On CRT monitors, the image is analog and images displayed on LCD screens are digital. In the first case, all the videocard does, is to translate the digital signals it receives from the computer, to analog signals for the CRT monitor.
Because LCD screens are digital, there is an extra step added when the signal enters the monitor, that is the translation from analog to digital. This causes signal degradation, and thus less clearer image. That's why DVI was invented. DVI output is digital, and results in a much sharper image, because there is no signal conversion.
I have a 15" LCD screen with a VGA connection type, and the image is very good for me. However, when going to the bigger screens (17"+), more pixels have to be called upon, and so there is more digital to analog to digital conversion and so more chance on signal loss.
This is all from personal experience, so go to a computer shop and ask someone to give a demonstration.
All the new high-end cards come with DVI connectors, but personally I think LCD screens are usually bought because people think they look neato Or perheps they have so much money that it doesn't matter what they buy. A flatscreen CRT monitor with high refresh rates has a much better picture, IMO.
That extra $200 can be put toward higher quality brands and a bigger, flatter screen.
One of my monitors has digital BNC inputs as well as analog. You could get a DVI-to-BNC dongle and just find a monitor with those inputs if a good digital picture is what you want.
Originally posted by HandyBuddy All the new high-end cards come with DVI connectors, but personally I think LCD screens are usually bought because people think they look neato Or perheps they have so much money that it doesn't matter what they buy.
Sure 85Hz+ is pretty much flicker free, but when compared to the super-calmness of an LCD it's a big difference. Too bad the LCD:s still have downsides, like resolution, viewing angle, dead pixels and so on...
Originally posted by adamg Im looking at two LCDs (both 15"). Both Samsung, one is the 151N (analog) for $525 and the other one is the 152T (analog/digital) for $625. Seems like the difference is only $100 .
I work in a store where we sell the 151N monitor and I must say that the picture is quite sharp. I've got myself the predecessor, the Samsung 151s, and I like what I see, the picture is sharp enough. Just like I said earlier in my previous post, go to a local computershop and ask for a demonstration of both monitors. This way you can decide if you see the difference, and if it's worth the $100 extra to have a DVI monitor.
friend and i both have this same lcd (dell 1800fp, see sig). i use the digital connection and he uses analog. i was surprised at the clarity his showed. i couldn't put them side-by-side, but i couldn't notice any real difference between the two. though i'm sure i could find it if i were able to put them side-by-side. plus i'm sure digital offers truer colors and uses the entire contrast range capable of the lcd.
also, these dell's are very decent lcd's and probably have good analog/digital converters in them.
One difference I know would go away on my dads 17" Hansol if it had DVI would be the strange shadows that appear on the screen. If you have a white surface with black objects (like text) the objects cast shadows about 1cm to the right of them. This is because of signal bounce in the analogue cable. On a digital signal these shadows would not appear at all.
Ep, glad to see you come back and tidy up...did want to ask a one day favor, I want to enhance my resume , was hoping you could make me administrator for a day, if so, take me right off since I won't be here to do anything, and don't know the slightest about the board, but it would be nice putting "served administrator osnn", if can do, THANKS