On Behalf Of Michael Wilkinson
> No , it does not assist.
OK I'll bite ;)

> Your comments are rather hazy with 
> an authority that lacks  detail, so please, be more detailed 
> in your response. I'm not questioning your knowledge, just 
> the depth /explanation of your response. There are lots of us 
> who are looking to move on from  CRT to LCD screens that 
> really do need to get the facts before we buy.

I am not fully up-to-date on everything technical regarding hardware
implementations of CRT vs LCD / TFT screens, so my descriptions have to be a
bit hazy. However:

FOR CRT's
CRTs work by scanning electron beams across, then down the screen. The time
that it has to get across the screen is the horizontal refresh rate. The
number of times it gets to do this for a whole screen worth is 'one vertical
refresh' - so the number of screen refreshes that it gets to do in a second
is therefore the vertical refresh rate - normally people are used to seeing
60Hz+... (UK tv's are 50Hz I believe, even though we only get 25 frames per
second). For reference there will be three electron guns - one for each
R,G,B which is an improvement on most Tv's which will have just one. This
means that each gun will only fire to activate phosphors of one colour on
the front of the screen. Depending on how many electrons are streamed at a
phosphor decides how bright it will glow.

Typically, CRT users have always required 70Hz+ vertical refresh because
they sit close to the screen so flicker becomes more likely to induce
headaches etc. Screen refresh is an issue because I believe the subconscious
mind picks up on flicker even when the conscious does not. Peripheral vision
is also more sensitive to flicker - and this can be a good way to check if
the screen refresh is high enough for you. Flicker is doubtless caused by
many things but I believe the main factor is that the phosphors will only
glow for so long after being struck by the beam; if they have too much
chance to fade before being activated by the beam again, you have flicker.
Hence, for CRT's, typically, the higher the refresh rate the better.
However, this very disposition of the phosphors to fade quite quickly (I
think it may be called latency but the techie term escapes me now) can
actually be useful for some things; like moving images. There have been
100Hz TV's available for some time over here, and Fluorescent 100Hz lighting
is also easier on the eyes.

CRT's receive ANALOG information via (typically for PC users, can't speak
for Macs) a VGA cable. Of course, the graphics area on your graphics card is
digital, so it needs a RAMDAC or RAM-Digital-to-Analog-Converter to convert
that digital info to analog, to feed down the wires. Hence, the bandwidth of
the graphics card defines how high a resolution the graphics card can be run
at on a CRT screen without noticeable flicker. To complicate matters, the
screens often have a 'bandwidth' of sorts too; so even if you buy a CRT that
is capable of 1600*1200, say, does not mean it will automatically be able to
run at the same refresh rates as a recent, good, graphics card.

FOR LCD's / TFT's
The whole situation is different. In fact, TFT and LCD technologies are very
different... And I believe most of us mean 'LCD' to mean flat-screen, when
in fact the good ones are TFT's (which is what we are interested in!). The
technical details of how TFT's / LCD's get the picture on the screen is a
little beyond me... But due to the different underlying characteristics, it
just makes more sense to get a digital representation of the screen image to
the monitor so that there is less degratation when compared to the analog
signals. This is where the DVI interface steps in. And again, for precise
reasons that elude me, the vertical refresh rate of the screen becomes far
less important and 60Hz becomes reasonable... If you want some of my ideas
on why I think this is possible we should probably take this off-list! (lets
just say, the fact the video signal may contain information about any one
pixel only 60 times per second, the screen hardware _may_ actually be
refreshing that pixel 500 times per second, electronically). The latency on
old LCD's can be Awful leaving ghostly trails behind a moving mouse pointer
etc. I've never noticed such on my TFT's. More here on monitor
recommendations:
http://www.mwords.co.uk/pages/cm/profileMonitor.htm#monitor

MY OWN RECOMMENDATIONS FOR GRAPHICS CARDS
So I think this means that you should look for a VGA / DVI card if you are
thinking you may upgrade to a TFT flat screen in the future. Looking for one
that states specifically something like '[EMAIL PROTECTED]' is probably a good
idea of a reasonable card (if you run your screen at 1600*1200) - though as
noted in my previous email, this bandwidth may not be as important to
digital displays as much as for CRTs. On RAM quantity, I do not personally
believe that more than about 32MB is required per screen you plan to run
(Mac Cinema display users may push this boundary). Graphics cards with
128MB+ of memory are typically for '3d specialist' rendering cards which use
lots of memory to store textures that the cards then render onto 3d shapes.
In this case '3d specialist' means 'graphcis cards for games players' for
the vast majority of people.

As ever, there is so much more that could be said!
Best Regards,
nij




===============================================================
GO TO http://www.prodig.org for ~ GUIDELINES ~ un/SUBSCRIBING ~ ITEMS for SALE

Reply via email to