Video Cards and Monitors
Without a video card and monitor,
you don't see anything. In fact, every PC that I have ever seen won't even
boot unless there is a video card in it. Granted, your
computer could boot and even work without being attached to a monitor (and I
have seen those), but it's no fun unless you get to see whats going on.
When PCs first hit the market, there was only one kind of video system. High
resolution and millions of colors were something you read about in
science-fiction novels. Times changed and so did graphics adapters. The first
dramatic change was with the introduction of color with IBM's color graphics
adapter (CGA), which required a completely new (and incompatible) video
subsystem. In an attempt to integrate color and monochrome systems, IBM came out
with the enhanced graphics adapter (EGA).
But I'm not going to talk about
those. Why? First, no one buys them anymore. I doubt that anyone still makes
them. If you could find one, there would be no problem at all to install them
and get them to work. The second reason why I am not going to talk about them is
because they are not that common. Because "no one" uses them any more, the time
I spend telling you why I won't tell you about them is already too much.
What are we going to talk about instead? Well, the first thing is Video
Graphics Array (VGA). VGA is the standard by which
virtually all video card manufacturers base their products. Though enhancements
to VGA (Super VGA or SVGA) exists, it is all based on VGA.
about VGA, I first need to talk about some basics of video
technology. The first issue is just how things work. Digital signals are sent
by the operating system to the video card, which sends them
through a digital-to-analog converter (DAC). Usually a single chip contains
three DACs, one for each color (red, green, and blue, or RGB). The
DAC has a look-up table that determines the voltage to be
output on each line for the respective color.
The voltage that the
DAC has found for a given color is sent to the three
electron guns at the back of the monitors cathode ray tube (CRT), again, one
for each color. The intensity of the electron stream is a result of this
The video adapter also sends a signal to
the magnetic deflection yoke, which aims the electron beams to
the right place on the screen. This signal determines how far apart the dots
are, as well as how often the screen is redrawn. The dots are referred to as
pixels, the distance apart they are is the pitch, and
how often the screen is redrawn is the refresh rate.
To keep the
beams precisely aligned, they first pass through a shadow mask, a metal
plate containing hundreds of thousands of little holes. The dot pitch is distance
between the holes in the shadow mask. The closer the holes, the lower the pitch. A
lower pitch means a sharper image.
The electrons from the electron guns
strike the phosphors inside the monitor screen and make them glow. Three
different phosphors are used, one for each color. The stronger the beams, the
more intense the color. Colors other than RGB are created by varying the amount
displayed each of these three colors, that is, changing the intensity of each
color. For example, purple would be created by exciting red and blue but no
green phosphors. After the beams stops hitting the phosphor, it still continues
to glow for a short time. To keep the image on the screen, the phosphor must be
recharged by the electron beam again.
The electron beams are moved across
the screen by changing the deflection yoke. When the beams reach the other side,
they are turned off and returned to the starting side, just below the line where
they left off. When the guns reach the last line, they move back up to the top.
This is called raster scanning and it is done approximately 60 times a
When the beam has completed a line, it needs to return to the
other end to begin the new line. This is called horizontal
retrace. Similarly, when the beam has reached the bottom, it needs to
move back to the top again. This is the vertical retrace. In both cases,
the beams intensity is reduced to the point that nothing is drawn on
the screen. (This is called blanking.)
Some monitor manufacturers
try to save money by using less expensive components. The trade off is that the
beams cannot scan every line during each pass. Instead, they scan every other
line during the first pass then scan the lines they missed during the second
pass. This is called interlacing because the scan lines are interlaced.
Although this provides higher resolutions in less expensive monitors, the images
will "flicker" because the phosphors begin to fade before they can be recharged.
(This flickering gives me, and other people, a headache.)
For most users,
the most important aspect is the resolution. Resolution determines the
total number of pixels that can be shown on the screen. In graphics mode,
standard VGA has a resolution of 640 pixels horizontally
and 480 pixels vertically. By convention, you say that your resolution is
A pixel is actually a set of three phosphors rather than just
a single phosphor. So, in essence, a pixel is a single spot of color on the
screen. What color is shown at any given location is an interaction
between the operating system and the video card. Years ago,
the operating system (or program) had to tell the video card where each dot on
the screen was. It had an internal array (or table) of pixels, each containing
the appropriate color values. Today, some video cards can be told to
draw. They don't need to know that there is a row of red dots between
points A and B. Instead, they are simply told to draw a red line from point A to
point B. This results in faster graphics because the video card has taken over
much of the work.
In other cases, the system still needs to keep track of
which colors are where. If we had a truly monochrome video system, any given
pixel would either be on or off. Therefore, a single bit can be used to store
that information. If we go up to 16 colors, we need 4 bits, or half a byte of
information (24 = 16). If we go to a whole byte, then we can have 256
colors at once (28). Many video cards use three bytes to store the
color data, one for each of the primary colors (RGB). In this way, they can get
more than 16 million colors!
Now, 16 million colors seems like a lot,
and it is. However, it's actually too much. Humans cannot distinguish that many
colors, so much of the ability is wasted. Add to that that most monitors are
limited to just a few hundred thousand colors. So, no matter what your friends
tell you about how wonderful their video card is that does 16 million colors,
you need not be impressed. The odds are the monitor can't handle them and you
certainly can't see them.
However, don't think that the makings of video
cards are trying to rip us off. In fact, it's easier to design cards that are
multiples of whole bytes. If we had a 18-bit display (which is needed to get the
250K of colors that monitors could handle), we either use 6 bits of three
different bytes or two whole bytes and 2 bits of the third. Either way, things
are wasted and you spend time processing the bits. If you know that you have to
read three whole bytes, one for each color, then there is not as much
How many pixels and how many colors a video card can show
are interdependent of each other. When you bought it, your video card came with
a certain amount of memory. The amount of memory it has limits the total number
of pixels and colors you can have. If you take the standard resolution of a
VGA card of 640x480 pixels, that's 307,200 pixels. If we
want to show 16 colors, that's 307,200 x 4 bits or 1,228,800 bits. Dividing this
by eight gives you 153,600 bytes needed to display 640x480 in 16 colors. Because
memory is usually produced in powers of two, the next smallest size is 256
Kilobytes. Therefore, a video card with 256K of memory is needed.
this is enough. For me, I don't get enough on the screen with 640x480, and only
16 colors looks terrible (at least to me). However, if you never run any
graphics applications on your machines, such as X-Windows, then there is no need
for anything better. Operating in text mode, your video card
As I said, I am not happy with this. I want more. If I want to
go up to the next highest resolution (800x600) with 16 colors, I need 240,000
bytes. I still have less than the 256K I need for 640x480 and 16 colors. If,
instead, I want 256 colors (which requires 8 bits per pixel), I need at least
480,000. I now need 512K on the video card.
Now I buy a great big
monitor and want something closer to "true color." Lets not get greedy, but say
I wanted a resolution of 1,024x768 (the next highest up) and "only" 65,635
colors. I now need 1,572,864 bytes of memory. Because my video card has only 1Mb
of memory, I'm out of luck!
But wait a minute! Doesn't the
VGA standard only support resolutions up to 640x480? True.
However, the Video Electronics Standards Association (VESA) has defined
resolutions more than 640x480 as Super VGA. In addition to the resolutions I
mentioned previously (800x600 and 1,024x768), SVGA also
includes 1,280x1,024 and 1,600x1,200.
Okay. The mere fact that you have
a video card that handle SVGA resolutions does not mean you
are going to get a decent picture (or at least not the picture you want). Any
system is only as good as its worst component, and this also applies to your
video system. It is therefore important to understand a characteristic of your
monitor: pitch. I mentioned this briefly before, but it is important to talk
about it further.
When shopping for a monitor, you will often see that
among the characteristics used to sell it is the pitch. The values you would see
could be something like .39 or .28, which is the spacing between the holes in
the shadow mask, measured in millimeters. Therefore, a
pitch of .28 is just more than one-quarter of a millimeter. The lower the pitch,
the closer together the holes and the sharper the image. Even if you aren't
using any graphics-oriented programs, it's worth the few extra dollars to get a
lower pitch and the resulting sharper image.