• Please review our updated Terms and Rules here

How does old VGA graphic hardware work?

JT64

Experienced Member
Joined
Oct 21, 2008
Messages
402
Location
Sweden
What functionality is embedded in older VGA cards, of course there is a digital to analog conversion of the information and even font support on the cards?

The rawdata sent from motherboard to graphiccard, is that more then just pixel information, in textmode i can think there is just ASCI values transported to the graphic card. But if i run a graphic game is it just not pixel information?

I think that DVI/HDMI run in standard VGA mode 640*480 50 hz.
Isn't the HDMI interface not just buffered pixel information read from a memory buffer?

If you would build a "DVI/HDMI CARD" interface to support only graphic mode at 640*480 at 50 hz. Would that be complex hardware, what more then pixelinformation is streamed over the cable?

JT
 
Personally I used to just play around with the Good-ol' Interrupt 10h in DOS or CP/M-86 for example, though because I was using something like TP, you could essentually program your own Procedures to have Lines, Boxes, Circles, or whatever!

SWAG was a great resource to get stuff relating to VGA - though I'm not sure if it's relevant to what you want since I'm talking about is fairly generic - if you talking about Programming the Card Directly - i.e. Taking a card from a certain manufacturer and using their Technical advances on the VGA, then that's way beyond me.
 
What functionality is embedded in older VGA cards, of course there is a digital to analog conversion of the information and even font support on the cards?

The rawdata sent from motherboard to graphiccard, is that more then just pixel information, in textmode i can think there is just ASCI values transported to the graphic card. But if i run a graphic game is it just not pixel information?

I think that DVI/HDMI run in standard VGA mode 640*480 50 hz.
Isn't the HDMI interface not just buffered pixel information read from a memory buffer?

If you would build a "DVI/HDMI CARD" interface to support only graphic mode at 640*480 at 50 hz. Would that be complex hardware, what more then pixelinformation is streamed over the cable?

JT

Modern LCD displays can run VGA in 640x480 @ 60Hz or 720x400 @ 70Hz. The display information is analog, so you would need a converter to use it with a DVI/HDMI input. Most LCDs have DVI-I inputs, which support analog and digital inputs. To use the VGA with those monitors, all you would need is a pin converter. To use a LCD with DVI-D(igital only) or HDMI (without VGA), then you would need a device that converts the analog VGA signal into a TDMS signal. Pixel and sychronization signals are directly transmitted over a VGA cable.

In CGA/MDA/EGA/VGA text modes, each even byte entered into memory will select which text character is displayed on each cell on the screen. EGA and VGA cards have the ability to display user-defined fonts in text modes. The odd bytes determine attributes for the preceeding even byte. In graphics modes, the byte entered into memory will determine the color of one or more pixels on the screen.
 
Modern LCD displays can run VGA in 640x480 @ 60Hz or 720x400 @ 70Hz. The display information is analog, so you would need a converter to use it with a DVI/HDMI input.
Most LCDs have DVI-I inputs, which support analog and digital inputs. To use the VGA with those monitors, all you would need is a pin converter. To use a LCD with DVI-D(igital only) or HDMI (without VGA), then you would need a device that converts the analog VGA signal into a TDMS signal. Pixel and sychronization signals are directly transmitted over a VGA cable.

In CGA/MDA/EGA/VGA text modes, each even byte entered into memory will select which text character is displayed on each cell on the screen. EGA and VGA cards have the ability to display user-defined fonts in text modes. The odd bytes determine attributes for the preceeding even byte. In graphics modes, the byte entered into memory will determine the color of one or more pixels on the screen.


Well i was thinking more in terms of what hardware would be necessary to build an graphic ISA card, with no analog output or text mode,basically a pixel render that only output 640*480 in graphic mode. What is actually sent from motherboard to an old fashioned graphic card when you play a VGA game.

JT
 
I think that DVI/HDMI run in standard VGA mode 640*480 50 hz.

HDMI and DVI are types of digital connectors, they can carry a variety of signal types. Do not confuse them with resolution. DVI us primarily used in the computer industry and HDMI is for the A/V industry. HDMI can carry digital audio as well as video.

It is not easy to convert from the A/V specs to PC specs but they can be close:

480i is roughly 640x480
480p is roughly 852x480
1080i/p is roughly 1920x1080
720i/p is roughly 1280x720.

These are HD signal types, and their names refer to the number of vertical scan line. SD signals are typically lower than 200 scan lines (320x200 if memory serves me right).

Another consideration is that HD is almost always wide screen (16:9 aspect) and computer wide screen differs in that it is actually 16:10.

This is why you generally need a scan convertor to plug a PC into a TV, or in the case of most newer LCDs the convertor is built in.
 
I'm digging back into my musty past here so bear with me...

The D-A conversion is done automatically by the hardware on the card itself. Refresh rates and resolution can be controlled programmatically by setting values on the VGA controller chip itself. An easy way are BIOS interrupt calls.

In terms of pixel rendering, why waste the effort of building something when it is supported by all VGA cards in one form or another? For VESA VGA cards, you put the card into the appropriate resolution/color depth you want. Again, BIOS calls can accomplish this. How the pixels get displayed depend on the mode.

Graphics modes on VGA cards are memory-mapped. The contents of the video memory (i.e. Trident VGA with 1MB RAM, that 1MB RAM is the video memory) determine what will be displayed on the screen. The amount of memory on the card also determines the available resolutions and color depths.

Many modes on the VGA card are indexed palettes. For example, 640x480x16 colors or 256 colors. In the indexed modes, you load the RGB color information into the color palette. Palettes are used to conserve what was at the time expensive memory. Color values are 0-indexed. For 16 colors, there would be 16 24 bit entries in the palette (8 bits for each RGB component) and for 256 colors, there would be 256 entries.

Video memory is laid out as contiguous space. The first byte of video memory corresponds to (0,0), the top left pixel on the screen. As addresses advance, so do the pixels. Memory is read by the VGA DAC and renders pixels from top-left, moving left to right, to bottom right.

Here's where it gets fun. The color depth determines how many pixels a byte of video memory represents. For monochrome, a single byte represents 8 pixels. IF a bit is set, it is black. If it is off, it is white. For 16 color modes, one byte equals two pixels. The high and low nibbles (4 bits) represent the index value for a single color. The 0-15 value is read out of the high or low nibble. looked up in the palette to determine what color that value represents and the appropriate color values are sent to the DAC for display as the memory is read. Same applies for the 256 color mode. 256 color modes are the easiest to program since one byte represents one pixel of a specific indexed color.

Higher color modes like 4096, 65536 and 16.7M color modes work similarly. Except now, it requires multiple bytes to represent individual pixels. For 4096 (12 bit) color, it requires 3 bytes to represent two pixels. 4096 colors are typically not found on PCs, I use it for reference.

The 65K and 16.7M modes, unlike the low color modes, are typically not palette but rather direct color modes. This means instead of indexes into a fixed palette of colors, the RGB color information is encoded directly into the values stored in memory. As a result, these modes are easier to program since a palette doesn't need to be set up beforehand. For 16.7M colors, 3 bytes are required to represent a single pixel.

That's where video memory affects what resolutions and color depths are available. Typical VGA cards had 128kb or 256kb RAM onboard. On standard VGA cards, 256 colors at 640x480 wasn't possible since it requires 300kb to render. The non-standard 640x400 resolution could display 256 colors on a 256kb card. 320x200x256 was the most heavily used on VGA for games and easy to program since it used 64kb to render, making it natural for DOS programmers used to 64kb segment addressing.

It also shows why 800x600 was limited to 16 colors since it would require 480kb to render 256 colors but only 240kb to render 16 colors.

Super VGA and XGA allowed for the higher resolutions and color depths. Today, moden cards typically use 24 or 32 bits per pixel. It doesn't take much to see how much memory is required to support the high resolution, high color modes we are used to today. 1280x1024x16.7M today requires between 4MB and 5MB of video memory, which back then often exceeded the total RAM in the computer.

As to your question to what the motherboard sends to the card, it's pretty straightforward:

1) Program called Int 13h to initialize the card and place it into the proper resolution. Let's assume 320x200x256. It would also initialize the color palette.
2) Depending on the mode, a specifc region of memory will be mapped into the high memory address space on DOS that serves as the start of video memory. The mode determines where this is.
3) The program points itself to the top of video memory and begins writing bytes to memory. As bytes are written (depending on the speed), the VGA DAC will update the signals and the resulting pixels will be displayed on the monitor.

For 16 color and monochrome modes, the program would be responsible for doing the logical bit operations to turn on the 2 or 8 pixels respectively represented by each byte and write it to memory.

That's really it. Forgive me if some of the information is inaccurate as I am drawing from my college days and memory when I used to do video programming. Hopefully this helps.

Matt
 
These are HD signal types, and their names refer to the number of vertical scan line. SD signals are typically lower than 200 scan lines (320x200 if memory serves me right).
Standard definition is 480i and 576i, often incorrectly called NTSC and PAL/SECAM respectively (where NTSC, PAL and SECAM are simply analogue colour encoding methods).

These translate to 640x480 at 60 fields/30 frames per second and 720x576 at 50 fields/25 frames per second.
 
Standard definition is 480i and 576i, often incorrectly called NTSC and PAL/SECAM respectively (where NTSC, PAL and SECAM are simply analogue colour encoding methods).

These translate to 640x480 at 60 fields/30 frames per second and 720x576 at 50 fields/25 frames per second.

i think you meant 576p not 576i.
 
I'm digging back into my musty past here so bear with me...

The D-A conversion is done automatically by the hardware on the card itself. Refresh rates and resolution can be controlled programmatically by setting values on the VGA controller chip itself. An easy way are BIOS interrupt calls.

In terms of pixel rendering, why waste the effort of building something when it is supported by all VGA cards in one form or another? For VESA VGA cards, you put the card into the appropriate resolution/color depth you want. Again, BIOS calls can accomplish this. How the pixels get displayed depend on the mode.

Graphics modes on VGA cards are memory-mapped. The contents of the video memory (i.e. Trident VGA with 1MB RAM, that 1MB RAM is the video memory) determine what will be displayed on the screen. The amount of memory on the card also determines the available resolutions and color depths.

Many modes on the VGA card are indexed palettes. For example, 640x480x16 colors or 256 colors. In the indexed modes, you load the RGB color information into the color palette. Palettes are used to conserve what was at the time expensive memory. Color values are 0-indexed. For 16 colors, there would be 16 24 bit entries in the palette (8 bits for each RGB component) and for 256 colors, there would be 256 entries.

Video memory is laid out as contiguous space. The first byte of video memory corresponds to (0,0), the top left pixel on the screen. As addresses advance, so do the pixels. Memory is read by the VGA DAC and renders pixels from top-left, moving left to right, to bottom right.

Here's where it gets fun. The color depth determines how many pixels a byte of video memory represents. For monochrome, a single byte represents 8 pixels. IF a bit is set, it is black. If it is off, it is white. For 16 color modes, one byte equals two pixels. The high and low nibbles (4 bits) represent the index value for a single color. The 0-15 value is read out of the high or low nibble. looked up in the palette to determine what color that value represents and the appropriate color values are sent to the DAC for display as the memory is read. Same applies for the 256 color mode. 256 color modes are the easiest to program since one byte represents one pixel of a specific indexed color.

Higher color modes like 4096, 65536 and 16.7M color modes work similarly. Except now, it requires multiple bytes to represent individual pixels. For 4096 (12 bit) color, it requires 3 bytes to represent two pixels. 4096 colors are typically not found on PCs, I use it for reference.

The 65K and 16.7M modes, unlike the low color modes, are typically not palette but rather direct color modes. This means instead of indexes into a fixed palette of colors, the RGB color information is encoded directly into the values stored in memory. As a result, these modes are easier to program since a palette doesn't need to be set up beforehand. For 16.7M colors, 3 bytes are required to represent a single pixel.

That's where video memory affects what resolutions and color depths are available. Typical VGA cards had 128kb or 256kb RAM onboard. On standard VGA cards, 256 colors at 640x480 wasn't possible since it requires 300kb to render. The non-standard 640x400 resolution could display 256 colors on a 256kb card. 320x200x256 was the most heavily used on VGA for games and easy to program since it used 64kb to render, making it natural for DOS programmers used to 64kb segment addressing.

It also shows why 800x600 was limited to 16 colors since it would require 480kb to render 256 colors but only 240kb to render 16 colors.

Super VGA and XGA allowed for the higher resolutions and color depths. Today, moden cards typically use 24 or 32 bits per pixel. It doesn't take much to see how much memory is required to support the high resolution, high color modes we are used to today. 1280x1024x16.7M today requires between 4MB and 5MB of video memory, which back then often exceeded the total RAM in the computer.

As to your question to what the motherboard sends to the card, it's pretty straightforward:

1) Program called Int 13h to initialize the card and place it into the proper resolution. Let's assume 320x200x256. It would also initialize the color palette.
2) Depending on the mode, a specifc region of memory will be mapped into the high memory address space on DOS that serves as the start of video memory. The mode determines where this is.
3) The program points itself to the top of video memory and begins writing bytes to memory. As bytes are written (depending on the speed), the VGA DAC will update the signals and the resulting pixels will be displayed on the monitor.

For 16 color and monochrome modes, the program would be responsible for doing the logical bit operations to turn on the 2 or 8 pixels respectively represented by each byte and write it to memory.

That's really it. Forgive me if some of the information is inaccurate as I am drawing from my college days and memory when I used to do video programming. Hopefully this helps.

Matt

Thank you for the briefing over my head but not incomprehensible.
Would be interesting to know the other part of the story, what is a HDMI/DVI signal could you use simple microcontroller to lit ON/OFF a pixel on a LCD screen?
I guess there must be a clock to synch with and a protocol to conform to?

OK i know nothing to zero about electronic so bear with me.
Would it not be possible to hook up a microcontroller to read from the VGA memory buffer and write/output a DVI compatible signal, i know that HDMI need alot of bandwith so maybe a microcontroller do not have the necessary speed and clock. And what about signal voltage would you have to deal with such stuff to?

I probably have no idea about i speak of but the subject is interesting, and maybe it is a bit of simplification to think that everything could be dealt with using a microcontroller in the digital domain. But they are pretty good at simple things like reading from memory and they have built in clock and have a small buffer so they can handle simple bit and byte manipulation (right?).

Wouldn't it be piece of a cake for the guys who build the IDE controller to make a hack using a microcontroller read the memory buffers of a VGA card, a little header programming would be needed to conform with DVI/HDMI standard, but what more need to be done hardwarewise to get a HDMI/DVI signal compatible signal out from the microcontroller.

:mrgreen:And how come we have no ISA DIGITAL soundcard kit for XT, could you let a microcontroller read direct from the ISA bus without alot of analogue stuff between and "with a little programming" output a S/PIF compatible signal.

One last notice it is usual to very easy to solve problem, when you have no idea of how hard they would be to solve in reality. But strictly speaking in the digital domain all problems is solved by "a little programming".

I just wish i new more about that analouge blackbox.

JT
 
Back
Top