• Please review our updated Terms and Rules here

Why was VGA 640x480 w/256kb ?

rmay635703

Veteran Member
Joined
May 29, 2003
Messages
626
Location
Wisconsin
Time for the dumb question of the day.

Something I’ve wondered since I got my Tandy 1000rlx was where did the non binary divisible 640x480 resolution come from?

640x480x16 colors is 150kb and runs roughly twice the speed of 15khz TV

Most previous video standards had a resolution that fit exactly into a specific amount of VRAM

CGA, Hercules, Even Mac mono fit exactly into the VRAM set aside for video.

So why did VGA usually have 256k on the card ?
To me 640x400x256 would have been ideal since it would use the whole amount.
But that mode is rare.

I also wondered if older vga cards had something other than 256k but...
I only know of one transitional ega/VGA card that had 160kb of VRAM,
A few VGA mono cards and the original MCGA had 64kb which also doesn’t make sense except for 320x200x256

This same question applies to the standard EGA cards that were expandable to 256k

Why have more ram than you need?
I get mode x double buffering but that wouldn’t apply to full VGA resolution

Just curious about this always seemed strange
 
640x480x16 colors is 150kb and runs roughly twice the speed of 15khz TV

Not sure what you mean by twice the speed. VGA 640x480x60 is 25.175 MHz dot with 800 x 525 total pixels. NTSC is 27.00 MHz dot with 858 x 525 total pixels (counting both fields).

I'm guessing 640 = 480 / 3 * 4. eg. 4:3 aspect. The more odd question to me is why 720 x 480 for NTSC?

Most previous video standards had a resolution that fit exactly into a specific amount of VRAM

CGA, Hercules, Even Mac mono fit exactly into the VRAM set aside for video.

They fit by co-incidence. My guess is since 640 was well established horizontally, they expanded to the industry normal of 480 active scan lines. 800 x 600 x 16 eventually lined up better w/ 256KB (240,000) but would have run slower on period hardware.

So why did VGA usually have 256k on the card ?

Because it was a log2 next step round-up of the required ram for 1 page? 192K would have cost more than 256K. And as you pointed out, 320x200 frames line up better and was a more popular VGA mode due to 8-bit palette. You could have 4 active pages in 320 x 200 x 8-bit LUT with 256KB.

Remember 640x480 was rarely used for graphics. It was mostly a text output mode requiring only 2K video ram. 320 x 200 x 8-bit was by far the more popular choice. So why pick on 640 x 480 for your alignment question?

To me 640x400x256 would have been ideal since it would use the whole amount.
But that mode is rare.

It is supported in every VGA card but to use it you had to de-couple the 4 video planes which made pixel access non-sequential. Thus it was difficult to use.

And, again, most games required double buffering for seamless redraw. 640 x 400 gave only 1 visible frame - no back-draw frame.
 
EGA could be expanded to 256kB so IBM made sure that the PS/2 VGA would also have 256 kB. Note that there were enhanced EGA cards that supported 640x480 and VGA cards quickly went to 800x600 with no change in memory. MCGA also supported the critically important 640x480 monochrome mode which was to have been the secretary screen resolution.

Why more RAM? Well, the price of 256 kbit chip quickly fell to only slightly more than 64 kbit chips so why not buy extra RAM and make for a convenient 4 planes of 64k each.
 
640 x 480 was supported by cad packages going back to the NEC APC. Onscreen it may only have been capable of 400 lines but the hardware supported panning. When it became economical enough games used that resolution. For the day if you devoted enough memory to 320 x 200 screens you had photo realism. But clearly more pixels looks better.
 
Technically isn't a bit plane one bit deep? I'm not sure just asking.

IBM monochrome VGA used only a single 64k block. 16 color VGA used all 4 64k blocks each with a different color bit. In those cases, the memory was effectively laid out as bit planes. The 256 color mode used 64k for 320x200 with one byte for each pixel.
 
So plane is synonymous with screen, that is whatever one screen consists of with however many colors in that mode? Planes or bit planes was a term used in EGA docs as well as Tandy 2000 docs. Iirc. It finally occurred to me I'm not terribly clear on the usage. Thinking about it a plane is a single color of whatever resolution. Best guess anyway.
 
A "bitplane" (outside of weird exceptions like mode X which have to do with coercing the hardware into doing things it wasn't designed to) is one bit deep, yes. Normal EGA and 16-color VGA modes have four bitplanes, which are four separate 1bpp bitmaps in memory that provide the four bits of each pixel required to provide a 16-color image. MCGA, VGA mode 13, and mode X (as well as 4-color CGA and Tandy/PCjr 16-color modes) use the other approach, "chunky" bitmaps in which all the bits for a single pixel are packed into a single block (a byte in the case of the 256-color modes, a nybble in the 16-color modes, etc.)
 
To me 640x400x256 would have been ideal since it would use the whole amount.
But that mode is rare.
It is supported in every VGA card but to use it you had to de-couple the 4 video planes which made pixel access non-sequential. Thus it was difficult to use.
Can you provide some example program supporting this mode on a vanilla VGA?
I've seen many 256-color xmodes: 320x400, 320x480, 360x480, 376x564, 400x564, 400x600, 320x240, and others.
But *never* 640x400.

As for the original question, it should be noted that 256KB wasn't a new feature in VGA, it was already present in EGA, at least optionally.
And the reason behing that option was "panning".
High-resolution monitors were expensive, but increasing of virtual resolution was relatively cheap.
 
The more odd question to me is why 720 x 480 for NTSC?

https://cardinalpeak.com/blog/the-math-behind-analog-video-resolution/

While NTSC is defined as 525 scan lines, there was a great deal of variability and interpretation. Not all CRTs were square and not all televisions had the same aperture settings, meaning there was a large variation in the amount of overscan. Storage media for NTSC also wasn't perfect (eg. VCR) so you often ended up with half or less of the original information being retained from a signal.

The 720x480 definition was usually associated with DVDs, but a good portion of the horizontal resolution wasn't usually visible on a CRT television.
 
I've seen many 256-color xmodes. But *never* 640x400.

I had a brain fart. 320x400 is the most familiar to me. As I also pointed out, 640x400 would not have left half the memory available for double buffering. You would see the updates on-screen with 256K. Not that useful for any mode.


While NTSC is defined as 525 scan lines... The 720x480 definition was usually associated with DVDs.

Yes there were 525 scan lines, but 480 was the industry accepted of what was available in the visible portion guaranteed under the overscan. By the 70s most TVs were converging on relative position of scan lines with respect to the tube. Otherwise CC, GCR, Nielson rating info, Teletext, etc would not be possible.
 
The missing piece of the puzzle here is memory bandwidth. If IBM had created a VGA card with 128kB (half as many DRAM chips) then the amount of data that can be transferred out of VRAM and onto the screen per scanline would also have been halved, meaning that you could have at most 360 horizontal pixels in 16 colours or 180 horizontal pixels in 256 colours, and a most 4 colours at the 720 horizontal pixel resolution.

This is also the reason why a standard IBM VGA card cannot do a 640x400x256 mode: there is enough VRAM for it, but the VGA chipset can't transfer data out of VRAM and pass it to the DACs quickly enough.

Another advantage of the 640x480 resolution (and the 320x240 mode X resolution) is that it has square pixels with the 4:3 monitor aspect ratios typical of the time. This simplifies a number of programming tasks compared to earlier graphics standards. For example, the code to draw a circle is simpler and faster than the code to draw an ellipse, but if you don't have square pixels then you need to draw ellipses to get proper circles anyway, so you're stuck with the slower and more complicated code.
 
So 640x480 was a simple case of using the industry standard 80 column resolution while matching the 4:3 square pixels found on other systems.

I wonder why IBM didn’t offer a “3 color” mode instead of plain monochrome on MCGA.

3 colors can be driven with 3 bits for every 2 pixels (-1 config) or 8 bits for every 5 pixels (pixel accurate) which is 60kb of VRAM

Similarly 640x480 should support 64 colors (6bit depth) using less than 256k with a speed penalty

Ah well
 
The missing piece of the puzzle here is memory bandwidth. If IBM had created a VGA card with 128kB (half as many DRAM chips) then the amount of data that can be transferred out of VRAM and onto the screen per scanline would also have been halved

This is also the reason why a standard IBM VGA card cannot do a 640x400x256 mode: there is enough VRAM for it, but the VGA chipset can't transfer data out of VRAM and pass it to the DACs quickly enough.

Another advantage of the 640x480 resolution (and the 320x240 mode X resolution) is that it has square pixels with the 4:3 monitor aspect ratios typical of the time.

I don’t think bandwidth within the VGA card was an issue, perhaps CPU bandwidth since even 150k pages at the time were large but you can bank 16 bits into 128k of ram just as easily as you can bank 8 bits with 256k chips

I can point out several 128kb VGA cards and one 160kb VGA card so it’s physically possible to build a VGA card with less memory
 
640x480 at 16 colors is a standard VGA mode (12h) and requires 150K for one page. So any "VGA" card with 128K is not actually VGA. Maybe MCGA.
 
Back
Top