• Please review our updated Terms and Rules here

VGA Mode 13h output

vol.2

Experienced Member
Joined
Dec 4, 2018
Messages
194
Location
baltimore
Someone told me that VGA cards (original VGA cards) did not output mode 13h in 320x200, but internally line-doubled it to 640x400 before sending it out to a monitor.

For whatever reason, I cannot remember this being a thing, and I'm having trouble finding a mention of it with a search. Can anyone here confirm/deny this statement?

Thanks,
 
Thanks!

This is a great point of reference, but it almost brings up more questions. 720x400 isn't a straight doubling of 320x200, so what's going on there? Also, that Trident card is early 1990's, so is it behaving the same as older VGA cards from the 1980's?
Is it that different cards would deal with the resolution in different ways?
 
You can't really go by what a monitor tells you about horizontal resolution; it can't count pixels: It just makes assumptions based on the scan rate.

The vertical resolution of an analogue video signal is quantal, horizontal is not. I suspect what you're getting is actually 320x400. But it doesn't really matter. What matters is the number of pixels that can be generated, not how they are realised.
 
You can't really go by what a monitor tells you about horizontal resolution; it can't count pixels: It just makes assumptions based on the scan rate.

The vertical resolution of an analogue video signal is quantal, horizontal is not. I suspect what you're getting is actually 320x400. But it doesn't really matter. What matters is the number of pixels that can be generated, not how they are realised.

Ok. Thanks. This is a lot closer to what I had forming in my head. I read this article: http://nerdlypleasures.blogspot.com/2013/10/320x200-resolution-of-choice-for-ibm-pc.html

In it, Great Hierophant states that: "200-line modes would be double scanned, with each pixel being double-clocked and each vertical line being repeated to fill up the refresh rate. This gives a different kind of scan-line structure compared with earlier monitor."

So, that would give 320x400 if the vertical lines are double-scanned. Then, I am assuming, the monitor is aspect correcting the material to stretch out horizontally on the screen.

Am I sort of correct here?
 
So, that would give 320x400 if the vertical lines are double-scanned. Then, I am assuming, the monitor is aspect correcting the material to stretch out horizontally on the screen.

Am I sort of correct here?
Not exactly - the monitor doesn't have to stretch or correct anything; most 'old-school' VGA monitors don't even have that ability. The dimensions of the screen depend only on the timing of the video signal (modulo whatever adjustment knobs the monitor may have). The VGA hardware takes care of that.

For each scanline, as long as the duration of the active/refresh periods is the same, it'll have a constant width - the monitor doesn't know or care how it's divided horizontally into pixels. Vertically it's slightly different, because scan rates have a narrow acceptable range, but again the card takes care of all timings (which determine the height of the picture). The double-scanning, I guess, was implemented in order to stay within that acceptable range without over-complicating the design.

Double-scanning can be (and often is) used for other resolutions too, e.g. *x240 is sent to the monitor as *x480.
 
For each scanline, as long as the duration of the active/refresh periods is the same, it'll have a constant width - the monitor doesn't know or care how it's divided horizontally into pixels. Vertically it's slightly different, because scan rates have a narrow acceptable range, but again the card takes care of all timings (which determine the height of the picture).

Ok. So, the monitor just sees X number of vertical lines (in this case 400) and gets them in a duration defined by the refresh rate. It then equally divides the (400) lines horizontally over the entire length of the screen. Is that better?

thanks,
 
I found this video very instructive

Hey, thanks! That's a really great video, and I had a lot of fun watching it. :)
I suppose this is all stuff I was familiar with though. My real question here is how the monitor handled the full width display of the pixels horizontally with the line doubled vertical output.
In other words, how does it know to stretch the image horizontally instead of pillarboxing the sides of the screen when the vertical pixels are doubled and not the horizontal pixels?

Wouldn't then the horizontal sync pulse need to happen twice as often? Assuming the same duration, it seems likely that the horizontal sync is being derived from the vertical sync pulses, and the monitor is simply refreshing twice as fast because it would be designed to fill the raster.
At least this is what I think is going on.
 
You're missing some analogue fundamentals. The monitor doesn't care in any way, and can't even know what the horizontal resolution is. Think like the guy who invented electronic television, Philo Farnsworth: imagine that you're plowing a field in a raster pattern. You dump paint as you plow to make an image. You don't stop the tractor to make a pixel, the tractor just keeps going at the same speed, in each row. How fast or slow you dump paint determines your resolution. The tractor doesn't do anything different.
 
I think what I was having trouble with is understanding how the sync pulses would change giving the vertical double scan in 13h.

I guess the from what you're saying here, the monitor just waits twice as long to send the vertical refresh signal to account for the doubling of the lines, but the horizontal sync pulse stays the same?
 
If you mean a double-scanned mode (13h with 200 lines) compared to a non-double-scanned mode (say 80x25 text with 400 lines), then both the vertical and horizontal pulses are the same. The double scanning is done in the VGA hardware; in both cases the monitor receives 400 scanlines progressively during each frame. The only difference is that in mode 13h, each odd line is a duplicate of the preceding even line, but the monitor can't tell the difference (and has no reason to).
 
If you mean a double-scanned mode (13h with 200 lines) compared to a non-double-scanned mode (say 80x25 text with 400 lines), then both the vertical and horizontal pulses are the same. The double scanning is done in the VGA hardware; in both cases the monitor receives 400 scanlines progressively during each frame. The only difference is that in mode 13h, each odd line is a duplicate of the preceding even line, but the monitor can't tell the difference (and has no reason to).

That more or less seems consistent with what I have in my head. If I had to choose a comparison, it would be 320x400 (13h) vs. 320x200 (which I think is 01h in 16 colors). The difference here is the length of time between vertical sync pulses (13h between twice the duration of 01h).
 
No. It's exactly the same. The monitor can't really "wait."

What VileR is saying is that the video card outputs identical video signals in 320x200 and 320x400 modes. It just sends the same information for each line twice in the 200 line mode.
 
No. It's exactly the same. The monitor can't really "wait."

What VileR is saying is that the video card outputs identical video signals in 320x200 and 320x400 modes. It just sends the same information for each line twice in the 200 line mode.

VileR's example was 13h VS a 400 line text mode. In this case, it would clearly both be 400 vertical lines in both modes and have the same vertical refresh period.

My example is 13h (320x400) VS 01h (320x200). Are you saying that the vertical refresh period would be the same for that mode, or for any other mode. For example 320x200 VS 640x480?
 
There is no such thing as a "320x200" for the display timing.
All 200-line modes must be double-scanned to form a 400-line mode sent to the monitor.
All 240-line modes must be double-scanned to form a 480-line mode sent to the monitor.
All 300-line modes must be double-scanned to form a 600-line mode sent to the monitor.

So there is no difference in timing between modes 01h and 13h.
Both modes are 320x200 pixels, sent as 320x400. They are also identical to any 640x400 mode.

The timing for 320x240 is exactly the same as for 640x480. In both modes, a line contains 25.42 µs of visible data - it is up to you how many pixels to cram into that time. For a while, I used a small 9" cash register monitor with 1056x480 (same timing as 640x480; the monitor did not support any other frequency).
 
Svenska for the win.

(If anyone wonders if standard VGA can really output 600 lines, it can, but you need a multisync monitor. I used this trick in the 1990s to display 320x200 18-bit color images using dithering, with each source line represented by 3 lines to hold the red, green, and blue components.)
 
All 200-line modes must be double-scanned to form a 400-line mode sent to the monitor.

Awesome. Thanks for the excellent explanation. I was laboring under false assumptions.
I am still assuming though, that the different modes (*x400, *x480, *x600) each have their own vertical refresh rates.
 
Back
Top