• Please review our updated Terms and Rules here

Trying to get 256 color mode working in win98 (think I might need a driver?)

J. Radon

Experienced Member
Joined
Sep 17, 2018
Messages
87
Location
Troutdale, OR
I'm a bit confused with how colors work in windows 98SE.
In the settings tab under display properties it says it's set to 256 color mode and my screen area is 1024 by 768. The actual color gradient underneath it however looks like only 6 colors with dithering, instead of a smooth blend. Then when I go into paint and go to define custom colors, it's more of the same; dithering. I don't think it's actually displaying 256 colors, I think it's maybe displaying in 16 bit color? I don't really know how to tell what the actual color depth is, but whenever I try to set any particular custom color, it turns into grey.

It's particularly annoying because I wanted to play around with making some tiled pixel art for a desktop background, and I cant get a good shade of mint green.

I think it's a driver issue. I can't find a working driver online for my trident tgui9680-1 video card. I found one called w98-tgui.exe (can't remember where, I dug around a bit on a few sites), and based on the description it sounded like what I was looking for. However, when I load it onto a floppy and pop it into my win98 rig, I get an error when I try to run it or copy it to the desktop (I tried 3 different floppies, and verified the files copied right when I wrote them on my win10 machine.

I feel pretty dumb asking a lot of probably basic questions here, I hope i'm not too much of a bother. A few years ago when I signed up here I referred to my grandpa's pc as an "IBM clone", so clearly I didn't know what I was talking about. I'm learning a lot about vintage computers (especially vintage hardware), but a lot of stuff still eludes me, like why this thing has a power switch directly wired to the power supply; was that just standard at the time? It completely bypasses the motherboard. When I try to shut it down in the system it always hangs at the end and I have to manually power it off: but that's another completely unrelated thing, this topic is just about getting 256 color working.
 
"why this thing has a power switch directly wired to the power supply; was that just standard at the time? It completely bypasses the motherboard. When I try to shut it down in the system it always hangs at the end and I have to manually power it off"

Standard for PCs to about midway through '486. PC and AT style power supplies have manual switches built in to the power supply that completly cut the power. Late AT style and 'Mini' AT power supplies have the switch on a remote cable. ATX power supplies are where they introduced 'soft' power on/off, which could be controlled by the motherboard. The power switch on those typically goes to the motherboard and requests that the motherboard turn on/off the power.
 
"this topic is just about getting 256 color working"

Do you know how much memory there is on the video card? I'm pretty sure 1024x768x256 requires 1 meg. Can it do 800x600x256?
<edit>

Just looked up the tgui9680, it's a PCI card with a minimum of 1 meg, which means both of the posts I wrote were pointless. Yes, it sounds like a driver issue, and yes if you have a PCI based system, it should properly shut down under operating system control - unless it's one of the early PCI motherboards that has an AT style power supply.
 
Last edited:
I'm a bit confused with how colors work in windows 98SE.
In the settings tab under display properties it says it's set to 256 color mode and my screen area is 1024 by 768. The actual color gradient underneath it however looks like only 6 colors with dithering, instead of a smooth blend. Then when I go into paint and go to define custom colors, it's more of the same; dithering. I don't think it's actually displaying 256 colors, I think it's maybe displaying in 16 bit color?

If you're getting a dithered color bar, you're in 256 color mode. 16 color mode will show 16 distinct color blocks with no dithering. 15/16 bit color (32768 or 65536 colors) will give you a RGB color ramp with vertical lines, and 32 bit (16,777,216 colors) will give you a smooth color ramp.

Since you're able to run at 1024x768@8bpp, it means that Windows 98 has some sort of driver for your video card. If you had no driver at all, Windows would be using its generic video driver, which locks you to 640x480 at 16 colors. You should be able to see what Windows thinks your video card is in the device manager.

But the behavior you're describing in Paint is to be expected because you only have 256 colors.

ATX power supplies are where they introduced 'soft' power on/off, which could be controlled by the motherboard. The power switch on those typically goes to the motherboard and requests that the motherboard turn on/off the power.

Incorrect. The power switch on ATX power supplies is a physical on/off switch. Depending on who designs the unit, it can either be a single or double pole switch. Cheaper supplies generally only use a single pole switch, while more expensive supplies will use double pole switches. The former will usually only cut the hot, while the latter will cut both the hot and the neutral.
 
Incorrect. The power switch on ATX power supplies is a physical on/off switch. Depending on who designs the unit, it can either be a single or double pole switch. Cheaper supplies generally only use a single pole switch, while more expensive supplies will use double pole switches. The former will usually only cut the hot, while the latter will cut both the hot and the neutral.
Not incorrect. You are wrong, as you just described AT, not ATX. What he said is 100% correct.
 
"this topic is just about getting 256 color working"

Do you know how much memory there is on the video card? I'm pretty sure 1024x768x256 requires 1 meg. Can it do 800x600x256?
<edit>

Just looked up the tgui9680, it's a PCI card with a minimum of 1 meg, which means both of the posts I wrote were pointless. Yes, it sounds like a driver issue, and yes if you have a PCI based system, it should properly shut down under operating system control - unless it's one of the early PCI motherboards that has an AT style power supply.

I believe it is an AT power supply based on your description. I'm only familiar with the modern standard of ATX.
It has two 6 pin connectors to supply the board power, and the supply has a wired switch that runs to the power button slot on the front of the case with a ground wire that splits off and gets bolted onto the case. It was pretty weird for me, and I was worried about having to potentially replace it if anything went wrong since I don't know anything about older PSU standards, and was having trouble googling the term "power supply with switch" or similar. Everything just brought up modern ATX supplies and the shut off switch on the back of the supply.

edit: I bring this up because I wonder if that's what's causing the odd shut down behavior. when I try to shut it down from the start menu, it just hangs after what I assume is it ending any processes. I then end up having to press the power button. I'm cautious about making sure I never use just the power switch to shut it down like I do with my main desktop, since I know with this machine the power button just directly cuts power from the supply instead of signaling to the board to start a shutdown process. Not having a power switch pin set on the board is just absolutely wild to me.
 
Last edited:
If you're getting a dithered color bar, you're in 256 color mode. 16 color mode will show 16 distinct color blocks with no dithering. 15/16 bit color (32768 or 65536 colors) will give you a RGB color ramp with vertical lines, and 32 bit (16,777,216 colors) will give you a smooth color ramp.

Since you're able to run at 1024x768@8bpp, it means that Windows 98 has some sort of driver for your video card. If you had no driver at all, Windows would be using its generic video driver, which locks you to 640x480 at 16 colors. You should be able to see what Windows thinks your video card is in the device manager.

But the behavior you're describing in Paint is to be expected because you only have 256 colors.



Incorrect. The power switch on ATX power supplies is a physical on/off switch. Depending on who designs the unit, it can either be a single or double pole switch. Cheaper supplies generally only use a single pole switch, while more expensive supplies will use double pole switches. The former will usually only cut the hot, while the latter will cut both the hot and the neutral.

Okay, weird, I figured there would be more actual colors for 256 color mode, but if having a dithered color bar is normal then i'm not sure what's up.
I installed the driver linked by kc8eyt, but it was an older windows 95 driver, and it didn't seem to change much. The driver windows is defaulting to is "Trident 9685/9680/9682/9385/9382/9385-1 PCI", which it says is sourced from the original windows 98SE installation files, so it recognizes that it's a Trident PCI card.
Maybe 32 bit color is what i'm looking for??? I don't know though.

I think something is wrong because it seems like there's more colors available when I lower the video resolution and set it to high color (16 bit) or true color (24 bit), but it seems like the color profiles are locked to video resolution. If I choose true color (24 bit) and start at a resolution of 640 by 480, then bump the slider up to 800 by 600, it swaps to high color (16 bit). From here going down keeps it at 16 bit color and doesn't revert to 24 bit color, but if I manually select 24 bit color from the drop down, it sets the resolution to 640 by 480.
Bumping it up from either of those settings to size 3 on the slider (1024 by 768) changes it to 256 color mode. With 256 color mode selected it will let me choose any of the 3 set resolutions, but the color looks off, and with 16 and 24 bit color modes, trying to increase the resolution forces the option to 256 color mode.
 
Last edited:
Not incorrect. You are wrong, as you just described AT, not ATX. What he said is 100% correct.

I mean, he's not totally wrong (I think), there's a physical hard cut off switch on the back of most if not all (that I know of) ATX power supplies, but it's different from the wired switch described to be on AT power supplies, which would face inside the machine and is a toggle button style switch instead of a rocker switch, so maybe that's where he's getting confused.

ATX have both a physical rocker switch on the back that cuts power directly, and the ability to turn it on or shut it down from the OS or a button wired to the motherboard through the 24 pin motherboard connection.

What eswan describe is also true, the power supply in my win98 rig (which I believe is an AT power supply) has no rocker switch on the back. On the back it features only a male and female 3 pin mains power socket, and a voltage selector toggle. On the side facing in are all the wires, and a switch wired out that goes to the front panel of the case, and has an earth ground wire that terminates in a crimped washer style terminal that gets screwed into the case. Nowhere on the board is there a place to wire a power switch, so there's not any way (that I know of) for the board to tell the power supply to shut down; there's just the two 6 pin cables from the supply, and I don't think any of them let the board tell the supply to turn off.
 
If you're getting a dithered color bar, you're in 256 color mode. 16 color mode will show 16 distinct color blocks with no dithering. 15/16 bit color (32768 or 65536 colors) will give you a RGB color ramp with vertical lines, and 32 bit (16,777,216 colors) will give you a smooth color ramp.

Since you're able to run at 1024x768@8bpp, it means that Windows 98 has some sort of driver for your video card. If you had no driver at all, Windows would be using its generic video driver, which locks you to 640x480 at 16 colors. You should be able to see what Windows thinks your video card is in the device manager.

But the behavior you're describing in Paint is to be expected because you only have 256 colors.



Incorrect. The power switch on ATX power supplies is a physical on/off switch. Depending on who designs the unit, it can either be a single or double pole switch. Cheaper supplies generally only use a single pole switch, while more expensive supplies will use double pole switches. The former will usually only cut the hot, while the latter will cut both the hot and the neutral.

Wait... aren't 32 bit and 256 color mode the same thing?
 
No, 256 colors are 8 bit.

wait wh-

So 256 is the lowest of the 3 resolutions I have to choose from?
If that's the case it'd explain a lot.

Is it a memory issue or something then?
It seems like the higher I set the resolution, the lower bit depth of color it lets me choose from (?)
 
Lowest is 16 colors normally. But depending on the driver, it may or may not offer that color depth in every screen resolution.

The graphics card needs enough video memory to store at least one full frame with the given resolution and color depth.

1024x768 @ 8 bits (256 colors) works with 1 MB
1024x768 @ 16 bits (64k colors) needs at least 2 MB
1024x768 @ 24/32 bits (true RGB, 16.7 million colors) needs at least 3 MB (that is 4 MB, as 3 MB is not an option)
 
Wow, I feel extra stupid.
I should have realized 256 was the lowest color setting of the group. It's even listed in ascending order!
So it seems like the best options I can currently get are 24 bit color @ 640 by 480 px or 16 bit color @ 800 by 600 px depending on if I want to get the most out of colors or resolution.

What's limiting me from higher bit color at higher resolutions? Is it a driver thing, or is it my video card, or is it something else?
For now i'll probably set it to 16 bit color at 800 by 600 px since it seems like the best compromise for a good color range and high resolution.

Is my video card also what's limiting my resolution selection? I only have 3 fixed resolution options.
 
Lowest is 16 colors normally. But depending on the driver, it may or may not offer that color depth in every screen resolution.

The graphics card needs enough video memory to store at least one full frame with the given resolution and color depth.

1024x768 @ 8 bits (256 colors) works with 1 MB
1024x768 @ 16 bits (64k colors) needs at least 2 MB
1024x768 @ 24/32 bits (true RGB, 16.7 million colors) needs at least 3 MB (that is 4 MB, as 3 MB is not an option)

Oh shoot, so it sounds like it's my video card that's the limit then?

To be honest, i'm not too worried about it.
800*600 at 16 bit should be plenty for now, and now I know what to upgrade if I want better output in the future.

I just feel stupid for assuming 256 was the highest just because it's the biggest number XD

Thank you for educating me.
 
If the video card has empty ram sockets, you should be able to bring it up to 2 meg at least. Ought to be 2 pieces of 256k x 16 EDO SOJ.
 
If the video card has empty ram sockets, you should be able to bring it up to 2 meg at least. Ought to be 2 pieces of 256k x 16 EDO SOJ.

Oh yeah, look at that!
I opened it up and sure enough it has two empty sockets on it.
That's wild.

I assume it wasn't too different from buying ram or any other memory component back in the day then?
I know the bios is also on a removeable flash chip, but it's still kinda crazy to me seeing this sort of thing.
The only experience I have with socket based memory chips is from a project I did years ago where I got a wilem serial port flash programmer to make a home made gameboy flash cart.
I still have a ton of plcc 32 pin flash chips lying around.

What kind of chips do I need to fill the slots out? I'll do some googling, but I assume they probably use some sort of common standard chip from the time?

edit: i'm stupid, you literally listed the specs of the chips. "56k x 16 EDO SOJ"
 
If you add ram, be carefull to get the orientation right. The chips will have a dimple in one corner that lines up with an arrow on the socket. They can fit backwards and will get really hot really quick and likely self destruct. Other than that, they just snap in easily.
 
If you add ram, be carefull to get the orientation right. The chips will have a dimple in one corner that lines up with an arrow on the socket. They can fit backwards and will get really hot really quick and likely self destruct. Other than that, they just snap in easily.

I already know about that, but I'm glad you point it out!
It's something really easy to mess up if you aren't paying attention.

I worked as a soldering technician for a while for a small company that made prototype cameras on commission for clients looking to mass produce.

A lot of my work essentially boiled down to following board schematics and cross referencing it with the build order to populate components on small batch PCB's.

Some of the components I had to solder on were BGA's, and because we didn't have an X-ray machine, I was essentially hand soldering them blind using a heat gun and hoping they worked.

Needless to say, I prefer socketed chips and chips with exposed legs over BGA's any day. Nothing is worse than realizing you might've killed a $20 chip when you only have maybe 5 to work with and need at least one working board by the end of the day :c
 
I'm going to expand on something Timo wrote. What he wrote is correct (except 3 MB is an option, just not a commonly used one - the SuperMac Spectrum/24, for example, has a 3 MB framebuffer) but you're not the first person here I've seen evidence some confusion about this kind of thing, so I'll try to spell it out in greater detail.

The video card needs to have memory to hold the pixels that are drawn on the screen. There are a lot of arrangements here; the simplest is the "linear framebuffer", which is essentially a region of memory that directly maps to the pixels on the screen. On a modern GPU-driven graphics card, this is a tiny fraction of the total available graphics memory. In the era you're looking at, basically the entire memory on the card is the framebuffer.

The amount of memory needed for the framebuffer is a product of the number of pixels, and the number of bits needed to describe the color space.

A black and white display needs one bit per pixel (it's either on, or it's off) -- you can fit eight pixels in one byte. So the amount of memory needed is the number of pixels, divided by eight. 1024*768*0.125 = 96 KB. Typically what you'd find in that case is a 128 KB framebuffer, as this simplifies the design, and the extra 32 KB becomes available as "off screen memory", which can be used to take advantage of the fact that it's often faster or more efficient to copy memory between different parts of the frame buffer, than to copy it between main memory and the frame buffer. So pre-rendered parts of the display font, or icons, or whatever else, can be drawn there, and then copied as needed into the part of the frame buffer that is visible on screen. Your modern GPU-based graphics cards still do exactly this, but on a much grander scale.

With 4 bits per pixel (which can produce 2^4 colors, or 16), you can fit two pixels in one byte. So the amount of memory needed is the number of pixels, divided by 2. 1024*768*0.5 = 384 KB. Again, what you'll typically find is a 512 KB frame buffer, with 128 KB offscreen.

With 8 bits per pixel (2^8 colors, or 256), one pixel fits in one byte. 1024*768 = 768 KB. Typically, a 1024 KB framebuffer with 256 KB offscreen.

This is where the DAC starts to matter, and it's not just a question of the amount of framebuffer memory. Analog display adapters (such as VGA) use a DAC to convert the digital contents of the framebuffer memory into an analog color signal the monitor can use.

1. The DAC defines how much total color fidelity you get - some early 256 color displays like the IBM PGC have a 12 bit DAC, which means it's possible to put 2^12 (4096) different colors on the display. You get to decide which 256 out of the total 4096, by programming the palette registers (a framebuffer memory cell is used to look up one of 256 12 bit color values from a lookup table (LUT), rather than describing the displayed color directly). VGA uses an 18 bit DAC, for 2^18 (262,144) possible colors, with the same kind of LUT arrangement. One of the "stupid tricks" of the VGA is to produce animations by leaving the image in the frame buffer alone, and just changing the color assignments in the LUT.

2. The DAC also has a maximum bandwidth, which defines how fast pixels can be blasted out to the display. At a vertical refresh rate of 60 Hz, the DAC has 1/60th of a second (actually, a little less) to read all the displayable pixels out of the framebuffer. The calculations here are little more complex, but the short story is that this is why things happen like, some Matrox Millennium 2s with an 8 MB framebuffer, which is big enough to accommodate 1600x1200x32bpp, can't actually use that mode because the 220 MHz DAC is short of what's required. Some of them have 250 MHz DACs, though, which is enough, and the mode works on those cards.

The LUT ("pseudo-color") arrangement for 8bpp and 12- or 18-bit DACs is no accident, if you consider both how it saves framebuffer memory, and the disadvantages of trying to represent an RGB color value directly using 8 bits. There's no way to split 8 bits evenly between each of red, green, and blue. So you end up having to decide: which gets short shrifted with only two bits of fidelity? R (RRGGGBB), G (RRRGGBBB), or B (RRRGGGBB)? Blue seems reasonable as the human eye is least sensitive to it, but whichever you choose, it makes for some quite obvious defects, e.g. trying to display shades of gray. Or do you restrict yourself to two bits for each (RRGGBB), which is evenly distributed but gives you only 2^6 (64) colors, and wastes the extra two bits? With the LUT, you are just referencing a value that does evenly divide by three (12 - 4 bits each for R, G, and B, or 18 - 6 bits each).

That said:

WIth 16 bits per pixel (2^16 colors, or 65536), one pixel fits in two bytes. 1024*768*2 = 1536 KB. But there is no LUT in this mode; the frame buffer memory location actually holds the color information directly ("direct-color"). So there is one extra bit of fidelity for one of R, G, or B. Most implementations gave it to green (R5 G6 B5, or RRRRRGGGGGGBBBBB), for human color perception reasons. But not all did. 15bpp doesn't have this problem, which gives it some advantages despite the lower total color fidelity. Since most memory bus arrangements are in multiples of 8 bits, actually implementing a 15 bit wide memory for a 15bpp display was not something that happened often (if ever at all) - they were pretty much all done in 16 bits with one "wasted". There's no LUT because there's no benefit: 16 bits of palette index uses the same amount of memory as 16 bits of color information, and 15 bits of color information is a pretty reasonable compromise, perception-wise.

24 bits per pixel (2^24 colors, or 16,777,216), one pixel fits in three bytes (24 bits / 8 bits per byte). 1024*768*3 = 2304 KB (so you can see that 2 MB, or 2048 KB, is not enough). It gets a little awkward, because while some display adapters (like the Spectrum/24 I mentioned earlier) have their memory arranged in a way that makes it reasonable to have a 24 bit wide framebuffer, most boards had a 16- or 32- (or even 64- or more) bit wide framebuffer memory bus, which makes trying to store a 24-bit value either wasteful (of one byte per pixel) or inefficient (by "packing" four 24 bit pixels into three 32 bit memory words - 96 bits - known as "packed pixel"). The downside of packed pixel mode is if you want to manipulate a single pixel, you have to deal with four of them at a time, and the situation that each 32 bit word is going to contain different parts of two pixels. So there's computational overhead.

32 bits per pixel makes the framebuffer design a lot more straightforward. One pixel fits in four bytes. 1024*768*4 = 3072 KB = 3.0 MB. You still get 2^24 colors, with the extra 8 bits either "wasted", or useful for other purposes (most commonly, an alpha channel, used to describe 2^8 (256) levels of transparency). Because of the overheads of packed pixel mode, a number of drivers offer "24 bit" modes which actually use 32 bits per pixel of framebuffer memory. This becomes noticeable with modes like, e.g. 1280*1024, which fits in a 4 MB framebuffer (3840 KB) using packed pixel, but not without (5120 KB). This is another reason, besides the DAC bandwidth, that a mode you might expect to be available... isn't.


Probably people who are domain experts at this stuff are going to find plenty to nitpick in what I wrote. But I think it should do as a beginner's primer.
 
Last edited:
Back
Top