Image Map Image Map
Page 1 of 2 12 LastLast
Results 1 to 10 of 20

Thread: Composite color artifacts: made by design or happy coincidence?

  1. #1

    Default Composite color artifacts: made by design or happy coincidence?

    Hi! I'm wondering if the Composite/NTSC TV color artifacts, such as those on the Apple ][ or the IBM CGA, were intended by the engineers as it is or it is just a happy coincidence and/or side effect that was cleverly taken advantage of by the programmers of that time.

    Thanks!

  2. #2

    Default

    Certainly intended on the Apple II.

    Not sure about CGA. I would say it was intended there as well, otherwise they would have just made the composite signal b/w from the outset, which would have greatly enhanced clarity and sharpness.

  3. #3

    Default

    Quote Originally Posted by Timo W. View Post
    Not sure about CGA. I would say it was intended there as well, otherwise they would have just made the composite signal b/w from the outset, which would have greatly enhanced clarity and sharpness.
    If you wanted sharp text, IBM would gladly sell you an MDA card and a matching monitor.

    Remember, there was no color monitor from IBM for the first two years of the PC's existence. They very much intended for you to connect it to your home TV through an RF modulator. Some of the early brochures showed a family using it in their living room connected to their TV.

  4. #4

    Default

    Quote Originally Posted by Timo W. View Post
    Certainly intended on the Apple II.

    Not sure about CGA. I would say it was intended there as well, otherwise they would have just made the composite signal b/w from the outset, which would have greatly enhanced clarity and sharpness.
    The 640x200 mode provided by the BIOS does have the color burst disabled on the composite output - you have to tweak the mode register to get the 15 colour composite mode that most composite games used. It's quite possible that IBM's engineers just never thought of using artifact colour. It's also possible that they did, and designed the hardware in such a way to give it flexibility to do things beyond the specification that they had designed it to meet.

  5. #5
    Join Date
    May 2011
    Location
    Outer Mongolia
    Posts
    3,273

    Default

    It's *possible* in CGA's case that IBM never actually thought of it and the fact that the hardware is capable of it is a happy accident resulting from sheer cheapness. (IE, it's possible IBM could have used a different oscillator instead of the colorburst*times*four 14.3mhz for the pixel clock combined with a somewhat more sophisticated method of handling the color burst; that would have made the color text modes a little cleaner on color composite monitors. Pixel clock not being a multiple of the colorburst frequency is why, for instance, the Commodore 64 is fairly immune to generating useful artifact colors.) But somehow I doubt it, because...

    As has already been mentioned the whole *point* of the Apple II was artifact colors. (It explicitly relies entirely on the pixel data to "implicitly" generate the colors, there is no separate system that generates color subcarrier pulses based on the contents of "color RAM" or whatever; the only "color control" is the seventh bit of each graphic data byte in the high-res mode is used to shift a block of pixels over half a clock, thus changing the phase and resulting in an alternate set of colors. This pixel-shift serves absolutely no purpose when the machine's connected to a monochrome monitor other than making the display look like it might be subtly broken.) By the time the PC came out the technique was a well known cheat to get more colors out of a system than you properly had RAM for. It seems unlikely the designers of the CGA card were unaware of it, and considering the limitations (nee, sheer crudeness) of CGA's "proper" color generation support they may well have been hoping people would leverage it even if they didn't *officially* document it.
    My Retro-computing YouTube Channel (updates... eventually?): Paleozoic PCs

  6. #6

    Default

    Am I the only one who simply just despises composite video? Obviously the quality is shit - but I just hate the intrinsic concept of shoving all the lines' chroma data into a tiny little backporch

    Sorry for the off topic, but god CVBS is stupid and terrible

  7. #7
    Join Date
    May 2011
    Location
    Outer Mongolia
    Posts
    3,273

    Default

    Quote Originally Posted by maxtherabbit View Post
    Am I the only one who simply just despises composite video? Obviously the quality is shit - but I just hate the intrinsic concept of shoving all the lines' chroma data into a tiny little back-porch
    But the chroma data isn’t shoved into the back porch, only the color burst lives there. (Which is the sync/phase reference for the chroma data overlaid/mixed on the luminance signal.)

    Sure, it’s not a perfect system because the requirement of backwards compatibility with monochrome sets and the bandwidth limitations that limit the effective color resolution to only a fraction of the luminance, but it was realizable with 1950’s technology and fits a complete video picture into a single wire/carrier wave. And because it *can* be abused with tricks like artifact colors it let computers built in the 1970’s with only tens of kilobytes of RAM display reasonable facsimiles of graphics that otherwise would require hundreds or thousands of dollars’ worth of more hardware.
    My Retro-computing YouTube Channel (updates... eventually?): Paleozoic PCs

  8. #8

    Default

    Quote Originally Posted by Eudimorphodon View Post
    But the chroma data isn’t shoved into the back porch, only the color burst lives there. (Which is the sync/phase reference for the chroma data overlaid/mixed on the luminance signal.)
    man just shows how much I hate it, I've never even looked at it on a scope or I would have known that

    so why does the decoder need a delay then? it's always offset relative to Y/C or RGB

  9. #9
    Join Date
    May 2011
    Location
    Outer Mongolia
    Posts
    3,273

    Default

    The details give me a headache, but the long and short of it is the circuitry that has to trap out the narrow bandwidth chroma pulses and compare them to the reference phase imposes significant delay that the lumina portion doesn’t experience, so if you just piped the luma straight through without running through the delay line then color and video “data” would be seriously out of sync.

    Remember, generally speaking delay lines *delay*, they don’t change frequency. If the color information for a line was all crammed in the colorburst than you’d need some kind of variable speed memory device to “unpack” it. And, also, by definition “artifact color” would be a nonsensical concept, since the TV wouldn’t be looking for color information mixed into the luma?
    My Retro-computing YouTube Channel (updates... eventually?): Paleozoic PCs

  10. #10

    Default

    Well now in retrospect after thinking about it and discussing it, my previous misconception about the color burst containing the chroma data really makes no sense at all. Such is the intensity of the passionate hatred I harbor for this format, it suspended my faculties of reason.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •