Image Map Image Map
Page 15 of 15 FirstFirst ... 51112131415
Results 141 to 143 of 143

Thread: IBM/C128 CGA conversation, capture, and VGA display success stories

  1. #141

    Default

    Feel free to make one!

    My first issue with HDMI is that most if not all the monitors have the wrong aspect ratio. Most old VGA monitors are 4:3. That might not seem like a big thing, but it annoys some people.
    My second issue with HDMI is the licence. You need to buy an expensive chip or pay. Almost ALL the HDMI encoders are massive horrible chips with tiny impossible to solder pins and require a micro controller to program them to work. They also mostly don't have inputs that take anything remotely useful for the task in hand.

    To make a CGA to HDMI out converter you need to convert from digital RGBI to analogue RGBs, then capture that to YUV 4:2:2 digital (another big horrible to solder chip) then send that to an HDMI encoder chip. It's not a fun thing to even think about designing and certainly out of my reach. The other option is to learn how to program an FPGA, capture the RGBI directly and encode your own HDMI output. You do of course come up again on the issues with voltage (1.8v io anybody?) pin count (144 pins? Have fun soldering that) and just learning how to program it.

    So long story short, This S*** is hard.

    I started out with a basic RGBI to RGBs dac. I then upgraded to a multi output DAC with S-Video as an option. Now I am making a complex scandoubler device.
    In that time, the number of CGA video converters developed by anybody else is? None. (2 Other basic DaC boards appeared about the same time as mine, you guys tried them. But that is it as far as I know.)

  2. #142
    Join Date
    Mar 2011
    Location
    Atlanta, GA, USA
    Posts
    1,477

    Default

    Quote Originally Posted by Pyrofer View Post
    My first issue with HDMI is that most if not all the monitors have the wrong aspect ratio. Most old VGA monitors are 4:3. That might not seem like a big thing, but it annoys some people.
    That's pretty subjective. How can it be 'wrong' if it were done by design - including consideration of 4:3? 16:9 was a compromise between 4:3 and Panavision CinemaScope (2.35:1) producing similar ratioed pillar vs letter box bars relative to content when viewing either format respectively. Maybe you are too young to remember "This film has been edited to fit your television screen" or the concept of Pan & Scan. One can certainly fault the creators for not anticipating the rate of demise of 4:3 content in subsequent years - aside from the rather awesome Knight Rider box set. But CinemaScope isn't fading in popularity for live action features. And most animated features have made the jump to 16:9. I certainly enjoy watching them on 16:9 with smaller or no letter boxing than a 4:3 tube. Because it isn't what we formerly used, doesn't make it wrong. The 16:9 creators would tell you the same thing then and now about how to handle VGA - pillar box the 4:3 signal in the center of the 16:9 screen. Most retro hobbyist would be happy with that too. But...

    But that's only half the issue. Pixel aspect is the other. At least the newer wide-ratio formats have an arguably 'not-wrong' pixel aspect ratio of 1:1. The types of 4:3 signals we are discussing here (digital standards) have rectangular pixels that vary in weird ratios by video mode. For example, modern NTSC's 720x480 is 1.125 to 1, MDA/Herc Mode 0's 720x350 is 1.54 to 1, CGA's Mode 1's 320x200 is 1.2 to 1, EGA's Mode 2's 640x350 is 1.37 to 1, etc.. Nothing was consistent back then. Now that is 'wrong' in my opinion. Only standard VGA's 640x480 the later VESA resolutions are 1:1 (800x600, 1024x768, ...). All 16:9 formats are sane square pixels as well. So even if you had a 4:3 HDMI display, chances are it uses a native matching square pixel resolution and you would have to perform pixel aspect scaling to compensate for the oddness of anything pre-VGA.

    Quote Originally Posted by Pyrofer View Post
    My second issue with HDMI is the licence. You need to buy an expensive chip or pay. Almost ALL the HDMI encoders are massive horrible chips with tiny impossible to solder pins and require a micro controller to program them to work. They also mostly don't have inputs that take anything remotely useful for the task in hand.
    The license is paid by the chip and connector manufacturers. Yes, if you want to output a compliant TMDS output and call it HDMI using a device like an FPGA, you would have to pay a license to place the HDMI logo on your box or ad material. But I don't believe HDMI LLC or it's member companies are going to spend a lot of legal effort coming after a RGBI to HDMI hobby project - and only if that project isn't using a licensed chip.

    Expensive and complex is relative too. TMDS is a 10:1 dot clock shift at a minimum. And HDMI requires a minimum dot clock of around 50 MHz for digital insertion of blanking intervals. So that is a TMDS rate of >500 MHz. Obviously that's difficult to achieve in a leaded part like a DIP or PLCC. You can buy a TFP410 off Digikey in a QFP64 package for around $6. And before you start moaning the differences between DVI and HDMI, let me say they are both completely forward and backwards compatible - electrically and in protocol. A TFP410 will output a perfectly compliant HDMI 1.0 signal all current TVs will sync to as long as the timing conforms to any standard defined by CEA-861. I know, because I've done it in several projects.

    Quote Originally Posted by Pyrofer View Post
    To make a CGA to HDMI out converter you need to convert from digital RGBI to analogue RGBs...
    WTF for? Just use a 16 entry (or 64 for EGA) color lookup table. It's all linear bit extension except for accounting for the CGA 'brown'. No analog needed. All you need to capture MDA/Herc/CGA/EGA is a '245 buffer and some load resistors.

    Quote Originally Posted by Pyrofer View Post
    ...then capture that to YUV 4:2:2 digital (another big horrible to solder chip) then send that to an HDMI encoder chip. It's not a fun thing to even think about designing and certainly out of my reach. The other option is to learn how to program an FPGA, capture the RGBI directly and encode your own HDMI output. You do of course come up again on the issues with voltage (1.8v io anybody?) pin count (144 pins? Have fun soldering that) and just learning how to program it.
    You never have to go to YCbCr at all. Every HDMI sink must take in RGB 4:4:4 as a base-line. YCbCr 4:4:4 and 4:2:0 are optional. The TI TFP410 I referenced above can be run completely configuration-less - no I2C programming - and assumes 24-bit RGB input. You can go from 2 bit MDA, 4-bit CGA, or 6-bit EGA to 24-bit correct RGB with a ROM. In the configuration-less more, the TFP410 will not send AVInfoFrames and you can't query the sink's EDID resolution capabilities for negotiation. But most TVs and monitors these days have scalers that will eat anything - even VESA modes - as long as it's a CEA-861 resolution. The TFP410 is a 3.3 V part to - so no need for another voltage rail.

    Quote Originally Posted by Pyrofer View Post
    So long story short, This S*** is hard.

    I started out with a basic RGBI to RGBs dac. I then upgraded to a multi output DAC with S-Video as an option. Now I am making a complex scandoubler device.
    In that time, the number of CGA video converters developed by anybody else is? None. (2 Other basic DaC boards appeared about the same time as mine, you guys tried them. But that is it as far as I know.)
    A FPGA with some RAM is the correct tool for this. Just to hurdle the pixel aspect ratio conversion problem requires a scaler - not just a doubler.

    I started a project a couple years ago to perform RGBI2USB. I realized no purest would be happy with any scaler - period. So the plan was to capture the raw frames to a PC host and let the user do whatever they wanted later in post-processing. I even wrote a Qt GUI app that would show the capture with or without scaler correction in real-time on your desktop. But the MCE2VGA and other similar projects stole it's thunder and reduced demand. I told Trixter I would try and make another more compact prototype before VCF-MW, but I'm not going to get it done in time. I've had a lot of paid higher priority contract work lately.

    But if someone is willing to take up the torch..... hint hint.
    "Good engineers keep thick authoritative books on their shelf. Not for their own reference, but to throw at people who ask stupid questions; hoping a small fragment of knowledge will osmotically transfer with each cranial impact." - Me

  3. #143

    Default

    By "wrong" I didn't mean to impune 16:9 as a standard, just that the output I am working with was designed for 4:3 monitors. Stretching it looks horrid and black bars are wasteful.

    There is no existing chip I know of that can take in RGBI (in it's digital form) and output HDMI. I don't even know of any that do analogue RGB direct to HDMI. This is assuming a purchasable finished IC not an FPGA you have to program yourself. If you know of chips that do this please let me know. I am not able at this point in time to design and program an FPGA solution. I do however get what you mean about converting the digital RGBI to an input the HDMI chip can take.. That IS an interesting thought. I will take a long hard look at the chip you mention too, TI TFP410 as it appears to be almost solderable by myself and the level conversion isn't a total nightmare with so few inputs. Remember, I am working with 5v on the CGA side so even a 3.3v part is a pain, however little to power and interface.

    I want to go one step at a time, and have invested a lot in the Scandoubler board so I will finish and sell that. But... curse you.. I will be looking at that HDMI chip now..

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •