5150: Setting up EGA/VGA


CGA = 1981 – 1983 (prime years)
EGA = 1984 – 1987 (still can display CGA video modes)
VGA = 1987 – 1999 (replaced by DVI, which was replaced by HDMI or DisplayPort)

“VGA connectors” still remain popular, they are a simple low cost way to offer some kind of display capability. I still see some high end servers delivered with humble VGA connectors (which can be irritating when corresponding KVM switches have no VGA connections).

CGA is IBM’s initial Color Graphics Adapter and standard video/graphics modes were limited to 4-colors.

EGA is a 16-color graphic mode, essentially as presented in the PCjr and Tandy 1000-series. While certainly a step up from CGA, competing systems (e.g. Amiga) were also offering substantially better graphics capability and the IBM PC was obliged to keep up. Note that any VGA card can also display EGA and CGA video modes.

VGA 8-bit ISA cards are actually not that straightforward to find. It may be because there isn’t enough bandwidth to push the higher resolutions at any useful rate (on the 8-bit bus), so there was no real business reason to make them. This also means you won’t find many good software titles that work well on the IBM PC 4.77 MHz at VGA resolutions (as one exception, see the excellent strategy game PlanetX3 by 8BitGuy that was released around 2017).

There were many competing after-market video cards, and many of them will be defunct (making it difficult to find documentation on things like dip-switch settings).

While I have a MDA/CGA original card (which is more “authentic” to early 1980s experience), I mainly use the VGA card for the convenience of being able to plug directly to a more modern VGA monitor without an intermediate MDA/VGA adapter. Here is an example of an Oak Technology VGA ISA card.

This particular card still has a 9-pin output connector. I can’t find any information or documentation about this particular code, so I’m not sure if it is an MDA or CGA output (or if that is configurable with the dip-switches).

By the era of EGA, the ability to connect your computer to a standard-definition television was becoming less popular (televisions were roughly 640×480 resolution, where as by early 1990s SuperVGA was far surpassing the resolution of televisions). “Monitors” offered higher resolution and faster refresh rates. It took awhile (maybe 10-15 years), but eventually monitors and televisions blended essentially into the same thing (with televisions becoming high resolution and fast refresh rate monitors). The days of S-Video quietly came to an end (there was a period of Component-Video, then DVI – but HDMI became a great standard because it also handled audio along the same lines, making it much easier to set up systems).

Also keep in mind that in 2009, there was a government mandate for all television broadcast stations to become digital. In addition to other implications, this also meant that every television now had some built in digital conversion capability.

Leave a Reply

%d bloggers like this: