Legacy Cables: A Timeline Lesson
Understanding where cables came from helps you understand why modern connectors exist. Every cable on this list solved a real problem when it was introduced and was eventually replaced when something better came along. Think of it like phones: we went from flip phones to smartphones not because flip phones were broken, but because we needed more.
Coaxial Cable (~1950 to ~2005)
Coaxial cable is one of the oldest standardized cables used in computing and telecommunications. The design is clever: a central copper conductor is wrapped in insulation, then a metal shield, then an outer jacket. That layered design made it very good at carrying signals over long distances without picking up interference.
In its prime, coaxial was everywhere. It carried TV signals into homes, connected radio equipment, and was even used in early computer networking. The 10BASE2 ethernet standard in the 1980s ran entirely over coaxial. As fiber optic and modern ethernet got cheaper and faster, coaxial slowly lost its role in networking. You still see it in cable TV setups today, but it is basically gone from computing.
Why it got replaced: Fiber optic and ethernet (Cat5/Cat6) were faster, easier to install, and cheaper to run long term.
Serial Cable / RS-232 (~1960 to ~2010)
RS-232 is notable for how long it remained in mainstream use, more than 50 years. It was developed in 1960 and transmitted data one bit at a time over a single wire. Slow by modern standards, but reliable, which mattered a lot at the time.
Serial ports connected some of the most important peripherals of the early PC era: external modems for dial-up internet, mice, and early networking equipment. In Windows, serial ports appeared as COM ports such as COM1 and COM2. That naming still exists in Windows today for backward compatibility. Network administrators also used serial connections to access router and switch management consoles, and some still do in enterprise environments.
Why it got replaced: USB overtook it completely. USB 2.0 reached 480 Mbps compared to serial's 115 Kbps, added hot-swapping, and gave users one universal standard instead of multiple COM ports.
Parallel Cable / IEEE 1284 (~1970 to ~2005)
While serial sent data one bit at a time, parallel cables sent multiple bits at once across multiple wires simultaneously, which made them noticeably faster for their era. The IEEE 1284 standard finalized in 1994 is the version most people remember.
Parallel ports were closely associated with printers. If you wanted to print something in the 1990s, you probably used the parallel port, also called the LPT port, short for Line Printer Terminal. Scanners used them too. The connector on the PC side was a 25-pin DB25, and on the printer side it was a larger Centronics connector.
Why it got replaced: USB followed the same pattern as with serial: simpler, faster, hot-swappable, and useful for many device types instead of serving as a dedicated printer port.
VGA / Video Graphics Array (1987 to ~2015)
VGA was introduced by IBM in 1987 and became the standard for PC video output for almost 30 years. The 15-pin connector carried analog video signals to monitors and projectors and could support resolutions up to 1080p depending on cable quality.
For many people growing up in the late 1990s and 2000s, VGA was simply the blue cable. It appeared on school computers, projectors, and monitors everywhere. Even after digital interfaces became available, manufacturers kept including VGA because classrooms and conference rooms still depended on it. VGA lasted well beyond its ideal technical lifespan because it was so deeply embedded in education and business hardware.
Why it got replaced: VGA is analog, so a GPU's digital signal has to be converted to analog, sent through the cable, and then converted back to digital at the display. That process reduces quality. HDMI and DisplayPort avoid those conversions, carry audio, and support much higher resolutions more cleanly.
DVI / Digital Visual Interface (1999 to ~2015)
DVI appeared in 1999 as a bridge between the analog and digital eras. It was the first mainstream video connector designed to carry digital signals natively, while still supporting analog for backward compatibility with older monitors.
There were five DVI types, and the pin layout showed which one you had. DVI-A was analog only, DVI-D was digital only, and DVI-I supported both. Single-link and dual-link versions existed as well, with dual-link providing more bandwidth for higher resolutions. DVI was common on flat-panel displays and graphics cards through much of the 2000s.
Why it got replaced: DVI never carried audio, the connectors were large and needed locking screws, and the multiple incompatible versions created confusion. HDMI combined video and audio in one smaller cable, and DisplayPort advanced further with higher bandwidth.
The Bigger Pattern
Looking at all five together, the pattern is clear. Each generation was replaced for similar reasons: it became too slow, too large, analog when digital was needed, or too limited to one purpose. The cables that lasted the longest, such as coaxial and serial, survived not because they were technically best, but because they were so deeply built into existing infrastructure that changing over took decades.
Modern cables like USB-C and Thunderbolt were designed with much of that history in mind. They are small, reversible, fast, digital, and capable of handling data, video, and power through one connection. That would have seemed like science fiction to someone setting up a PC in 1987.