The Evolution of Computer Monitors


With today’s high-tech computer monitors, it will highly be unlikely for most to come close to guessing when and how the technology began. From the high-resolution liquid crystal display (LCD) screens that graphic designers prefer to the rugged and intuitive industrial monitor touchscreens, computer displays have come a long way. The following shows how computer display technology evolved throughout the years.

Blinking Lights

The computer display technology started out with rows of blinking lights. These were during the days when one computer takes up almost the entire room, and the blinking lights looked more like bad background movie props.


Punched-cards Data Processing

This is yet another reason why most will find it hard to imagine how computer display technology began. Commands were entered into the computer via punched hole in cards, and the computer spewed out the results in the form of cards likewise with punched out holes. Operators would use another device to decode and interpret the results. During this same era (1940s), another version of this machined punched out holes in rolls of paper.

Cathode Ray Tubes

The 1950s and 1960s saw the advent of cathode ray tube (CRT) technology for computers. Designers for the first computer CRT displays took inspiration from radar and oscilloscope displays. They adapted the display technology to the SAGE 1 and programmed data processor (PDP) computer systems.


Teletypes were essentially two electronic typewriters communicating with the session being automatically printed on paper. This technology remained the preferred way of interfacing with computers until the mid-70s because of its cost-efficiency.

Glass Teletype

Designers saw the possibility of combining the teletype and CRT technologies. They soon were able to successfully combine the two for the first working CRT display that was capable of displaying text.

Composite Video Out

Because the technology for the CRT was taken from military radar and oscilloscope design, the first CRTs were expensive. Three people ― Steve Wozniak, Don Lancaster and Lee Felsenstein ― turned to cheaper closed-captioned television (CCTV) monitor as an alternative display. By 1976, they were able to design the first computers with video terminals in the Apple I and Sol 20.

Television Sets

From 1976 to the 1980s, the composite computer monitors underwent several more improvements as the era of personal computers began to gain traction. The Apple II came with a radio frequency modulator that enabled ordinary television screens to interpret the computer’s output.


LCD technology became available in the 1960s. However, it was only incorporated in computer technology in the 1980s. The LCDs were used for portable computers, and they came only in monochrome.

More Monitor Variants in the 1980s

Several more types of computer monitors came out in the 1980s, including IBMs and Macintosh computer displays. IBM also launched the video graphics array (VGA) video standard in the same decade, a technology, which is still very much in use in some PCs today.

LCDs in the 1990s

LCD technology continued to improve while the PC market was still focused primarily on the VGA CRT monitors. At around 1997, ViewSonic, IBM and Apple launched the first color LCD monitors with competitive features and prices that enabled it to compete with CRTs.

Computer Monitors Today

LCD technology has virtually phased out CRTs. These displays are lighter, more cost-effective and energy efficient. They come in different types and with different features to suit various applications.

Read Also:Recover Deleted iMessages

Computer technology is continuing to evolve with better and more practical displays. To date, touchscreens, gesture control capabilities, virtual reality head-up displays (HUDs), curved monitors, 4K displays and all-in-ones are also available. Experts predict that the technology may soon evolve to flexible screens, transparent monitors, holograms and 8K displays in the coming years.