Since it was first mentioned, the electronic computing machine (today known as the PC) needed a display device. I mean what's the point of computing stuff if you can't see the final result? As so, the original name of the PC monitor was the VDU (Visual Display Unit).
Although early IBM computers followed the usual 3-part rule, comprising of a monitor, a computing unit and a keyboard (the mouse came at a later time), some companies manufactured the computing unit as an integral part of the display. It is actually funny that before monitors became the main display units for the PC, one interesting attempt to solving the output problem was to use printers as display units. As you can imagine this idea never became widespread.The beginning
The first ever widely used display technology was the CRT (Cathode Ray Tube). The main concept was based on a negatively-charged cathode, an electron gun and a screen coated with a grid of phosphor dots. The electron gun shot electrons down the tube and the phosphorus would glow when struck by the electron beam. As the number of phosphor dots on the screen is limited, discretization was necessary in order to display images on the screen.
This discretization was accomplished taking in consideration the number of pixels the screen has. Each pixel contains three phosphor dots of different color (Reed, Green and Blue) which, if combined, could reproduce (with a certain degree of approximation) any real life color. This is not to say that the first CRT monitors were color displays. They were actually monochrome units that had very low resolutions. The low resolution was a result of the high manufacturing cost and the composite interface used for PC connection.
Ferdinand Braun, a German physicist, was the inventor of the CRT (in 1897) and, even if it may be difficult to imagine nowadays, the shape of the display was not flat. This was mainly due to the early technology used in the design. The problem was that the electron beam was altered by magnetic deflection (a varying magnetic field generated by coils) in order for the rectangular image to fit on a non-flat surface.
As a side note, I don't know how many of you ever wondered what was the true meaning of the screensaver. I know that nowadays its main goal is to allow computer users to relax using some kind of animation or even an image slide show, but the name itself surely suggests otherwise. In fact its first meaning was to “save” the screen. The thing is that the phosphor in CRT screens becomes less bright with time.
So, after a few years of displaying the Windows desktop, if a full screen picture was displayed on the device, users could observe an imprint of the taskbar on the bottom of the screen. To solve this problem, the screensaver was introduced. It would display a random animation used to regulate phosphorus usage so that no part of the screen would display a static image for an extended period of time.The evolution
In time, CRTs evolved. Brightness, resolution and contrast improved. The color gamut became richer and images could be more accurately displayed. Resolutions ranged from the VGA standard (640 x 480) to 2304 x 1440 pixels. This evolution took place somewhere in the 90's, when Sony patented the Triniton tube. This was a more advanced cathode tube, which provided for a wider color gamut, higher brightness and contrast, as well as better electron focus across the screen.
Sony's patent expired in 1996 and as such other companies began using the the same technology, but under a different name (for example Mitsubishi called it Diamondtron). The 90's CRT market was led by Sony and their high quality monitors. Eizo for example only became popular because they were re-branding Sony's high end monitors. Soon other companies like Dell, Apple, IBM and Sun Microsystems joined the re-branding bandwagon.
This evolution was also made possible by an updated connector called HD-15 (widely known as the VGA connector or D-Sub). This was a 15 pin connector cable used to carry analog component RGBHV signal. The HV at the end signify horizontal and vertical sync signals.
The last update to the CRT technology was used in the FD Wega series (yes, Sony again). FD meant Flat Display and it translated to “perfect” image representation on a PC monitor. By improving the magnetic deflection technology, manufacturers began building almost flat screens by calibrating the coil position on the side of the screen. I say almost flat because some companies kind of cheated here and created screens that where not actually flat, but their surface was in fact part of a very large sphere and created the illusion of a flat surface. Sony was not one of those companies.The revolution
The CRT era seemed to go on forever. Sony was in business with high quality models that where re-branded by other well known companies. But as history shows us, technology is continuously evolving and sitting around bragging about how you have the best technology and not investing into R&D is not a good policy. Sony learned this by being dethroned as the display manufacturing king.
Somewhere at the end of the 90's, other companies began introducing a new technology on the market, namely the Liquid Display Technology (a.k.a. LCD). The new technology used a grid of liquid crystals, placing three transistors behind each crystal. The three transistors where used to generate the RGB spectrum and in the end display more sharper images than CRT monitors. Monitors featuring LCD panels came in a slimmer form factors and with lower radiation emissions. They were a step up in design compared to the bulky CRT monitors people were so used to.
The first generations of LCD panels came with a few drawbacks. One of their biggest problems was the response time. Response time represents the amount of time a pixel takes to go from black to white and back to black again. The first generations had a response time of 25ms, which was acceptable when only using your PC for office related tasks, but became a serious problem when watching movies or playing games. The low response time introduced ghosting when the display was forced to render fast changing images (also known as image lag).
The further evolution of LCDs solved this problem and today's monitors have response times as low as 2 ms. Some companies tried to cheat by providing response times measured using different methods (like gray-to-gray).
Another problem that came along with these panels were the so called dead pixels. This was a result of some stage from the manufacturing process, when some of the transistors were damaged and pixels affected by this would always display the same color (usually red or black). Most providers refused to honor user's warranties, stating that this was not covered by the manufacturer.
Again, contrast ratios and brightness improved with each new generation of displays, only this time Sony was no more the leading manufacturer (actually the company's LCDs were not even among the best). At some point, rumors emerged that Sony was in fact re-branding Samsung panels.
Yes, the new leader of the PC display market was Samsung. The company manufactured high quality panels and allowed users to exchange panels that came with damaged pixels or offered them their money back.
The age of HD
Together with the evolution of LCDs came a new video interface called DVI (digital video interface). Compared to the analog VGA connection, this solution was no longer bound to the quality and length of the cable. DVI provided for a complete digital transfer of information between the graphics adapter and the monitor.
Later on (at some point in 2004) HDCP (high definition content protection) was introduced. This was developed to protect the new High Definition content of Blu-ray and HD DVD discs. In order for a Blu-ray movie to play, users had to have both a HDCP enabled player and display. For computer users, the graphics card needed to have a HDCP decoding chip.
The HDCP decoding can only be done by connecting the monitor trough the DVI interface, since this interface provides for non-display data to be transferred to the device. Samsung was among the first to provide users with HDCP enabled monitors (even though at first the feature wasn't even advertised).
Today's monitors range from 15 to 30-inch LCDs, with resolutions matching 2560 x 1600 pixels. They are outfitted with webcams, USB hubs and audio speakers. Design-wise, manufactures are trying to attract people by using slimmer and fashioned-focused devices. Companies are also trying to develop 3D monitors that provide a more interactive experience. Most of these tryouts fail with the first two or three generations, but (as the LCD market has shown us) will eventually replace current generation displays.