In an industry in which development is so rapid, it is somewhat surprising that the technology behind monitors and televisions is over a hundred years old. Whilst confusion surrounds the precise origins of the cathode-ray tube, or CRT, it's generally agreed that German scientist Karl Ferdinand Braun developed the first controllable CRT in 1897, when he added alternating voltages to the device to enable it to send controlled streams of electrons from one end of the tube to the other. However, it wasn't until the late 1940s that CRTs were used in the first television sets. Although the CRTs found in modern day monitors have undergone modifications to improve picture quality, they still follow the same basic principles. The demise of the CRT monitor as a desktop PC peripheral had been long predicted, and not without good reason:
Whilst competing technologies - such as LCDs and PDPs had established themselves in specialist areas, there are several good reasons to explain why the CRT was able to maintain its dominance in the PC monitor market into the new millennium:
However, by 2001 the writing was clearly on the wall and the CRT's long period of dominance appeared finally to be coming to an end. In the summer of that year Philips Electronics - the world's largest CRT manufacturer - had agreed to merge its business with that of rival LG Electronics, Apple had begun shipping all its systems with LCD monitors and Hitachi had closed its $500m-a-year CRT operation, proclaiming that "there are no prospects for growth of the monitor CRT market". Having peaked at a high of approaching $20 billion in 1999, revenues from CRT monitor sales were forecast to plunge to about half that figure by 2007.
|