Part 1: Foundations of Display Technology
The basic principles of how images are created electronically.
It is not surprising that the moment humanity discovered that it was possible to send information down a wire that people immediately saw the need to send more information down the wire and to come up with new types of information to send down a wire. We can send words down a wire. How do we send pictures down a wire? Text based art emerged very soon after the invention of the typewriter so it is easy to imagine that text based graphics followed the ability to send structured text over a wire. This section follows the arc of development from these early transmission systems to the point where the core elements of image transmission are established and the early end points for video display are emerging.
One continuing theme is that the final form of a thing is not required for that thing to develop and gain popularity. Surprisingly the modern typewriter keyboard arrives late in this arc. Early fax machines and typewriters used keyboards borrowed from pipe organs or piano keyboards. Borrowing the existing parts or building upon existing models is part of a pattern where a slightly hacky way of doing something is eventually supplanted by a more highly optimized way of doing something. This tends to happen as a function of mass acceptance and commercialization. The value of inventing something as complex as a typewriter is tied to the scale of the market or even the identification of a market. The original way of “typing” now looks very strange but it may have more in common with stenography keyboards and typing at the point of invention as this was a specialized skill and normal people did not have telegraphs at home.
The section highlights the evolution of information transmission, starting from sending text to the development of text-based graphics and early image transmission systems. At the center of this journey is human ingenuity and the persistent drive to transform the way we capture, transmit, and display visual information. This text illuminates how today’s digital displays depend upon layers of innovation dating back to the mid-19th century, when the first images were transmitted through wires.
We’re Scanning. We’re Scanning.
The development of image scanning technology is a pivotal moment in the development of information processing, establishing foundational principles that will later inform computing, telecommunications, and video. The central innovation is the ability to convert analog visual information into discrete digital data that could be transmitted, processed, and reproduced.
One way to look at this is the difference between the Royal Earl House telegraph, Morse Code, and MIDI. I include the House telegraph because US patent 8505 discloses the design of a steam-assisted telegraph and this is precisely the sort of information that this series requires as fuel. The House telegraph was not binary but worked rather like a remote print head where a person in one location was able to turn the print head on or off “print” at a location at the other end of a wire. The binary function of the House telegraph was “power on” and “power off”. This was the very first KVM.
Morse code is a binary system of long (dash) and short (dot) elements that can be put together in blocks and transmitted over a static time domain with gaps to define the blocks of information. Samuel Morse was, interestingly, primarily a fine art painter.
MIDI is a binary system of 1s and 0s using messages composed of 8 bit bytes (using 7 bits) where end points can be controlled from 0 to 127. We can debate whether it is fair to compare a single group of flashes/taps that form a single letter to a 7 bit payload in a message however the key point is that the 7 bit payload allows a user to send a lot more information.
Drum Scanners: The Mechanical Foundation
The earliest image scanning systems established the fundamental principle of breaking down continuous images into discrete elements. Frederick Bakewell’s working fax machine of 1847 represented an early mechanical scanning approach, utilizing a revolving drum coated in tinfoil with a scanning stylus that moved across the surface. If you think the tinfoil sounds absurd I give you this quote from Wikipedia “Bakewell replaced the pendulums of Bain’s system with synchronized rotating cylinders”, which is key information if you ever wanted to pin down exactly when the timeline really went wrong. The Bakewell approach established the critical concept of sequential scanning along a predictable grid pattern, which would later become essential to digital image processing.
By the early 20th century, more sophisticated scanning systems emerged. Alexander Murray and Richard Morse at Eastman Kodak developed the first analog color scanner in 1937, which used a drum scanner to image color transparencies with a light source and three photocells equipped with color filters. This technology laid the groundwork for converting continuous-tone images into separate color channels that could be individually processed.
Pulse Code Modulation: The Critical Bridge
The systems above all transmit continuous data meaning that the source and the endpoint are to some degree coupled as with the House telegraph. The transition from analog scanning to digital imaging required a method to convert continuous analog signals into discrete digital values. Pulse Code Modulation (PCM), invented by Alec Reeves at International Telephone and Telegraph in 1937, provided this critical bridge.
PCM’s contribution to image digitization can be understood through four fundamental stages:
Sampling: PCM established the concept of measuring an analog signal at regular intervals rather than continuously. In image scanning, this translates to measuring light intensity at specific points along a grid rather than as a continuous field.
Quantization: The sampled values are assigned to discrete numerical levels. In early systems like the Bartlane system, this was demonstrated by using five different exposure levels to correspond to five quantization levels. This early quantization later evolved into the multi-bit quantization used in modern systems.
Encoding: The quantized values are converted into binary code for storage or transmission. The Bartlane system notably used the five-bit Baudot code to transmit the grayscale digital image, establishing the practice of bit-based representation of visual information.
Modulation: For transmission, these digital values need to be converted into signals suitable for the transmission medium, a principle applied in all image transmission systems from early facsimile to modern telecommunications.
The Bartlane System: Local Storage and Gray Scale
The Bartlane system, developed by Harry G. Bartholomew and Maynard D. McFarlane in 1920, represents a watershed moment in the digitization of images. This system produced what is widely considered the first digital image and established several key principles of digital image processing:
- The conversion of continuous tones into discrete levels (initially 5, later expanded to 15 by 1929)
- The use of a standardized code (Baudot, the predecessor to International Telegraph Alphabet No. 2) for transmitting this visual information
- The concept of storing digital image information (via punched paper tape!)
The Bartlane system’s approach to transmitting digitized newspaper images between London and New York demonstrated how digital representation could overcome the limitations of analog transmission across long distances. This system effectively reduced transatlantic image transmission time from over a week to just three hours, demonstrating the efficiency of digital representation.
From Theory to Practice: Advancing the Technology
Claude Shannon’s information theory, formalized in his landmark 1948 paper “A Mathematical Theory of Communication,” provided the theoretical foundation for understanding how much information could be encoded and transmitted through digital channels. This work established the bit as the fundamental unit of information and formalized concepts of channel capacity that would guide all subsequent digital imaging systems.
Shannon’s information theory built upon Harry Nyquist’s earlier work on signal transmission, particularly the Nyquist–Shannon sampling theorem which established that to accurately reconstruct a signal, the sampling rate must be at least twice the highest frequency component of the signal being sampled. This principle became fundamental to all digital image and audio sampling processes, establishing the theoretical minimum sampling density required to capture and later reproduce the original analog information without loss.
Around this same time John Tukey coined the term “bit” by bashing together the words “binary” and “digit”. The same man also coined the words hardware and software. Apparently he also worked on the Fast Fourier Transform, an algorithm that allows for the rapid compression of data.
Russell Kirsch’s work at the National Bureau of Standards (now NIST) in 1957 led to the first fully digital image scanner using a drum scanner and photomultiplier tube that generated digital data that could be stored in a computer. This system produced the first digital photograph—a 176×176 pixel image of Kirsch’s infant son—marking the birth of what we now recognize as digital imaging.
Is this the bitmap?
An important parallel development came in the form of the Williams-Kilburn tube, developed by Frederic C. Williams and Tom Kilburn in 1946-1947. While primarily designed as computer memory, this technology demonstrated the principles of representing binary information visually on a Cathode Ray Tube (CRT) display. The bright spots represent 1s and the dim spots represent 0s, with each spot corresponding to an area of electrostatic charge. This may have been influenced by the same work that led to the RCA Selectron tube, developed by Jan A Rajchma.
This visual representation of binary data on the CRT established a fundamental principle that would later inform all raster display systems: the organization of visual information into a grid of discrete elements that could each independently represent binary states, effectively creating the concept of the pixel before the term was widely used.
Raster Display Systems are the final piece of this puzzle. Modern displays utilize the grid-based approach established by early scanning and CRT technologies, organizing visual information into pixels arranged in scan lines. Raster scan systems represent screens as a logical collection of blocks known as pixels that correspond to integer coordinates in the screen coordinate system.
Key Contributors to the Technology Stack
Several innovators played critical roles in this technological evolution. While it is a historical truism that some contributions have been minimized or erased while other contributions have been overstated this is a list. For example the use of Shannon’s name often exclusively in relation to sampling is often combined with various of these additional names in an attempt at accuracy; Nyquist, Küpfmüller, Kotel’nikov, Ogura, Raabe, and Someya.
Alexander Bain: Proposed the first facsimile machine in 1843, establishing the concept of transmitting visual information electronically. Descendents of Bain would go on to perfect this same system using consultants.
Frederick Bakewell: Developed the first working fax machine in 1847, introducing the rotating drum scanning method.
Arthur Korn: Created the phototelautograph in 1902, which used a light-sensitive selenium cell to scan papers, introducing photoelectric scanning.
E.T. Whittaker: In 1915 wrote a paper on sampling theory that precedes work by Shannon and others.
Harry G. Bartholomew and Maynard D. McFarlane: Invented the Bartlane system in 1920, creating the first digital image transmission system.
K. Ogura: Wrote a key paper in 1920 on interpolation that fed into the work of Shannon. This was one of multiple papers on the subject. Ogura’s paper on Interpolation found errors in the earlier work of Whittaker.
Harry Nyquist: A Bell Labs employee with a habit of having lunch with very smart people. Nyquist Frequency in 1928 “sampling rate twice the bandwidth of the signal’s waveform being sampled; sampling at a rate that is equal to, or faster, than this rate ensures that the waveform can be reconstructed accurately.”
Vladimir Kotelnikov: In 1933 Kotelnikov presented work on sampling theory prior to Shannon and was later presented with several IEEE awards for his work.
Alec Reeves: Invented Pulse Code Modulation in 1937, establishing the fundamental principles of converting analog signals to digital representation.
Claude Shannon: Developed information theory in 1948, providing the mathematical foundation for digital representation and transmission of information.
Russell Kirsch: Created the first digital image scanner in 1957, producing the first computer-scanned photograph.
Frederic C. Williams and Tom Kilburn: Developed the Williams-Kilburn tube in 1946-1947, demonstrating visual representation of binary data.
Pulse Code Modulation stands as the foundational bridge between the analog and digital worlds. By establishing the principles of sampling, quantization, encoding, and modulation, PCM provided the conceptual framework that guided the development of all subsequent digital imaging technologies (probably). This framework transformed the visual world from continuous analog information into the discrete, manipulable, glitchable, and transmissible digital representations that now dominate every moment of your day. <insert meme>
