The word modem is a portmanteau, compressed from modulator-demodulator. To send digital data over an analogue telephone line, you must first modulate it — convert a stream of binary digits into audio tones that the telephone network was designed to carry. At the receiving end, you must demodulate those tones back into binary digits. The device that does both jobs is a modem.
This was not an obvious idea in 1950. The telephone network had been engineered for one purpose: carrying the human voice. It passed frequencies roughly between 300 and 3400 hertz — enough for speech to sound recognisable — and discarded everything outside that range. Nobody had designed it for data. The engineers who built the first modems had to work entirely within those constraints, treating the telephone network as a given and finding ways to encode digital information as patterns of sound that it would faithfully carry.
The SAGE modem of 1958 had proved the concept, but it was a classified military system the size of a wardrobe. The first modem intended for broader use came from AT&T's Bell System in 1958: the Bell 101, developed for the SAGE network but soon adapted for commercial data services. It operated at 110 bits per second using a technique called frequency-shift keying (FSK) — representing a binary 1 as one audio tone and a binary 0 as a different tone.
In 1962, Bell followed with the Bell 103, which became the first truly influential modem standard. The Bell 103 ran at 300 bits per second — 300 baud — and introduced full-duplex operation over a single telephone line. Full duplex meant both ends could send and receive simultaneously: the originating modem used one pair of tones (1070 Hz for 0, 1270 Hz for 1), while the answering modem used a different pair (2025 Hz for 0, 2225 Hz for 1). The two conversations occupied different frequency bands and did not interfere.
Imagine you want to send the letter A in ASCII, which is the binary pattern 01000001. With FSK, you play a low tone for each 0 and a high tone for each 1. The receiver listens to the incoming audio and converts each tone back to a 0 or 1. The telephone line carries the tones just as it would carry a voice — it has no idea it is transmitting data.
FSK is simple, robust, and slow. Each symbol carries exactly one bit. With 300 symbols per second and one bit per symbol, you get 300 bits per second. To go faster, you need either more symbols per second (limited by Nyquist) or more bits per symbol — which requires more sophisticated modulation schemes. Later epochs explored both routes.
At 300 baud, the Bell 103 could transmit roughly 30 characters per second — about as fast as a good typist. It was slow by any later standard, but it was enough to drive an entire generation of online services, time-sharing computer systems, and eventually the first bulletin boards.
There was a problem with the Bell 103. AT&T owned the telephone network, and AT&T's rules — backed by law — prohibited customers from connecting any non-Bell equipment directly to the phone line. If you wanted to use a modem, you had to rent it from AT&T. The telephone jack as we know it did not yet exist; the line came out of the wall and was wired directly into Bell-supplied equipment.
Engineers found a way around this restriction that required no electrical connection at all: the acoustic coupler. An acoustic coupler was a cradle into which you placed the handset of an ordinary telephone. A small speaker inside the coupler played audio tones into the handset's microphone; a small microphone in the coupler listened to the tones coming from the handset's speaker. Data travelled as sound through air, not as electrical signals through wire. AT&T could not object, because nothing was connected to their network — just a telephone, used exactly as intended.
Acoustic couplers were awkward. They were sensitive to background noise. Holding the handset firmly in the rubber cups was a skill. But they liberated modems from Bell's monopoly and made it possible for universities, research labs, and eventually hobbyists to connect computers to the telephone network with equipment they had built or bought themselves. The Vadic VA3400 (1973) and Novation CAT were popular coupler-based modems of this era.
The acoustic coupler workaround was a symptom of a deeper problem. In 1968, a Texas businessman named Thomas Carter had built a device called the Carterfone, which connected two-way radio systems to the telephone network. AT&T moved to ban it. Carter sued. The FCC ruled in Carter's favour, establishing the legal principle that customers had the right to attach any device to the telephone network, provided it did not harm the network.
The Carterfone decision of 1968 was transformational for the modem industry. It meant that manufacturers other than AT&T could now make modems that connected directly to the telephone line. It opened the door to competition, innovation, and eventually the consumer modem market that flourished in the 1980s. Without Carterfone, there might have been no Hayes Smartmodem, no BBS culture, no dial-up internet.
In 1969, the U.S. Defense Advanced Research Projects Agency switched on the first nodes of ARPANET — the network that would eventually become the internet. The four initial nodes were at UCLA, Stanford Research Institute, UC Santa Barbara, and the University of Utah. They were connected not by telephone lines but by dedicated leased lines running at 50 kilobits per second, using Interface Message Processors (IMPs) built by Bolt Beranek and Newman.
ARPANET itself did not use dial-up modems — it used dedicated leased lines and specialised hardware. But it demonstrated, at scale, that packet-switched computer networking worked. And the researchers who used ARPANET also needed to connect from home and from other institutions. For those connections, they used exactly what was available: Bell 103 modems at 300 baud, acoustic couplers, and time-sharing terminals.
By the mid-1970s, the combination of ARPANET, time-sharing systems at universities, and cheap (if slow) dial-up access had created a small but real online culture: electronic mail, mailing lists, file transfers, and remote login. The essential architecture of the internet was already in place; what was missing was speed, and the hardware to deliver it to ordinary people.
Through the 1970s, modem speeds began to climb. AT&T introduced the Bell 212A in 1976, doubling the speed to 1200 bits per second by using a more sophisticated modulation technique called phase-shift keying (PSK). Instead of switching between two frequencies, PSK kept the frequency constant but varied the phase of the signal — the point in the wave cycle at which each symbol began. With four possible phases, each symbol could represent two bits instead of one, doubling throughput without increasing the symbol rate.
Internationally, the CCITT (now ITU-T) began publishing the V-series recommendations — a set of international standards for modem design. The V.21 standard (1964) defined 300 baud FSK modems compatible with Bell 103. The V.22 standard (1980) defined 1200 bps modems. These standards ensured that a modem made in Japan could communicate with one made in Germany — an essential prerequisite for the global data network that was coming.
By 1980, the modem had evolved from a classified military device into a commercial product sold in electronics catalogues. Speeds had grown fourfold in two decades. The next decade would bring another hundredfold increase — and the device that made it possible was about to be announced by a small company in Mountain View, California.