On 1 January 1981, a small company in Mountain View, California called Hayes Microcomputer Products shipped the first Smartmodem 300. It cost $279 — a significant sum at the time — and it changed the modem industry permanently. What made it smart was not its speed, which was still 300 baud, identical to the Bell 103. What made it smart was that a computer could control it through software.
Before the Smartmodem, connecting a modem to a telephone line required manual intervention: you dialled the number on a telephone handset, listened for the answering tone, then placed the handset in an acoustic coupler or flipped a switch to transfer the line to the modem. Every step required a human. Hayes changed this by embedding a small microprocessor in the modem and defining a command language that a computer could send over the serial port. The modem would listen for commands, execute them, and report results back — all without human hands on the telephone.
Every command in the Hayes language began with the letters AT — short for attention — followed by one or more letters and numbers specifying what to do. A few examples that any modem user of the era would recognise:
ATD — Dial. ATDT 5551234 dials 5551234 using touch tones.
ATA — Answer an incoming call.
ATH — Hang up (H for Hook, as in hook-switch).
ATE1 — Echo commands back to the screen so you can see what you typed.
ATZ — Reset the modem to its default settings.
The responses were equally terse: OK meant the command succeeded; CONNECT meant a connection had been established; NO CARRIER meant the line had dropped; BUSY meant the remote number was engaged. These messages appeared on millions of screens every day throughout the 1980s and 1990s.
Hayes never formally patented the AT command set, and competitors copied it freely. Within a few years it had become a de facto industry standard. Every modem manufacturer implemented it. Every communications software package assumed it. It remained the universal language of modems until the dial-up era ended.
The Smartmodem 300 was followed quickly by the Smartmodem 1200 in 1982, implementing the Bell 212A standard. But the speed race was just beginning. Through the 1980s, the ITU-T V-series standards arrived in rapid succession, each one pushing the boundary of what was possible over a standard telephone line.
The key technical challenge was that the telephone channel offered only about 3000 Hz of usable bandwidth. By the Nyquist theorem, this allowed at most around 6000 symbols per second. To send more bits per second without exceeding the symbol rate limit, modem designers had to encode more bits into each symbol — requiring more complex modulation schemes that could represent many distinct signal states.
The modulation technique that drove the speed race of the 1980s and 1990s was QAM. Instead of varying just the frequency or just the phase of the carrier signal, QAM varies both the amplitude (volume) and the phase (timing) simultaneously. By combining different amplitudes and phases, a modem can define a large number of distinct signal states — called a constellation.
A 16-QAM modem defines 16 distinct states. Since 2&sup4; = 16, each state represents 4 bits. At the same symbol rate, a 16-QAM modem sends four times as much data as a simple two-state FSK modem. A 64-QAM modem defines 64 states (6 bits each). A 256-QAM modem defines 256 states (8 bits each).
The catch is that larger constellations are more sensitive to noise. If the signal is slightly distorted, the receiver may mistake one constellation point for an adjacent one and decode the wrong bits. This is exactly where the Shannon limit bites: it tells you precisely how large a constellation a channel can support given its noise level. Better phone lines, better signal processing, and better error correction all pushed the practical limit closer to Shannon's theoretical ceiling.
The progression of ITU-T standards tells the story of engineers steadily closing the gap:
V.22 (1980) — 1200 bps, using PSK.
V.22bis (1984) — 2400 bps, using QAM. The suffix bis is French for second.
V.32 (1984) — 9600 bps, full duplex over a two-wire line, using trellis-coded modulation.
V.32bis (1991) — 14,400 bps. The first speed at which downloading software became practical, measured in minutes rather than hours.
V.34 (1994) — 28,800 bps, later extended to 33,600 bps. V.34 analysed the specific characteristics of the telephone line at the start of each call and adapted its modulation to extract the maximum possible data rate from that particular connection.
Raw speed was only part of the story. Telephone lines were noisy, and noise caused errors. In the early days, a corrupted bit simply garbled the data and the user had to notice and request a retransmission manually. By the mid-1980s, modems were handling this automatically.
MNP (Microcom Networking Protocol), developed by Microcom Inc., introduced automatic error correction at the modem level. MNP Class 4 became widely adopted through the late 1980s. In 1989, the ITU standardised error correction as V.42, incorporating MNP as a fallback and adding a new protocol called LAPM. The ITU also standardised data compression as V.42bis, which could compress certain data types by up to 4:1 — effectively quadrupling throughput on compressible files. A V.34 modem with V.42bis could push text data at effective rates approaching 115,200 bps, though real-world gains depended entirely on the nature of the data being sent.
On 16 February 1978, Ward Christensen and Randy Suess switched on the first public Bulletin Board System (BBS) in Chicago. Users could dial in, leave messages, and read messages left by others — a public noticeboard accessible by telephone. The idea spread rapidly. By the mid-1980s, thousands of BBSs were operating across North America, Europe, and beyond.
Each BBS was typically run by an individual — the sysop, short for system operator — on hardware in a spare bedroom or basement. Users called in and could read and post messages in topic areas, download files, upload their own contributions, play text-based door games, and chat in real time with other users online simultaneously. Handles (pseudonyms) were standard. ASCII art decorated menus and welcome screens.
File sharing was the lifeblood of the community: shareware software, public domain utilities, digitised images, and later pirated commercial software all circulated through BBS file sections. FidoNet, created by Tom Jennings in 1984, connected BBSs into a global network: messages posted in Toronto could propagate overnight through a chain of telephone calls to a BBS in Tokyo. At its peak in the early 1990s, the BBS ecosystem numbered in the tens of thousands of systems worldwide. It was the social network, the app store, the file-sharing platform, and the discussion forum of its era — all running over dial-up modems, one phone call at a time.
Alongside the grassroots BBS world, commercial online services emerged to serve a less technically inclined audience. CompuServe launched its consumer Information Service in 1979, offering email, news, weather, financial data, and hundreds of special-interest forums, all accessible via a dial-up modem. Prodigy, a joint venture of IBM and Sears launched in 1988, aimed at families and offered a graphical interface — primitive by later standards but genuinely novel at the time.
America Online (AOL), which launched its consumer service in 1991 and famously distributed tens of millions of free trial discs through the post, grew to become the dominant commercial service of the early 1990s. At its peak, AOL accounted for roughly half of all consumer internet time in the United States. These services were walled gardens — you could email other AOL users but not CompuServe users — but they introduced most ordinary people to the idea of going online for the first time.
Through the 1980s, most modems were external devices: a separate box on the desk, connected to the computer's serial port and to the telephone line via an RJ-11 connector. External modems wore their status on their face: a row of LEDs — HS, AA, CD, OH, RD, SD, TR, MR — flickered in patterns that experienced users could read like a diagnostic display.
As expansion slots became standard in PCs, internal modems appeared on ISA cards: cheaper, needing no desk space, drawing power from the computer's supply. Through the early 1990s they became the default for new PC purchases.
Toward the end of this epoch, a cost-cutting innovation appeared that would prove controversial: the Winmodem, also called a software modem. A Winmodem offloaded all signal processing from dedicated hardware to the computer's main CPU, running it as a Windows driver. This made the card itself very cheap to manufacture. The tradeoffs were real: Winmodems consumed CPU cycles, performed poorly under load, and only worked under Microsoft Windows. Linux users developed a lasting and vocal animosity toward them — an animosity that was entirely justified.