Epoch VI

Mobile Modems & the Wireless World
2000 – today

 

The Network in Your Pocket

The story of the modem, from Bell 103 to V.92, was the story of a wire. A copper pair running from a telephone exchange to a house, carrying modulated audio tones, governed by the laws that Nyquist and Shannon had written down decades earlier. That story did not end — copper DSL lines are still active in hundreds of millions of homes — but a parallel story had been developing since the 1980s, one that needed no wire at all.

Mobile telecommunications networks were originally designed for voice. The first generation (1G) systems of the early 1980s — AMPS in North America, NMT in Scandinavia, TACS in the UK — were analogue, carried voice only, and had no concept of data. The second generation (2G) systems that replaced them in the early 1990s were digital, and digitisation opened the door. A digital voice channel could, with the right equipment on both ends, carry data just as a telephone line could. The mobile modem was born from exactly this opportunity.

 

GSM Data: CSD, HSCSD, and the Trickle Begins

The GSM standard (Global System for Mobile Communications), which became the dominant 2G technology worldwide from the early 1990s, included a data capability from the start. CSD — Circuit Switched Data — allowed a GSM phone to make a data call in exactly the same way as a voice call, at a maximum speed of 9,600 bits per second. The phone's built-in modem modulated data onto the radio channel; the network's base station demodulated it and forwarded it to the destination.

CSD was slow, expensive (billed by the minute, like a voice call), and occupied a full voice channel for the duration of the connection. Sending an email from a mobile phone in 1995 required patience, a data-capable handset, and a willingness to pay voice-call rates for the time connected. Nevertheless, CSD worked, and it established the principle of mobile data as a real, if niche, service.

HSCSD (High Speed Circuit Switched Data), introduced in the late 1990s, improved on CSD by bundling multiple GSM time slots together, achieving speeds of up to 57.6 kbps. But HSCSD remained circuit-switched — it held a dedicated channel open for the duration of the session, regardless of whether data was actually being sent. It was an evolutionary dead end, superseded almost immediately by a fundamentally different approach.

 

GPRS: Always-On Arrives on Mobile (2000)

GPRS — General Packet Radio Service — launched commercially in 2000 and represented a genuine architectural shift. Instead of circuit switching — dedicating a channel for the duration of a connection — GPRS used packet switching: data was broken into packets and sent opportunistically whenever radio capacity was available, interleaved with other users' data on the same channels. You were billed for the data transferred, not the time connected. And crucially, you were always connected: no dialling, no handshake, no waiting.

GPRS could use between one and four GSM time slots simultaneously, giving theoretical maximum speeds of around 114 kbps, though real-world speeds were typically 20–40 kbps. It was slower than a dial-up modem, and far slower than ADSL. But it was mobile. You could receive email on a train. You could look up a map on a street corner. The use cases that GPRS enabled — modest, text-heavy, carefully designed for low bandwidth — pointed directly toward the smartphone era that was coming.

GPRS was marketed as 2.5G — a half-generation step between 2G voice and the 3G data networks that were being planned. Its successor, EDGE (Enhanced Data rates for GSM Evolution), introduced in 2003, used a more efficient modulation scheme (8PSK instead of GMSK) to push speeds up to a theoretical 384 kbps and typical real-world speeds of 80–150 kbps. EDGE was marketed as 2.75G. Apple's first iPhone, launched in 2007, used EDGE rather than 3G — a decision that reflected both the limited 3G coverage of the time and the battery constraints of the hardware.

📷 Image suggestion: Search for "Nokia 6310i GPRS phone" or "Sony Ericsson T68i data cable" — these were among the most popular GPRS-capable phones of the early 2000s, often used with a data cable to connect a laptop. Also consider the original Apple iPhone (2007) as an EDGE device.
 

3G: Mobile Broadband Becomes Real

The third generation of mobile networks was designed from the outset as a data platform, not a voice network with data grafted on. The ITU's IMT-2000 specification called for peak data rates of 2 Mbit/s in stationary conditions and 384 kbps in mobile conditions. Two main technology families emerged to meet it: UMTS (Universal Mobile Telecommunications System), based on WCDMA radio technology and adopted by most of Europe and Asia, and CDMA2000, developed from Qualcomm's IS-95 CDMA technology and adopted primarily in North America, South Korea, and Japan.

UMTS launched commercially in Japan (NTT DoCoMo, 2001) and in Europe (2003). Initial real-world speeds were modest — typically 200–384 kbps — but the technology improved rapidly. HSPA (High Speed Packet Access), standardised in 2005, transformed 3G into a genuine broadband technology. HSPA downstream (HSDPA) reached 14.4 Mbit/s in its initial form; HSPA+ pushed this to 42 Mbit/s and eventually 84 Mbit/s in advanced deployments. For the first time, a mobile data connection could be genuinely faster than an average home ADSL line.

WCDMA and Spread Spectrum — in plain English

Earlier mobile systems like GSM divided the radio spectrum into narrow frequency slots and assigned one slot to each user — a technique called FDMA (Frequency Division Multiple Access). WCDMA, the radio technology underlying UMTS, took a different approach called spread spectrum.

In WCDMA, every user transmits across the full 5 MHz channel simultaneously. Each user's data is multiplied by a unique spreading code — a pseudo-random sequence of chips at 3.84 million chips per second — that spreads it across the entire bandwidth. The receiver, knowing the spreading code, can extract its intended signal from the combined noise of all other users. To other receivers, each user's transmission looks like random noise.

This has a counter-intuitive but important property: as more users share the channel, the noise floor rises and each user's effective data rate decreases gradually, rather than falling off a cliff. The channel degrades gracefully under load rather than suddenly becoming unusable. It also made WCDMA very resistant to narrowband interference — a jammer occupying one frequency disturbs only a tiny fraction of the spread signal.

📷 Image suggestion: Search for "Huawei E220 USB modem" or "3G USB dongle laptop". The Huawei E220, launched in 2006, was the first widely deployed HSPA USB dongle and became the defining image of mobile broadband in the late 2000s.
 

The USB Dongle: Broadband in Your Bag

The physical form of the mobile modem evolved dramatically through the 2000s. The earliest mobile data connections required a separate modem card — a PCMCIA card that slid into a laptop's PC Card slot — or a data cable connecting a mobile phone to a laptop's serial or USB port. Neither was convenient. The breakthrough in form factor came with the USB dongle: a self-contained mobile modem the size of a large thumb drive, plugging directly into any USB port.

Huawei pioneered the mass-market USB dongle with the E220 in 2006, supporting HSDPA at up to 3.6 Mbit/s. The device was an immediate success. Mobile operators around the world white-labelled it under their own brands and sold it with prepaid or contract data SIM cards. For the first time, mobile broadband was accessible to anyone with a laptop and a USB port — no specialist knowledge required, no PCMCIA slot needed, plug in and connect.

Subsequent generations of USB dongles supported progressively higher speeds: HSPA+ at 21 Mbit/s, then 42 Mbit/s, then early 4G LTE at 100 Mbit/s and beyond. The dongle also spawned the MiFi — a pocket-sized device that combined a mobile modem with a Wi-Fi hotspot, allowing multiple devices to share a single mobile data connection. MiFi devices became standard travel accessories for business users and a lifeline for remote workers in areas without fixed broadband.

 

4G LTE: The Modern Baseline

4G LTE — Long Term Evolution — was standardised by the 3GPP as Release 8 in 2008 and launched commercially by TeliaSonera in Stockholm and Oslo in December 2009, making Scandinavia once again the site of a landmark in mobile communications history. LTE was designed as a clean break from 3G: a new radio access technology built on OFDMA (Orthogonal Frequency Division Multiple Access) downstream and SC-FDMA upstream, capable of peak theoretical speeds of 100 Mbit/s downstream and 50 Mbit/s upstream in its initial specification.

OFDM — the Technology Behind 4G and 5G, in plain English

OFDM (Orthogonal Frequency Division Multiplexing) is the modulation scheme that underlies LTE, Wi-Fi (since 802.11a in 1999), and 5G NR. It is, in a sense, a wireless version of the DMT technique used in ADSL — the same fundamental idea applied to a radio channel.

OFDM divides the available radio bandwidth into a large number of narrow, closely spaced sub-carriers — in LTE, up to 1200 sub-carriers each 15 kHz wide. Each sub-carrier carries a low-rate QAM signal. The sub-carriers are mathematically orthogonal to each other, meaning they can be packed together with minimal spacing without interfering with each other, as long as the receiver uses a Fourier transform to separate them.

The advantages are significant. OFDM is highly resistant to multipath fading — the interference caused by signals bouncing off buildings and arriving at the receiver via multiple paths with different delays. It adapts easily to different bandwidths by using more or fewer sub-carriers. And it allows different sub-carriers to be allocated to different users simultaneously (OFDMA), making efficient use of the spectrum in a cell with many active users.

The same mathematical elegance that made DMT the right choice for ADSL in 1999 made OFDM the right choice for LTE in 2009 — and for 5G NR a decade later. Nyquist and Shannon's framework, applied by different engineers to different physical channels, converging on the same solution.

LTE Advanced (Release 10, 2011) introduced carrier aggregation — bonding multiple LTE frequency bands to multiply throughput, just as DOCSIS 3.0 had done for cable. Peak speeds in carrier aggregation deployments reached 300 Mbit/s, 450 Mbit/s, and eventually over 1 Gbit/s in laboratory conditions. In the real world, LTE became the baseline expectation for mobile connectivity: fast enough for HD video streaming, low-latency enough for voice calls over IP, ubiquitous enough to be taken for granted in most urban environments worldwide.

📷 Image suggestion: Search Wikimedia Commons for "LTE frequency bands diagram" or "Qualcomm Snapdragon modem chip". The Snapdragon X-series modems are the most widely deployed LTE and 5G modem chipsets in smartphones.
 

5G NR: Speed, Latency, and the Connected World

5G NR (New Radio) was standardised by the 3GPP as Release 15 in 2018 and began commercial deployment in South Korea, the United States, and the UK in 2019. 5G is not a single technology but a framework spanning an enormous range of frequencies and use cases, from sub-1 GHz bands for wide-area coverage to millimetre-wave bands above 24 GHz for ultra-high-capacity urban hotspots.

The headline numbers are striking: peak theoretical downstream speeds of 20 Gbit/s, latency below 1 millisecond in ideal conditions, and the ability to support up to one million connected devices per square kilometre. These specifications are not aimed at the smartphone user streaming video — though 5G delivers that experience better than any previous generation. They are aimed at industrial applications: autonomous vehicles that need sub-millisecond response times, factory automation systems with thousands of sensors per production floor, remote surgery where network latency must be imperceptible.

5G modems are no longer separate devices in most deployments. They are integrated circuits — a modem SoC (System on Chip) — embedded directly in the processor package of a smartphone, laptop, or industrial module. Qualcomm's Snapdragon X65 and X70, Samsung's Exynos modems, MediaTek's Dimensity series, and Apple's in-house modem development (the first Apple-designed modem appeared in the iPhone 16 in 2024) are all descended, in a direct engineering lineage, from the signal processing work that produced the Bell 103 in 1962.

 

The Modem Everywhere: IoT and What Comes Next

The final chapter of the modem's story — if it has a final chapter — is the Internet of Things. The devices connecting to mobile networks today are no longer primarily smartphones and laptops. They are electricity meters, industrial sensors, vehicle tracking units, medical implants, agricultural monitors, smart home devices, and thousands of other categories of embedded hardware. Each of these devices contains a modem — typically a small, low-power module optimised for the specific requirements of its application.

The 3GPP has standardised several radio technologies specifically for IoT: NB-IoT (Narrowband IoT) for devices that send small amounts of data infrequently, such as utility meters; LTE-M (LTE for Machines) for devices that need slightly higher throughput or voice capability, such as wearables and asset trackers. Both technologies operate in licensed spectrum, provide guaranteed quality of service, and are designed for devices that may run on battery power for ten years or more.

The modem of 2025 bears almost no physical resemblance to the Bell 103 of 1962. It occupies a few square millimetres of silicon rather than a refrigerator-sized cabinet. It operates at gigabits per second rather than 300 bits. It communicates wirelessly over distances of kilometres rather than through a copper wire to an exchange. And yet the mathematical principles governing its operation — the Nyquist rate, the Shannon limit, the tradeoff between constellation size and noise immunity, the fundamental problem of encoding information onto a physical channel and decoding it reliably at the other end — are identical to those that governed the SAGE modem in 1958, and the telegraph relay in 1844.

The modem is the story of humanity trying to communicate faster, more reliably, and over greater distances. That story is not over.

 
 

Key People

Epoch VI

  • » Irwin Jacobs — co-founder of Qualcomm, architect of CDMA
  • » Erik Dahlman — Ericsson, key architect of WCDMA and LTE
  • » Ren Zhengfei — founder of Huawei, dominant force in mobile modem hardware
  • » Steve Jobs — iPhone redefined what mobile data was for
  • » 3GPP teams — hundreds of engineers across dozens of companies who wrote the LTE and 5G standards

Key Dates

2000 – today

  • » 2000 — GPRS launches (2.5G)
  • » 2001 — UMTS 3G, NTT DoCoMo Japan
  • » 2003 — EDGE launches (2.75G)
  • » 2005 — HSPA, 14.4 Mbit/s
  • » 2006 — Huawei E220 USB dongle
  • » 2007 — iPhone (EDGE)
  • » 2009 — LTE launches, Scandinavia
  • » 2011 — LTE Advanced, carrier aggregation
  • » 2019 — 5G NR commercial launch
  • » 2024 — Apple first in-house modem