• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. 2G: Mobile Goes Digital
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
5G
GSM
2g
mobile

2G: Mobile Goes Digital

7 May 2020 • 12 minute read

 breakfast bytes logo

In last week's post, 1G Mobile: AMPS, TOPS, C-450, Radiocom 2000, and All Those Japanese Ones. I covered 1G mobile, the first analog standards. Then we went digital.

2G

 The Nordic countries started work on an all-digital standard to replace NMT. By then it was clear in Europe at least that it was ridiculous for every country to have its own standard, with high equipment costs and no capability for roaming from, say, France to Germany. So Europe (actually the European Conference of Postal and Telecommunications Administrations) set up a special group to work on a new standard and on European-wide deployment. This special group was the Groupe Spécial Mobile or GSM. Later, everyone would pretend that GSM had stood for Global System for Mobile all along, but originally it was just the name of a working group that stuck (the Bluetooth name arose the same way). By the way, the dots in the GSM logo are meant to symbolize three clients on their home network and one roaming client.

Eventually, the GSM standard would get transferred to ETSI, the newly-created European Telecommunications Standards Institute. This is headquartered in Sophia Antipolis in the south of France, not far from where I worked for years at VLSI Technology and Compass Design Automation. Also, not far from where Cadence's Sophia Antipolis office is today. For more about Sophia, see my post Sophia Antipolis.

Most of the world decided to adopt the GSM standard so that they could take advantage of the scale of manufacture of equipment and handsets (which, in GSM terminology, were still called "terminals"). China used GSM (after a smaller deployment of TACS). The USSR (and its successors) all used GSM. The Indian subcontinent all used GSM. North America and Japan were the big exceptions.

 GSM used digital transmission instead of analog. This brought three big advantages over its predecessors:

  • Calls could be encrypted and so were confidential.
  • It could consume very little power when not making a call since it wasn't necessary to listen to the paging channel continuously.
  • The encoding made much better use of spectrum since it squeezed four calls into the space of one analog call.

The paging channel point is a bit subtle, but it was important to making battery life longer as phones were increasingly moving out of cars and into pockets and bags. With a digital paging channel, it was easy to have the phone be almost completely shut down (except for the real-time-clock) when it was on standby (not making a call). Of course, if it was shut down all the time there would be no way to tell the handset that it had an incoming call. But from experience with landlines, everyone is used to the idea that it takes a few seconds for a call to connect. So the phone only needs to wake up every few seconds at a precisely determined time, listen for a moment, and then go back to sleep unless it gets told that there is an incoming call.

The encoding was Time Division Multiple Access or TDMA. Technically, it was also using FDMA, too, since there wasn't just a single frequency. Each channel was divided into four slots, and a call would be allocated to one of the slots. It would transmit for one-fourth of the time, then wait three-fourths of the time, then transmit again. This worked because voice had been compressed down to 16Kb/s so four calls could fit into the space that a voice call required on landline networks, 64Kb/s. This required very precise timing since one handset might be near the basestation and another 30 kilometers away. So handsets had to advance or retard timing so that their call would arrive at the basestation in precisely its allocated slot (and vice versa). Plus, the phones might be moving on a high-speed-train shifting the frequency by Doppler effects.

 Another innovation in GSM was the SIM card (Subscriber Identity Module). The original was credit card sized. When GSM was developed, there was still no real comprehension of how ubiquitous and cheap mobile would get. It was expected that all cars and perhaps train seats would have a phone and you would put your SIM card into it. Of course, a few rich people might have an actual portable phone, but not us great unwashed masses. Luckily, there was provision made for a smaller SIM, the mini-SIM, that could be punched out of the credit card. In practice, the full-sized card was never used, and the smaller card would later prove to be too large and even smaller versions, the micro-SIM and nano-SIM, have been introduced since. The functionality is the same in all cases. The SIM actually holds various technical data, but from a user's point of view, it appears to hold the phone number. It actually holds a more complicated number, the International Mobile Subscriber Identity or IMSI in GSM-speak.

In the US, AMPS was succeeded at first by D-AMPS (Digital AMPS) also called IS-54. I think that this was a poor decision since it only made the voice channels digital with a version of TDMA splitting the channel into three slots. This made it really easy to share basestations between AMPS and D-AMPS since they both used the same analog paging channel. But that meant that battery life was terrible since the phones could not properly go into standby, they had to listen all the time. IS-54 was eventually replaced by a standard with a digital paging channel called IS-136.

When more spectrum was opened up, most carriers transitioned from IS-136 to GSM. AT&T in particular, who had developed AMPS back in the Bell System era and then later D-AMPS, switched to GSM and D-AMPS development basically stopped.

CDMA

 But there was another technology that started to be used in the US called Code Division Multiple Access, or CDMA. This had been entirely developed from whole cloth by Qualcomm. I was skeptical it would ever work in practice, it was too elegant to cope with the real noisy world, relying on a mathematical abstraction called Walsh-Hadamard codes originally researched in 1893.

But it did work, and it turned out to work better than TDMA in terms of the one thing that matters the most: spectral efficiency. You can read more about how CDMA works, and my adventures licensing it for VLSI Technology, in my post The CDMA Story and Qualcomm (spoiler alert: Qualcomm, today a company with over $25B in annual revenue, desperately needed $2M to make their quarterly number). CDMA didn't have the same timing challenges of GSM (and any TDMA system), instead, it had power challenges since the transmission power of the radio needed to be dynamically changed thousands of times per second. CDMA is often compared to lots of people talking at a party in different languages. But if one person shouts too loud (uses too much power) you can't hear the person you are trying to listen to. And it risks everyone gradually raising their voices ore and more to be heard until everyone is shouting. The commercial CDMA technology was called cdmaOne and further became IS-95.

I gave a high-level view of the timing challenges of TDMA, altering timing to keep transmissions in their assigned slot at the receiver. If you think about it for a moment, this is going to make the handoff between basestations very tricky. Not only are the frequencies going to change, but also the distances and so the timing. The handoff also had to happen in the gap between transmissions. CDMA didn't have this problem since it had connections to multiple basestations as part of its architecture, so as the phone moved, one would get stronger, and another would get weaker until it faded completely.

Another advantage CDMA had is that when you were not saying anything, it didn't transmit. GSM didn't transmit either, to save power, but all the slots it would have need to use still remained empty and so used up bandwidth. In both technologies, if you listen when nobody is saying anything, you think you can hear background noise from the other end of the call. Actually, that is "comfort noise" inserted automatically, since people get worried and think the call has been dropped if there is complete silence. In 2G, calls were half-duplex transmitting in either one direction or the other but not both. If you try and talk over the other person on a 2G network and you can hear them, then they can't hear you (insert your own mansplaining joke here).

SMS aka Text Message

 Another thing that happened in 2G was the invention of the text message. The GSM standard (and IS-136) included part of the protocol known as the short message system or SMS. Originally, this grew from a similar technology used in wireline systems to send short messages using the control channel instead of the voice channel. The SMS protocol in the GSM standard limited the message length to 128 characters so it would fit in the control channel without disturbing real control messages (later increased to 160...I'm not quite sure how Twitter ended up with 140 for compatibility with SMS).

The first text message was sent in 1992 when Neil Papworth, a 22-year old engineer at Vodaphone, sent "Merry Christmas" to Richard Jarvis, a director of the company, who was at a party taking place to celebrate it. I assume test messages had been sent before, but this was the first non-test message. It was a good job it worked better than the first message on the Arpanet/internet that only managed the "lo" of "login" before crashing. I expect the party continued with much Champagne. But I bet even the attendees had no idea how widely text messaging use would become.

Text messages completely changed how people used phones. Text messages were either free or cheap and voice calls were expensive. People with no money, aka teenagers, switched from making calls to texting. Eventually, their parents got with the program, too. At one point, operators were making more money from text messages than their total profit — as if they ran the whole network just so they could sell text messages.

Africa and M-Pesa

2G was also the era when African countries started to build out their mobile networks. They had never really had wireline networks except in the biggest cities, so this allowed them to leapfrog that stage of development. So just as my kids will never have a landline, neither will a rural Tanzanian. Most of those networks were adequate for years and so many of them are still running 2G today. Several African countries, starting with Kenya and Tanzania, also could leapfrog the fact that they didn't have a branch banking system and go straight to mobile for moving money and making payments. M-Pesa ("pesa" is Swahili for money) transfers several times the entire GDP of these countries every year. M-Pesa was implemented in a way that worked even if only SMS text messages were available. You didn't need a data-ready phone to use it.

One drive to create mobile payments is the observation that the world has 6+B cellphones, but only 2B bank accounts, and only 1B credit cards. If you want to start payments or banking somewhere that doesn't have it already then mobile is clearly the way to go. In China today, farmers selling vegetables at the side of the road take mobile payments, and apparently even beggars have QR codes to accept money.

Data Service

Adding data to voice is sometimes called 2.5G, since data is packet-switched and voice is circuit-switched (landlines are the same: the internet is packet-switched, the phone network is circuit-switched, at least on the surface).

The first data technology was General Packet Radio Service, or GPRS. This was then upgraded to Enhanced Datarates for GSM Evolution, or EDGE. You know how occasionally when reception is really bad and you don't seem to be able to get any data at all? You see that E on the top right of your screen. That doesn't stand for error, it stands for EDGE. But we are so used to high datarates that when we drop back to EDGE rates it seems like the system is failing. The original GPRS rate was 56kb/s (yes, bits, so 7kB/s) and the "enhanced" EDGE rate wen up to three times as fast. To be fair, they have continued to evolve so when you see that E, you might be getting more like 400kb/s. When the iPhone first came out and didn't support 3G, it wasn't voice calls that people were complaining about, a call either works or it doesn't. It was the fact that the data rates were too low to have a good experience using internet-based applications like maps or search.

 Around this time I got my first phone that could make use of data, a Samsung Blackjack. It had a full mechanical keyboard and a small screen. But you could access email, maps, and similar services. I believe it was 3G compatible but 3G was actually very slow to roll out. Obviously, in both look and name, the Blackjack was designed to compete with RIM's Blackberry. RIM sued and it was settled out of court.

These data technologies were an upgrade to the 2G systems since they were software upgrades and didn't require new infrastructure such as new antennas or additional basestations. They used unused slots in the normal TDMA voice channel to transmit the data. This actually made up for a disadvantage versus CDMA that I mentioned above. SInce CDMA didn't have the equivalent of unused slots, adding data proved more difficult and lacked behind GSMA/EDGE.

3G

Although GPRS and EDGE provided some data functionality, it was not enough for convenient internet access. That would require more bandwidth, and more emphasis on data compared to voice. Although 1G and 2G had never been called by those terms at the time they were launched, 3G was explicitly named (and 1G and 2G were retroactively named). We ended the 2G era with two main standards, cdmaOne and GSM/EDGE. I think Japan was still going its own way. One result of this decision, in my opinion shortsighted, is that Japanese phone manufacturers had the home market to themselves since nobody else could be bothered with a single network for "only" 150M people — but Japan gave up on being a global player in mobile except for a half-hearted attempt at a joint venture between Sony and Ericsson called...you'll never guess...Sony-Ericsson and based in Lund Sweden.

Look for posts on 3G and 4G soon.

 

Sign up for Sunday Brunch, the weekly Breakfast Bytes email.