1,2,3 G!
Evolution by Pamela Weintraub
The key to 3G is speed – the ability to transmit and
receive digital data at rates of about 2 million bits per
second, more than 35 times faster than today's fastest dial-up
personal computer modems and more than 200 times the speed of
most current handheld wireless data devices. Today's crazy
quilt of incompatible wireless standards prevents most mobile
phones from working in far-flung areas of the world. The
developers of 3G technology hope to create a single, unified
global standard – known in Europe as the Universal Mobile
Telecommunications System, or UMTS.
To understand the 3G future, says Ericsson's Gunnar
Liljegren, it's important to grasp the underpinnings of cell
technology, including the older 1G and 2G. Until the late
1970s, mobile telephony was virtually impossible due to a
fundamental problem: technological limitations confined
wireless phone calls to a small area known as a cell, covered
by a single base station, or radio tower. Single cells
couldn't service a big enough area to make the business
viable. The moment a user left the area of a given cell for
another, the signal fell apart and the call came to an end.
Just as important, the individual user could access the system
only from a designated cell tower; when the user "roamed"
outside the cell area to the vicinity of another tower, it
became impossible to recognize that account.
It was the new technology of computer-controlled switching
that overcame these problems, enabling the first generation of
mobile phones to reach the market: mobile networks could
automatically switch radio waves from one cell to the next,
keeping phone calls intact as users moved around. "Today you
can start in New York and then go to L.A. or even London, and
the system will find you," Liljegren says. "The
computer-controlled switches were as pivotal to the cellular
industry as the transistor was to electronics and silicon was
to computers."
By the early 1980s, first-generation systems had rolled out
in Europe and the Middle East. Though composed of a hodgepodge
of technologies from one region to the next, 1G systems all
shared the same essential handicaps: more popular than anyone
had predicted, they were hampered by inefficient use of the
designated spectrum, and by analog technology that made it
easy for eavesdroppers to listen in.
Up and running by 1990, second-generation systems, based on
digital technology, were far more robust, efficient and
private. Today they offer a host of features including voice
mail, stock trading and text messaging, as well as e-mail. But
although improved, the 2G systems failed to anticipate the
extraordinary changes ushered in by globalization, increases
in computing power and the Internet. "All you needed to do was
look at the trends in fixed telecommunications systems," says
Liljegren, "to see where cell systems fell short."
Indeed, wired phone lines transmitted so much data that
this application now eclipsed voice in terms of revenue and
demand. To keep up, wireless would have to deliver data, too –
and not just text, but also graphics and multimedia.
Designed to be faster and more efficient than 2G systems,
3G systems are meant to provide true intelligence, enabling
navigational capacity to set the compass for users who might
want directions to use city subway stations or country roads.
Not only would 3G need to be fast, it would also have to work
in sync with the protocols that already powered the World Wide
Web. Finally, 3G would have to strive for a level of
integration never before seen in the mobile-phone universe:
instead of varying from locality to locality, systems would
need to be more or less uniform from one region to the next.
|