What is 4G broadband, and do any wireless carriers really have it?
Connecting state and local government leaders
Verizon says it is fast. AT&T says it is even faster. Mobile broadband is exploding and everyone claims they have the best fourth generation network coming to a legion of data-hungry consumers. Really, though, what is this supposed 4G?
What constitutes 4G mobile broadband?
The four major U.S. mobile carriers all claim to have superior fourth generation technology for their phones, whether in practical use or in production. Yet, if you ask Verizon Wireless, it will tell you that AT&T’s 4G is really not 4G. AT&T will retort that there is really no discernible difference between its product and Verizon’s, except that AT&T is faster. Sprint claims that its 4G blows everybody away.
Everybody is the fastest. Every carrier has the most connectivity. Every carrier claims the best phones.
This all has very little to do with technology.
It is marketing, pure and simple. Yes, there are discernible differences between the carriers and the technologies they employ but one truth stands above all – pretty soon, everything is going to be so fast that it hardly will make a difference.
Related coverage:
Meet AT&T’s 4G – almost the same as the 3G
BlackBerry reveals plans for 4G tablet PC
5 smart choices in smart phones
To understand the 4G wars, it helps to first understand how we got to this supposed 4G in the first place. After all, 4G implies that there were three generations before it.
So, how did we get here? Let’s take a look at the timeline:
(Note: Thank you to Subrahmanyam Kvj, a telecom and media consultant from Mumbai, India, for helping me answer this question on the social media question and answer site Quora.)
1G, Pre-1990. Analog Mobile Phone System (AMPS), Frequency Division Multiple Access (FDMA), 9.6 kilobits/sec.
Analog technology. Think of the first generation as any mobile phone you saw before the rise of the personal computer. These phones were voice-focused with no ability for text or data. These were satellite or car phones, often needed suitecase full of gear to pick up a signal and were about as big as a shoebox. Zach Morris from "Saved By The Bell" and his huge cell phone? That was probably run on AMPS.
2G, 1991-2000. Time Division Multiple Access (TDMA), Code division multiple access (CDMA), 14.4 kilbits/sec.
Cell phones become digital. The cell phone as you know it came to life in the 1990s with the ability to send text messages and talk on reasonably sized phones from nearly anywhere. This is also where the initial split came of current American mobile standards between CDMA (Verizon/Sprint) and GSM (Global System for Mobile Communications) (AT&T/T-Mobile)
2.5G, 2000-present. General Packet Radio System (GPRS), CDMA 1x – 56, 156 kilobits/sec.
The rise of the BlackBerry. If you are in government you are probably using a BlackBerry and have probably seen GPRS or CDMA on the top of your screen telling you what connection you are using for your basic data services. These standards enabled simple data browsing through Wireless Access Protocol (WAP) connections, giving rise to the “mobile web” – the simplified mobile version of the Internet before Webkit designs (such as the iPhone) to enrich mobile media.
2.75G, 2003-present. Enhanced Data Rates For Global Evolution (EDGE), 236.8 kilobits/sec.
This GSM standard is currently what many (AT&T/T-Mobile specifically) 3G phones will fall back to if there is no 3G availability. The iPhone was initially an EDGE-based phone.
3G, 2000-present. Universal Mobile Telecommunications System (UMTS), CDMA 2000, up to 8 megabits/sec.
This is where the current 3G war is being fought. It is defined by the International Mobile Communications-2000 (IMT-2000) standard of what is “3G.” This is where mobile bandwidth became robust enough to handle the Era of the App, such as can be found in the Apple App Store, Android Market or BlackBerry App World. 3G can handle full rendering of websites or enriched mobile versions. UMTS is the evolved version from the GSM standard made by a collaboration of groups around the globe called the 3rd Generation Partnership Project (3GPP). That is not to be confused with 3GPP2, which is the parallel collaboration on the CDMA 2000 standard.
3.5G, 2006-present. High Speed Packet Access (HSPA), 14 megabits/sec downlink, 5.6 megabits/sec uplink.
Further evolution of the GSM standard currently being used by AT&T. The CDMA correlation would be Evolution-Data Optimized Revolution B (EV-DO Rev. B).
That brings us to 4G.
Or does it?
Technically speaking, true 4G does not yet exist. The International Mobile Telecommunications-Advanced (IMT-Advanced, the more technical name for 4G) as defined by the Internal Telecommunication Union (ITU, a division of the United Nations) of 4G was a downlink speed of 1 gigabit/sec for “low nomadic” – stationary or slow moving users – and 100 megabits/sec for higher travel speeds such as on a train. It must also have carrier flexibility and be based on an all-IP switch packet network (which more or less means fully compliant with IPv6).
None of the carriers can match such a rigorous criteria. Even the so-called 4G technologies coming into play these days are considered “pre-4G” and are stopgap between the initial 3G standards and true IMT-Advanced.
This is where the “I am the fastest” 4G marketing comes into play.
Verizon Wireless has been working since 2007 to implement Long Term Evolution, which is a branch off the GSM tree, the next step after UMTS and HSPA. The popular mindshare puts LTE as “true” 4G technology. That is not the case. LTE will dramatically increase speeds but not hit that 1 gigabit/sec threshold. It will, theoretically, provide 100 megabits/sec speeds downlink speeds (50 megabits/sec uplink), which will be a significant upgrade over current advanced 3G speeds, which rarely exceed 14.4 megabits/sec. AT&T's work to deploy LTE technology though its infrastructure is behind that of Verizon’s. T-Mobile and Sprint are also known to be looking into the standard.
The next step in LTE will be LTE-Advanced, which is expected for deployment after 2013. By the IMT-Advanced definition of 4G, this iteration is supposed to hit the mark and has been recognized by the ITU as the distinct fourth generation. In this same category is Worldwide Interoperability for Microwave Access – Advanced (WiMax Advanced). WiMax is currently the standard used by Sprint for its supposed 4G network and can theoretically reach 70 megabits/sec downlink with somewhere between 10 and 30 megabits/sec more likely for actual users.
So, why do all the carriers claim that they have 4G networks when in reality none of them do? Well, on Dec. 6, 2010, the ITU announced that carriers could use the “4G” name for “evolved 3G technologies providing a substantial level of improvement in performance and capabilities with respect to the initial third generation systems.”
That means that AT&T and T-Mobile now have the ability to call their networks 4G even though they do not currently deploy (in any meaningful way at least) the standard “pre-4G” standards of LTE or WiMax. Both carriers operate their advanced 3G standards on HSPA+ which is an upgrade over UMTS and can get up to 56 megabits/sec maximum downlink speeds.
Outside of the hazy ITU proclamations and definitions, 4G is mostly about semantics. During Verizon’s unveiling of the iPhone 4 on its network Jan. 11, CEO Lowell McAdam implied that he would like to call his LTE network more “6G” than “4G,” especially after the company’s chief rival – AT&T – is using the 4G label for its HSPA+ rollout.
For the most part, once a carrier network, be it HSPA+, LTE or WiMax, reaches a certain speed level (50 megabits/sec, for example), users in the near-term will not notice the difference. Phone applications are only able to do so much, and the current variety when run at speeds that fast will operate beautifully. Consumers will start to notice, though, when more enriched and sophisticated mobile applications run on carrier data come into play and fully utilize the advantages that 4G (IMT-Advanced if you prefer) and pre-4G have to offer.
To put this all in context, the landline link for the computer that this story is being written on is connected at 100 megabits/sec. When it is using the office’s WiFi (802.11g) it is (given full network stability) 56 megabits/sec.
In government, that means that agencies should start looking at the ability to create specific project-oriented mobile applications that can increase the efficiency of its workforce. Whether that be the ability to create and distribute multimedia or run sophisticated deep data queries, the capability of mobile devices will skyrocket within the next five years as wireless standards advance. Think about it. Most current landline broadband services cannot carry 1 gigabit/sec to workstations and laptops. Sooner rather than later, your phone will be able to do it.
What will you do with all that data?