Japan’s Rakuten is the first global mobile network operators (MNO) to fully virtualize their networks, with millions of active customers on commercial service. Rakuten has taken their expertise of being a fully virtualized operator to create the Rakuten Communications Platform (RCP), which packages its vendor portfolio and deployment expertise and markets it to other operators who also want to run their network in the cloud. Interestingly, the vendor product portfolio that is being sold is more extensive than what Rakuten has chosen to deploy. Based on news reports, Rakuten has already signed up 15 customers on RCP. The first publicly announced RCP trial partner is Ligado. A would-be US operator, Ligado owns spectrum in the United States that was previously used for satellite use. Up until recently Ligado has been involved in a fight with the Department of Defense over potential interference with GPS and NTIA, but the FCC sided with Ligado and allowed them to use their spectrum for commercial use.
Based on our research, the RCP universe consists of the following players:
Area
Company
4G Core
Cisco
Converged 4G/5G Core
NEC
Servers
Quanta
Service orchestration
Innoeye (acquired by Rakuten)
IMS/RCS
Mavenir
Open RAN software
Altiostar
4G Sub-6 GHz Radios
Nokia
5G Sub-6 GHz Radios
NEC
4G/5G Sub-6 GHz mmWave Radios and Software
Airspan
Qualcomm and Intel are also mentioned as RCP participants but apparently are mostly involved as silicon providers for their particular area of expertise.
Rakuten continues to use Innoeye and Altiostar, which it has purchased outright in Innoeye’s case or has an equity investment like Altiostar, for orchestration and Open RAN software, respectively. Mavenir continues to supply the IMS/RCS software and Quanta provides the servers.
RCP made several adjustments in its vendor portfolio when it added 5G support to the network. Rakuten switched from Cisco as a 4G packet core provider to NEC, which will work with Rakuten on building a converged 4G/5G core. NEC also replaced Nokia for the sub-6 GHz radios as Nokia only provides 4G radios for Rakuten. This change was surprising as the NEC 5G radios are actively cooled, whereas state-of-the-art radios are passively cooled. Rakuten did not switch mmWave radio providers which, Airspan continues to provide for 4G and 5G.
One of the powerful features of RCP is that an MNO can mix and match from any vendor in the RCP portfolio. If an MNO prefers Nokia or Airspan as their radio vendor, they can use Nokia’s mmWave product for 5G or use Airspan for both mmWave and sub-6 GHz Airspan’s Open RAN software or use Altiostar’s software.
RCP fits into an interesting sweet spot in the market. Most large MNOs, especially in the United States, will chart their own path towards Open RAN based on how it fits into the current network. Changing or introducing vendors for such a significant network transition is like changing an airplane’s engines in midflight. Small operators, especially if they have already chosen Huawei equipment, are locked in. These operators typically have lean engineering and operations staff that are not trained or sized for such a significant network transition. This makes small operators dependent on large network providers as prime project managers and for vendor financing. Huawei’s growth to become the largest global provider of 5G equipment to MNOs has been based on both significant deployment and customer service capability to the point where almost every Huawei-powered network is a custom network with generous vendor financing packages. A side effect of the customization of each network is that it makes it difficult for other vendors to get a part of the network equipment. Medium-size MNOs and greenfield operators, especially if they are not dependent on vendor financing and the small rural MNOs in the United States who have to replace the Huawei equipment in their network and are collectively paid $1.9 billion to do so are a prime target for RCP. Rakuten’s offering lets MNOs jump to state-of-the-art software-defined networks with Open RAN. Software-defined networks are more flexible and cheaper to operate and due to the standardization of hardware, they are less expensive to buy.
Especially the rural operators who are replacing their equipment should invest in the technology of the future, SDN, and Open RAN, regardless of whether they choose RCP or a custom route, rather than invest in the past’s integrated hardware and software. For rural MNOs who have to run their network with a lean team, SDN’s automation allows the network operation teams to be more efficient and effective. Before RCP, the path to SDN and Open RAN was quite daunting as, for example, Dish’s Charlie Ergen remarked during the Q4 2020 earnings call. RCP solves the complexity problem by allowing rural MNOs to use a working suite of products from another network operator. As a further bonus, several MNOs could combine their network operations and share a common core and operations team for additional cost benefits. Switching to SDN and Open RAN would also work with President Biden’s Buy American initiative. Several leaders in the field are American companies such as Airspan, Altiostar, Cisco, and Mavenir. For all too long, we have complained that there are no American telecom network equipment providers. Now the telecom industry has an opportunity to diversify its vendor base.
In recent years, the FTC raised concerns that Qualcomm’s patent portfolio and unbiased licensing scheme would prevent other companies from manufacturing and selling 5G chipsets, leading to an anti-trust lawsuit that concluded in November 2019. However, the prediction has not borne out. Currently, there are two companies, Qualcomm and MediaTek, that sell 5G chipsets to the device ecosphere at large, two captive suppliers who make their own 5G chipsets for internal consumption, and one company that is creating its own new 5G chipset also for internal consumption.
The mobile chipset business has a series of players with different objectives. Companies like Qualcomm and MediaTek provide mobile chipsets to device manufacturers and serve the vital function of ecosphere enablers. Without them, the plethora of devices and choices consumers enjoy when it comes to smartphones would not be possible. Another set of companies are making mobile chipsets only for themselves to create a competitive advantage in the marketplace. Apple and Samsung fall into this camp. Huawei is potentially a hybrid case as it was previously only providing its own handset group with chipsets, but now also provides them to a Chinese state-led consortium that purchased the Honor handset line.
Customer
Modem
Integrated SoC
Stand-alone Application Processor
RF Frontend
Qualcomm
Ecosphere
Yes
Yes
No
Yes
MediaTek
Ecosphere
No
Yes
No
No
Huawei
Divested divisions
Yes
Yes
Discontinued
No
Samsung
Captive
Yes
Yes
No
No
Apple
Captive
Future
Future
Yes
No
Intel (sold to Apple)
Ecosphere
Yes
No
Aborted
No
Currently, Qualcomm provides high-quality Systems on Chip (SOC) that are integrating multiple components, ranging from baseband, AI, graphics, camera, to CPU into one chip to anyone interested in them. Qualcomm was the first company to offer 5G chipsets with the first devices hitting the market at the end of 2019. MediaTek is offering a similar, but less advanced and less integrated product line to device manufacturers looking for low-level to medium-level chipsets. By the middle of 2020, MediaTek’s chipsets were powering a broad portfolio of handsets.
Intel, another ecosphere provider, sold its mobile chip business to Apple in December 2019, nine years after it entered the mobile chip market by buying a division of Infineon. Intel’s motivation to buy Infineon was that Infineon was the sole provider of modems to Apple. Reportedly during the negotiations between Intel and Infineon, then-Intel CEO Otellini sought reassurances from then-Apple CEO Steve Jobs that Apple would continue to use Infineon products after the Intel acquisition as Otellini recognized the importance of Apple as a customer for its chipsets. During the nine years after the Infineon acquisition, Intel’s mobile chipset division’s fate was intricately linked to Apple as Intel struggled to find other customers in the mobile device manufacturer ecosphere. In a nutshell, Intel was unable to compete with Qualcomm on quality like RF performance and SoC integration and was unwilling to compete with MediaTek as it had a more integrated solution and Intel did not. Intel ultimately threw in the towel on the heels of Apple and Qualcomm settling their lawsuit and agreeing to a six-plus two-year licensing and multiyear chipset supply agreement.
Huawei through its HiSilicon subsidiary has developed and used its own 5G chipsets and has integrated them into its own devices. While the Huawei chipsets are not as integrated and small as Qualcomm’s, Huawei’s engineers have found ways to integrate the chipsets into its devices. It is using Qualcomm, Skyworks and Qorvo, all from the US, for its RF front-end. Huawei’s role in the mobile world got a lot more interesting as it has sold its Honor-brand device division to a Chinese state-led consortium of more than three dozen companies as Huawei experienced a lot of pressure on its devices sales due to American sanctions. Reportedly, Huawei is also considering selling its Mate and P-line device groups in the hope that American sanctions will not follow to the new owners of the device businesses. Up until now, Huawei is not selling its HiSilicon chipsets to other companies, other than the group of Huawei dealers that acquired the Honor-brand device division, as a competitive weapon in order to keep their best technology capitive. In 2019, during the trade tensions between the US and China over Huawei, the company offered to license its 5G intellectual property to American companies to alleviate any spying concerns, but no deal has emerged until to date. If Huawei is divesting its entire device portfolio Huawei might either also divest its HiSilicon division with it or become an ecosphere provider for other handset manufacturers. The direction of Huawei’s HiSilicon business will be quite telling of the size of the Chinese walls between Huawei and its divested handset businesses as well as other handset vendors.
Samsung has been producing its own Exynos modems and mobile processors, and has also purchased mobile chipsets from Qualcomm. Samsung’s new 5G devices, including its S20 5G flagship smartphone, is shipping either with the Exynos or Qualcomm Snapdragon chipset. Samsung sells the Qualcomm variant in the US, China, and most recently South Korea, and its Exynos variant in the rest of the world. Benchmarking has shown that the Qualcomm chipset version regularly outperform the Exynos one and that Samsung uses the Qualcomm variant in the most competitive markets to close the gap against Apple’s iPhone.
In 2008, Apple with its computer heritage bought P.A. Semi, a processor development company specializing in highly power-efficient designs, to build its own ARM-based processors for iPhones, iPads, and similar devices. Apple’s ARM processors are now the fastest CPUs in the market and will start powering Apple Mac computers starting in 2021. Apple sourced its baseband chipset first from Infineon, then post-acquisition from Intel, then a few years later from Qualcomm, then dual-sourced from Intel and Qualcomm, and most recently in 2019, signed an agreement to return to Qualcomm. In 2019, Apple also bought Intel’s baseband chipset business and has started hiring more wireless engineers in San Diego, Qualcomm’s home market. Considering Apple’s track record it is quite logical that Apple is going to try to replicate its successful ARM processor endeavor in modems, and internally source its 5G mobile chipsets when the Qualcomm agreement expires. The Qualcomm agreement gives Apple breathing room to pour its resources into an area that is a key differentiator between mobile devices.
These successful 5G chipset endeavors demonstrate that Qualcomm’s patent portfolio and licensing policy do not present a significant barrier to innovation. Qualcomm’s licensing rates have not changed since it first started licensing CDMA in the 1990s, while its portfolio has grown substantially, facilitating continued innovation that has made the United States a leader in international telecommunications on a fair, reasonable, and non-discriminatory basis. As silicon merchants to the industry, Qualcomm and Mediatek’s participation in chipset development creates choice and opportunity for many mobile device manufacturers to have a chipset that meets their needs and budgets exponentially increases the range of consumer choices without infringing on the ability of other companies to enter the market.
The Super Bowl is not only the pinnacle of the American Football season, but it is also the show case event for wireless carrier. Every year, every wireless carrier sets aside tens of millions of dollars to improve the wireless network for the big game. The newest and best gets installed so that attendees and regular citizens alike get a superior experience. One of the challenges that mobile operators have to overcome in order to provide faster speeds is how to best use the spectrum they have. The spectrum portfolio of each carrier has different amounts of spectrum ranging from 600 MHz and 700 MHz all the way to 39 GHz. In order increase the speed for customers, the spectrum slivers can be bonded together in a process called carrier aggregation (CA). The maximum amount of spectrum that LTE can have in a channel is 20 MHz and 5G is 100 MHz. So even if a mobile network operator has 1,200 MHz of contiguous spectrum with 5G they could aggregate 12 100 MHz channel for the maximum amount of bandwith which directly translates into speed.
Global Wireless Solutions (GWS) conducted network testing during the 2021 Super Bowl in Tampa and provided some additional insights that typically don’t get mentioned with other tests. GWS found that AT&T used 8 channel CA using 800 MHz of its 39 MHz spectrum, Verizon used 6 CA using 600 MHz in 28 GHz, and T-Mobile 4 CA using 400 MHz in 39 GHz plus 80 MHz in the 2.5 GHz band. The availability using the 28 and 39 GHz band was almost the same for every carrier at 73% to 77%, which isn’t surprising considering the relatively small area covered. T-Mobile’s 2.5 GHz coverage added another 20% of coverage. Considering more spectrum means more speed, it comes as no surprise that AT&T was the fastest with peak speeds of 1.7 Gbps, Verizon with 1.5 Gbps and T-Mobile with 1.1 Gbps.
The GWS tests show that it is not only important to have a lot of spectrum. It is even more important to use it. Not only does the network have to be ready for it, but also the device. Most flagship devices are using the Qualcomm Snapdragon X55 modem. Devices using this modem, like the Samsung Note 20 that the GWS engineers used for the tests can utilize 8 channel CA in the mmWave band (> 6 GHz) with 2×2 MIMO and 200 MHz 4×4 MIMO below 6 GHz with 7×20 MHz channel CA for LTE. This means that when T-Mobile uses all its on average 130 MHz 2.5 GHz spectrum, current flagship devices will be ready. When Verizon and T-Mobile are ready to follow AT&T’s lead with 8 channel CA, the devices will also be able to support that. The Qualcomm X60 modem that’s for example in the Samsung Galaxy S20 can aggregate spectrum below and above 6 GHz. This allows to combine the better coverage characteristics of sub-6 GHz spectrum with the massive spectrum bands and therefore speeds that are available above 6 GHz. The X60 modem also works with the upcoming C-Band networks that will probably become available in 2022.
The bidding for licenses in the C-Band auction has ended with bids of $81 billion for 280 MHz, surprising most observers. The C-Band auction exceeded the previous record holder, the 2015 AWS-3 auction, which yielded $44.9 billion. While C-band raised almost twice as much as AWS-3, one of the things to consider is that different amounts of spectrum were for sale: 280 MHz in the C-Band auction versus 65 MHz in the AWS-3 auction. In terms of dollars per MHz of Population covered ($/MHz Pop), the metric by which we compare different amounts of spectrum and population coverage, the AWS-3 auction is still the most expensive auction in US history.
What are the drivers for spectrum prices? Looking at some of the most important auctions over the last 15 years gives us some important pointers, both when comparing prices within an auction and from auction to auction.
Auction
Year
MHz
Mhz
Result
$/MHz Pop
Notes
C-Band
2021
3700
280
$80.9b
$0.94
Unincumbered spectrum
CBRS
2020
3500
70
$4.6b
$0.22
Maximum 30 MHz & combined with unlicensed, limited power
600 MHz
2017
600
70
$19.6
$0.88
AT&T & Verizon mostly ineligible to bid
AWS-3 paired
2015
1700/2100
50
$42.5b
$2.71
Unincumbered spectrum
AWS-3 unpaired
2015
1700
15
$2.4b
$0.52
Unpaired spectrum
700 MHz B Block
2008
700
12
$9.06b
$2.24
Unincumbered spectrum
700 MHz A Block
2008
700
12
$3.87b
$1.17
Needed filters
700 MHz C Block
2008
700
22
$4.75b
$0.76
Net Neutrality
700 MHz E Block
2008
700
6
$1.26
$0.74
Unpaired spectrum
Conducted in 2008, the 700 MHz auction sold four different blocks of spectrum. A Block had significant interference issues because some broadcasters were still operating in the spectrum at that time and handsets required filters in order to work properly. However, none of the handsets in the time had those filters. The B Block was clean and ready to use spectrum. The C Block comes with net neutrality provisions attached to it because Google promised the FCC to bid on the spectrum. The FCC at the time was very eager to incentivize a new entrant into the US wireless market and the FCC interpreted from what Google had told them that Google would win the C Block auction and become a mobile network operator. Finally, E Block is a sliver of unpaired spectrum.
To no surprise B Block, the clean, ready to use spectrum, sold for the highest price. The problem A Block sold for half the price of the B Block. The C Block sold for one third of the A Block after Google bid once in the first round, Verizon topped it in the second round, and nobody else bid on it in the next 222 rounds of the auction. The smaller E Block of unpaired spectrum sold for a tiny bit less than the C Block because at the time, nobody really knew what to do with it as the technology to use it was not mainstream yet.
In the 2015 AWS-3 Auction, clean paired spectrum again sold for substantially more than unpaired spectrum. This time, it was more than five times the amount: $2.71 per MHz Pop compared to $0.52. Since the AWS-3 block laid next to the AWS-1 block which mobile operators had already deployed, it was very easy and cheap to use the spectrum without incurring infrastructure cost as the same equipment could be used. Basically, roughly the amount of the cost of the infrastructure went into the auction and drove up the price for the spectrum since operators calculate their total ownership cost of spectrum plus the cost to deploy in their budgeting process.
Prior to the 2017 600 MHz Broadcast Incentive Auction, T-Mobile and regional operators argued that AT&T and Verizon had too much low band spectrum in the market for others to be competitive, and therefore should not be allowed to bid on the 600 MHz auction. T-Mobile argued that it had no 700 MHz spectrum and therefore it was at a disadvantage, omitting that it chose at the time not to bid in the 700 MHz auction. Long story short, AT&T and Verizon were excluded from bidding on the vast majority of license. T-Mobile won more than half of the spectrum on bid for the auction (37 Mhz of 70 MHz) for one third of the price ($0.88 per MHz Pop) of what spectrum went for at the AWS-3 ($2.71 per MHz Pop) and 700 MHz ($2.24 per MHz Pop) auction since the regional operators did not have the financial resources to effectively compete with T-Mobile and Sprint chose not to participate.
Fast forward three years: T-Mobile buys Sprint for $26 billion. Sprint owned between 160 and 194 MHz of 2.5 GHz spectrum in the Top 100 markets with a nationwide average of 137 MHz plus 37 MHz in the 800 MHz, PCS, AWS bands and the Department of Justice and the FCC only require T-Mobile to divest 14 MHz of 800 MHz spectrum to Dish for $3.6 billion. Suddenly, T-Mobile has three times the low- and mid-band spectrum of Verizon and two times that of AT&T. Well played, T-Mobile, well played!
In 2020, the FCC auctioned off 70 MHz of 150 MHz CBRS spectrum. In the CBRS band, the 70 MHz CBRS auction winners could buy up to three 10 MHz Priority Access Licenses (PAL) and use them exclusively in addition to the 80 MHz General Authorized Access Licenses (GAA) licenses. Wherever PAL licenses were not sold, these licenses became GAA and could also be used by anyone. This novel approach of combining shared access with a licensing approach created an auction totaling $4.6 billion for the US Treasury or $0.22 per MHz Pop. This is by far the lowest amount per MHz Pop of all the auctions. One quarter of the 600 MHz Auction and one tenth of 700 MHz and one twelfth of the AWS-3 auction.
A few months after the CBRS auction concluded, the C-Band auction took place. The two spectrum bands are adjacent to each other and have identical propagation characteristics. The same spectrum at the same time could one lead to believe that the prices for the two bands would be very similar. The C-Band auction, which is clean, unincumbered spectrum yielded more than four times the price per MHz Pop than the CBRS Auction with its sharing characteristics. Furthermore, CBRS licenses were auctioned on a per county level whereas C-Band was auctioned on a Partial Economic Area basis, which are much larger.
Comparing the different prices, the following drivers of spectrum proceeds become obvious and should be considered by the FCC when designing upcoming spectrum auctions. After all, it’s the taxpayer’s money:
Exclusivity and larger license areas: Exclusive use in larger license areas has a 4x premium over shared use – C-Band versus CBRS (427%)
Clean spectrum: Cleared spectrum without incumbents sharing the spectrum is up to 2x as valuable – A Block versus 700 B Block (191%)
No bidder restrictions: Allowing everyone to participate in an auction increases spectrum value by up to 3x – AWS-3 paired versus 600 MHz (307%) or 700 MHz B Block versus 600 MHz (254%)
No restrictions on business models: Lack of business model restrictions increases spectrum value by 3x – 700 MHz B Block compared to 700 MHz C Block (294%)
Propagation characteristics: Unrestricted low band spectrum in 3x as valuable as mid band spectrum – AWS-3 paired compared to C-Band (288%)
Paired and unpaired spectrum no longer matters: The historic 3x difference between paired and unpaired spectrum does not have a technical reason anymore as 5G works better on unpaired spectrum.
As China has become a major global economy and grows more assertive on the global stage, the country has discovered the power of anti-trust legislation. While created on three common pillars of fighting anti-competitive agreements between companies, the abuse of a dominant position, and mergers that may eliminate or restrict competition, the implementation is increasingly different. There have been hundreds of cases where Chinese authorities have looked at mergers between Chinese companies, and not one has been objectionable to the authorities. But if it doesn’t matter that two separate companies are owned by the state or merged, how can a merger between state-owned businesses be anti-competitive?
In a state capitalist system, as we have now, Communist Party groups are part of every company, including private domestic or international joint-ventures and all foreign investment is in the shape of a joint-venture with a Chinese partner with three or more employees. While they have long-established formal power in state owned enterprises (SOE), for joint ventures there is increasing pressure to allow party groups to approve all critical matters before they are presented to the board based on the 2017 Communist Party Directive entitled “Notice about firmly promoting writing SOE party building work into company articles of association.” Following this logic, reviews of intra-Chinese mergers have always been approved.
Mergers in the last decade
As you can see from the chart above, there are no outright merger rejections and only a small number of approvals with conditions. Interestingly, the only mergers that have come under scrutiny are mergers without Chinese involvement. Due to the extraterritorial nature of Chinese anti-trust law, even mergers of companies outside China fall under its purview when it involves companies with a substantial amount of business in China. For example, in 2019, the five cases that were approved with conditions were KLA Tencor (US)/ Orbotech (Israel), Cargotec (Finland)/ TTS (Norway), II-VI incorporated (US)/Finisar (US), Zhejiang Huayuan Biotechnology (PR China)/Royal DSM (Dutch), and Nevelis (US)/Aleris (US). In addition, there are cases like Qualcomm (US) / NXP (Dutch), where instead of denying the application, Chinese anti-trust authorities just ran out the time. After two years of waiting for the acquisition of NXP by Qualcomm to be permitted, the companies reached the end of the contractual merger period and were forced to give up. This de facto denial was never recorded as a denial, as the Chinese anti-trust authorities simply did not rule. Due to the small size of China’s anti-trust authority, the country has plausible deniability when it delays ruling on a merger. At face value, China’s perfect record of only approving mergers remains intact, when in reality the merger was forcibly abandoned.
What’s really at stake
Cases such as those mentioned above create the appearance that Chinese anti-trust concerns are not directed at protecting Chinese consumers but protecting Chinese industrial policy. The approval with conditions of the Marubeni (Japan) acquisition of Gavilon (US) and Glencore (Swiss) of Xstrata (Swiss/UK) demonstrates that China’s industrial policy leads anti-trust merger enforcement. In both cases, China was concerned about the supply of vital commodities, copper and grain respectively, and the merger was approved only after significant divestitures that alleviated these concerns.
With this in mind, the acquisition of ARM Technologies (UK) from Softbank (Japan) by Nvidia from the US will be another interesting case. Most casual observers would conclude that Chinese anti-trust authorities would not be involved. Au contraire, mon ami! Almost all smartphone central processors are using ARM instruction sets, and Chinese companies have built their AI and neural processing technology on them. Huawei even went a step further and built its Ascend AI and Kunpeng general purpose processor programs entirely on ARM. The increasing reliance is due both to technical and to political reasons.
President Trump’s moves to use American intellectual property in trade battles with China, as well as restricting their use in military and dual-use applications, has complicated the lives of Chinese high-tech companies and it is likely to continue during President Biden’s administration. As a reaction, China has accelerated its Made in China 2025 project focused on reducing its dependency on foreign technology and products and shifting to non-American suppliers. If the Nvidia acquisition of ARM goes through, another key technology will be more closely under the control of US authorities, giving them another potential tool to assert pressure on China. It would also give Nvidia a significant boost in the AI competitive race that China considers one of its highest priorities. Nividia is a leader in network-based AI and ARM a leader in device-based, also known as edge AI. Combining the two companies makes them a much more formidable competitor, allowing to cross pollinate network AI with edge AI technology and vice-versa. Both companies have substantial business in China and hence fall under Chinese anti-trust laws and are subject to review.
Considering China’s track record, it is almost inevitably going to either block or just refuse to approve the Nvidia/ARM transaction to protect its domestic industry from further US sanctions and restrictions and to prevent a stronger competitor in the AI marketplace. It is more likely that China will simply run out the clock on the merger, while a more aggressive and higher profile move would be an outright denial of the merger. This would send a much stronger signal to the United States than passive aggressive non-approval and would be a harbinger of a more adversarial phase in the relationship between the two countries.
When Nvidia announced that it was in the process of buying Arm from Softbank, many analysts and industry observers were exuberant about how it would transform the semiconductor industry by combining the leading data center Artificial Intelligence (AI) CPU company with the leading device AI processor architecture company. While some see the potential advantages that Nvidia would gain by owning ARM, it is also important to look at the risks that the merger poses for the ecosphere at large and the course of innovation.
An understanding of the particular business model and its interplay highlights the importance of the proposed merger. Nvidia became the industry leader in data center AI almost by accident. Nvidia became the largest graphics provider by combining strong hardware with frequently updated software drivers. Unlike its competitors, Nvidia’s drivers constantly improved not only the newest graphics cards but also past generation graphics cards with new drivers that made the graphics cards faster. This extended the useful life of graphics cards but, more importantly, it also created a superior value proposition and, therefore, customer loyalty. The software also added flexibility as Nvidia realized that the same application that makes graphics processing on PCs efficient and powerful – parallel processing – is also suitable for other heavy computing workloads like bitcoin mining and AI tasks. This opened up a large new market as its competitors could not follow due to the lack of suitable software capabilities. This made Nvidia the market leader in both PC graphics cards and data center AI computation with the same underlying hardware and software. Nvidia further expanded its lead by adding an parallel computing platform and application programming interface (API) to its graphics cards that has laid the foundation for Nvidia’s strong performance and leading market share in AI.
ARM, on the other hand, does not sell hardware or software. Rather, it licenses its ARM intellectual property to chip manufacturers, who then build processors based on the designs. ARM is so successful that virtually all mobile devices use ARM-based CPUs. Apple, which has used ARM-based processors in the iPhone since inception is now also switching their computer processors from Intel to ARM-based internally built CPUs. The ARM processor designs are now so capable and focused on low power usage that they have become a credible threat to Intel, AMD, and Via Technology’s x86-based CPUs. Apple’s move to eliminate x86 architecture from their SKUs is a watershed moment, in that solves a platform development issue by allowing developers to natively design data center apps on their Macs. Consequently, it is only a matter of time before ARM processor designs show up in data centers.
This inevitability highlights one of the major differences between ARM and Nvidia’s business model. ARM makes money by creating processor designs and selling them to as many companies that want to build processors as possible. Nvidia’s business model, on the other hand, is to create its own processor designs, turn them into hardware, and then sell an integrated solution to its customers. It is hard to overstate how diametrically different the business models are and hard to imagine how one could reconcile these two business models in the same company.
Currently, device AI and data center AI are innovating and competing around what kind of tasks are computed and whether the work is done on the device or at the data center or both. This type of innovative competition is the prerequisite for positive long-term outcomes as the marketplace decides what is the best distribution of effort and which technology should win out. With this competition in full swing, it is hard to see how a company CEO can reconcile this battle of the business models within a company. Even more so, the idea that one division of the New Nvidia, ARM, could sell to Nvidia’s competitors, for example, in the datacenter or automotive industry and make them more competitive is just not credible, especially for such a vigorous competitor as Nvidia. It would also not be palatable to shareholders for long. The concept of neutrality that is core to ARM’s business would go straight out of the window. Nvidia wouldn’t even have to be overt about it. The company could tip the scales of innovation towards the core data center AI business by simply underinvesting in the ARM business, or in industries it chooses to deprioritize in favor of the datacenter. It would also be extremely difficult to prove what would be underinvesting when Nvidia simply maintained current R&D spend rather than increasing it, as another owner might do as they see the AI business as a significant growth opportunity rather than a threat as Nvidia might see it.
It is hard to overestimate the importance of ARM to mobile devices and increasingly to general purpose computing – with more than 130 billion processors made as of the end of 2019. If ARM is somehow impeded from freely innovating as it has, the pace of global innovation could very well slow down. The insidious thing about such an innovative slow down would be that it would be hard to quantify and impossible to rectify.
The proposed acquisition of ARM by Nvidia also comes at a time of heightened anti-trust activity. Attorney Generals of several states have accused Facebook of predatory conduct. New York Attorney General Letitia James said that Facebook used its market position “to crush smaller rivals and snuff out competition, all at the expense of everyday users.” The type of anti-competitive conduct that was cited as basis for the anti-trust lawsuit against Facebook was also that of predatory acquisitions to lessen the threat of competitive pressure by innovative companies that might become a threat to the core business of Facebook.
The parallels are eerie and plain to see. The acquisition of ARM by Nvidia is all too similar to Facebook’s acquisitions of Instagram and WhatsApp in that both allow the purchasing entity to hedge their growth strategy regardless of customer preferences while potentially stifling innovation. And while Facebook was in the driver’s seat, it could take advantage of customer preferences. Whereas in some countries and customer segments the core Facebook brand is seen as uncool and old, Instagram is seen as novel and different than Facebook. From Facebook’s perspective, the strategy keeps the customer in-house.
The new focus by both States and the federal government, Republicans and Democrats alike, on potentially innovation-inhibiting acquisitions, highlighted by their lawsuits looking at past acquisitions as in Facebook’s and Google’s case, make it inevitable that new mergers will receive the same scrutiny. It is likely that regulators will come to the conclusion that the proposed acquisition of ARM by Nvidia looks and feels like an act that is meant to take control of the engine that fuels the most credible competitors to Nvidia’s core business just as it and its customers expands into the AI segment and are becoming likely threats to Nvidia. In a different time, regardless of administration, this merger would have been waved through, but it would be surprising if that would be the case in 2021 or 2022.
When Nvidia announced that it was in the process of buying Arm from Softbank, many analysts and industry observers were exuberant about how it would transform the semiconductor industry by combining the leading data center Artificial Intelligence (AI) CPU company with the leading device AI processor architecture company. While some see the potential advantages that Nvidia would gain by owning ARM, it is also important to look at the risks that the merger poses for the ecosphere at large and the course of innovation.
An understanding of the particular business model and its interplay highlights the importance of the proposed merger. Nvidia became the industry leader in data center AI almost by accident. Nvidia became the largest graphics provider by combining strong hardware with frequently updated software drivers. Unlike its competitors, Nvidia’s drivers constantly improved not only the newest graphics cards but also past generation graphics cards with new drivers that made the graphics cards faster. This extended the useful life of graphics cards but, more importantly, it also created a superior value proposition and, therefore, customer loyalty. The software also added flexibility as Nvidia realized that the same application that makes graphics processing on PCs efficient and powerful – parallel processing – is also suitable for other heavy computing workloads like bitcoin mining and AI tasks. This opened up a large new market as its competitors could not follow due to the lack of suitable software capabilities. This made Nvidia the market leader in both PC graphics cards and data center AI computation with the same underlying hardware and software. Nvidia further expanded its lead by adding an parallel computing platform and application programming interface (API) to its graphics cards that has laid the foundation for Nvidia’s strong performance and leading market share in AI.
ARM, on the other hand, does not sell hardware or software. Rather, it licenses its ARM intellectual property to chip manufacturers, who then build processors based on the designs. ARM is so successful that virtually all mobile devices use ARM-based CPUs. Apple, which has used ARM-based processors in the iPhone since inception is now also switching their computer processors from Intel to ARM-based internally built CPUs. The ARM processor designs are now so capable and focused on low power usage that they have become a credible threat to Intel, AMD, and Via Technology’s x86-based CPUs. Apple’s move to eliminate x86 architecture from their SKUs is a watershed moment, in that solves a platform development issue by allowing developers to natively design data center apps on their Macs. Consequently, it is only a matter of time before ARM processor designs show up in data centers.
This inevitability highlights one of the major differences between ARM and Nvidia’s business model. ARM makes money by creating processor designs and selling them to as many companies that want to build processors as possible. Nvidia’s business model, on the other hand, is to create its own processor designs, turn them into hardware, and then sell an integrated solution to its customers. It is hard to overstate how diametrically different the business models are and hard to imagine how one could reconcile these two business models in the same company.
Currently, device AI and data center AI are innovating and competing around what kind of tasks are computed and whether the work is done on the device or at the data center or both. This type of innovative competition is the prerequisite for positive long-term outcomes as the marketplace decides what is the best distribution of effort and which technology should win out. With this competition in full swing, it is hard to see how a company CEO can reconcile this battle of the business models within a company. Even more so, the idea that one division of the New Nvidia, ARM, could sell to Nvidia’s competitors, for example, in the datacenter or automotive industry and make them more competitive is just not credible, especially for such a vigorous competitor as Nvidia. It would also not be palatable to shareholders for long. The concept of neutrality that is core to ARM’s business would go straight out of the window. Nvidia wouldn’t even have to be overt about it. The company could tip the scales of innovation towards the core data center AI business by simply underinvesting in the ARM business, or in industries it chooses to deprioritize in favor of the datacenter. It would also be extremely difficult to prove what would be underinvesting when Nvidia simply maintained current R&D spend rather than increasing it, as another owner might do as they see the AI business as a significant growth opportunity rather than a threat as Nvidia might see it.
It is hard to overestimate the importance of ARM to mobile devices and increasingly to general purpose computing – with more than 130 billion processors made as of the end of 2019. If ARM is somehow impeded from freely innovating as it has, the pace of global innovation could very well slow down. The insidious thing about such an innovative slow down would be that it would be hard to quantify and impossible to rectify.
The proposed acquisition of ARM by Nvidia also comes at a time of heightened anti-trust activity. Attorney Generals of several states have accused Facebook of predatory conduct. New York Attorney General Letitia James said that Facebook used its market position “to crush smaller rivals and snuff out competition, all at the expense of everyday users.” The type of anti-competitive conduct that was cited as basis for the anti-trust lawsuit against Facebook was also that of predatory acquisitions to lessen the threat of competitive pressure by innovative companies that might become a threat to the core business of Facebook.
The parallels are eerie and plain to see. The acquisition of ARM by Nvidia is all too similar to Facebook’s acquisitions of Instagram and WhatsApp in that both allow the purchasing entity to hedge their growth strategy regardless of customer preferences while potentially stifling innovation. And while Facebook was in the driver’s seat, it could take advantage of customer preferences. Whereas in some countries and customer segments the core Facebook brand is seen as uncool and old, Instagram is seen as novel and different than Facebook. From Facebook’s perspective, the strategy keeps the customer in-house.
The new focus by both States and the federal government, Republicans and Democrats alike, on potentially innovation-inhibiting acquisitions, highlighted by their lawsuits looking at past acquisitions as in Facebook’s and Google’s case, make it inevitable that new mergers will receive the same scrutiny. It is likely that regulators will come to the conclusion that the proposed acquisition of ARM by Nvidia looks and feels like an act that is meant to take control of the engine that fuels the most credible competitors to Nvidia’s core business just as it and its customers expands into the AI segment and are becoming likely threats to Nvidia. In a different time, regardless of administration, this merger would have been waved through, but it would be surprising if that would be the case in 2021 or 2022.
A week into the C-Band auction, all signs point to an intense struggle for spectrum among the auction participants. With 5G deployments in full-swing, the 280Mhz of mid-band licenses being offered sit directly in the ‘goldilocks zone’ that straddles attractive propagation characteristics -good in-building penetration and range- and 20Mhz blocks large enough to deploy significant capacity and speed for 5G.
For bidders looking to use spectrum quickly, the 100MHz that makes up A block for the auction is scheduled to be cleared as early as 2021 while B and C block licenses may not be available until 2023. As such, A block licenses for 46 of the top 50 markets can be bid on separately and will likely come at a premium compared to B and C block licenses in the same market.
While we won’t know who bid on which markets until the auction is over, for each round the FCC reports the demand for licenses versus those available via the public reporting system. For each round, in every market where demand exceeds supply, the price increases by 10% for the next round. Ten percent may not sound like much initially, but exponential growth soon catches up and markets with intense bidding quickly get expensive. For example, after ten rounds where demand exceeds supply the price more than doubles, after twenty rounds it increases by six-fold and after fifty rounds the price has increased to 100x the original cost.
Price escalation invariably forces decisions on even the most well-heeled bidders, but we have not yet reached that point with the C-Band auction. So far the volume of markets with bids over supply has INCREASED, not decreased, resulting in over $10.5B gross proceeds across the 411 markets offered through round 20.
What is interesting about the C-Band auction so far is that the bidding volume for smaller markets that typically happens later in the auction has heated up early. During the first ten rounds demand for roughly 2/3rds of markets exceeded supply and therefore increased in price by 10% every round. Starting in round 12 bidding volume increased. This increased volume catapulted markets with price increases from 68% of all markets to 81% of all markets and continued to build to over 90% of all markets by round 17.
Digging a bit deeper, the jump in markets with more demand than supply that began in round 11 was driven by an increase in interest in smaller population markets while prices for larger markets were still not settled. Through round 10 only about half of markets lower population tier markets ranked 100-400 had more bids than supply, but by round 15 over 86% of markets ranked 100-400 by population had more bids than supply.
But what about those A sub block licenses that could help the winner race to deploying mid-band 5G? There are only 5 sub blocks available per market, but bidding for many of the top markets through 20 rounds still shows bids in the double digits in excess of supply. In fact, demand for A block markets is still so strong that in the first 20 rounds every A sub block in the 46 markets increased in price every round. That adds up to a 612% increase over 20 rounds and over $3.5B of the total gross proceeds of $10.5B across the entire auction.
Bidders seem to be slowly adjusting to the expectation that they may not win the entire A block. In round 18 demand for the top 39 markets dropped by 2 to 3 units across all markets, likely indicating a coordinated pull-back by one bidder. While demand has fallen across the top markets, none of the A block markets have reached bidding equilibrium where supply equals demand and price increases cease. At current pace the licenses in A sub block markets would be collectively worth over $5B in just 5 more rounds and over $10B by round 32.
Within A sub block, which markets are most popular among bidders? As always, the top 10 markets have among the highest demand, but the market with the most demand in excess of supply so far is Salt Lake City, the 27th largest market in the auction, with 4 bidders for every available block (5 blocks available, 15 in excess of supply). A cluster of markets follow Salt Lake City for second most activity: Chicago, Dallas, Miami, Houston, Orlando, Las Vegas, Kansas City, Austin and Milwaukee all have 12 bids over supply (total bids of 17 each).
So what can we learn from the first week of the C-Band auction? At $10.5B through round 20, the levels of interest in mid-band 5G spectrum are sky high. Verizon and AT&T’s well understood need for mid-band 5G spectrum is surely playing a role in the bidding intensity, but bidding volumes suggest there are also other players willing to throw their hat in the ring, particularly for the A block which will be available soon. Regardless of who wins, it’s likely we’ll see consumers enjoying the benefits of C-Band 5G sooner rather than later.
Over the past 15 years, there have been several government initiatives to expand the adoption of broadband in the United States. At the same time, industry has been busily focused on extending the reach and capacity of both fixed and mobile broadband networks. Yet, a digital divide still exists. Why? Let’s review the history here.
Since xxx, the cable and telecom industry have successfully provided broadband connectivity to more than 110.8 million households, adding about 2.4 million households per year. Gigabit speeds are now available to 85% of households. The broadband companies expand their footprint in an economically responsible way as they are accountable to their shareholders. Regardless, this leaves us with 17.7 million households left to cover. With the number of households increasing by roughly one million per year, at the current pace this would take us around 13 years. The current pandemic, with its work and study from home demands, shows us that we do not have 13 years to close this digital divide. In order to make the best possible decision on how to solve the problem, we should look at what has and has not worked in the past.
One of the most hotly debated solutions being proposed to close the digital divide is to have the government support municipal broadband, a catch-all term for providers of broadband that includes telephone and electric cooperatives. The general caveat of government entering what is a private business market is what economists call crowding out. A for-profit company typically has no chance of competing against a government entity. The latter does not have a profit goal and can provide service at a loss for an infinite period of time, as it has access to government revenue in the form of taxes or bonds to cover the losses. At the same time, the government has a poor record of innovating adjustments to a rapidly changing technological environment. The pro-municipal broadband argument holds that if for-profit companies are not offering services in a particular geographic location, they cannot be crowded out.
Electric cooperatives were founded in the 1930s to solve the 20th century equivalent of the broadband problem, and the solution is instructive for our current situation. The Institute of Local Self-Reliance, an organization in favor dispersing economic power and ownership, identified eight municipal networks that failed in the United States. The common thread of failure was inexperience in running customer-facing organizations as a neophytes struggled to learn a new skill set. This highlights the gap between running a relatively small number of government services and running much larger and more technically complicated broadband network and the problems recruiting the people with the right existing skill sets.
The most likely scenario for success is the addition of broadband service to an existing electric or telephone cooperative’s portfolio. In this case, an entity with experience in running a customer-facing operation and network for decades simply expands its service. The cooperatives are already serving mostly rural customers and do not crowd out for-profit cable and telecom providers. The FCC has recognized this and has explicitly included electric cooperatives in the Connect America Fund II initiative (which we will discuss later)
As we can see from the map above, the opportunity for rural broadband coverage from cooperatives is significant as rural areas often in the South and the Great Plains have low population density. Perhaps engaging both electric and telephone cooperatives in rural areas is an effective way to close the digital divide in some areas. These could take the form of public private partnerships and potentially avoid the pitfalls of muni-broadband.
Muni-Broadband has failed for different reasons. Research shows that most of the failed entities are urban, often engaging in direct competition with incumbent providers. Examples such as Monticello, MN, Salisbury NC, and Tacoma, WA come to mind. In other cases, the municipal broadband networks such as in Muscatine, IA, Utopia, UT had to be bailed out by taxpayers or the electric cooperative because it could not stay afloat. We also have Provo, UT and Groton, CT, which ended up selling to private companies at a great loss to tax payers, and Burlington, VT where the lack of oversight and cover-up of incompetence lead to failure to Bristol, VA where corruption meant the end of the network.
In 2010, Google announced that it would start providing broadband fiber connectivity in a number of cities to between 50,000 and 500,000 households. Cleverly, Google put out a request for information asking municipalities to apply to have Google offer fiber in their city or town. This reversed the traditional relationship between provider and municipality. Traditionally, the provider asks the municipality if it can provide service in the area. The municipality responds with what they ask for in terms of fees and extra services. Ever wondered why so many pools, parks, and sports areas are sponsored by telecom and cable companies? It was one of the demands of the city in order to allow the service provider to offer service in the town. By inverting the relationship and asking towns to apply to Google for consideration, Google shifted the power relationship, and was able to receive such favorable terms that telecom and cable providers went to cities and demanded the same terms and conditions that Google got, but they were never able to get by themselves. Under equal treatment rules, these cities had to extend the favorable Google terms and conditions to every other provider. Kansas City was the first city Google Fiber launched followed by Austin, Provo, and fifteen more cities. The Provo network was a defunct municipal network that was built for $39 million and then sold to Google for one dollar. After realizing the high cost to build a fiber network and the long delay of a payback to themselves, Google first halted further network expansions after it had deployed in five cities, and then switched to a private public partnership (PPP) model where the municipality builds the network and incurs the cost and Google sells the service. In addition, Google made an acquisition in the fixed wireless broadband space to also provide broadband wirelessly. This has slowed down the expansion significantly, but the scope has increased beyond what can be called a trial – as Google likes to call every endeavor they get into – as Google now covers 18 cities.
The 19th market for Google Fiber will be West Des Moines, Iowa. Similarly to Huntsville, Alabama, the city will build a fiber network for $39 million, in exchange Google will pay the city $2.25 for each household that connects to the network. Over the 20-year agreement, Google will pay at least $4.5 million to the city. The project will be completed by the end of 2023. By entering PPPs, Google gets the various cities to pay for the expensive built out and make money by providing the service. Google’s experience highlights that even one of the largest companies in the world does not have the focus, wherewithal and patience to actually build out a nationwide system, but relies on the government to pay for the physical buildout.
When the government helps in areas with adverse circumstances, either through low population density or low income, a business case can be made that allows the deployment of broadband services. The societal good that comes from broadband in the form of access to online learning for students, job resources for adults and an overall increase in computer skills will create greater long-term benefits than long-term costs.
On the government side of the equation, the FCC has been very focused on allocating monies (and spectrum) for broadband. The FCC’s Connect America Fund (CAF) was born out of the National Broadband Plan from 2010 aiming to broaden the availability of broadband. Now in its second iteration, CAF II, the fund is a reverse auction subsidy for broadband providers, satellite companies and electric cooperatives to provide coverage in underserved areas.
At the end of the CAF II auction, $1.49 billion of subsidies over ten years were awarded to provide broadband and voice services to 700,000 locations in 45 states highlighted in the map above. Prospective providers successively bid on who would cover the underserved market for less and less subsidy. This ensures that the area is covered for the least cost to taxpayers.
The CAF II and other government programs are increasingly closing the gap more than $20.4 billion over the next 10 years. The US Department of Agriculture has been one of the longest standing sources of support to bring broadband to rural America with $600 million per year from the ReConnect program. In October 2020, the FCC will launch the auction for the Rural Development Opportunity Fund (RDOF), a 10-year $20.4 billion program to bring broadband to areas that do not have broadband defined at 25 Mbit/ss download speed and 3 Mbit/s upload speed.
The biggest controversy around CAF II is the mapping issues. In a nutshell, if only one location in a census track has access to broadband, it is assumed that all locations have broadband. This is in a significant number of cases is not true and some locations have access when others do not. This is especially true in urban areas where we still have some high population pockets that lack access to broadband. Parts of the FCC Commission wanted to delay additional projects until the mapping problem was solved, whereas the majority voted to release the fund and work at the problem concurrently as the underserved markets are underserved even with a tighter requirement. While being criticized for its complexity and lack of clarity of how overachievement of the target goals gets recognized and impacts winning the subsidy, the program has been overall lauded a success.
When we look at what has worked and what hasn’t worked, it becomes apparent that the for-profit system has worked for 90% of Americans to have access to at least one broadband provider. The problem becomes the hard to reach, both in urban and rural environments. No matter how we look at the issue, it becomes clear that government and cooperatives plays a role to alleviate the problem as we need to fix a societal problem.
Since Silicon Valley giants like Google with almost infinite resources have balked at building out fiber in many urban areas and are relying on cooperatives or municipalities to foot the bill, the economics of building out hard to reach parts of the United States are even more difficult.
The broadband industry is investing between $70 billion and $80 billion per year to connect Americans, the wireless industry investing another $25 to $30 billion just shows that the industry can’t shoulder it alone.
Electric cooperatives as non-profits have a longer time horizon, which makes their investment in underserved rural areas easier, as they have an already established customer relationship with the prospective customers and an established connection to the location.
The CAF and other funds have worked by providing the minimal subsidy to cover underserved markets, but we just need more. Even though some have complained that it provides for only one choice ignoring that 85% of households have a choice of two wireline providers and 99% of Americans can chose between at least three mobile service providers. The counter argument for very rural parts of the United States is that one choice in an economically unprofitable market is better than no choice. Also, one has to consider that requiring every location to have two choices roughly doubles the cost of deployment and half of the infrastructure being idle.
The program will work even better with more accurate mapping of underserved areas and through that broaden its scope from mainly rural to also urban areas and become location agnostic. If a follow-up program not only wants to bring access but also competition to an underserved area, the government would have to not only double but probably quadruple if not quintuple the subsidy due to the doubling the cost of deployment while halving the expected revenue.
The consequences of not building out areas that do not have broadband access today – regardless if urban or rural – perpetuates the current trends where we have parts of society that cannot participate in the economic and social life of our country. As 2020 has shown us, broadband internet has become the lifeline of businesses and video conferencing has become a necessity for employees to work remotely. This means that many better paid jobs are closed to people depending on where they live regardless if it is an area without broadband in the urban or rural place. Unsolved this will force a further depopulation of rural America, a flight from unserved urban areas as critical employees and business owners are effectively prevented from earning a living there. At least as important is the equal access to education. Student homework and tests cannot be counted for grading unless every student in the class is able to participate. Without broadband access, not only children who live in these unserved areas are affected but also their classmates who have access.
For a country that is known for being as efficient, organized, and technologically advanced as Germany, its state of mobile networks constitutes a rare black mark. Germany is the third largest economy in the world with 82 million inhabitants (double that of California in half of the area) with a highly efficient and advanced high-tech manufacturing industry. Where it is struggling is with the digitalization of the economy and both fixed and wireless networks. Germany’s wireless networks are ranked 32nd out of 34 countries, ahead only of Ireland and Belarus. Yet no other European country has a larger share of 3G users than Germany, and it is not uncommon to fall back to EDGE networks both in urban and rural areas. The reasons for this atypical performance lie with the actions of regulators and companies alike.
In 2010, Germany auctioned 4G licenses with the requirement that within 5 years 97% of the population would be covered by 4G. However, even by 2020, every operator has failed to meet the 2015 buildout requirement. How could this happen in a country that prides itself on following the rules?
With every new generation, German mobile operators suffer from low technology adoption because they use the same playbook over and over again (3G, 4G and now 5G), resulting in the same poor outcome. Wireless licenses in Germany and most of Europe are tied to a specific technology, whereas US licenses can be used with any technology that allows a more efficient transition from one generation to the next. Regardless, German operators rightfully realize the high value of new spectrum for next generation technology and bid more money per capita for next generational licenses than anywhere else in Europe. As a result of the significant investment in licenses, German operators position the next generation product as a premium product with a significant price premium. For this reason, consumers and businesses are reluctant to adopt next generation service plans and devices, leading to suppressed next generation revenues and profits. These low profits are then used as a justification limit capital investment in next generation technologies. Consequently, German wireless networks cover less area than they can and should. This self-fulfilling prophesy is now in its third iteration. We have seen it in 3G, 4G, and now 5G in Germany.
US carriers start from the same point of recognizing the value of next generation technology and spectrum, and US spectrum auctions have yielded the highest values globally. Unlike Germany, US mobile operators make the new technology available for the same price point as the last generation technology, creating greater profitability through a significantly lower cost structure given that next generation technology typically lowers the cost per gigabyte by 90% over the previous generation. As a result, US mobile operators see a rapid shift from the old generation to the next generation network usage as customers upgrade their devices to be able to take advantage of the new networks. By holding the price points steady for next generation networks with their faster speeds, US operators are under less price pressure than European operators, allowing operators to invest heavily in their networks and differentiate on coverage. As a result, the US ranks fifth in the world for 4G availability, behind South Korea, Japan, Norway and Hong Kong, which make up a combined 9.3% of the areas of the United States. As a result, everyone wins in the US approach: customers have faster access to next generation technology, and operators make a higher profit.
Germany’s cost problem is compounded by a legal and regulatory regime that does not favor the building of cell sites similar to Section 332 of the Telecom Act. German building permits are notoriously lengthy endeavors that take a long time. Frequent lawsuits against many cell sites lead to drawn-out legal reviews which slows down network buildout. All of these policies are not friendly for capital investment in wireless networks.
The problem of how to cover thinly populated rural areas in Germany persists. Mobile operators complain that it is unprofitable to cover many rural areas. During the 2018 Mobilfunkgipfel (Mobile Summit) between the German government and mobile operators, the government committed to share part of the cost of covering rural parts of Germany.
Coverage issues in rural parts of a country are not unique. Germany’s neighbor France – roughly the size of Texas – has tackled the issue in three different ways. For the 2G rollout mobile operators, the central government, and the departments (provinces) with coverage gaps in the rural parts of France split the cost three ways between the parties. In 2015, the French government set aside $1 billion to close the 3G coverage gaps. In 2018, the French government came to an agreement with the four incumbent operators to extend the license term in exchange for closing coverage gaps and to install jointly more than 5,000 masts and antennas.
There are four key lessons that we can take away from the German and French examples:
The business model matters. American operators are providing world class service, especially considering the size of the country. The US operator model of capturing profit through cost reduction rather than price increases is the superior model. It results in faster and higher adoption of next generation technology and greater capital investment. The one US carrier who tried to charge a premium for 5G, Verizon, has two European executives at the helm. Customer pressure quickly forced Verizon to abandon its European model of a price premium and revert back to the US model.
A mobile-friendly regulatory regime that enables the rapid building of new cell sites makes a positive difference. It is a no-brainer that when it is difficult for operators build new sites, coverage suffers.
Even medium-size economically prosperous countries like France and Germany have similar problems to economically build out mobile networks. While it is more cost effective to build out rural areas with wireless rather than fixed technology, the business case is far from a foregone conclusion.
The comparison between the US and more tightly regulated countries shows that incentives and support for wireless networks without red tape and strings attached are creating better results.