In case you are just tuning in, Verizon has been going through a rough time for about two years now. In fall 2021, it replaced Ronan Dunne as CEO of Verizon’s Consumer Group (VCG) as it struggled before filling the position with Manon Brouillette. It would be difficult to say that things have improved.

We live in peculiar times. For a long time, financial analysts wanted to convince us that when a mobile network operator (MNO) has a larger size than its competitors, the size advantage gives them a substantial edge in the market. Now, other financial analysts want to convince us that Verizon, because it is the largest provider in the market, is destined to lose customers for the foreseeable future. I disagree with both positions but would point to having a good plan, the ability to rapidly adapt to new circumstances, and superior execution to being the only sustainable competitive advantage in the market. 

Verizon has traditionally differentiated itself as the premium provider in the market based on superior network performance. Taking network leadership to heart, Verizon charged ahead in 2G, 3G and 4G and created the fastest and largest network for at least the first three to four years of what is generally a seven-year technology era. The shock and awe of the early, rapid build created a nimbus of permanent network superiority even though, at least in urban markets, by year five, we had network parity. In contrast, in rural markets the network superiority persisted.

For the last decade, Verizon has internally fretted about what it should do if and when this trick would no longer work and its network superiority nimbus would be diminished, or even worse, large swaths of customers would perceive network parity or, even worse, someone else to have the better network.

Several poor decisions and outcomes around spectrum auctions weakened the strong network foundation. Verizon seems to have then tried to replace the internal differentiation of being the provider of the undisputedly best network with having the best streaming bundle and differentiating around that.

Replacing an internally generated differentiation with an externally acquired differentiation, especially when it is so easily replicable, is a dangerous gamble. To make this decision even more puzzling, Verizon engaged in the content-differentiation strategy at the same time when AT&T exited the content bundling with wireless.

AT&T and T-Mobile having seen the 2G, 3G and 4G outcomes decided they didn’t want to live through the same experience with 5G and put a lot more emphasis on network performance. While in Recon Analytics Data weekly net promoter score data, Verizon still leads in the network performance categories, the gap has undoubtedly diminished. Metered speed tests show Verizon being behind, but how much does it matter? In our purchase decision factor ranking, speed is a solid second out of nine metrics.

Especially T-Mobile, powered by Sprint’s spectrum and a greater network focus with various firsts has given Verizon’s network team a run for its money. AT&T has been more judicious in spectrum expenditures and build-out pace betting that speed test results alone don’t win customers and aligning its build-out more with customer and technical capabilities and usage. The slower build-out has not hurt AT&T’s success in the marketplace because it was able to execute on other purchase factors that existing and prospective customers find important.

Verizon’s recent promotion

On May 22, 2022, Verizon launched an online promotion where single-line customers would get $15 off, two-line customers $12.50 per line off, and three-line customers got $5 per line off. Since Verizon did not issue a press release around it, it was largely unreported.

We took it as an opportunity to test Verizon’s value proposition of all plans – 5G Start, 5G Play More, 5G Do More, and 5G Get More – against what the customers of the other providers, ranging from T-Mobile and AT&T to Google Fi and Mint Mobile, were willing to pay for the different plans for a different number of lines. This gave our clients one week later a read if they should worry, to what degree, and about what part of Verizon’s promotion they should be worried about.

Below is how just T-Mobile customers were viewing Verizon’s single-line plans and for what price they would switch to the plan. While this is no sophisticated conjoint pricing analysis, it nevertheless gives some interesting insights. It also does not consider larger long-term pricing strategies that a company like Verizon would have to consider when making pricing decisions. The yellow shading represents the take rate at a given price point, while the magenta line represents the revenue that would be realized.

With the promotion, Verizon charged for a single line $55 for 5G Start, $65 for 5G Play More, or 5G Do More, and $75 for 5G Get More. As a reference point, Verizon just launched Welcome Unlimited for $65 for a single line for a skinnier offer than 5G Start.

T-Mobile customers’ highest revenue price point was $40 with a 64% take rate for 5G Start vis-à-vis Verizon’s $55 promotion. 5G Play More and 5G Do More were valued at $50 with 55% and 64% take rates respectively while Verizon was charging $65. Verizon 5G Get More plan discounted to $75 during the promotion was also valued at $50 with a 73% take rate.

Analyzing the data as in the above example vis-à-vis T-Mobile, it became apparent that in the one-line segment, the Verizon promotion would not save Verizon’s quarter. The numbers for the other providers for single-line customers were roughly similar to those of T-Mobile customers.

Interestingly, despite being the least generous, the three-line offer was the most competitive for several of Verizon’s offers. This brings us back to Verizon’s new Welcome Unlimited plan. It looks like a significant uphill battle to convince single-line customers to spend $65 per month when only two months ago, at least T-Mobile customers thought it was only $40 worth.

As I mentioned before, customers of different mobile service providers and for different line counts value Verizon’s plans differently, but in the one- and two-line part of the market a similar picture emerged.

Verizon isn’t suffering from a large size that dooms its progress; it suffers from a value positioning, value perception, and long-term pricing strategy issue.

Lessons from the Field

Lessons from the Field

There is broad agreement in the United States that we need to increase broadband coverage and make broadband more affordable for low income Americans. To arrive to that conclusion, it is unnecessary and actually is counter-productive to use less than reliable data to make this point.

Thomas Philippon recently authored a paper entitled, “How Expensive are U.S. Broadband and Wireless Services.” In it he concludes the American consumers pay more for broadband and wireless services than consumers in other industrialized nations. Unfortunately, these conclusions appear to be based on data that are dated, incorrect, omitted or misinterpreted. The findings are actually a disservice to the goal of closing the digital divide that exists both in coverage and affordability.

Fixed broadband prices

To compare U.S. with European prices, the paper uses data collected by cable.co.uk, an advertising website in the UK that tries to convince UK customers to buy UK broadband services rather than relying on unbiased sources. It is not clear what expertise this company has at determining U.S. broadband prices (or interest in showing them to be economical) and conducting proper apples-to-apples international price comparisons. In particular, there are no data contained within cable.co.uk’s currently provided spreadsheet that allow a reviewer to ascertain that similar quality plans were sampled in each country. Indeed, this seems in doubt as in its most recent study, cable.co.uk reports that the plans it sampled from the U.S. had an average price of $59.99, with a minimum price of $29.99 and a maximum of $299.95. The magnitude of this variation suggests that the sampled plans varied widely in quality (i.e., offered speeds), and it is especially curious that cable.Co.UK’s computed average speed should be $59.99. For an average from 26 observations to arrive at this archetypical retail price number seems improbable.

Indeed, it is highly likely that the quality of broadband services that cable.co.uk compares are quite different. In its current report on prices, cable.co.uk suggests that readers should also examine cable.co.uk’s study of worldwide broadband speeds. It is more than a little revealing that in the provided spreadsheet, cable.co.uk finds U.S. average speeds to be nearly twice as fast as UK speeds (71.20 Mbps vs. 37.82 Mbps). Furthermore, the only listed European countries or dependencies that exceed the U.S. in speed are:

  • Liechtenstein (population of 38,747)
  • Jersey (A British crown dependency of population of 107,800)
  • Andorra (77,142 people)
  • Gibraltar (a British Overseas Territory with 33,701 people),
  • Luxembourg (590,667 people)
  • Iceland (population 356,991),
  • Switzerland (8.5 million inhabitants),
  • Monaco (population of 38,964),
  • Hungary, (9.7 million people)
  • Netherlands (population of 17.2 million)
  • Malta, (460,297 inhabitants)
  • Denmark, (5.7 million people)
  • Aland Islands, (Swedish-speaking semi-autonomous region of Finland of 27,929 people)
  • Sweden, (10 million populations)
  • Slovakia (5.4 million people).

As you can easily see someone tried really hard to increase the count of geographies that have faster speeds than the US by including parts of countries, dependencies, and dutchies into the mix. Nearly all of these are small countries or semi-autonomous regions with populations less than a typical U.S. city or state. It is notable that no European country with a population larger than that of the Netherlands makes the list of countries with faster average services than the U.S. Given that U.S. broadband speeds exceed significantly those in Europe, and the U.S. generally has much lower population densities, higher wages, and thus, significantly higher per-home network deployment costs, it is unremarkable that U.S. prices might exceed European prices as it costs substantially more to deploy these networks.

The scatterplot of cable.co.uk’s collected prices appears to be consistent with 2017 prices reported by the OECD. But four-year-old prices don’t seem terribly apposite to debates about the current price and quality performance of broadband in the U.S. As demonstrated by USTelecom, broadband prices in the U.S. have been dropping significantly over the past six years even as their service quality has increased dramatically.

After noting that “some of the data measures presented above [in the paper about pricing] are a few years old,” the paper turns his concern to USTelecom’s recent report detailing that deployment of advanced networks is further along in the U.S., and subscription to these high speed U.S. networks exceeds European subscription to similar networks. The paper retort to these findings, that derive directly from data and official statistics collected for the U.S. by the FCC, and for Europe by the European Commission (EC)is to reference a 247-slide presentation published on internet that claims that in 2019, 87% percent of people in the U.S. use the internet, but in Western Europe and Northern Europe, the figures are 92% and 95%, respectively.[1]

There are at least two problems with this response. First, even if these data were valid, they focus on geographic subsections.[2] Why not compare these most-developed areas of Europe against U.S. figures strictly for the Northeast or the Pacific Coast? But aside from needing to slice and dice European data in order to adduce a favorable comparison, the biggest problem is that even if Europeans are using the internet, EC DESI data on connectivity show that many are not using it via a fixed broadband connection. This is because the EC finds that only 78% of European households subscribed to fixed broadband in 2019. So it is more than likely that the slide presentation the paper cites to on usage includes people who use the internet via mobile wireless broadband connections, satellite connections and dial-up internet connections, in addition to fixed line broadband connections.[3] In any event, one year earlier, in 2018, 84% of U.S. households had fixed broadband subscriptions – and the U.S. advantage over Europe widens as only 30+ Mbps broadband speeds and 100+ Mbps speeds are considered.

Fixed broadband speeds

The next topic specifically addressed by the paper is fixed broadband connection speeds. For this, the paper refers to slide 52 in the 247-slide presentation. This slide, he says, “shows the US is close to the EU median, and slightly below France, in terms of speed.” The first statement appears to be false, the second is immaterial. Let’s unpack.

The European countries listed (in terms of speed) on the slide are: Romania, Switzerland, France, Sweden, Spain, Denmark, Netherlands, Portugal, Poland, Belgium, Germany, Ireland, U.K., Italy and Austria.[4] The U.S.A. slots between Sweden and Spain. Now even assuming that the paper meant these comparisons to be against European countries rather than EU countries as he states in the paper, the median European country is Portugal – which lies four positions below the U.S. If only EU countries are considered, the median position drops another half slot to be between Portugal and Poland.[5] While the paper may consider the U.S.’ positioning in these lists to be “close” to the median, he could also have noted that the only major EU country ahead of the U.S. was France, with a miniscule (and likely statistically meaningless) speed advantage of 500 Kbps (131.3 Mbps for France versus 130.8 Mbps for the U.S.).

Thus, rather than showing the U.S. to be a laggard in fixed broadband speeds, the paper’s analysis appears to show it significantly in the lead.

Lightning round

ARPU: The paper then claims to look at broadband ARPU for Altice and Comcast in the U.S., and pronounces it significantly above that in France. The validity of his data is highly questionable, though. For example, Philippon claims (without citation) that Altice’s ARPU is $90/month.[6] Reference to Altice’s SEC 10-K report (at p. 3) indicates that its residential broadband ARPU is $70.52, a figure substantially less than Philippon’s unreferenced figure of $90. Further, Altice is a cable company with a very substantial FTTH footprint. It reports that the average speed purchased by its customers exceeds 300 Mbps – over twice the average speed experienced by French customers.

Prices of comparable contracts: A chart is displayed suggesting that prices for triple-play services in the U.S. exceed significantly those in European countries. This statistic is likely meaningless because it is well known that the cost of television services in U.S. triple-plays is vastly above similar charges in Europe. This is due to many factors, including: U.S. bundles typically include many more channels, especially HD channels, than European bundles; fees paid by U.S. triple-play operators to acquire local broadcast channels, sports channels and other cable television networks exceed greatly those paid in Europe. Indeed, in many European countries, local broadcast channels are paid for via television license fees that are paid separately by customers and are not included in their triple-play bills; U.S. bundles also commonly allow the subscriber to watch several simultaneous programs on multiple television sets – in contrast to European bundles that may be restricted to a single TV set stream.

Labor cost adjustment: Here the paper argues that because “wages are about 20% higher in America than in the main EU countries” and “since compensation of employees accounts for half of the value added in private industries, one might expect [U.S.] price to be [only] 10% higher” than in Europe. This analysis is not compelling. Even if these national-level statistics are specifically applicable to the U.S. broadband industry, there is no need for wage differences to account for the entire amount of any putative elevation in U.S. broadband prices over European ones. That is, they are only a contributor. The fact that the U.S. is much less densely populated than Europe and U.S. networks provide much higher speeds and carry much more data per household than European ones are also likely contributors.

Profits and investment: This section contains a mishmash of data that purport to suggest that U.S. capital investment is not impressive. But, the data presented for “Comcast, AT&T, and other Telecom companies,” not an appropriate basis for analysis because many of these companies are diversified into businesses other than broadband. Comcast and AT&T offer television services and own movie studios. Comcast owns theme parks and AT&T owns legacy copper telephone networks and DBS satellite systems. Consolidated capex figures from these companies are inadequate to discern broadband-specific investments.

But in any event, discussion of the above is probably intended to divert attention away from the best available investment comparator for telecommunications, the data collected by the OECD from national statistical agencies or regulators.[7]

Sources: Investment data extracted from: OECD, “Telecommunications database”, OECD Telecommunications and Internet Statistics (database), http://dx.doi.org/10.1787/data-00170-en; OECD Digital Economy Outlook 2017, Table 3.10 https://www.oecd.org/sti/ieconomy/deo2017data/Table%203.10.%20Investment.xls; OECD Digital Economy Outlook 2015, Table 2.26 https://www.oecd.org/sti/ieconomy/deo2015data/2.26-Investment.xls and Table 2.31 https://www.oecd.org/sti/ieconomy/deo2015data/2.31-InvestCapita.xls. Population data extracted from https://stats.oecd.org/Index.aspx?DataSetCode=ALFS_POP_VITAL# .

All that the paper appears to say in response to the investment history presented in above chart is to note that since 2015, “investment by the main Telecom operators in Europe has grown rather quickly.” So it has; but so has investment grown in the U.S. – and U.S. per capita investment levels remain at nearly twice those in Europe.

Coda

It is odd that the paper should resort to such a strange mix of data that are old, wrong, or misinterpreted to try to support his claim that U.S. broadband is too expensive, and that this can only be the result of a lack of competition. Only by ignoring the fact that broadband networks are more widely deployed in the U.S. than in Europe, offer higher speeds, carry more data, and are more heavily subscribed to can the paper conclude that the European model is to be preferred. But while that may be the paper’s conclusion, it is not that of the European Commission which has studied these issues directly. In Table 7 of Annex 3 in its International Digital Economy and Society Index specifically for connectivity finds the U.S. to score higher than all but the top EU country (Denmark) and to tie in score with the next two highest EU countries (Finland and Malta).[8] All other EU countries score lower.

So the real truth is that U.S. fixed broadband leads, and does not lag Europe’s performance. This does not take away that there are Americans that cannot get broadband internet or cannot afford broadband. We need to help these people and do not need to trump up the differences to come to that conclusion.


[1] Curiously, the paper ascribes his data source’s stated figure of 87% usage for North America to be the usage figure specific to the U.S. This, of course, neglects the fact that roughly 10% of North America’s inhabitants are Canadians.

[2] The sources cited for this agglomeration of statistics (slide 34 of 247) are not traceable. They include: ITU, Global Web Index, GSMA Intelligence, Eurostat, Social Media Platforms’ Self-Service Advertising Tools, Local Government Bodies and Regulatory Authorities, APJ۩, and United Nations. Indeed, this same boilerplate list of sources appears on several other slides in the presentation.

[3] Indeed, reference to the Eurostat household questionnaire that appears to be the ultimate source for the paper’s cited statistics confirms that wireless broadband, satellite and dial-up are included in the European usage figures. See, ICT usage in households and by individuals (isoc_i) (europa.eu) and https://circabc.europa.eu/sd/a/919a2bf9-d2e0-4d00-8c72-ddcee7f3285a/MQ_2020_ICT_HH_IND.pdf.

[4] Turkey (which is below Austria in speed) is possibly another European country on the slide, but out of conservatism, we will not include it in our analysis.

[5] The EU countries contained in the slide’s list are: Romania, France, Sweden, Spain, Denmark, Netherlands, Portugal, Poland, Belgium, Germany, Ireland, U.K., Italy and Austria. Switzerland in not an EU country. Note that the U.K. was still a member of the EU when these data were collected.

[6] We assume this is intended to be residential ARPU because the paper compares it against a French statistic for residential broadband prices.

[7] OECD Telecommunications and Internet Statistics, http://www.oecd.org/sti/broadband/9b.Investment.xls and https://stats.oecd.org/Index.aspx?DataSetCode=EDU_DEM.

[8] Connectivity dimensions are defined as: Fixed Broadband Coverage, Fixed Broadband Take-up, 4G Coverage, Mobile Broadband Take-up, Fixed Wired Broadband Speed and Broadband Price Index.

Americans overwhelmingly support that broadband should be available to every American and that the funding base to achieve that should be broadened to every company that makes money through the internet.

More than 78% of respondents agreed that broadband internet should be available to every American showing broad support from large parts of the population. When looking a bit closer at the answers given by respondents who say they have broadband versus the people who said they didn’t have broadband, the support of those respondents who do not have what they consider broadband (26%) for everyone having access to broadband drops to 64%. This indicates that there is not only an availability and affordability gap but also an educational gap. Many people who don’t have broadband internet access either do not want it or do not understand why they should have it. These findings, which are mirrored in other studies, indicate that any broadband infrastructure program should include an educational component to increase the number of broadband subscribers. Otherwise, broadband penetration will never reach its full potential.

As we found in previous surveys on the topic, around 54% of respondents use the internet for work purposes from home. This is equal to the number of employees that are classified as white-collar employees by the U.S. Department of Labor. This number highlights the importance of broadband for the functioning of American businesses and enterprises during the continuing pandemic. It is likely that the added agility and flexibility to work from home will continue to be utilized after the country has emerged from the pandemic restrictions. We also found similar opinions around what Americans consider broadband. The median American considers 50/5 MBits as broadband whereas the most answered response was Gigabit speed with 29% of respondents.

The high percentage of Americans who think broadband should be available to everyone is probably based on the intensive usage and the need of many Americans to use it from home to work. Using the weighted average of the responses, Americans spend around six hours every day on the internet.

How many hours do you spend on the internet with a mobile device or computer?Use the internet from home for workDoes not use the internet from home for workCombined Response
Less than an hour4.8%18.0%10.9%
Two to four hours18.1%36.0%26.4%
Four to six hours15.2%26.3%20.4%
Six to eight hours25.2%8.8%17.6%
Eight to twelve hours25.7%4.8%16.0%
More than twelve hours11.0%6.0%8.7%

When looking at those who also use the internet from to work from home, unsurprisingly the usage pattern is significantly heavier as their usage pattern includes both business and leisure activities.

While only 37% of respondents knew that the Lifeline Program provides low-income Americans with basic phone and internet service, they were open to new funding sources to close the digital divide. More than 71% of Americans are in favor that companies with business models that rely solely on the internet to exist and who also generate revenue from those businesses, like Google and Facebook, also contributing to provide access to Americans who currently do not have access to the internet. Such a move would dramatically expand the funding sources for a broadband access plan and include companies that have exerted the most valuable and profits from the internet.

What is really interesting, the survey also found support to extend net neutrality rules to websites and ecommerce companies. We framed questions around the net neutrality principles of no blocking, no throttling speeds, and no paid prioritization by asking Americans if websites like Google, Facebook or Amazon should be allowed to restrict access to legal sites, give preference to their own products and services over others and change the search results based on how much money they receive from others.

More than 72% of Americans are against companies like Facebook or Google restricting access to legal sites for any reason. This is exactly the behavior that Facebook showed when it made it impossible to link from Facebook to news sites in Australia (and for a short time to itself) to avoid having to compensate news sites linked to. In essence, it was a commercial and legislative conflict where Facebook wanted to use its customer base as a bargaining chip in its negotiations. This is the essence of the “No Blocking” rule in net neutrality.

More than 55% of Americans believe companies like Amazon, Google, or Facebook should not be allowed to give preference to their own products and services over that of others, a self-dealing practice that has cost Google more than $10 billion in fines by the European Union. Search engines like Amazon, Google, and Facebook, all of which provide you with what you are looking for, are increasingly the prism through which we see the world. They have incredible power over our perception of what it is we are actually looking for. By pushing competing products into the obscurity of lower-ranked results, they, in essence, throttle the success of other products that are better but do not fit the commercial objective of the search engine provider.

In terms of pay-to-play result manipulation, more than 80% of Americans say they are against search engines altering results based on how much websites and advertisers pay for preferential positioning. It is common that the first view search results for a given term are occupied by responses that are marked by the easily missed word “Ad” in front of the link. This effectively operates as paid prioritization, something the ISPs are not allowed to do under California’s net neutrality law, nor under earlier versions of net neutrality that the Democrats might be considering reinstating.

The results of our survey showcase two key points: Americans are open to reigning in tech giants, who solely rely on the internet to generate revenue, and curbing their ongoing uncompetitive behavior, and having these companies contribute part of said revenues to subsidize access to broadband for low-income Americans. While the Biden Administration focuses on proposing ideas that have been tried and tested, perhaps it should take a step back and listen to consumers, who are those who the administration ought to serve and prioritize.

While we all agree that the United States needs more broadband and net neutrality, most Americans do not support the Biden administration’s plan. The majority of Americans want internet companies to pay their share to build the broadband network that these companies are profiting from. They also want to be protected from the demonstrated behavior of internet-based companies that violate the net neutrality rules that these companies want to impose on other companies but not themselves. Net neutrality rules need to protect consumers and not one set of companies that want to prevent other companies to effectively compete with them. Any net neutrality rules that do not apply to internet service providers and internet companies like search engines, social media companies, and e-commerce providers is just cleverly disguised corporate welfare with the government picking winners and losers.

Between March 16 and March 26, 2021, Recon Analytics conducted a demographically representative survey of 1,000 Americans using the internet and cell phones, asking them about their opinions and attitudes around universal access, funding mechanisms, conduct, and usage.

Do you believe that access to broadband internet should be available to every American?

Yes 78.2%                                                                                                            No 21.8%

Did you know that the government requires a small portion of your phone bill to be used to fund phone service for low-income Americans aka lifeline service?

No 62.9%                                                                                                             Yes 37.1%

Do you think companies like Google and Facebook that make money through the internet should contribute to the provide access for Americans who do not have the internet?

Yes 71.4%                                                                                                            No 28.6%

Should companies like Google or Facebook be allowed to restrict access to legal sites for any reason?

No 72.7%                                                                                                             Yes 27.3%

Should companies like Amazon, Google, or Facebook be allowed to give preference to their own products and services?

No 55.8%                                                                                                             Yes 44.2%

Should search engines be allowed to alter search results based on how much money they receive from websites or advertisers?

No 80.6%                                                                                                             Yes 19.4%

How would you define broadband internet access?

3/1 9.3%                                                                                                              10/1 9%

25/3 14.8%                                                                                                          50/5 17.3%

100/10 20.9%                                                                                                      Gigabit 28.8%

Do you currently have broadband internet access?

Yes 74.2%                                                                                                            No 25.8%

Does your job require internet access at home?

Yes 53.4%                                                                                                            No 46.6%

How much time per day do you spend on the internet (via your mobile device or on your computer?)

Less than an hour 10.9%                                                                                    2-4 hours 26.4%

4-6 hours 20.4%                                                                                                  6-8 hours 17.6%

8-12 hours 16.0%                                                                                                More than 12 hours 8.7%

Japan’s Rakuten is the first global mobile network operators (MNO) to fully virtualize their networks, with millions of active customers on commercial service. Rakuten has taken their expertise of being a fully virtualized operator to create the Rakuten Communications Platform (RCP), which packages its vendor portfolio and deployment expertise and markets it to other operators who also want to run their network in the cloud. Interestingly, the vendor product portfolio that is being sold is more extensive than what Rakuten has chosen to deploy. Based on news reports, Rakuten has already signed up 15 customers on RCP. The first publicly announced RCP trial partner is Ligado. A would-be US operator, Ligado owns spectrum in the United States that was previously used for satellite use. Up until recently Ligado has been involved in a fight with the Department of Defense over potential interference with GPS and NTIA, but the FCC sided with Ligado and allowed them to use their spectrum for commercial use.

Based on our research, the RCP universe consists of the following players:

AreaCompany
4G CoreCisco
Converged 4G/5G CoreNEC
ServersQuanta
Service orchestrationInnoeye (acquired by Rakuten)
IMS/RCSMavenir
Open RAN softwareAltiostar
4G Sub-6 GHz RadiosNokia
5G Sub-6 GHz RadiosNEC
4G/5G Sub-6 GHz mmWave Radios and SoftwareAirspan

Qualcomm and Intel are also mentioned as RCP participants but apparently are mostly involved as silicon providers for their particular area of expertise.

Rakuten continues to use Innoeye and Altiostar, which it has purchased outright in Innoeye’s case or has an equity investment like Altiostar, for orchestration and Open RAN software, respectively. Mavenir continues to supply the IMS/RCS software and Quanta provides the servers.

RCP made several adjustments in its vendor portfolio when it added 5G support to the network. Rakuten switched from Cisco as a 4G packet core provider to NEC, which will work with Rakuten on building a converged 4G/5G core. NEC also replaced Nokia for the sub-6 GHz radios as Nokia only provides 4G radios for Rakuten. This change was surprising as the NEC 5G radios are actively cooled, whereas state-of-the-art radios are passively cooled. Rakuten did not switch mmWave radio providers which, Airspan continues to provide for 4G and 5G.

One of the powerful features of RCP is that an MNO can mix and match from any vendor in the RCP portfolio. If an MNO  prefers Nokia or Airspan as their radio vendor, they can use Nokia’s mmWave product for 5G or use Airspan for both mmWave and sub-6 GHz Airspan’s Open RAN software or use Altiostar’s software.

RCP fits into an interesting sweet spot in the market. Most large MNOs, especially in the United States, will chart their own path towards Open RAN based on how it fits into the current network. Changing or introducing vendors for such a significant network transition is like changing an airplane’s engines in midflight. Small operators, especially if they have already chosen Huawei equipment, are locked in. These operators typically have lean engineering and operations staff that are not trained or sized for such a significant network transition. This makes small operators dependent on large network providers as prime project managers and for vendor financing. Huawei’s growth to become the largest global provider of 5G equipment to MNOs has been based on both significant deployment and customer service capability to the point where almost every Huawei-powered network is a custom network with generous vendor financing packages. A side effect of the customization of each network is that it makes it difficult for other vendors to get a part of the network equipment. Medium-size MNOs and greenfield operators, especially if they are not dependent on vendor financing and the small rural MNOs in the United States who have to replace the Huawei equipment in their network and are collectively paid $1.9 billion to do so are a prime target for RCP. Rakuten’s offering lets MNOs jump to state-of-the-art software-defined networks with Open RAN. Software-defined networks are more flexible and cheaper to operate and due to the standardization of hardware, they are less expensive to buy.

Especially the rural operators who are replacing their equipment should invest in the technology of the future, SDN, and Open RAN, regardless of whether they choose RCP or a custom route, rather than invest in the past’s integrated hardware and software. For rural MNOs  who have to run their network with a lean team, SDN’s automation allows the network operation teams to be more efficient and effective. Before RCP, the path to SDN and Open RAN was quite daunting as, for example, Dish’s Charlie Ergen remarked during the Q4 2020 earnings call. RCP solves the complexity problem by allowing rural MNOs to use a working suite of products from another network operator. As a further bonus, several MNOs could combine their network operations and share a common core and operations team for additional cost benefits. Switching to SDN and Open RAN would also work with President Biden’s Buy American initiative. Several leaders in the field are American companies such as Airspan, Altiostar, Cisco, and Mavenir. For all too long, we have complained that there are no American telecom network equipment providers. Now the telecom industry has an opportunity to diversify its vendor base.

In recent years, the FTC raised concerns that Qualcomm’s patent portfolio and unbiased licensing scheme would prevent other companies from manufacturing and selling 5G chipsets, leading to an anti-trust lawsuit that concluded in November 2019. However, the prediction has not borne out. Currently, there are two companies, Qualcomm and MediaTek, that sell 5G chipsets to the device ecosphere at large, two captive suppliers who make their own 5G chipsets for internal consumption, and one company that is creating its own new 5G chipset also for internal consumption.

The mobile chipset business has a series of players with different objectives. Companies like Qualcomm and MediaTek provide mobile chipsets to device manufacturers and serve the vital function of ecosphere enablers. Without them, the plethora of devices and choices consumers enjoy when it comes to smartphones would not be possible. Another set of companies are making mobile chipsets only for themselves to create a competitive advantage in the marketplace. Apple and Samsung fall into this camp. Huawei is potentially a hybrid case as it was previously only providing its own handset group with chipsets, but now also provides them to a Chinese state-led consortium that purchased the Honor handset line.

 CustomerModemIntegrated SoCStand-alone Application ProcessorRF Frontend
QualcommEcosphereYesYesNoYes
MediaTekEcosphereNoYesNoNo
HuaweiDivested divisionsYesYesDiscontinuedNo
SamsungCaptiveYesYesNoNo
AppleCaptiveFutureFutureYesNo
Intel
(sold to Apple)
EcosphereYesNoAbortedNo

Currently, Qualcomm provides high-quality Systems on Chip (SOC) that are integrating multiple components, ranging from baseband, AI, graphics, camera, to CPU into one chip to anyone interested in them. Qualcomm was the first company to offer 5G chipsets with the first devices hitting the market at the end of 2019. MediaTek is offering a similar, but less advanced and less integrated product line to device manufacturers looking for low-level to medium-level chipsets. By the middle of 2020, MediaTek’s chipsets were powering a broad portfolio of handsets.

Intel, another ecosphere provider, sold its mobile chip business to Apple in December 2019, nine years after it entered the mobile chip market by buying a division of Infineon. Intel’s motivation to buy Infineon was that Infineon was the sole provider of modems to Apple. Reportedly during the negotiations between Intel and Infineon, then-Intel CEO Otellini sought reassurances from then-Apple CEO Steve Jobs that Apple would continue to use Infineon products after the Intel acquisition as Otellini recognized the importance of Apple as a customer for its chipsets. During the nine years after the Infineon acquisition, Intel’s mobile chipset division’s fate was intricately linked to Apple as Intel struggled to find other customers in the mobile device manufacturer ecosphere. In a nutshell, Intel was unable to compete with Qualcomm on quality like RF performance and SoC integration and was unwilling to compete with MediaTek as it had a more integrated solution and Intel did not. Intel ultimately threw in the towel on the heels of Apple and Qualcomm settling their lawsuit and agreeing to a six-plus two-year licensing and multiyear chipset supply agreement.

Huawei through its HiSilicon subsidiary has developed and used its own 5G chipsets and has integrated them into its own devices. While the Huawei chipsets are not as integrated and small as Qualcomm’s, Huawei’s engineers have found ways to integrate the chipsets into its devices. It is using Qualcomm, Skyworks and Qorvo, all from the US, for its RF front-end. Huawei’s role in the mobile world got a lot more interesting as it has sold its Honor-brand device division to a Chinese state-led consortium of more than three dozen companies as Huawei experienced a lot of pressure on its devices sales due to American sanctions. Reportedly, Huawei is also considering selling its Mate and P-line device groups in the hope that American sanctions will not follow to the new owners of the device businesses. Up until now, Huawei is not selling its HiSilicon chipsets to other companies, other than the group of Huawei dealers that acquired the Honor-brand device division, as a competitive weapon in order to keep their best technology capitive. In 2019, during the trade tensions between the US and China over Huawei, the company offered to license its 5G intellectual property to American companies to alleviate any spying concerns, but no deal has emerged until to date. If Huawei is divesting its entire device portfolio Huawei might either also divest its HiSilicon division with it or become an ecosphere provider for other handset manufacturers. The direction of Huawei’s HiSilicon business will be quite telling of the size of the Chinese walls between Huawei and its divested handset businesses as well as other handset vendors.

Samsung has been producing its own Exynos modems and mobile processors, and has also purchased mobile chipsets from Qualcomm. Samsung’s new 5G devices, including its S20 5G flagship smartphone, is shipping either with the Exynos or Qualcomm Snapdragon chipset. Samsung sells the Qualcomm variant in the US, China, and most recently South Korea, and its Exynos variant in the rest of the world. Benchmarking has shown that the Qualcomm chipset version regularly outperform the Exynos one and that Samsung uses the Qualcomm variant in the most competitive markets to close the gap against Apple’s iPhone.

In 2008, Apple with its computer heritage bought P.A. Semi, a processor development company specializing in highly power-efficient designs, to build its own ARM-based processors for iPhones, iPads, and similar devices. Apple’s ARM processors are now the fastest CPUs in the market and will start powering Apple Mac computers starting in 2021. Apple sourced its baseband chipset first from Infineon, then post-acquisition from Intel, then a few years later from Qualcomm, then dual-sourced from Intel and Qualcomm, and most recently in 2019, signed an agreement to return to Qualcomm. In 2019, Apple also bought Intel’s baseband chipset business and has started hiring more wireless engineers in San Diego, Qualcomm’s home market. Considering Apple’s track record it is quite logical that Apple is going to try to replicate its successful ARM processor endeavor in modems, and internally source its 5G mobile chipsets when the Qualcomm agreement expires. The Qualcomm agreement gives Apple breathing room to pour its resources into an area that is a key differentiator between mobile devices.

These successful 5G chipset endeavors demonstrate that Qualcomm’s patent portfolio and licensing policy do not present a significant barrier to innovation. Qualcomm’s licensing rates have not changed since it first started licensing CDMA in the 1990s, while its portfolio has grown substantially, facilitating continued innovation that has made the United States a leader in international telecommunications on a fair, reasonable, and non-discriminatory basis. As silicon merchants to the industry, Qualcomm and Mediatek’s participation in chipset development creates choice and opportunity for many mobile device manufacturers to have a chipset that meets their needs and budgets exponentially increases the range of consumer choices without infringing on the ability of other companies to enter the market.

The Super Bowl is not only the pinnacle of the American Football season, but it is also the show case event for wireless carrier. Every year, every wireless carrier sets aside tens of millions of dollars to improve the wireless network for the big game. The newest and best gets installed so that attendees and regular citizens alike get a superior experience. One of the challenges that mobile operators have to overcome in order to provide faster speeds is how to best use the spectrum they have. The spectrum portfolio of each carrier has different amounts of spectrum ranging from 600 MHz and 700 MHz all the way to 39 GHz. In order increase the speed for customers, the spectrum slivers can be bonded together in a process called carrier aggregation (CA). The maximum amount of spectrum that LTE can have in a channel is 20 MHz and 5G is 100 MHz. So even if a mobile network operator has 1,200 MHz of contiguous spectrum with 5G they could aggregate 12 100 MHz channel for the maximum amount of bandwith which directly translates into speed.

Global Wireless Solutions (GWS) conducted network testing during the 2021 Super Bowl in Tampa and provided some additional insights that typically don’t get mentioned with other tests. GWS found that AT&T used 8 channel CA using 800 MHz of its 39 MHz spectrum, Verizon used 6 CA using 600 MHz in 28 GHz, and T-Mobile 4 CA using 400 MHz in 39 GHz plus 80 MHz in the 2.5 GHz band. The availability using the 28 and 39 GHz band was almost the same for every carrier at 73% to 77%, which isn’t surprising considering the relatively small area covered. T-Mobile’s 2.5 GHz coverage added another 20% of coverage. Considering more spectrum means more speed, it comes as no surprise that AT&T was the fastest with peak speeds of 1.7 Gbps, Verizon with 1.5 Gbps and T-Mobile with 1.1 Gbps.

The GWS tests show that it is not only important to have a lot of spectrum. It is even more important to use it. Not only does the network have to be ready for it, but also the device. Most flagship devices are using the Qualcomm Snapdragon X55 modem. Devices using this modem, like the Samsung Note 20 that the GWS engineers used for the tests can utilize 8 channel CA in the mmWave band (> 6 GHz) with 2×2 MIMO and 200 MHz 4×4 MIMO below 6 GHz with 7×20 MHz channel CA for LTE. This means that when T-Mobile uses all its on average 130 MHz 2.5 GHz spectrum, current flagship devices will be ready. When Verizon and T-Mobile are ready to follow AT&T’s lead with 8 channel CA, the devices will also be able to support that. The Qualcomm X60 modem that’s for example in the Samsung Galaxy S20  can aggregate spectrum below and above 6 GHz. This allows to combine the better coverage characteristics of sub-6 GHz spectrum with the massive spectrum bands and therefore speeds that are available above 6 GHz. The X60 modem also works with the upcoming C-Band networks that will probably become available in 2022.

When Nvidia announced that it was in the process of buying Arm from Softbank, many analysts and industry observers were exuberant about how it would transform the semiconductor industry by combining the leading data center Artificial Intelligence (AI) CPU company with the leading device AI processor architecture company. While some see the potential advantages that Nvidia would gain by owning ARM, it is also important to look at the risks that the merger poses for the ecosphere at large and the course of innovation.

An understanding of the particular business model and its interplay highlights the importance of the proposed merger. Nvidia became the industry leader in data center AI almost by accident. Nvidia became the largest graphics provider by combining strong hardware with frequently updated software drivers. Unlike its competitors, Nvidia’s drivers constantly improved not only the newest graphics cards but also past generation graphics cards with new drivers that made the graphics cards faster. This extended the useful life of graphics cards but, more importantly, it also created a superior value proposition and, therefore, customer loyalty. The software also added flexibility as Nvidia realized that the same application that makes graphics processing on PCs efficient and powerful – parallel processing – is also suitable for other heavy computing workloads like bitcoin mining and AI tasks. This opened up a large new market as its competitors could not follow due to the lack of suitable software capabilities. This made Nvidia the market leader in both PC graphics cards and data center AI computation with the same underlying hardware and software. Nvidia further expanded its lead by adding an parallel computing platform and application programming interface (API) to its graphics cards that has laid the foundation for Nvidia’s strong performance and leading market share in AI.

ARM, on the other hand, does not sell hardware or software. Rather, it licenses its ARM intellectual property to chip manufacturers, who then build processors based on the designs. ARM is so successful that virtually all mobile devices use ARM-based CPUs. Apple, which has used ARM-based processors in the iPhone since inception is now also switching their computer processors from Intel to ARM-based internally built CPUs. The ARM processor designs are now so capable and focused on low power usage that they have become a credible threat to Intel, AMD, and Via Technology’s x86-based CPUs. Apple’s move to eliminate x86 architecture from their SKUs is a watershed moment, in that solves a platform development issue by allowing developers to natively design data center apps on their Macs. Consequently, it is only a matter of time before ARM processor designs show up in data centers.

This inevitability highlights one of the major differences between ARM and Nvidia’s business model. ARM makes money by creating processor designs and selling them to as many companies that want to build processors as possible. Nvidia’s business model, on the other hand, is to create its own processor designs, turn them into hardware, and then sell an integrated solution to its customers. It is hard to overstate how diametrically different the business models are and hard to imagine how one could reconcile these two business models in the same company.

Currently, device AI and data center AI are innovating and competing around what kind of tasks are computed and whether the work is done on the device or at the data center or both. This type of innovative competition is the prerequisite for positive long-term outcomes as the marketplace decides what is the best distribution of effort and which technology should win out. With this competition in full swing, it is hard to see how a company CEO can reconcile this battle of the business models within a company. Even more so, the idea that one division of the New Nvidia, ARM, could sell to Nvidia’s competitors, for example, in the datacenter or automotive industry and make them more competitive is just not credible, especially for such a vigorous competitor as Nvidia. It would also not be palatable to shareholders for long. The concept of neutrality that is core to ARM’s business would go straight out of the window. Nvidia wouldn’t even have to be overt about it. The company could tip the scales of innovation towards the core data center AI business by simply underinvesting in the ARM business, or in industries it chooses to deprioritize in favor of the datacenter. It would also be extremely difficult to prove what would be underinvesting when Nvidia simply maintained current R&D spend rather than increasing it, as another owner might do as they see the AI business as a significant growth opportunity rather than a threat as Nvidia might see it.

It is hard to overestimate the importance of ARM to mobile devices and increasingly to general purpose computing – with more than 130 billion processors made as of the end of 2019. If ARM is somehow impeded from freely innovating as it has, the pace of global innovation could very well slow down. The insidious thing about such an innovative slow down would be that it would be hard to quantify and impossible to rectify.

The proposed acquisition of ARM by Nvidia also comes at a time of heightened anti-trust activity. Attorney Generals of several states have accused Facebook of predatory conduct. New York Attorney General Letitia James said that Facebook used its market position “to crush smaller rivals and snuff out competition, all at the expense of everyday users.” The type of anti-competitive conduct that was cited as basis for the anti-trust lawsuit against Facebook was also that of predatory acquisitions to lessen the threat of competitive pressure by innovative companies that might become a threat to the core business of Facebook.

The parallels are eerie and plain to see. The acquisition of ARM by Nvidia is all too similar to Facebook’s acquisitions of Instagram and WhatsApp in that both allow the purchasing entity to hedge their growth strategy regardless of customer preferences while potentially stifling innovation. And while Facebook was in the driver’s seat, it could take advantage of customer preferences. Whereas in some countries and customer segments the core Facebook brand is seen as uncool and old, Instagram is seen as novel and different than Facebook. From Facebook’s perspective, the strategy keeps the customer in-house.

The new focus by both States and the federal government, Republicans and Democrats alike, on potentially innovation-inhibiting acquisitions, highlighted by their lawsuits looking at past acquisitions as in Facebook’s and Google’s case, make it inevitable that new mergers will receive the same scrutiny. It is likely that regulators will come to the conclusion that the proposed acquisition of ARM by Nvidia looks and feels like an act that is meant to take control of the engine that fuels the most credible competitors to Nvidia’s core business just as it and its customers expands into the AI segment and are becoming likely threats to Nvidia. In a different time, regardless of administration, this merger would have been waved through, but it would be surprising if that would be the case in 2021 or 2022.

When Nvidia announced that it was in the process of buying Arm from Softbank, many analysts and industry observers were exuberant about how it would transform the semiconductor industry by combining the leading data center Artificial Intelligence (AI) CPU company with the leading device AI processor architecture company. While some see the potential advantages that Nvidia would gain by owning ARM, it is also important to look at the risks that the merger poses for the ecosphere at large and the course of innovation.

An understanding of the particular business model and its interplay highlights the importance of the proposed merger. Nvidia became the industry leader in data center AI almost by accident. Nvidia became the largest graphics provider by combining strong hardware with frequently updated software drivers. Unlike its competitors, Nvidia’s drivers constantly improved not only the newest graphics cards but also past generation graphics cards with new drivers that made the graphics cards faster. This extended the useful life of graphics cards but, more importantly, it also created a superior value proposition and, therefore, customer loyalty. The software also added flexibility as Nvidia realized that the same application that makes graphics processing on PCs efficient and powerful – parallel processing – is also suitable for other heavy computing workloads like bitcoin mining and AI tasks. This opened up a large new market as its competitors could not follow due to the lack of suitable software capabilities. This made Nvidia the market leader in both PC graphics cards and data center AI computation with the same underlying hardware and software. Nvidia further expanded its lead by adding an parallel computing platform and application programming interface (API) to its graphics cards that has laid the foundation for Nvidia’s strong performance and leading market share in AI.

ARM, on the other hand, does not sell hardware or software. Rather, it licenses its ARM intellectual property to chip manufacturers, who then build processors based on the designs. ARM is so successful that virtually all mobile devices use ARM-based CPUs. Apple, which has used ARM-based processors in the iPhone since inception is now also switching their computer processors from Intel to ARM-based internally built CPUs. The ARM processor designs are now so capable and focused on low power usage that they have become a credible threat to Intel, AMD, and Via Technology’s x86-based CPUs. Apple’s move to eliminate x86 architecture from their SKUs is a watershed moment, in that solves a platform development issue by allowing developers to natively design data center apps on their Macs. Consequently, it is only a matter of time before ARM processor designs show up in data centers.

This inevitability highlights one of the major differences between ARM and Nvidia’s business model. ARM makes money by creating processor designs and selling them to as many companies that want to build processors as possible. Nvidia’s business model, on the other hand, is to create its own processor designs, turn them into hardware, and then sell an integrated solution to its customers. It is hard to overstate how diametrically different the business models are and hard to imagine how one could reconcile these two business models in the same company.

Currently, device AI and data center AI are innovating and competing around what kind of tasks are computed and whether the work is done on the device or at the data center or both. This type of innovative competition is the prerequisite for positive long-term outcomes as the marketplace decides what is the best distribution of effort and which technology should win out. With this competition in full swing, it is hard to see how a company CEO can reconcile this battle of the business models within a company. Even more so, the idea that one division of the New Nvidia, ARM, could sell to Nvidia’s competitors, for example, in the datacenter or automotive industry and make them more competitive is just not credible, especially for such a vigorous competitor as Nvidia. It would also not be palatable to shareholders for long. The concept of neutrality that is core to ARM’s business would go straight out of the window. Nvidia wouldn’t even have to be overt about it. The company could tip the scales of innovation towards the core data center AI business by simply underinvesting in the ARM business, or in industries it chooses to deprioritize in favor of the datacenter. It would also be extremely difficult to prove what would be underinvesting when Nvidia simply maintained current R&D spend rather than increasing it, as another owner might do as they see the AI business as a significant growth opportunity rather than a threat as Nvidia might see it.

It is hard to overestimate the importance of ARM to mobile devices and increasingly to general purpose computing – with more than 130 billion processors made as of the end of 2019. If ARM is somehow impeded from freely innovating as it has, the pace of global innovation could very well slow down. The insidious thing about such an innovative slow down would be that it would be hard to quantify and impossible to rectify.

The proposed acquisition of ARM by Nvidia also comes at a time of heightened anti-trust activity. Attorney Generals of several states have accused Facebook of predatory conduct. New York Attorney General Letitia James said that Facebook used its market position “to crush smaller rivals and snuff out competition, all at the expense of everyday users.” The type of anti-competitive conduct that was cited as basis for the anti-trust lawsuit against Facebook was also that of predatory acquisitions to lessen the threat of competitive pressure by innovative companies that might become a threat to the core business of Facebook.

The parallels are eerie and plain to see. The acquisition of ARM by Nvidia is all too similar to Facebook’s acquisitions of Instagram and WhatsApp in that both allow the purchasing entity to hedge their growth strategy regardless of customer preferences while potentially stifling innovation. And while Facebook was in the driver’s seat, it could take advantage of customer preferences. Whereas in some countries and customer segments the core Facebook brand is seen as uncool and old, Instagram is seen as novel and different than Facebook. From Facebook’s perspective, the strategy keeps the customer in-house.

The new focus by both States and the federal government, Republicans and Democrats alike, on potentially innovation-inhibiting acquisitions, highlighted by their lawsuits looking at past acquisitions as in Facebook’s and Google’s case, make it inevitable that new mergers will receive the same scrutiny. It is likely that regulators will come to the conclusion that the proposed acquisition of ARM by Nvidia looks and feels like an act that is meant to take control of the engine that fuels the most credible competitors to Nvidia’s core business just as it and its customers expands into the AI segment and are becoming likely threats to Nvidia. In a different time, regardless of administration, this merger would have been waved through, but it would be surprising if that would be the case in 2021 or 2022.

Over the past 15 years, there have been several government initiatives to expand the adoption of broadband in the United States. At the same time, industry has been busily focused on extending the reach and capacity of both fixed and mobile broadband networks.  Yet, a digital divide still exists.  Why?  Let’s review the history here.

Since xxx, the cable and telecom industry have successfully provided broadband connectivity to more than 110.8 million households, adding about 2.4 million households per year. Gigabit speeds are now available to 85% of households. The broadband companies expand their  footprint in an economically responsible way as they are accountable to their shareholders. Regardless, this leaves us with 17.7 million households left to cover. With the number of households increasing by roughly one million per year, at the current pace this would take us around 13 years. The current pandemic, with its work and study from home demands, shows us that we do not have 13 years to close this digital divide. In order to make the best possible decision on how to solve the problem, we should look at what has and has not worked in the past.

One of the most hotly debated solutions being proposed to close the digital divide is to have the government support municipal broadband, a catch-all term  for providers of broadband that includes telephone and electric cooperatives. The general caveat of government entering what is a private business market is what economists call crowding out. A for-profit company typically has no chance of competing against a government entity. The latter does not have a profit goal and can provide service at a loss for an infinite period of time, as it has access to government revenue in the form of taxes or bonds to cover the losses. At the same time, the government has a poor record of innovating adjustments to a rapidly changing technological environment. The pro-municipal broadband argument holds that if for-profit companies are not offering services in a particular geographic location, they cannot be crowded out.

Electric cooperatives were founded in the 1930s to solve the 20th century equivalent of the broadband problem, and the solution is instructive for our current situation. The Institute of Local Self-Reliance, an organization in favor dispersing economic power and ownership, identified eight municipal networks that failed in the United States. The common thread of failure was inexperience in running customer-facing organizations as a neophytes struggled to learn a new skill set. This highlights the gap between running a relatively small number of government services and running much larger and more technically complicated broadband network and the problems recruiting the people with the right existing skill sets.

The most likely scenario for success is the addition of broadband service to an existing electric or telephone cooperative’s portfolio. In this case, an entity with experience in running a customer-facing operation and network for decades simply expands its service. The cooperatives are already serving mostly rural customers and do not crowd out for-profit cable and telecom providers. The FCC has recognized this and has explicitly included electric cooperatives in the Connect America Fund II initiative (which we will discuss later)

Source: ILSR

As we can see from the map above, the opportunity for rural broadband coverage from cooperatives is significant as rural areas often in the South and the Great Plains have low population density. Perhaps engaging both electric and telephone cooperatives in rural areas is an effective way to close the digital divide in some areas.  These could take the form of public private partnerships and potentially avoid the pitfalls of muni-broadband.

Muni-Broadband has failed for different reasons.  Research shows that most of the failed entities are urban, often engaging in direct competition with incumbent providers. Examples such as Monticello, MN, Salisbury NC, and Tacoma, WA come to mind. In other cases, the municipal broadband networks such as in Muscatine, IA, Utopia, UT had to be bailed out by taxpayers or the electric cooperative because it could not stay afloat. We also have Provo, UT and Groton, CT, which ended up selling to private companies at a great loss to tax payers,  and Burlington, VT where the lack of oversight and cover-up of incompetence lead to failure to Bristol, VA where corruption meant the end of the network.

In 2010, Google announced that it would start providing broadband fiber connectivity in a number of cities to between 50,000 and 500,000 households. Cleverly, Google put out a request for information asking municipalities to apply to have Google offer fiber in their city or town. This reversed the traditional relationship between provider and municipality. Traditionally, the provider asks the municipality if it can provide service in the area. The municipality responds with what they ask for in terms of fees and extra services. Ever wondered why so many pools, parks, and sports areas are sponsored by telecom and cable companies? It was one of the demands of the city in order to allow the service provider to offer service in the town. By inverting the relationship and asking towns to apply to Google for consideration, Google shifted the power relationship, and was able to receive such favorable terms that telecom and cable providers went to cities and demanded the same terms and conditions that Google got, but they were never able to get by themselves. Under equal treatment rules, these cities had to extend the favorable Google terms and conditions to every other provider. Kansas City was the first city Google Fiber launched followed by Austin, Provo, and fifteen more cities. The Provo network was a defunct municipal network that was built for $39 million and then sold to Google for one dollar. After realizing the high cost to build a fiber network and the long delay of a payback to themselves, Google first halted further network expansions after it had deployed in five cities, and then switched to a private public partnership (PPP) model where the municipality builds the network and incurs the cost and Google sells the service. In addition, Google made an acquisition in the fixed wireless broadband space to also provide broadband wirelessly. This has slowed down the expansion significantly, but the scope has increased beyond what can be called a trial – as Google likes to call every endeavor they get into – as Google now covers 18 cities.

The 19th market for Google Fiber will be West Des Moines, Iowa. Similarly to Huntsville, Alabama, the city will build a fiber network for $39 million, in exchange Google will pay the city $2.25 for each household that connects to the network. Over the 20-year agreement, Google will pay at least $4.5 million to the city. The project will be completed by the end of 2023. By entering PPPs, Google gets the various cities to pay for the expensive built out and make money by providing the service. Google’s experience highlights that even one of the largest companies in the world does not have the focus, wherewithal and patience to actually build out a nationwide system, but relies on the government to pay for the physical buildout.

When the government helps in areas with adverse circumstances, either through low population density or low income, a business case can be made that allows the deployment of broadband services. The societal good that comes from broadband in the form of access to online learning for students, job resources for adults and an overall increase in computer skills will create greater long-term benefits than long-term costs.

On the government side of the equation, the FCC has been very focused on allocating monies (and spectrum) for broadband.  The FCC’s Connect America Fund (CAF) was born out of the National Broadband Plan from 2010 aiming to broaden the availability of broadband. Now in its second iteration, CAF II, the fund is a reverse auction subsidy for broadband providers, satellite companies and electric cooperatives to provide coverage in underserved areas.

At the end of the CAF II auction, $1.49 billion of subsidies over ten years were awarded to provide broadband and voice services to 700,000 locations in 45 states highlighted in the map above. Prospective providers successively bid on who would cover the underserved market for less and less subsidy. This ensures that the area is covered for the least cost to taxpayers.

The CAF II and other government programs are increasingly closing the gap more than $20.4 billion over the next 10 years. The US Department of Agriculture has been one of the longest standing sources of support to bring broadband to rural America with $600 million per year from the ReConnect program. In October 2020, the FCC will launch the auction for the Rural Development Opportunity Fund (RDOF), a 10-year $20.4 billion program to bring broadband to areas that do not have broadband defined at 25 Mbit/ss download speed and 3 Mbit/s upload speed.

The biggest controversy around CAF II is the mapping issues. In a nutshell, if only one location in a census track has access to broadband, it is assumed that all locations have broadband. This is in a significant number of cases is not true and some locations have access when others do not. This is especially true in urban areas where we still have some high population pockets that lack access to broadband. Parts of the FCC Commission wanted to delay additional projects until the mapping problem was solved, whereas the majority voted to release the fund and work at the problem concurrently as the underserved markets are underserved even with a tighter requirement.  While being criticized for its complexity and lack of clarity of how overachievement of the target goals gets recognized and impacts winning the subsidy, the program has been overall lauded a success.

When we look at what has worked and what hasn’t worked, it becomes apparent that the for-profit system has worked for 90% of Americans to have access to at least one broadband provider. The problem becomes the hard to reach, both in urban and rural environments. No matter how we look at the issue, it becomes clear that government and cooperatives plays a role to alleviate the problem as we need to fix a societal problem.

  • Since Silicon Valley giants like Google with almost infinite resources have balked at building out fiber in many urban areas and are relying on cooperatives or municipalities to foot the bill, the economics of building out hard to reach parts of the United States are even more difficult.
  • The broadband industry is investing between $70 billion and $80 billion per year to connect Americans, the wireless industry investing another $25 to $30 billion just shows that the industry can’t shoulder it alone.
  • Electric cooperatives as non-profits have a longer time horizon, which makes their investment in underserved rural areas easier, as they have an already established customer relationship with the prospective customers and an established connection to the location.
  • The CAF and other funds have worked by providing the minimal subsidy to cover underserved markets, but we just need more. Even though some have complained that it provides for only one choice ignoring that 85% of households have a choice of two wireline providers and 99% of Americans can chose between at least three mobile service providers. The counter argument for very rural parts of the United States is that one choice in an economically unprofitable market is better than no choice. Also, one has to consider that requiring every location to have two choices roughly doubles the cost of deployment and half of the infrastructure being idle.
  • The program will work even better with more accurate mapping of underserved areas and through that broaden its scope from mainly rural to also urban areas and become location agnostic. If a follow-up program not only wants to bring access but also competition to an underserved area, the government would have to not only double but probably quadruple if not quintuple the subsidy due to the doubling the cost of deployment while halving the expected revenue.

The consequences of not building out areas that do not have broadband access today – regardless if urban or rural – perpetuates the current trends where we have parts of society that cannot participate in the economic and social life of our country. As 2020 has shown us, broadband internet has become the lifeline of businesses and video conferencing has become a necessity for employees to work remotely. This means that many better paid jobs are closed to people depending on where they live regardless if it is an area without broadband in the urban or rural place. Unsolved this will force a further depopulation of rural America, a flight from unserved urban areas as critical employees and business owners are effectively prevented from earning a living there. At least as important is the equal access to education. Student homework and tests cannot be counted for grading unless every student in the class is able to participate. Without broadband access, not only children who live in these unserved areas are affected but also their classmates who have access.

For a country that is known for being as efficient, organized, and technologically advanced as Germany, its state of mobile networks constitutes a rare black mark. Germany is the third largest economy in the world with 82 million inhabitants (double that of California in half of the area) with a highly efficient and advanced high-tech manufacturing industry. Where it is struggling is with the digitalization of the economy and both fixed and wireless networks. Germany’s wireless networks are ranked 32nd out of 34 countries, ahead only of Ireland and Belarus. Yet no other European country has a larger share of 3G users than Germany, and it is not uncommon to fall back to EDGE networks both in urban and rural areas. The reasons for this atypical performance lie with the actions of regulators and companies alike.

In 2010, Germany auctioned 4G licenses with the requirement that within 5 years 97% of the population would be covered by 4G. However, even by 2020, every operator has failed to meet the 2015 buildout requirement. How could this happen in a country that prides itself on following the rules?

 RequirementTelefónicaTelekomVodafone
Baden-Württemberg97%82,7%96,01%97,7%
Bayern97%80,7%97,58%98,3%
Berlin97%100%99,96%100%
Brandenburg97%62,6%97,5%99%
Bremen97%99,9%99,99%100%
Hamburg97%100%99,99%100%
Hessen97%76,7%98,39%97,4%
Mecklenburg-Vorpommern97%72,9%97,52%99,3%
Niedersachsen97%85,9%98,6%99%
Nordrhein-Westfalen97%94,3%99,28%99,4%
Rheinland-Pfalz97%65,4%96,48%97%
Saarland97%78,9%95,43%97,9%
Sachsen97%80,9%98,12%99%
Sachsen-Anhalt97%80,6%98,49%98,7%
Schleswig-Holstein97%90,6%98,53%99,9%
Thüringen97%73,2%97%98,1%
Nationwide98%84,3%98,1%98,6%
Interstates100%77,9%97,6%96%
Rail100%80,3%96,4%95%***

Souce: Bundesnetzagentur, May 2020

With every new generation, German mobile operators suffer from low technology adoption because they use the same playbook over and over again (3G, 4G and now 5G), resulting in the same poor outcome. Wireless licenses in Germany and most of Europe are tied to a specific technology, whereas US licenses can be used with any technology that allows a more efficient transition from one generation to the next. Regardless, German operators rightfully realize the high value of new spectrum for next generation technology and bid more money per capita for next generational licenses than anywhere else in Europe. As a result of the significant investment in licenses, German operators position the next generation product as a premium product with a significant price premium. For this reason, consumers and businesses are reluctant to adopt next generation service plans and devices, leading to suppressed next generation revenues and profits. These low profits are then used as a justification limit capital investment in next generation technologies. Consequently, German wireless networks cover less area than they can and should. This self-fulfilling prophesy is now in its third iteration. We have seen it in 3G, 4G, and now 5G in Germany.

US carriers start from the same point of recognizing the value of next generation technology and spectrum, and US spectrum auctions have yielded the highest values globally. Unlike Germany, US mobile operators make the new technology available for the same price point as the last generation technology, creating greater profitability through a significantly lower cost structure given that next generation technology typically lowers the cost per gigabyte by 90% over the previous generation. As a result, US mobile operators see a rapid shift from the old generation to the next generation network usage as customers upgrade their devices to be able to take advantage of the new networks. By holding the price points steady for next generation networks with their faster speeds, US operators are under less price pressure than European operators, allowing operators to invest heavily in their networks and differentiate on coverage. As a result, the US ranks fifth in the world for 4G availability, behind South Korea, Japan, Norway and Hong Kong, which make up a combined 9.3% of the areas of the United States. As a result, everyone wins in the US approach: customers have faster access to next generation technology, and operators make a higher profit.

Germany’s cost problem is compounded by a legal and regulatory regime that does not favor the building of cell sites similar to Section 332 of the Telecom Act. German building permits are notoriously lengthy endeavors that take a long time. Frequent lawsuits against many cell sites lead to drawn-out legal reviews which slows down network buildout. All of these policies are not friendly for capital investment in wireless networks.

The problem of how to cover thinly populated rural areas in Germany persists. Mobile operators complain that it is unprofitable to cover many rural areas. During the 2018 Mobilfunkgipfel (Mobile Summit) between the German government and mobile operators, the government committed to share part of the cost of covering rural parts of Germany.

Coverage issues in rural parts of a country are not unique. Germany’s neighbor France – roughly the size of Texas – has tackled the issue in three different ways. For the 2G rollout mobile operators, the central government, and the departments (provinces) with coverage gaps in the rural parts of France split the cost three ways between the parties. In 2015, the French government set aside $1 billion to close the 3G coverage gaps. In 2018, the French government came to an agreement with the four incumbent operators to extend the license term in exchange for closing coverage gaps and to install jointly more than 5,000 masts and antennas.

There are four key lessons that we can take away from the German and French examples:

  1. The business model matters. American operators are providing world class service, especially considering the size of the country. The US operator model of capturing profit through cost reduction rather than price increases is the superior model. It results in faster and higher adoption of next generation technology and greater capital investment. The one US carrier who tried to charge a premium for 5G, Verizon, has two European executives at the helm. Customer pressure quickly forced Verizon to abandon its European model of a price premium and revert back to the US model.
  2. A mobile-friendly regulatory regime that enables the rapid building of new cell sites makes a positive difference. It is a no-brainer that when it is difficult for operators build new sites, coverage suffers.
  3. Even medium-size economically prosperous countries like France and Germany have similar problems to economically build out mobile networks. While it is more cost effective to build out rural areas with wireless rather than fixed technology, the business case is far from a foregone conclusion.
  4. The comparison between the US and more tightly regulated countries shows that incentives and support for wireless networks without red tape and strings attached are creating better results.

A new report called “Broadband 2020” by Recon Analytics shows that over 40% of employees in the United States are able to telecommute. The Department of Labor’s Bureau of Labor Statistics defines the professional workforce as all workers in the “management, professional, and related occupations” colloquially known as white collar workers, which make up 41.2% of all jobs in America. This means that basically every white collar worker is able to telecommute. This highlights the dramatic change that the American workplace has undergone during the pandemic.

The pandemic also has the potential to halt or even reverse the decades-long migration of Americans from rural to urban settings. A slight majority (50.9%) of Americans that can telecommute are contemplating moving to a smaller city or town as the pandemic has prompted many Americans to reevaluate their priorities and living conditions.

What is surprising is that even 31% of Americans that cannot telecommute are considering moving to a smaller city or town. It shows that the luster of metropolitan areas has been waning.

But not all new places are equal, so we asked what factors would stop people from moving to a new place. The results were equal parts predictable and surprising:

More than a third of Americans do not have any reasons that would prevent them from moving to a different place. Where it gets interesting is the reasons why people would not move. The number one reason for not moving to a different town or village is a pay cut – 31.6% of respondents. Companies like Facebook have announced that employees who work from home from lower-cost areas – and everything is lower cost than Silicon Valley – would receive a pay cut. A move that ties compensation to location rather than contribution would prevent a significant number of employees from moving away from Silicon Valley, which already is experiencing a severe housing shortage and overloaded roads. Facebook’s reasoning also allows a glimpse at its compensation philosophy, which seems to focus more on competitive factors than what is good for the community or the employee. Almost as many, 31%, would not move to a town or village without broadband, which is just ahead of access of quality health care with 30.1% – and that in the midst of a pandemic. One has to recognize the magnitude of this finding: Availability of broadband, access to quality healthcare, and a pay cut are equally important in the mind of Americans during a pandemic and recession.

At 36.3%, the 45-54 age segment considers the lack of broadband to be the most significant barrier to moving, followed by the 25-34 age segment with 35.8%. More than a quarter of seniors (26.1%) will not move to a new location if broadband isn’t readily available.

Broadband is even more important than politics. While 22.5% of Americans would not move to an area with what they consider an incompatible political climate, which is significantly less than the importance of broadband. The  45 to 56 age segment is most focused on  politics with over 30.9% citing an unwillingness to move due to an incompatible political climate. The next most polarized age segment is those over the age of 65, where 22.1% mention an incompatible political climate prevents them from moving.

The lack of a nearby airport or a buzzing nightlife was the least important in people’s minds. Only 13.7% of respondents thought that not having an airport within a 50-mile radius would prevent them from moving there. A buzzing nightlife or restaurant scene is even less on people’s minds. Only 9.6% of 18 to 24-year-olds find it an obstacle to move, whereas 13.1% of the 25 to 34 age segment needs a buzzing nightlife and restaurant scene.

We asked people what they considered broadband. The median American considers 50 Mbit/s download and 5 Mbit/s upload as broadband. The people’s expectations are leading the FCC’s definition of broadband which currently sits at 25 Mbit/s download and 3 Mbit/s upload.

The reason for this becomes apparent when we look at the use cases. In our survey we looked at several use cases, but the prevalence of video conferencing has driven bandwidth requirements upwards, especially on the upload side. A HD video stream requires a minimum of 5 Mbit/s upload and download per stream. With more than 25% of Americans now frequently using video conferencing for work and another 21% using sometimes for work the bar has effectively been raised.

While the lack of widely available broadband is a significant hurdle for cities and towns to attract new residents, it is almost outright disqualifying for housing options: 77.5% of respondents would not move to a place, like a house or apartment, that does not have broadband. This makes the availability of broadband one of the key selection criteria when choosing a new residence. When almost half of the population has to be sometimes or frequently on video conferencing, having broadband becomes a job requirement. The pandemic, for the good and bad, has made our homes places of work with the IT and connectivity needs that were traditionally reserved for places of work. These are just some of the highlights of the new Recon Analytics Report “Broadband 2020.”

The results of the report are reinforcing the data from FCC’s 2020 Broadband Deployment Report which represents the most recent government data on the topic and the progress the industry has made from 2014 to 2018.

As of 2018, 94.4% of the Americans have access to broadband as the FCC defines it, 25 Mbits download, 3 Mbits upload (25/3). In urban areas, it is even 98.5%, but as we look towards rural areas and tribal lands, the availability is significantly less. In rural areas 77.7% of Americans and in tribal lands, 72.3% of Americans have access to 25/3 broadband. In higher tiers, access in urban areas drops only slightly, but much more significantly in rural areas and in tribal lands. At the 250/25 Mbps tier, 94% of Americans in urban areas have access, a drop of 4.5% from the 25/3 level. In rural areas,  51.6% of American have access to 250/25, which is 26.1% less than 25/3. In tribal lands, 45.5% have access to 250/25 which is 26.8% less than 25/3.

The numbers make it clear that there is still more than enough to do in urban, rural and tribal areas to provide connectivity for essential tasks. As it looks increasingly unlikely that children in every school district will be able to go back to school, we need to ensure that every child in the United States can access the internet to be able to participate in school and classroom work. If only one child cannot participate, the progress and grades for the entire class are not counted. While fixed broadband deployment is a time-consuming endeavor, mobile broadband can and should close the homework gap. T-Mobile has announced that as part of its merger commitments it will deliver mobile broadband to 10 million households we have only a few weeks to turn this promise into a meaningful difference as the new school year starts. The other mobile operators, in conjunction with the FCC and federal funding, should seize the opportunity and close the homework gap as quickly as possible.

In order to recover as quickly as possible from the current economic slump, we should put money where it has the biggest impact. Different technologies are able to achieve the same goals but have strengths and weakness in different areas. This means that any funding has to be technologically agnostic and look at the performance characteristics. The United States has wisely always used performance characteristics such as download and upload speed as well as latency as its selection criteria rather than being tied to a technology regardless if it is fiber, hybrid fiber coax, VDSL, satellite or whatever generation of wireless standards.

If you would like to buy the underlying report, please give us a call at 617.823.3363