February 3, 2026 | Joe Salesky, Analyst & Head of AI Research
Despite Microsoft’s enterprise distribution advantages and Office 365 integration, Microsoft Copilot lost 7.3 percentage points of paid subscriber share in seven months while Google Gemini gained 2.9 points, based on more than 150,000 respondents. Distribution advantages do not lock in market position. Employees receiving enterprise AI tools evaluate options and select based on experience. The platform that delivers the most reliable results wins, regardless of vendor seat licenses.
The 39% Market Contraction
Copilot’s decline from 18.8% in July 2025 to 11.5% in January 2026 represents a 39% contraction in market position among U.S. paid AI subscribers. This occurred during a period when Microsoft actively invested in enterprise distribution and deepened Office 365 integration. The platform accesses the same OpenAI models as ChatGPT, so underlying capability is comparable. The divergence in user perception points to product experience and integration execution rather than model quality.
ChatGPT maintained dominant share near 55.2% with modest erosion from 60.7% in July. Gemini climbed from 12.8% to 15.7%, crossing Copilot in late November to claim the number-two position. The two platforms now separated by more than 4 percentage points, with Gemini’s trajectory continuing upward while Copilot stabilized in the 10-12% range.
Exhibit 1: Primary Platform Share Among U.S. Paid AI Subscribers
Source: Recon Analytics U.S. AI Survey, July 2025 – January 2026. U.S. paid subscribers only.
The Workplace Conversion Gap
When Copilot is the only AI platform an employer provides, 68% of workers adopt it as their primary tool. This demonstrates meaningful uptake when alternatives are absent. The competitive dynamics shift when employers offer multiple platforms.
Among workers with both Copilot and ChatGPT available, Copilot’s adoption falls to 18% while ChatGPT captures 76%. When all three major platforms are available, only 8% choose Copilot while 70% choose ChatGPT and 18% choose Gemini. The pattern is consistent: as worker choice expands, Copilot adoption collapses.
ChatGPT converts 83.1% of U.S. paid subscribers who have workplace access. Copilot converts 35.8%. Gemini converts 34.0%. The 47-point gap between ChatGPT and Copilot quantifies the adoption challenge. Microsoft’s Office 365 distribution creates exposure. That exposure does not automatically translate to preference when workers evaluate alternatives.
Exhibit 2: Workplace Conversion Rates by Platform
Source: Recon Analytics U.S. AI Survey, July 2025 – January 2026. Workers with paid AI subscriptions only.
Quality Perception Drives Share Movement
Gemini’s rise correlates with quality leadership. The platform posts the highest accuracy satisfaction scores among major competitors, 23 points above Copilot and 9 points above ChatGPT. Copilot’s decline correlates with the lowest accuracy perception in the market. Quality drives share movement, not deployment volume.
Copilot’s accuracy NPS remained persistently negative throughout the measurement period. July showed -3.5, September declined to -24.1, and January finished at -19.8. The gap is not closing. Users who tried Copilot and stopped using it cited distrust of answers at 44.2%, exceeding comparable figures for Gemini (42.8%) and ChatGPT (40.6%).
The correlation between accuracy perception and market share movement is direct. Platforms investing in quality gain share. Those relying on ecosystem integration without matching quality lose share.
The Enterprise Market Remains Contestable
Multi-platform enterprise deployment is common. Among U.S. paid subscribers with Copilot available at work, over half also have ChatGPT available. Workers evaluate options and select based on experience rather than defaulting to whatever platform their employer provisions.
The enterprise AI market is not winner-take-all. Every renewal cycle presents an opportunity for challengers to displace incumbents based on demonstrated superiority. Microsoft’s enterprise dominance in productivity software does not foreordain Copilot’s dominance in AI. Google’s Workspace position does not guarantee Gemini’s success. The platforms that execute on accuracy, integration, and use case demonstration will capture enterprise spend in 2026.
ChatGPT’s 83.1% workplace conversion rate and 55.2% market share reflect entrenched product-market fit. Competitors are not displacing ChatGPT. They are competing for second position. Gemini demonstrated that share can shift when product experience improves. Copilot demonstrated that share can decline when it does not.
Report Details
User Illusion: Licenses Don’t Equal Adoption
U.S. Paid AI Subscriber Market Analysis | July 2025 – January 2026
The complete 24-page report includes detailed analysis of:
Platform-by-platform strategic implications for Microsoft, Google, and OpenAI
Use case performance across web search, research, writing, coding, and data analysis
NPS trajectories showing accuracy perception trends over seven months
Churn intent and retention dynamics by platform
Investment theses for enterprise AI market positioning
On January 16, 2026, OpenAI announced plans to test advertisements in ChatGPT’s free tier and the new $8/month “Go” tier in the United States. The move was widely anticipated: advertising funded the scaling of Google Search and Facebook and has been expected as the monetization path for consumer AI services. With OpenAI reportedly losing over $11.5 billion in Q3 2025 and projecting infrastructure spending of $1.4 trillion over the next eight years, the decision reflects economic necessity rather than strategic pivot.
Our analysis of 117,467 U.S. consumers from the Recon Analytics US_AI Survey reveals the consumer dynamics underlying this decision—and the structural challenges that limit OpenAI’s advertising ambitions.
ChatGPT commands 48.5% usage share in the U.S., far ahead of Google Gemini at 18.5%. Only 22.3% of ChatGPT users pay for the service, creating a substantial free user base where advertising represents incremental revenue that would otherwise not exist. Among free users, 40% indicate they would never pay for AI services at any price point. For this segment, advertising is the only viable monetization path.
OpenAI faces what we term “Pioneer’s Peril”: being first to test AI advertising while helping incumbents refine their approach. Google and Meta will defend aggressively. Advertising represents 77% and 97% of their respective revenues, totaling approximately $456 billion in 2025 and controlling roughly 50% of the global digital ad market. Both already deploy AI-powered ad tools. Google’s Performance Max and AI Max deliver 14% average conversion lifts; Meta’s Advantage+ shows 22% ROAS improvements. The six major agency holding companies control approximately 30% of U.S. ad spend with established workflows and proven ROI benchmarks. Switching costs are material—not technical, but institutional.
Digital advertising already represents 82% of total ad spend globally. OpenAI cannot rely on a secular shift from traditional media; that transition is complete. Any meaningful revenue must come from the existing $777 billion digital pool. Capturing even 1% ($7.8 billion) would require displacing entrenched competitors with superior targeting, measurement, and advertiser relationships that do not yet exist.
User Sentiment: A Window of Opportunity
Ads rank last among current user concerns at just 2%, well below privacy (27%) and job displacement (29%). Users have not yet formed strong negative associations with AI advertising. Whether this remains true depends entirely on implementation quality. Nearly one-third of users (31.6%) indicate ads could trigger them to switch platforms, suggesting that intrusive or poorly executed advertising could accelerate competitive dynamics in a market where switching costs are minimal.
Concern
% Citing
Job displacement
29%
Privacy
27%
Accuracy of responses
18%
Bias in AI
12%
Ads / sponsored content
2%
Source: Recon Analytics US_AI Survey; n=117,467
OpenAI’s decision to introduce advertising in the free tier follows sound business logic for user monetization. With 40% of free users indicating they will never pay, advertising represents the only viable revenue path for this segment. However, building a material advertising business faces structural headwinds. Google and Meta’s entrenched positions, AI-powered ad tools, and deep agency relationships create formidable barriers. The most likely near-term outcome is that ChatGPT advertising generates incremental revenue from the free tier but struggles to capture meaningful share of advertiser budgets from platforms with proven performance. We expect modest revenue contribution in 2025-2026, with OpenAI’s advertising ambitions likely measured in hundreds of millions rather than billions.
Methodology: Data from Recon Analytics US_AI Consumer Survey. Fielded May 1 – December 5, 2025. Total sample: n=117,467. Margin of error: ±0.3% at 95% confidence.
Numbers and facts are important because they define ultimate limits and capabilities, but numbers and facts don’t make decisions: People make decisions. Nowhere is this truer than in the United States satellite broadband market of late 2025. If we look strictly at the operational scoreboard, the game is over. Starlink has achieved a scale that no competitor can mathematically replicate within the relevant investment horizon. While the data based on now a bit over one million respondents from our Recon Analytics Telecom Pulse Service shows that Starlink holding a massive customer satisfaction lead in rural America over terrestrial as well as satellite legacy providers like HughesNet, dwelling on this gap is an exercise in archaeological irrelevance. HughesNet is effectively liquidating its business model, and ViaSat is pivoting away from it. Both are implicitly acknowledging that the laws of physics have rendered them obsolete. Rural telcos stuck with DSL are holding on for dear life in an era that is rapidly coming to an end. The war against legacy GEO is not just over; the battlefield has been cleared. When the last remnants of rural DSL are being swept away by its skyborne replacement is only a matter of a few years.
The real narrative is not about Starlink beating zombies; it is about the politically engineered survival of its future competitors. The industry is bifurcating into two distinct realities: SpaceX’s operational “rout” and the strategic mandates sustaining Amazon Leo and AST SpaceMobile. These companies matter not because they are currently beating Starlink on metrics—they aren’t—but because the U.S. government and the nation’s largest wireless carriers have decided that a Musk monopoly is strategically unacceptable. Consequently, we are witnessing the creation of a managed market where strategic intervention and corporate hedging sustain competitors that market forces alone would eliminate.
The Carrier Insurgency: The “Never Musk” Wager
While T-Mobile grabbed headlines by pairing with an iconic inventor and a proven technology years ahead of the competition, the most consequential satellite-communications decision of recent years happened quietly in AT&T’s and Verizon’s boardrooms in 2024. Their commitments of capital and spectrum to AST SpaceMobile weren’t bets on the best technology available: they were bets on strategic independence. Even in 2024, it was clear that AST was operationally behind, struggling with a single-digit satellite count while Starlink was deploying thousands. The carriers knew that AST’s service would likely launch later and offer less initial capacity than the vertically integrated juggernaut of SpaceX. They looked at the spreadsheets, saw the performance gap, and decided to stomach it.
This was a calculated strategic sacrifice. AT&T’s decision to lock into a binding agreement with AST through 2030 represents a deliberate strategy to preserve network sovereignty rather than a forced reaction to market constraints. Management feared, and correctly so, that utilizing Starlink would ultimately accelerate Elon Musk’s ambition to become a full-fledged service provider, leading to their own disintermediation as network operators. If they partnered with Starlink, they risked becoming mere resellers in a Musk-controlled ecosystem, effectively funding their own future competitor. Consequently, AT&T was willing to endure the short-term pain of AST’s operational delays to nurture a competitor that preserves their control, calculating that the cost of funding a future Starlink monopoly far exceeds the risks of supporting a slower, inferior alternative.
Verizon followed a similar, albeit more hedged, logic. Their $100 million investment in AST was a coldly calculated but necessary option premium. Verizon leadership recognized that T-Mobile’s exclusivity with SpaceX was temporary, but they also recognized that a world with only one satellite provider gives that provider infinite pricing power. By propping up AST, Verizon keeps a non-SpaceX option alive to discipline the market. They are funding AST not because the tech is currently better—the gap between AST’s 5 satellites and Starlink’s 660 D2C satellites is 100-to-1—but because the contract isn’t with Musk. AST has effectively become a compliance cost for the wireless industry, a tax paid by carriers to ensure they never have to bend the knee to SpaceX.
This “Not-Musk” imperative explains why the investment thesis for AST remains robust despite the fact that its primary differentiator—broadband to the phone—has been neutralized. SpaceX’s confirmed Q1 2026 rollout of full data and voice capabilities has effectively evaporated AST’s unique value proposition. Yet, the carriers cannot waver. The 2025 rupture between Donald Trump and Elon Musk only validated the carriers’ 2024 foresight: relying on a single, politically volatile billionaire for critical infrastructure is a fiduciary hazard. AT&T and Verizon are stuck with AST, and they are happy to be stuck, because the alternative is captivity.
Amazon Leo: The “Too Big to Fail” Regulatory Gamble
If the carriers are engineering AST’s survival through capital, the federal government is engineering Amazon Leo’s survival through regulation. Amazon Leo is not a standard growth story; it is a binary derivative trade on regulatory relief. The scale of Amazon’s deployment deficit is staggering. As of late 2025, Amazon has managed to place only 153 satellites into orbit, leaving a gap of 1,465 satellites against the FCC’s deadline requiring 1,618 by July 2026. This gap is mathematically uncloseable through launch cadence alone. Consequently, Amazon requires a waiver that would typically invite withering scrutiny.
However, Amazon has successfully constructed a regulatory shield by securing BEAD awards for 211,194 locations across 33 states. These awards create a government interest in Amazon’s success. State broadband offices, desperate to show competition, accepted Amazon’s paper promises over SpaceX’s operational reality, effectively making Amazon too big to fail without collapsing a critical federal program. If Amazon cannot illuminate these locations, states face clawbacks and the administration faces a failure of its signature infrastructure project.
The most dominant policy force in the market today is the BEAD program. Amazon Leo’s dominance of the BEAD program was achieved by aggressively buying the market with average bids of just $560 per location, effectively undercutting Starlink by a factor of three. This secures a guaranteed revenue floor estimated at $177 million annually, which exists independent of consumer preference. Regulators are expected to grant the accommodation to avoid entrenching a SpaceX monopoly, using the waiver to provide political cover while maintaining the appearance of regulatory neutrality. The Trump administration increasingly favors Jeff Bezos over the volatile Elon Musk in this context, rendering regulatory accommodation probable. Amazon Leo survives not because it executed, but because the government cannot afford to let it die.
The Political Overlay: 2025 as an Accelerant
While the carriers made their anti-monopoly decisions in 2024, the political volatility of 2025 acted as a powerful accelerant, hardening the “Not-Musk” resolve across the ecosystem. The alliance between Donald Trump and Elon Musk collapsed in June 2025 due to disputes over fiscal policy and devolved into name calling. Although a pragmatic reconciliation began in November, the era of automatic regulatory preference for SpaceX is finished. The relationship has stabilized at “neutral,” a significant downgrade from the “favored” status Musk enjoyed early in the year.
This political oscillation drives strategic positioning. The Pentagon, seeking to hedge political risk rather than simply improve capability, directed “Golden Dome” defense planners to diversify away from exclusive reliance on SpaceX in favor of Amazon. This directive to “diversify” is now embedded in procurement logic, creating a permanent, protected market for a “second source” regardless of the headlines. Just as AT&T and Verizon funded AST to avoid commercial captivity, the Department of Defense is funding Amazon and AST to avoid strategic captivity.
The Reality of Market Bifurcation
The satellite internet industry has organized into four distinct competitive segments, and understanding this structure is essential because winners in one segment do not necessarily dominate the others. While Starlink dominates the LEO consumer broadband market with a +42 Net Promoter Score, the government and carriers have effectively decided to subsidize competitors to ensure market health. This creates a floor for Amazon and AST, and a ceiling on Starlink’s monopoly power.
The numbers are definitive: Starlink’s operational dominance provides a shield that regulation cannot easily penetrate. Its satisfaction lead creates a political asset, insulating the company because no administration can politically afford to disconnect rural American voters. However, the strategic landscape proves that performance is not the only metric that matters. Amazon Leo’s 211,194 committed BEAD locations provide a survival path even if the FCC denies a consumer waiver, converting it into a government-subsidized utility. AST SpaceMobile’s binding contracts with AT&T and Verizon ensure it remains a viable entity, serving as the industry’s indispensable “Plan B”.
Ultimately, the satellite industry acts as a mirror for the broader political economy. The “SpaceX Paradox” defines Amazon’s desperate position: to compete with Starlink, Amazon was forced to contract launches from its primary competitor, implicitly admitting that SpaceX’s capacity was necessary for its own survival. Yet, Jeff Bezos has successfully positioned himself as a “responsible” alternative, securing a vital revenue lifeline to sustain Amazon Leo. The market has bifurcated: Starlink wins on physics and performance in the consumer zone, while Amazon and AST win on politics and diversity mandates in the regulatory and carrier zones.
For investors and executives, the lesson is clear: The narrative of “failure” surrounding legacy providers is simply the sound of the past dying; ignore it. The real signal is the deliberate, expensive, and strategic effort by the world’s largest telecom companies to prevent a SpaceX monopoly. AT&T and Verizon knew exactly what they were buying in 2024: an inferior product that offered the superior benefit of independence. They decided to stomach the lag, the risk, and the cost because the alternative was a future where Elon Musk held the keys to their network. The data tells us who has the best product, but the strategy tells us who will be allowed to survive.
If you want to read more about the interplay between the satellite and broadband industry have a look here. https://www.reconanalytics.com/products/2027-november-satellite-report-vf/
Commerce Secretary Lutnik signaled in an interview with Broadband Breakfast on March 5th, 2025 that the US government will rethink the BEAD program so that Americans “get the benefit of the bargain.” He elaborated that it could mean that homes get broadband through satellite instead of fiber. “We want the lowest cost broadband access to Americans,” he said.
Secretary Lutnik gets it. Connecting a location to broadband for $10k-$30k makes more sense than spending $60k to do the same thing. By approaching the issue from a technology-neutral perspective, we can connect a lot more people for a lot less money while improving connectivity and satisfaction with the connection.
From February 28th, 2024, to February 28th, 2025, Recon Analytics surveyed 160,848 people in the United States and asked them about their broadband experience with their current provider.
To determine satisfaction with their service, we asked consumers a standard net promoter question as developed by Bain, but modified to ask about specific components of the customer experience. Below you see a heatmap of Recon Analytics’ component NPS scores for connectivity. I omitted the customer interaction part of the heatmap for readability.
The 1,113 Starlink subscribers we interviewed over the last year were the most satisfied, followed by consumers of fixed wireless (FWA) with cable, and DSL customers being the least happy. Why are Starlink customers and FWA customers happier than cable and DLS?
While Starlink and FWA can be slower than speeds over cable, most consumers are not engineers or economists who make decisions based solely on technical merit or price. In the case of both Starlink and FWA, our data confirms that customers value the fact that the technology solution is easy to install and easy to get rid of if the consumer is not happy. In Starlink’s case, we have about three times as many former Starlink customers as current Starlink customers. This indicates that there are quite a few people who were unhappy with the solution and swapped out for a different one.
At the same time, the people for whom Starlink or FWA works are very happy with it, especially in comparison to what they had before. Interestingly, we consistently find in our polling data that the higher the cost for a service, the lower consumer satisfaction is, all other things held equal. The low satisfaction scores for cable, even though cable internet service is substantially cheaper than fiber, is a clear indication that cable needs to do a much better job in serving its customers. As confirmed in the quarterly reported financial data, customers leave services with low satisfaction and join services with high satisfaction. In the home internet case, customers choose a more expensive service because, in their experience, the cheaper service is not worth it.
We are also seeing in the data that the Starlink and FWA routers are state-of-the-art equipment. Investment in good routers leads to better scores for how well existing devices stay connected to the Wi-Fi network and how easy it is to connect to Wi-Fi. It also aids in every other connectivity and customer issues metric. This is demonstrated again in the table below.
We are also asking all of our home internet respondents how often they have subjectively experienced internet issues in the last 90 days. The data below is again from the 160,848 people we surveyed over the last year.
What is striking is how well Fiber, FWA, and Starlink are performing when it comes to reliability. When looking at reliability from a customer perspective, it is the interplay between the connection, which is determined by network technology; the router used or supplied; and the end user equipment. The end user equipment is the same for every customer – a mix of smartphones, laptops, desktops, and other connected devices. While Verizon FiOS and AT&T Fiber customers report fewer instances of their internet connection going down, the FWA and Starlink routers are able to mitigate a lot of the more difficult connection technology challenges.
When comparing the data presented in this research note with what was presented six months ago, the Starlink scores for connectivity increased as the company launched more satellites during that period. As Starlink continues to launch more satellites, its scores will change depending on what increases faster – the number of satellites or the number of customers.
Furthermore, it is interesting to see where Starlink customers came from. Eighty-five percent of Starlink respondents are from rural areas, consistent with Starlink’s reporting of where it sells. Almost 31% came from small ISPs. For more than 11% of respondents, Starlink is the first home internet provider that they have, followed by CenturyLink, Charter, Frontier and Comcast, who provide a lot of internet coverage in rural areas.
Recon Analytics data shows that a technology-neutral approach is the right way to go for allocating federal dollars to get affordable broadband out to as many Americans as will take it. There are many Americans, especially in very rural areas, who are very happy with Starlink’s service. Fixed wireless is also solving the broadband rubric for many customers in a satisfactory way. Fixed wireless is especially valuable in less densely populated areas, where ample spectrum and thereby speed and capacity is shared among fewer people resulting in higher speeds. Fiber, without a doubt, is the workhorse technology for more densely populated areas, where satellite and FWA do not have sufficient capacity given the current licensed full-power spectrum constraints to serve customers well.
What we need, and what Secretary Lutnik rightfully alluded to, is a technology-neutral approach where the Americans can choose how they want to be connected at the lowest price considering the circumstances.
Americans love the internet, accessing it from home and on the road. Until 2007, Americans essentially had two choices when it came to home internet: cable internet or DSL. To the cable industries great credit, they were the first to provide high speed internet access to most Americans with DSL, a slow “other choice” if cable was not available or was too expensive. But beginning in 2007, the telecom companies began to build out fiber, first with Verizon FiOS, and then by AT&T launching fiber service in 2013. By launching fiber networks, telecom companies brought competition to cable in the home broadband market and offered Americans more choices for connecting to the Internet.
In the last three years, the competitive landscape has changed again, for the benefit of American consumers of all stripes. The mobile network operators have launched Fixed Wireless Access (FWA) and, as we saw during Hurricane Helene, satellite provider Starlink proved its prowess in rural, hard to reach geographies.
FWA has become such a popular choice that the cable companies are losing home internet customers to FWA providers, a trend that has thumped the cablecos market cap. Since launch, Verizon, T-Mobile and AT&T added 10.675 million customers to their FWA service. And almost all of those subscribers came from cable companies.
FWA service is typically slightly less expensive that fiber or cable home internet, but its satisfaction scores across all 16 cNPS categories is higher.
Part of the reason for the superior cNPS scores are a better purchasing and installation experience for consumers, lower price points and the ability to easily return the product if it does not meet the customer’s satisfaction. This leads to the customers who use the service to be happy with it, while the unhappy customers cancel the service, return the router and continue service with their existing provider and continue to be less than happy with them.
The high satisfaction and lower price for FWA and the dissatisfaction with the other choices available has led FWA to become the preferred next home internet provider of choice for Americans.
Based on interviews with 288,490 Americans conducted between July 2023 and December 2024, 44% of Americans would choose an FWA as their next provider if they would have to make a choice other than their existing provider, 25% would choose a fiber provider, 17% a cable provider, and 6% each would choose DSL or a satellite provider (predominantly Starlink.)
The change in customer preferences is also an opportunity. FWA is the first home internet offer that is being advertised on a nationwide basis, both on a standalone and converged basis. More than 70% of FWA customers are using the mobile solution of the same provider. We are also increasingly seeing a remarkable amount of customers who are switching from one FWA provider to another indicating both a preference for FWA as well as a high aversion to the available wired solutions available. It is also a wakeup call for existing providers, especially cable, to improve their service, both on a technical basis with DOCSIS 4.0 and a relational basis in how they interact with their customers. We are full of hope as some cable providers are introducing NPS as a metric they look at and full of dismay as FWA is being described as CPI or Cell Phone Internet. By describing FWA as cell phone internet, these cable providers do themselves a disservice as cell phones have nothing to do with FWA other than the network they use and shows a blindness to the real threat FWA provides to them. As long as cable views FWA as CPI it will continue to lose as it lives in its own world disconnected from the preferences of everyday Americans.
This has interesting implications for the spectrum policy world. Cable, understandably, is trying to prevent new licensed, full power spectrum to be authorized for cellular use. Why would they? That additional spectrum will enable the mobile operators to offer even more FWA options. While the wireless industry is pushing hard for more full power, commercial spectrum, it is not a done deal. In 2024, we have seen FWA speeds and the availability to sign up with FWA in urban market decline indicating that the growth of FWA is becoming more of a supply issue than a lack of demand. Hence the need for more full power spectrum to amp up network capacity to support more FWA.
The outgoing 118th Congress failed to provide Americans with a spectrum pipeline and the FCC with general spectrum authority (Congress provided for temporary spectrum authority to reauction the returned AWS-3 licenses.) The chances that the incoming 119th Congress that takes over in 2025 will provide a spectrum pipeline with licensed, full-powered spectrum is much higher. The last Trump White House leaned much more heavily on the Department of Defense and was able to clear the 3.45 GHz spectrum for commercial use in a record one-year time period. The incoming Senate Commerce Committee Chairman, Senator Cruz (R-TX), has also traditionally been less accommodating to Department of Defense preferences and FCC failure to live up to its congressionally mandated requirements.
In a nutshell, FWA has higher satisfaction scores than any other technology and more Americans want FWA as the way they connect to the internet than any other choice. It is up to Congress to decide if Americans get their wish.
Sometimes old album titles say it best. Today, AT&T marks the start of the expansion of AT&T’s fixed wireless home internet service called AT&T Internet Air. After offering it in its DSL footprint for the last few months, it is now becoming the third nationwide mobile network operator (MNO) to launch a 5G (where available) internet offer.
AT&T is starting in Los Angeles, Philadelphia, Cincinnati, Harrisburg/Lancaster/Lebanon, PA; Pittsburgh, Chicago, Detroit, Flint-Saginaw-Bay City, MI; Las Vegas, Minneapolis-St. Paul, Phoenix (Prescott), AZ; Portland, OR; Salt Lake City, Seattle-Tacoma, Tampa-St. Petersburg (Sarasota), and Hartford-New-Haven, CT. Notably, Los Angeles is Charter’s largest market and a T-Mobile FWA stronghold, Philadelphia is Comcast’s home market, and Seattle is T-Mobile’s home market. If the carriers are looking for attention, these launch markets are certainly going to attract it. Another very interesting market is Phoenix. Gigapower, a joint venture in which AT&T is involved, is building out fiber in Mesa, AZ. While the two are about 100 miles apart, it will be interesting to see how the two technologies will be adopted in the same market.
With nationwide combined 3.45 GHz and C-Band of 120 MHz on average, and with at least 100 MHz in every market, AT&T can put significant bandwidth behind its FWA offer. The theoretical maximum speed achievable with 100 MHz of spectrum is 2.3 Gbit/s. It is important to keep in mind that what is possible in theory is also possible in reality – and that wireless is a shared resource. Will someone sitting next to a tower be the only person on the cell to get 2.3 Gbit/s? Possibly, but even though quite a few wireless speed testers have reported wireless download speeds of 600 to 800 Mbit/s, it is far from certain on a loaded network. Even half the theoretical speed is still more than respectable. Quieter than its competitors, AT&T has rolled out its mid-band network to more than 175 million pops.
AT&T Mid-Band Spectrum Depth of 3.45 GHz and C-Band
With the advent of 5G and, with it, a technology called network slicing, many parties, ranging from mobile network operators and enterprises to policymakers, are re-examining how to deploy customized networks that were previously unfeasible. The flexibility afforded by network slicing will allow wireless operators to more efficiently meet the needs of their enterprise customers, particularly those customers concerned about the potential costs or burdens of “do-it-yourself” or outsourced private networks. Alternatively enterprises may choose to deploy a private wireless network to meet their needs. This would require a specific plan to utilize spectrum they obtain or lease from another entity along with a private network managed by an operator, network supplier, or viable third party.
Network slicing allows a network operator using the same physical wireless network to provide virtual slices with different characteristics to serve different customer needs. It allows the operator to tailor the technical characteristics such as bandwidth, latency, and security for certain types of applications. This is especially significant for enterprise customers whose desire to enjoy the economic benefits associated with large-scale network operators; instead of defaulting to largely homogenized offerings, companies and operators now enjoy unprecedented flexibility to develop customized solutions at a much lower cost.
Flexibility comes from the move towards virtualization and software-defined networking, where network infrastructure that was an integrated software and hardware component becomes disaggregated into a software program running on general purpose computers and servers.
Cost savings come from the shared use of physical infrastructure when the new specialized tasks only require the installation of a new software application solving, the enterprise’s specific need.
The enhanced capabilities and cost savings allow enterprises to switch from a production process that has been hardwired to a flexible wireless approach, just as mobile phones have replaced landline phones as the preferred way to communicate. In particular, the development of network slicing allows providers and customers throughout the wireless ecosystem to continue to enjoy the efficiencies of flexible use licenses and larger geographic license sizes, while also benefitting from small-scale customization.
5G Network Slicing
Over the last five years, the concepts of software defined networking (SDN) and network function virtualization (NFV) have taken large strides in the telecom world. The change from application-specific hardware to networks that are built on a general-purpose computing foundation, implemented and orchestrated by software, is a watershed event. The typical wireless network of 2015 consisted of roughly 30,000 to 50,000 different pieces of hardware with integrated software. The hardware elements (also called SKU for Stock Keeping Units) included base stations, internet routers, messaging gateways, multimedia gateways, switches and many other components. For every function in the network, one new SKU was created. These SKUs are highly efficient at running the specific task for which they are designed as long as they are close to being fully utilized. The networks are also designed around the busiest time of the day for the specific function, plus a significant margin for growth so that customers can always use any function they might want to at any time of the day. Usage is highly variable both over time and across functions which are being used, leading to a significant amount of idle capacity and significant cost.
As MNOs have transformed their networks from hardware-centric to software-centric, wireless networks have become a lot more flexible. By being able to use each network function as an application, these developments have allowed MNOs to create network slices that provide service assurance by creating virtual wireless networks as part of the overall wireless network.
The key advantages of network slicing are:
Enhanced mobile broadband. It allows the operator to guarantee reliable, ultra-high speed data connections for applications such as live 4k video streams.
Very low latency. It makes applications such drone flying beyond visual range possible and among other things opens up the ability to inspect power lines and buildings.
Massive IoT. Thousands of IoT devices such as sensors can be installed per square mile allowing for a range of applications such extremely even temperature control on a shop floor where material variances are measured in mu requiring temperatures to be within one degree for product uniformity.
Importantly, these key advantages can be mixed and matched to suit different industry verticals, all over the same physical network.
Below in Exhibit 1, we have abstracted the network into three layers: The radio access network (RAN) layer, which is responsible for the wireless connectivity between the device and the network; the core network function layer (core) that controls how voice calls and data sessions are connected; and the network application layer on which the various services run. The network application layer consists of either a private or public cloud, centralized or at the edge of the network. This way, a specific amount of resources can be allocated to every network task. In the exhibit, the width of the bars is the amount of resources guaranteed to each slice. The MNO can thereby guarantee all customers who have bought a network slice a minimum performance through a service level agreement (SLA.)
For example, the MNO has set aside a specific amount of network resources for its retail or consumer customers (the MNO Network Slice). Since its customers are using a balanced number of applications, it needs a corresponding amount of core and RAN resources to serve all of its customers, both consumers and enterprise customers, well. The MNO is also hosting an MVNO which is smaller than the MNO and has therefore purchased fewer network resources that are also balanced. The same with a large industrial provider and an IoT provider who have created their own private networks by acquiring smaller slices of the network. On the right side of the exhibit are two examples that show the strength of the network slicing model. In the green case, a customer purchases a video-centric slice, either for internal use like a large close circuit video network or a company providing video services to its customers. Video is the RAN killer app, both in terms of popularity and in terms of RAN resources needed. For that reason, the company purchases a significant amount of RAN resources. At the same time, a single use needs less core resources and even fewer resources at the network application layer, may it be in the form of a private, public or mobile edge cloud, as it streams the video. A contrasting example would be a game-centric slice. Multi-player online games are not data intensive, using only a small fraction of the data bandwidth that a video stream uses, but are highly latency dependent. To ensure the required latency, gaming demands a service with a higher amount of core resources and an even higher amount of network application resources where the processor-intensive calculation takes place.
Exhibit 1:
Source: Recon Analytics, 2021
Network slicing allows MNO administrators to custom tailor which resources are needed to deliver the services that the customer wants and needs by guaranteeing their availability at the network level.
Building a wireless network involves a significant amount of investment regardless of whether the network serves one customer or one hundred million customers. Building a RAN with towers, antennas and radio heads, a core network and application network layer with servers is a significant undertaking, even for established MNOs. Network slicing allows non-telecom companies to have the benefits of a private, secure, custom-tailored network meeting their needs without having to build, manage, maintain and upgrade it themselves. It is the difference between building your own car and leasing a car to drive to work every day.
Telecom engineers were determined to discover better ways to run a network than with 50,000 individual components. Using data centers and even personal computers as an example, they posited that it would be more efficient, flexible and agile to virtualize network functions (NFV), turning the vast majority of SKUs into computer programs and using general purpose computers or servers to create what is known as software-defined networking (SDN). This solves various issues:
Agility: With an SDN it is much easier to launch new products and services. Instead of having to justify the cost and complexity of having to add another SKU to the network, mobile network operators (MNOs) can simply launch the service by starting the software program. Adding a new service becomes (almost) as easy as installing a new browser or word processor on a PC. If the service is successful, it takes up more data center resources, using excess capacity of a service that has fallen out of favor. If wildly successful, the MNO can simply add more servers. If the service fails to catch on, the application either uses minimal network resources or is deleted. Thus there is no need to pay for, install, and potentially remove hardware from the network.
Reduction in network complexity and cost: As mentioned before, an average mobile network has 30,000 to 50,000 different SKUs performing different network functions. The fully virtualized SDNs that are operational today have reduced that number to six.. Fewer network elements means less complicated and expensive maintenance and purchasing costs as well as fewer hardware incompatibilities that need to be overcome. With fewer network elements, fewer people are needed to operate the network, thereby lowering operating costs. Since the network runs on a large number of commoditized servers that are interchangeable, costs are lower due to larger purchase volumes of generally available commoditized hardware.
Speed and scale of innovation: By using standardized, general purpose server hardware, wireless networks are joining the larger and faster speed of innovation rather than continuing to rely on the speed of innovation in niche applications. Furthermore, by shifting innovation to the software layer, many more companies can develop new software than what would be possible in an integrated hardware and software model.
Security and vendor independence: By using standardized hardware to run the network, it becomes less likely that malware gets introduced to the network through compromised hardware as the removal of compromised hardware is expensive and time consuming. If any software is compromised, it can quickly and inexpensively be replaced by a competing product. Furthermore, by relying on software over standardized hardware, MNOs can much more easily switch from one vendor to another for technical or commercial reasons.
Ease of management: In an SDN, all virtualized network functions can be controlled and changed from one platform and one location. In a traditional wireless network, individual network elements have to be individually changed and monitored, often with different applications that do not work together, and the changes have to be made locally. Most importantly, when network software runs on standardized hardware, it is possible to allocate specific resources to predetermined tasks, something that is called network slicing.
Until now, enterprise customers whose operations were dependent on features like these were incented to build and operate (or outsource) their own customized networks, entailing significant capital and operational expenses since this approach would not incorporate the efficiencies passed through to customers by large-scale traditional mobile network operators. Network slicing now allows enterprise customers to share in those efficiencies of scale, while still gaining the advantages of network customization – the best of both worlds.
Network Slicing Use Cases
5G network slicing allows guaranteed network performance that was not possible in 4G. This allows mobile network operators to offer a whole new set of capabilities that were previously not possible.
Industrial use case
5G is ideal for factories as it removes the need to lay cables to run different machineries that cost in the hundreds of thousands of dollars depending on the size of the factory floor. Network slicing, with its guaranteed performance, allows factories to connect precision machinery that needs to react to changes in the production process with less than 10 millisecond reaction time, like metal works or chemical processes. It also allows the production process flow on the shop floor to be changed according to new and different tasks without incurring cost prohibitive expenses and time of rewiring the shop floor. In addition, the material flow on the shop floor can be automated and, with below 10 millisecond reaction time, centrally controlled. Work-in-progress material vehicles can shuttle material from work station to station. By shifting the work-in-progress material flow from manual labor to an automated process, the number of work-related accidents on the shop floor can be significantly reduced, improving employee health and decreasing work related healthcare costs.
Just like in the hospital use case, the industrial company can choose which elements and functions it wants to source from the mobile network operator and which parts it wants to own itself for maximum flexibility. For example, in an automobile manufacturing plant, the automobile manufacturer installs its own antenna network, uses its own industrial applications to run the robots and manufacturing street, but relies on the mobile network operator to integrate it and run it on the mobile operator core.
Drones beyond visual line of sight
One of the advantages of 5G and network slicing is the guaranteed low latency and high data through put. What has been an exclusive purview of the military, is coming to the civilian sector: Flying drones beyond the visual line of sight (BVLOS.) The FAA approved BVLOS flights for public safety in August 2020, with a commercial application approved in January 2021. Due to the reaction time limits by traditional technology the drone cannot be further away that 1,500 feet from the pilot and more than 50 feet above or within 400 feet horizontally of any obstacle. The 5G and network slicing operators can guarantee a stable connection and low latencies that will allow drones to fly wherever there is a network connection and much closer to buildings, cables, and anything else the FAA describes as obstacles. Powerlines, local and long-distances, as well as cell cites need to be visually inspected in regular intervals to ensure the structural integrity of the units is still warranted. Currently, a lot of these inspections are done either by car or on foot with binoculars or personnel has to climb up the structure to inspect it. Due to OSHA regulations, they have to be turn off to be inspected, which impacts customers. When the industry switches to BVLOS one drone pilot can inspect these structures from a central location leading to lower cost and improved inspections as the drone will deliver a 4k video stream that is then recorded. Successive videos of the same structures can be compared against previous recordings to evaluate how quickly the structure is withstanding environmental impacts and aging. This will allow for preventive maintenance reducing cost and uptime. For the public safety community, BVLOS provides the opportunity for firefighters to have a better view on forrest fires than what is possible with planes. Drones can fly closer and slower to the ground than planes or helicopters, without endangering a pilot, giving firefighters a better understanding of conditions before they fight the fire in person. Futhermore, use of augmented reality (AR) and virtual reality (VR) technology can aid flying the drones and better help with inspecting various pieces of equipment.
Augmented and virtual reality use case
Combining 5G with network slicing allows reliable and predictable performance of augmented and virtual reality applications. For example, television programs can use 5G to holographically capture interviews and events, transmit them to a TV studio, and project the live images, allowing people in the studio to interact with the holographic images. Examples of this include interviews from sports events where the athletes are projected straight into the studio to be interviewed. Network slicing is particularly useful for large, highly latency-sensitive data streams for multimedia applications, since it allows for the dedication of bandwidth and computational resources needed to ensure flawless delivery.
Secure banking use case
Mobile payment and banking applications that are currently operating on the common network can be enhanced by creating a network slice that is dedicated only to a bank or payment system’s dedicated network slice. This allows a complete separation of the payment and banking app from the commonly shared wireless network, allowing for greater security and flexibility.
International examples
The advent of network slicing increases the efficiency and attractiveness of larger geographic license sizes. Internationally, the most prevalent way spectrum is licensed is through nationwide licenses. Some large countries like Canada and Brazil follow the US model of splitting the country into a number of licenses. In particular, Canada with regional telecom providers has a similar system for the same reasons as the United States, namely to allow regional providers an opportunity to participate in the wireless market place.
With the advent of 5G, several European countries have set aside spectrum for Industry 4.0 companies in the 3.7 GHz to 3.8 GHz band. Despite its moniker, the licenses are available for all companies at costs that are in the thousands of dollars per year. One of the leading examples is Germany, with its significant manufacturing sector. The German government lays out four scenarios[1] of how German companies can get use a 5G campus network, ranging from a slice of a public network, a hybrid-shared RAN where the public RAN is supplemented with private small cells, a hybrid RAN with a private core, and a fully separate stand-alone network.
The premise of this approach is that allocating spectrum in smaller parcels to individual companies would be more efficient than allocating larger flexible use licenses to MNOs, but experience has not borne this out as the uptake has been limited and is projected to continue to be minor. The first companies that have made announcements on starting 5G private networks are either stand-alone networks or network slices, but almost all of them are being built together with an MNO, either as the provider of a slice or as the lead partner to the customer. For example, General Motors and Honeywell have a 5G indoor system deployed in the United States with Verizon[2]. In Germany, Lufthansa together with Vodafone has launched a stand-alone private 5G network[3] and OSRAM together with Deutsche Telekom has launched a hybrid slice network[4]. In the UK, Ford together with Vodafone[5] and Air France and Groupe ADP are planning a 5G network[6], and in China multiple companies are building or have already built 5G networks with China Mobile, China Telecom and China Unicom respectively[7].
Bitkom, the 2,700-member German industry association for the digital sector, conducted a survey of companies regarding their plans for 5G[8]. Half of all companies considered 5G an important factor for their company in the future. Thirty-six percent of companies are planning or discussing using a network slice from an MNO for their 5G needs and six percent are planning to use the 100 MHz license that the German government has set aside for private company networks. As one of the leading high-tech manufacturing countries in the world, the survey result showing that only six percent of German companies are interested in deploying a private 5G network, where the enterprise owns the radios, network core and applications, on dedicated spectrum is sobering. The advent of network slicing allows enterprises to avoid taking on those responsibilities while still enjoying the benefits of customization.
Opportunity cost of different license area sizes or licensing schemes
Exhibit 2: Industrial locations in Germany with more than 1,000 employees
Private networks, like the industrial networks envisioned in Germany, still require network design, someone who builds the network, operates it and maintains it. This is a challenge for even the largest industry providers. As a result, they are either looking to either large system integrators or mobile operators to provide these services.
An additional challenge is the relatively small license areas for these networks. The smaller the license area, the more challenging network design becomes. In the low- and mid-band, radio waves inevitably will travel further than the smallest, city-block-size census blocks in major metropolitan markets. This creates interference issues with neighboring antennas, especially if they are owned by a different licensee. The interference issue gets smaller with higher frequency spectrum as the signal is increasingly attenuated and has problems penetrating walls.
Another issue to consider is that if only a small number of companies are willing or eligible to use the spectrum, the unused spectrum lays fallow and cannot be used otherwise. This is particularly a problem for low and mid-band spectrum that would otherwise be deployed in wider areas.
Even in a country as large as Germany, with twice as many people than California on half the geography, larger businesses with more than 1,000 employees are highly concentrated as we can see in Exhibit 2[9]. In large swaths of Germany, 100 MHz of 5G spectrum that is reserved for companies will lie fallow as indicated by the large area of grey land and cannot be used for important tasks such as wireless broadband solutions to the home or businesses. This runs counter to the idea that MNOs in both urban and especially rural areas can solve connectivity issues driven by inadequate landline solutions. Network slicing by MNOs allows enterprise customers to avoid the inefficiencies of small license sizes, while still achieving the customization benefits of private networks,
Conclusion
5G and Network slicing are opening up new opportunities for consumers and businesses alike. It terms of added flexibility network sliceing is the best thing that happened to mobile networks since sliced bread. 5G, SND and network slicing eliminate the delineation between telecom and information technology and create a contiguous development space for software engineers. As software is eating the world, this includes wireless networks, just as wireless is eating the world as well. This development opens up new opportunities ranging from augmented and virtual reality to drones to more fully optimized factories.
The cost savings from replacing expensive cabling to reducing workplace accidents are significant as wireless connections replace wired connections. The increased flexibility of changing the workflow on a factory floor is substantial. The ability to fly drones beyond the line of sight will simplify many tasks requiring people on the ground to inspect structures. Considering that the core competency of regular companies is their specific focus and not that of running IT systems or even highly sophisticated 5G networks on a limited scale, the deployment scenarios we see in Germany and other European companies will be that of network slicing or a hybrid solution that MNOs will manage. While it is a possibility that we will see stand-alone networks, the trend in corporate support functions is towards outsourcing as it often provides a better solution at a lower cost.
Most companies have outsourced their entire IT segment to specialist companies that can provide a superior experience for a lower cost. Even email, a relatively simple application, has become a hosted solution provided by large IT companies. Network slicing allows for custom network applications previously deployed on private networks or not at all by essentially creating a network as a service approach, where connectivity is just one of the components of an enterprise solution. Network innovations that would take a separate investment in a private network come when using a network slice as the overall MNO network is being upgraded. By working with an MNO and using a network slice allows companies to benefit from technical innovation for free as the MNO will upgrade their network to remain competitive. As a stand-alone network the enterprise would have to bear the cost for it. Private networks are also an option, but each enterprise should consider the requirements to develop and manage such a network – from spectrum, to design, development, and ongoing operations.
Network slicing allows enterprises to have all the benefits of customized private networks without having to build, operate and keep current the network increasingly becoming essential to remain competitive.
This paper has been commissioned by CTIA – The Wireless Industry Association
When Nvidia announced that it was in the process of buying Arm from Softbank, many analysts and industry observers were exuberant about how it would transform the semiconductor industry by combining the leading data center Artificial Intelligence (AI) CPU company with the leading device AI processor architecture company. While some see the potential advantages that Nvidia would gain by owning ARM, it is also important to look at the risks that the merger poses for the ecosphere at large and the course of innovation.
An understanding of the particular business model and its interplay highlights the importance of the proposed merger. Nvidia became the industry leader in data center AI almost by accident. Nvidia became the largest graphics provider by combining strong hardware with frequently updated software drivers. Unlike its competitors, Nvidia’s drivers constantly improved not only the newest graphics cards but also past generation graphics cards with new drivers that made the graphics cards faster. This extended the useful life of graphics cards but, more importantly, it also created a superior value proposition and, therefore, customer loyalty. The software also added flexibility as Nvidia realized that the same application that makes graphics processing on PCs efficient and powerful – parallel processing – is also suitable for other heavy computing workloads like bitcoin mining and AI tasks. This opened up a large new market as its competitors could not follow due to the lack of suitable software capabilities. This made Nvidia the market leader in both PC graphics cards and data center AI computation with the same underlying hardware and software. Nvidia further expanded its lead by adding an parallel computing platform and application programming interface (API) to its graphics cards that has laid the foundation for Nvidia’s strong performance and leading market share in AI.
ARM, on the other hand, does not sell hardware or software. Rather, it licenses its ARM intellectual property to chip manufacturers, who then build processors based on the designs. ARM is so successful that virtually all mobile devices use ARM-based CPUs. Apple, which has used ARM-based processors in the iPhone since inception is now also switching their computer processors from Intel to ARM-based internally built CPUs. The ARM processor designs are now so capable and focused on low power usage that they have become a credible threat to Intel, AMD, and Via Technology’s x86-based CPUs. Apple’s move to eliminate x86 architecture from their SKUs is a watershed moment, in that solves a platform development issue by allowing developers to natively design data center apps on their Macs. Consequently, it is only a matter of time before ARM processor designs show up in data centers.
This inevitability highlights one of the major differences between ARM and Nvidia’s business model. ARM makes money by creating processor designs and selling them to as many companies that want to build processors as possible. Nvidia’s business model, on the other hand, is to create its own processor designs, turn them into hardware, and then sell an integrated solution to its customers. It is hard to overstate how diametrically different the business models are and hard to imagine how one could reconcile these two business models in the same company.
Currently, device AI and data center AI are innovating and competing around what kind of tasks are computed and whether the work is done on the device or at the data center or both. This type of innovative competition is the prerequisite for positive long-term outcomes as the marketplace decides what is the best distribution of effort and which technology should win out. With this competition in full swing, it is hard to see how a company CEO can reconcile this battle of the business models within a company. Even more so, the idea that one division of the New Nvidia, ARM, could sell to Nvidia’s competitors, for example, in the datacenter or automotive industry and make them more competitive is just not credible, especially for such a vigorous competitor as Nvidia. It would also not be palatable to shareholders for long. The concept of neutrality that is core to ARM’s business would go straight out of the window. Nvidia wouldn’t even have to be overt about it. The company could tip the scales of innovation towards the core data center AI business by simply underinvesting in the ARM business, or in industries it chooses to deprioritize in favor of the datacenter. It would also be extremely difficult to prove what would be underinvesting when Nvidia simply maintained current R&D spend rather than increasing it, as another owner might do as they see the AI business as a significant growth opportunity rather than a threat as Nvidia might see it.
It is hard to overestimate the importance of ARM to mobile devices and increasingly to general purpose computing – with more than 130 billion processors made as of the end of 2019. If ARM is somehow impeded from freely innovating as it has, the pace of global innovation could very well slow down. The insidious thing about such an innovative slow down would be that it would be hard to quantify and impossible to rectify.
The proposed acquisition of ARM by Nvidia also comes at a time of heightened anti-trust activity. Attorney Generals of several states have accused Facebook of predatory conduct. New York Attorney General Letitia James said that Facebook used its market position “to crush smaller rivals and snuff out competition, all at the expense of everyday users.” The type of anti-competitive conduct that was cited as basis for the anti-trust lawsuit against Facebook was also that of predatory acquisitions to lessen the threat of competitive pressure by innovative companies that might become a threat to the core business of Facebook.
The parallels are eerie and plain to see. The acquisition of ARM by Nvidia is all too similar to Facebook’s acquisitions of Instagram and WhatsApp in that both allow the purchasing entity to hedge their growth strategy regardless of customer preferences while potentially stifling innovation. And while Facebook was in the driver’s seat, it could take advantage of customer preferences. Whereas in some countries and customer segments the core Facebook brand is seen as uncool and old, Instagram is seen as novel and different than Facebook. From Facebook’s perspective, the strategy keeps the customer in-house.
The new focus by both States and the federal government, Republicans and Democrats alike, on potentially innovation-inhibiting acquisitions, highlighted by their lawsuits looking at past acquisitions as in Facebook’s and Google’s case, make it inevitable that new mergers will receive the same scrutiny. It is likely that regulators will come to the conclusion that the proposed acquisition of ARM by Nvidia looks and feels like an act that is meant to take control of the engine that fuels the most credible competitors to Nvidia’s core business just as it and its customers expands into the AI segment and are becoming likely threats to Nvidia. In a different time, regardless of administration, this merger would have been waved through, but it would be surprising if that would be the case in 2021 or 2022.
A new report called “Broadband 2020” by Recon Analytics shows that over 40% of employees in the United States are able to telecommute. The Department of Labor’s Bureau of Labor Statistics defines the professional workforce as all workers in the “management, professional, and related occupations” colloquially known as white collar workers, which make up 41.2% of all jobs in America. This means that basically every white collar worker is able to telecommute. This highlights the dramatic change that the American workplace has undergone during the pandemic.
The pandemic also has the potential to halt or even reverse the decades-long migration of Americans from rural to urban settings. A slight majority (50.9%) of Americans that can telecommute are contemplating moving to a smaller city or town as the pandemic has prompted many Americans to reevaluate their priorities and living conditions.
What is surprising is that even 31% of Americans that cannot telecommute are considering moving to a smaller city or town. It shows that the luster of metropolitan areas has been waning.
But not all new places are equal, so we asked what factors would stop people from moving to a new place. The results were equal parts predictable and surprising:
More than a third of Americans do not have any reasons that would prevent them from moving to a different place. Where it gets interesting is the reasons why people would not move. The number one reason for not moving to a different town or village is a pay cut – 31.6% of respondents. Companies like Facebook have announced that employees who work from home from lower-cost areas – and everything is lower cost than Silicon Valley – would receive a pay cut. A move that ties compensation to location rather than contribution would prevent a significant number of employees from moving away from Silicon Valley, which already is experiencing a severe housing shortage and overloaded roads. Facebook’s reasoning also allows a glimpse at its compensation philosophy, which seems to focus more on competitive factors than what is good for the community or the employee. Almost as many, 31%, would not move to a town or village without broadband, which is just ahead of access of quality health care with 30.1% – and that in the midst of a pandemic. One has to recognize the magnitude of this finding: Availability of broadband, access to quality healthcare, and a pay cut are equally important in the mind of Americans during a pandemic and recession.
At 36.3%, the 45-54 age segment considers the lack of broadband to be the most significant barrier to moving, followed by the 25-34 age segment with 35.8%. More than a quarter of seniors (26.1%) will not move to a new location if broadband isn’t readily available.
Broadband is even more important than politics. While 22.5% of Americans would not move to an area with what they consider an incompatible political climate, which is significantly less than the importance of broadband. The 45 to 56 age segment is most focused on politics with over 30.9% citing an unwillingness to move due to an incompatible political climate. The next most polarized age segment is those over the age of 65, where 22.1% mention an incompatible political climate prevents them from moving.
The lack of a nearby airport or a buzzing nightlife was the least important in people’s minds. Only 13.7% of respondents thought that not having an airport within a 50-mile radius would prevent them from moving there. A buzzing nightlife or restaurant scene is even less on people’s minds. Only 9.6% of 18 to 24-year-olds find it an obstacle to move, whereas 13.1% of the 25 to 34 age segment needs a buzzing nightlife and restaurant scene.
We asked people what they considered broadband. The median American considers 50 Mbit/s download and 5 Mbit/s upload as broadband. The people’s expectations are leading the FCC’s definition of broadband which currently sits at 25 Mbit/s download and 3 Mbit/s upload.
The reason for this becomes apparent when we look at the use cases. In our survey we looked at several use cases, but the prevalence of video conferencing has driven bandwidth requirements upwards, especially on the upload side. A HD video stream requires a minimum of 5 Mbit/s upload and download per stream. With more than 25% of Americans now frequently using video conferencing for work and another 21% using sometimes for work the bar has effectively been raised.
While the lack of widely available broadband is a significant hurdle for cities and towns to attract new residents, it is almost outright disqualifying for housing options: 77.5% of respondents would not move to a place, like a house or apartment, that does not have broadband. This makes the availability of broadband one of the key selection criteria when choosing a new residence. When almost half of the population has to be sometimes or frequently on video conferencing, having broadband becomes a job requirement. The pandemic, for the good and bad, has made our homes places of work with the IT and connectivity needs that were traditionally reserved for places of work. These are just some of the highlights of the new Recon Analytics Report “Broadband 2020.”
The results of the report are reinforcing the data from FCC’s 2020 Broadband Deployment Report which represents the most recent government data on the topic and the progress the industry has made from 2014 to 2018.
As of 2018, 94.4% of the Americans have access to broadband as the FCC defines it, 25 Mbits download, 3 Mbits upload (25/3). In urban areas, it is even 98.5%, but as we look towards rural areas and tribal lands, the availability is significantly less. In rural areas 77.7% of Americans and in tribal lands, 72.3% of Americans have access to 25/3 broadband. In higher tiers, access in urban areas drops only slightly, but much more significantly in rural areas and in tribal lands. At the 250/25 Mbps tier, 94% of Americans in urban areas have access, a drop of 4.5% from the 25/3 level. In rural areas, 51.6% of American have access to 250/25, which is 26.1% less than 25/3. In tribal lands, 45.5% have access to 250/25 which is 26.8% less than 25/3.
The numbers make it clear that there is still more than enough to do in urban, rural and tribal areas to provide connectivity for essential tasks. As it looks increasingly unlikely that children in every school district will be able to go back to school, we need to ensure that every child in the United States can access the internet to be able to participate in school and classroom work. If only one child cannot participate, the progress and grades for the entire class are not counted. While fixed broadband deployment is a time-consuming endeavor, mobile broadband can and should close the homework gap. T-Mobile has announced that as part of its merger commitments it will deliver mobile broadband to 10 million households we have only a few weeks to turn this promise into a meaningful difference as the new school year starts. The other mobile operators, in conjunction with the FCC and federal funding, should seize the opportunity and close the homework gap as quickly as possible.
In order to recover as quickly as possible from the current economic slump, we should put money where it has the biggest impact. Different technologies are able to achieve the same goals but have strengths and weakness in different areas. This means that any funding has to be technologically agnostic and look at the performance characteristics. The United States has wisely always used performance characteristics such as download and upload speed as well as latency as its selection criteria rather than being tied to a technology regardless if it is fiber, hybrid fiber coax, VDSL, satellite or whatever generation of wireless standards.
If you would like to buy the underlying report, please give us a call at 617.823.3363