Amdocs: Operators Plan Massive Growth in Carrier-Grade Wi-Fi to Meet Customer Expectations

Carrier Grade WiFi To Grow  from 14% in 2014 to 72% in 2018

Carrier Grade WiFi To Grow

New network planning and management tools critical for enabling monetization of carrier-grade Wi-Fi

ST. LOUIS - December 3, 2014 - Amdocs, the leading provider of customer experience solutions, today announced at Small Cells Americas the results of new, independent research exploring the transition from “best-effort” to “carrier-grade” Wi-Fi networks among Multiple Service Operators (MSOs) and Mobile Network Operators (MNOs).  This transition comes in response to new end-user expectations for improved capacity and quality for “everywhere” content, and the need to support new revenue streams.

The Amdocs-commissioned research, conducted by Real Wireless and Rethink Technology Research, reveals plans for massive growth in carrier-grade Wi-Fi, the different strategies operators intend deploying and the technical barriers to be overcome.  According to the research, service providers realize that “best-effort” Wi-Fi is becoming less profitable and that new revenue streams can only be built once a higher quality of experience (QoE) is assured. This higher QoE is necessary for services such as TV everywhere, health monitoring, enterprise voice, online gaming, media streaming and voice over Internet protocol (VoIP) services.  The survey also highlights the criticality of tools for carrier-grade network planning and performance management, spanning both cellular and Wi-Fi networks, to enable the leap to new Wi-Fi services.

Key findings include:

  • Carrier-grade Wi-Fi hotspots will grow from 14 percent today to 72 percent of overall Wi-Fi hotspots by 2018
  • As part of their Wi-Fi network strategy to enable Wi-Fi coverage on the move, by 2016, 77 percent of service providers plan to use “homespots” (where the user agrees to leave the hotspot open for use by passers-by), growing from 30 percent today
  • Almost all operators (85 percent) plan to invest in carrier-grade Wi-Fi by 2016. MSOs see carrier-grade Wi-Fi providing better positioning in mobile virtual network operator (MVNO) deals, supporting quad-play offerings and wireless services, while MNOs plan to use carrier-grade Wi-Fi broaden their networks and offload radio access network (RAN) traffic
  • By the end of 2016, 61 percent of MSO’s Wi-Fi hotspots, and 70 percent of MNOs’, will be sourced from third parties to take advantage of shared cost savings and accelerated deployment, up from 45 percent in today
  • Two-thirds (65 percent) of respondents placed the lack of strong network planning and management tools in their top three risk factors for investing in carrier-grade Wi-Fi, with 65 percent stating that their existing tools will not extend well to Wi-Fi without additional investment.

“Service providers are starting to see Wi-Fi as a strategically important offering that can enhance or damage their reputations and which needs to support a user experience comparable to that of cellular networks,” said Oliver Bosshard, Managing Consultant at Real Wireless. “Best-effort Wi-Fi networks are not controlled from the operator’s core network or operational support systems tools, and the access points often do not support any form of traffic management or prioritization. As a result, operators are unable to monitor or address performance issues such as congestion, meaning they cannot guarantee QoE – properties such as connection speed, latency or prioritization that are all critical to enable the monetization options for Wi-Fi.”

“Because quality of experience is essential to current and future network monetization strategies, operators need to have the right planning and management tools in place.  These are areas that are critical to the business case – to ensure optimal and cost-efficient roll outs, and to provide detailed analysis of network behavior and customer usage, which can feed into improved quality of experience,” said Rebecca Prudhomme, vice president for product and solutions marketing at Amdocs. “Amdocs network solutions allow service providers to maximize network capacity and deliver quality of service based on real-time customer insights while enabling greater cost-efficiency.”

The research was conducted between August and October 2014, with Wi-Fi managers from 40 service providers in Asia Pacific, Europe, Latin America and North America.

Supporting Resources

Will cognitive radio, dynamic spectrum access come of age in 5G?

Around 10 years ago, the Defense Advanced Research Projects Agency (DARPA)’s Next Generation Communications program constructed a prototype cognitive radio system, which utilized dynamic spectrum access for its communications. By identifying unused sections of spectrum in the area it was operating, it was hoped up to 10-times more spectrum would be available for transmissions. This highlighted a growing interest in the defense community in dynamic spectrum access techniques which had been developed with the challenges of battle-space spectrum in mind, but also apparently had applicability in commercial environments in terms of making more efficient use of valuable spectrum resources and potentially leading the way to spectrum trading. The XG program was one of the largest cognitive radio projects at the time but interest in Cognitive radio was by no means limited to the U.S.

Martin Cave’s audit of public sector bands in 2005, which highlighted just how much more efficiently U.K. defense spectrum could be utilized, provoked interest in the topic in the U.K. This was produced alongside Ofcom’s Spectrum Framework Review, which set out ambitious targets for a general move from the traditional “command and control” approach to spectrum licensing to a more dynamic approach based on “market mechanisms” with the overall ambition of realizing better value from spectrum for the U.K.

With the switchover to digital television and release of TV white space, a debate was ignited over whether DSA could be applied to these civilian bands too. The obvious example of this has been the activity around TV white space, although the Federal Communications Commission discussion on 3.5 GHz is also significant.

However, the digital TV switchover was six years ago and the commercial roll out of white space devices is still fairly limited due to the complications of deploying these devices in practice. Concerns over the so-called “hidden node” issue (interference provoked by the failure of one device to detect the presence of all other devices) and how devices with different spectral views would liaise with each other have meant that the regulation of these white space devices has taken some time to agree.

In attempting to overcome these limitations, regulators gradually shied away from a pure spectrum sensing approach, towards the introduction of beacon signals to identify usage, before settling on the use of a centralized database of white spaces in each location that is used in addition to spectrum sensing.

But even then, the practical use of TV white spaces has continued to be fairly limited. Vendors and operators have struggled to find an application that suits the availability of white spaces, as well as handling the lack of guaranteed spectrum.

This same philosophy is being proposed for 3.5 GHz in the U.S., where some locations have other users (e.g. marine radar), but the combination of database and sensing could allow this band to be used. This is especially important as 3.5 GHz is one of the few LTE bands that is supported globally, so there is a clear commercial imperative.

Enter 5G

At the recent 5G Huddle, rethinking how existing technologies make use of spectrum was a key topic of discussion, with spectrum sharing a major part of this.

There are some strong arguments for why this would be sensible:

  • We’re starting to reach the limits of what we can achieve through higher order modulation schemes, with any gains insufficient to keep pace with demand.
  • We may still be making some gains with regards to multiple-input, multiple-output and CoMP, but again, not at the same rate that demand is increasing.
  • Small cells, which are increasing in usage, and network densification, levels of which are also increasing, both lend themselves well to spectrum sharing.
  • –he last variable available to us in our attempts to increase capacity is spectrum, and (at least in theory), DSA maximizes availability and efficiency of spectrum across all operators

On that last point, this is of course only if it is deployed correctly, with polite protocols for communications.

However introducing dynamic spectrum sharing to “5G” would surely result in 5G just suffering from the same technical issues that cognitive radio has encountered before.

After all, one of the key differentiators of cellular over many other wireless technologies, such as Wi-Fi, is the guaranteed quality of service. Indeed, we have previously examined how exclusively licensed spectrum loses value as the sharing arrangements increase uncertainty for operators.

Wouldn’t 5G lose this edge if spectrum access became dynamic and without guarantees?

At present, you would be correct, but it is unlikely anyone would be satisfied introducing such a glaring problem into 5G. Rather the key difference between earlier cognitive radios and 5G is that, as demonstrated with the discussions at the 5G huddle, major commercial vendors and operators are putting significant research time and investment behind the technology.

Perhaps this time around, with the full weight of the industry behind it, and with an appropriate understanding of what operators need from spectrum sharing conditions to offer high-quality services, cognitive radio and DSA can really come of age.

This blog post originally appeared as part of RCR Wireless’s Analyst Angle, where the industry’s leading analysts discuss the hot topics in the wireless industry.

The UK needs to address rural coverage – but national roaming isn’t the answer

This week has seen the UK government bring back proposals for national roaming, the idea being that those in remote villages and towns should be able to jump onto rival networks if their current provider isn’t delivering. It’s certainly an admirable initiative and one worthy of discussion – but national roaming isn’t the answer.

There has been plenty of discussion today on the pros and cons of this approach, with The Register doing a particularly good job of summarising the key reasons why this policy is well intentioned but not well thought through.

So rather than going over the same ground, we wanted to look at other potential, viable solutions to the problem.

LTE is coming

As part of the 4G licence award, Telefónica O2 has an obligation to provide “a mobile broadband service for indoor reception to at least 98% of the UK population and at least 95% of the population of each of the UK nations… by the end of 2017 at the latest.” And, perhaps encouraged by this obligation, all the operators have committed to meeting this target by the end of 2015. So much will change in the next year without further government intervention.

While LTE has been in big cities for a while now, it’s yet to reach much of the countryside or the smaller towns. But it’s on the way.

Real Wireless completed a project for the Scottish Government where we looked specifically at rural coverage and people will be genuinely surprised by just how good LTE coverage is.

We found that providing 95% of the population with indoor coverage, growing to 98% with gradual enhancements, is not beyond the reach of operators to achieve by the end of 2015. This is a huge improvement over 2G coverage, which even today only currently averages around 85% indoors. We also found that the average indoor mobile data speed available across Scotland will increase from about 2.5Mbps in 2012 to approximately 36Mbps by 2023.

The 4G roll-outs will reach 95% of the population surprisingly quickly, and there are ways to accelerate the rollout to 98%. However, it’s the final 2% that presents the most difficult challenge – but nor is this something national roaming would solve.

Rural coverage is expensive

Building networks is expensive, yet the UK already has amongst the lowest mobile infrastructure investment per head – something we touched on in a previous blog here. This is a real problem and one that puts us behind the rest of the world.

Technology has developed so that operators no longer need to invest in coverage over a wide area, to get service where users need it most – indoors. Vodafone’s open sure signal initiative is a good example of how this can work.

Targeted coverage makes it much more cost effective for operators to deploy sites and also avoids many of the planning challenges that can slow up installations. It’s this sort of technology that needs to be considered when addressing that final 2% figure, rather than expecting a blanket coverage approach. Such technology also provides operators with a way to continue to compete on coverage even as the share more of their wider networks, which is surely in the interest of consumers.

No easy answers

Rural coverage isn’t easy and the challenge has always been balancing the cost of network investment with the potential return. However with the wider rollout of LTE and the development of much cheaper, targeted ways of delivering coverage, there are viable solutions that need to be considered. It’s these approaches that need to be looked at by Government and operators in parallel, rather than pushing ahead with an approach that, while well intentioned, has some significant flaws.   Government needs to be aware of the risks of unintended consequences – just one example is the potential impact on national security flagged by police chiefs and the Home Secretary.

Wireless in Stadiums

The Challenge & Opportunities of Stadium Wireless

 We presented on this at a conference in Barcelona last week.

 Stadiums are famously some of the most challenging environments for wireless, but do offer a lot of opportunities for improving customer satisfaction, new revenue streams and operational efficiency.

 The presentation covers the challenges faced by stadium owners in meeting increasing demand for wireless but without sacrificing on the core values of their loyal supporters.

 But stadiums do have some specific technical difficulties compared with other wireless environments, including

o   Multiple user communities

o   Challenging architecture

o   Critical business requirements

o   Hig capacity density

o   A mixed propagation environment

 Getting the best from wireless in stadiums involves recognising and addressing all of these challenges and is likely to lead to a multi-party engagement model with appropriate independent wireless expertise  but whilst difficult it is definitely possible.

It also highlights some of the current challenges: such as the difficulty of getting cost effective Wi-Fi into a stadium bowl without producing a disappointing experience. And the current opportunities: such as using LTE Broadcast for a high quality video streaming and as a new source of content revenue.

The presentation is available in our downloads library.

These issues are discussed in more detail in a Stadium Magazine interview with RW Commercial Director Mark Keenan.

In addition, we discussed many of the commercial aspects in our White Paper.

Real Wireless warns existing networks need upgrade to cope with demand for wireless on transport

Industry experts warn of reliance on public networks and lack of holistic business case in transport wireless services

Current approaches to the provision of mobile connectivity to travellers on public and private transport risk failing to meet demand or justify themselves financially. This is according to independent wireless technology advisory firm Real Wireless.

Instead, a service that meets demand will only be possible through the rollout of additional custom mobile infrastructure, in order to complement the existing provision of public wireless networks.

The expense involved in this approach means that a comprehensive business case is vital before any rollout. In many cases, a positive return on investment will only be possible if transport operators take a holistic approach to planning wireless service rollouts, combining revenue from passengers with operational efficiency savings in other areas of their business.

To explore the additional services and business models that can be enabled by wireless, Real Wireless has published a new guide ‘The business opportunities for wireless in transport’. Bringing together the experience and insight of its experts from across the wireless industry, both working in and outside of the transport sector, it provides an overview of the potential services that must be taken into account by transport companies looking at rolling out wireless services.

Transportation has seen a series of high profile announcements in 2014 regarding the integration of wireless services. The highest profile of these have centered on the introduction of in-vehicle data connectivity for passengers on railways and airlines, delivering benefits for both customer experience and productivity and creating new revenue streams for operators.

However ‘wireless services’ extend far beyond data connectivity, covering other technologies such as Wi-Fi, cellular reception, machine-to-machine communications and ‘big data’ analytics. Similarly, it can also bring benefits to the companies operating the transport services, delivering operational efficiencies and new opportunities to streamline the business.

“For passengers, the benefits of having access to data services on the move are obvious,” said Mark Keenan, Commercial Director at Real Wireless. “But our analysis shows that operators must carefully consider all their options before proceeding with a rollout.

“Transport operators should build a comprehensive business case for the introduction of wireless, taking in to account both direct and indirect cost savings and revenue streams and fully factoring in recent and expected advances in technology. A well considered approach can prove highly lucrative for operators and regulators, as well as streamlining their everyday operations and enhancing their customers’ experience.”

In the report, Real Wireless identifies the added benefits wireless can offer the following sectors:

  • Railways – including enhanced customer services, better insights in to customer behaviour and reduced carriage weight
  • Aerospace – including reduced turnaround time, more effective airport security and crisis management and revenue from on demand content
  • Roads – including accident prevention, shorter accident response times, and enhanced traffic flow systems to reduce congestion
  • Maritime - including onboard cellular networks, better tracking of cargo and new revenue streams for port operators

The Real Wireless guide to the business opportunity wireless presents the transport sector, ‘The business opportunities for wireless in transport’, is available to download free of charge here.

Resilience, space weather, and the end of the world

Animation courtesy of Spaceweather.com

Animation courtesy of Spaceweather.com

Last week you may have caught the news of two large coronal mass ejections (CMEs) occurring within a few days of each other, hitting Earth with a good dose of radiation. The two CMEs were the result of the catchily titled AR2158 sunspot, and their power was placed within the ‘extreme’ bracket of the scale used by astronomers.

On the night, many people saw the beauty in the event – thanks to it resulting in a fantastic Northern Lights display – whilst others predicted that it would lead to a nightmare of doomsday proportions.

In the end, the only ones really impacted were amateur (ham) radio enthusiasts. The HF signals used in ham radio transmissions propagate by ‘bouncing’ off the ionosphere, the atmospheric layer impacted by geomagnetic activity. This is a good thing for amateur radio enthusiasts, allowing communication over much longer distances than usual.

It was therefore far from the cataclysmic existential risk some had made it out to be – but there is just cause for concern, thanks to the potential future impact of such an event on wireless, and the consequences for wider society.

The most famous solar event is the Carrington Event of 1859, a powerful solar storm that produced the largest geomagnetic storm ever recorded. Aurora Borealis sightings near the equator were noted, whilst a famous anecdote states that gold miners in Denver woke up at 1AM and began their morning routines due to the brightness.

We’ve seen written testimony discussing Northern Lights events like this throughout history; the reason the Carrington Event is remembered is partially due to our increased understanding by 1859, but also the impact on the early telegraph systems we had by this point. These systems failed, pylons sparked and operators suffered electric shocks.

Fast forward to 2014, and we are witnessing another peak in solar activity.  After a major solar superstorm narrowly missed Earth in 2012, a NASA study in the December 2013 edition of Space Weather estimated the chances of a Carrington-level event hitting earth by 2022 at 12%.  That said, I’m dubious of how you assign statistics to something that has only happened once.

Such an event could induce huge currents in east-west wires – the longer the wire the bigger the effect. That could cause significant disruption in USA and continental Europe – though less in UK where our powerlines mostly run North-South. Transformers failing, power networks collapsing, fires and other unpleasant effects could result. With no power water supplies and sewage systems  - and of course communication networks – could stop working if not specifically design to take such effects into account.

However, just as we take into account and mitigate risks of terrorism to wireless infrastructure, the impact such an event could have on communications infrastructure is something we must take in to account when planning wireless.

Why? A 2013 report from Lloyd’s investigating the risk of such a storm estimated between $0.6 – 2.6 trillion in costs from US power shortages alone, with lower end estimates relying upon utility businesses being prepared for an event.

We contributed to arguably the most authoritative study on this issue: a report by a Royal Academy of Engineering committee on the impacts of so-called Extreme Space Weather on engineered systems and infrastructure. This included a group of eminent space scientists together with representatives of major services such as power networks and aviation, with Real Wireless representing the interests of wireless communication networks.

One interesting finding was that, although Carrington events are very extreme, even more typical activity should result in measurable impacts on mobile network quality around once a week. We recommended that systems needed for critical applications should carefully examine their use of synchronization systems based on GPS, which could be vulnerable.

The full report is available here.

We can’t prevent such an event, it will probably happen one day in the not too distant future, all we can do is ensure we are prepared to mitigate any impact it may have. We hope that the Royal Academy of Engineering report will provide a basis for proper planning to minimise the potential consequences.

Clouding the Edge for LTE-A and Beyond

This blog post was originally published over at Light Reading.

One of the areas of increasing discussion about LTE-Advanced (LTE-A) and especially around the yet-to-be defined 5G standard is the tension between the “edge” and the “cloud.”

Over the last decades in telecom the powerful trend has been to push intelligence out to the edge. David Isenberg wrote a very good — but oddly not as widely known or distributed as it deserves — essay on this way back in 1997:The Rise of the Stupid Network.

We now have edge routers, we have gateways in our phones, and new smartphones have “intelligence” onboard in a way landline phones never did.

In wireless networking, a few years after Isenberg’s essay, broadband was proving this logic with TCP/IP pushing intelligence out to the edge. While 2G the smarts were quite centralized — with a basestation controller (BSC) in the network — with 3G that focus shifted and the network started to flatten out a bit. (See Mobile Infrastructure 101.)

Bell Labs, meanwhile, had the idea of putting the router and stack all the way into the basestation with the snappily named BaseStationRouter. That of course then became the 3G small cell, with the medium access control (MAC) and stacks moving into the NodeB with Iuh replacing Iub, and then onto “flat architecture” of LTE. (See Telco in Transition: The Move to 4G Mobility.)

So small cells represent the clear case of intelligence to the edge — some people call this the Distributed RAN (D-RAN). (See Know Your Small Cell: Home, Enterprise, or Public Access?)

The advantages are that networks become better: We put capacity exactly where we need it. The small cell is responsive and efficient, and we can do things like offload and edge caching, latency is reduced (which improves speed and QoE) and so on. It is a cost effective and intelligent way to make the network better and has been the “obvious” paradigm for the last few years. (SeeMeet the Next 4G: LTE-Advanced.)

But over the last few years we have seen the reverse trend too.

In computing we have the cloud. Intelligence moving out of the edge into the center: The widespread use of Amazon AWS or Google Cloud to host services, the rise of Chromebooks, cloud-based services like Salesforce, Dropbox or Gmail.

This concept is also been felt in the wireless world, as we have heard more and more about cloud RAN (C-RAN). This is the opposite trend to small cells: Having a “dumb” remote radio head (RRH) at the edge with all the digits sent back over fiber — aka “fronthaul” — to a huge farm of servers that do all of signal processing for the whole network. No basestation and certainly no basestation router. (See What the [Bleep] Is Fronthaul? and C-RAN Blazes a Trail to True 4G.)

Some simple advantages here are from economies of scale: One big server farm is cheaper and more efficient than having the same processing power distributed — electricity and cooling needs at the basestation are reduced for example. A more subtle gain is from pooling, which is sometimes called “peak/average” or “trunking gain.”

While in a normal network every basestation must be designed to cope with the peak traffic it will support — even though other basestations will be lightly loaded then, only to have their peak some other time. So the network needs a best/worst case dimensioning, even though on average there is a lot of wasted capacity. In contrast, the Cloud RAN can have just the right of capacity for the network as a whole and it “sloshes around” to exactly where it is needed.

That is a benefit, but it has not seemed significant enough to persuade most carriers.

The problem has been connectivity: Those radio heads produce a huge amount of data and the connectivity almost certainly requires dark fiber. Most carriers simply do not have enough fiber, and even for those who do it is unfeasibly expensive. So, for most operators C-RAN has so far been economically interesting but not compelling and not worth the cost. (See DoCoMo’s 2020 Vision for 5G.)

But there is an increasingly strong reason that is changing that calculation.

Most of the advances in signal processing that make LTE-A and 5G interesting rely on much tighter coordination between basestations. Whether they are called CoMP, or macro-diversity or 3D MIMO or beam-shaping, they all rely on fast, low-level communication between different sites. This is impossible with “intelligence at the edge” but relatively easy with a centralized approach. (SeeSprint Promises 180Mbit/s ‘Peaks’ in 2015 for more on recent advances in multiple input, multiple output antennas.)

Hence the renewed focus on centralized solutions: Whilst before the economics were intriguing maybe these performance and spectral efficient gains make it compelling.

There is the twist that maybe a “halfway” solution would be optimal. This would perhaps have some signal processing in the radio, to reduce the data rate needed on fronthaul — and use something easier and cheaper than dark fiber — while still getting the pooling economies and signal processing benefits. (See60GHz: A Frequency to Watch and Mimosa’s Backhaul Bubbles With Massive MIMO.)

This tension between the edge and the cloud will be one of the more interesting architectural choices facing 5G and is something 3rd Generation Partnership Project (3GPP) and 5GPPP are looking at, as is the Small Cell Forum Ltd. (See5G Will Give Operators Massive Headaches – Bell Labs.)

But it might be an ironic twist if the architecture that becomes 5G is back to the “some at edge, some in core” we had with GSM or 3G, and we re-invent Abis and Iub for a new generation. [Ed note: Abis is an interface that links the BTS and the BSC; Iub links the Radio Network Controller (RNC) and Node B in a GSM network.]

2G and 3G are dead, long live LTE

Earlier this month, Verizon CFO Fran Shammo finally confirmed that the long delayed launch of VoLTE on their networks will happen in Q4 of this year.

This signals a turning point in the technology; it’s been a long and slow road to get here, but we’re finally at the point where it is starting to infiltrate the mainstream conscious.

Both AT&T and Verizon have committed themselves to offering phones that can take advantage of the new VoLTE technology by Q4 – and I’d hazard a guess that means it is certain to be a standard feature in both Apple and Samsung’s latest generation phones. This in turn will undoubtedly mean their competitors are not far behind with their own offerings.

So far no real surprises. The more interesting question, though, is when will we see the first LTE only devices? After all, many operators and handset manufacturers have made no secret of their desire to turn off 2G or 3G networks.

For the operators, supporting these now legacy technologies not only occupies valuable spectrum, but adds additional infrastructure rollout and maintenance costs.

For handset manufacturers, the need to make use of 2G and 3G networks adds additional modem requirements and costs. These in turn negatively impact battery life and phone size. We’ve recently seen several new companies emerge offering “LTE Only”, thin modems at very aggressive prices, which no doubt has piqued the interest of manufacturers.

Obviously switching over entirely to LTE has only been made possible with the introduction of VoLTE. The lack of voice support has meant that a circuit switch fall back has been a requirement up until now, therefore 2G and 3G networks were a necessity.

Another key barrier up to now has been LTE coverage. Obviously, until this catches up, we’re unlikely to see any operator in a hurry to offer handsets that only support LTE, as this would severely impact their customers’ experiences.  But, as we saw in our recent work for the Scottish Government, the speed with which LTE has rolled out means it won’t be long until it catches up – our estimates put indoor 4G coverage in Scotland at 95% by the end of 2015.

Verizon originally forecast the introduction of LTE-only phones to their network by the end of 2014, a prediction that raised more than a few eyebrows. Their updated forecast now pushes this out to early 2016.

I think this is not only likely, but perhaps a necessity; should they wait any longer, the ecosystem will be in place for a competitor to take advantage of their delay.