Ofcom Spectrum Advisory Board Gains New Members

I previously gave details of this Board  – better known as OSAB – when I was appointed to it (see here). OSAB has proven to be extremely interesting.  As well as the many stimulating discussions around challenges posed to us by the Ofcom board, one of the best things has been the chance to meet the diverse members of OSAB, including technical folks, economists and sociologists.

So it’s great to have some new members to work with, announced today. These are as follows:

  •  Linda Doyle - Linda Doyle is an Associate Professor in Trinity College, University of Dublin in the Department of Electronic & Electrical Engineering.
  • Philip Marnick - Philip is currently Chief Technology Officer of SpinVox, a speech technology company
  • Brigadier David Meyer – David Meyer is currently the Deputy Chief Information Officer in the Ministry of Defence in London
  • Gavin Young  - Gavin’s current role is as Chief Architect for Access within Cable & Wireless (C&W)

I’ve had the pleasure of working with Gavin a little in his role of Technical Chair of the Broadband Forum and I’m looking forward to meeting the other new members.

For more information, see the OSAB website and particularly the last annual report which gives some sense of the range of topics we have discussed recently.

Simon

Sixteen paths to mobile enlightenment…

manifold

It used to be that operators came in clearly-defined categories, as different from one another as reptiles are from mammals. Fixed operators had fixed lines and telephone exchanges and sold fixed line telephone services. Mobile operators had spectrum and base stations and sold mobile telephony. Cable operators had coaxial cables and sold television services. Then mobile operators started selling wholesale minutes to other operators, who sold them on to end customers under a different brand and the MVNO was born. Mobile minutes started to displace fixed-line minutes, so fixed line operators became MVNOs, adding mobile services to  bundles of television and broadband services. More recently, operators have started to outsource their network operations and to share their networks with other operators, so that even ‘real’ mobile operators may no longer own the network they use to deliver services.

This led us to wondering how many different ways operators could deliver mobile services, and whether all of those ways had been properly identified and investigated. Clearly there are some essential assets which are needed to deliver a service and any aspiring operator must somehow gain access to a complete set of those assets. These could be characterised in various ways, but we think a helpful approach is to consider the following four essential ingredients:

  • Spectrum: Spectrum licences which permit mobile services in bands where mobiles are widely available must be available in order to deliver a viable service.
  • Sites: In order to deliver a mobile service which provides coverage, wide area mobility and a large amount of capacity, mobile operators need a portfolio of sites in appropriate locations and with associated rights to deploy base station equipment.
  • Network: The mobile network itself is clearly an essential ingredient, composed of radio, transport and core elements.
  • Customers: It may seem obvious, but a large and willing customer base has to be available – although the relationship with those customers may not necessarily be direct with the mobile operator, as in the case of MVNOs.

So any organisation with at least one of these four essential elements can play a role in delivering mobile services. Traditionally, all of these needed to be associated with the one mobile operator, but actually recent history has demonstrated that these can be distributed between different organisations, so long as the ‘joins’ between the different elements are sufficiently seamless that the customers ultimately get a good service.

Provided a given organisation has at least one of these four elements, they have a potential role in delivering a mobile service. In fact a mobile apps developer could even deliver a service without any of these elements, resulting in the 16 different players illustrated in Table 1. This includes organisations such as spectrum band managers who may not actively run any part of a service, but have an asset (a spectrum licence) which they can make available to another operator under suitable terms. It’s interesting to note that examples of many of these are hard to come by: does this mean there is no value in such an offering, or just that no-one has thought to offer it?

Table 1: 16 forms of operator
 
Sixteen Types of Operator

 

 

Of course, all of these players (with the exception of the conventional vertically integrated operator – #1 in the table) need to form partnerships with others in order for a complete service to be delivered. Figure 2 shows what we think is the complete set of viable combinations.

Figure 2: Viable combinations for delivering a full service

 

Complementary Combinations

The first combination – a wholesale operator with an MVNO – is well known. Some of the others seem outlandish today, e.g. a site provider with a band manager, a network -only operator and an MVNO. However, the concept of multiple operators sharing the same network and then outsourcing it all to a third party would have seemed like a flight of fancy just a few years ago…

The practical significance of all this is of course that there may be more ways to compete in the mobile market than might at first be apparent and indeed that there  might be ways to deliver niche mobile services – such as those for limited customer segments or geographical areas – with a lower barrier to entry than traditionally imagined.

We’d love to hear your views, and particularly real examples of these potential patchwork mobile service partnerships around the world.

Wireless Communications: A wrong turn taken many years ago leads to a dead end?

Introduction

deadendSome years ago I bought my daughter a wooden train track. The starter set included a small amount of track and a couple of engines. She played with it briefly and then lost interest. I concluded that more track and other items like stations were needed to make it sufficiently interesting to engage her attention and carried on doing this for quite some time before I finally realised that it was nothing to do with the amount of track – she just was not interested in train tracks.

My thesis is that wireless communications equally made some poor decisions many years ago. Just like my daughter, end users did not refuse the new services offered but quickly lost interest in them. Only now are operators and manufacturers starting to realise that the path they have been on was the wrong one. This paper discusses the decisions made, the resulting outcome and then suggests the path that should have been followed and how we might redirect our efforts to get back on track.

Historical decisions

The main developments in wireless communications have been within the cellular industry. Here the industry has progressed through a series of “generations” from 1G to 3G, with 4G now being widely discussed. The decisions made when designing the next generation are key – they affect the services that can be offered, the cost of the network and even aspects such as the battery life of handsets. The timing of the generations is also important in that it affects the need for operators to invest in new spectrum and technologies. It is in making these decisions that the wrong turn was taken. This section explains the decisions that were made and then subsequent sections discuss why these were inappropriate and have led us to a dead end.

The first generation was a mix of different analogue standards with a range of problems including security, lack of roaming and limited capacity. The standard that was developed in response to this was GSM (there were other standards in the US and Japan but these eventually became sidelined). GSM has remained secure to this date, provided enough capacity for all and facilitated roaming. In addition, almost as an after-thought, the short message service (SMS) was added which became the hugely popular texting service (something we will return to later). Over time, evolutions were added to enable packet data transmission which was generally more efficient for data services.

Once the team that had been working on the second generation completed their task, their attention naturally turned to the next challenge. Since 2G had followed about a decade after 1G there was a natural supposition that 3G would follow a decade later. However, there was nothing obviously wrong with 2G that needed fixing so the developers of 3G focussed on making it do the same things as 2G but only better. Better to them meant faster, so 3G was designed to deliver much higher data rates than 2G. It was also somewhat more spectrum efficient although the gains here were relatively small (perhaps a factor of three for voice, somewhat higher for packet data). This was one of the key decisions that we will return to – that “better” meant “faster”.

With 3G introduced, if somewhat shakily, the bandwagon rolled on, looking at what would be required for 4G. Just as with the 2G to 3G transition there was little that obviously needed fixing so the same teams concluded that they would make 4G even better than 3G – again predominantly by making it even faster. As before some small spectrum efficiency gains were anticipated although these were even less than in the previous transition as technologies came closer to fundamental limits. For 4G, all the decisions about what “better” meant when 3G was designed were simply taken to be true.

So it was a fundamental decision taken in the mid 1990s that wireless systems needed to offer ever higher data rates that has broadly placed us on the path we have followed since then. But there is little evidence that users value higher data rates and plenty that they value other things. This divergence between what the system designers think “good” looks like and what the end users want has been growing over time and is now leading to serious problems. These are explored in the next section.

The current position

Despite ever “better” technology, the current position of the wireless communications industry is not generally healthy. Most wireless operators are no longer seeing a growth in revenues and indeed many are now seeing a small fall each year as competition drives down call costs for subscribers. This has led many operators to cost-cutting measures. The manufacturing industry is also in poor shape. Nortel is bankrupt, many other manufacturers have merged and few are currently profitable. Many are reliant on 4G deployments or technologies like LTE or WiMax for future growth, but these deployments might be some years away. Although it is almost impossible to ascertain, it seems unlikely that many operators who invested in 3G spectrum and networks have yet recouped their investment, almost a decade after the first 3G systems were launched.

Other areas of wireless are somewhat more healthy. In the short range area, WiFi and BlueTooth chipset sales continue to grow and become embedded in ever more products. The number of WiFi hotspots is still growing and they are being used to an increasing extent. Satellite communications remain stable although they address only a small niche market segment. Broadcast of TV and radio is also stable, although there are some concerns over the funding models for broadcasters – but these are not predominantly technically related.

Paradoxically, 3G does appear to have managed a recent success in the form of wireless data, or “3G dongles” as they are often known. While WiFi has demonstrated to users that there can be value in downloading emails and enabling web surfing when away from the home and office, WiFi coverage is erratic and often requires user intervention to log into each different zone. It appeared to some that 3G might be able to offer an alternative. With sufficiently high data rates that downloads happen fast enough for most and no need to log into different zones it does appear to solve the “data roaming” problem. But this is an illusion. For while 3G does indeed solve the problem, it is unable to provide enough capacity at a low enough cost. A hint as to why this might be is to note that between 2G and 4G data rates have risen in the region of 100-1,000 fold but capacity has only gone up around 5-10 fold.

It is worth dwelling on this problem a little more because it exposes some of the fallacies in the previous decisions made. Current 3G networks in the UK (and other countries are likely similar) can support data transfers of around 1GByte/user/month. This is adequate for occasional email download on the move but rapidly gets used up if there is any video involved or if used as the primary household broadband connection. Beyond this level cellular networks become congested and the data rates users can achieve suffer substantially. There are some ways to enhance this. One is to acquire additional spectrum, however, this is costly in terms of spectrum fees, infrastructure upgrade and the need to subsidise dongles that can work on the new frequencies. Another is to deploy more cells, but this is again expensive and becoming increasingly difficult to do in crowded areas where suitable locations are hard to find.

In order to increase capacity then, operators will need to spend more money. This only makes sense if they can charge enough for the data usage to justify the cost. Herein lies the problem. Users will pay quite a lot for the initial connection but as the volume of data increases and data rate grows the amount that they are prepared to pay per bit of data transferred falls. At present users only pay around 1% of the cost per bit for data transfer than they do for voice despite the fact that a bit of data requires exactly the same network resources as a bit of voice. Only by reducing the price per month have operators made wireless data successful but the price point they have reached is insufficient to justify new investment in the network. Users then, do quite like the idea of high speed wireless data but not enough to pay at the levels necessary to make this an attractive and sustainable business for the operators.

We are left in something of a dead end. Revenue into the industry in the form of subscriber ARPU is falling, resulting in operators suffering falling revenue with and downward pressure on the rest of the industry. The solution that the industry proposes to this is faster 3G and ultimately even faster 4G but while users generally like the idea of faster it is only of marginal value to them and they will not pay much for high speed wireless data – not enough to justify further investment. Few are prepared to admit it but 3G looks like it was mostly a mistake and 4G looks even more problematic. As manufacturers go bankrupt and operators increasingly look to merge, where does wireless go from here?

Where we need to be

The greatest success in the wireless industry in recent years is Apple’s iPhone. In particular, the fact that the iPhone was initially launched as a 2G device is instructive. Indeed, another of the success stories of recent years – the Blackberry – is also a 2G device. The iPhone has since been upgraded to 3G and this does appear to have somewhat improved the user experience but nevertheless this clearly shows that end users value something other than data rate. Of course, all things being equal, higher data rates are better, but do not add all that much value to most.

The iPhone succeeded through a much improved user interface that enabled users to do much more with their phone. Its popularity was further enhanced by the “Apps store” which enabled users to select from an enormous range of different games and applications and download them for typically a small one-off payment. It is notable that Apple and others succeeded where the operators failed. Operators have been trying to introduce new services for decades. These include WAP, group calling, video calling, picture messaging, mobile TV, location-based services and more. They have almost all failed for a range of reasons. These include the fact that the operators wanted to change a per-usage recurring revenue when users wanted to pay one-off fees and the desire of the operator to roll out a service consistently across the thousand or more handset variants operating on their network which tended to bring the service experience down to the lowest common denominator as well as slow down its introduction.

Despite having high data rate channels at their disposal, users continue to predominantly use apparently highly inferior approaches such as texting and most recently Twitter – a text only solution with limited message size. All of this suggests that while the designers of 3G might have thought that “better” meant faster, this was far from what the end users understood by “better”. So what might better actually look like?

A first conclusion is that “better” is not necessarily more. If this were true then voice calls would have been replaced by video calls giving not only voice but images as well. Instead, if anything, voice calls have been replaced by texting, emails and tweets. Less, it appears, is often better than more.

A second conclusion is that variety is generally a good thing. New methods of communication such as Facebook have been embraced rapidly while older ones like texting have not declined. Humans like a wide range of communications mechanisms to select from depending on the context. Some things, like rejection of a suitor, are done less painfully from a distance over a very “thin” communications channel. Sometimes a text to say “I love you” is worth much more than a video call to say the same thing.

A third is that users are very different. There are many thousands of applications in the Apps Store. Some are very niche but very valuable to a subset of users. A wide range of simple applications and services is better than a narrow range of highly developed applications.

Perhaps above all else though, the conclusion that designers of communications systems do not really understand how their solutions will be used and what users will see as “better” stands out. Making any guess as to what “better” might be is likely to fail while “worse” (eg texting) may actually turn out to be just what the user wants. So faster is not better (or at least, not much better), what is better is unpredictable, but providing users and developers with the tools to play around, be inventive and have a wide range of channels and their disposal is more likely to result in a good outcome. To paraphrase, if we’d concentrated more on developing iPhones and less on 3G we might be in a much better place now.

So, to answer the question set at the start of this section, where we need to be is an environment with a wide range of communications mechanisms which are flexible and relatively low cost, are amendable to new services being developed and to experimentation by end users. Technically, this might mean a range of large-cell networks (eg 2G) and small cell networks (eg WiFi) as well as fixed networks of course, which devices can readily connect to and which offer a simple standardised interface. It means devices that have a small number of standardised operating systems onto which applications can be readily downloaded so that developers can write just one version of their application. It means a wide range of extra features in the handset such as location and cameras to provide more “hooks” for developers to experiment. And it means a flexible value chain where service providers can readily integrate data transmission across multiple networks and develop services making use of multiple resources.

How we can get back on track

To summarise the discussion to date, wireless communications is currently not in a healthy position. Operators and manufacturers are facing increasing pressure to the extent that they are merging or going bankrupt and yet the next generation of technology only threatens to make things worse by increasing expenditure without providing substantial end-user benefit. This sorry state is all a result of assuming the “better” meant “faster”.

Where we need to be is a mix of different low-cost networks that offer flexibility and enable experimentation. The good news is that we already have most of the technology and investment that we need – the fact that the iPhone has been so successful is ample evidence of this. It is less a case of changing the technology and more one of changing the structure.

A first step for the operators is to bring to a halt most of their infrastructure expenditure. There is no evidence that 4G will be any more successful than 3G nor that further investment in 3G will be any more beneficial than the investment to date (back to the opening discussion about the train track). But there is plenty of evidence that the existing 2G and 3G networks can carry data in a flexible manner that allows most services to be implemented. Halting expenditure will also be the first step to restoring profitability.

The next step is for the operators to separate into network and service provision elements. This will open up the value chain, making it easier for service providers to put together converged offerings spanning multiple networks, both fixed, mobile and WiFi. This will allow, for example, simple roaming across home, office, WiFi and cellular networks enabling lowest cost transfer of data in a manner that will meet the needs of most while minimising overall network cost. It also enables the network elements to merge, outsource, share masts and generally reduce the costs of their operations, further aiding the operators’ business cases.

Along with this is the need for application development environments where developers can write once for a range of different networks and devices. This requires operators to provide standard interfaces into network elements such as their location databases and for mobile devices to standardise on a small number of operating systems which are able to interwork in the manner that the same document can be viewed on an Apple or a PC computer. Initiatives are underway in these areas but would benefit from greater commitment.

The final step is an acceptance that the value chain comprises network operators who run networks and provide wholesale bitpipes, service providers who aggregate data provision across multiple networks and provide customer care, and application developers who put together the applications that run on top of all these. This enables the greatest flexibility allowing many different communications channels to be provided and a wide range of applications to be made available.

Conclusions

Wireless communications can provide immense value to users. A range of new wireless services such as travel assistance and interworking with home networks and appliances would add substantially to this value. However, at present the industry is on the wrong path. It is fixated on ever higher data rates – a fixation bourn of a time in the 1990s when work started on 3G but when there were few obvious problems to fix with 2G. While higher data rates are no bad thing, they add little value to the end user and do not help provide any of the services which might revolutionise the role of the mobile phone. Further, if users try to make widespread use of these data rates the networks will rapidly run into capacity problems. This fixation has been steadily leading the whole wireless sector down the wrong path to the extent now that many operators and manufacturers see a problematic future.

Happily, it getting back on track is not overly difficult. It requires operators to forego plans to deploy new networks and instead concentrate on opening up their networks, perhaps through structural separation of the network and service provisioning element, and enabling a wide range of applications to be developed and deployed by others. It is the structure of the sector we need to address most, not the technology it might deploy.

Highlights from Femtocells World Summit 2009

I recently chaired and presented at Femtocells World Summit, the biggest public femtocell event of the year, on behalf of Femto Forum. Attended by over 300 people, it was a fantastically informative and newsworthy show. Some highlights included:

  • Vodafone’s announcement of their UK launch of femtocells, or the Vodafone Access Gateway as they call them. Since 1st July these have been available from www.vodafone.co.uk/gateway This video interview highlights the key aspects of their launch:
  • Details of the latest achievements by the Femto Forum:
  • Demonstrations of a range of applications specifically enabled by femtocells beyond those which would be available in traditional mobile networks. Se for example these demonstrations by ip.access and Airvana.

 

  • awardslogo_HomeThe Femto Forum Femtocell Industry Awards, where we hosted a gala ceremony to announce the winners, full details of which are available here.

Distinguished International Spectrum Manager Joins Real Wireless Team

Mike Goddard, OBEReal Wireless is proud to announce the appointment of Mike Goddard OBE to our associates team, as International Spectrum Policy Advisor. Mike has a very distinguished career working in the field of national and international spectrum management, most recently as Director of Spectrum and International Policy for Ofcom, the UK communications regulator.

We asked Mike about his experiences in this fast-changing field over the last 4 decades.

Q: How did you start in spectrum regulation?

 Mike: Following a student apprenticeship with what was then the General Post Office, my interest in radio communication led me into my first job in frequency planning for broadcasting, at a time when there was no commercial radio in the UK, and we still had 405-line black and white TV at VHF. I later moved into fixed link planning and then into more general frequency policy and international work. My first international experience came within 2 years of leaving university.   

 Q: What do you consider your proudest achievement?

 Mike: As the first Chairman of the CEPT’s European Radiocommunications Committee, I was responsible for introducing widespread changes in the organisation, expanding membership, widening contact with industry, and setting up the European Radiocommunications Office (ERO) in Copenhagen. I am proud of the latter in particular as the ERO opened, fully funded and staffed, in less than 12 months from the initial committee decisions. But since then I have taken a lot of pride in leading many UK delegations to ITU World Radio Conferences and the Regional Conference in 2006 in which we achieved 100% of our objectives and established a sound basis for digital switchover in the UK, Europe and Africa.

Q: You’ve worked for several government agencies dealing with spectrum over the years, and seen significant shifts in the approach to spectrum regulation around the world. What are the major changes?

Mike: There have been many. Apart from the increasing complexity resulting from new technologies and applications, and from competition in many sectors, there has been a general move towards openness and dialogue with industry. There has also been a greater convergence of spectrum, telecommunications and broadcasting policies. Perhaps the biggest single change is the recognition of the economic and strategic value of spectrum; it is no longer considered just a specialised technical subject but an essential and major contributor to national economies, public services and a wide range of industries. This has led to increased emphasis on allowing flexible use of the spectrum and allowing the market to make important decisions at least in some sectors. Of course the extent of these changes, and the pace of change, varies considerably from country to country and this in itself poses major challenges for international agreement.

Q: What will you miss most about working for Ofcom?

 Mike: The people (an enthusiastic, highly skilled and experienced bunch of people), the dynamics and the rich variety of subjects, not only in spectrum management.

 Q: How was your trip to Buckingham Palace to collect your OBE last year?

 Mike: In a word “great!”. Apart from the immense pride in being honoured and collecting the award with my wife and grown-up children, I had a brief but very pleasant conversation with Prince Charles about digital switchover and retirement!