Mobile coverage in rural areas – a step in the wrong direction? UK currently has around 54,403 mobile phone masts dotted around the country — many of which are on land leased to the major telecoms companies by local landowners.

4,000 leases are due to expire this year, and this could lead to serious consequences for  telecoms companies and consumers alike. In the absence of any regulation, lease renewal negotiations could lead to significant demands from landlords for rent increases in a large number of cases. Telecoms companies will then need to either pass on this cost to the consumer, through more expensive tariffs, or remove macrocells completely and create coverage or capacity gaps. The Telegraph recently wrote an article on this topic.

Macrocells are still vital to mobile coverage

Despite advances in small cell technology and Wi-Fi calling, macrocells remain the backbone of the mobile network, delivering the majority of the UK’s coverage and capacity.

There’s no alternative to macrocells, either, that doesn’t involve some form of relationship with a property or asset owner. In-building connectivity solutions like small cells and DAS do improve coverage and capacity in homes, offices and public buildings, but they will never replace macrocells entirely and do not provide wide area coverage in towns and around the countryside.

Operators need to protect their investment

Vital infrastructure is often expensive to provide and macrocells are no different. Operators naturally want to keep hold of their existing assets, given they’ve invested heavily in constructing macrocells in the first place.

Operators and landowners both know the difficulties with finding alternative sites for macrocells and obtaining planning permission and the time and cost associated with doing this would be significant — whilst the operators could resort to invoking code powers this is not a step that would be taken likely but it cannot be discounted completely as an idle threat.

How rent rises will affect mobile provision

The first impact of rent rises is likely to be felt by users in those locations where high costs force MNOs to remove macrocells, resulting in coverage or capacity gaps. Site closures aren’t going to happen overnight, though. MNOs will fight to keep their sites at rental levels that are either at or below the current level. But if landowners insist on increasing site rent by excessive amounts then users will no doubt have to bear the brunt of the costs through higher tariffs. Most likely the operators will pass some costs onto the users and absorb the majority but this will lead to less investment in new infrastructure in their networks and invariably lead to a negative impact on the digital economy generally.

Can the government intervene ?

The story of land rentals is an old chestnut in the mobile industry. The cycle of site acquisition, rental renewals and notices to quit will carry on as long the mobile industry exists — unless the government  is prepared to intervene to help regulate the rental levels that MNO’s pay for this essential infrastructure. At the same time, MNO’s need to realise that landlords and building owners should not have their genuine development plans for their land or property undermined by MNO macrocells that may have been on there for many years.

The reliance the British public currently places on their mobile communications and, within a few years, the reliance that the Police and other emergency services will have on their vital communications being carried by mobile networks suggests that this particular debate should be opened up and that representatives from the various parties (MNO’s Property owners and Government) can create a solid and sustainable basis that will help maintain mobile communications services throughout the UK.

How Real Wireless is shaping the future of wireless connectivity with 5G

Whilst 4G might only just have started to be appreciated by personal and business users, the wireless industry is already awash with discussions about 5G. Whilst Boris Johnson’s prediction that London will have 5G by 2020 is ambitious, it’s a solid bet to say it will start to be rolled out – in some form – in the early part of the 2020s, with a few non-standard networks trialling it before this (at the Tokyo Olympics, for example).

But at the same time, the reality is that the 5G technology isn’t actually defined yet. To make matters more complicated, there’s little appetite for rolling out an expensive new generation of cellular technology that only offers the “usual” higher speeds and bigger capacity benefits we have come to expect.

Instead, 5G is aiming to be the first wireless generation that is designed to explicitly cater to the needs of specific vertical industries. These could be anything from the emergency services, to broadcasting, smart highways, and utility networks.

As a result, the industry is fully aware that the end technology will need to be hugely flexible, capable of providing wide range connectivity to wireless sensors in remote locations, through to the short delay communications required to meet the needs of M2M. There are also niche use cases, such as in hyperdense venues like stadiums, where it needs be capable of handling tens of gigabits per second of data.

This in turn requires new, more flexible network architectures at all levels. The core network needs to be able to route traffic quickly and efficiently, adapting to suit the current application and available transport networks. The radio network needs to be flexible enough to suit the various needs of immensely different applications, some of which could be decades of battery life, gigabits of speeds, and milliseconds of latency…

…fingers crossed it’s not having to provide all of those at the same time!

To meet this need, and to ensure that 5G becomes a timely reality, Real Wireless is playing a key role in the research it first requires via initiatives, which include:

1. The EC socio-economic analysis – Catering to all these needs could prove immensely expensive, it’s therefore particularly important we closely examine the business case of the new business models it could enable – and the associated social and economic benefits these in turn could provide.

In May, the European Commission launched a 12-month study into the socioeconomic benefits of 5G. The study will help provide a better understanding of the potential impact that 5G will have in a variety of industries including health and travel.

After working with the European Commission on several other projects, Real Wireless was selected, along with three other key independent project stakeholders, to perform the analysis for this assessment.

The study will include a series of stakeholder hearings starting on 22nd September and a workshop on 19th October.

2. 5G Architecture research – The technological elements of 5G are – and will continue to be – the subject of intensive international research over the next few years. Real Wireless is contributing to this research, some of which is being funded by the EU – to the tune of €700million, no less – including as part of its 5GPP programme.

A great example of our involvement in this work is our recently announced 5G NORMA project. In this piece of work, we are working to identify the optimum architectures for 5G – you can find more details on this here.

3. Membership of research centres – The 5G Innovation Centre (5GIC) at the University of Surrey is the UK’s only research centre dedicated to the next generation of mobile communications.

Real Wireless is now a pioneering SME member of the centre and will advise it on regulatory, technical and business challenges — driving the delivery of a mobile communications network capable of meeting the tomorrow’s needs.

We have also been contributing to the work of the world-renowned CONNECT research centre at Trinity College Dublin.

With the upward trend in mobile device adoption levels, 5G will become the crucial network underpinning almost every application, so the work we do now is crucial to ensure the infrastructure is ready when the world needs it.

It’s therefore important to us that we continue to play a key role in the development of the technology – both from an economic and technological standpoint.

Our work is also not without direct benefits for Real Wireless customers. Our insight in to the development process allows us to provide truly informed advice to both wireless industry players who wish to establish a position towards 5G, and to our wireless user customers who want to be sure that they are best placed to make the most of 5G’s potential to address their particular needs – at a time which is right for them.

It’s time for operators to keep the vendor community on track

With the dust starting to settle on the Nokia and ALU news, it’s clear that this won’t be the last M&A story we’ll be talking about this year. Mergers amongst operators are becoming rife and the moves in the vendor sector mirror this.

We’ve already talked about the good and bad sides of operator mergers with one particular aspect being the impact on innovation. With multiple competitors watching every move in an attempt to capitalise on any mistakes, can we blame any operator that doesn’t want to experiment with an unproven technology?

In the vendor world, the picture is much more complicated as operators need both global scale and interoperability alongside innovation and new ideas. So what is the impact of fewer vendors on operators and what does this mean for the industry?

The importance of scale for the mobile industry cannot be overstated. Products needs to be developed for a global audience and, more importantly, in line with global standards. So on face value, bigger vendors with an increased reach and larger R&D teams.

As the industry gears up for 5G, we need lots of dynamic, fresh thinking and innovative companies driving new standards and approaches. Bigger companies are not always best placed to deliver this.

Additionally, less choice brings risks for operators, who may seek to second-source, but are often beholden to investments in single-vendor ecosystems. In particular, it’s a guilty secret of the industry that standards give no guarantee of inter-vendor interoperability: indeed this is the exception rather than the norm.

So where does this leave operators?

Faced with a smaller pool of vendors to choose from, operators need to wield their collective energies to ensure that there is a mix of vendors of all scales, by insisting on open interfaces and real interoperability in networks, and by encouraging small developers, not just in phone apps but deep in the heart of the network as well. This way we’ll continue to see the sort of innovation that drives our industry forwards, alongside the benefits that come from vendors with a global reach.

Will we really have Wi-Fi on trains by 2017?

Prime Minister David Cameron announced today that all trains in the UK should have free WiFi from 2017, partly helped by £50M of government funding.

At Prime Minister’s Questions he said the plans would cover services operated by TSGN, Southeastern, Chiltern and Arriva Trains Wales. (It isn’t clear if it is only those will get the funding, or if it is only those that have the expectation of free service?)

1Di8seNRo6bK8c9xq5Cw_Italia FerrisBut another facet is that is likely this will be a prerequisite of tender submissions: TOCs will have to offer Wi-Fi as part of the criteria for in the next round of franchise submissions – and will need to compete on the level of service they offer.

It is clear the operators see the benefits: Wi-Fi is a great way to make train travel more productive and hence more attractive than driving.

A spokesman for the Rail Delivery Group, which represents Network Rail and train operators, said: “It is good news that even more rail passengers will be able to benefit from Wi-Fi on their train. Rail plays a crucial role in keeping people connected to friends, family and jobs and the wider rollout of Wi-Fi on the rail network will mean people can make even better use of their time on the train.”

But saying people should do it is the easy bit: actually making this work is extremely challenging, and this is an area where Real Wireless has done a lot of work.

Trains are an extremely challenging environment for on-board connectivity, whether via Wi-Fi or small cells. For a start, there are very strict safety standards, which complicates installations. But most challenging is the issue of backhaul: trains move fast, though difficult terrain (tunnels, cuttings) and often through remote areas. To get that connection to work reliably is not trivial, and might need specialist links or dedicated spectrum.

That makes it critical that there is appropriately designed trackside network, on-train equipment and spectrum.

We have worked on these issues for a number of clients, and have some in-depth expertise in this area.

An example, which is in public domain, was done with Mott MacDonald for the Rail Safety and Standards Board (RSSB), that considered both Spectrum and Technology: “Supporting the Rail Industry’s Wireless Communications”.

We analysed spectrum (the characteristics of different frequency bands, the status of regulatory policy) and technology (capabilities that would be suitable to both operational and passenger services, and both Wi-Fi and LTE).

We have done a number of other projects on train communications and how to make them work, reliably and cost-effectively.

A few things to consider:

  • It may seem surprising, but one of our findings was that it is not actually as expensive as often thought to install mobile equipment on all the carriages of all the trains in the country – if it’s done in a coordinated fashion.
  • What’s more, that the cost is massively outweighed by savings in necessary trackside infrastructure, given the right use of technology and spectrum. But too many people are not doing that right.
  • There are significant benefits from operational use: looking only at passenger use omits many of the opportunities for telemetry, maintenance and other in-house savings.
  • If you are looking at WiF-i, you should consider cellular service at the same time. Including a small cell to improve cellular connectivity is a small incremental cost but has a major benefit for passengers in serving all devices with both voice and data services.
  • People need to anticipate the future and plan ahead. These solutions need to be robust with the right technology, capacity and QoS to support the number of travellers using the service – especially as passenger numbers rise by 2017.

For some more details please contact us, or see our white paper “The business opportunities for wireless in transport” explains how network operators can invest in infrastructure to support better connectivity and new business opportunities on trains and other modes.



Clouding the Edge for LTE-A and Beyond

This blog post was originally published over at Light Reading.

One of the areas of increasing discussion about LTE-Advanced (LTE-A) and especially around the yet-to-be defined 5G standard is the tension between the “edge” and the “cloud.”

Over the last decades in telecom the powerful trend has been to push intelligence out to the edge. David Isenberg wrote a very good — but oddly not as widely known or distributed as it deserves — essay on this way back in 1997:The Rise of the Stupid Network.

We now have edge routers, we have gateways in our phones, and new smartphones have “intelligence” onboard in a way landline phones never did.

In wireless networking, a few years after Isenberg’s essay, broadband was proving this logic with TCP/IP pushing intelligence out to the edge. While 2G the smarts were quite centralized — with a basestation controller (BSC) in the network — with 3G that focus shifted and the network started to flatten out a bit. (See Mobile Infrastructure 101.)

Bell Labs, meanwhile, had the idea of putting the router and stack all the way into the basestation with the snappily named BaseStationRouter. That of course then became the 3G small cell, with the medium access control (MAC) and stacks moving into the NodeB with Iuh replacing Iub, and then onto “flat architecture” of LTE. (See Telco in Transition: The Move to 4G Mobility.)

So small cells represent the clear case of intelligence to the edge — some people call this the Distributed RAN (D-RAN). (See Know Your Small Cell: Home, Enterprise, or Public Access?)

The advantages are that networks become better: We put capacity exactly where we need it. The small cell is responsive and efficient, and we can do things like offload and edge caching, latency is reduced (which improves speed and QoE) and so on. It is a cost effective and intelligent way to make the network better and has been the “obvious” paradigm for the last few years. (SeeMeet the Next 4G: LTE-Advanced.)

But over the last few years we have seen the reverse trend too.

In computing we have the cloud. Intelligence moving out of the edge into the center: The widespread use of Amazon AWS or Google Cloud to host services, the rise of Chromebooks, cloud-based services like Salesforce, Dropbox or Gmail.

This concept is also been felt in the wireless world, as we have heard more and more about cloud RAN (C-RAN). This is the opposite trend to small cells: Having a “dumb” remote radio head (RRH) at the edge with all the digits sent back over fiber — aka “fronthaul” — to a huge farm of servers that do all of signal processing for the whole network. No basestation and certainly no basestation router. (See What the [Bleep] Is Fronthaul? and C-RAN Blazes a Trail to True 4G.)

Some simple advantages here are from economies of scale: One big server farm is cheaper and more efficient than having the same processing power distributed — electricity and cooling needs at the basestation are reduced for example. A more subtle gain is from pooling, which is sometimes called “peak/average” or “trunking gain.”

While in a normal network every basestation must be designed to cope with the peak traffic it will support — even though other basestations will be lightly loaded then, only to have their peak some other time. So the network needs a best/worst case dimensioning, even though on average there is a lot of wasted capacity. In contrast, the Cloud RAN can have just the right of capacity for the network as a whole and it “sloshes around” to exactly where it is needed.

That is a benefit, but it has not seemed significant enough to persuade most carriers.

The problem has been connectivity: Those radio heads produce a huge amount of data and the connectivity almost certainly requires dark fiber. Most carriers simply do not have enough fiber, and even for those who do it is unfeasibly expensive. So, for most operators C-RAN has so far been economically interesting but not compelling and not worth the cost. (See DoCoMo’s 2020 Vision for 5G.)

But there is an increasingly strong reason that is changing that calculation.

Most of the advances in signal processing that make LTE-A and 5G interesting rely on much tighter coordination between basestations. Whether they are called CoMP, or macro-diversity or 3D MIMO or beam-shaping, they all rely on fast, low-level communication between different sites. This is impossible with “intelligence at the edge” but relatively easy with a centralized approach. (SeeSprint Promises 180Mbit/s ‘Peaks’ in 2015 for more on recent advances in multiple input, multiple output antennas.)

Hence the renewed focus on centralized solutions: Whilst before the economics were intriguing maybe these performance and spectral efficient gains make it compelling.

There is the twist that maybe a “halfway” solution would be optimal. This would perhaps have some signal processing in the radio, to reduce the data rate needed on fronthaul — and use something easier and cheaper than dark fiber — while still getting the pooling economies and signal processing benefits. (See60GHz: A Frequency to Watch and Mimosa’s Backhaul Bubbles With Massive MIMO.)

This tension between the edge and the cloud will be one of the more interesting architectural choices facing 5G and is something 3rd Generation Partnership Project (3GPP) and 5GPPP are looking at, as is the Small Cell Forum Ltd. (See5G Will Give Operators Massive Headaches – Bell Labs.)

But it might be an ironic twist if the architecture that becomes 5G is back to the “some at edge, some in core” we had with GSM or 3G, and we re-invent Abis and Iub for a new generation. [Ed note: Abis is an interface that links the BTS and the BSC; Iub links the Radio Network Controller (RNC) and Node B in a GSM network.]

Ofcom finalises 4G auction rules

Ofcom today published its final rules for the 4G spectrum auction in the UK. Key points:

  • The combined total of reserve prices is £1.3 billion
  • Provisional application date is 11th December
  • Bidding begins in January
  • The outcome depends on the bidding process, but bidders should know what they have won and its cost in February/March
  • Ofcom expects resulting services to be launched in May/June
  • Press release
  • Full statement

4G’s here … the last word in mobile network capacity?

The UK’s first 4G service has just gone live with others set to follow next spring, but some people are asking whether anyone really needs faster 4G speeds yet.

In addition, the amount of spectrum that can be used for mobile services is more than doubling with the 4G spectrum auctions that have or will soon take place in Europe. So the future’s bright … our mobile and wireless networks should have the capacity to meet the future demands of consumers and businesses for using our smart phones and wireless broadband services?

However, the amount of data we consume through our mobile devices has been growing frenetically and many expect that growth to continue, particularly as smart phones and tablets become more widespread.

The chart below shows a series of market forecasts that vary widely but all show rapid growth – the Mid forecast shows roughly a 100 times increase in demand for mobile data over the next 10 years. NOTE – it’s plotted on a logarithmic scale which gives a compressed view of how fast demand for data is predicted to increase. Going up one notch on the vertical axis represents an ten-fold increase in demand (not a doubling). So is there perhaps a question to answer despite the imminent arrival of 4G and so much new spectrum. And what could we do if there were a risk of a mobile network capacity crunch in the future?


Source: Real Wireless

Does or can government help industry meet soaring demand?

One reason to consider this now is because, if we do need to bring more spectrum on stream in the future, the process is cumbersome to say the least – potentially years of international negotiations and heaps of technical work. In order to get more spectrum in 10 years’ time, we might need to set the wheels in motion quite soon.

The key things to consider are:

  • how fast demand for mobile data may grow in the future, taking into account that some of the demand might be carried over Wi-Fi or indoor small cells (i.e. a mini femtocell or picocell base station inside a home or office)
  • what spectrum may be available for mobile in the future – it also makes a difference whether other countries are considering doing the same thing
  • potential future developments in technologies which could improve mobile network capacity.

My associates, Real Wireless, experts in mobile technologies mapped out the potential future technology enhancements that could increase the capacity of mobile networks in a study for Ofcom. Generally we can identify quite a few techniques now which could be introduced over the next 10 years (despite the uncertainty inherent in technology forecasts):

  • deploying more infrastructure – either outdoor small cells (micro / picocells) or full scale base stations (macrocells)
  • improvements to 4G technologies e.g. LTE Advanced should enable mobile networks to use spectrum more efficiently and flexibly and increase the top speeds mobile networks can deliver
  • techniques to use mobile frequencies more efficiently – e.g. increased sectorisation and use of multiple antenna technologies (MIMO)
  • distributed processing and sharing of traffic loads across multiple cell-sites – e.g. Coordinated multi-point and Cloud RAN.

Real Wireless worked out a number of plausible combinations of these techniques and looked at how much additional spectrum Ofcom is currently predicted to make available for mobile use over the next 20 years – up to 350MHz (which compares well to the 200MHz of 4G spectrum currently being released in Europe). This enabled them to make a good forecast (using information on real geographic areas) of how mobile network capacity is likely to increase in the future.

This allowed mobile data demand to be matched against mobile network capacity (once the fluctuations of mobile data demand during the day were taken into account to get a measure of the peak demand).

Spectrum currently earmarked for mobile could be exhausted in just over a decade

The result is that that there may well be a network capacity crunch, in as little as 10 to 12 years’ time in some areas, even given the likely technological improvements and increased spectrum we currently expect to come on stream.

By capacity crunch we mean that the mobile operators will have exhausted all the techniques for increasing capacity we can currently forecast, and the only way to increase capacity would be a significant expansion in base station sites. This would not only be costly, but physical and planning limitation could mean that a major expansion was unlikely to be feasible, particularly in urban areas.

The result was derived by evaluating the costs of the alternatives for increasing mobile network capacity, i.e. using more of the spectrum available for mobile vs.  new technologies vs. deploying more base stations. The most cost effective way to increase capacity to meet demand was calculated on a rolling 2-3 year basis. The result is shown in the graph below.


Source: Real Wireless

What could be done to provide more capacity?

The option that is most in the control of governments and regulators is to try to allocate more spectrum for mobile. It’s likely that any suitable candidates are already being used for something else, hence there would be a cost to society in switching over such spectrum to mobile.

The 700MHz band is one possibility. Although currently used for terrestrial TV broadcasting, moves are afoot in Europe and in other regions to consider possible future mobile use. The 700MHz band is attractive because it may gain broad international support. This would make it more likely that leading handsets would work on it. Also, its physical characteristics mean that it can provide more reliable coverage, and hence capacity, compared to the majority of existing mobile spectrum.

700MHz could alleviate the potential capacity crunch

Our research shows that mobile operators could save substantial sums of money by deploying 700MHz spectrum at the key point in the future, instead of deploying more base stations. Consumers should benefit as well through lower prices and more consistent service quality.

However the timing of when 700MHz is available is important, particularly the closer we are to the worst case scenario of when mobile broadband demand is high and the government cannot release as much spectrum for mobile as it currently expects over the next 10 years.

If 700MHz spectrum were available in 2020, the benefits for mobile operators (and consumers) would be much greater than if it were only available when current 700MHz licences expire in 2026.

If 700MHz is not available until 2026, mobile operators would have to start deploying new base station sites when the capacity crunch hit in 2022 to 2024. Deploying new sites would lock the operators into a certain course of action (to exploit the new sites to the full). The potential cost savings from using 700MHz would be much lower than if 700MHz had been available before the new sites were deployed. In other words, there is a risk that the industry could get locked into the wrong technology path.


Despite the exciting changes that 4G is likely to make to our smartphone and tablet experiences, regulators and mobile operators have to keep an eye on the future needs of the mobile networks. Our technological inventiveness may not be enough to avoid a capacity crunch 10 years down the line, hence the mobile sector is likely to need even more spectrum, preferably harmonised on a European or wider basis.

The 700MHz spectrum is potentially a good prospect, but the cost savings it could bring need to be offset against the costs of clearing out the existing broadcasting users.


Full details of the work, including an illustrative video, download of the full report and a link to Ofcom’s use of analysis in their UHF strategy consultation are available at:

Hexagons in 3D – Is it time to update the defining image of the cellular industry?

If you had to pick a single iconic image to represent  the world of the mobile operator , it would have to be the hexagon. Much used in the early marketing literature of operators, the hexagon provided a simple representation of the area covered by a base station, and helped to illustrate how a limited set of frequencies could be reused in order to serve an unlimited number of users. This is the central ‘magic’ of cellular networks.


Hexagons define frequency re-use in outdoor macrocellular networks

 Hexagons defining outdoor cellular coverage areas


More formally, a hexagon defines the region which contains points which are closer to one base station site than to any other  if the base stations are arranged in a regular grid.  Assuming uniform wave propagation conditions, it therefore shows the coverage area of the base station in the centre of the hexagon, i.e. locations where a mobile would receive and deliver a stronger signal to this base station than to any other. The hexagon is a special case of a Voronoi polygon, which contains the closest locations to any random selection of points.



 Voronoi Polygons for Random Points (Base Station Sites)


Real-world propagation conditions are never like that, of course; in practice the coverage area of a given base station is very irregular indeed. Nevertheless, the hexagon provides a useful idealisation – its six sides give an indication of the  number of sources of interference which need to be considered when working out the total capacity of a basic cellular system.


So what’s new? Today, cellular systems are undergoing a period of renewal. Well over two-thirds of voice traffic occurs inside buildings and it’s likely that data services will occur even more inside buildings. This means that mobile networks need to do more than provide coverage to a 2D plane – they need to consider the third dimension. Re-use of radio resources vertically is inevitable, whether using Wi-Fi access points or femtocells. A 2D map can be coloured without reusing colours in adjacent shapes using just four colours, so 4 frequencies  can be reused without limit to avoid interference between adjacent cells. In 3D, the number rises greatly adding to the complexity. [Note: I haven’t yet been able to find the 3D equivalent of the four-colour theorem – I’d be fascinated to hear if anyone knows the answer]


So the question arises: what is the equivalent of the hexagon in three dimensions? In the jargon, we are seeking a space-filling polyhedron.  There exist various exotic candidate shapes (anyone for rhombo-hexagonal dodecahedra?). However, we don’t simply want a polyhedron which fills the space, but one which corresponds to a 3D version of the Voronoi polygon, enclosing the points closest to the antennas.


If the antennas in a building are on a regular grid across each floor of the building, with antennas directly above and below each other on successive floors, then the Voronoi polygon is simply the humble cube.



Cubic Honeycomb

Inside a cubic lattice


If the antennas on successive floors are offset between floors, so that an antenna is at the midpoint of its four nearest neighbours on the floor above, then a rather more interesting shape results. This arrangement is known to crystallographers as a body-centred cubic arrangement, for which the Voronoi polygon is the truncated octahedron. This has 8 regular hexagonal faces, 6 regular square faces, 24 vertices and 36 edges.




 The Truncated Octahedron



So there are 14 adjacent interference-creating cells surrounding each antenna:


A Lattice of Truncated Octahedra



Finally, we can contemplate arranging the antennas in a face-centred cubic pattern. The Voronoi polygon in this case is the rhombic dodecahedron, with 12 rhombic faces.




Rhombic Dodecahedron


The lattice in this case looks like this:

Lattice of rhombic dodecahedra



Of course, these patterns don’t relate closely to the reality of in-building propagation any more closely than the real world of outdoor macrocell planning. In particular, the high propagation losses involved in penetration through walls and floors will distort the relevant shapes hugely.


Nevertheless, doesn’t an industry which has changed so much deserve a new defining image? Perhaps the truncated octahedron could fit the bill !


The Truncated Octahedron

Mobile Broadband Data – in Finland!

In case anyone doubted the staggeringly high growth of mobile data over the last year or so, I came across some fascinating (and very detailed) statistics on mobile data in Finland (thanks to Dean Bubley for pointing these out).

- Total data traffic in ‘07 was 13 x larger in volume than the previous year
- 92% of data traffic was from computers rather than phones
- This share of traffic was from just 2.1% of devices
Full details at:
Horsham, 28th May 2008 (via HSDPA!)