Wednesday, June 29, 2011

Broadband - Discussion of the possibilities of a broadband speed bubble, that there is no case for investing in higher speed networks

[telecoms.com] So, everyone’s agreed: broadband operators will eventually replace their decades-old copper networks with superfast fibre all the way to the home. That, at least was the consensus of some speakers on stage at last week’s fibre-to-the-x (FTTx) and Next-Generation Access (NGA) Summit in Berlin, Germany. But talk from operators and vendors on the show floor gave me yet more cause to question this conclusion.

The risk of being a next-generation nay-sayer

It has become a risk for anyone in the broadband industry – and increasingly, in the world of politics and public affairs – to question whether operators and nations need fibre-to-the-home (FTTH). To do so would be akin to making Bill Gates’ reported 1981 proclamation that “640K of [home computer] memory should be enough for anybody” – a statement which, incidentally, the Microsoft co-founder denies he ever made.

Certainly, the history of the Internet backs up the view that people will find uses for extra bandwidth in the same way that they found uses for extra home computer memory. Take online video, for example. As speeds have ticked up from 512Kbps to 1Mbps, then 2Mbs, 8Mbps and beyond, video has evolved from short, downloadable low-resolution clips to full-length high-definition movies that can be streamed within a few seconds.

Broadband even has its own law, similar to Moore’s law, which essentially states that computing hardware power will double every two years and is generally believed to have held true since the invention of the integrated circuit in 1958. Nielsen’s law states that Internet connection speeds for high-end home users will increase by 50% every year, which various observers say has similarly held true since the early 1980s.

The problem for operators is that adding more fibre to their networks is profoundly more risky than increasing the power of computers.

Bandwidth is a commodity; FTTH definitely is not

It is true that bandwidth, like computing power, has become become more widely available and inexpensive over time. A study I led last year, for example, found that 100Mbps tariffs were available in several European and Asia Pacific countries for less than US$30 per month, equal to a cost per Mbps considerably lower than that offered by current-generation broadband services with much slower speeds.

But while the cost to consumers is likely to continue to fall in more countries around the world as next-generation services become cheaper and more widely available, the cost to operators of rolling out the networks to provide those extra Mbps will follow a quite different trend.

In most countries, the cost to the consumer has followed a smooth curve as operators’ marketing departments have set prices based on what people have been prepared to pay before and what they are likely to pay in future. As such, the rate of decline in cost per Mbps has exceeded expectations as operators have slashed their prices after realising that few people are prepared to pay a significant premium for faster speeds. In some markets, operators have even priced superfast services below those on their current-generation networks in order to encourage apathetic subscribers to migrate to their next-generation infrastructure.

The problem for operators is that increasing speeds to certain levels is much more costly than updating their marketing collateral. The upgrade from up to 8Mbps ADSL to up to 24Mbps ADSL2+ requires the replacement of equipment in local telephone exchanges and the home, for example. The upgrade to 100Mbps+ FTTH can be the mostly costly of all, requiring operators to also replace the copper leading to and within the home with fibre, a task which can cost thousands of dollars per household.

Why not fibre-to-somewhere-near-the-home?

Proponents argue that only FTTH is future-proof enough to provide the speeds operators need now and in future. But momentum is growing behind an array of technologies which promise to do the same by harnessing decades-old copper networks, as talk from the FTTx & NGA Summit demonstrated.

Until recently, it had been assumed that operators could only reasonably provide speeds of about 50Mbps using the “half-way house” approach of rolling fibre to street cabinets (FTTC) and using VDSL2 for the final copper connection to the home. Technologies such as vectoring and line-bonding, however, promise to boost commercial VDSL2 speeds close to 100Mbps at a fraction of the cost compared to FTTH.

Conversations with operators and vendors at the summit also confirmed my growing suspicions that certain technical and commercial obstacles to line-bonding and vectoring are even less insurmountable than previously thought. Line-bonding has already been deployed by AT&T in the US and Pakistan’s PCTL, vectoring has been trialled by Belgacom, Orange, Swisscom and Telekom Austria amongst others and many across the industry are confident that operators will deploy one or both before the end of this year.

Plus, vectoring and line-bonding might by no means be copper’s swan song. Stefaan VanhasteI of Alcatel Lucent provided some insights into what is being called “the final form of DSL”. Omega DSL promises to deliver speeds of up to 1Gbps – or 1,000Mbps – over 200 meters or less of copper, perfect for enabling operators to provide FTTH-like speeds without having to dig up subscribers’ gardens or enter their homes.

Research into Omega DSL is only at an early stage, though Vanhastel was confident that products would be available before 2020.

Remembering what’s Nielsen’s law is really about

FTTH purists will probably scoff at such a timetable, adding that any speed increases vectoring and line-bonding may provide in the meantime won’t be enough.

Nielsen’s law also appears to suggest that even if operators start using the latest VDSL technology to provide 100Mbps services next year, they would need to upgrade to FTTH by 2014, by which time top speeds will have passed the 200Mbps mark. This need would be especially pressing in markets where cable TV operators are strong, because the structure of their hybrid fibre/coaxial (HFC) networks makes upgrading to offer progressively faster speeds a much less costly option.

But such an analysis ignores some fundamental principles that Nielsen built into his law. First of all, the law applies to connection speeds only for high-end users, not all users. And as Nielsen notes, average connection speeds will diverge ever further from high-end one as the mass-market of customers get online, as they are more likely to be low-end users.

“Unfortunately, I can argue as much as I want: most users still save on bandwidth and prefer a $20/month ISP over a $30/month one with better service,” he states.

Nielsen wrote that in 1998, when few people had dialup connections let alone DSL or fibre ones, but the sentiment remains true to this day. In most markets, at least 80% of all broadband users are subscribed to low-end services with speeds well within the means of current-generation networks.

Surprisingly, the same is true of FTTH networks. The vast majority of FTTH customers are subscribed to services with speeds ADSL2+ could easily deliver. The top-tier 100Mbps+ services are akin to the most expensive bottle of wine on a menu; they’re aimed at making the restaurant look more sophisticated and the mid-priced and house options most people actually order appear more affordable.

The exceptions are countries such as Japan and South Korea where for reasons related to market structure and regulatory concessions, 100Mbps+ services are the only ones available over FTTH networks. Even then, take-up has been slower than operators have wanted.

Build-it-and-they-will-come… Many years later

Given that Nielsen published his law in 1998 he is understandably vague about the rate at which speeds for the vast majority of low-end users will increase, merely stating that they will lag two to three years behind those for high-end users.

But a back-of-the-envelope calculation based on applying Nielsen’s law to packages offered by UK cable operator Virgin Media at the end of last year suggests rival operators will be able to comfortably serve the 70-80% of broadband customers with “low-end” 50Mbps services and 10-20% with “mid-tier” 100Mbps ones by 2015.

This would make the decision by Virgin’s arch-rival BT to invest in widespread FTTC and VDSL and offer 40Mbps speeds at mid-market prices eminently sensible. This analysis was backed up by two very senior executives of two very different major operators, who told me at the summit that they believed that 100Mbps would be more than enough for the vast majority of their customers for the next 5-10 years at least.

The irony is that Nielsen alludes to one of the central causes for the slow take-up of superfast broadband in the blog post where he sets out his law: the lack of applications that require such speeds.

Nielsen is a web usability expert at heart, and makes his observation about Internet bandwidth in service to a wider point about web design. In short, he advises his readers to stick to minimalist web pages until about 2003, despite his prediction that high-end users will be using 1.5Mbps connections by then. To serve the mass-market, websites need to be designed to be downloaded quickly by the vast number of low-end users on much slower connections.

This same principle holds true for all manner of online content providers today. The BBC’s iPlayer service, for example, employs so-called adaptive bit-rate technology to enable the broadcaster to deliver reliable video streams over connections not much faster than those available in the early part of last decade. Online video provider Netflix, meanwhile, recently reduced the default bit-rates of its Canadian service to take into account relatively low download caps introduced by operators.

The lesson for operators is that content providers won’t help them drive take-up of superfast broadband; they’ll always aim to serve the lowest common denominator.

Another irony is that FTTH operators have arguably widened the lag between the availability of superfast services and the emergence of more bandwidth-hungry online services. By marketing speeds so far in excess of what’s required today, they might have inadvertently made customers less likely to upgrade, which will further delay when content providers will eventually produce services that use them.

Simply throwing bandwidth at the problem won’t work for another reason: the emergence of popular online content and applications is dependent on factors that have little to do with broadband speeds. Did you know, for example, that average Internet users in the US, Spain, the UK or Italy generate more traffic than those in Japan, despite the fact that some 20 million homes are subscribed to 100Mbps FTTH/B services there?

Farewell to arms: broadband in a post-speed world

The key question is when – or even if – operators will feel the need to make those multi-billion dollar investments in FTTH. To serve high-end users prepared to pay for superfast broadband now? Certainly, the shrinking premiums operators are able to charge this small sliver of customers suggests that the business case can’t be justified on increased revenues alone.

Perhaps then, when Nielsen’s law dictates that low- and mid-tier speeds will exceed xDSL’s capabilities? Well, maybe Nielsen’s law just won’t apply by then.
Top speeds in the advanced markets of Japan and South Korea have been defying Nielsen’s law for some time, having been stuck at 100Mbps for three or four years. KT offers 1Gbps WDM-PON in Greater Seoul as a showcase trial, but is already losing money on IPTV and so sees no point in upgrading on a wider scale and throwing good money after bad. Speeds offered by NTT East and NTT West have stalled for similar reasons, although IPTV is not as advanced in Japan.

And as Nielsen predicted, price has trumped speed in consumers’ minds time and time again. Demand for telecoms is also shifting away from standalone broadband to bundles that include telephony, TV and other services. Who’s to say that broadband speeds won’t become increasingly unimportant to consumers, and so fail to justify the multibillion dollar investments FTTH requires? Could operators simply hold their nerve and compete for the mass-market of less-demanding low-end users on well-priced bundles alone? Certainly, we’ve seen numerous ISPs use low-priced no-frills ADSL2+ services to neutralise the supposed threat posed by FTTH in many markets.

The real question of whether telecoms operators need to lay fibre all the way to the home will come when their cable TV rivals boost the speeds of their low-end packages beyond those that even the most advanced fibre/copper hybrid networks can support. Even so, the aforementioned advances in VDSL2 technologies and trends in demand for broadband suggest that the time to pose this question, let alone answer it, could be several years away.

Perhaps telecoms operators are committing to next-generation networks because secretly they know that providing connectivity, rather than content and services, is what they do best. After all, if operators don’t invest in infrastructure, then what is the point of them? If the investment cripples them financially then there is a fair chance that the regulator will step in to protect them.

I am well aware that it is a risk to even pose these questions, especially given my role as an analyst seeking to engage with the very companies promoting the opposite views. And personally speaking, I would very much like to live in a world where near-infinite bandwidth has led to the emergence of all manner of unimaginable new applications and services.

But given how demand for superfast broadband is likely to play out and the vast amounts of money operators – and increasingly, governments – are being urged to invest in it, I feel these questions need to be asked. As always, your feedback is welcome and encouraged.



Questioning the unquestionable: is fibre-to-the-home really the future of broadband?

No comments: