Discovery Institute
disco-tech | Discovery Institute's Technology Blog: Exaflood Archives

January 3, 2010
State of IP

At Telephony Online, Rich Karpinski notes,

In today's carrier networks, IP may not always be hyped or even seen, but it is indeed everywhere -- and in 2010, it's only going deeper and making an even bigger impact.
I think this protocol proliferation in the name of IP is the death rattle of the old network. IP is a data protocol so of course it dominated the enterprise market and it is prevalent on the Internet so of course Internet players such as Google want it to be upgraded for so-called Multimedia.

But the message of all the brave talk about "ultimate outcomes that have yet to take hold today" is that once again it is becoming reasonable to predict that cable will win. CableTV is already frankly devoted to the transmission of the high definition interactive video that will comprise 99 percent of network traffic. This is the black hole into which all the plans for sophisticated Rich Communications Suites, guaranteed QoS, Internet Multimedia Services, and all the abortive plan for Long Term Evolution (LTE) will fall.

The companies for the new era will be the hardware enablers of broadband interactive video: graphics and network processors, optical transponders, wavelength division muxing gear, and optical circuit switches for the new TDM circuits that will be crucial for the robust streaming video that will be at the heart of the market.

That's the Henry Gau Telecosm and I'm sticking to it. Upgrading the old networks for video and multimedia, one service at a time, is a non-starter. It will be swept away by truly broadband wavelength circuits optimized for interactive video streams. Within these circuits all other
traffic can flow without significant additional expense.

Security, routing, session management, and switching all will be done on the customer edge and the datacenter, which will comprise the bulk of the server edge.

Unless the telco's grasp that their old circuit model is relevant again, they are going to give way to cable TV players who already get the picture in high definition and are moving ever closer to video teleconferencing.


October 5, 2009
Swanson on the problem with 'net neutrality'

A must-read from Bret Swanson:

Despite the brutal economic downturn, Internet-sector growth has been solid. From the Amazon Kindle and 85,000 iPhone "apps" to Hulu video and broadband health care, Web innovation flourishes. Mr. Genachowski heartily acknowledges these happy industry facts but then pivots to assert the Web is at a "crossroads" and only the FCC can choose the right path.

The events of the last half-decade prove otherwise. Since 2004, bandwidth per capita in the U.S. grew to three megabits per second from just 262 kilobits per second, and monthly Internet traffic increased to two billion gigabytes from 170 million gigabytes--both tenfold leaps.

* * * *

At a time of continued national economic peril, the last thing we need is a new heavy hand weighing down our most promising high-growth sector. Better to maintain the existing open-Web principles and let the Internet evolve.


July 28, 2009
Thoughts on broadband strategy

The FCC received reply comments last week concerning the national broadband plan it is required -- pursuant to the stimulus legislation -- to deliver to Congress by Feb. 17, 2010.

In the attached reply comments of my own, I conclude:

  • The broadband market is delivering better services at lower prices. There is no evidence of a market failure which would justify additional regulation.

    I pointed out that just as the Sherman Act does not "give judges carte blanche to insist that a monopolist alter its way of doing business whenever some other approach might yield greater competition," according to the Supreme Court, the Commission would be wise not to insist that broadband providers alter their way of doing business just because it hopes some other approach might yield more consumer benefits. The pursuit of the "perfect" may prove elusive. Meanwhile, the "good" -- which presently exists in the form of a fast-charging, innovative market -- could be destroyed.

  • The Commssion should focus on non-regulatory strategies which have proven effective in promoting the adoption of broadband services.

    For example, a lot of Americans don't subscribe to broadband because they don't see the need for it or because they are concerned about the price. According to the Pew Research Center, 50 percent of dial-up and non-online users fall into the former category and 19 percent fall into the latter category.

    A public-private partnership in Kentucky discovered that the lack of a computer at home ranked even higher than the monthly service fee as a barrier to the adoption of household broadband. In Kentucky, the number of people actually using broadband jumped from 22% to 44% as a result of the partnership's efforts.

  • Common carrier regulation could interfere with innovation and legitimate network management.

    If government mandates that sellers have to charge everyone the same price, that potentially limits returns on investment (because some consumers are willing to pay more than others). If government says sellers can't serve some customers unless they can serve all customers, that potentially limits investment opportunities. Net neutrality regulation would potentially lead to these and perhaps other consequences.

    One such consequence might be to prevent network operators from proactively managing the network to reduce congestion and malicious traffic which lead to identity theft and cyber attacks.

  • There is no compelling evidence of excessive profits which would justify reregulation of the special access market.

    Purchasers of these high capacity services allege profiteering, but a more reasonable analysis has found that instead of earning a 138% return on special access investment, AT&T is more likely earning 30%. Qwest is probably earning 38%, not 175%. And Verizon, 15% instead of 62%.

    If AT&T, Qwest and Verizon are earning excess profits, cable and fixed wireless competitors will be able to undercut their prices and capture market share. The higher the profits, the faster the entry.

    If regulation pushed special access prices lower, that would reduce the revenue investors could expect to earn from new competitive facilities. If investment won't be profitable, it won't be made.



June 29, 2009
Bandwidth Boom

Bret Swanson at Entropy Economics makes some fascinating findings in a new paper:

We estimate that by the end of 2008, U.S. consumer bandwidth totaled almost 717 petabits per second. On a per capita basis, U.S. consumers now enjoy almost 2.4 megabits per second of communications power, compared to just over 28 kilobits per second in 2000. The ability of Americans to communicate and capitalize on all of the Internet's proliferating applications and services is thus, on average, about 100 times greater than it was in 2000.
It sort of makes you wonder why we need a National Broadband Plan from the government, particularly when you consider the possibility that the government's well-intentioned efforts may backfire. Consider Swanson's observation as to the last time the government tried to improve the telecommunications market:
The millennial technology and telecom crash was, in part, a result of this broadband dearth. Thousands of Silicon Valley dot-com business plans had been conceived on the assumption that real broadband would be rapidly deployed and adopted across the nation. More than half a dozen communications companies took advantage of the newly deregulated long-haul transmission market and built nationwide fiber optic networks, boosting intercity bandwidth by several orders of magnitude. But local telecom markets werenʼt similarly deregulated. They were re-regulated. At the FCC and in 51 state utility commissions, in fact, complex rules and price controls grew for DSL and threatened to engulf cable modems as well. Investment ground to a halt. The resulting bandwidth gap, with the crucial last mile falling well short of the market's expectations, helped produce the crash, which lasted through 2002.


November 25, 2008
Bracing for new regulation

Observers predict stepped-up regulatory battles in telecom, according to the Wall Street Journal,

New congressional leaders as well as policy makers in the Obama administration are expected to press for fresh limits on media consolidation and require phone and cable firms to open their networks to Internet competitors, lobbyists and industry officials say.
The article overlooks the fact that broadcast ownership limits and forced access policies are restraints on the free speech rights of broadcasters and network providers, and that the constitutionality of new regulation could ultimately be decided by the courts.


November 24, 2008
Regulation and investment?

Misguided regulatory policy is "among the most important inhibitors of capital investment in telecommunications," conclude Debra J. Aron and Robert W. Crandall in a recent paper.

The authors observe that

Business firms do not make investments for altruistic reasons but rather make investments in order to earn a return on the invested capital. For any company to make any investment, it must determine, and convince the capital market, that the investment is reasonably likely to produce a positive return in net present value (NPV) terms sufficient to compensate for the risk incurred. When companies seek funding to execute a project, they compete for those funds with all other potential projects in the economy, not just with other investment opportunities available to the company itself and not just with investment opportunities in the same industry or geographic area.
Regulators cannot set optmal prices -- as a practical matter -- only prices which either are too high or too low. Prices which are too low discourage investment.
The risk that regulatory prices would not be compensatory is magnified by the fact that any investment in new fixed-wire networks is largely sunk. That is, the company making the investment cannot remove the assets and deploy them in alternative pursuits if they prove to be non-remunerative in the telecom sector. Thus, a decision to invest today in a given technology is irrevocable and potentially very costly. In contrast, if a competitor were to be granted access to these assets, once they are in place, at regulated rates, the competitor's decision would not be irrevocable. If it is allowed to lease these facilities on a short-term basis, it could simply walk away if a new technology were to appear. For this reason, economists refer to the competitor as having a "real option" which should be priced into the regulated rate. Alternatively, the competitor could be required to share the incumbent's investment risk by leasing the asset for its entire life. In this way, if the competitor remained solvent, it would be faced with its proportionate share of the risk of early obsolescence. (footnotes omitted.)
But that is not what regulators do. Regulators require incumbents to share the rewards of successful investments, not the losses arising from investment failures. The competitor gets to walk away while the incumbent is forced to write off huge amounts of fixed investment.

Next, the authors confirm that Wall Street is skeptical of Verizon's and AT&T's massive broadband investments.

A recent report by Bernstein Research, for example, concludes that "Even with aggressive assumptions about incremental adoption and retention, we believe the FiOS [Verizon's fiber-to-the-home initiative] project, in aggregate, falls well short of earning its cost of capital." An earlier report by industry analysts Pike & Fisher was also pessimistic, stating that its "report suggests Verizon is spending so much on FiOS that it could take a decade or more for the company to pay back its investment should it fall considerably short of its market-penetration goals." In contrast, Stifel Nicolaus analysts Christopher King and Billie Warrick were fairly optimistic about Verizon's FiOS product, predicting that "Verizon will still be able to offer a superior product to cable (and AT&T) due to its FTTH [fiber-to-the-home] architecture, and will still be able to generate a positive ROI [return on investment], given its superior product offering to its cable competitors, in our view." (footnotes omitted.)
The authors caution that regulation harms some consumers more than others.
The effects of the depressed investment incentives would be most immediately and directly felt in areas where the economics of investment are at the edge of profitability even without unbundling burdens. This is likely to be in already disadvantaged geographic areas. Hence, consumers in the least attractive areas for investment in advanced broadband networks would be the ones who would likely be disproportionately deprived of the new investment.
The authors point out that
The vigor and speed with which ILECs make investments in
broadband infrastructure will affect the vigor and speed with which cable and wireless broadband companies will continue to invest in response, and the ferocity of intermodal competition.
Finally, we are reminded that that the Federal Communication's Commission policy of deregulating broadband investment by incumbent telephone companies has in fact unleashed a virtuous cycle of multi-billion dollar investment by the phone companies and their competitors in the cable industry.
In this deregulatory environment, broadband subscriptions in the U.S. have soared, more than trebling in the three years ended June 2007. Clearly, the FCC's forbearance policy has borne substantial fruit for U.S. citizens.

Verizon and AT&T are not alone among communications companies in the U.S. in substantially increasing their investments since the TRO decision. Consistent with the mutually-reinforcing dynamic of responsive competitive investments we discussed earlier, cable companies have made massive investments in their broadband infrastructures as well. While the combined annual capital expenditures of AT&T and Verizon have increased from $17.1 to $24.6 billion since 2004, the aggregate annual capital expenditures of the three largest publicly held cable providers, Comcast, Cablevision, and Time Warner Cable, have nearly doubled, from $5.6 billion to $10.1 billion. (footnotes omitted.)

The paper is entitled "Investment in Next Generation Networks and Wholesale Telecommunications Regulation."


June 6, 2008
Telecosm recap

You should have been there! Telecosm was thrilling. I will list the ways, in chronological order in two or three posts over the next few days. (Below is Part 1.)

1) Lawrence Solomon, author of The Deniers, demonstrated, beyond cavil,
that nearly all the relevant scientists, outside of the government
echo-chambers, completely repudiate the climate panic. He concluded by
pointing to evidence for a cooling trend ahead.

2) After I presented the statistics showing that most of the global
economy is driven by innovation in the Telecosm--teleputers, datacenters,
optical fiber, fiberspeed electronics--Steve Forbes gave a magisterial
tour of the world economy. Relevant to the debates on the Gilder Telecosm
Forum subscriber message board was his assertion that the Fed had been too
loose in the face of a collapse in the demand for dollars caused by the
muddled cheap dollar leadership from the administration. Later in the
conference, in an incandescent speech mostly about the amazing expansion
of freedom and supply side economics in China, John Rutledge maintained
that the Fed had been too tight, measured by the flat monetary base. But
then, as far as I could grasp, Rutledge contradicted himself by showing a
dramatic surge of bank lending to small and midsized businesses. If it was
caused by the collapse of other lending sources, he did not give any
evidence.

3) Nicholas Carr gave a suave and lucid presentation of the themes of his
The Big Switch book, comparing the emergence of cloud computing to the
rise of the centralized power grid. Raising an issue that recurred
throughout the conference, our regnant expert on the power grid, Carver
Mead, dismissed the analogy as simplistic, since one-way power delivery
and two-way information transfer are radically different processes. Bill
Tucker, author of the forthcoming Terrestrial Energy, pointed out in a
compelling speech that Moore's Law is about miniaturization of bits while
the energy industry is better described by a Law of More--more power and
more efficiency. He explained that all the energy in the atom is in the
nucleus and pointed to the immense heat caused by nuclear fission and
fusion within the earth. Then he impugned the venture capitalists'
compulsion to waste arable land and space twiddling with electrons and
photons and presented much evidence that solar energy in all its forms
would never provide adequate power for an ever growing economy. Physicist
Howard Hayden of Energy Advocate enthusiastically confirmed this view.

4) Andy Kessler followed with an uproarious investigation of Who Killed
Bear Stearns?. His answer pointed not to the usual culprits (though he did
politely finger front row auditors me and Bob Metcalfe) but to Bear
Stearns' itself. After preparing a feculent feast of sub-prime pork ("they
knew better than anyone else what was in it"), then packaging it all into
putatively succulent AAA delicacies, they totally lost it and ate their
own sausages.

5) The Exaflood Panel presented Andrew Odlyzko's dour but learned analysis
of Internet traffic, which concluded that the real danger is not too much
traffic but not enough to sustain all the businesses in the sector. Joe
Weinman, a brilliant strategist from ATT, however, confirmed the Exaflood
thesis, and Johna Till Johnson of Nemertes offered compelling evidence
that the best way to examine the issue is from the supply side. If you
don't build it, they definitely will not come. Traffic in the core is
dependent on access from the edge, which still lags in the US, as even
Odlyzko showed rates of usage in Korea and Hong Kong six times US usage
rates. Lane Patterson of Equinix confirmed aggressive estimates of traffic
growth and still more ambitious growth of Equinix datacenters, but said
that patterns of traffic confirm that the core is being starved by
inadequate access on the edge.


May 5, 2008
Terabit Ethernet coming soon

George Gilder is getting some well-deserved recognition in Technology Review in an article by Mark Williams entitled "The State of the Global Telecosm - The most notorious promoter of the 1990s telecom boom has been proved right."

"I'm a fan of George Gilder, the bubble bursting notwithstanding," Ethernet co­inventor Bob Metcalfe (a member of Technology Review's board of directors) told me after his San Diego keynote speech, "Toward Terabit Ethernet." Metcalfe had told his audience not only that optical networks would soon deliver 40- and 100-gigabit-per-second Ethernet--standards bodies are now hammering out the technical specifications--but also that 1,000-gigabyte-per-second Ethernet, which Metcalfe dubbed "terabit Ethernet," would emerge around 2015. Why, I asked, did Metcalfe believe this? "Last night, Gilder spoke to 300 of us at an executive forum about his 'Exaflood' paper, in which he predicts a zettabyte of U.S. Internet traffic by the year 2015," Metcalfe said. "Since I admire Gilder, I extrapolated from his prediction."

An exabyte is 1018 bytes of data; a zettabyte is 1021 bytes. Metcalfe pointed to video, new mobile, and embedded systems as the factors driving this rising data flood: "Video is becoming the Internet's dominant traffic, and that's before high definition comes fully online. Mobile Internet just passed a billion new cell phones per year. Then totally new sources of traffic exist, like the 10 billion embedded microcontrollers now shipped annually."

Metcalfe also addresses the interesting question of whether there is sufficient capacity in the Internet backbone to accommodate the surging traffic:
Did Metcalfe believe that the existing infrastructure--built in the boom years, when great excesses of fiber-optic cable were laid down--could support terabit Ethernet? "That dark fiber laid down then is being lit up, and some routes are now full," he said. "That's the principal pressure to go to 40 and 100 gigabits per second. It seems we can reach those speeds with basically the same fibers, lasers, photodetectors, and 1,500-nanometer wavelengths we have, mostly by means of modulation improvement. But it's doubtful we'll wring another factor of 10 beyond that." Thus, the backbone networks would need to be overhauled and new technologies implemented.


April 25, 2008
The bandwidth conundrum

John Dvorak, PCMag.com:

In today's world, bandwidth demand is similar to what processing demand was 20 years ago. You just can't get enough speed, no matter how hard you try. Even when you have enough speed on your own end, some other bottleneck is killing you.

This comes to mind as, over the past few months, I've noticed how many YouTube videos essentially come to a grinding halt halfway through playback and display that little spinning timer. Why don't they just put the word "buffering" on the screen?

All too often, it's not the speed of my connection that's at issue--it's the speed of the connection at the other end. It may not even be the connection speed itself; it may simply be the site's ability to deliver content at full speed under heavy demand.

This concerns me, since I'm an advocate of IPTV and other technologies that need lots of speed to work. We seldom consider the fact that if something becomes hyper-popular (like YouTube), user demand on the system is enormous and can easily break the system from the demand side....

Read On

Interesting article that misses the chief recent development on the net: the huge advances in the efficiencies of the datacenters that dispense these web pages. The "cloud" computing paradigm, pioneered by Google, is now going mainstream as Nicholas Carr, Telecosm speaker this year (www.TelecosmConference.com), documents in his intriguing book. For example, Jules Urbach--our movie and virtual world renderman and Telecosm star with his Lightstage corporation--can send images from thousands of different "viewports" per second from his graphic processor based OTOY servers, which can scale to millions of users. A company called Azul has developed cheap scalable datacenter technology that delivers terabits per second from its OS neutral Java-based clusters of servers.

The bottleneck is rapidly moving back to where it has long resided: at the last mile, where passive optical networks, such as VZ's FiOS, are increasingly necessary. For IPTV, content delivery networks (CDN) from Akamai and its increasing throng of video rivals using a variety of ingenious delivery algorithms will eclipse the cumbersome BitTorrent mesh model, which shuffles video files through underused personal computers across the network.


February 22, 2008
Unleashing the Exaflood

Bret Swanson and George Gilder have a column in today's Wall Street Journal in which they argue that more Internet capacity will be necessary to keep up with movie downloads, gaming, virtual worlds and other fast-growing applications.

They explain that Internet capacity will have to increase 50 times in the next couple years in their recent report "Estimating the Exaflood: The Impact of Video and Rich Media on the Internet -- A 'zettabyte' by 2015?," which I discuss here.

In their column, Gilder and Swanson warn this won't happen if politicians re-regulate network providers:

The petitions under consideration at the FCC and in the Markey net neutrality bill would set an entirely new course for U.S. broadband policy, marking every network bit and byte for inspection, regulation and possible litigation. Every price, partnership, advertisement and experimental business plan on the Net would have to look to Washington for permission. Many would be banned. Wall Street will not deploy the needed $100 billion in risk capital if Mr. Markey, digital traffic cop, insists on policing every intersection of the Internet.
I included a similar warning in comments to the Federal Communications Commission last week.


February 16, 2008
The Coming Ad Revolution

Check out Taylor Frigon's blog post, "A paradigm's shift in the way you get information," which links to a story in the Wall Street Journal by Esther Dyson entitled: "The Coming Ad Revolution." Dyson's column discusses major changes in advertising that have been on their way for years but which few people today even see coming. Frigon writes:

The article outlines an impending paradigm shift in the way people find information, which will have a tremendous impact on the advertising business and those that support it.

But this revolution in the way that people find information will impact more than just the ad industry. We wrote about some of the potential implications in the world of search two months ago in a post entitled, "What is the future of search?" And there are thousands of other ways in which the kinds of changes that Dyson is discussing in this article will impact business and life beyond business.

George Gilder predicted these very same revolutionary forces in his 2000 book Telecosm: How Infinite Bandwidth Will Revolutionize our World. In chapter 18, "The Lifespan Limit," he wrote:

"The supreme time waster, though, is television. Many people still have trouble understanding how egregious a time consumer, how obsolete a business model, how atavistic a technology, and how debauched a cultural force it is. [. . .] For as much as seven hours a day, on average, consuming perhaps two thirds of your disposable time, year after year, all in order to grab your eyeballs for a few minutes of artfully crafted advertising images that you don't want to see, of products that you will never buy.

"[. . .] In the future, no one will be able to tease or trick you into watching an ad. Your time is too precious and you are too powerful. Advertisements will truly add value rather than subtract it (247 - 252)."

The value of your trusted circle of friends, family, colleagues, and various networks to which you belong or with which you associate may become much easier to tap into to help you with decisions than ever before, diminishing the power of old-fashioned advertising as Gilder foresaw years ago and as Dyson describes in today's article.

You may well make purchasing decisions based on these existing networks, as well as based on new networks which arise to provide you with access to what products other consumers like you find valuable.

Based on this outlook, the tremendous valuations for companies like Google, whose revenues are based upon a very primitive version of tying advertisements to what you are looking for, may be something of a house of cards. If the paradigm is truly shifting in the ways that are foreseen by Dyson and Gilder, there are new opportunities few see now, and the companies most dominant today may become examples for future discussions of the topple rate.


February 1, 2008
A Breakdown of the Innovation Culture?

In preparation for the "Exaflood" paper, I read the November 2007 paper by Nemertes Research -- "The Internet Singularity Delayed: Why Limits in Internet Capacity Will Stifle Innovation on the Web." It is an exemplary supply-side work (low utilization rates signify inadequate bandwidth rather than lack of demand). Failure to invest in infrastructure will produce not a breakdown of the Internet but a breakdown of the innovation culture of the net that brought us YouTube et al.

I recommend the paper to all as a guide to the prospects of our network processor and hollow router paradigms. It contains a number of obvious errors (dates reversed on charts (p.22), confusions between zettabits per second and petabits), and a "What me worry?" approach to huge conflicts between Nemertes and Odlyzko estimates of global capacity in 2000 (Odlyzko 85 pettabytes per month; Nemertes 61 exabytes!). Today global access capacity is around a zettabyte (10 to the 21) per month (2 plus petabits per second), with the U.S. commanding only one seventh of it (300 Tbps) or 14% while our GDP was close to 20% and our market cap 40% (adjusting for dollar doldrums).

Meanwhile, U.S. investment in infrastructure (capex only) was roughly $5B out of a global total of $20B and U.S. investment in access equipment (again capex only) was under $1B or about a fifth of global access investment in capex ($5B plus). But the U.S. out-invests the rest of the world in edge router/switch connectivity for the metro and high-end enterprise. Enterprise IP traffic is estimated to be about 1.5 times Internet IP traffic, but convergence continues.

On page 29, the report contains a breakdown of core and edge router/switch unit growth that is relevant to our network processor paradigm. On the order of 10 to 15 thousand core routers are sold annually, compared to between 30 and 60 thousand edge and metro routers and literally billions of access nodes of all kinds. The NPA is a key product for volume production of NPUs for scale and learning curves.

The other insight is that IPV6 meets the need for addresses but does not respond to the expansion of router tables that will slow the net in coming years if it is not remediated. The conclusion is a large need for CAMs, Knowledge Processors, and other memory and lookup table accelerators.

Nemertes declares that the U.S. confronts a coming bandwidth crunch in 2010, when access constraints will begin seriously to limit investment in Internet service innovation. The argument is that Moore's law increases in capacity will yield rising utilization rate and that traffic is supremely sensitive to utilization rates. In other words, as bandwidth increases we use it more and innovate more. I believe this. Others don't.

"If we build it, they will come," is the underlying assumption of Nemertes and me. People laugh but it is true over any run longer than a year or so.


January 29, 2008
Estimating the Exaflood

Bret Swanson and George Gilder predict that the U.S. Internet of 2015 will be at least 50 times larger than it was in 2006. Their report, "Estimating the Exaflood: The Impact of Video and Rich Media on the Internet -- A 'zettabyte' by 2015?," estimates that annual totals for various categories of U.S. IP traffic in the year 2015. It projects:

  • Movie downloads and P2P file sharing of 100 exabytes
  • Internet video, gaming and virtual worlds of 200 exabytes
  • Non-internet IPTV of 100 exabytes, and possibly much more
  • Business IP Traffic of 100 exabytes
Gilder notes that an exabyte is equal to one billion gigabytes, or approximately 50,000 times the contents of the U.S. Library of Congress.

This report expands on Swanson's article "The Coming Exaflood," which was published in the Wall Street Journal on January 20, 2007.

Let Technology Revive the Economy
Swanson and Gilder point out that the network isn't currently designed to handle this increase.

Internet growth at these levels will require a dramatic expansion of bandwidth, storage, and traffic management capabilities in core, edge, metro, and access networks. A recent Nemertes Research study estimates that these changes will entail a total new investment of some $137 billion in the worldwide Internet infrastructure by 2010. In the U.S., currently lagging Asia, the total new network investments will exceed $100 billion by 2012.
Wow, this is roughly comparable to the projected cost of the economic stimulus bill now winding its way through Congress ($146 billion). But I'll bet no one's thought of empowering the telephone and cable companies to revive the economy. That would mean scrapping welfare for Silicon Valley (aka network neutrality legislation) and eliminating discriminatory taxation of communications services. Nah, not as long as they can contribute boatloads of cash for politicians through their political action committees. Swanson and Gilder correctly point to a fact that's lost on the political class:
Technology remains the key engine of U.S. economic growth and its competitive edge. Policies that encourage investment and innovation in our digital and communications sectors should be among America's highest national priorities.


December 3, 2007
Exa-blogging

Two links of interest:

(1) The New York Times technology blog BITS takes up the Exaflood question here.

and

(2) Matthew Yglesias of The Atlantic endorses Paul Krugman's broadband vision that says the U.S. has fallen hideously behind other nations because of too little regulation. Here was my reply in the comments section:

Hyper-regulation of telecom networks in the late 90s and early 2000s blocked investment and helped cause the telecom/tech crash. America fell far behind Korea and other Asian and European nations, with Korea by 2003 boasting some 40 times the per capita bandwidth of the U.S. Then the U.S. wised up, the FCC and state utility commissions relaxed or eliminated many of our dumb anti-investment regulations, and fiber-optic and wireless broadband investment is now BOOMING. Verizon and AT&T are in the midst of a massive fiber-optic buildout that might approach $50 billion. Verizon's FiOS service offers speeds up to 50 Mbps, with plans to offer 100 Mbps. AT&T is offering a 25 Mbps service. The cable companies -- who were always less regulated and thus took the broadband lead over the last decade -- are now responding in kind, increasing their 3 and 5 Mbps services up to 15 and even 30 Mbps in many areas. After a disastrous decade of regulatory paralysis, we are now in a virtuous upward spiral of competition, and American broadband is now hitting on all cylinders, with worldclass speeds and services either being delivered right now or just around the corner. Virtually the only thing that could stop these positive developments are the very actions Krugman et al call for: net neutrality regulation, re-regulation of cable now being sought inexplicably by FCC chair Martin, and more government involvement in general.

Bret Swanson


Posted by Bret Swanson | December 2, 2007 11:20 AM



November 26, 2007
New IP traffic study

This is just one of a flurry of stories about a new Internet traffic study from Nemertes Research. Nemertes approaches the topic from a different angle than, say, the Cisco study, which projected growth of particular Net applications. Nemertes instead believes that many new applications and innovations are beyond our ability to predict with any degree of precision and thus uses estimates of future network capacity and utilization rates to arrive at traffic projections. The one thing we know for sure, they say, is that we will find ways to use up a certain large portion of new bandwidth, just as we find creative ways to use or even "waste" digital MIPS and storage. Nemertes' approach is somewhat appealing on the surface, but I'm still digesting the report and will have lots more to say.

Bret Swanson


November 15, 2007
HDTube

High definition is coming to YouTube.


November 12, 2007
Web 50.0 feeds the exaflood

Peter Huber tours the exotic locale of a teenager's bedroom. There he finds the fiber-fed 3D digital trickles that are beginning to puncture the narrowband dike and feed the exaflood.

Let's not forget how rotten today's Web really is. Amazon is useless if you love picking your way through books stacked high on tables, flipping pages and skimming dust jackets. Normal people don't shop for groceries by clicking boxes on a meticulously prepared list; they make choices as they stroll down aisles packed with merchandise. Or, for an expert opinion on your so-called digital life, drag your teenager away from his Xbox to help you shop for a new minivan. Show him the neat video feature that takes you inside the car through the lens of a camera that you can pan, tilt and turn with your mouse. Tell him you think it's "way cool." He won't know whether to laugh or to cry.

The graphics on his Xbox are cool. And while his fancy joystick can't type, it can move him through virtual space a whole lot better than a mouse can. The Wii remote incorporates motion sensors, and, primitive as they still are, they let you stand in your living room really swinging a "Wiimote" bat or club at the virtual ball. If you could plug a strand of glass into the far side of the box, you could race the dealer's virtual minivan on a Nascar track against Richard Petty.

As I attempt to quantify the exaflood in terms of IP traffic, I would put the applications Huber describes here under the heading of "Virtual Worlds" -- though gaming and virtual worlds will cross over and converge and expand beyond our current conceptions. Huber reminds us that this category will be even larger and more diverse than we commonly believe.

Bret Swanson


November 2, 2007
High Def will become "standard"

Verizon's fiber-optic FiOS service will boost its high-definition content five-fold to 150 HD channels. VZ also says it has gotten a better response to its high-end 50 Mbps broadband offering and could move faster towards its goal of delivering 100 Mbps by reducing the number of homes served by each 2.4 Gbps GPON (passive optical network) node to 24.


October 3, 2007
An exabyte here, an exabyte there...

...and pretty soon we're talking real Internet traffic.

On Monday I gave the keynote at the Fiber-to-the-Home Conference in Orlando. I opened my talk by citing an amusing article by John Markoff of The New York Times from 1993. In that year, Gopher, an early Google of sorts, grew 400%. It retrieved an unimaginable "200 billion bytes a month -- about 7 million newspaper pages." Or about the size of today's average PC hard-drive. "At the National Center for Supercomputer Applications in Champaign, Ill.," the article continued,

a new service that answers requests to an electronic library called the World Wide Web, has seen the number of daily queries explode from almost 100,000 requests in June to almost 400,000 in October. Officials at the center say the only solution may be to take a $15 million supercomputer away from its normal scientific number-crunching duties and employ it full time as an electronic librarian.

Imagine, $15 million to run the WWW. Today Google alone spends between $2 and $3 billion per year on "digital librarians" (read chips and disks), and YouTube streams 100 million videos a day.

In 1993 total Internet traffic was around 100 terabytes for the year. Today you can order a Dell PC with a TB of disk storage. By 2010, Seagate will put 3 TB on a single 3.5-inch disk.

I closed my talk Monday by estimating that by 2015, U.S. IP traffic could reach 1 zettabyte -- 10 to the 21st power. That's 10 million times larger than the 1993 Internet. World Internet traffic will cross the zettabyte level several years before the U.S.

-Bret Swanson


September 10, 2007
Traffic Report

Professor Andrew Odlyzko has long been one of the key trackers of Internet traffic. See here his brand new website -- called MINTS for Minnesota Internet Traffic Studies -- dedicated to U.S. and global Internet trends.


August 28, 2007
Cisco's exabyte estimates

Check out Cisco's new estimates of Web application trends and Internet traffic growth in the "Exabyte Era," as they call it.

-- The Exabyte Era
-- The methodology

We couldn't help noticing a similarity to our work on the "Coming Exaflood." There's lots of great stuff in these reports, but one projection struck me as extremely conservative. Two-way video conferencing won't take off until 2015 or after? I doubt it will happen next year, but I also doubt it will take eight years. Anyway, Cisco's new 15 Mbps synchronous Telepresence videoconferencing system is apparently sci-fi awesome.


August 8, 2007
Exaflood Powers Cisco

From today's Wall Street Journal:

Web Growth Fuels Cisco

"Cisco Chief Executive John Chambers in a conference call said the Internet is entering its second major phase, brought about by new technologies like social networking and online video. He said such trends will result in innovation and productivity increases for businesses, which will also be a boon for Cisco. "We believe there's an opportunity to be an instant replay to what occurred for Cisco in the very early 1990s," when the company experienced rapid growth, he said."


July 20, 2007
FTTH v. Net Neutrality

The U.S. ranks 11th worldwide in fiber-to-the-home penetration, according to a new study from the globe's three FTTH Councils in Europe, Asia, and North America. With tens of billions of dollars worth of new optical networks under construction by the major telcos, we should move up the list smartly in the next few years. The biggest obstacle to an improvement from No. 11 and an unleashing of an exaflood of rich new content and services is, of course, Net Neutrality regulation.

-Bret Swanson


March 24, 2007
Exaflood debate

The Fiber-to-the-Home Council has a new video clip nicely summarizing the impact rich video will have on Internet traffic and the resulting need for massive new network investment. In this, it follows the themes of an article I wrote in The Wall Street Journal on Jan. 20 called "The Coming Exaflood."

Prolific blogger Tim Lee doesn't like the "exaflood" idea and compares it to "peak oil" theories, which is funny because in my previous Journal column I debunked "peak oil." Tim says peak oil assumes insatiable demand for oil, but in fact what peak oil asserts is that even though there is more oil in the ground, oil production will soon peak and begin a long, slow downward slide.

Although I strongly disagree with this theory and believe petroleum is fabulously abundant, bandwidth is even more so. At least potentially. We continue to find new ways to reuse the already expansive electromagnetic spectrum. It just takes dollars to build the networks.

Lee seems to misunderstand the exaflood as a "crisis" where a rush of data will overwhelm and crash the Internet. That is not my thesis. Lee claims to disagree with the exaflood premise, but then he uses the very YouTube example from my "Coming Exaflood" article. Broadcast.com -- the YouTube of its day -- didn't work...because there wasn't enough bandwidth.

I believe the video clip that Lee questions explicitly says that the exaflood is not a bad thing. So long as we build enough network capacity in the right places to accommodate the new applications coming online.

The ecosystem of networks (core and edge), bandwidth, software, consumer devices, and Net applications is a complex technical and business environment. These things build on each other. They interact. Companies build more network capacity because they think new applications will consume more bandwidth. People create more bandwidth-hungry apps because they think there will be more broadband. But as we saw in the telecom crash, if the ambition of the applications -- in that case, the thousands of dot-coms that failed -- outstrips the capabilities of the network, both the app companies and the Net companies might fail business-wise, but the network itself may not crash.

Here Lee and I agree. If new fiber-optic broadband links to homes and businesses are not built, there won't be an "exaflood," at least not as I envision it. Lee is correct that "demand for data is not an independent variable." Demand is, of course, essentially infinite, but effective demand -- what we can get for what we're willing to pay -- is dependent on the quality and capacity of the network.

Now it's possible that if millions more people suddenly get radically better broadband connections, and they, again, suddenly all begin doing hi-res video conferencing, and if the network companies get caught short and haven't built out their core networks, then there could be a bandwidth crisis in the middle of the Net. But this is not what I, at least, have been predicting. I don't equate the exaflood with a crisis. I think it is a great sign of advance. It will be good for hardware, software, network, and applications companies. It will be good for consumers. It will happen if we let it.

It is inevitable in the long-term, but we can stop it in the short term if we enact bad investment-killing policy. If we retard the build-out of high-speed fiber networks to homes and businesses, there will be no exaflood, and we could, once again, crash the Internet ecosystem.

Dotted Divider Line





Contact Us
Discovery Institute Logo

Click here for additional contact information