Discovery Institute
disco-tech | Discovery Institute's Technology Blog: Internet Archives

February 24, 2014
Advertisers vs. ad-supported pirate sites

A sampling of 596 web sites that deal primarily in pirated content made an estimated $227 million in annual advertising revenues, according to the Digital Citizens Alliance (See: "Good Money Gone Bad: Digital Thieves and the Hijacking of the Online Ad Business - A Report on the Profitability of Ad-Supported Content Theft"). "The 30 largest sites studied that are supported only by ads average $4.4 million annually, with the largest BitTorrent portal sites topping $6 million. Even small sites can make more than $100,000 a year from advertising."

"It is important to note that the advertising profits garnered by content thieves do not equate with the losses incurred by the owners of the content," notes the report. "These losses are unquestionably greater by many orders of magnitude..."

Fortunately, the advertising industry is not willing to tolerate intellectual property infringement. "The future health of digital media is at stake," according to Bob Liodice, head of the Association of National Advertisers, "and we owe it to ourselves, our industry and its brands to attack the issue head-on."

Continue reading "Advertisers vs. ad-supported pirate sites" »


January 9, 2012
Opponents overreact to online piracy legislation


Showdowns are likely in the Senate and House of Representatives later this month on legislation combating online piracy. The House Judiciary Committee is expected to vote on the Stop Online Privacy Act, H.R. 3261 (SOPA), and the full Senate on the Preventing Real Online Threats to Economic Creativity and Theft of Intellectual Property Act, S. 968 (Protect IP Act). These measures have generated some overheated rhetoric.

A recent column in Roll Call by Stephen DeMaura and David Segal, entitled "All Candidates Should Be Concerned About SOPA," for example, suggests that SOPA could be exploited by political opponents to restrict free speech.

Here's a plausible campaign scenario under SOPA. Imagine you are running for Congress in a competitive House district. You give a strong interview to a local morning news show and your campaign posts the clip on your website. When your opponent's campaign sees the video, it decides to play hardball and sends a notice to your Internet service provider alerting them to what it deems "infringing content." It doesn't matter if the content is actually pirated. The ISP has five days to pull down your website and the offending clip or be sued. If you don't take the video down, even if you believe that the content is protected under fair use, your website goes dark.
Another recent column in Politico by Tim Mak entitled "Bloggers: SOPA's the end of us" makes a similar claim and implies a tidal wave of opposition is forming (we shall see).
The conservative and liberal blogospheres are unifying behind opposition to Congress's Stop Online Piracy Act, with right-leaning bloggers arguing their very existence could be wiped out if the anti-piracy bill passes.
There is no way these bills would permit an opposing campaign or campaign committee to pull down websites harboring "infringing content," nor would they authorize censorship of lawful speech.

Continue reading "Opponents overreact to online piracy legislation" »


April 23, 2010
My canonical paradigm

A Chinese American entrepreneur engineer named Henry Gao has written a Chinese book paralleling, enriching and affirming the more far reaching propositions in Telecosm.

His theme is that the history of communications networks has passed through three eras: 1) the telegraph (data with delay and buffering); 2) the public switched telephone network (PSTN for real-time two-way voice), and 3) now back to the telegraph (the data-rich Internet protocols and layers, with many asynch buffers and best efforts and lost bits).

Today under the stress of an interactive video exaflood, there is a new fork in the road. On the one hand, the industry wants to continue on its current path back to a new video best efforts telegraph--an ever more complex Internet patched and epicycled and upgraded for interactive video. This is the current choice.

Continue reading "My canonical paradigm" »


April 7, 2010
FCC gets squashed
John%20Marshall%20Statue.jpg
The U.S. Court of Appeals for the D.C. Circuit ruled that the authority the FCC used to regulate Internet access providers is very limited. The ruling is obviously a victory for broadband Internet access providers. But it is also a victory for the rest of us. As the court noted, the legal interpretation the FCC fought to defend "would virtually free the Commission from its congressional tether."

In Comcast v. FCC, the court said it was okay for Comcast to discriminate against peer-to-peer file sharing as necessary to manage scarce network capacity. The opinion was written by Judge David S. Tatel, a Clinton nominee.

The question before the court was whether the FCC has any jurisdiction to regulate Internet access providers' network management practices. The FCC acknowledged it has no express statutory authority, but it argued that section 4(i) of the Communications Act of 1934 (47 U.S.C. § 154(i)) authorizes it to "perform any and all acts, make such rules and regulations, and issue such orders, not inconsistent with this chapter, as may be necessary in the execution of its functions."

Courts have referred to the commission's section 4(i) power as its "ancillary authority." The FCC successfully used this authority in 1968 to restrict the geographic area in which a cable company could operate, even though the Communications Act gave the commission no express authority over cable television at the time. The FCC acted reasonably when it limited the retransmission of distant broadcast signals by cable operators, according to the Supreme Court, because otherwise the commission's ability to fulfill its statutory responsibility for fostering local broadcast service could have been thwarted.

The FCC couldn't cite a single statutory responsibility that might justify Internet regulation
This and subsequent cases have established the principle that the FCC may exercise its "ancillary" authority only if it demonstrates that its action is "reasonably ancillary to the . . . effective performance of its statutorily mandated responsibilities." In the present case, the FCC couldn't cite a single statutory responsibility impacted as a result of interference with peer-to-peer communications by an Internet access provider.

Comcast v. FCC presents a fairly high hurdle for the FCC to overcome going forward to the extent it seeks to regulate the Internet on the basis of "ancillary" authority.

Public Knowledge is already at work on a backup plan. It recently filed a petition asking the FCC to reclassify high-speed Internet access as a "telecommunications" service subject to common carrier regulation under Title II of the Communications Act -- "the home of some heavy-handed regulation, to be sure," notes Susan Crawford (a former Special Assistant to the President for Science, Technology, and Innovation Policy). She nevertheless supports the idea.

One problem with this approach is the FCC has already taken the position that high-speed Internet access is not a telecommunications service subject to Title II common-carrier regulation. This determination was upheld by the Supreme Court in NCTA v. Brand X (2005). The commission's logic, noted the Supreme Court, was that cable companies do not "offe[r] telecommunications service to the end user, but rather . . . merely us[e] telecommunications to provide end users with cable modem service."

In other words, a product or service does not become "telecommunications" subject to heavy-handed Title II common carrier regulation just because it utilizes telecommunications. Imagine the consequences of taking the opposite approach and saying that if a product or service (insert your own example) includes a telecommunciations component the FCC in its discretion can treat it as "telecommunications."

Public Knowledge's targets don't include any product or service, just broadband Internet access providers. The sweeping power it is urging the FCC to grasp, however, knows no such bounds.

It is fallacy to assume there can be one set of rules for broadband service providers and another set for everyone else. Since they are not monopolies, it will not be possible to relegate broadband Internet service providers to distinct legal categories for monopolies. The courts will have no choice but to ensure that their rights and responsibilities are reasonably consistent with ours, and ours with theirs.

Meanwhile, the FCC like all federal agencies needs a Congressional tether and thanks to the D.C. Circuit Court of Appeals it still has one. If in its wisdom Congress believes it is appropriate for the FCC to have statutory authority to regulate the Internet, it can always supply that.




March 18, 2010
Internet a "treasure trove" for catching criminals
police%20internet.htm
Some of the fascinating ways social media sites are making it easier for police to nab the bad guys, from FOXNews.com,
  • Police in Indiana were able to arrest a New York fugitive who turned himself in by posting his workplace on his MySpace and Facebook pages.
  • A Florida man who was convicted of murdering of his friend was caught because he posted pictures of the friend on his MySpace page next to the words "rest in peace" and "live through me," hours before the death was even reported to police.
  • A burglar in Pennsylvania who ransacked a home and stole some jewelry was caught after the victim found the burglar's Facebook account open on her home computer.
  • A bank fraud fugitive in Washington was turned in by a Facebook friend when the fugitive boasted about his new life in Mexico.
  • An Oregon burgler was caught when his victim found the stolen property advertised on Craigslist.
"We utilize data posted on blogs and various social networking sites like Facebook, MySpace and Twitter to validate and corroborate information we have developed regarding the target of an investigation," a retired New York detective told FoxNews.com. "...These sites are a potential treasure trove."

February 5, 2010
Why antagonize China?

From George Gilder's column in today's Wall Street Journal,

Meanwhile, Secretary of State Hillary Clinton and the president's friends at Google are hectoring China on Internet policy. Although commanding twice as many Internet users as we do, China originates fewer viruses and scams than does the U.S. and with Taiwan produces comparable amounts of Internet gear. As an authoritarian regime, it obviously will not be amenable to an open and anonymous net regime. Protecting information on the Internet is a responsibility of U.S. corporations and their security tools, not the State Department.
The full column is here.


January 3, 2010
State of IP

At Telephony Online, Rich Karpinski notes,

In today's carrier networks, IP may not always be hyped or even seen, but it is indeed everywhere -- and in 2010, it's only going deeper and making an even bigger impact.
I think this protocol proliferation in the name of IP is the death rattle of the old network. IP is a data protocol so of course it dominated the enterprise market and it is prevalent on the Internet so of course Internet players such as Google want it to be upgraded for so-called Multimedia.

But the message of all the brave talk about "ultimate outcomes that have yet to take hold today" is that once again it is becoming reasonable to predict that cable will win. CableTV is already frankly devoted to the transmission of the high definition interactive video that will comprise 99 percent of network traffic. This is the black hole into which all the plans for sophisticated Rich Communications Suites, guaranteed QoS, Internet Multimedia Services, and all the abortive plan for Long Term Evolution (LTE) will fall.

The companies for the new era will be the hardware enablers of broadband interactive video: graphics and network processors, optical transponders, wavelength division muxing gear, and optical circuit switches for the new TDM circuits that will be crucial for the robust streaming video that will be at the heart of the market.

That's the Henry Gau Telecosm and I'm sticking to it. Upgrading the old networks for video and multimedia, one service at a time, is a non-starter. It will be swept away by truly broadband wavelength circuits optimized for interactive video streams. Within these circuits all other
traffic can flow without significant additional expense.

Security, routing, session management, and switching all will be done on the customer edge and the datacenter, which will comprise the bulk of the server edge.

Unless the telco's grasp that their old circuit model is relevant again, they are going to give way to cable TV players who already get the picture in high definition and are moving ever closer to video teleconferencing.


March 1, 2009
Good bye, local newspaper

The Rocky Mountain News published its last edition, the San Francisco Chronicle may be next and unfortunately this may be just the beginning.

Rush Limbaugh says "[w]e haven't even gotten close to how bad it was in the 1970s. We haven't gotten close to the recession in 1982." Yet within the newsrooms of America it is a different story; it is a Depression for the employees of newspapers, and they are the people from whom the rest of us get the news and the spin.

Slate editor Michael Kinsely points out that

the harsh truth is that the typical American newspaper is an anachronism. It is an artifact from a time when chopping down trees was essential to telling the news, and when you couldn't get The New York Times or The Washington Post closer to your bed than the front door, where the local paper lies, sopping wet.

The Times, The Post and a few others probably will survive. When the recession ends, advertising will come back, with fewer places to go.

Computers and the Internet have reshaped the newspaper industry as they revamped so many others, like the old locally headquartered bank or telephone company or department store. They have made it completely unnecessary to have virtually stand-alone enterprises in cities across America who can collect, print and distribute the news locally. Most of what the local newspapers publish is national and international news from a wire service or locally-produced content which could have been produced in any locality, so most of their operations are highly redundant.

It is sad to behold; but before getting too sentimental, it's worth remembering that local newspapers are cozy little monopolies, duopolies or oligopolies. And nothing lasts forever, not even market power. Nor should it!

There will always be an appetite for local news, but a national publication can easily duplicate that with small local bureaus and regional editions.

The implication for politicians is interesting: No more threat of a local newspaper who opposes your reelection; but much harder to grab headlines, especially for the trivial stuff you do.

The consolidation of the industry was inevitable.

There is still the issue of how can even a few newspapers who survive attract sufficient revenue to remain viable. Charging subscribers is futile and unnecessary. Kinsley observes that

[n]ewspaper readers have never paid for the content (words and photos). What they have paid for is the paper that content is printed on. A week of The Washington Post weighs about eight pounds and costs $1.81 for new subscribers, home-delivered. With newsprint (that's the paper, not the ink) costing around $750 a metric ton, or 34 cents a pound, Post subscribers are getting almost a dollar's worth of paper free every week -- not to mention the ink, the delivery, etc. The Times is more svelte and more expensive. It might even have a viable business model if it could sell the paper with nothing written on it. A more promising idea is the opposite: give away the content without the paper. In theory, a reader who stops paying for the physical paper but continues to read the content online is doing the publisher a favor.
It might seem that charging readers is essential since advertising appears to be hopeless because Internet search-related advertising is so much more efficient for advertisers.

But online newspapers can target ads to some extent. And if advertising really is capable of motivating people to buy something they did not know they wanted or thought they needed, then newspapers are in a great position to cater to potential buyers who surf the Web for news and entertainment. Just like (almost) no one buys a newspaper to look at the ads, (almost) no one visits the Web to conduct searches. They're looking for content.

This is not the first impossible-seeming problem with monetization. There was much consternation in the airline industry following deregulation in the late 1970s over how would carriers possibly induce some travelers to pay exorbitant ticket prices while simultaneously offering the significantly lower fares needed to fill all the empty seats at the time. Robert Crandall of American Airlines came up with the frequent flyer program, which was part of the answer.

Somewhere there is a creative newspaper executive who will figure this out, too.

* * * *

ADDENDUM -- James DeLong's excellent column, "Preparing the Obituary," appeared in The American on Mar. 3, 2009, after I wrote this post. Among other things, he notes that

The editor of the New York Times just "challenged the belief among some of the digerati that 'information wants to be free,' saying 'a lot of people in the news business, myself included, don't buy as a matter of theology that information "wants to be free." Really good information, often extracted from reluctant sources, truth-tested, organized, and explained--that stuff wants to be paid for.'" The LA Times talked of the need for an antitrust exemption so newspapers can jointly agree to stop giving away the product, and several columnists have chimed in about the need for monetization.


November 25, 2008
Bracing for new regulation

Observers predict stepped-up regulatory battles in telecom, according to the Wall Street Journal,

New congressional leaders as well as policy makers in the Obama administration are expected to press for fresh limits on media consolidation and require phone and cable firms to open their networks to Internet competitors, lobbyists and industry officials say.
The article overlooks the fact that broadcast ownership limits and forced access policies are restraints on the free speech rights of broadcasters and network providers, and that the constitutionality of new regulation could ultimately be decided by the courts.


June 6, 2008
Telecosm recap

You should have been there! Telecosm was thrilling. I will list the ways, in chronological order in two or three posts over the next few days. (Below is Part 1.)

1) Lawrence Solomon, author of The Deniers, demonstrated, beyond cavil,
that nearly all the relevant scientists, outside of the government
echo-chambers, completely repudiate the climate panic. He concluded by
pointing to evidence for a cooling trend ahead.

2) After I presented the statistics showing that most of the global
economy is driven by innovation in the Telecosm--teleputers, datacenters,
optical fiber, fiberspeed electronics--Steve Forbes gave a magisterial
tour of the world economy. Relevant to the debates on the Gilder Telecosm
Forum subscriber message board was his assertion that the Fed had been too
loose in the face of a collapse in the demand for dollars caused by the
muddled cheap dollar leadership from the administration. Later in the
conference, in an incandescent speech mostly about the amazing expansion
of freedom and supply side economics in China, John Rutledge maintained
that the Fed had been too tight, measured by the flat monetary base. But
then, as far as I could grasp, Rutledge contradicted himself by showing a
dramatic surge of bank lending to small and midsized businesses. If it was
caused by the collapse of other lending sources, he did not give any
evidence.

3) Nicholas Carr gave a suave and lucid presentation of the themes of his
The Big Switch book, comparing the emergence of cloud computing to the
rise of the centralized power grid. Raising an issue that recurred
throughout the conference, our regnant expert on the power grid, Carver
Mead, dismissed the analogy as simplistic, since one-way power delivery
and two-way information transfer are radically different processes. Bill
Tucker, author of the forthcoming Terrestrial Energy, pointed out in a
compelling speech that Moore's Law is about miniaturization of bits while
the energy industry is better described by a Law of More--more power and
more efficiency. He explained that all the energy in the atom is in the
nucleus and pointed to the immense heat caused by nuclear fission and
fusion within the earth. Then he impugned the venture capitalists'
compulsion to waste arable land and space twiddling with electrons and
photons and presented much evidence that solar energy in all its forms
would never provide adequate power for an ever growing economy. Physicist
Howard Hayden of Energy Advocate enthusiastically confirmed this view.

4) Andy Kessler followed with an uproarious investigation of Who Killed
Bear Stearns?. His answer pointed not to the usual culprits (though he did
politely finger front row auditors me and Bob Metcalfe) but to Bear
Stearns' itself. After preparing a feculent feast of sub-prime pork ("they
knew better than anyone else what was in it"), then packaging it all into
putatively succulent AAA delicacies, they totally lost it and ate their
own sausages.

5) The Exaflood Panel presented Andrew Odlyzko's dour but learned analysis
of Internet traffic, which concluded that the real danger is not too much
traffic but not enough to sustain all the businesses in the sector. Joe
Weinman, a brilliant strategist from ATT, however, confirmed the Exaflood
thesis, and Johna Till Johnson of Nemertes offered compelling evidence
that the best way to examine the issue is from the supply side. If you
don't build it, they definitely will not come. Traffic in the core is
dependent on access from the edge, which still lags in the US, as even
Odlyzko showed rates of usage in Korea and Hong Kong six times US usage
rates. Lane Patterson of Equinix confirmed aggressive estimates of traffic
growth and still more ambitious growth of Equinix datacenters, but said
that patterns of traffic confirm that the core is being starved by
inadequate access on the edge.


May 5, 2008
Terabit Ethernet coming soon

George Gilder is getting some well-deserved recognition in Technology Review in an article by Mark Williams entitled "The State of the Global Telecosm - The most notorious promoter of the 1990s telecom boom has been proved right."

"I'm a fan of George Gilder, the bubble bursting notwithstanding," Ethernet co­inventor Bob Metcalfe (a member of Technology Review's board of directors) told me after his San Diego keynote speech, "Toward Terabit Ethernet." Metcalfe had told his audience not only that optical networks would soon deliver 40- and 100-gigabit-per-second Ethernet--standards bodies are now hammering out the technical specifications--but also that 1,000-gigabyte-per-second Ethernet, which Metcalfe dubbed "terabit Ethernet," would emerge around 2015. Why, I asked, did Metcalfe believe this? "Last night, Gilder spoke to 300 of us at an executive forum about his 'Exaflood' paper, in which he predicts a zettabyte of U.S. Internet traffic by the year 2015," Metcalfe said. "Since I admire Gilder, I extrapolated from his prediction."

An exabyte is 1018 bytes of data; a zettabyte is 1021 bytes. Metcalfe pointed to video, new mobile, and embedded systems as the factors driving this rising data flood: "Video is becoming the Internet's dominant traffic, and that's before high definition comes fully online. Mobile Internet just passed a billion new cell phones per year. Then totally new sources of traffic exist, like the 10 billion embedded microcontrollers now shipped annually."

Metcalfe also addresses the interesting question of whether there is sufficient capacity in the Internet backbone to accommodate the surging traffic:
Did Metcalfe believe that the existing infrastructure--built in the boom years, when great excesses of fiber-optic cable were laid down--could support terabit Ethernet? "That dark fiber laid down then is being lit up, and some routes are now full," he said. "That's the principal pressure to go to 40 and 100 gigabits per second. It seems we can reach those speeds with basically the same fibers, lasers, photodetectors, and 1,500-nanometer wavelengths we have, mostly by means of modulation improvement. But it's doubtful we'll wring another factor of 10 beyond that." Thus, the backbone networks would need to be overhauled and new technologies implemented.


April 25, 2008
The bandwidth conundrum

John Dvorak, PCMag.com:

In today's world, bandwidth demand is similar to what processing demand was 20 years ago. You just can't get enough speed, no matter how hard you try. Even when you have enough speed on your own end, some other bottleneck is killing you.

This comes to mind as, over the past few months, I've noticed how many YouTube videos essentially come to a grinding halt halfway through playback and display that little spinning timer. Why don't they just put the word "buffering" on the screen?

All too often, it's not the speed of my connection that's at issue--it's the speed of the connection at the other end. It may not even be the connection speed itself; it may simply be the site's ability to deliver content at full speed under heavy demand.

This concerns me, since I'm an advocate of IPTV and other technologies that need lots of speed to work. We seldom consider the fact that if something becomes hyper-popular (like YouTube), user demand on the system is enormous and can easily break the system from the demand side....

Read On

Interesting article that misses the chief recent development on the net: the huge advances in the efficiencies of the datacenters that dispense these web pages. The "cloud" computing paradigm, pioneered by Google, is now going mainstream as Nicholas Carr, Telecosm speaker this year (www.TelecosmConference.com), documents in his intriguing book. For example, Jules Urbach--our movie and virtual world renderman and Telecosm star with his Lightstage corporation--can send images from thousands of different "viewports" per second from his graphic processor based OTOY servers, which can scale to millions of users. A company called Azul has developed cheap scalable datacenter technology that delivers terabits per second from its OS neutral Java-based clusters of servers.

The bottleneck is rapidly moving back to where it has long resided: at the last mile, where passive optical networks, such as VZ's FiOS, are increasingly necessary. For IPTV, content delivery networks (CDN) from Akamai and its increasing throng of video rivals using a variety of ingenious delivery algorithms will eclipse the cumbersome BitTorrent mesh model, which shuffles video files through underused personal computers across the network.


April 8, 2008
Google's bids

Communications Daily ($) cited my recent post comparing Google's limited objectives for the 700 MHz auction with the expansive objectives it outlined to the Federal Communications Commission last summer, and it included the following reaction to my comments from Richard Whitt of Google:

Whitt said in response that Haney had misread his company's comments from last summer. "We consistently have argued that the open access license conditions adopted by the FCC would inject much-needed competition into the wireless apps and handset sectors, but would not by themselves lead to new wireless networks," he said Monday. "Only if the commission had adopted the interconnection and resale license conditions we also had suggested -- which the agency ultimately did not do -- would we have seen the potential for new facilities-based competition."

Another way to look at this is if there wasn't any potential for new facilities-based wireless competition without the interconnection and resale license conditions Google wanted, why would Google have submitted bids for the spectrum which it might have won and had to pay for?

I do agree that prior to the FCC's adoption of two of the four open platform principles Google proposed the company consistently premised its commitment to participate in the auction on the FCC adopting all four principles. I also agree Google was clear that it believed all four principles were necessary to promote competition.

Then it participated in the auction anyway.

This case may reveal how some regulators and some legislators are shrewd, have their own ideas about how to get what they want and even think they know what's in the best interest of corporations like Google.

It makes sense, as Whitt told Communications Daily, that the interconnection and resale license conditions would seem necessary to a hypothetical competitor who is a network provider. But in its Jul. 9th letter (and in the statement to Communications Daily) Google characterizes all four principles as being relevant to whether a new entrant would bid for the spectrum. For example:

Should the Commission not adopt the four open platforms requirements listed above, we believe it is doubtful that even the most determined and committed new entrant will be able to outbid an equally determined and committed incumbent wireless carrier, or consequently pave the way for second order competition.
In other words, each of the principles could be of interest to a new entrant who might bid for the spectrum. That seems logical, and the proof is Google. A new entrant who isn't a network provider -- such as Google -- might be more interested in open platforms for applications and handsets upon which its lucrative advertising plans depend. It might be worth it for Google to become a wireless broadband competitor in order to promote its highly profitable legacy business model.

Google was presenting an all-or-nothing-offer. But in Washington all-or-nothing-deals are rare. Google must have known this. Google got half of what it asked for (the typical return on investment here). And half a loaf seemed to be enough in view of the fact Google participated in the auction.

If in its prior conduct Google was saying only that it intended to ensure that the reserve price was met but it had no interest in owning the spectrum itself, that wasn't particularly clear.

Reasonable people might differ, but I think if Google never intended to win the spectrum (unless there was no way around it), and it was merely advancing its hypothesis that the four open platform conditions would summon forth hypothetical new entrants that wasn't especially clear at the time, either. Nor would it have seemed convincing to many people. Google's proposal wouldn't have acquired much momentum. The excitement was around the possibility Google would become the competitor. Google's previous Jul. 9th letter to the FCC said "Google remains keenly interested in participating in the auction" and its subsequent behavior continued to highlight that interest.


March 27, 2008
Problem solved

Comcast and BitTorrent are working together to improve the delivery of video files on Comcast's broadband network.

Rather than slow traffic by certain types of applications -- such as file-sharing software or companies like BitTorrent -- Comcast will slow traffic for those users who consume the most bandwidth, said Comcast's [Chief Technology Officer, Tony] Warner. Comcast hopes to be able to switch to a new policy based on this model as soon as the end of the year, he added. The company's push to add additional data capacity to its network also will play a role, he said. Comcast will start with lab tests to determine if the model is feasible.
Over at Public Knowledge, Jef Pearlman argues that the pioneering joint effort by Comcast and BitTorrent "changes nothing about the issues raised in petitions" before the FCC advocating more regulation, because Comcast and BitTorrent are "commercial entities whose goals are, in the end, to make sure that their networks and technology are as profitale as possible."

Setting aside whether the pursuit of profit is a good thing or not, what this episode actually proves is that the Federal Communications Commission has done its job, the threat of regulation is a credible deterrent to prevent unreasonable discrimination by broadband service providers and we don't need a new regulatory framework with the unintended consequences which regulation always entails.

If we want innovation, more choices and ultimately lower prices we have to be prepared to allow broadband service providers to experiment and to succeed or fail in the market. Regulator always discourages all three.

We also need an enforcement backstop, of course. But it doesn't have to be formalistic and inflexible.

Aside from FCC authority under the Communications Act of 1934 as amended, the professional staff of the Federal Trade Commission has concluded that antitrust law is "well-equipped to analyze potential conduct and business arrangements involving broadband Internet access."

Here at the Tech Policy Summit in Hollywood, one panelist claimed during a breakout session that antitrust enforecement in this area is impaired as a result of the Supreme Court's decision in Verizon v. Trinko (2004). But it isn't so.

In that case, the plaintiff was trying to convert an alleged breach of the Communications Act into an antitrust claim under §2 of the Sherman Act. In other words, the plaintiff was trying to expand the application of antitrust jurisprudence. The Court ruled that the Telecommunications Act of 1996 neither expanded nor limited the antitrust laws.

The 1996 Act has no effect upon the application of traditional antitrust principles. Its saving clause--which provides that "nothing in this Act ... shall be construed to modify, impair, or supersede the applicability of any of the antitrust laws," 47 U. S. C. §152--preserves claims that satisfy established antitrust standards, but does not create new claims that go beyond those standards.
The Court went on to conclude that the activity of Verizon which Trinko complained of did not violate pre-existing antitrust standards.

The bottom line is that we have three federal agencies, which include the Antitrust Division of the Department of Justice in addition to the two previously mentioned, who have the jurisdiction, expertise and some actual experience to intervene if broadband providers unreasonably discriminate.

Groups like Public Knowledge have done a great job and can declare victory now.


March 3, 2008
Chaos and opportunity

Referring to Bret Swanson's and George Gilder's prediction U.S. IP traffic will reach an annual total of 1,000 exabytes, or one million million billion bytes by 2015, Ethernet inventor Robert Metcalfe foresees a terabit-per-second Ethernet, according to Telephony. Although not sure eaxctly when, Metcalfe predicts --

New modulation schemes will be needed for the coming network, he said, as well as "new fiber, new lasers, new everything."

The need to replace existing technologies will create "chaos," Metcalfe said, but also opportunity for equipment vendors.



May 24, 2007
State and local tax collectors have ambitious plans for taxing Internet

The Senate Commerce Committee's hearing Wednesday on the Internet tax moratorium demonstrates the necessity of making the ban on state and local taxation of Internet access services permanent. Another temporary extension simply guarantees opponents another chance to overturn it down the road, and creates the possibility they can win new concessions in the meantime.

The hearing showed than opponents are still determined to gut the moratorium. Harley Duncan, the Washington representative of state and local tax administrators, rehashed the old argument that the moratorium is unnecessary, because "the economic evidence is that state taxation of Internet access charges has little or nothing to do with the adoption of Internet services by consumers or the deployment of services by industry." And he cites a new Government Accountability Office conclusion that taxing Internet access is "not a statistically significant factor influencing the adoption of broadband service at the 5 percent level. It was statistically significant at the 10 percent level." Even assuming this conclusion is valid, it still doesn't mean anything. Because once states and localities are allowed to impose taxes on Internet access, they won't hold the line at 5 percent.

To get an idea what states and localities might do with Internet access, just consider what they do with telecommunications. Right now, for instance, Jeff Dircksen of the National Taxpayer Union & Foundation notes that the are pushing the combined tax burden on cellphone services above 20 percent.

Local and state governments believe wireless taxes, fees, and surcharges are a "cash cow" for the 21st Century. Yet, they fail to consider that the total wireless tax and fee burden can exceed 20 percent in some areas -- a higher effective tax rate than the typical middle-class consumer pays on a 1040 federal income tax return.

Annabelle Canning with Verizon Communications points to a 1999 study by the Committee on State Taxation which found that consumers of telecommunications services paid effective state and local tax rates that were "more than twice those imposed on taxable goods sold by general business (13.74% vs. 6%)." She also cited a Heartland Institute conclusion that consumers of cable TV, wireless and wireline phone service paid an average of 13.5% in taxes, more than two times the 6.6% average sales tax rate.

The taxes on telecom, cable and wireless include franchise taxes, utility taxes, line access and right-of-way charges, 911 fees, relay charges, and maintenance surcharges, according to Dircksen. He notes that there are approximately 11,000 state and local governmental entities that could levy taxes or fees on telecommunication activities, according to the National Conference of State Legislatures. Canning mentioned that the typical communications service provider was required to file "seven to eight times as many tax returns compared to those filed by typical businesses (63,879 vs. 8,951 annually)."

This is how we can expect state and local government to handle the taxation of Internet access services if given the chance.

"Bundled" and "Acquired" Services

If Duncan can't eliminate the moratorium, he'd like to gut it. In his Senate testimony, he suggested changing the definition of "Internet access" to make it clear that an Internet service provider cannot "bundle" other types of Internet services, content and information (some of which may be currently taxable) into a package of "Internet access" and claim that the state would be preempted from taxing any part of that package.

There is also a controversy over so-called "acquired" services. Basically, opponents want to apply the moratorium only to retail services while allowing wholesale services to be taxed. The effect is to allow taxation of the Internet backbone. Unfortunately, GAO has bought-off on this interpretation. Their testimony, which includes the following diagram, claims that the Internet tax moratorium did not apply to Internet backbone services (described as "acquired" services).

GAO%20acquired%20services%20chart.jpg

The bottom line is that the Internet tax moratorium helps keep the price consumers pay for broadband as low as possible, and affordability is usually a key factor when consumers make a purchasing decision. If we didn't have the moratorium, we might be forced to consider subsidizing broadband to make it ubiquitous. Duncan's testimony makes it clear that's exactly what the tax administrators recommend.

Online Sales Taxes

States and localities are prohibited from taxing "remote sales" by virtue of a Supreme Court ruling, not the Internet Tax Moratorium. Duncan requested enactment of Federal legislation to authorize states to require remote sellers to collect sales and use taxes on goods and services sold into the state. The Supreme Court's Bellas Hess rule has been around since 1967. State and local officials have been trying, without success, to overturn it ever since. If they were to ever succeed, they would surely discover that the Internet provides a lot of opportunities for tax avoidance and evasion. That would only lead to more regulation which would stifle innovation.

Policymakers have more serious things to talk about than debate these same issues every few years; that's why Congress needs to make the Internet Tax Moratorium permanent.


April 13, 2007
Give IRS keys to the Internet?

The IRS likes to talk about how it's primarily concerned with improving taxpayer services, particularly this time of year. But don't be fooled. Earlier this year, the Bush Administration proposed to require "brokers" to report online sales of tangible personal property to the IRS.

This is really another giant surveillance program, like the trial balloon the administration has previously floated to require internet service providers to retain customer data to combat crimes committed against children (as I've discussed here and here). In both cases, the government is trying to harness the unique capacity of the Internet to identify and document conduct in ways that were never feasible nor possible before -- in this case ordinary commercial transactions that just happen to be conducted online. According to press coverage, the proposal is specifically aimed at online auctions (see this and this).

One problem with these seemingly well-intentioned proposals for leveraging Internet capabilities to reduce crime is the proverbial slippery slope. The Internet can be used to collect, store and cross-reference potentially limitless information about each one of us -- our day-to-day activities, our associations, our spending habits, even our thoughts. If for example the government wanted to control Medicare and Medicaid costs by identifying who smokes, who drinks, and who doesn't follow the government's dietary, exercise or safe-sex guidelines, it will increasingly become possible for the government to do that. Is the only criteria going to be whether the government could save money from imposing surveillance mandates on the private sector (which would operate indiscriminately against the innocent and the guilty alike) versus spending more for law enforcement (which must respect basic civil liberties) or other government programs? There might be little cause for worry if innocence were it's "own shield" (but it isn't), or if the government could be trusted to safeguard personal information (but it can't). Not only do government agents falsely accuse people (remember when Senator Edward M. Kennedy's name showed up on the government's terrorist watch list?), they also lose their laptops every day.

The latest idea, innocuously entitled "Expand Broker Information Reporting," is described on page 65 of a publication entitled "General Explanations of the Administration's Fiscal Year 2008 Revenue Proposals," and is one of five proposals for tighter information reporting. Another would require the organizations who process credit and debit card payments for merchants ("merchant acquiring banks") to report to the IRS the gross reimbursement payments made to merchants.

Brokers like eBay and Amazon would be required to collect social security numbers and file IRS "information returns" identifying gross proceeds from the sale of tangible personal property. The proposal has a de minimis exception ("would apply only with respect to a customer for whom the broker has handled 100 or more separate transactions generating at least $5,000 in gross proceeds in a year") but, honestly, this is designed to divide potential opposition and will get ratcheted back as compliance costs decline for the brokers or for any number of other reasons.

The proposal assumes online buyers and sellers require the assistance of U.S. middlemen like eBay and Amazon. But any seller who wants to sidestep this information reporting requirement could easily use an off-shore middleman or set up their own shop on the web. As they say, our tax system ultimately relies on voluntary compliance. Just how big of a problem are we talking about here? I was surprised to learn, from a recent Government Accountability Office report, most taxpayers not subject to information reporting pay their taxes anyway.

Past IRS data have shown that independent contractors report 97 percent of the income that appears on information returns, while contractors that do not receive these returns report only 83 percent of income.

Senate Finance Ranking Member Chuck Grassley (R-IA) agrees that most taxpayers pay their taxes and further points out that "a significant amount of noncompliance is unintentional." Grassley wisely points out that, " in our zeal to get at the tax gap we cannot wreck the lives of the honest taxpayers. We can't be like a fellow who tears down his house to get at a mouse."

The proposal received criticism in the GAO report, which notes that "the dollar amounts expected to be raised are quite small." I conclude this is a reason not to do it at all, although some might not see it that way. A bureaucrat might propose to solve this problem by requiring that more commercial activity be conducted online, since it is easier to track that way.

Anyway, Senate Finance Chairman Max Baucus (D-MT) has also criticized the proposal for not raising enough money, and suggested that the administration ought to be emphasizing traditional audits.

The IRS says that it gets a four-to-one return on investment in tax enforcement. For every $1 it spends, it gets $4 back in additional taxes collected. So, it would make sense for the administration to propose an IRS budget that would take advantage of that four-to-one return. But they have not. And the tax gulf just keeps growing.

But the administration has good reason to proceed cautiously in stepping up the IRS's enforcement activities, which has been tried before and has led to serious abuses. Remember the Taxpayer's Bill of Rights?

Another approach, as highlighted by GAO, would be to simplify the tax code.

Simplifying the tax code or fundamental tax reform has the potential to reduce the tax gap by billions of dollars. IRS has estimated that errors in claiming tax credits and deductions for tax year 2001 contributed $32 billion to the tax gap.

Simplification sounds like a good thing in the abstract, but the term is usually employed as a euphemism for sucking billions of dollars out of the private sector to fund bigger government.

_______________

See, e.g.: "IRS' Case of the Missing Laptops," by Declan McCullagh ("The IRS has lost or misplaced 2,332 laptop computers, desktop computers and servers over three years, according to a recent report by Treasury Department auditors. They concluded it's a persistent problem: The IRS has "reported a material weakness in inventory controls" every year since 1983.")

See also: "Database snafu puts US Senator on terror watch list," by Thomas C. Greene ("US Senator Ted Kennedy was prohibited from flying because his name sparked a terror alert, the Associated Press reports. Apparently, the Senator's name came up on a terrorist watch list, or no-fly list, while attempting to board a US Airways shuttle out of Washington.")


April 6, 2007
"National strategy" for broadband?

Japan has 7.2 million all-fiber broadband subscribers who pay $34 per month and incumbent providers NTT East and NTT West have only a 66% market share. According to Takashi Ebihara, a Senior Director in the Corporate Strategy Department at Japan's NTT East Corp. and currently a Visiting Fellow at the Center for Strategic and International Studies here in Washington, Japan has the "fastest and least expensive" broadband in the world and non-incumbent CLECs have a "reasonable" market share. Ebihara was speaking at the Information Technology and Innovation Foundation, and his presentation can be found here. Ebihara said government strategy played a significant role. Local loop unbundling and line sharing led to fierce competition in DSL, which forced the incumbents to move to fiber-to-the premises.

Others have taken a slightly different view. Nobuo Ikeda, formerly a Senior Fellow with Japan's Research Institute of Economy, Trade and Industry, says that the "success of Japan's broadband has been brought about by such accidental combination of a Softbank's risky investment and NTT's strategic mistakes." Ebihara acknowledges that the results of the unbundling regulation have been "mixed" in terms of competitors investing in their own local switching and last-mile facilities, as the U.S. discovered for itself.

The whole point of Ebihara's lecture was that the U.S. doesn't have what he and others consider a national broadband strategy. Never mind that Verizon already plans to spend $23 billion to construct an all-fiber broadband network, which will pass up to 18 million homes by 2010, according to USATODAY. And AT&T is spending $4.6 billion to deploy VDSL to 19 million homes by 2008.

Viewed in hindsight, and not because the Bush Administration has done a particularly good job touting its own success, a clear strategy emerges. It consists mainly of relief from unbundling regulation for fiber deployments; flexibility to offer broadband services a common-carrier basis, a non-common carrier basis, or some combination of both; and national guidance for local franchising authorities.

When, on Feb. 20, 2003, the FCC set new rules for telephone network unbundling which freed fiber-to-the-home loops, hybrid fiber-copper loops and line-sharing from the unbundling obligations of incumbent carriers, then-SBC Communications (now AT&T) and Verizon quickly responded. Verizon announced it would begin installing fiber to the premises (FTTP) in Keller, Tex. and that it planned to pass "about 1 million homes in parts of nine states with this new technology by the end of the year." SBC outlined its own plans to deploy fiber to nodes (FTTN) within 5,000 feet of existing customers in order to deliver 20 to 25 Mbps DSL downstream to every home (amd that it would construct fiber to the premises for all new builds. SBC projected that FTTN deployment can be completed in one-fourth the time required for an FTTP overbuild and with about one-fifth the capital investment. Verizon subsequently announced it would hire between 3,000 and 5,000 new employees by the end of 2005 to help build the new network, on which it planned to spend $800 million that year. And that it planned to pass two million additional homes in 2006.

It may look like these major investment decisions didn't depend on subsequent deregulatory actions -- such as the Jun. 27, 2005 decision of the Supreme Court in NCTA v. Brand X Internet Services -- clearing the way for the FCC, on Aug. 5, 2005, to eliminate the requirement for telephone companies to share their DSL services with competitors. The FCC decision finally put DSL on an equal regulatory footing with cable modem services. However, it began to emerge as early as 1998 -- in an FCC Report to Congress -- that asymmetric regulation between the broadband offerings of the telephone companies versus their competitors would be impossible to sustain as a matter of logic. A decision by the U.S. Court of Appeals for the Ninth Circuit in 2000 all but confirmed this. Thus, it was possible to foresee that either cable would have to be regulated or the phone companies would have to be deregulated. When cable modem service achieved a higher market penetration than DSL, and given the Bush administration's preference for less regulation, it became possible to anticipate that DSL would ultimately be deregulated.

The FCC didn't enact national guidelines for local franchise authorities until Dec. 20, 2006, however there was a long history of abuses by local franchise authorities. In a report to Congress in 1990 the FCC said that "in order '[t]o encourage more robust competition in the local video marketplace, the Congress should ... forbid local franchising authorities from unreasonably denying a franchise to potential competitors who are ready and able to provide service.'" Despite howls of protest from local officials, Congress imposed limits on the franchise authorities in the Cable Act of 1992. Similar abuses began showing up when the telephone companies looked serious about upgrading their broadband services. After months of discussion, the FCC began the proceeding which resulted in the current guidelines in Nov. 2005.

There's more to be done. Spectrum policy, in particular, remains mired in special-interest broadcaster and public safety politics and must be fully sorted out. But it's not clear the U.S. should follow the costly Japanese model, with its heavy reliance on tax breaks, debt guarantees and subsidies (see, e.g., this). And don't forget that Japan had zero interest rates. Industrial policy leads to higher costs, because taxpayers are footing the bill. It also relies on policymakers, who usually understand the least about technology. Consider this poignant example, as noted by Philip J. Weiser:

It was the threat of Japan's rise in the 1980s that spurred the course toward digital television that the United States still follows today. Washington committed wide swaths of spectrum to digital television, leaving U.S. mobile-phone providers with less bandwidth than they needed and only about half the amount of their European counterparts. The entire effort assumed that Americans would continue to watch television shows broadcast over the air. Yet over the past two decades, more U.S. consumers have begun to watch cable and satellite television, undermining the rationale for this expensive policy, which has also delayed innovation and imposed unjustifiable costs on the nation.


March 15, 2007
Digital Prosperity Report Concludes IT Investment Critical

Policy makers should recognize information technology as the centerpiece of economic policy and develop their plans accordingly, concludes the Digital Prosperity study published this week by the Information Technology and Innovation Foundation.

"In the new global economy information and communications technology (IT) is the major driver, not just of improved quality of life, but also of economic growth," writes Foundation president, Dr. Robert D. Atkinson, author of the study.

Atkinson is a widely respected economist who formerly served as project director of the Congressional Office of Technology Assessment, and is the former director of the Progressive Policy Institute's Technology and New Economy Project of the centrist Democratic Leadership Council.

Based on reviews of other studies, and Atkinson's own research, the report maintains, "IT was responsible for two-thirds of total factor growth in productivity between 1995 and 2002 and virtually all of the growth in labor productivity" in the United States.

Continue reading "Digital Prosperity Report Concludes IT Investment Critical" »


February 5, 2007
Health IT Creating a Buzz

Patients, adept at using the internet to schedule travel, conduct business, and access information with the click of a mouse, are now driving changes in the way state and federal policymakers address health care reform.

"Health IT" is the new buzzword for health care, and information technology proposals for healthcare reform are sprouting like daffodils in April!


Tennessee Gov. Phil Bredesen

So far this year, the National Governor's Association has announced the creation of the State Alliance for E-Health, co-chaired by Tennessee Gov. Phil Bredesen and Vermont Gov. Jim Douglas. Their purpose is to bring together office holders and policy experts to, "address state-level health information technology (HIT) issues and challenges to enabling appropriate, interoperable, electronic health information exchange (HIE)".

As quoted in the National Journal's coverage of the event, Gov. Bredesen explained, "...the states can move much more quickly....I don't trust the federal government to actually do anything on my watch."

Continue reading "Health IT Creating a Buzz" »


September 19, 2006
Internet data storage is the wrong way to combat child sexual exploitation

Kudos to Attorney General Alberto R. Gonzales for cracking down on child sexual exploitation, but it's troubling he's still considering whether to ask Congress for legislation to require communications companies to store things like search queries and which web sites their customers visit. Proposals like this endanger the civil liberties of the innocent and risk creating a police state. They are a dangerous substitute for adequately-funded law enforcement and prisons, and for a higher priority on children's safety than on second- and third-chances for dangerous criminals.

When Congress held hearings on protecting children from sexual predators in 2005, it emerged that protecting children didn't used to be a very high priority for some public officials. Consider these findings:

Continue reading "Internet data storage is the wrong way to combat child sexual exploitation" »


May 18, 2006
"Where's The Outrage?"

Sen. Ted Stevens (R-AK) rightly worries that current universal service mechanisms are unsustainable as consumers migrate to Internet phone services that are lightly taxed and regulated (these services clearly should contribute their fair share). Stevens and others also believe that rural America won't get broadband services without subsidies (we can't know this for sure, because we have never tried the alternative approach of removing all of the barriers to competition and investment).

Anyway, while Internet content and conduit providers obsess over net neutrality, something equally harmful is lurking in the shadows. A little noticed provision in the Senate's "staff working draft" designed to expand the universal service funding base could have profound consequences.

Currently, consumers of "telecommunications" services contribute billions of dollars to subsidize telephone service in rural areas. "Telecommunications" include telephone, cell phone and, for the moment, DSL services. DSL has been deregulated, and the requirement that it contribute to universal service is temporary. VoIP contributes a small amount, but nothing like a fair share. The Senate staff draft expands the category of contributors and ensures that they all pay equally. Its true that the Internet backbone would not contribute to universal service, but so what? Everything that travels to or from the public Internet would pay.

Here's how this looks:

''(1) CONTRIBUTION MECHANISM.-- ''(A) IN GENERAL.--Each communications service provider shall contribute as provided in this subsection to support universal service."
''COMMUNICATIONS SERVICE.--The term 'communications service' means telecommunications service, broadband service, or IP-enabled voice service (whether offered separately or as part of a bundle of services)."
''(A) BROADBAND SERVICE.--The term 'broadband service' means any service used for transmission of information of a user's choosing with a transmission speed of at least 200 kilobits per second in at least 1 direction, regardless of the transmission medium or technology employed, that connects to the public Internet for a fee directly-- ''(i) to the public; or ''(ii) to such classes of users as to be effectively available directly to the public."
Setting aside the issue of VoIP -- whose free ride should clearly end -- advocates of expanding the funding base sound like tax collectors when they argue that spreading the burden will lower the individual contributions. Contributions that are set low initially are, of course, much easier to raise in the future. And that will happpen, because there are no limits on the growth of most of the universal service funding mechanisms.

May 4, 2006
The Stevens bill

This week Senate Commerce Chairman Ted Stevens (R-AK) introduced comprehensive telecom reform legislation which, as Adam Thierer notes, is a 135-page monster, represents a counterproductive obsession on the part of some policymakers over the smallest details of communications policy and doesn't tear down any of the old regulatory paradigms that it sould.

That said, the proposal would move the country in a positive direction in several respects.

  • Net Neutrality -- Unlike the House bill, which grants the FCC specific new authority to enforce the commission's net neutrality principles -- and which is guaranteed to lead to questionable enforcement proceedings and perhaps litigation between grasping and delusional competitors -- the Stevens bill wisely requires the FCC to merely keep a watchful eye on industry practices and issue annual reports for 5 years. The reports may contain recommendations, but the commission may not recommend new rulemaking authority for itself. Unfortunately, the commission shall report on peering and other business arrangements that are appropriate objects for antitrust -- if egregious -- but not for an agency which is not governed by a clear and principled competition standard which emphasizes consumer welfare, as Randy May and the Regulatory Framework Working Group have outlined. It is unnecessary to include any provision regarding net neutrality in telecom reform legislation, as I have argued here, for example. However, Senator Stevens has the best net neutrality proposal so far.
  • Video Franchising -- The proposal encourages negotiations between cities and competitive entrants, but establishes a 30-day shot clock and eliminates the ability of the cities to extort in-kind contributions (beyond 1% of gross revenues for Public, Educational and Governmental channels) or set anticompetitive buildout requirements. It also ensures comparable treatment for cable operators who face competition from a new entrant. Unfortunately, the proposal preserves almost all of the existing video regulations -- such as must-carry, PEG and I-Nets -- even though the market is competitive and all vendors need to be able to raise vast sums of capital to deliver broadband speeds of 50 mbps to 100 mbps.
  • Universal Service -- On the distribution side, the bill would require periodic audits of universal service recipients and would set up a review process to prevent waste, fraud and abuse. That Senator Stevens, one of the strongest defenders of universal service, acknowledges the potential for waste, fraud and abuse in what is nothing more or less than an entitlement program is welcome and significant. Although this is a great start, ways must be found to ensure that the size of the fund declines as technology reduces the cost of services. One way to do this is to mandate price caps on all recipients. Another way is to auction the loans and/or subsidies for broadband services to the providers and the technologies that can offer the service at the lowest cost.

    On the contribution side, the bill would authorize the FCC to expand the contribution base in virtually any conceivable way and would hide everything from the Congressional budget process. Sinces taxes and regulation go hand-in-hand, Senator Stevens' proposal raises the worrisome possibility that not only taxes but also regulation may be coming to the Internet. I'm not sure which is worse: that taxes and regulation could ruin the Internet, or that the Internet might provide potentially limitless opportunities for taxes and regulation to stifle everything else. At a minimum, Congress should cap the fund if its going to give the FCC virtually unlimited authority to collect "fees."

  • Video and Audio Flag -- The bill would preserve the brodcasting business model by withholding content from the Internet. This is protectionist and anti-consumer. Congress should not pick winners and losers.
  • Sports Freedom -- The bill would outlaw exclusive contracts with programming vendors for sporting events. In a free market, exclusive contracts can benefit producers and consumers. Government, as a general matter, should not interfere with private contractual arrangements. On the other hand, these particular arrangements are not the product of a competitive marketplace and have the potential to retard competitive entry. A permanent prohibition is heavy-handed and could be damaging, but some kind of transitional relief seems appropriate.


March 6, 2006
"There you go again"

One of history's great propaganda experts believed that if you repeat a lie often enough it becomes the truth. A lot of politicians follow this advice.

The Consumers Union and the Consumer Federation of America endlessly wave the same bloody shirt of higher phone rates. The latest incantation is:

"If approved, [the] merger [between AT&T and BellSouth] will lead to higher local, long distance and cell phone prices for consumers across the country."

This is absurd because AT&T and BellSouth do not compete against each other for local or mobile phone services. And any competition between them in long distance is de minimis. The consumer groups see these companies as potential competitors. They have argued unsuccessfully for years that antitrust enforcers should take a more active role protecting and nurturing potential competition. This would require a clairvoyance that government simply does not possess.

The consumer groups also repeat their second most frequent remark:

"Telecommunications has now gone from a regulated monopoly to an unregulated duopoly with just two major players."

The implication seems to be that there is barely any difference between a monopoly and a duopoly. Sometimes that's true, but a fact-specific analysis is required. The answer depends on the barriers to entry. Telecom used to bear the hallmarks of a "natural monopoly" where the barriers were very high. But technology has led to plummeting costs for network equipment and Congress eliminated the legal barriers in 1996. So any company who abuses a dominant position will simply invite competitive entry.

VoIP and cellphones are destroying traditional wireline phone services as the phone companies invest everything they can to become broadband providers. This has got to be one of the most dynamic markets in the world today. Why would anyone want to turn back the clock?


February 16, 2006
Cringely...so close, but so far

Rich Karlgaard at his great Digital Rules blog refers to Robert Cringely's new column.

How can an article make so much and so little sense at the same time? Cringely correctly identifies big bandwidth as a replacement for Quality of Service (QoS). Big bandwidth will indeed render moot most of the "blocking" and "degradation" fears of the content companies. Congress, furthermore, should refrain from imposing new rules in a dynamic realm it knows little about. Yes, yes, yes. He must be lurking here at disco-tech.

But then Cringely gets mixed up and implies the bandwidth providers (cablecos and telcos) are the ones asking for special new rules. Exactly wrong. It's the content companies asking Congress to impose a massive new regulatory regime on the Internet, which could make the disastrous '96 Act look tame by contrast.

-Bret Swanson

Franchise reform countdown

This week's hearing on local franchising in the Senate Commerce Committee was breathtaking. Senator after senator expressed doubts about the wisdom of subjecting new entrants to the cable franchise process. Consumer advocates generally supported the phone companies. The same day, a group of 6 Republicans and Democrats on the committee signed a letter stating that Congress should reform the franchise process.

"I think the stars are aligned," noted Senator Jay Rockefeller (D-WV).

One gets the impression that the cable industry hasn't been paying attention for the past 25 years, as they take positions and employ arguments that monopolists have used in the past with little-to-no success (see, e.g., Deal of the Century: The Breakup of AT&T [1987], by Steve Coll).

Most members of the Senate Commerce Committee are committed to "pro-competition" policy, a result-oriented philosophy that embraces regulation and allows for picking winners and losers. A good example of pro-competition doctrine was the observation made by the Consumer Union's Gene Kimmelman at the hearing, that "a transition always requires some benefits to the new entrant."

This is also called asymmetric regulation, and it may produce quick results. But those results can be short-lived -- as in the case of CLECs -- since inefficient compeititon is unsustainable in a deregulated market. Both the telephone and cable companies are big enough to take care of themselves, and, like Senator Conrad Burns (R-MT) said, should be subject to the same rule book.

This week's hearing produced a lot of consensus that a regulatory rule book would stifle competitive entry. Hopefully the committee will draw the logical inference that what is needed now is a deregulatory rule book.

Memorable comments

On deregulation:

"It is ironic that cellphone service is widely available at low cost [in India] because it was regarded as a luxury and therefore left to the market, while electricity is hard to obtain because it has been regarded as a necessity and therefore managed by the government."

--Former Council of Economic Advisors Chairman Martin Feldstein, writing in the Wall Street Journal, Feb. 16, 2006.

* * *

On net neutrality:

"with or without a new law, the FCC will affect the future in a major way by its approach to the question of broadband's openness. Sometimes called net neutrality, the question of openness is multidimensional. It is hard to define and harder to answer. Chairman Martin and his colleagues have the talent, expertise, and courage to come up with the right answers on this topic."

--Former FCC Chairman Reed Hundt, speaking at George Washington University on the 10th anniversary of the Telecommunications Act, Feb. 6, 2006.

* * *

On video franchising:

"When there was no competition to the telephone and cable companies, local governments could tax and over-regulate both of them and use the extracted revenues for perks and to cross-subsidize consumers or finance unrelated public services. Cable television and phone companies submitted to this over-regulation and over-taxation because their government-sanctioned monopolies meant they could recover their investment by raising prices. Consumers had no choice but to pay. But cable tv and telephone companies are no longer monopolies."

--Senator Jim DeMint (R-S.C.), at the Senate Commerce Committee hearing on local franchising, Feb. 15, 2006.


February 13, 2006
Net neutrality: part 38...Talk about degrading service...

Maybe Google should look to old-economy providers of rich content to find actual examples of content degradation. It seems Netflix, the popular postal purveyor of DVDs, has been using "fairness algorithms" to slow the mailing of DVDs to its most voracious customers. High volume customers impose higher postal costs on Netflix, which charges a flat fee for all users. Low volume customers are more profitable. Netflix now spells out this policy in its service agreement so customers know what they're getting. Seems reasonable enough. Google and other online content companies have been fretting over the figment of online service blockages and degradation, though no one can seem to find any actual examples. Here's an example of a content company degrading itself because it found customers taking advantage of its business model and platform. Google better look in the mirror for examples of how other companies and individuals might exploit Google's platform, and what Google's response might be. Otherwise the "net neutrality" laws Google thinks it wants to govern the Internet could come back to bite -- hard.

-Bret Swanson


February 8, 2006
Industry reaffirms commitment to free Internet

Yesterday the head of the trade association representing most of the nation's telephone companies testified that telephone companies will not block, impair or degrade what consumers and vendors can do on the Internet.

"Today, I make the same commitment to you that our member companies make to their Internet customers: We will not block, impair, or degrade content, applications, or services. That is the plainest and most direct way I know to address concerns that have been raised about net neutrality."

--Walter B. McCormick, Jr.
President and Chief Executive Officer
United States Telecom Association
February 7, 2006

As a practical matter, a voluntary commitment is significant because it is a de facto standard by which the actions of individual companies will be measured by consumers, investors, regulators, legislators, judges and the press. The mistakes and excesses become easier to fix because the range of what is subjectively okay and not okay is significantly narrowed. As if this weren't enough, the FCC has a similar policy. Some argue that the FCC's "ancillary jurisdiction" may limit its freedom of action to enforce the policy. I completely disagree. In 1968, the Supreme Court upheld the FCC's use of ancillary jurisdiction to regulate the cable industry even though Congress had declined to pass a cable act. If the FCC needs to take enforcement action in the future to prevent blocking, impairment or degradation on the Internet, a reviewing court now has a standard as well as precendent to follow.

To the extent that net neutrality was ever a problem, it has been effectively solved.


February 2, 2006
DeLong on Google/China

Here's James DeLong, with the most sophisticated take on the Google-China dilemma. He not only gets the technology right but also offers intriguing thoughts on the geopolitical landscape and the very state of democracy in the West. It's provacative, but I think he's right.

Tom Hazlett also understands what's going on.

-Bret Swanson


January 26, 2006
Stuck in Neutral

Are Comcast and Verizon bent on slowing your Google and Yahoo! searches to a crawl? Each day, it seems, yet another pundit jumps into the "net neutrality" fray, and that is the impression they give readers. In last Sunday's Washington Post, it was Christopher Stern failing to listen to the technology. Stern's treatment was fairer than most but still drew a false caricature of the complex business and technical issues that have recently dominated the Internet and New Media debate.

Stern asks:

"Do you prefer to search for information online with Google or Yahoo? What about bargain shopping -- do you go to Amazon or eBay? Many of us make these kinds of decisions several times a day, based on who knows what -- maybe you don't like bidding, or maybe Google's clean white search page suits you better than Yahoo's colorful clutter.

"But the nation's largest telephone companies have a new business plan, and if it comes to pass you may one day discover that Yahoo suddenly responds much faster to your inquiries, overriding your affinity for Google. Or that Amazon's Web site seems sluggish compared with eBay's.

"The changes may sound subtle, but make no mistake: The telecommunications companies' proposals have the potential, within just a few years, to alter the flow of commerce and information -- and your personal experience -- on the Internet."

The fact is that simple Google and Yahoo! searches or eBay or Amazon transactions are not going to slow down at all. They don't require very much bandwidth. Most of the delay comes in processing the query or transaction at the server cluster. In fact, with advanced new networks delivering broadband speeds of tens of megabits per second, the Net is going to deliver a far better experience for the vast majority of users and applications. After the flood of bandwidth, the key to delivering bits quickly and robustly is "storewidth" -- the capacity of computers to find, sort, process, and serve web pages, applications, answers, services, transactions, and other forms of content with little delay. Customers want quick computing. Slowing down Google searches or eBay transactions would be disastrous for the bandwidth service providers.

The real question, then, is what about large files and huge streams -- namely, rich media such as video and audio. These are the only types of Internet traffic that will impose much of a bandwidth burden on the network operators. Video and audio also happen to intersect with (interfere with?) the content business models of the cable and telecom companies. High definition video packets traveling from coast to coast will require special processing, using priority quality of service (QoS) technologies integrated in the new generation of routers and switches. High definition video consumes massive amounts of bandwidth and requires precise delivery without latency or jitter. New algorithms like Digital Fountain's ingenius "raptor codes" or "fountain codes" can do some of the work. But much of the work will be done through packet prioritization, both in the core of the network and at the edge.

Today, the Internet is a "best effort" system that seeks servers in a roundabout way with algorithms that seek the best route but sometimes hit obstacles or bottlenecks. All sorts of methods have been employed to shuttle VIPs (Very Important Packets) around the Net with faster and more reliable service than their pedestrian counterparts. We've given streaming video and voice-over-IP, for instance, special attention. Very often extra bandwidth is used as an elegant substitute for complicated QoS or geographically optimized content caching. Increase the size of the pipe enough, and all the packets get red carpet treatment. Nevertheless, going forward, some combination of big bandwidth, big storewidth, and QoS will be used to deliver the goods. This stuff costs money, and in many cases the VIPs will cost more to serve than the JSPs (Joe Six Packets). Content companies will pay bandwidth providers for VIP service, as they do today. Bandwidth providers will offer their customers contracts that say we will give you so much bandwidth to the open, best effort Internet for so many dollars. If the bandwidth providers don't provide that access, if they actively block or slow certain sites or certain packets that effectively break that bandwidth-per-dollar contract, not only will they not fulfill their contract, but customers also will find another provider.

The fact that some VIPs will enjoy bandwidth martinis will not prevent Joe Six Packets from drinking as much beer as he wants.

-Bret Swanson


January 25, 2006
Google's "Infinite Database" targeted by Rep. Markey

Google thinks everyone should have the right to visit any legal web site they choose -- as long as it can track every move and remains free to manage the data in its wisdom.

Google maintains server logs that record the date, time and originating IP address of every search query and subsequent click on a link. The New York Times reported in 2002 that Google collects "150 million queries a day in its databases, updating and storing the computer logs millisecond by millisecond."

If you were a prosecutor, an investigator or a private plaintiff, could you resist the temptation to examine this material? And what about the more serious problem of theft and loss? If Gen. Wesley Clark's mobile phone records can be purchased by anyone, its just a matter of time before someone can obtain server logs from Google or one of the other Internet companies who compile this information. The decision(s) to create a "bottomless, timeless database" (the words of Rep. Ed Markey (D-MA)) was an error of judgment -- one that Google now hopes to offset by unfairly portraying the Bush Administration as a villain for seeking aggregated data to protect children from online pornography.

Google and the other Internet companies who store server logs should have seen this one coming. Permanent server logs are about as wise as blocking the web site of a competitor. Both reflect an indifference to the rights and expectations of consumers. Madison River Communication's mistake in blocking the web site of Vonage Holdings Corp. was an isolated incident. Server logs apparently are an industry practice.

Rep. Markey has announced he will introduce legislation to require the destruction of "personally identifiable information derived from a consumer's Internet use" by Internet companies beyond a reasonable period. You can't argue with Rep. Markey's objective, just like you can't argue with the objective of network neutrality. But in both cases the likelihood is that if Congress does pass legislation it will be a clumsy affair that goes too far in some respects and not far enough in others.



January 19, 2006
Fragmenting the Internet

Proof that you can never have it both ways can be found in a report by Christopher Rhoads in today's Wall Street Journal, which notes that countries and organizations are erecting rival Internets. Internet pioneers such as Vinton Cerf are alarmed about a fragmentation of the Internet, according to Rhoads. But we should step back and give thanks for what this development is not. It is not U.N. control of the Internet. The U.N. is a sclerotic, and some say corrupt, organization that is full of strange notions about the importance of personal and commercial freedom. Were it to control the Internet, foreign dictators and bureaucrats would be able to influence how we can use the Internet in this country. Many foresaw the possibility of rival Internets--as well as the likelihood of their inevitability--in the wake of the Bush Administration's success in beating back the proposal for an Internet dominated by the U.N.

The advent of rival Internets will create challenges, but it will also increase interest and participation in what has always been a "network of networks." Rivals may also help to limit the impact of Internet abuses by repressive regimes, and will certainly limit the opportunities for control by world organizations. Your domain name may no longer be your domain name in every corner of the world, but with a bit of ingenuity there should be no reason that we can't continue to have as much order and stability as we want in the U.S.

Rival Internets will be an interesting test case for those who believe that government regulation is needed to facilitate network interconnection, such as the drafters of various proposals for telecom reform on Capitol Hill. My prediction is that interconnection between rival Internets will be the norm, not the exception. For one thing, the benefits of interconnection are disproportionately felt by the users of a smaller network, who gain access to more content and services than they would otherwise have. For another, interconnection can be direct or indirect. Years ago, when the old AT&T refused to interconnect with upstart MCI, the upstart was able to gain access to AT&T's customers by finding a small, independent local exchange carrier who was both interconnected with AT&T and who was willing to interconnect with MCI. The involuntary interconnection that MCI thereby achieved with AT&T through indirect means may not have been ideal from a network engineering perspective, but it worked.

It has been obvious for some time that the Internet is becomming too important culturally and politically for the status quo to continue indefinitely. The Internet appears to have outgrown places like Silicon Valley and Marina del Rey--perhaps making it less likely that the future will look like Star Trek, where Earth is a member of a Federation of Planets which is headquartered in San Francisco of all places.


January 18, 2006
Thought Free Telecom

Today at Slate.com, Adam Penenberg examined the "net neutrality" debate in an article entitled "Internet Freeloaders:
Should Google have to pay for the bandwidth it consumes?
"

Following is Penenberg's column (indented) with my comments interspersed:

Internet Freeloaders

The Internet has always been about democracy--what the geeks who designed it call "network neutrality." Data, whether e-mail, a Web page, or video, get sent as packets that are reassembled at the end of their journeys. All packets are created equal, and Internet service providers deliver them without prejudice, based on their network's speed and capacity.

This isn't quite right. For years, providers of certain content, applications, and services have used specialized techniques to deliver higher value data in faster and more robust ways. For example, companies that provide streaming video or audio, or companies like Vonage that provide voice-over-IP service, have used caching architectures and quality-of-service technologies to reduce latency and "jitter" and thus deliver smooth audio and video services. From content delivery networks (CDNs) that store and push data to optimal points on the Net to asynchronous transfer mode (ATM) and multiprotocol label switching (MPLS), which route packets across the globe according to levels of priority, networks have exhibited high degrees of "prejudice." In other words, not all packets have been created equal. Going forward, however, more bandwidth resulting from new fiber-optic capacity could actually reduce the need for QoS and caching technologies in some cases and thus reduce the need for packet discrimination. More about this later....

Telecommunications and cable companies--let's call them telco-cable--want to change that. Verizon, Comcast, and their ilk have been lobbying Congress to transform the Internet into a two-tiered system. By tagging content, broadband providers would ensure that their own packets (or those from companies paying them protection money) get preferential treatment and reach subscribers faster than second-tier content. This would give companies like Verizon a tremendous advantage as they roll out their own television and VoIP telephone services.

Cable and telecom companies have not been lobbying Congress for a two-tiered system. Some members of Congress and many industry pundits have suggested new regulations--broadly known as "network neutrality" rules--that could radically constrain the flexibility of both infrastructure and content companies to pursue new business plans on the Net. The telecom and cable companies have opposed these proposed new regulations that could massively intrude on and calcify what is a fast-moving and organic telecom environment. They have not lobbied for any two-tier system.

Telco-cable companies have spent billions to lay down broadband pipe and want a return on their investment. They are tired of bandwidth hogs like Google, Amazon, and Microsoft getting a free ride. This was fine when the Internet consisted mostly of e-mail and static Web pages. With the advent of online video, Internet telephony, and IPTV, Verizon, AT&T, and BellSouth want content providers to share the cost. Their reasoning: If Google is going to introduce a video service, shouldn't it have to pay for some of the bandwidth it scarfs down?

But it isn't just the Googles of the Web that are soaking up bandwidth. According to the U.K.-based technology firm Cachelogic, peer-to-peer traffic accounts for 80 percent of the traffic of so-called last-mile providers, companies like Verizon and Time Warner Cable that take broadband that final mile into your home. All of this demand for video, music, and file-sharing could create bottlenecks for Verizon and Time Warner--the ones who hook up your home to the data grid.

As a result, telco-cable has been lobbying Congress to rewrite the Telecommunications Act of 1996.

The 1996 Telecom Act is not up for re-write because of these issues, which have come up fairly recently. The 96 Act is up for re-write because it was a terrible failure. It barely even acknowledged the existence, let alone the importance, of the Internet. It empowered the FCC and 51 state utility commissions to micromanage telecom prices and services. It socialized the network by giving "open access" to faux telecom companies who merely played a game of regulatory arbitrage. A rewrite could now do away with these mistakes and actually deregulate this competitive arena. The "net neutrality" effort is a push by many of the same forces who gave us the disastrous Open Access I to revive the monster at a different layer of the network. Call it Open Access II.

A draft of the new bill would codify "network neutrality" (which to this point has been voluntary) and forbid network service providers from blocking or otherwise sabotaging content. Usually fierce competitors, these gatekeepers can agree on one thing: They want to strike the network neutrality clause. Google, Yahoo!, Microsoft, and eBay want to keep it. If telco-cable wins, it will be able to set up separate tiers, forcing Google to pay up or ride in the slow lane.

Google and every other content and infrastructure company on the Net already pay for bandwidth. They pay for long-haul bandwidth. They pay for bandwidth within data centers. If they don't buy enough bandwidth, they can experience "the slow lane." "[S]eparate tiers" of service pervade the Net, not to mention every matter of commerce and life. A cross-country "lambda" running at 10 gigabits per second costs more than a cross-country T-1 line at 1.5 megabits. Ten lambdas cost more than one lambda. Equinix, which operates data centers--large secure facilities that host the servers and network equipment for just about every major content and network company--charges its clients per megabit of traffic. Paying to use or acquire someone else's facilities or product is pretty elementary. Nothing new here.

At this month's Consumer Electronics Show, Verizon CEO Ivan Seidenberg explained, "We have to make sure that they [application providers] don't sit on our network and chew up bandwidth. We need to pay for the pipe." Perhaps, but what Verizon proposes is to charge twice for broadband: first to subscribers, then to content providers. In essence, telcos and cable companies want to privatize the Internet--a model we've pretty much left behind since the days of CompuServe, Prodigy, and AOL.

Charge twice? What about magazines. They charge subscribers for the magazine, and they charge advertisers who take up space in the magazine. This model may not work on the Net. We just don't know yet. The situation is too fluid and moving too fast. But "charging twice" is not some nefarious trick. It's just another business model that may or may not work.

If the telcos and cable companies get their way, we'll have a Balkanized Web. Content providers who can afford to pay for premium service will market superior products to consumers with fast connections. Everyone else will make do with second-class companies at second-class speeds.

Time for profundity: You get what you pay for....Seriously, the idea that the Net, which today is a slow molasses trickle of spam, viruses, and herky-jerky video, will regress into Bosnian chaos when cornucopian new fiber optic networks come online is the opposite of the truth. There will be different levels of service, as with every other product on the planet, but new networks will dramatically improve content and services across the Net.

The business model that this most resembles is cable television. There's one key difference, though. In the cable world, the service providers pay channels for the rights to broadcast their shows. In the system that telco-cable is proposing for the Internet, the content providers--who provide the services that make customers clamor for broadband in the first place--would have to pay for the privilege of being included.

Not all content providers are taking this lying down. Business 2.0's Om Malik reports that Google has been buying up miles of "dark" fiber--unused fiber-optic cable--at severely depressed prices. Malik believes that Google plans to "blanket major cities with Wi-Fi," including San Francisco, Washington, D.C., and New York. Given Google's ethos, its Wi-Fi would probably be free, with revenue derived from targeted advertising. Obviously, the telcos and cable companies would have trouble competing with that. Even if telco-cable is successful in implementing a two-tiered Internet plan, another workaround could be municipal wireless networks, like those being built in Philadelphia. (No wonder Verizon has been fighting them tooth and nail.)

There's a far better solution than Verizon charging Google to use its bandwidth or Google becoming a service provider itself. What about having subscribers pay for the bandwidth they consume? Just like you buy variable rate cell-phone plans and pay for electricity based on how much you use, your broadband bills should be calculated the same way. That way, heavy Net users could subsidize the Internet for those who don't use it as often, and access would be available for anybody who wants it. Then content would remain free, and everyone would benefit.

Subscribers paying the whole bill depending on amount of usage--dollars per megabit per month is the common method--is a perfectly reasonable option. Maybe the telcos and cable companies will adopt this end-user-pays-the-whole-bill model and the debate will evaporate. Or maybe the telcos and cable companies will find that consumers won't pay more than, say, $75 for advanced connectivity. Eventually, of course, consumers will pay one way or another. How to split the cost of bandwidth, however, is a matter of business strategy, not regulation. Whatever happens, not all content will be "free." Some content will be free in the sense that it is paid for by advertising or subsidized by some other source. But lots of content--movies, educational videos, sports, music, and lots of writing--will be paid for by end-users. The utopian vision of free networks and free content would be costly if taken seriously by policymakers. It is a vision free of thought.

-Bret Swanson


December 1, 2005
Charging for web speed

At a Congressional staff briefing this week, the Chief Technology Officer of BellSouth referenced an agreement between BellSouth and Movielink in which BellSouth receives a fee to ensure that Movielink's customers can download movies quickly -- even if they have a slow Internet connection. In today's Washington Post, Gigi Sohn -- an advocate of "network neutrality" -- ridiculed the arrangement with this clever comment: "If we want to ruin the Internet, we'll turn it into a cable TV system." I would actually hate to see the Internet turn into either a cable TV system or a telephone network. That's the problem with Sohn's proposal.

Sohn's proposal would turn the Internet into a wasteland where transport providers can't make any money. The problem with that is no one will invest another penny to make the Internet faster or more ubiquitous. The BellSouth-Movielink arrangement has at least two pro-consumer advantages that are being overlooked. One is that customers who elect to pay for a slow Internet connection can get faster downloads that are subsidized by others. Another is that as BellSouth invests in network upgrades to provide a higher quality experience for Movielink's customers, traffic moves more quickly resulting in faster speeds for everyone.


November 21, 2005
Google and Net Neutrality (continued)

Like my colleague, Hance Haney, I find Google's support of "net neutrality" regulation surprising. Or if not surprising, at least disappointing. Google is not a search engine company or a dot-com. Google is an Internet infrastructure company. A networked computer company. It is a general purpose platform of processors, bandwidth, and software. It does search, yes, but every few months now Google introduces another array of new Net products and services: GoogleVideo, GoogleBase, GoogleMaps, GoogleDesktop to organize my PC. Read this column by Robert Cringely, who explains Google's infrastructure buildout and says Google is about to monopolize the Net, leaving no room for competitors or even mid-sized companies, only small guys and entrepreneurs.

I don't believe Google will monopolize the Net, but I do believe Google is building the ultimate storewidth system -- a generic network computer that will challenge not only Microsoft but also NBC, CBS, Fox, and Hollywood. Google will challenge Microsoft directly by putting your stuff -- your info, your e-mail, your documents, your photos, your apps -- out on the Net and making them accessible anywhere, anytime, from any device, without the headaches of a Windows PC. And because it is a general purpose content storage and delivery system, it will empower individual producers and consumers of content, and it will disrupt the existing content aggregators and distributors, namely Big TV and Hollywood. Google's strength is its physical infrastructure -- hundreds of thousands of computers at dozens of major network nodes, all linked with big bandwidth and super-nerd software to make all the disparate resources run in parallel. The geographically diverse data centers reduce latency and boost performance by getting close to users. This is the vision of storewidth -- the combination of storage and bandwidth -- that George Gilder outlined five years ago in the Gilder Technology Report and at our Storewidth conferences, the first several of which were attended by the then-less-famous Larry Page and Eric Schmidt, Google's co-founder and CEO, respectively. Which came first, I don't know, but Page and Schmidt have implemented all the best ideas that we batted around at those conferences -- including co-opting Yale computer scientist David Gelernter's "life streams" software -- and they are finally making good on Sun Microsystems' long-held notion that "the network is the computer."

So now we come to net neutrality, the regulatory concept that an operator at one layer of the network should not discriminate against a company or serivce operating at another layer of the network. Net neutrality advocates are worried the Comcasts, Coxes, Verizons, and SBCs (sorry, AT&Ts) of the world will leverage their last-mile pipes into homes and businesses to deny voice-over-IP companies (Vonage, Skype (now eBay)), content providers (Yahoo, Movielink), or applications and service providers (MSN, Google) from reaching the end user, or at least from reaching them on their own terms. We've argued strenuously that it is in the best interest of the bandwidth providers (Comcast, Verizon) to offer unfettered access to the cornucopia of content and services that only the Internet can provide. But it is a matter to be decided by business strategy. Most walled gardens wither and die. But clearly the last mile communications companies are going to leverage their hundred-billion dollar assets to their own advantage where it makes sense and profits.

Google, likewise, has unique assets of computers, wires, and software, distributed around the world, at very strategic locations. If Verizon, AT&T, and Comcast dominate the physical layer of the Internet, Google will soon dominate the logical layer of the Net. But if the physical layer of the Net should be "neutral" and subject to "open access" regulation, as Google and other Silicon Valley companies say it must be, then why not the logical layer? A layer that, if we are to believe many technology pundits like Robert Cringely, will be even more dominated by Google than the physical one is by the Bellcos and Cablecos. Does Google really want the FCC forcing and micromanaging "open access" to its data centers, processors, and software, to its advertizing and content delivery platforms? The Internet is comprised of numerous and diverse networks. The current net neutrality push targets one narrow set of companies for regulation, but if it were ever applied, well, neutrally, net neutrality law would cover the whole Net and impose never-ending micromanagement and litigation on the most dynamic and important engine of the world economy.

-Bret Swanson


November 18, 2005
Vint Cerf and Net Neutrality

Internet pioneer Vint Cerf penned a letter expressing his fear that legislation before the House Energy & Commerce Committee "would do great damage to the Internet as we know it." Cerf is now an employee of Google, a great company that unfortunately strongly supports the legislation's net neutrality provisions.

Continue reading "Vint Cerf and Net Neutrality" »


August 24, 2005
GoogleTalk About Game Over

So now that Google has entered the instant message and voice-over-IP games, adding to the existing 76 million U.S. users of AIM, Yahoo, and MSN Messenger, not to mention Skype's 47 million VoIP users and a few million Vonage customers, with robust broadband video conferencing from these web-based applications providers on the way, can we finally agree that the rigid price ceilings and floors and geographic pricing layers and cross-subsidies for traditional voice telephony administered by the 51 state utility commissions are no longer operative?


-Bret Swanson

Dotted Divider Line





Contact Us
Discovery Institute Logo

Click here for additional contact information