Discovery Institute
disco-tech | Discovery Institute's Technology Blog: Privacy Archives

November 14, 2011
GPS tracking devices do not have power to rewrite Fourth Amendment

Futurists have been predicting for years that there will be diminished privacy in the future, and we will just have to adapt. In 1999, for example, Sun Mcrosystems CEO Scott McNealy posited that we have "zero privacy." Now, Wall Street Journal columnist Gordon Crovitz is suggesting that technology has the "power to rewrite constitutional protections." He is referring to GPS tracking devices, of all things.

The Supreme Court is considering whether it was unreasonable for police to hide a GPS tracing device on a vehicle belonging to a suspected drug dealer. The Bill of Rights protects each of us against unreasonable searches and seizures. According to the Fourth Amendment,

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
In the case before the Supreme Court, U.S. v. Antoine Jones, the requirement to obtain a warrant was not problematic. In fact, the police established probable cause to suspect Jones of a crime and obtained a warrant. The problem is, the police violated the terms of the warrant, which had expired and which was never valid in the jurisdiction where the tracking occurred. Therefore, first and foremost, this is a case about police misconduct.

Continue reading "GPS tracking devices do not have power to rewrite Fourth Amendment" »

February 28, 2011
Targeted ad bubble?

On Friday Scott Cleland predicted that online advertising is an investment bubble which will eventually burst as a result of latent consumer privacy concerns.

Expect privacy concerns to be the eventual catalyst that ultimately bursts the Internet investment Bubble 2.0. It is rare when there is a profound disconnect and suspension of reality between industry behavior/investment expectations and customer wants, needs and expectations, but that is precisely what is at work in Bubble 2.0.

In the Wall Street Journal, Scott Thurm reported that venture firms have invested $4.7 billion since 2007 in 356 online ad firms. The investment climate is "frothy" according to one of Thurm's sources. And there's an arms race among the start-ups for math specialists, he reports. Some specialists are migrating to advertising from Wall Street.

Sounds eerily familiar.

Another item in the same publication caught my eye this morning, and suggests -- to me, at least -- that we may also have an irrational boundless-optimism effect on our hands. Julia Angwin and Emily Steel cite a knowledgeable source who suggests that large amounts of data for targeting ads do not necessarily produce "great results" for advertisers.

A cofounder of Allow, Justin Basini ... came up with the idea for his new business when working as head of brand marketing for Capital One Europe. He says he was amazed at the "huge amounts" of data the credit-card companies had amassed about individuals.

But the data didn't produce great results, he says. The response rate to Capital One's targeted mailings was 1-in-100, he says--vastly better than untargeted mailings, but still "massively inefficient." Mr. Basini says. "So I thought, 'Why not try to incentivize the customer to become part of the process?"

November 22, 2010

Simple in theory, tricky in practice

You want to save the world; you have a clear and simple idea. If it weren't for the details! Ah, the pitfalls of regulation. From the Wall Street Journal,

Seeking to be a leader in protecting online privacy, the European Union last year passed a law requiring companies to obtain consent from Web users when tracking files such as cookies are placed on users' computers. Enactment awaits action by member countries.

Now, Internet companies, advertisers, lawmakers, privacy advocates and EU member nations can't agree on the law's meaning. Is it sufficient if users agree to cookies when setting up Web browsers? Is an industry-backed plan acceptable that would let users see--and opt out of--data collected about them? Must placing cookies on a machine depend on the user checking a box each time?

Bold actions and frightening consequences

Analyist Anna Maria Kovacs says House Republicans have made it clear they would be particularly offended if the Federal Communications Commission moves to impose network neutrality regulation this year, before House Republicans assume the majority. The consequences for the FCC, Kovacs suggests, could be severe.

Alienating the Republican leadership could create some pain for the FCC. Budget requests could receive much tougher scrutiny and some degree of budget cut becomes a possibility. Considerable staff and commissioner time could be spent on oversight, responding to questions from Congress and testifying at hearings. A bill to abolish or radically reform the FCC could not be enacted, but the process of fighting it could be time-consuming and unpleasant .... Getting commissioner nominations through the Senate might become impossible ... This potentially could leave the FCC with only four commissioners at the end of 2011, when Commissioner Copps [D] has to leave unless he is reappointed. That would leave a Commission that is most likely to deadlock two-two on controversial issues which would make it far more difficult for Chairman Genachowski to implement the [National Broadband] Plan.

November 10, 2010
Promoting online privacy with phony symbolism

The New York Times reports that officials at the Federal Trade Commission are exploring a "Do Not Track" option on websites and browsers similar to the "Do Not Call" list which prevents unwanted telemarketing calls. Meanwhile, the White House has established an interagency panel to ensure that any restrictions do not impede law enforcement and national security efforts.

A "Do Not Track" feature won't protect consumers from unwanted ads, only relevant ads they are more likely to find useful. That's the whole purpose of tracking. Advertisers, who underwrite much of the cost of Internet content, applications and services, will lose an efficient opportunity to connect with potential customers. For what?

Meanwhile, even if there will be no tracking for commercial purposes, there will still have to be full tracking for law enforcement and national security efforts. ISPs and websites will have to continue to track everyone to comply with warrants or subpoenas they may receive in the future. As long as this information is collected and stored, there's a risk it can be misued. The principal risk isn't the ISPs or websites themselves. They would suffer reputational damage. The primary threats are from hackers and over-eager investigators.

A more appropriate first step would be for Congress to amend the Electronic Communications Privacy Act to provide the same legal protection for information gathered by ISPs and websites as for the information stored on your PC or in your safe deposit box.

December 8, 2009
Behavioral advertising: Poor excuse for regulation

With U.S. Rep. Rick Boucher (D-VA) and now the Federal Trade Commission holding hearings on privacy and online advertising, it seemed like a good time to visit the Google Privacy Center to see what categories Google believes I fall into based on my online behavior.

My interests were:

News & Current Events
That was it.

I could opt out of interest-based advertising or manage my ad preferences at the Google site, but, I figured, what's the point?

A Google representative told the New York Times that the Privacy Center pulls in tens of thousands of visitors each week. For every one person who opts out, four people change the categories they have fallen into, and 10 people do nothing, just look over the information on the site.

The same article quotes an academic who notes that some consumers do not understand behavioral advertising, and that "people are confused about which part of a Web page is advertising." So apparently the argument is regulation is justified because it would relieve some people of the responsibility to get educated.

But one of the chief problems with any regulation -- no matter how well-intentioned -- is the difficulty containing it. For example, "a number of parties have suggested it would be appropriate to extend these privacy rights as a consumer protection to the offline side as well," Rep. Boucher says.

September 30, 2009
Regulating behaviorial ads

Rep. Rick Boucher (D-Va.), chairman of the House Subcommittee on Communications, Technology and the Internet, is planning to introduce legislation to ensure privacy protection for Internet users. According to Boucher,

Industry is to be commended for its recent advancement of self-regulatory principles. However, while proactive, these entirely voluntary principles do not go far enough, and there is no guarantee that every company that collects information from the Internet-using public will abide by them.
* * * *
The structure I have set forth should not prove burdensome for Internet-based businesses that rely on targeted advertising. In fact, it is in line with the practices of reputable service providers today. More importantly, by giving Internet users a greater confidence that they have control over the collection and use of information about them by websites, it will encourage greater levels of general Internet usage and e-commerce, benefiting not only consumers, but also the companies that transact business online and our nation's economy.
In other words, Boucher is identifying the "best practices" which have already emerged within the industry in the absence of regulation, and he is planning to impose them by legislative fiat on every industry participant.

If consumers like the best practices, other industry participants will adopt them in the absence of regulation as a matter of competitive necessity. Legislation isn't necessary.

But the issue is as politically hot as fears of cellphone radiation causing cancer, which was the subject of a recent Senate hearing.

Note: According to the National Cancer Institute, there is no evidence cellphones cause cancer.

Studies suggest that the amount of RF energy produced by cellular telephones is too low to produce significant tissue heating or an increase in body temperature. However, more research is needed to determine what effects, if any, low-level non-ionizing RF energy has on the body and whether it poses a health danger. (footnote omitted.)
Today the New York Times reported that a survey of 1,000 Internet users revealed that two-thirds object to online tracking by advertisers.
Tailored ads in general did not appeal to 66 percent of respondents. Then the respondents were told about different ways companies tailor ads: by following what someone does on the company's site, on other sites and in offline places like stores.

The respondents' aversion to tailored ads increased once they learned about targeting methods. In addition to the original 66 percent that said tailored ads were "not O.K.," an additional 7 percent said such ads were not O.K. when they were tracked on the site. An additional 18 percent said it was not O.K. when they were tracked via other Web sites, and an additional 20 percent said it was not O.K. when they were tracked offline.

The survey company also asked about customized discounts and customized news. Fifty-one percent of respondents said that tailored discounts were O.K., and 58 percent said that customized news was fine.

Boucher correctly notes that targeted advertising has many important benefits for consumers and should not be disrupted:
It is also important to note that online advertising supports much of the commercial content, applications and services that are available to Internet users today without charge, and I have no intention of disrupting this well-established and successful business model.
The real problem here is not that Web sites collect and use customer data. The data is only read by machines, not humans; and it can be disaggregated and stored in different places so that even in the event of a data breach no one has access to enough pieces of information to assemble the whole puzzle.

The bigger danger is there is insufficient Fourth Amendment protection against unreasonable searches and seizures of information in the possession of third parties.

The Third Party Rule is unworkable in the cloud computing age, because storage and processing are migrating from the desktop to data centers. And consumers won't always be able to tell the difference, which will undermine the justification for the Third Party Rule.

April 26, 2009
Consumers have privacy options

According to House Subcommittee on Communications, Technology, and the Internet Chairman Rick Boucher (D-VA)

Deep packet inspection enables the opening of the packets which hold the content of Internet transported communications. Through the use of DPI the content can be fully revealed and examined.

It has generally accepted beneficial uses such as enabling better control of networks and the blocking of Internet viruses and worms. It also enables better compliance by Internet service providers with warrants authorizing electronic message intercepts by law enforcement.

But its privacy intrusion potential is nothing short of frightening. The thought that a network operator could track a user's every move on the Internet, record the details of every search and read every email or attached document is alarming.

And while I'm certain that no one appearing on the panel today uses DPI in this way, our discussion today of the capabilities of the technology, the extent of its deployment and the uses to which it is being put will give us a better understanding of where to draw lines between permissible and impermissible uses or uses that might justify opt-in as opposed to opt-out consent.

But as Kyle McSlarrow, CEO of the National Cable and Telecommunications Association notes,
Packet inspection serves a number of pro-consumer purposes. First, it can be used to detect and prevent spam and malware, and protect subscribers against invasions of their home computers. It can identify packets that contain viruses or worms that will trigger denial of service attacks; and it can proactively prevent so-called Trojan horse infections from opening a user's PC to hackers and surreptitiously transmitting identity information to the sender of the virus.

Packet inspection can also be used to help prevent phishing attacks from malicious emails that promote fake bank sites and other sites. And it can be used to prevent hackers from using infected customers' PCs as "proxies," a technique used by criminals, in which user PCs are taken over and used as jumping-off points to access the Internet, while the traffic appears to be generated by the subscriber's PC. As a result, the technology can be used in spam filters and firewalls.

Second, packet inspection can be used for network diagnostics and capacity planning. Cable operators cannot plan for network growth without understanding how Internet traffic is growing and the uses to which it is put. By using this technology to analyze the aggregate growth and usage changes in network traffic patterns over time, cable operators can anticipate the needs of their subscribers and appropriately plan for network growth.

Third, packet inspection can help network operators accurately respond to formal requests from law enforcement agencies for the interception of communications for law enforcement purposes. When law enforcement agencies identify traffic of concern, this technology allows network operators to comply with their legal obligations to flag that traffic.

Finally, the Internet is not static. Different opportunities and challenges will emerge and this technology may prove useful in providing consumers more choice and control in ways that are difficult to predict today. For instance, as streaming video capabilities increase, this technology could be a means of supporting more advanced parental controls.

It will not be possible for Congress to outlaw deep packet inspection, because the consumer benefits are too compelling.

And besides, deep packet inspection is only one of the ways oneline providers can accumulate personally-identifiable information about consumer preferences.

If Congress tries to regulate how online providers collect sensitive consumer data it runs the risk of choosing the wrong technology winner and losers.

Consumers ought to be allowed to opt in or out.

AT&T says it will let consumers opt in

AT&T will not use consumer information for online behavioral advertising without an affirmative, advance action by the consumer that is based on a clear explanation of how the consumer's action will affect the use of her information.
But other entities, such as Google, are using an opt-out approach.

Both are excellent business models which allow consumers to choose. The AT&T approach may promote more privacy; the Google approach may promote more free services. Both approaches give consumers notice and control.

AT&T, Google and others ought to be allowed to compete. And consumers ought to be allowed to choose.

With freedom to innovate, who can predict what new service and features the providers will come up with?

A legitimate issue for Congress is what should the liability for online providers be for violations of privacy policicies, including breaches of sensitive consumer data?

This is an area in which it would be appropriate for Congress to enact tough sanctions, if it is so inclined.

January 20, 2009
Privacy recommendations

An interesting report, "Securing Cyberspace for the 44th Presidency" by the Center for Strategic and International Studies, focusing primarily on numerous process improvements within the executive branch. Process is important, of course, but I have never been convinced process is a substitute for the content of human hearts.

According to the report, the government should make "strong authentication of identity, based on robust in-person proofing and through verification of devices, a mandatory requirement for critical cyber infrastructures," but consumers should be protected from businesses and other services who might require strong government-issued or commercially issued credentials for all online activities (this could be done by requiring businesses to adopt a risk-based approach to credentialing).

Well, goodness. I guess some might consider the renewal of an auto registration or the remittance of a traffic fine more deserving of superior credentials than a mortgage or student loan application. The former contains little personal information; the latter a lot. But I guess the report's authors think that any government function is important and private interactions are not.

This may sound shocking, but I suspect businesses -- through trial and error -- may be capable of discovering the optimum balance between security and convenience for most consuners.

February 22, 2008
Stealing encrypted data

Researchers at Princeton have figured out how to crack encrypted files stored on a computer's hard drive, according to the New York Times.

"Cool the chips in liquid nitrogen (-196 °C) and they hold their state for hours at least, without any power," Edward W. Felten, a Princeton computer scientist, wrote in a Web posting. "Just put the chips back into a machine and you can read out their contents."
This technique -- which enabled the researchers to retrieve encryption keys from DRAM chips -- can't be carried out remotely via the Internet or a WiFi connection, only if your computer is stolen or seized.

One way to look at this is to lament that one can't be sure anything one stores on their computer is safe. But that's pessimistic -- a bit like lamenting it's too bad someone can't build a ship which can't sink or a vehicle which can't be stolen. Just as it is true that any secret code can be broken, it's equally true there's no limit on the complexity or redundancy one can add to secret codes to make them harder to compromise. Microsoft and Apple suggest how to protect one's personal files in case their computer is stolen or seized:

Austin Wilson, director of Windows product management security at Microsoft, said the company recommended that BitLocker be used in some cases with additional hardware security. That might include either a special U.S.B. hardware key, or a secure identification card that generates an additional key string.

The Princeton researchers acknowledged that in these advanced modes, BitLocker encrypted data could not be accessed using the vulnerability they discovered.

An Apple spokeswoman said that the security of the FileVault system could also be enhanced by using a secure card to add to the strength of the key.

November 9, 2007
Give them immunity

With all due respect for the views of my colleagues (here and here) and commenters at Technology Liberation Front, former Sen. Bob Kerrey had this, and other, mature insights in an op-ed which appeared yesterday in The Hill regarding whether to include immunity for telecom carriers in the Foreign Intelligence Surveillance Act (FISA) reauthorization:

Consider the atmosphere: the president had gone before Congress and said "one vial, one canister, one crate, slipped into this country, could bring a day of horror like none we have ever known." So if these companies refused to cooperate, by implication, that dark day could be on their conscience. And now they cannot even defend themselves in court, because the details of the investigations remain classified.

Opposition to immunity isn't aimed so much at punishing the telecom providers, but at obtaining information about what really happened and about reaffirming the significant legal duties that telecom providers have for safeguarding the privacy of their law-abiding customers.

Presumably any judge would have some sympathy for the telecom providers, considering the extraordinary circumstances; still, investors have an irrational fear of legal bills and uncertainty.

As for whether the warrantless surveillance was really unconstitutional or not isn't absolutely clear. The Supreme Court hasn't said, and some believe the Court might defer to the president who was acting as commander-in-chief to protect the nation's security. The Fourth Amendment concerns "unreasonable" searches and seizures, and electronic surveillance is routinely conducted on all sides during wartime.

Under FISA, the Foreign Intelligence Surveillance Court can authorize electronic surveillance when there's probable cause to believe that the target of surveillance is an agent of a foreign power or a terrorist. The argument is that the Bush administration should have invoked this procedure, which would have protected the telecom providers from liability.

But Richard A. Posner observed in February, 2006 that FISA was "dangerously obsolete" because while it allowed electronic surveillance against known terrorists, it couldn't authorize surveillance for the purpose of identifying potential terrorists and their supporters.

[FISA] retains value as a framework for monitoring the communications of known terrorists, but it is hopeless as a framework for detecting terrorists. It requires that surveillance be conducted pursuant to warrants based on probable cause to believe that the target of surveillance is a terrorist, when the desperate need is to find out who is a terrorist.
Writing in 2005, William Kristol and Gary Schmitt posited the following hypothetical:
A U.S. president has just received word that American counterterrorist operatives have captured a senior al Qaeda operative in Pakistan. Among his possessions are a couple of cell phones -- phones that contain several American phone numbers. In the wake of Sept. 11, 2001, what's a president to do?
Kristol and Schmitt rightly asked where is the evidence, in this hypothetical, to support a finding of probable cause to believe the targets of electronic surveillance, in the U.S., are terrorists?
Who knows why the person seized in Pakistan was calling these people? Even terrorists make innocent calls and have relationships with folks who are not themselves terrorists.
I have no idea if this was the actual justification or not, but it sounds plausible and legitimate to me.

Kerrey makes the logical point that the fight against terrorism will require access by the government to all kinds of personal data:

It is now clearer than ever that to connect the dots in future terror investigations, the government simply cannot do it alone -- it must have the full, unwavering support of private industry. The global proliferation and increasing sophistication of terrorist operations means that every private enterprise -- from the telecom and tech companies to the car renters and airlines, data-mining and credit card firms, chemical manufacturers and fertilizer retailers -- virtually every private concern in the U.S. economy must be willing to help out when a terrorism investigator comes to call.

The possibilities for abuse, given the occasional corrupt politician, careless bureaucrat or scheming corporation, stagger the imagination. Corporations like to curry favor from politicians; bureaucrats are assigned laptops, for some reason; politicians like to leak damaging details about their opponents' private lives; the list goes on. But the question ought to be whether it's possible to prevent abuse in most cases while allowing the government every tool to detect and prevent terrorist attacks.

Posner suggested a combination of criminal penalties and evidentiary prohibitions which sound like a promising starting point:
Forbid any use of intercepted information for any purpose other than "national security" as defined in the statute ... Thus the information could not be used as evidence or leads in a prosecution for ordinary crime. There would be heavy criminal penalties for violating this provision, to allay concern that "wild talk" picked up by electronic surveillance would lead to criminal investigations unrelated to national security.

The suggestion is evocative, at least for me, of the Miranda ruling, which addressed the problem of unscrupulous police investigators who conducted coercive interrogations to obtain confessions from innocent suspects. The Supreme Court solved the problem by making improperly-obtained evidence inadmissible and not by prohibiting interrogations or confessions -- which the Court recognized were indispensible techniques for fighting crime. I don't know many who would argue that the Miranda Warning hasn't worked pretty well.

February 22, 2007
Data retention would be costly

I surmised here that it would be costly for ISPs to retain customer data pursuant to a new proposal in the House of Representatives, and subsequently came upon a news report from a couple years ago in which industry sources predicted the cost of a similar proposal under consideration in the European Union would be quite large:

For AOL, retaining communications data for one year would add an enormous cost, said de Stempel. "There are huge amounts of data involved. AOL has 329m user sessions a day, and its customers send 597m emails, and we're just one ISP." De Stempel said that to save all communications data on its UK customers for just one day would require 100 CDs. "If you multiply that (for a year) it will have an enormous impact on our business."

Further costs would be incurred because an ISP could not simply hand a whole year's worth of CDs (36,000 in the case of AOL) over to police or other law enforcement agency when a request was made because, they say, this would be an offence under Regulation of Investigatory Powers Act (RIPA). RIPA says that any requests for communications data has to be proportional. "We'd have to search for a particular piece of data," said de Stempel.

Clive Feather, an Internet expert at ISP Thus who also gave evidence, said AOL's figure of 36,000 CDs was if anything an underestimate of the scale of the problem. "This is raw data. If ISPs are retaining data so it can be searched later then it has to be organised and indexed," said Feather. "And this would all have to be paid for."

Feather said he had no idea where the government's estimate of £20m for the whole industry came from. "The cost would be £5m to £6m for us alone," he added.

The full article can be found here.

The transcript of the testimony is here.

November 18, 2005
Spyware legislation advances in Senate

The Senate Commerce Committee approved a modified version of S. 687, a bill sponsored by Senator Conrad Burns (R-MT) and Senator Barbara Boxer (D-CA) which would target a variety of malicious practices that include: computer hijacking, spam zombies, endless loop pop-up advertisements and fraudulent software installation. A similar measure (H.R. 29) introduced by Rep. Mary Bono (R-CA) and Rep. Ed Towns (D-NY) has passed the House. The House has also approved H.R. 744, by Rep. Bob Goodlatte (R-VA) and Rep. Zoe Lofgren (D-CA), which addresses criminal penalties and prosecutorial tools related to spyware.

Spyware legislation is beneficial because it will promote consumer awareness and assist law enforcement. But technological solutions to the problem may ultimately prove more important. The industry is working on a number of solutions and requires flexibility to respond to evolving challenges. Lawmakers in both the Senate and House appear to be fully conscious of the danger of unintended consequences from legislating in this area. If the legislation's aims and means are too expansive or are not described with optimal clarity, for example, not only could it kill promising technological solutions but it could also ensnare legitimate applications and services that will make the use of computers more simple and secure for ordinary Americans.

Ratify the Cybercrime Convention

It is already against the law in the U.S. to interfere with someone else's computer or commit traditional crimes with the aid of a computer, however many countires have gaps in their criminal laws governing computer-related crimes and have become havens for cyber-criminals. Another problem is that electronic evidence of crime is difficult for law enforcers to locate and secure when it crosses borders. A treaty is awaiting final Senate approval that would fully criminalize computer-related offenses in other countries and require each country to have the power to quickly preserve and disclose stored computer data, compel the production of electronic evidence by ISPs, to search and seize computers and data, and to collect traffic data and content in real time. These evidence-gathering and surveillance powers are already provided for under U.S. law.

The Convention on Cybercrime has been criticized on the ground that it could allow a foreign country to collect evidence or eavesdrop in the U.S. -- on who knows what? -- via "mutual assistance." But the evidence-gathering and surveillance powers are subject to conditions and safeguards under domestic law that protect civil liberties, such as the First Amendment.

The Senate should ratify the treaty, which will promote an international minimum baseline in computer-related criminal offenses and law enforcement tools.

Dotted Divider Line

Contact Us
Discovery Institute Logo

Click here for additional contact information