By Tim Sparapani
Innovators and startups welcome the news that policymakers are taking a fresh look at how to protect consumers’ privacy online. While the headlines may try to spin this as just another partisan food fight, in truth it’s an incredibly important opportunity to restore balance and clarity to consumer privacy rules in the online ecosystem.
As we’ve said from the start, the privacy rules adopted late last year by the Wheeler FCC were clearly flawed and the ongoing jurisdictional tussle over privacy needs to be resolved for the benefit of consumers and companies alike. The Wheeler rules created an inconsistent, confusing patchwork, in which consumers’ private information on the internet would be protected differently depending on which servers and routers their data happened to be crossing. Yes, the exact same data would arbitrarily enjoy different levels of protection. 94% of consumers believe that all companies collecting their information online should face the same set of rules – and they’re right. The Wheeler rules break from the bipartisan FTC privacy framework that has seen the internet thrive and grow in other ways, introducing new friction and erecting confusing and unjustified new obstacles to even the most mundane uses of data any consumer would see as non-sensitive. This kind of regulation is bad for consumers, bad for entrepreneurs, and bad for innovation.
In addition, a little known consequence of the Wheeler rules was that they jeopardized the United States’ privacy agreement with the European Union. The Privacy Shield is predicated in part on the United States having a single, lead consumer privacy agency, and the dilution of the FTC’s authority puts this agreement at risk.
We’re glad that policymakers at the FCC and in Congress will have an opportunity to review the rules again and, hopefully, correct these flaws. A return to the FTC’s role as the lead privacy enforcer would allow innovators to do what they do best: innovate. In addition, a consistent set of rules would do well to assuage consumer advocates’ concern that gaps in enforcement would delay critical privacy actions when companies are ignoring or outright abusing their data responsibilities to their customers.
October 7, 2016
“This version, like the first, falls woefully short in its noble goal to safeguard consumer data and increase transparency for the public. Subjecting the exact same data to different and arbitrary rules depending upon a company’s primary offering in today’s era of vertical integration does not increase consumer privacy. It is also blind to the realities of the marketplace. We need 21st Century privacy rules to govern a 21st Century data market.” said Tim Sparapani, CALinnovates’ senior policy counsel.
“Chairman Wheeler has indicated that some favored companies will be allowed to practice permissionless innovation outside the FCC’s jurisdiction while other disfavored entities must operate under the microscope despite the fact that the data is one in the same. Businesses of all types today are data companies first and foremost, whether they make software or deliver internet access – or both. And innovation can and should spring from all types of companies, ISPs included.”
“Today, Verizon owns Yahoo, AOL and Huffington Post, and the line between ISPs and edge providers has been increasingly blurred. Consumers will be no better off under this scheme than the previous one, but they may be worse off than they are today.”
“This is a referendum on innovation and an affront to consumers who expect more and demand better. No matter how Chairman Wheeler tries to spin it, his latest iteration of the FCC’s privacy proposal is nothing more than lipstick on a pig,” said Mike Montgomery, executive director of CALinnovates.
“CALinnovates encourages Chairman Wheeler to return to the drawing board to rewrite the rules one more time. Better yet, the FCC should seek further public input as well as guidance from Congress and the FTC, which has the longstanding privacy expertise the FCC lacks.”
CALinnovates is a non-partisan coalition of tech companies, founders, funders and non-profits determined to make the new economy a reality.
By Tim Sparapani
“Bad UI leads to bad UX” is one of the most common sayings in Silicon Valley. Translated, this means that bad user interface (UI) – the look, feel and relative usability of an app or website’s design – will inevitably create a bad user experience (UX). Silicon Valley spends considerable resources trying to build more intuitive, instinctive designs, especially when trying to get consumers’ attention and permission for using their data to offer them products and services. This challenge is at the heart of an upcoming Federal Trade Commission conference this week focusing on the effectiveness of online disclosures to consumers.
Companies have long had to balance completeness and usefulness in disclosures — take our constantly evolving nutrition labels, for example. In the digital age, while tech companies have made important strides in communicating with consumers, striking the right balance is still a challenge just as it is with describing the most important details about your favorite cereal.
Despite decades of work in trying to figure out best disclosure practices by all kinds of companies, it’s amazing how much we have to learn about design for disclosing critical information to consumers. The challenge is most stark online: tech companies must figure out how to get their consumers’ attention, tell them what they need to know, and obtain their permission when needed, while not creating the dreaded “notice fatigue” where consumers ignore disclosures or, worse, abandon the website or app out of annoyance.
That’s why the upcoming FTC workshop to explore consumer disclosures is both so interesting and so important. The FTC, state attorneys general, and consumers themselves, rightfully expect that companies should communicate clearly what consumers should or must know about a company’s products or services. Today, the FTC will gather experts from various disciplines to explore consumer messaging cognition and challenges for disclosures, permissions and warnings all with the goal of advancing UI.
How to communicate something you really need someone to know is a vexing problem in life. Perhaps, if you are married, you are really good at sharing important news and wisdom with your spouse. You probably do not try to use the same words to share the same bit of wisdom with your children or someone who is from an older generation. Words and phrases, much less idioms or technical language, often mean different things to people with different experiences. Those consumers for whom English is not their first language may understand words in translation differently than those who are native speakers.
Read the full article here.
By: Tim Sparapani
If you want to buy someone’s private data, it’s disturbingly easy to do. It’s all there for sale on the dark web, a completely anonymous twin of the web most of us use daily.
The dark web (or deep web, if you prefer) is “dark” because the sites on it cannot be indexed by a web crawling browser, such as Google. That makes it hard for ordinary people, and law enforcement, to find specific websites. This anonymity has the advantage of creating a zone of free speech where individuals can communicate, think and explore ideas without government interference.
But it also creates a haven for illicit activity, including the buying and selling of drugs, child pornography and individuals’ private information such as social security numbers, health records and passwords.
People who don’t closely follow privacy issues probably associate the dark web with Silk Road, the infamous illegal drug marketplace that did millions in business before the FBI managed to shut the site down in 2013.
But the death of Silk Road didn’t put an end to the dark web. This shady technological playground is still going strong, and many sites that thrive on the dark web are a daily threat to privacy and the economy.
Over the past three months, the website LeakedSource has uncovered huge caches of account data being sold on the dark web from eight websites including Twitter, MySpace and LinkedIn. In some cases, those accounts came from privacy breaches at the web companies. In other cases, data thieves were able to steal information directly from users.
The way the account information was stolen matters less than the fact that so much of it is for sale. Need a Netflix password? They’re available for pennies on the dark web. You can also get stolen passwords for Hulu, HBO Go and Spotify.
The dark web has also become a haven for child pornography. According to an article on Wired, over 80% of dark web searches are related to pedophilia.
By: Tim Sparapani
The US Supreme Court has just made the law of privacy in the US about as settled as wet cement. Now, neither consumers nor companies handling consumer data know where things stand.
This all came about when a data broker – a company that gathers data about individuals, typically without their knowledge or consent, and then resells that data – created a file of wholly inaccurate information about an individual for resale. Upon learning about the data broker, Spokeo’s, actions the individual sued Spokeo citing a violation of his rights under the Fair Credit Reporting Act. That federal statute creates a right to sue for violations. The trial court, nevertheless, dismissed the case but the US Court of Appeals for the Ninth Circuit allowed it to proceed. The Supreme Court overturned that decision and sent the case back for additional consideration because the Ninth Circuit had not determined whether the plaintiff’s alleged injury was sufficiently real, tangible, or, what it deemed “concrete” enough to meet Constitutional standards for sustaining a lawsuit.
The result of effectively a non-decision by the US Supreme Court coupled with it providing the barest of guidance has created tremendous privacy law controversy. Now a debate is raging in Washington and in the offices of General Counsels of corporations and plaintiffs attorneys nationwide about what it takes to satisfy this vague standard. Pitched battles are being waged to influence the interpretation of that non-decision and influence what happens next because so much is at stake in a time when our economy is driven by identifying and unlocking value from consumers’ data.
How do we know when a company’s actions using a consumer’s data – especially erroneous data – harmed that consumer’s privacy? The whole debate will turn now on the definition of the term “concrete.” It’s a word that’s hard to lock down. Dictionaries provide only slightly more help than the thin guidance provided by the US Supreme Court. “Concrete”, an adjective meaning, “based on sure facts or existing things rather than guesses or theories.” Cambridge English Dictionary. “Specific particular. Real, tangible.” Merriam Webster’s Dictionary.
This non-decision has corporate America celebrating because fewer privacy cases will be successful. Influential privacy and consumer advocates, in contrast, argue that giving the lower court a do over, in effect, changes nothing. The truth lies somewhere in between, of course.
This is, no doubt, a blow for plaintiffs trying to bring lawsuits. By forcing people who want to sue to describe a tangible injury – and perhaps barring ephemeral or hard to explain or quantify privacy harms, even when Congress created a statutory right to sue – the barrier is now higher to successfully sue to vindicate privacy invasions. While that’s not the end of all suits regarding privacy as some have erroneously claimed, it does mean that some privacy cases that would have gone forward in the past will not make the cut. That surely means that some not-well-articulated but nonetheless important privacy harms will go unaddressed in the courts.
By Tim Sparapani:
Los Angeles is considering new regulations around Airbnb, and other home-sharing platforms, that should deeply worry anyone who cares about keeping their personal information private. If approved, the regulations would require people who rent out space via a home-sharing platform to hold on to three years’ worth of information about who rented their property for how long and at what price. The Office of Finance would have the right to inspect these records at any time.
It’s unclear exactly why the government is proposing this level of privacy invasion. The main thrust of the proposed legislation, which will eventually need to be approved by the LA City Council, is to set out guidelines and fines that would ensure a level of safety and accountability for home rentals. This market is growing quickly. According to a recent poll by Time magazine, 26% of the population has used a home-sharing service. As such, it’s not a stretch for the government to set some commonsense rules around the market and collect taxes from commercial activity.
But the invasion of privacy outlined in the Los Angeles proposal will create unnecessary risks for consumers.
Think about when you check into a hotel. If you pay with a credit card, the hotel will likely look at your driver’s license to make sure it matches the name on the card, but they don’t have to. If you pay with cash, they don’t need any kind of proof of identity. You pay your money and you get your room.
So why is the sharing economy potentially going to be held to a different standard?
By: Tim Sparapani
It’s late in the college basketball season and just as the best teams have to improve their defense in order to win a championship, it may be time for our privacy regulators to try increasing their defensive intensity in order to deter and prevent additional consumer privacy violations. Just like in basketball, sometimes it takes switching up your tactics and your defense to take your team’s game up to a new level.
The challenges for policing misuse of consumers’ data are getting harder, not easier for regulators on the privacy beat. Those regulators – chiefly the Federal Trade Commission and the state Attorneys General – have performed admirably during an age when accidental or clumsy data breaches are now daily events, and cyber attacks – many successful – are now the norm, not the exception. The FTC like a 7 foot-tall center is well-practiced at swatting away the easy, slam dunk cases where companies deceive consumers about their privacy practices ranging from neglect, to poor cross-company coordination, to outright lies. Yet, as a longtime privacy and consumer advocate I’m eager to see more done. I want to shout “De-Fense!” – or, maybe “Un-Fair-Ness!” and exhort the FTC to do more for consumers and in a new way.
Let’s be honest, however, the FTC has resource constraints; it simply cannot police every violation of consumer trust or misuse of consumers’ data by a corporation. Nor is privacy defense the only role assigned to the FTC. The FTC is statutorily required to enforce dozens of consumer protection statutes, and its work on consumer data privacy, while it has lead to groundbreaking and important results, is limited by a lack of staff and intricacies of sophisticated new scams. The FTC’s data privacy work is also limited by the fast-moving pace of technologies. New systems are brought to the public and then obviated by subsequent iterations often within months, not years.