People Still Value Privacy. Get Over It. Online Privacy Alliance.

An article, People Still Value Privacy. Get Over It. Online Privacy Alliance., published in The National Law Review recently was written by Mark F. Foley with von Briesen & Roper, S.C.:

vonBriesen

 

Sun Microsystems’ CEO Scott McNealy famously quipped to reporters in 1999: “You have zero privacy anyway. Get over it.” Sun on Privacy: ‘Get Over It‘, WIRED, Jan. 26, 1999, http://www.wired.com/politics/law/news/1999/01/17538.

 

At the time, Sun Microsystems was a member of the Online Privacy Alliance, an industry coalition seeking to head off government regulation of online consumer privacy in favor of industry self regulation. Although McNealy was widely criticized for his views at the time, it is fair to say that much of the technology world agreed then, or agrees now with his remark.

Have we gotten over it? Do we reside in a world in which individuals assign so little value to personal privacy that companies who collect, process, analyze, sell, and use personal data are free to do whatever they want?

There are indications that if it ever were true that consumers did not value privacy, their interest in privacy is making a comeback. Where commercial enterprises do not align their practices with consumer expectations and interests, a regulator will step in and propose something unnecessarily broad and commercially damaging, or outraged consumers will take matters into their own hands. Recent privacy tornadoes provide the proof.

For some time, employers have accessed public information from social media sites to monitor employee activities or to investigate the personal qualifications of prospective hires. But recently, companies have gone further, demanding that employees and prospects provide user names and passwords that would enable the company to access otherwise limited distribution material. Dave Johnson, a writer for CBS Money Watch, said employer demands for access to an employee’s or prospective hire’s Facebook username and password are “hard to see … as anything other than an absolutely unprecedented invasion of privacy.”  http://www.cbsnews.com/8301-505143_162-57562365/states-protect-employees-social-media-privacy/

The reaction was predictable. In the past year, six states – California, Delaware, Illinois, Maryland, Michigan and New Jersey – have reacted to public outcries by outlawing the practice of employers coercing employees into turning over social media account access information. At least eight more states have similar bills pending, including Massachusetts, Minnesota, Missouri, New York, Ohio, Pennsylvania, South Carolina, and Washington. See National Conference of State Legislatures Legislation Summary as of Jan. 8, 2013 at http://www.ncsl.org/issues-research/telecom/employer-access-to-social-media-passwords.aspx.

Similarly, Congress enacted the Children’s Online Privacy Protection Act (COPPA) in 1998 in response to the failure of self-regulation to limit the scope and nature of information collected from young children. COPPA and implementing regulations limited the collection of information from or about children less than 13 years old. In the past several years, it was widely conceded that this law was not effective in preventing the collection and use of personal information about our children, particularly where photographs and mobile phones were concerned. Companies collecting and using information about children took no action to satisfy parental concerns.

The reaction? In December 2012, the Federal Trade Commission issued amended regulations to make clear that COPPA rules apply to a child-oriented site or service that integrates outside services, such as plug-ins or advertising networks, to collect personal information from visitors. The definition of restricted personal information now includes geolocation as well as photos, videos, and audio files that contain a child’s image or voice, and “persistent identifiers” that recognize users over time and across different websites or services.

Parents and job counselors have been warning for years that teenagers and young adults must not post unflattering images to their Facebook pages because, even if deleted, they will persist somewhere on the internet and may be found by prospective colleges and employers. There were many anecdotes about teenagers committing suicide after nasty postings or the distribution of photos. There did not seem to be a practical solution to the problem.

Last year, the European Commission proposed a sweeping revision to its already difficult data privacy rules to include an explicit “right to be forgotten.” If the proposal is adopted, individuals can demand that websites remove personal photos or other data. Companies that fail or refuse to do so could be fined an amount based on their annual income. The rules, as proposed, would apply both to information the data subject posted about herself and embarrassing information others posted about her, unless the website can prove to a regulator that the information is part of a legitimate journalistic, literary, or artistic exercise. Such a new law would set up a dramatic clash between the European concept of privacy and the American concept of free speech.

For the past three years we’ve heard shocking stories about phone Apps that quietly collect information about our searches, interests, contacts, locations, and more without disclosure or a chance to opt out. The uproar led to only limited action that has not satisfied consumer concerns.

The reaction? U.S. Representative Hank Johnson has proposed The Application Privacy, Protection, and Security (APPS) Act of 2013, which would require App developers to disclose their information-gathering practices and allow users to require that their stored information be deleted.

Increasingly, consumers are not waiting for regulatory action, but are taking privacy protection into their own hands. For example, Instagram built a business on its photosharing App. Shortly after it became popular enough to be purchased by Facebook, Instagram issued new terms of service and privacy policies that appeared to give the company the right to use uploaded images without permission and without compensation. The Washington Post described consumer reaction as a “user revolt. . . on Twitter where shock and outrage mixed with fierce declarations swearing off the popular photo-sharing site for good.” http://articles.washingtonpost.com/2012-12-18/business/35908189_1_kevin-systrom-instagram-consumer-privacy. The Twitter response was so memorable that perhaps, in the future “insta-gram” will come to have a secondary meaning of “a massively parallel instantaneous complaint in cyberspace.”

The blogosphere and Twitterterra were filled with apologies and explanations by Instagram and others stating the company was not a bad actor and truly had no intention of using photos of your naked child to sell diapers without your permission. Even some of the harshest critics admitted, “it’s [not] quite as dramatic as everyone . . . made it seem like on Twitter.” See Theron Humphrey quoted in David Brancaccio’s Marketplace Tech Report for December 19, 2012, http://www.marketplace.org/topics/tech/instagrams-privacy-backlash-and-dirty-secret-data-caps. But the truth about the revised terms and conditions may not matter because consumer goodwill toward Instagram had been destroyed by the perception.

Instagram users are not alone in their disapproval of commercial uses of personal information. Consumer analytics company LoyaltyOne released a July 2012 survey that shows U.S. consumers are increasingly protective of personal information. Of the 1,000 consumers responding, only about 50% said they would be willing to give a trusted company their religious or political affiliation or sexual orientation, only 25% were willing to share commonly commercialized data such as their browsing history, and only 15% were willing to share their smart phone location. See summary of findings at http://www.retailcustomerexperience.com/article/200735/Consumers-still-value-privacy-survey-shows. USA Today reported that an ISACA survey of adults 18 years and older showed that 35% would not share any personal information if offered 50% off a $100 item, 52% would not share any personal information if offered 50% off a $500 item, and 55% would not share any personal information if offered 50% off a $1,000 item. USA TODAY, Bigger Discount, Less Sharing, January 21, 2013.

I’m confident everyone reading this Update has been sufficiently careful and prudent in their own personal and professional lives; but who among us has not had an, ahem, family member, who does not regret a photo posted to a social media site, an unappreciated email joke, or a comment in a tweet or blog that looks much less “awesomely insightful” after the passage of a few days. (Is there an emoticon meaning “I’m being really facetious”?) Such brief moments of indiscretion can lead to disproportionately bad results.

Have commercial collectors, users, and resellers of such information shown sufficient willingness to respond to consumer’s widespread discomfort with the permanent retention and uncontrolled access to their personal information, candid photos, and musings?

We no longer inhabit a Wild West without limit on the collection and use of personal information for commercial purposes. Be assured, that when something perceived to be bad happens, there will be a violent, goodwill damaging, market value destroying, throw-out-the-baby-with-the-bath-water Instagram-like response that will obliterate some current business models and corporate franchises. Notwithstanding terms and conditions of service that try contractually to deprive users of any right to complain about your use of their data, they will complain and they will vote, with both their Feet and their Tweets.

There are very good social, psychological, religious, and political reasons why privacy should be protected. See Wolfgang Sofsky, PRIVACY: A MANIFESTO (Princeton Univ. Press 2008). As consumers and parents we instinctively know that privacy is important, even if we can’t precisely define it and can’t say exactly why. Even though we’ve sometimes been too foolishly willing to let go of privacy protections in exchange for the convenience of a nifty new website or clever new App, we do, in the end, still care. We know there is something important at issue here. We should not forget this insight when we change hats and become business people deciding what data to collect and how to use it.

Companies that want to avoid receiving an “insta-gram,” that want to build long term relationships with consumers, need to accept that sentiment has changed when designing their programs, analytics, and business models. It’s time to throw out McNealy’s aphorism. Businesses need to recognize that today consumers increasingly do value their privacy, and get over it.

©2013 von Briesen & Roper, s.c

Lessons from the Facebook Privacy Fiasco

An article recently in The National Law Review by Dean W. Harvey of Andrews Kurth LLP regarding Facebook Privacy:

Facebook is a wildly popular social media site which allows users to share information about themselves, send messages to friends, play games and join common interest groups. It is the most visited site in the U.S., with over 100 million active U.S. users and hundreds of millions of active users worldwide.1

During the week of April 18, 2010, Facebook made material changes to the way that its users’ personal information was classified and disclosed. The changes resulted in complicated privacy settings that confused users, and in some cases, personal data which users had previously designated as private was allegedly made public. As a result, a group of petitioners, including the Electronic Privacy Information Center (“EPIC”), filed a complaint with the FTC requesting that the Commission investigate Facebook to determine whether it engaged in unfair or deceptive trade practices (“Complaint”).

Allegations

The Complaint claimed that Facebook violated its own privacy policy, disclosed personal information of Facebook users without consent, and engaged in unfair and deceptive trade practices. Specifically, the Complaint alleged that among other things:

  • Facebook made publicly available personal information which users had previous designated as private.2
  • Facebook disclosed to third parties information that users designated as available to Friends Only (including to third-party websites, applications, other Facebook users and outsiders who happen on to Facebook pages).3
  • Facebook claimed that none of user’s information was shared with sites visited via a plug-in (such as the Like button, Recommend button, etc.). However, such plug-ins may reveal users’ personal data to such websites without consent.4
  • Facebook designed privacy settings “to confuse users and to frustrate attempts to limit the public disclosure of personal information . . .”5
  • Although the Facebook terms which many users accepted indicated that developers would be limited to a 24-hour retention period for any user data, Facebook announced that the limit no longer exists.6

Angry End Users

Regardless of whether each of the above allegations is true, it is clear that Facebook’s changes to its privacy practices inflamed some of its users. In support of its allegations, the EPIC Complaint included quotes from experts and users about Facebook’s privacy practices such as:

“I shouldn’t have to dive into complicated settings that give the fiction of privacy control but don’t, since they are so hard to understand that they’re ignored. I shouldn’t need a flowchart to understand what friends of friends of friends can share with others. Things should be naturally clear and easy for me.”7

“Facebook constantly is changing the privacy rules and I’m forced to hack through the jungle of their well-hidden privacy controls to prune out new types of permissions Facebook recently added. I have no idea how much of my personal information was released before I learned of a new angle the company has developed to give my information to others.”8

“‘Instant Personalization’ is turned on automatically by default. That means instead of giving you the option to “opt-in” and give your permission for this to happen, Facebook is making you “opt-out,” essentially using your information how they see fit unless you make the extra effort to turn that feature off.”9

“Facebook has become Big Brother. Facebook has succeeded in giving its users the allusion [sic] of privacy on a public site, leaving everyone to become complacent about keeping track of the myriad changes going on behind the scenes. The constant changes assure Facebook that you can never keep all your information private.”10

The Proposed Settlement

The FTC investigated the Complaint and ultimately agreed to a proposed settlement agreement containing a consent order.11 Without admitting liability, Facebook has agreed to a settlement that among other things requires the following:

  • Facebook must establish, implement and maintain a comprehensive privacy program designed to: (1) address privacy risks related to the development and management of new and existing product and services for consumers; and (2) protect the privacy and security of covered information.12
  • Facebook must obtain an independent third-party audit every other year for the next 20 years certifying that the Facebook privacy program meets or exceeds the requirements of the FTC order;13
  • Facebook is required to obtain express consent from a user before enacting changes that override the user’s privacy preferences;14
  • Facebook is required to prevent third parties from accessing user data after the user has deleted (with exceptions for legal compliance and fraud prevention).15

Lessons from the Complaint and Order

Facebook received significant negative publicity, incurred legal costs and business disruption associated with a government investigation, and will incur compliance costs for the next 20 years as a result of the proposed settlement. Businesses that deal with consumer information would be well advised to learn from Facebook’s experience. There are several lessons that businesses can draw from the Facebook privacy fiasco in dealing with data privacy issues.

A. Don’t Make Your Customers Angry

Facebook’s intentions in making the changes to its privacy settings may have been entirely good. For example, Facebook may have honestly been trying to improve its user experience. However, the changes significantly angered some of its customers. The lesson to be learned here is that intentions don’t matter if you anger your customers with your changes. The ultimate user experience may be better, the site may objectively offer more functionality, but none of that matters if users are offended by the process.

Businesses need to achieve innovations and improvements in the use of consumer data with user consent, and without breaking prior promises. Keeping your customers satisfied isn’t just good business, it also greatly reduces the likelihood that they will be filing deceptive trade practice complaints with the FTC.

B. Keep the Privacy Settings Simple

Much of the Complaint is dedicated to showing how complicated the Facebook settings are, and many of the quoted user statements underscore that issue as well. Such complexity often leads to errors (such as permitting applications to access personal information of a user through the user’s friends). Even when the settings work perfectly, the average person may find such complexity frustrating, leading to angry end users.

It is important to keep privacy policies simple and establish privacy settings so that they can be easily understood by an average user. Informed consent is really only obtained when the user understands the policy or setting to which he or she is consenting.

C. Consider How Applications Access User Data

When drafting a privacy policy, it is easy to focus on the organization’s use of data for internal purposes and with its vendors and subcontractors. However, special care must be taken with use of consumers’ data by software applications. For example, it is alleged that Facebook indicated applications only had access to the user information necessary for their operation, when the applications in fact had access to all user information.

In order to accurately describe how applications use consumer data in your privacy policy, you have to investigate the operation of the applications on your site, document that operation, and establish IT policies and procedures governing the use of data by new or modified applications. If you do not take these steps, it is likely that any promise regarding the use of data by applications will become misleading over time as the applications change and are updated.

D. Monitor Linking and Other Advertising Arrangements

Linking and advertising arrangements are the lifeblood of many sites. In order to make accurate statements about the types of data shared in such arrangements, it is necessary to review the contracts to understand what types of user data will be shared through business processes. However, this is not sufficient to ensure that the full use of data is understood. Just as with applications, it is necessary to investigate what data is collected or shared in the process of passing the user to the third party. Similar to applications, it is important to document what user data is permitted to be shared with advertisers and other third parties, and to establish IT policies and procedures to enforce such permitted uses.

E. Don’t Make User Data Public Without Consent.

One of the problems many businesses face with privacy policies is that as their business changes, the types of user data that they want to access or use may change as well. However, it is important to remember that no matter what the motive, if you have promised to keep certain elements of user data private in your privacy policy, you should not make it public by default without first obtaining affirmative user consent.

Privacy compliance is difficult in a changing online environment, even for businesses that don’t have hundreds of millions of users. The Complaint and Order in the Facebook matter highlight some of the many ways that a business can go wrong in protecting private consumer information. In order to successfully protect such information, a business which deals extensively with consumer data should establish, maintain, update and enforce a comprehensive privacy and security program, which takes into account material risks as well as lessons learned from the experience of other companies, such as Facebook.

1. In the Matter of Facebook, Inc., Complaint paragraph 31 (May 5, 2010); available at http://epic.org/privacy/facebook/EPIC_FTC_FB_Complaint.pdf.

2. Id. at paragraph 55.

3. Id. at paragraph 59.

4. Id. at paragraph 65.

5. Id. at paragraph 64.

6. Id. at paragraphs 92-94.

7. Id. at paragraph 95.

8. Id. at paragraph 97.

9. Id. at paragraph 98.

10. Id. at paragraph 106.

11. In the Matter of Facebook, Inc. File No. 092 3184, Agreement Containing Consent Order (“Order”); available athttp://www.ftc.gov/os/caselist/0923184/111129facebookagree.pdf.

12. Id. at paragraph IV.

13. Id. at paragraph V.

14. Id. at paragraph II.

15. Id. at paragraph III.

© 2012 Andrews Kurth LLP