Protect Your CEO’s Tweets and Posts from U.S. Securities Exchange Commission (SEC) Enforcement Action

vonBriesen

The U.S. Securities Exchange Commission (SEC) Enforcement Division altered the jet stream of blogosphere commentary last December by, for the first time, recommending legal action against a CEO on account of a Facebook post. Immediately after the announcement, a blizzard of articles, tweets, and blogs buried the mediascape with opinions about the critical role of CEO social media use in the new economy, the wisdom or foolishness of allowing CEO’s to Tweet or post, and whether the SEC should be time warped back to the Stone Age it seems to prefer.

Sweeping away the accumulated hyperbole reveals two important takeaways from the SEC’s announcement, applicable to both public and private companies: i) the more things change, the more they remain the same, and ii) this latest “grave threat” to the modern world is not a crisis, but an opportunity. Social media can be a valid, legal, and effective way to communicate with investors, if it’s done right.

About Regulation FD

The SEC’s action responded to a July 2012 Facebook post by CEO Reed Hastings stating that members watched over 1 billion hours on Netflix in June. Netflix estimated that Hastings had reached 200,000 people through his Facebook, Twitter, and LinkedIn accounts. The SEC felt this was material information for investors and that by announcing it through social media, rather than more traditional outlets, Netflix had violated Regulation Fair Disclosure (Reg. FD).

The SEC adopted Reg. FD in 2000 to fix a perceived lack of fairness in the public securities markets. Before Reg. FD, public companies could share material information with analysts who participated in conference calls or meetings not open to smaller investors. Well-connected investors got trading advantages over the general public. Reg. FD prohibits public companies from providing material information to limited groups of investors without simultaneously making the information available to the entire marketplace.

Under Reg. FD, public disclosures must be made by “filing or furnishing a Form 8-K, or by another method or combination of methods that is reasonably designed to effect broad, non-exclusionary distribution of the information to the public.” The “other method” most often employed is a press release to an array of media outlets likely to disseminate the information broadly and quickly. Individuals and companies violating Reg. FD risk injunctions and monetary penalties.

Use of Social Media Growing, Creating Risks

Social media channels first became critical communication tools for companies after adoption of Reg. FD. A 2010 study of the 100 largest companies in the Fortune 500 found that 79% were using at least one of the four most popular social media platforms. See Burson-Marsteller Fortune Global 100 Social Media Study, Feb. 23, 2010, available at http://www.burson-marsteller.com/Innovation_and_insights/blogs_and_podcasts/BM_Blog/Lists/Posts/Post.aspx?ID=160

A 2012 Forbes article cited an IBM study saying 57% of surveyed CEO’s likely would be using social media by 2017. Mark Fidelman, IBM Study: If you Don’t Have a Social CEO, YourGoing to be Less Competitive, FORBES, May 22, 2012.

The SEC itself uses social media to disclose important information such as speeches, trading suspensions, litigation releases, and administrative proceedings.

While some CEOs see social media as “part of their job description,” others try to minimize risk by having employees write or review tweets before posting, and some CEOs have already tried social media and moved on. See Leslie Kwoh and Melissa Korn, 140 Characters of Risk: Some CEO’s Fear Twitter, WALL STREET JOURNAL, September 26, 2012.

Not everyone does, or should, use all forms of social media. The point of Twitter, for example, is to provide information contemporaneously with the occurrence of a thought or an event. This promptness is both the differentiating touchstone of the medium and its source of danger. Quick, unconsidered, unscripted communications by senior executives of public companies pose risks in the form of leaked intellectual property, disclosed business plans, angered customers, litigious investors, and frothy regulators. The SEC Netflix announcement demonstrates the potential for liability arising from disclosures of information requiring consideration through social media focused solely on promptness. A Facebook post subjected to prior review might have been a better choice.

Even where the SEC does not act, executives may be at risk. In May 2012, retailer Francesca’s Holdings Corporation fired its CFO, Gene Morphis after he tweeted: “Board meeting. Good numbers = Happy Board.” Mr. Morphis, who was also active on other social media outlets, had a history of postings about earnings calls, road shows, and other work related matters. Morphis lost his job even though the SEC took no action. Rachel Emma Silverman, Facebook and Twitter Postings Cost CFO His Job, WALL STREET JOURNAL, May 14, 2012.

Social Media Without Big Risk

The SEC has never issued guidance about the use of social media, but it has issued guidance that websites could be deemed sufficiently “public” to satisfy Reg. FD when: (1) it is a recognized channel of distribution, (2) posting on the web site disseminates the information in a manner making it available to the securities marketplace in general, and (3) there has been a reasonable waiting period for investors and the market to react to the posted information. Indeed, “for some companies in certain circumstances, posting … information on the company’s web site, in and of itself, may be a sufficient method of public disclosure,” SEC Release No. 34-58288 (Aug. 7, 2008) at 18, 25.

This is an example of how “the more things change, the more they stay the same” when it comes to the intersection of law and technology. The purpose of Reg. FD is to make sure that all investors have access to the same information roughly simultaneously. The specific communications method is not important so long as the principle of public disclosure to the general market, not subsets of investors, is served. Because 8-K filings and press releases were the most common ways to quickly and broadly disseminate information in the past, investors knew where to look for them and could monitor those information outlets. Now, when companies establish their websites as well-known places to find press releases, SEC filings, and supplemental information, they, too, have become acceptable means for Reg. FD disclosures.

The same analysis applies to social media, as well as any new communications technology that may exist in the future. The critical question is: has the company sufficiently alerted the market to its disclosure practices based on the regularity, prominence, accuracy, accessibility, and media coverage of its disclosure methods? If so, social media should be just as acceptable as any other communication tool.

One company seems to have found the right balance. Alan Meckler, CEO of WebMediaBrands Inc. drew the SEC’s attention after a pattern of regularly disclosing company information through social media back in December 2010. The SEC’s Division of Corporation Finance questioned whether Mr. Meckler’s Tweets “conveyed information in compliance with Regulation FD.”SEC letter dated December 9, 2010. Despite, the investigation, the SEC brought no enforcement action.

To use social media with minimum SEC risk, the company must educate investors so that they know such communications will always occur at a particular place and at least simultaneously with other outlets. This is done by a regular pattern of social media disclosure and links to other sources, such as SEC filings, showing the way. A company should not force investors to win a shell game, finding the nut of important information in Twitter this time, on Facebook the next time, and Instagram after that. Consistency, predictability, and transparency are key. Used this way, social media present an opportunity to communicate with investors in new ways, not a source of legal problems.

©2013 von Briesen & Roper, s.c

Can Having Employees Pose for the Camera Pose Problems for You?

The National Law Review recently featured an article regarding Employee Photos written by Amy D. Cubbage with McBrayer, McGinnis, Leslie and Kirkland, PLLC:

McBrayer NEW logo 1-10-13

Employers have a variety of reasons for using employee photos, including:

  • internal company use (for a company directory or in the break room);
  • external use (such as the company website or a blog post—you’ll find my picture below);
  • for safety precautions (name badges or scan cards); and
  • for commercial use in advertisements or marketing.

Employees are usually amendable to having their picture taken. But, there may be a few who express their genuine disinterest in being photographed. Such employees could simply be camera shy; others may have a more serious reason to refuse to have an image published.  Some may need to protect anonymity for personal reasons, such as past domestic abuse.  Others may adhere to religions forbidding taking pictures.

There are generally no legal ramifications for using employee photos, unless it is for commercial purposes.  Most states, including Kentucky, have laws that require permission before using an individual or their “likeness” for commercial purposes. This is due to the commonly held notion that a person has property rights in his or her name and likeness and those rights should be shielded from exploitation. Kentucky’s law is codified in KRS 391.170.

If you need to use employee photos for a commercial use, there is a simple solution. Have employees sign releases in which they acknowledge that their picture may be used in a company advertisement and they will receive no compensation for the use of their photo. Keep these releases on file.

Even in a state where consent is not required, it is always a smart approach to use a release so that employees will not be surprised when they see their face plastered on a promotional piece. If minors appear in the commercial materials always use extra caution. Use a consent form, whether required or not, to be signed by the child’s parents.

A warning about taking photos of potential employees: if you take photographs of applicants applying for a job (to help remember who’s who), it may put you at risk for a discrimination claim. A photograph creates a record of certain protected characteristics (i.e., sex, race, or the presence of a disability) that employers generally cannot use in hiring considerations. If this information is collected and a discrimination claim arises, the burden will be on the employer to prove the photographs were not used to make a discriminatory employment decision.

I will leave you with a little common sense about employee photos. Always remember to publicize when the office picture day will be; no one likes showing up ill-prepared. Offer a “redo day” for those who are truly unhappy about how their picture turned out. If all else fails, resort to photoshopping. A little lighting adjustment or cropping can work wonders for a shutterbug humbug.

© 2013 by McBrayer, McGinnis, Leslie & Kirkland, PLLC

Administration Launches Strategy on Mitigating Theft of U.S. Trade Secrets

The National Law Review recently published an article, Administration Launches Strategy on Mitigating Theft of U.S. Trade Secrets, written by Lauren M. Papenhausen with McDermott Will & Emery:

McDermottLogo_2c_rgb

 

The strategy announced on February 20, 2013, should serve as both a wake-up call from the government and an offer of assistance.  Given the losses that can arise from competitors’ purposeful theft of trade secrets, entities should review the announcement and decide whether they need to be more active in protecting their trade secrets.  The strategy also offers opportunities for increased collaboration with the government.

On February 20, 2013, the White House announced an “Administration Strategy on Mitigating the Theft of U.S. Trade Secrets.”  Companies should view the announcement of this strategy as both a wake-up call from the government and an offer of assistance.  Given the losses that can arise from competitors’ purposeful theft of trade secrets, entities should review this government announcement and decide whether they need to be more active in protecting their trade secrets.

The administration strategy articulates a broad governmental commitment to addressing an “accelerating” threat to U.S. intellectual property.  The strategy encompasses five action items:

  • Focusing diplomatic efforts to protect trade secrets through diplomatic pressure, trade policy and cooperation with international entities
  • Promoting voluntary best practices by private industry to protect trade secrets
  • Enhancing domestic law enforcement, including through outreach and information-sharing with the private sector
  • Improving domestic legislation to combat trade secret theft
  • Improving public awareness and stakeholder outreach

Three main themes emerge from the administration strategy that are important for U.S. businesses.

First, the strategy and its supporting documentation highlight how frighteningly real the prospect of trade secrets theft is.  The White House report is peppered with references to household name companies that have been victimized by trade secrets theft over the past few years, often at a cost of tens of millions of dollars or more.  Mandated reports from the defense industry to the government indicate a 75 percent increase between FY2010 and FY2011 in reports of suspicious activity aimed at acquiring protected information.  Coupled with a recent New York Times article asserting Chinese government involvement in more than 100 attempted cyber attacks on U.S. companies since 2006, these reports warrant sitting up and taking notice.  According to a report by the Office of the National Counterintelligence Executive, particular targets include companies that possess the following:

  • Information and communications technologies
  • Business information that relates to supplies of scarce natural resources or that gives foreign actors an edge in negotiations with U.S. businesses or the U.S. government
  • Military technologies, particularly in connection with marine systems, unmanned aerial vehicles and other aerospace/aeronautic technologies
  • Civilian and dual-use technologies in sectors likely to experience fast growth, such as clean energy, health care and pharmaceuticals, advanced materials and manufacturing techniques, and agricultural technology

Second, the government alone cannot solve the problem.  The administration commits to making the investigation and prosecution of trade secret theft a “top priority” and states that the Federal Bureau of Investigation has increased the number of trade secret theft investigations by 29 percent since 2010.  On its face, however, a 29 percent increase in investigations cannot keep pace with a 75 percent increase in attempted trade secret thefts.  Historically, as a result of limited resources, the government has been able to address only a tiny fraction of trade secret thefts, and there is no indication that there will be the massive influx of resources necessary to change this dynamic materially.  Indeed, the administration strategy recognizes the need for public-private partnerships on this issue and asks companies and industry associations to develop and adopt voluntary best practices to protect themselves against trade secret theft.  And, of course, there are significant drawbacks to any after-the-fact solution, whether relying on government intervention or a private lawsuit.

The best solution is to prevent a trade secret theft from ever occurring.  Even if that is not possible, having taken strong measures to protect trade secrets will aid success both in any civil litigation against the perpetrator and in any criminal action the government may bring.  Entities should consider at least the following types of protective measures:

  • Research and development compartmentalization, i.e., keeping information on a “need to know” basis, particularly where outside contractors are involved in any aspect of the process
  • Information security policies, e.g., requiring multiple passwords or multi-factor authentication measures and providing for data encryption
  • Physical security policies, e.g., using controlled access cards and an alarm system
  • Human resources policies, e.g., using employee non-disclosure agreements, conducting employee training on the protection of trade secrets and performing exit interviews.

It also will be important in any future litigation that a company has clearly designated as confidential any materials it may wish to assert are trade secrets.

Third, the new administration approach to trade secrets offers some opportunities for U.S. companies.

The government interest in enhancing law enforcement operations indicates that businesses may have a better chance of encouraging the government to investigate and bring criminal charges under the Economic Espionage Act (EEA) against the perpetrators of trade secret thefts.  The possibility of seeking government involvement is a powerful tool that should be considered and discussed with counsel any time there is a significant suspected trade secret theft.  Obtaining government involvement in specific instances of trade secret theft can allow businesses to take advantage of information learned via government tactics such as undercover investigations and search warrants.  It also can significantly enhance any civil litigation—for example, a finding of criminal liability can make a civil outcome a foregone conclusion.

The administration strategy’s focus on improving domestic legislation and increasing communication with the private sector suggests that there is an opportunity for the private sector to collaborate with government actors in communicating industry needs and shaping policy.  For example, it is possible that the time is ripe for an amendment to the EEA (currently a federal criminal statute that offers no private right of action) to create a federal, private cause of action for misappropriation of trade secrets.  A bill to this effect was introduced in Congress in 2012 and did not progress, but two other amendments to strengthen the EEA that passed overwhelmingly in December 2012, plus the recently issued administration strategy, suggest there may be gathering momentum for such a change.

In an executive order signed on February 12, 2013, entitled “Improving Critical Infrastructure Cybersecurity,” President Obama outlined government plans to significantly increase the amount of information that the government shares with private sector entities about cyber threats.  Specifically, the order directs government agencies to develop procedures to create and disseminate to targeted entities unclassified reports of cyber threats that identify them as targets, to disseminate classified reports of cyber threats under certain circumstances to “critical infrastructure entities,” and to expand the Enhanced Cybersecurity Services program (previously available only to defense contractors to assist in information-sharing about cyber threats and protection of trade secrets) to “eligible critical infrastructure companies or commercial service providers that offer security services to critical infrastructure.”  The directives in the executive order are in addition to and complement various information-sharing tactics set forth in the administration strategy designed to provide warnings, threat assessments and other information to industry.  Companies, particularly those involved in the power grid or the provision of other utilities or critical systems, should be aware of the possibility of obtaining additional information from the government about threats to protected information.

© 2013 McDermott Will & Emery

Federal Trade Commission (FTC) Recommends Privacy Practices for Mobile Apps

The National Law Review recently published an article, Federal Trade Commission (FTC) Recommends Privacy Practices for Mobile Apps, written by Daniel F. GottliebRandall J. Ortman, and Heather Egan Sussman with McDermott Will & Emery:

McDermottLogo_2c_rgb

On February 1, 2013, the Federal Trade Commission (FTC) released a report entitled “Mobile Privacy Disclosures: Building Trust Through Transparency” (Report), which urges mobile device application (app) platforms and developers to improve the privacy policies for their apps to better inform consumers about their privacy practices.  This report follows other recent publications from the FTC concerning mobile apps—including “Mobile Apps for Kids: Disclosures Still Not Making the Grade,” released December 2012 (December 2012 Report), and “Mobile Apps for Kids: Current Privacy Disclosures are Disappointing,” released February 2012 (February 2012 Report)—and the adoption of the amended Children’s Online Privacy Protection Act (COPPA) Rule on December 19, 2012.  (See “FTC Updates Rule for Children’s Online Privacy Protection” for more information regarding the recent COPPA amendments.

Among other things, the Report offers recommendations to key stakeholders in the mobile device application marketplace, particularly operating system providers (e.g., Apple and Microsoft), application developers, advertising networks and related trade associations.  Such recommendations reflect the FTC’s enforcement and policy experience with mobile applications and public comment on the matter; however, where the Report goes beyond existing legal requirements, “it is not intended to serve as a template for law enforcement actions or regulations under laws currently enforced by the FTC.”  Nevertheless, such key stakeholders should take the FTC’s recommendations into account when determining how they will collect, use and transfer personal information about consumers and preparing privacy policies to describe their information practices because they reflect the FTC’s expectations under its consumer protection authorities.

At a minimum, operating system providers and application developers should review their existing privacy policies and make revisions, as necessary, to comply with the recommendations included within the Report.  However, all key stakeholders should consider the implications of recommendations specific to their industry segment, as summarized below.

Operating System Providers

Characterized within the Report as “gatekeepers to the app marketplace,” the FTC states that operating system providers have the “greatest ability to effectuate change with respect to improving mobile privacy disclosures.”  Operating system providers, which create and maintain the platform upon which mobile apps run, promulgate rules that app developers must follow in order to access the platform and facilitate interactions between developers and consumers.  Given their prominent role within the app marketplace, it is not surprising that the FTC directs numerous recommendations toward operating system providers, including:

  • Just-In-Time Disclosures.  The Report urges operating system providers to display just-in-time disclosures to consumers and obtain express, opt-in (rather than implied) consent before allowing apps to access sensitive information like geolocation (i.e., the real world physical location of a mobile device), and other information that consumers may find sensitive, such as contacts, photos, calendar entries or recorded audio or video.  Thus, operating system providers and mobile app developers should carefully consider the types of personal information practices that require an opt-in rather than mere use of the app to evidence consent.
  • Privacy Dashboard.  The Report suggests that operating system providers should consider developing a privacy “dashboard” that would centralize privacy settings for various apps to allow consumers to easily review the types of information accessed by the apps they have downloaded.  The “dashboard” model would enable consumers to determine which apps have access to different types of information about the consumer or the consumer’s device and to revisit the choices they initially made about the apps.
  • Icons.  The Report notes that operating system providers currently use status icons for a variety of purposes, such as indicating when an app is accessing geolocation information.  The FTC suggests expansion of this practice to provide an icon that would indicate the transmission of personal information or other information more broadly.
  • Best Practices.  The Report recommends that operating system providers establish best practices for app developers.  For example, operating system providers can compel app developers to make privacy disclosures to consumers by restricting access to their platforms.
  • Review of Apps.  The Report suggests that operating system providers should also make clear disclosures to consumers about the extent to which they review apps developed for their platforms.  Such disclosures may include conditions for making apps available within the platform’s app marketplace and efforts to ensure continued compliance.
  • Do Not Track Mechanism.  The Report directs operating system providers to consider offering a “Do Not Track” (DNT) mechanism, which would provide consumers with the option to prevent tracking by advertising networks or other third parties as they use apps on their mobile devices.  This approach allows consumers to make a single election, rather than case-by-case decisions for each app.

App Developers

Although some practices may be imposed upon app developers by operating system providers, as discussed above, app developers can take several steps to adopt the FTC’s recommendations, including:

  • Privacy Policies.  The FTC encourages all app developers to have a privacy policy, and to include reference to such policy when submitting apps to an operating system provider.
  • Just-In-Time Disclosures.  As with the recommendations for operating system providers, the Report suggests that app developers provide just-in-time disclosures and obtain affirmative express consent before collecting and sharing sensitive information.
  • Coordination with Advertising Networks.  The FTC argues for improved coordination and communication between app developers and advertising networks and other third parties that provide certain functions, such as data analytics, to ensure app developers have an adequate understanding of the software they are incorporating into their apps and can accurately describe such software to consumers.
  • Participation in Trade Associations.  The Report urges app developers to participate in trade associations and other industry organizations, particularly in the development of self-regulatory programs addressing privacy in mobile apps.

Advertising Networks and Other Third Parties

By specifically including advertising networks and other third parties in the Report, the FTC recognizes that cooperation with such networks and parties is necessary to achieve the recommendations outlined for operating system providers and app developers.  The recommendations for advertising networks and other third parties include:

  • Coordination with App Developers.  The Report calls upon advertising networks and other third parties to communicate with app developers to enable such developers to provide accurate disclosures to consumers.
  • DNT Mechanism.  Consistent with its recommendations for operating system providers, the FTC suggests that advertising networks and other third parties work with operating system providers to implement a DNT mechanism.

Trade Associations

The FTC states that trade associations can facilitate standardized privacy disclosures.  The Report makes the following recommendations for trade associations:

  • Icons.  Trade associations can work with operating system providers to develop standardized icons to indicate the transmission of personal information and other data.
  • Badges.  Similar to icons, the Report suggests that trade associations consider developing “badges” or other visual cues used to convey information about a particular app’s data practices.
  • Privacy Policies.  Finally, the FTC suggests that trade associations are uniquely positioned to explore other opportunities to standardize privacy policies across the mobile app industry.

Children and Mobile Apps

Commenting on progress between the February 2012 Report and December 2012 Report, both of which relied on a survey of 400 mobile apps targeted at children, the FTC stated that “little or no progress has been made” in increasing transparency in the mobile app industry with regard to privacy practices specific to children.  The December 2012 Report suggests that very few mobile apps targeted to children include basic information about the app’s privacy practices and interactive features, including the type of data collected, the purpose of the collection and whether third parties have access to such data:

  • Privacy Disclosures.  According to the December 2012 Report, approximately 20 percent of the mobile apps reviewed disclosed any privacy-related information prior to the download process and the same proportion provided access to a privacy disclosure after downloading the app.  Among those mobile apps, the December 2012 Report characterizes their disclosures as lengthy, difficult to read or lacking basic detail, such as the specific types of information collected.
  • Information Collection and Sharing Practices.  The December 2012 Report notes that 59 percent of the mobile apps transmitted some information to the app developer or to a third party.  Unique device identifiers were the most frequently transmitted data point, which the December 2012 Report cites as problematic, suggesting that such identifiers are routinely used to create user “profiles,” which may track consumers across multiple mobile apps.
  • Disclosure Practices Regarding Interactive App Features.  The FTC reports that nearly half of the apps that stated they did not include advertising actually contained advertising, including ads targeted to a mature audience.  Similarly, the December 2012 Report notes that approximately 9 percent of the mobile apps reviewed disclosed that they linked with social media applications; however, this number represented only half of the mobile apps that actually linked to social media applications.  Mobile app developers using a template privacy policy as a starting point for an app’s privacy policy should carefully tailor the template to reflect the developer’s actual privacy practices for the app.

Increased Enforcement

In addition to the reports discussed above and the revisions to the COPPA Rule, effective July 1, 2013, the FTC has also increased enforcement efforts relating to mobile app privacy.  On February 1, 2013, the FTC announced an agreement with Path Inc., operator of the Path social networking mobile app, to settle allegations that it deceived consumers by collecting personal information from their mobile device address books without their knowledge or consent.  Under the terms of the agreement, Path Inc. must establish a comprehensive privacy program, obtain independent privacy assessments every other year for the next 20 years and pay $800,000 in civil penalties specifically relating to alleged violations of the COPPA Rule.  In announcing the agreement, the FTC commented on its commitment to continued scrutiny of privacy practices within the mobile app industry, adding that “no matter what new technologies emerge, the [FTC] will continue to safeguard the privacy of Americans.”

Key Takeaways

App developers and other key stakeholders should consider the following next steps:

  • Review existing privacy policies to confirm they accurately describe current privacy practices for the particular app rather than merely following the developer’s preferred template privacy policy
  • Where practical, update actual privacy practices and privacy policies to be more in line with the FTC’s expectations for transparency and consumer choice, including use of opt-in rather than opt-out consent models
  • Revisit privacy practices in light of heightened FTC enforcement under COPPA and its other consumer protection authorities

© 2013 McDermott Will & Emery

People Still Value Privacy. Get Over It. Online Privacy Alliance.

An article, People Still Value Privacy. Get Over It. Online Privacy Alliance., published in The National Law Review recently was written by Mark F. Foley with von Briesen & Roper, S.C.:

vonBriesen

 

Sun Microsystems’ CEO Scott McNealy famously quipped to reporters in 1999: “You have zero privacy anyway. Get over it.” Sun on Privacy: ‘Get Over It‘, WIRED, Jan. 26, 1999, http://www.wired.com/politics/law/news/1999/01/17538.

 

At the time, Sun Microsystems was a member of the Online Privacy Alliance, an industry coalition seeking to head off government regulation of online consumer privacy in favor of industry self regulation. Although McNealy was widely criticized for his views at the time, it is fair to say that much of the technology world agreed then, or agrees now with his remark.

Have we gotten over it? Do we reside in a world in which individuals assign so little value to personal privacy that companies who collect, process, analyze, sell, and use personal data are free to do whatever they want?

There are indications that if it ever were true that consumers did not value privacy, their interest in privacy is making a comeback. Where commercial enterprises do not align their practices with consumer expectations and interests, a regulator will step in and propose something unnecessarily broad and commercially damaging, or outraged consumers will take matters into their own hands. Recent privacy tornadoes provide the proof.

For some time, employers have accessed public information from social media sites to monitor employee activities or to investigate the personal qualifications of prospective hires. But recently, companies have gone further, demanding that employees and prospects provide user names and passwords that would enable the company to access otherwise limited distribution material. Dave Johnson, a writer for CBS Money Watch, said employer demands for access to an employee’s or prospective hire’s Facebook username and password are “hard to see … as anything other than an absolutely unprecedented invasion of privacy.”  http://www.cbsnews.com/8301-505143_162-57562365/states-protect-employees-social-media-privacy/

The reaction was predictable. In the past year, six states – California, Delaware, Illinois, Maryland, Michigan and New Jersey – have reacted to public outcries by outlawing the practice of employers coercing employees into turning over social media account access information. At least eight more states have similar bills pending, including Massachusetts, Minnesota, Missouri, New York, Ohio, Pennsylvania, South Carolina, and Washington. See National Conference of State Legislatures Legislation Summary as of Jan. 8, 2013 at http://www.ncsl.org/issues-research/telecom/employer-access-to-social-media-passwords.aspx.

Similarly, Congress enacted the Children’s Online Privacy Protection Act (COPPA) in 1998 in response to the failure of self-regulation to limit the scope and nature of information collected from young children. COPPA and implementing regulations limited the collection of information from or about children less than 13 years old. In the past several years, it was widely conceded that this law was not effective in preventing the collection and use of personal information about our children, particularly where photographs and mobile phones were concerned. Companies collecting and using information about children took no action to satisfy parental concerns.

The reaction? In December 2012, the Federal Trade Commission issued amended regulations to make clear that COPPA rules apply to a child-oriented site or service that integrates outside services, such as plug-ins or advertising networks, to collect personal information from visitors. The definition of restricted personal information now includes geolocation as well as photos, videos, and audio files that contain a child’s image or voice, and “persistent identifiers” that recognize users over time and across different websites or services.

Parents and job counselors have been warning for years that teenagers and young adults must not post unflattering images to their Facebook pages because, even if deleted, they will persist somewhere on the internet and may be found by prospective colleges and employers. There were many anecdotes about teenagers committing suicide after nasty postings or the distribution of photos. There did not seem to be a practical solution to the problem.

Last year, the European Commission proposed a sweeping revision to its already difficult data privacy rules to include an explicit “right to be forgotten.” If the proposal is adopted, individuals can demand that websites remove personal photos or other data. Companies that fail or refuse to do so could be fined an amount based on their annual income. The rules, as proposed, would apply both to information the data subject posted about herself and embarrassing information others posted about her, unless the website can prove to a regulator that the information is part of a legitimate journalistic, literary, or artistic exercise. Such a new law would set up a dramatic clash between the European concept of privacy and the American concept of free speech.

For the past three years we’ve heard shocking stories about phone Apps that quietly collect information about our searches, interests, contacts, locations, and more without disclosure or a chance to opt out. The uproar led to only limited action that has not satisfied consumer concerns.

The reaction? U.S. Representative Hank Johnson has proposed The Application Privacy, Protection, and Security (APPS) Act of 2013, which would require App developers to disclose their information-gathering practices and allow users to require that their stored information be deleted.

Increasingly, consumers are not waiting for regulatory action, but are taking privacy protection into their own hands. For example, Instagram built a business on its photosharing App. Shortly after it became popular enough to be purchased by Facebook, Instagram issued new terms of service and privacy policies that appeared to give the company the right to use uploaded images without permission and without compensation. The Washington Post described consumer reaction as a “user revolt. . . on Twitter where shock and outrage mixed with fierce declarations swearing off the popular photo-sharing site for good.” http://articles.washingtonpost.com/2012-12-18/business/35908189_1_kevin-systrom-instagram-consumer-privacy. The Twitter response was so memorable that perhaps, in the future “insta-gram” will come to have a secondary meaning of “a massively parallel instantaneous complaint in cyberspace.”

The blogosphere and Twitterterra were filled with apologies and explanations by Instagram and others stating the company was not a bad actor and truly had no intention of using photos of your naked child to sell diapers without your permission. Even some of the harshest critics admitted, “it’s [not] quite as dramatic as everyone . . . made it seem like on Twitter.” See Theron Humphrey quoted in David Brancaccio’s Marketplace Tech Report for December 19, 2012, http://www.marketplace.org/topics/tech/instagrams-privacy-backlash-and-dirty-secret-data-caps. But the truth about the revised terms and conditions may not matter because consumer goodwill toward Instagram had been destroyed by the perception.

Instagram users are not alone in their disapproval of commercial uses of personal information. Consumer analytics company LoyaltyOne released a July 2012 survey that shows U.S. consumers are increasingly protective of personal information. Of the 1,000 consumers responding, only about 50% said they would be willing to give a trusted company their religious or political affiliation or sexual orientation, only 25% were willing to share commonly commercialized data such as their browsing history, and only 15% were willing to share their smart phone location. See summary of findings at http://www.retailcustomerexperience.com/article/200735/Consumers-still-value-privacy-survey-shows. USA Today reported that an ISACA survey of adults 18 years and older showed that 35% would not share any personal information if offered 50% off a $100 item, 52% would not share any personal information if offered 50% off a $500 item, and 55% would not share any personal information if offered 50% off a $1,000 item. USA TODAY, Bigger Discount, Less Sharing, January 21, 2013.

I’m confident everyone reading this Update has been sufficiently careful and prudent in their own personal and professional lives; but who among us has not had an, ahem, family member, who does not regret a photo posted to a social media site, an unappreciated email joke, or a comment in a tweet or blog that looks much less “awesomely insightful” after the passage of a few days. (Is there an emoticon meaning “I’m being really facetious”?) Such brief moments of indiscretion can lead to disproportionately bad results.

Have commercial collectors, users, and resellers of such information shown sufficient willingness to respond to consumer’s widespread discomfort with the permanent retention and uncontrolled access to their personal information, candid photos, and musings?

We no longer inhabit a Wild West without limit on the collection and use of personal information for commercial purposes. Be assured, that when something perceived to be bad happens, there will be a violent, goodwill damaging, market value destroying, throw-out-the-baby-with-the-bath-water Instagram-like response that will obliterate some current business models and corporate franchises. Notwithstanding terms and conditions of service that try contractually to deprive users of any right to complain about your use of their data, they will complain and they will vote, with both their Feet and their Tweets.

There are very good social, psychological, religious, and political reasons why privacy should be protected. See Wolfgang Sofsky, PRIVACY: A MANIFESTO (Princeton Univ. Press 2008). As consumers and parents we instinctively know that privacy is important, even if we can’t precisely define it and can’t say exactly why. Even though we’ve sometimes been too foolishly willing to let go of privacy protections in exchange for the convenience of a nifty new website or clever new App, we do, in the end, still care. We know there is something important at issue here. We should not forget this insight when we change hats and become business people deciding what data to collect and how to use it.

Companies that want to avoid receiving an “insta-gram,” that want to build long term relationships with consumers, need to accept that sentiment has changed when designing their programs, analytics, and business models. It’s time to throw out McNealy’s aphorism. Businesses need to recognize that today consumers increasingly do value their privacy, and get over it.

©2013 von Briesen & Roper, s.c

Estate Planning with Digital Assets in Mind

McBrayer NEW logo 1-10-13

“It’s ‘Bosco’!!”  Seinfeld fans will recall from “The Secret Code” episode that George Costanza created a good deal of chaos by being reluctant to share his secret code.  By the same token, failing to share the secret codes to your digital assets could put a wrench in your best laid estate plans.  This article will discuss various measures that you can implement to insure that your digital assets will pass in accordance with your desires.

Whether we like it or not, the world is changing at warp speed.  Paper statements for bank accounts and the like are going to the way of the dodo bird.  Those dusty old books that used to gobble up shelf space can now be stored on a device that fits in the palm of your hand.  Same goes for the vinyl records you bought with money from mowing lawns.  And who would have ever thought that you’d be able to share pictures of your children or grandchildren with your friends and family by posting them on Facebook?

As the world becomes more and more digital, so too do the assets which comprise your estate.  Digital assets encompass a wide variety of items.  The website www.digitalestateresourse.com defines digital assets to include the following:

  1. files stored on digital devices, including but not limited to, desktops, laptops,    tablets, peripherals, storage devices, mobile telephones, smartphones, and any    similar digital device which currently exist or may exist as technology develops;    and
  2. e-mails received, e-mail accounts, digital music, digital photographs, digital    videos, digital books, software licenses, social network accounts, file sharing    accounts, financial accounts, banking accounts, tax preparation service accounts,    online stores, affiliate programs, other online accounts, and similar digital items    which currently exist or may exist as technology develops, regardless of the    ownership of the physical device upon which the digital item is stored.”

Failing to properly catalogue your digital assets could have a variety of negative consequences.  By way of example, that rainy day savings account that you never told anyone about could go undetected by the executor of your estate; and those vacation photos which your family would so enjoy could be forever locked in a Shutterfly account.

So what needs to be done to insure that your digital assets are properly accounted for and that they go to their intended beneficiaries?  Taking the following steps will go a long way towards accomplishing your objectives: (1) keep a master list of your digital assets; (2) keep the master list current; (3) tell someone where you keep the master list; (4) determine whether your digital assets are transferable; and (5) consider making specific provisions for them in your Will.

(1) KEEPING A LIST.  The most important step in properly handling your digital assets is to create a master list of such assets.  I find Excel spreadsheets to be a helpful tool for creating and maintaining such lists.  For each of your digital assets, consider including the following information: (i) a description of the asset (e.g., TD Ameritrade Brokerage Account); (ii) where the asset is located (e.g.,www.tdameritrade.com); (iii) any account number or user name associated with the asset; and (iv) any password that is necessary to gain access to the asset.

(2)  CURRENT INFORMATION.  Creating a list of digital assets without keeping the information current is about as useful as having an ashtray on a motorcycle.  It doesn’t do your executor any good to know that the brokerage account you opened in 2004 was with TD Ameritrade.  Rather, he really needs to know that you transferred the assets to Fidelity Investments in 2009 and that is where the assets are currently located.  Ideally you should update the master list every time you change the location of the assets, change a password or make a similar change.  Short of that, you should review your master list at least once every three months and after you have done so, make a notation to that effect on the master list.  Something such as “Current as of 12/1/12” would work nicely.

(3)  LOCATION OF THE LIST.  Creating and maintaining the master list does your heirs no good unless you share its location with someone you trust.  As a best practice, you should tell your executor where the master list is located and you should keep a copy of the master list with your other valuable papers and documents.

(4)  NOT ALL DIGITAL ASSETS ARE TRANSFERABLE.  Unless you are the one person in 10,000 who actually reads the user agreement when you establish an online account, you should revisit each user agreement for your online accounts to determine which of your digital assets are transferrable upon your death.  By way of example, not all airlines permit the transfer of frequent flyer miles upon the death of the account holder.  Upon making such a determination, you should update your master list accordingly.

(5)  SPECIFIC BEQUESTS OF DIGITAL ASSETS.  Now that your executor knows your digital assets exist, they should pass in accordance with your overall estate plan.  Without making specific provisions for your digital assets, they will pass pursuant to the residuary clause of your Will.  So, while it is not necessary to make specific bequests of your digital assets, as a practical matter it may be advisable to do so.  For example, I know that my wife would love to have the family photos stored on my laptop, but I can promise you that she has no interest in the Alex Cross novels I’ve purchased for my Kindle Fire or the Johnny Cash albums I’ve purchased for my iPhone.

Digital assets are often an overlooked component of even the most complicated estate plans.  However, with proper planning you can make sure that all of your digital assets are properly accounted for and that they pass according to your wishes.  To assess the current health of your estate plan, including a determination of whether your digital assets are properly accounted for, consider scheduling an appointment with your estate planning attorney.

© 2013 by McBrayer, McGinnis, Leslie & Kirkland, PLLC

Trade Secret Misappropriation: When An Insider Takes Your Trade Secrets With Them

Raymond Law Group LLC‘s Stephen G. Troiano recently had an article, Trade Secret Misappropriation: When An Insider Takes Your Trade Secrets With Them, featured in The National Law Review:

RaymondBannerMED

While companies are often focused on outsider risks such as breach of their systems through a stolen laptop or hacking, often the biggest risk is from insiders themselves. Such problems of access management with existing employees, independent contractors and other persons are as much a threat to proprietary information as threats from outside sources.

In any industry dominated by two main players there will be intense competition for an advantage. Advanced Micro Devices and Nvida dominate the graphics card market. They put out competing models of graphics cards at similar price points. When played by the rules, such competition is beneficial for both the industry and consumers.

AMD has sued four former employees for allegedly taking “sensitive” documents when they left to work for Nvidia. In its complaint, filed in the 1st Circuit District Court of Massachusetts, AMD claims this is “an extraordinary case of trade secret transfer/misappropriation and strategic employee solicitation.” Allegedly, forensically recovered data show that when the AMD employees left in July of 2012 they transferred thousands of files to external hard drives that they then took with them. Advanced Micro Devices, Inc. v. Feldstein et al, No. 4:2013cv40007 (1st Cir. 2013).

On January 14, 2013 the District Court of Massachusetts granted AMD’s ex-parte temporary restraining order finding AMD would suffer immediate and irreparable injury if the Court did not issue the TRO. The TRO required the AMD employees to immediately provide their computers and storage devices for forensic evaluation and to refrain from using or disclosing any AMD confidential information.

The employees did not have a non-compete contract. Instead the complaint is centered on an allegation of misappropriation of trade secrets. While both AMD and Nvidia are extremely competitive in the consumer discrete gpu market involving PC gaming enthusiasts, there are rumors that AMD managed to secure their hardware to be placed in both forthcoming next-generation consoles, Sony PlayStation 4 and Microsoft Xbox 720. AMD’s TRO and ultimate goal of the suit may therefore be to preclude any of their proprietary technology from being used by its former employees to assist Nvidia in the future.

The law does protect companies and individuals such as AMD from having their trade secrets misappropriated. The AMD case has only recently been filed and therefore it is unclear what the response from the employees will be. What is clear is how fast AMD was able to move to deal with such a potential insider threat. Companies need to be aware of who has access to what data and for how long. Therefore, in the event of a breach, whether internal or external, companies can move quickly to isolate and identify the breach and take steps such as litigation to ensure their proprietary information is protected.

© 2013 by Raymond Law Group LLC

Data Privacy Day 2013 – Passwords

The National Law Review recently featured an article on Passwords written by Cynthia J. Larose with Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C.:

MintzLogo2010_Black

 

Something everyone can do for Data Privacy Day:  make it a point to change at least one password and make it “long and strong.”

Here are some tips for building strong passwords from David Sherry, Chief Information Security Officer at Brown University:

To create a strong password, you should use a string of text that mixes numbers, letters that are both lowercase and uppercase, and special characters. Best practice says it should be eight characters, but the more the better. The characters should be random, and not follow from words, alphabetically, or from your keyboard layout.

So how do you make such a password?

Spell something backwards. Example: Turn “New York” into “ kroywen ”

Use “l33t speak”: Substitute numbers for certain letters.  Example: Turn “kroywen” into kr0yw3n

Randomly throw in some capital letters.  Example: Turn “kr0yw3n” into Kr0yW3n

Don’t forget the special character.  Example: Turn “Kr0yW3n” into       !Kr0y-W3n$

So, you say you can’t remember “complex” passwords…

One suggestion: create one, very strong, password and “append” it with an identifier:

!Kr0y-W3n$Bro

!Kr0y-W3n$Ama

!Kr0y-W3n$Boa

!Kr0y-W3n$Goo

!Kr0y-W3n$Yah

©1994-2013 Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C.

Privacy of Mobile Applications

The National Law Review recently featured an article, Privacy of Mobile Applications, written by Cynthia J. Larose with Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C.:

MintzLogo2010_Black

 

As we continue our “new year, new look” series into important privacy issues for 2013, we boldly predict:

Regulatory Scrutiny of Data Collection and Use Practices of Mobile Apps Will Increase in 2013

Mobile apps are becoming a ubiquitous part of the everyday technology experience.  But, consumer apprehension over data collection and their personal privacy with respect to mobile applications has been growing.   And as consumer apprehension grows, so does regulatory scrutiny.  In 2012, the Federal Trade Commission (FTC) offered guidance to mobile app developers to “get privacy right from the start.”    At the end of 2012, the California Attorney General’s office brought its first privacy complaint against Delta Airlines, Inc., alleging that Delta’s mobile app “Fly Delta” failed to have a conspicuously posted privacy policy in violation of California’s Online Privacy Protection Act.  And also in December, SpongeBob Square Pants found himself in the middle of a complaint filed at the FTC by a privacy advocacy group alleging that the mobile game SpongeBob Diner Dash collected personal information about children without obtaining parental consent.

In 2013, we expect to see new regulatory investigations into privacy practices of mobile applications.   Delta was just one of 100 recipients of notices of non-compliance from the California AG’s office and the first to be the subject of a complaint.  Expect to see more of these filed early in this year as the AG’s office plows through responses from the lucky notice recipients.   Also, we can expect to hear more from the FTC on mobile app disclosure of data collection and use practices and perhaps some enforcement actions against the most blatant offenders.

Recommendation for action in 2013:  Take a good look at your mobile app and its privacy policy.   If you have simply ported your website privacy policy over to your mobile app – take another look.  How is the policy displayed to the end user?  How does the user “accept” its terms?  Is this consistent with existing law, such as California, and does it follow the FTC guidelines?  

©1994-2013 Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C.

Privacy Policies Now a Must for Mobile Apps

The National Law Review recently published an article, Privacy Policies Now a Must for Mobile Apps, written by Tanya L. CurtisLeonard A. Ferber, and Doron S. Goldstein of Katten Muchin Rosenman LLP:

Katten Muchin

 

California has long been a leader in privacy legislation. That position was strengthened recently when the California Attorney General filed a first-of-its-kind lawsuit against a company for its failure to include a privacy policy with a smartphone application. The lawsuit, filed on December 6 against Delta Airlines, alleges that the airline violated California law requiring online services to “conspicuously post its privacy policy” by failing to include such a policy with its “Fly Delta” mobile application. This action by the state of California has broad implications to anyone developing or distributing mobile apps.

Background

In 2004, California enacted the California Online Privacy Protection Act (CalOPPA)requiring commercial operators of websites and online services to conspicuously post detailed privacy policies to enable consumers to understand what personal information is collected by a website and the categories of third parties with which operators share that information. CalOPPA provides that “an operator shall be in violation of this [posting requirement] only if the operator fails to post its policy within 30 days after being notified of noncompliance,” and if the violation is made either (a) knowingly and willingly or (b) negligently and materially. In the case of an online service, “conspicuously posting” a privacy policy requires that the policy be “reasonably accessible…for consumers of the online service.”

While CalOPPA does not define an “online service” or specifically mention “mobile” or “smartphone” applications, the California Attorney General considers any service available over the internet or that connects to the internet, including mobile apps, to be an “online service.” In light of this interpretation, in 2011 the Attorney General’s office contacted the six leading operators of mobile application platforms in an attempt to improve mobile app compliance with CalOPPA. In February 2012, the Attorney General reached an agreement with these companies on a set of principles designed to ensure that mobile apps include a conspicuously posted privacy policy where applicable law so requires (such as in California), and that the policy appear in a consistent location on the app download screen.

Delta markets its Fly Delta mobile app though various online “app stores.” Among other things, the Fly Delta app allows customers to check in to flights, rebook cancelled flights and pay for checked baggage. Delta has a website that includes a privacy policy, but that policy did not mention the Fly Delta app or the types of information collected from the app.

The Case

In October, the California Attorney General’s office sent letters to a number of mobile application makers, including Delta, that did not have a privacy policy reasonably accessible to app users, giving them 30 days to respond or make their privacy policies accessible in their apps. Delta either forgot about or ignored the letter, and the Attorney General filed suit.

The complaint stated that the Fly Delta application did not have a privacy policy within the application itself or in the app stores from which the application could be downloaded. The complaint also noted that, while Delta’s website has a privacy policy, the policy does not mention the Fly Delta app or the personal information collected by the app, and is not reasonably accessible to consumers who download the app. Since Delta failed to respond to the October letter, the Attorney General charged the airline with violating California law by knowingly and willfully, or negligently and materially, failing to comply with CalOPPA. And, in a separate charge under a provision of CalOPPA not requiring 30 days’ notice of noncompliance, the Attorney General alleged that Delta failed to comply with the privacy policy posted on its own website, in that the Fly Delta app does not comply with that policy. The complaint asks for damages of $2,500 for each violation, presumably for each download.

What You Need to Know

While California is currently unique in applying its privacy law to mobile applications, many states look to California, as a leader in this area, for guidance. CalOPPA applies to any “operator of a commercial website or online service that collects personally identifiable information through the Internet about individual consumers residing in California who use or visit its commercial website or online service…” In light of California’s large population, the practical effect of CalOPPA is that an overwhelming number of online businesses (including mobile app developers) must comply with it.

It is now clear that virtually all mobile or smartphone app makers, as well as companies that use smartphone apps as part of their “mobile strategy,” must make privacy policies accessible to app users. The actions of the California Attorney General also make it clear that there is a cost to noncompliance. Such accessibility can be achieved either by including the privacy policy within the app itself or by creating an icon or text link to a readable version of the privacy policy, which may be part of a company’s or developer’s overall web privacy policy.

©2012 Katten Muchin Rosenman LLP