Don’t Gamble with the GDPR

The European Union’s (EU) General Data Protection Regulation (GDPR) goes into effect on May 25, and so do the significant fines against businesses that are not in compliance. Failure to comply carries penalties of up to 4 percent of global annual revenue per violation or $20 million Euros – whichever is highest.

This regulatory rollout is notable for U.S.-based hospitality businesses because the GDPR is not just limited to the EU. Rather, the GDPR applies to any organization, no matter where it has operations, if it offers goods or services to, or monitors the behavior of, EU individuals. It also applies to organizations that process or hold the personal data of EU individuals regardless of the company’s location. In other words, if a hotel markets its goods or services to EU individuals, beyond merely having a website, the GDPR applies.

The personal data at issue includes an individual’s name, address, date of birth, identification number, billing information, and any information that can be used alone or with other data to identify a person.

The risks are particularly high for the U.S. hospitality industry, including casino-resorts, because their businesses trigger GDPR-compliance obligations on numerous fronts. Hotels collect personal data from their guests to reserve rooms, coordinate event tickets, and offer loyalty/reward programs and other targeted incentives. Hotels with onsite casinos also collect and use financial information to set up gaming accounts, to track player win/loss activity, and to comply with federal anti-money laundering “know your customer” regulations.

Privacy Law Lags in the U.S.

Before getting into the details of GDPR, it is important to understand that the concept of privacy in the United States is vastly different from the concept of privacy in the rest of the world. For example, while the United States does not even have a federal law standardizing data breach notification across the country, the EU has had a significant privacy directive, the Data Protection Directive, since 1995. The GDPR is replacing the Directive in an attempt to standardize and improve data protection across the EU member states.

Where’s the Data?

Probably the most difficult part of the GDPR is understanding what data a company has, where it got it, how it is getting it, where it is stored, and with whom it is sharing that data. Depending on the size and geographical sprawl of the company, the data identification and audit process can be quite mind-boggling.

A proper data mapping process will take a micro-approach in determining what information the company has, where the information is located, who has access to the information, how the information is used, and how the information is transferred to any third parties. Once a company fully understands what information it has, why it has it, and what it is doing with it, it can start preparing for the GDPR.

What Does the Compliance Requirement Look Like in Application?

One of the key issues for GDPR-compliance is data subject consent. The concept is easy enough to understand: if a company takes a person’s personal information, it has to fully inform the individual why it is taking the information; what it may do with that information; and, unless a legitimate basis exists, obtain express consent from the individual to collect and use that information.

In terms of what a company has to do to get express consent under the GDPR, it means that a company will have to review and revise (and possibly implement) its internal policies, privacy notices, and vendor contracts to do the following:

  • Inform individuals what data you are collecting and why;

  • Inform individuals how you may use their data;

  • Inform individuals how you may share their data and, in turn, what the entities you shared the data with may do with it; and

  • Provide the individual a clear and concise mechanism to provide express consent for allowing the collection, each use, and transfer of information.

At a functional level, this process entails modifying some internal processes regarding data collection that will allow for express consent. In other words, rather than language such as, “by continuing to stay at this hotel, you consent to the terms of our Privacy Policy,” or “by continuing to use this website, you consent to the terms of our Privacy Policy,” individuals must be given an opportunity not to consent to the collection of their information, e.g., a click-box consent versus an automatically checked box.

The more difficult part regarding consent is that there is no grandfather clause for personal information collected pre-GDPR. This means that companies with personal data subject to the GDPR will no longer be allowed to have or use that information unless the personal information was obtained in line with the consent requirements of the GDPR or the company obtains proper consent for use of the data prior to the GDPR’s effective date of May 25, 2018.

What Are the Other “Lawful Basis” to Collect Data Other Than Consent?

Although consent will provide hotels the largest green light to collect, process, and use personal data, there are other lawful basis that may exist that will allow a hotel the right to collect data. This may include when it is necessary to perform a contract, to comply with legal obligations (such as AML compliance), or when necessary to serve the hotel’s legitimate interests without overriding the interests of the individual. This means that during the internal audit process of a hotel’s personal information collection methods (e.g., online forms, guest check-in forms, loyalty/rewards programs registration form, etc.), each guest question asked should be reviewed to ensure the information requested is either not personal information or that there is a lawful reason for asking for the information. For example, a guest’s arrival and departure date is relevant data for purposes of scheduling; however, a guest’s birthday, other than ensuring the person is of the legal age to consent, is more difficult to justify.

What Other Data Subject Rights Must Be Communicated?

Another significant requirement is the GDPR’s requirement that guests be informed of various other rights they have and how they can exercise them including:

  • The right of access to their personal information;

  • The right to rectify their personal information;

  • The right to erase their personal information (the right to be forgotten);

  • The right to restrict processing of their personal information;

  • The right to object;

  • The right of portability, i.e., to have their data transferred to another entity; and

  • The right not to be included in automated marketing initiatives or profiling.

Not only should these data subject rights be spelled out clearly in all guest-facing privacy notices and consent forms, but those notices/forms should include instructions and contact information informing the individuals how to exercise their rights.

What Is Required with Vendor Contracts?

Third parties are given access to certain data for various reasons, including to process credit card payments, implement loyalty/rewards programs, etc. For a hotel to allow a third party to access personal data, it must enter into a GDPR-compliance Data Processing Agreement (DPA) or revise an existing one so that it is GDPR compliant. This is because downstream processors of information protected by the GDPR must also comply with the GDPR. These processor requirements combined with the controller requirements, i.e., those of the hotel that control the data, require that a controller and processor entered into a written agreement that expressly provides:

  • The subject matter and duration of processing;

  • The nature and purpose of the processing;

  • The type of personal data and categories of data subject;

  • The obligations and rights of the controller;

  • The processor will only act on the written instructions of the controller;

  • The processor will ensure that people processing the data are subject to duty of confidence;

  • That the processor will take appropriate measures to ensure the security of processing;

  • The processor will only engage sub-processors with the prior consent of the controller under a written contract;

  • The processor will assist the controller in providing subject access and allowing data subjects to exercise their rights under the GDPR;

  • The processor will assist the controller in meetings its GDPR obligations in relation to the security of processing, the notification of personal data breaches, and data protection impact assessments;

  • The processor will delete or return all personal data to the controller as required at the end of the contract; and that

  • The processor will submit to audits and inspections to provide the controller with whatever information it needs to ensure that they are both meeting the Article 28 obligations and tell the controller immediately if it is asked to do something infringing the GDPR or other data protection law of the EU or a member state.

Other GDPR Concerns and Key Features

Consent and data portability are not the only thing that hotels and gambling companies need to think about once GDPR becomes a reality. They also need to think about the following issues:

  • Demonstrating compliance. All companies will need to be able to prove they are complying with the GDPR. This means keeping records of issue such as consent.

  • Data protection officer. Most companies that deal with large-scale data processing will need to appoint a data protection officer.

  • Breach reporting. Breaches of data must be reported to authorities within 72 hours and to affected individuals “without undue delay.” This means that hotels will need to have policies and procedures in place to comply with this requirement and, where applicable, ensure that any processors are contractually required to cooperate with the breach-notification process.

© Copyright 2018 Dickinson Wright PLLC
This post was written by Sara H. Jodka of Dickinson Wright PLLC.

GDPR May 25th Deadline Approaching – Businesses Globally Will Feel Impact

In less than four months, the General Data Protection Regulation (the “GDPR” or the “Regulation”) will take effect in the European Union/European Economic Area, giving individuals in the EU/EEA greater control over their personal data and imposing a sweeping set of privacy and data protection rules on data controllers and data processors alike. Failure to comply with the Regulation’s requirements could result in substantial fines of up to the greater of €20 million or 4% of a company’s annual worldwide gross revenues. Although many American companies that do not have a physical presence in the EU/EEA may have been ignoring GDPR compliance based on the mistaken belief that the Regulation’s burdens and obligations do not apply outside of the EU/EEA, they are doing so at their own peril.

A common misconception is that the Regulation only applies to EU/EEA-based corporations or multinational corporations with operations within the EU/EEA. However, the GDPR’s broad reach applies to any company that is offering goods or services to individuals located within the EU/EEA or monitoring the behavior of individuals in the EU/EEA, even if the company is located outside of the European territory. All companies within the GDPR’s ambit also must ensure that their data processors (i.e., vendors and other partners) process all personal data on the companies’ behalf in accordance with the Regulation, and are fully liable for any damage caused by their vendors’ non-compliant processing. Unsurprisingly, companies are using indemnity and insurance clauses in data processing agreements with their vendors to contractually shift any damages caused by non-compliant processing activities back onto the non-compliant processors, even if those vendors are not located in the EU/EEA. As a result, many American organizations that do not have direct operations in the EU/EEA nevertheless will need to comply with the GDPR because they are receiving, storing, using, or otherwise processing personal data on behalf of customers or business partners that are subject to the Regulation and its penalties. Indeed, all companies with a direct or indirect connection to the EU/EEA – including business relationships with entities that are covered by the Regulation – should be assessing the potential implications of the GDPR for their businesses.

Compliance with the Regulation is a substantial undertaking that, for most organizations, necessitates a wide range of changes, including:

  • Implementing “Privacy by Default” and “Privacy by Design”;
  • Maintaining appropriate data security;
  • Notifying European data protection agencies and consumers of data breaches on an expedited basis;
  • Taking responsibility for the security and processing of third-party vendors;
  • Conducting “Data Protection Impact Assessments” on new processing activities;
  • Instituting safeguards for cross-border transfers; and
  • Recordkeeping sufficient to demonstrate compliance on demand.

Failure to comply with the Regulation’s requirements carries significant risk. Most prominently, the GDPR empowers regulators to impose fines for non-compliance of up to the greater of €20 million or 4% of worldwide annual gross revenue. In addition to fines, regulators also may block non-compliant companies from accessing the EU/EEA marketplace through a variety of legal and technological methods. Even setting these potential penalties aside, simply being investigated for a potential GDPR violation will be costly, burdensome and disruptive, since during a pending investigation regulators have the authority to demand records demonstrating a company’s compliance, impose temporary data processing bans, and suspend cross-border data flows.

The impending May 25, 2018 deadline means that there are only a few months left for companies to get their compliance programs in place before regulators begin enforcement. In light of the substantial regulatory penalties and serious contractual implications of non-compliance, any company that could be required to meet the Regulation’s obligations should be assessing their current operations and implementing the necessary controls to ensure that they are processing personal data in a GDPR-compliant manner.

 

© 2018 Neal, Gerber & Eisenberg LLP.
More on the GDPR at the NLR European Union Jurisdiction Page.

Elder Abuse: Are Granny Cams a Solution, a Compliance Burden, or Both?

In Minnesota, 97% of the 25,226 allegations of elder abuse (neglect, physical abuse, unexplained serious injuries and thefts) in state-licensed senior facilities in 2016 were never investigated. This prompted Minnesota Governor, Mark Dayton, to announce plans last week to form a task force to find out why. As one might expect, Minnesota is not alone. A studypublished in 2011 found that an estimated 260,000 (1 in 13) older adults in New York had been victims of one form of abuse or another during a 12-month period between 2008 and 2009, with “a dramatic gap” between elder abuse events reported and the number of cases referred to formal elder abuse services. Clearly, states are struggling to protect a vulnerable and growing group of residents from abuse. Technologies such as hidden cameras may help to address the problem, but their use raises privacy, security, compliance, and other concerns.

With governmental agencies apparently lacking the resources to identify, investigate, and respond to mounting cases of elder abuse in the long-term care services industry, and the number of persons in need of long-term care services on the rise, this problem is likely to get worse before it gets better. According to a 2016 CDC report concerning users of long-term care services, more than 9 million people in the United States receive regulated long-term care services. These numbers are only expected to increase. The Family Caregiver Alliance reports that

by 2050, the number of individuals using paid long-term care services in any setting (e.g., at home, residential care such as assisted living, or skilled nursing facilities) will likely double from the 13 million using services in 2000, to 27 million people.

However, technologies such as hidden cameras are making it easier for families and others to step in and help protect their loved ones. In fact, some states are implementing measures to leverage these technologies to help address the problem of elder abuse. For example, New Jersey’s Attorney General recently expanded the “Safe Care Cam” program which lends cameras and memory cards to Garden State residents who suspect their loved ones may be victims of abuse by an in-home caregiver.

Common known as “granny cams,” these easy-to-hide devices which can record video and sometimes audio are being strategically placed in nursing homes, long-term care, and residential care facilities. For example, the “Charge Cam” (pictured above) is designed to look like and actually function as a plug used to charge smartphone devices. Once plugged in, it is able to record eight hours of video and sound. For a nursing home resident’s family concerned about the treatment of the resident, use of a “Charge Cam” or similar device could be a very helpful way of getting answers to their suspicions of abuse. However, for the unsuspecting nursing home or other residential or long-term care facility, as well as for the well-meaning family members, the use of these devices can pose a number of issues and potential risks. Here are just some questions that should be considered:

  • Is there a state law that specifically addresses “granny cams”? Note that at least five states (Illinois, New Mexico, Oklahoma, Texas, and Washington) have laws specifically addressing the use of cameras in this context. In Illinois, for example, the resident and the resident’s roommate must consent to the camera, and notice must be posted outside the resident’s room to alert those entering the room about the recording.
  • Is consent required from all of the parties to conversations that are recorded by the device?
  • Do the HIPAA privacy and security regulations apply to the video and audio recordings that contain individually identifiable health information of the resident or other residents whose information is captured in the video or audio recorded?
  • How do the features of the device, such as camera placement and zoom capabilities, affect the analysis of the issues raised above?
  • How can the validity of a recording be confirmed?
  • What effects will there be on employee recruiting and employee retention?
  • If the organization permits the device to be installed, what rights and obligations does it have with respect to the scope, content, security, preservation, and other aspects of the recording?

Just as body cameras for police are viewed by some as a way to help address concerns over police brutality allegations, some believe granny cams can serve as a deterrent to abuse of residents at long-term care and similar facilities. However, families and facilities have to consider these technologies carefully.

This post was written by Joseph J. Lazzarotti  of Jackson Lewis P.C. © 2017
For more legal analysis, go to The National Law Review 

Employees Sue for Fingerprint Use

Employees of Peacock Foods, an Illinois-based food product manufacturer, recently filed a lawsuit against their employer for alleged violations of Illinois’ Biometric Information Privacy Act. Under BIPA, companies that collect biometric information must inter alia have a written retention policy (that they follow). As part of the policy, the law states that they must delete biometric information after they no long need it, or three years after the last transaction with the individual. Companies also need consent to collect the information under the Illinois law, cannot sell information, and if shared must get consent for such sharing.

According to the plaintiff-employees, Peacock Foods used their fingerprints for a time tracking system without explaining in writing how the saved fingerprints would be used, or if they would be shared with third parties. According to the employees, this violated BIPA’s requirement for explaining -in the consent request- how information would be used and how long it would be kept. The employees also alleged that Peacock Foods did not have a retention schedule and program in place for deleting biometric data. The employees are currently seeking class certification.

Putting it Into Practice: This case is a reminder that plaintiffs’ class action lawyers are looking at BIPA and possible complaints that can be brought under the law. To address the Illinois law – and similar ones in Texas and Washington – companies should look at the notice and consent process they have in place.   

This post was written by Liisa M. Thomas & Mukund H. Sharma of Sheppard Mullin Richter & Hampton LLP., Copyright © 2017

For more Labor & Employment legal analysis, go to The National Law Review

Recording Conversations with Your Cellphone: with Great Power Comes Potential Legal Liability

In the cellphone age, nearly everyone walks around with a multi-tasking recording device in their pocket or purse, and it comes in handy for many of our modern problems: Your dog suddenly started doing something adorable? Open your video app and start rolling. Need to share that epic burger you just ordered with your foodie friends? There’s an app for that. Want to remember the great plot twist you just thought of for that novel you’ve been working on? Record a voice memo.

Sometimes, though, the need arises to record more serious matters. Many people involved in lawsuits choose to record conversations with their phones, all in the name of preserving evidence that might be relevant in court. People involved in contentious divorce or child custody cases, for example, might try to record a hostile confrontation that occurred during a pickup for visitation. Conversely, others might be worried that an ex-spouse has secretly recorded a conversation and plans to use it against them out of context.

But while everyone has the power to record just about anything with few swipes on their phone, do they have the legal right to do so? If not, what are the possible consequences? Can you even use recorded conversations in court? Consider these important questions before your press record.

Criminal Liability: Can you go to jail just for recording someone’s conversation?

The short answer: Yes. Under Michigan’s Eavesdropping law,[1] it is a felony punishable by up to two years and $2,000 to willfully use any device to eavesdrop on (meaning to overhear, record, amplify, or transmit) a conversation without the consent of all participants in that conversation.[2]It is also a felony for a person to “use or divulge” any information that they know was obtained through illegal eavesdropping.[3]

But there is one important distinction that Michigan courts have recognized: if you are a participant in the conversation, then you do not need permission of other participants to record the conversation (at least not when it comes to the eavesdropping law; there may be other laws that apply, as discussed below).[4] This makes sense given the purposes of the law. The theory is that if you are a participant in the conversation, then other participants at least have a chance to judge your character and determine if you are the kind of person who might relay the conversation to others (either verbally or by making a recording).

The bottom line is that if you use a device, like your cellphone, to record, overhear, amplify, or transmit a conversation that you are not a part of without the permission of all participants, you could face criminal consequences.

Civil Liability: If someone records your private conversation, can you file a lawsuit against them?

The short answer: Yes. The eavesdropping statute allows eavesdropping victims to bring a civil lawsuit against the perpetrator.[5] But the same distinction applies; you cannot sue someone for recording a conversation that they participated in.

Before filing a civil eavesdropping claim, though, consider what if anything there is to gain. The eavesdropping statute permits a judge to issue an injunction prohibiting the perpetrator from further eavesdropping. This may be a valuable remedy if there is a risk that the eavesdropper would otherwise continue eavesdropping on your conversations. The statue also allows a plaintiff to recover actual damages and punitive damages from the wrongdoer. In many cases, actual damages will likely be minimal, and punitive damages are subject to the whims of the judge or jury deciding the case. A result, the cost of litigation may exceed any monetary recovery unless actual damages are significant or the eavesdropper’s conduct was egregious enough to elicit a large punitive award from a jury.

Evidence and Admissibility: Can I use a recorded conversation in court?

Many people are familiar with the exclusionary rule that arises from the Fourth Amendment of the United States Constitution, which provides that if police officers obtain evidence as a result of an illegal search or seizure, then the prosecution is prohibited from using that evidence to support their case. This raises the question:

If a regular civilian obtains evidence by recording a conversation in violation of the eavesdropping statute, is that evidence automatically excluded from court proceedings?

The short answer: No. The exclusionary rule is specifically designed to curb the potentially oppressive power of the government in order to guarantee the protections of the Fourth Amendment, at the expense of excluding potentially valuable evidence from court proceedings. Since the Fourth Amendment only restricts government conduct, the exclusionary rule only applies to evidence obtained as a result of unconstitutional government action. As a result, even if a private citizen breaks the law and records your conversation, that recording is not automatically excluded from court.[6]

So does this mean you can use any recorded conversation in court whenever you want?

The short answer: No. Anything presented in court still needs to comply with the Rules of Evidence, and in many cases recorded conversations will not make the cut. A big reason is the hearsay rule, which says that out of court statements cannot be used to prove the truth of the matter asserted.[7] In other words, you can’t use a recording of your neighbor saying “I use my neighbor’s Wi-Fi” as evidence to prove that he was, in fact, using your Wi-Fi.

But there are many exceptions to the hearsay rule which might allow a recorded conversation into court. Salient among these exceptions is the rule that admissions of a party-opponent are not hearsay.[8] Consequently, if a man records his ex-wife’s conversation with her current husband, the hearsay rule will not prevent the man from using the recording of his ex-wife against her in a child custody case; the ex-wife is a “party-opponent” and her out-of-court statements are not considered hearsay.

Continuing this same example, note that the man’s actions would violate the eavesdropping statute (assuming he didn’t have permission to make the recording) because he was not a participant in the hypothetical conversation. But this violation would not keep the recording out of court. Nevertheless, if a prosecutor wanted to press charges, the man could be subject to criminal liability. And if the ex-wife was so inclined, she could file a civil lawsuit against the man and ask for an injunction and monetary damages.

Other Law: Is the eavesdropping statute the only law you need to worry about before recording all of your conversations?

The short answer: No, don’t hit record just yet. Even if you comply with the eavesdropping statute, there are still other potential pitfalls to be aware of. For instance, wiretapping laws govern the recording and interception of telephone calls and electronic communications, and carry criminal penalties. For inter-state phone calls, the laws of other states will come into play as well. And depending on the means you use to obtain a recording and what you do with the recording once you have it, you risk incurring civil liability for a variety of privacy torts, such as intrusion upon seclusion or public disclosure of private facts.

The safest route is to always get permission from everyone involved before recording a conversation or sharing a recorded conversation with anyone. If that’s not an option, consult with a lawyer who has had an opportunity to consider all of the facts involved in your case.

________________________________

[1] MCL 750.539 et seq.
[2] MCL 750.539a; MCL 570.539c.
[3] MCL 750.539e.
[4] See Sullivan v. Gray, 117 Mich. App. 476, 324 N.W.2d 58, 59 – 61 (1982).
[5] MCL 750.539h.
[6] See, e.g., Swan v. Bob Maxey Lincoln Mercury, No. 216564, 2001 WL 682371, at *2 n3 (Mich. Ct. App. Apr. 24, 2001)
[7] MRE 802.
[8] MRE 801(d)(2).

This post was written by Jeffrey D. Koelzer of  Varnum LLP © 2017
For more legal analysis go to The National Law Review

Employees Celebrate Chip Party: Embedding RFID Chips – Would You Agree to This?

On 1 August 2017, employees of a Wisconsin-based technology company enjoyed a “Chip Party” – but not the salty kind.  21 of Three Square Market’s 85 employees agreed to allow their employer to embed radio frequency identification chips in their bodies. We are familiar with the Internet of Things, is this the Internet of People?

Three Square Market (known as 32M) highlighted the convenience of microchipping their employees, reporting that they will be able to use the RFID chip to make purchases in the company break room, open doors, access copy machines and log in to their computers.

While the “chipped” employees reported that they felt only a brief sting when the chips were inserted, chipping employees draws deeper cuts through ethical and privacy issues.

One such issue is the potential for the technology to gradually encroach with further applications not contemplated by its original purpose. RFID technology has the potential to be used for surveillance and location-tracking purposes, similar to GPS technology. It also has potential to be used as a password or authentication tool, to store health information, access public transport or even as a passport.

While these potential applications will offer convenience to employers and consumers, the value of the information generated by each transaction is arguably greater for the marketers, data brokers and law enforcement entities that use it for their own purposes. Once data like this exists it can be accessed in all manner of circumstances.  Can you ever provide sufficient advice and counselling to employees to create informed consent free from the power imbalance of the employment relationship?

All keen on tech here at K&L Gates, but no one was putting their hand up for a similar program here, we’ll all just use our pass card to open the door, thanks.  We were left brainstorming films that use implants to see where this technology could take us as it is all too common in Sci-Fi films.  Have a look at The Final Cut, 2004 (warning 37% Rotten Tomato rating), where implants took centre stage by storing people’s experiences.  We are not there yet, but we have taken the first wobbly step on the path.

Read more about 32M’s use of RFID chips here.

See here to find out more about tracking employees with other technologies.

Read more legal analysis on the National Law Review.

Olivia Coburn and Cameron Abbott of K&L Gates contributed this article.

Privacy Hat Trick: Three New State Laws to Juggle

Nevada, Oregon and New Jersey recently passed laws focusing on the collection of consumer information, serving as a reminder for advertisers, retailers, publishers and data collectors to keep up-to-date, accurate and compliant privacy and information collection policies.

Nevada: A Website Privacy Notice is Required

Nevada joined California and Delaware in explicitly requiring websites and online services to post an accessible privacy notice. The Nevada law, effective October 1, 2017, requires disclosure of the following:

  • The categories of “covered information” collected about consumers who visit the website or online service;

  • The categories of third parties with whom the operator may share such information;

  • A description of the process, if any, through which consumers may review and request changes to their information;

  • A description of the process by which operators will notify consumers of material changes to the notice;

  • Whether a third party may collect covered information about the consumer’s online activities over time and across different Internet websites or online services; and

  • The effective date of the notice.

“Covered Information” is defined to include a consumer’s name, address, email address, telephone number, social security number, an identifier that allows a specific person to be contacted physically or online, and any other information concerning a person maintained by the operator in combination with an identifier.

Takeaway: Website and online service operators (including Ad Techs and other data collectors) should review their privacy policies to ensure they are disclosing all collection of information that identifies, can be used to contact, or that is combined with information that identifies consumers. Website operators should also be sure that they are aware of, and are properly disclosing, any information that is shared with or collected by their third-party service providers and how that information is used.

Oregon: Misrepresentation of Privacy Practices = Unlawful Trade Practice.

Oregon expanded its definition of an “unlawful trade practice”, effective January 1, 2018, to expressly include using, disclosing, collecting, maintaining, deleting or disposing of information in a manner materially inconsistent with any statement or representation published on a business’s website or in a consumer agreement related to a consumer transaction.The new Oregon law is broader than other similar state laws, which limit their application to “personal information”. Oregon’s law, which does not define “information”, could apply to misrepresentations about any information collection practices, even if not related to consumer personal information.

Takeaway: Businesses should be mindful when drafting privacy policies, terms of use, sweepstakes and contest rules and other consumer-facing policies and statements not to misrepresent their practices with respect to any information collected, not just personal information.

New Jersey: ID Cards Can Only be Scanned for Limited Purposes (not Advertising)

New Jersey’s new Personal Information and Privacy Protection Act, effective October 1, 2017, limits the purposes for which a retail establishment may scan a person’s identification card to the following:

  • To verify the authenticity of the card or the identity of the person paying for goods or services with a method other than cash, returning an item or requesting a refund or exchange;

  • To verify the person’s age when providing age-restricted goods or services to the person;

  • To prevent fraud or other criminal activity using a fraud prevention service company or system if the person returns an item or requests a refund or exchange;

  • To prevent fraud or other criminal activity related to a credit transaction to open or manage a credit account;

  • To establish or maintain a contractual relationship;

  • To record, retain, or transmit information required by State or federal law;

  • To transmit information to a consumer reporting agency, financial institution, or debt collector to be used as permitted by the Fair Credit Reporting Act and the Fair Debt Collection Practices Act; or

  • To record, retain, or transmit information governed by the medical privacy and security rules of the Health Insurance Portability and Accountability Act.

The law also prohibits the retention of information scanned from an identification card for verification purposes and specifically prohibits the sharing of information scanned from an identification card with a third party for marketing, advertising or promotional activities, or any other purpose not specified above. The law does make an exception to permit a retailer’s automated return fraud system to share ID information with a third party for purposes of issuing a reward coupon to a loyal customer.

Takeaway: Retail establishments with locations in New Jersey should review their point-of-sale practices to ensure they are not scanning ID cards for marketing, advertising, promotional or any other purposes not permitted by the New Jersey law.

Read more legal analysis at the National Law Review.

This post was written byJulie Erin Rubash of  Sheppard Mullin Richter & Hampton LLP.

Weapons in the Cyber Defense Arsenal

In May 2017, the world experienced an unprecedented global cyberattack that targeted the public and private sectors, including an auto factory in France, dozens of hospitals and health care facilities in the United Kingdom, gas stations in China and banks in Russia. This is just the tip of the iceberg and more attacks are certain to follow. As this experience shows, companies of all sizes, across all industries, in every country are vulnerable to cyberattacks that can have devastating consequences for their businesses and operations.

The Malware Families

Exploiting vulnerabilities in Microsoft® software, hackers launched a widespread ransomware attack targeting hundreds of thousands of companies worldwide. The vector, “WannaCry” malware, encrypts electronic files and locks them until released by the hacker after a ransom is paid in untraceable Bitcoin. The malware also has the ability to spread to all other computer systems on a network. On the heels of WannaCry, a new attack called “Adylkuzz” is crippling computers by diverting their processing power.

The most prevalent types of ransomware found in 2016 were Cerber and Locky. Microsoft detected Cerber, used in spam campaigns, in more than 600,000 computers and observed that it was one of the most profitable of 2016. Spread via malicious spam emails that have an executable virus file, Cerber has gained increasing popularity due to its Ransomware-as-a-Service (RaaS) business model, which enables less sophisticated hackers to lease the malware.

data security privacy FCC cybersecurityCheck Point Software indicated that Locky was the second most prevalent piece of malware worldwide in November 2016.  Microsoft detected Locky in more than 500,000 computers in 2016. First discovered in February 2016, Locky is typically delivered via an email attachment (including Microsoft Office documents and compressed attachments) in phishing campaigns designed to entice unsuspecting individuals to click on the attachment. Of course, as the most recent global attacks demonstrate, hackers are devising and deploying new variants of ransomware with different capabilities all the time.

The Rise of Ransomware Attacks

The rise in ransomware attacks is directly related to the ease with which it is deployed and the quick return for the attackers. The U.S. Department of Justice has reported that there was an average of more than 4,000 ransomware attacks daily in 2016, a 300 percent increase over the prior year. Some experts believe that ransomware may be one of the most profitable cybercrime tactics in history, earning approximately $1 billion in 2016. Worse yet, even with the ransom paid, some data already may have been compromised or may never be recovered.

The risk is even greater if your ransom-encrypted data contains protected health information (PHI). In July 2016, the U.S. Department of Health and Human Services, Office of Civil Rights (HHS/OCR) advised that the encryption or permanent loss of PHI would trigger HIPAA’s Breach Notification Rule for the affected population, unless a low probability that the recovered PHI had been compromised could be demonstrated. This means a mandated investigation to confirm the likelihood that the PHI was not accessed or otherwise compromised.

Ransomware Statistics

According to security products and solutions provider Symantec Corporation, ransomware was the most dangerous cybercrime threat facing consumers and businesses in 2016:

  • The majority of 2016 ransomware infections happened in consumer computers, at 69 percent, with enterprises at 31 percent.

  • The average ransom demanded in 2016 rose to $1,077, up from $294 in 2015.

  • There was a 36 percent increase in ransomware infections from 340,665 in 2015 to 463,841 in 2016.

  • The number of ransomware “families” found totaled 101 in 2016, triple the 30 found in 2015.

  • The biggest event of 2016 was the beginning of RaaS, or the development of malware packages that can be sold to attackers in return for a percentage of the profits.

  • Since January 1, 2016, more than 4,000 ransomware attacks have occurred − a 300 percent increase over the 1,000 daily attacks seen in 2015.

  • In the second half of 2016, the percentage of recognized ransomware attacks from all malware attacks globally doubled from 5.5 percent to 10.5 percent.

The Best Defense Is a Good Offense

While no perfectly secure computer system exists, companies can take precautionary measures to increase their preparedness and reduce their exposure to potentially crippling cyberattacks. While Microsoft no longer supports Windows XP operating systems, which were hit the hardest by WannaCry, Microsoft has made an emergency patch available to protect against WannaCry. However, those still using Windows XP should upgrade all devices to a more current operating system that is still fully supported by Microsoft to ensure protection against emerging threats. Currently, that means upgrading to Windows 7, Windows 8 or Windows 10.

Even current, supported software needs to be updated when prompted by the computer. Those who delay installing updates may find themselves at risk. Microsoft issued a patch for supported operating systems in March 2017 to protect against the vulnerability that WannaCry exploited. Needless to say, many companies did not bother to patch their systems in a timely manner.

Ransomware creates even greater business disruption when a company does not have secure backups of files that are critical to key business functions and operations. It also is important for companies to back up files frequently, because a stale backup that is several months old or older may not be particularly useful. Companies also should make certain that their antivirus and anti-malware software is current to protect against emerging threats.

In addition, companies need to train their employees on detecting and mitigating potential cyber threats. Employees are frequently a company’s first line of defense against many forms of routine cyberattacks that originate from seemingly innocuous emails, attachments and links from unknown sources. Indeed, many cyberattacks can be avoided if employees are simply trained not to click on suspicious links or attachments that could surreptitiously install malware.

Last but not least, companies should consider purchasing cyber liability insurance coverage, which is readily available. While cyber policies are still evolving and there are no standardized policy forms, coverage can be purchased at varying price points with different levels of coverage. Some of the more comprehensive forms of coverage provide additional “bells and whistles” such as immediate access to preapproved professionals that can guide companies through the legal and technical web of cybersecurity events and incident response.

Other cyber policies afford bundled coverages that may include:

  • The costs of a forensics investigation to identify the source and scope of an incident

  • Notification to affected individuals

  • Remediation in the form of credit monitoring and identity theft restoration services

  • Costs to restore lost, stolen or corrupted data and computer equipment

  • Defense of third-party claims and regulatory investigations arising out of a cyberattack.

 

This post was written by Anjali C. Das, Kevin M. Scott and John Busch of Wilson Elser Moskowitz Edelman & Dicker LLP.data security privacy FCC cybersecurity

Sharing Cyber Threat Information

HIPAA PRIVACY ISAOsThe Information Sharing and Analysis Organization-Standards Organization (ISAO-SO) was set up under the aegis of the Department of Homeland Security pursuant to a Presidential Executive Order intended to foster threat vector sharing among private entities and with the government. ISAOs are proliferating in many critical infrastructure fields, including health care, where cybersecurity and data privacy are particularly sensitive issues given HIPAA requirements and disproportionate industry human and systems vulnerabilities.  Therefore, in advising their companies’ management, general counsel and others  might benefit from reviewing the FAQ’s and answers contained in the draft document that can be accessed at the link below.

Announcing the April 20 – May 5, 2017 comment period, the Standards Organization has noted the following:

Broadening participation in voluntary information sharing is an important goal, the success of which will fuel the creation of an increasing number of Information Sharing and Analysis Organizations (ISAOs) across a wide range of corporate, institutional and governmental sectors. While information sharing had been occurring for many years, the Cybersecurity Act of 2015 (Pub. L. No. 114-113) (CISA) was intended to encourage participation by even more entities by adding certain express liability protections that apply in several certain circumstances. As such proliferation continues, it likely will be organizational general counsel who will be called upon to recommend to their superiors whether to participate in such an effort.

With the growth of the ISAO movement, it is possible that joint private-public information exchange as contemplated under CISA will result in expanded liability protection and government policy that favors cooperation over an enforcement mentality.

To aid in that decision making, we have set forth a compilation of frequently asked questions and related guidance that might shed light on evaluating the potential risks and rewards of information sharing and the development of policies and procedures to succeed in it. We do not pretend that the listing of either is exhaustive, and nothing contained therein should be considered to contain legal advice. That is the ultimate prerogative of the in-house and outside counsel of each organization. And while this memorandum is targeted at general counsels, we hope that it also might be useful to others who contribute to decisions about cyber-threat information sharing and participation in ISAOs.

The draft FAQ’s can be accessed at :  https://www.isao.org/drafts/isao-sp-8000-frequently-asked-questions-for-isao-general-counsels-v0-01/

©2017 Epstein Becker & Green, P.C. All rights reserved.

Broadband Internet Service Providers In Regulatory Limbo After Repeal of FCC Privacy and Data Security Rules

data security privacy FCC cybersecurityPotentially signaling the end of the short-lived stint by the Federal Communication Commission (“FCC”) to regulate consumer data privacy on the internet, the Trump Administration recently repealed Obama-era data privacy and security rules for broadband providers.  The action, passed by Congress and signed by President Trump pursuant to the Congressional Review Act, completely rescinds the rules that would have gone into effect later this year.  While the move has been welcomed by industry insiders, it leaves broadband providers in regulatory limbo as the Trump Administration seeks to determine which agency and what rules will oversee data protection in this sector going forward.

The FCC’s Privacy Order and Its Repeal

In November 2016, the FCC released comprehensive consumer privacy and data security rules (the “2016 Privacy Order”) for broadband internet access service (“BIAS”) providers.1  BIAS providers offer consumers high-speed, continuous access to the internet, typically through cable, telephone, wireless, or fiber-optic connections.  They are different from entities such as Amazon and Facebook, which do not provide connections to the internet but rather offer internet services such as cloud storage, messaging, news, video streaming, and online shopping and are regulated, with respect to data privacy matters, by the Federal Trade Commission (“FTC”).

The 2016 Privacy Order would have, among other things, required BIAS providers to obtain affirmative customer consent (“opt-in” consent) prior to using and sharing, for commercial purposes, confidential customer data, such as a user’s web browsing history, application usage history, or geo-location information, and prohibited them from refusing to serve customers who did not provide such consent.  It also required BIAS providers to adopt “reasonable measures” to protect customer data from unauthorized disclosure, and required them to give notice to customers affected by any data breach “without unreasonable delay” but not later than 30 days after determining that a breach had occurred.

Repeal of the 2016 Privacy Order comes as a welcome development for industry groups, which vigorously opposed them both prior to and subsequent to their finalization.  In January 2017, the FCC received multiple petitions to reconsider and stay the order.2  The BIAS industry complained that some of the new rules – particularly the opt-in rule for the use of sensitive customer information – put BIAS providers at a competitive disadvantage because the rules were more restrictive than FTC rules that applied to other internet entities such as Amazon and Facebook and, further, would have required costly updates to BIAS providers’ systems.  In response, the FCC – now with a Chairman appointed by President Trump and a majority of Republican-appointed commissioners – reversed course and, on March 1, 2017, voted to stay some of the provisions of the 2016 Privacy Order that had been due to come into effect.3  Shortly thereafter, Congress and President Trump used their authority under the Congressional Review Act to completely rescind the 2016 Privacy Order.4

Is Net Neutrality Next?

To answer the question of where the Trump Administration might go from here first requires an explanation of how the FCC came to be responsible for regulating data privacy and security for BIAS providers in the first place.

Until 2015, BIAS providers, like other internet service and content providers, were not considered to be “common carriers” by the FCC and, thus, were not subject to data privacy regulation by the FCC.  Instead, for matters concerning data privacy and protection, BIAS providers looked to the FTC.  That changed in 2015, when the FCC issued the “Open Internet Order,”5 which reclassified BIAS providers as “telecommunications services” and, therefore, subjected them to common carrier regulation by the FCC under Title II of the Communications Act of 1934 (“Title II”).  Among other things, Title II requires “telecommunications services” to furnish services to customers “upon reasonable request” and prohibits “unjust and unreasonable discrimination” in the services that common carriers provide.  Title II further provides that “telecommunications services” have a duty to protect the privacy of customer data.6

This reclassification was necessary for the FCC to promote and establish, as the centerpiece of the Open Internet Order, “net neutrality” rules for BIAS Providers.  “Net neutrality” rules require BIAS providers to allow users equal access to all otherwise lawful internet websites, content, and services, without favoring or restricting access, whether the websites are owned or controlled by the service providers’ affiliates, business partners, or competitors.  For example, absent net neutrality rules, a BIAS provider might, in exchange for a fee or other consideration, agree with a video sharing website, such as YouTube, to provide its customers with faster and better access to YouTube than to a rival video sharing website, such as Vimeo.

Previous attempts by the FCC to impose net neutrality rules on BIAS providers had been rejected by the Court of Appeals for the D.C. Circuit.  Most recently, in 2014, the D.C. Circuit held that the FCC did not have the authority to impose net neutrality rules on BIAS providers because they were not subject to the common carrier rules under Title II.7  In response, the FCC reclassified BIAS providers as common carriers in its Open Internet Order.  The 2016 Privacy Order was an attempt by the FCC to further define the data privacy and protection rules that applied to BIAS providers under Title II.

The Trump Administration now seeks to return the BIAS industry to privacy oversight by the FTC, as both the current FCC and FTC Chairpersons have indicated that “jurisdiction over broadband providers’ privacy and data security practices should be returned to the FTC, the nation’s expert agency with respect to these important subjects.”8  However, this is easier said than done, as it would require that the FCC revoke the Open Internet Order and its accompanying net neutrality rules.  Such a move would be favored by the BIAS industry and the new Chairman of the FCC, Ajit Pai, who regards the net neutrality rules as a “mistake,”9 but would be met by criticism from many major internet content providers and services, such as Amazon, Google, and Facebook.10

In the meantime, the FTC is without authority to regulate BIAS providers regarding data privacy, as the FTC Act contains an express exemption of FTC jurisdiction for common carriers.11  Further complicating matters is an August 2016 decision of the Court of Appeals for the Ninth Circuit, which interpreted the FTC’s common carrier exemption as including all activities of any entity designated as a common carrier, even those activities that are unrelated to the entity’s common carrier business and which otherwise might be subject to FTC jurisdiction if they were carried out by a separate entity.12  If the Ninth Circuit position were to stand and be adopted by other Circuits – the FTC is currently seeking a rehearing en banc – the FCC suddenly might find itself responsible for regulating a host of non-common carrier related business activities merely because they are provided by entities that have been designated as common carriers under Title II.

Many large BIAS providers have faced this uncertainty by pledging to take “reasonable measures to protect customer information” and notify “consumers of data breaches as appropriate” in accordance with the existing FTC data privacy framework (i.e., ensuring that their data security practices are not “unfair or deceptive” in contravention of Section 5 of the FTC Act).[13]

BIAS providers are also presently subject to a host of state laws concerning data privacy and protection, including at least 48 state data breach notification laws, the most recent of which was enacted in New Mexico.14  These laws typically require businesses to notify the state authorities, affected customers, and major credit reporting agencies when the state’s residents’ confidential personal information, such as social security or driver’s license numbers, credit card numbers, and passwords, have been exposed through a data breach.  In addition, some states, such as Massachusetts15 and California,16 also require businesses to implement and maintain reasonable security procedures and practices to protect customer information.  Finally, some states maintain consumer protection laws, which, similar to the FTC Act, generally protect against unfair or deceptive trade practices and have been used by state attorney generals to penalize companies that fail to protect customer data.17

Conclusion

The Trump Administration’s repeal of the 2016 Privacy Order has provided a respite for the BIAS industry from vigorous new requirements that would have gone into effect this year.  However, it also has created a period of regulatory uncertainty as regulators determine the way forward, including the fate of the Open Internet Order.  In the meantime, BIAS providers should, as they have promised, continue to follow reasonable data privacy and protection practices, consistent at least with those required by the FTC, and also carefully consider whether any other applicable federal or state data privacy laws apply to their business.

© Copyright 2017 Cadwalader, Wickersham & Taft LLP


Protecting the Privacy of Customers of Broadband and Other Telecommunications Services, Report and Order, 31 FCC Rcd 13911 (2016), available at https://apps.fcc.gov/edocs_public/attachmatch/FCC-16-148A1.pdf.

Seee.g., Joint Petition for Stay, available athttps://ecfsapi.fcc.gov/file/101270254521574/012717%20Petition%20for%20Stay.pdf(“Stay Petition”).

See Order Granting Stay Petition, available at https://apps.fcc.gov/edocs_public/attachmatch/FCC-17-19A1.pdf.

See S.J. Res. 34 – 115th Congress, available at https://www.congress.gov/bill/115th-congress/senate-joint-resolution/34/text.

See Protecting and Promoting the Open Internet, Report and Order on Remand, Declaratory Ruling, and Order, 30 FCC Rcd 5601 (2015), available athttps://apps.fcc.gov/edocs_public/attachmatch/FCC-15-24A1.pdf.

See 47 U.S.C. § 222(a) (“Every telecommunications carrier has a duty to protect the confidentiality of proprietary information of, and relating to . . . customers.”).

See Verizon v. F.C.C., 740 F.3d 623 (D.C. Cir. 2014).

See Joint Statement of Acting FTC Chairman Maureen K. Ohlhausen and FCC Chairman Ajit Pai on Protecting Americans’ Online Privacyavailable at https://www.ftc.gov/news-events/press-releases/2017/03/joint-statement-acting-ftc-chairman-maureen-k-ohlhausen-fcc.

See Remarks of Federal Communications Commission Chairman Ajit Pai at the Mobile World Congress (February 28, 2017), available at https://apps.fcc.gov/edocs_public/attachmatch/DOC-343646A1.pdf.

10 See Google, Facebook and Amazon write to FCC demanding true net neutrality, The Guardian (May 7, 2014), available athttps://www.theguardian.com/technology/2014/may/08/google-facebook-and-amazon-sign-letter-criticising-fcc-net-neutrality-plan.

11 See 15 U.S.C. § 45(a)(2).

12 See F.T.C. v. AT&T Mobility LLC, 835 F.3d 993 (9th Cir. 2016).  The FTC has sought rehearing en banc.

13 See Stay Petition, ISP Privacy Principles.

14 See New Mexico H.B. 15, Data Breach Notification Act (2017).

15 See Mass Gen. Laws Ann. ch. 93H, § 2.

16 See Cal. Civ. Code § 1798.81.5(b).

17 Seee.g., Press Release, A.G. Schneiderman Announces $100K Settlement with E-Retailer after Data Breach Exposes Over 25K Credit Card Numbers, N.Y. State Attorney General’s Office (Aug. 5, 2016), available at https://ag.ny.gov/press-release/ag-schneiderman-announces-100k-settlement-e-retailer-after-data-breach-exposes-over