Private Email Woes Infect The Private Sector in Delaware

emailVice Chancellor J. Travis Laster’s ruling in Amalgamated Bank v. Yahoo!, Inc., C.A. No. 10774-VCL (Del. Ch. Feb. 2, 2016) should sound a tocsin to directors that their “private” emails may not be so private.  The ruling addressed Amalgamated Bank’s demand to inspect the books and records of Yahoo! pursuant to Section 220 of the Delaware General Corporation Law.  The bank sought to inspect, among other things, documents that reflect discussions or decisions of Yahoo’s full Board or Committee.  Documents covered by the demand included emails to and from the directors, from management or the compensation consultant, emails among the directors themselves, and documents and communications prepared by Yahoo officers and employees about the Board‘s deliberations.

Vice Chancellor Laster found that emails were records subject to inspection under Section 220 and that through Delaware’s jurisdiction over a corporation, a court can compel production of documents in the possession of officers, directors, and managing agents of the firm.  According to the Vice Chancellor, the court can impose sanctions or other consequences on the firm if the officer, director, or managing agent fails to comply. He further noted that if a personal email account was used to conduct corporate business, the email is subject to production under Section 220. Directors and corporate officers should therefore take heed that emails concerning corporate business may be subject to disclosure even if conducted using a private email address.

© 2010-2016 Allen Matkins Leck Gamble Mallory & Natsis LLP

 

HHS Issues Final Rule on HIPAA and Firearm Background Check Reporting

On January 6, as part of President Obama’s executive action to combat gun violence, HHS promulgated a final regulation modifying the HIPAA Privacy Rule to allow certain HIPAA covered entities to disclose limited information to the National Instant Criminal Background Check System (NICS).

Background:  The NICS, maintained by the Federal Bureau of Investigation (FBI), is the national database used to conduct background checks on persons who may be disqualified from receiving firearms based on federal or state law.  Federal law identifies several categories of potential disqualifiers, known as “prohibitors” including a federal mental health prohibitor.  By statute, the federal mental health prohibitor applies to individuals who have been committed to a mental institution or adjudicated as a mental defective.  The Department of Justice has promulgated regulations that defines these categories to include the following individuals:

  • individuals committed to a mental institution for reasons such as mental illness or drug use;

  • individuals found incompetent to stand trial or not guilty by reason of insanity, or

  • individuals who have been otherwise determined by a court, board, commission, or other lawful authority to be a danger to themselves or others or to lack the mental capacity to contract or manage their own affairs as a result of marked subnormal intelligence or mental illness, incompetency, condition, or disease.

However, there is currently no federal law that requires state agencies to report data to the NICS, including the identity of individuals who are subject to the mental health prohibitor.  HHS believes that HIPAA poses a potential barrier to such reporting. Under current law, HIPAA only permits covered entities (e.g., state mental health agencies) to disclose such information to the NICS in limited circumstances: when the entity is a “hybrid” entity under HIPAA (and the Privacy Rule does not apply to these functions) or when state law otherwise requires disclosure, and thus disclosure is permitted under HIPAA’s “required by law” category.

Final Rule:  HHS finalized its proposed rule without any substantive changes. Under the final rule, a new section 164.512(k)(7) of the HIPAA Privacy Rule expressly permits certain covered entities to disclose information relevant to the federal mental health prohibitor to the NICS.

The permitted disclosure applies only to those covered entities that function as repositories of information relevant to the federal mental health prohibitor on behalf of a State or are responsible for ordering the involuntary commitments or the adjudications that would make someone subject to the prohibitor.  Thus, most treating providers may not disclose protected health information about their own patients to the NICS, unless otherwise permitted by the HIPAA Privacy Rule.  HHS also clarifies that individuals who seek voluntary treatment are not subject to the prohibitor.

The rule limits disclosure only to the NICS or an entity designated by the State to report data to the NICS.  And only that information that is “needed for purposes of reporting to the NICS” may be disclosed, though HHS gives States the flexibility to determine which data elements are “needed” to create a NICS record (consistent with requirements of the FBI, which maintains the NICS).  At present, the required data elements for the NICS are: name; date of birth; sex; and codes identifying the relevant prohibitor, the submitting state agency, and the supporting record.  The NICS also allows disclosure of certain optional data elements (e.g., social security number and identifying characteristics).  HHS notes that applicable covered entities may disclose such optional data elements “to the extent necessary to exclude false matches.”

HHS declined many commenters’ suggestion to expand the rule to permit the disclosure of information about individuals who are subject to state-only mental health prohibitors. HHS fears that expanding the scope of the permitted disclosure would disrupt the careful balance between public safety and encouraging patients to seek mental health care.

Finally, in the preamble, HHS defended its statutory authority to make this change, despite the fact that Congress did not address HIPAA in recent legislation to strengthen the NICS.  HHS explained that the “HIPAA statute confers broad authority on the Department to specify the permitted uses and disclosures of PHI by HIPAA covered entities.”

© 2015 Covington & Burling LLP

That Drone in Your Holiday Stocking Must Now Be Registered With FAA

Fearing for public safety over the explosion of hobby-type drones taking to the air, on December 14, 2015, the Federal Aviation Administration (FAA) issued the first rule1 of its kind directed squarely to owners and operators of hobby and recreational small unmanned aircraft systems (sUAS aka drones) that now requires their registration on the FAA’s new online drone registration portal.2 A second, more comprehensive drone rule is expected to issue in 2016 to allow the use of drones for certain commercial purposes where the risk to public safety is low.

In the meantime, the hobby drone rule, which goes into effect on December 21, 2015 ahead of the holiday rush, requires all hobby-type drones weighing between 0.55 pounds (about two sticks of butter) and 55 pounds that have flown prior to December 21, 2015 to be registered no later than February 19, 2016.3 For hobby drones in this weight class flying for the first time on or after December 21, 2015 (i.e. holiday stocking stuffers), the rule requires that the owner register the drone before the first flight.

The FAA structured the grace period to encourage registration of millions of pre-existing drones while also requiring that all new drones be registered before first flight. To further encourage registration, the $5.00 registration fee will be credited back to registrants if they register within the first 30 days. And to lessen the burden of registration, the rule provides that a single registration applies to as many drones as an owner/operator owns or operates. If these incentives are insufficient to prompt compliance, the rule provides civil penalties up to $27,500, and criminal penalties including fines up to $250,000 and/or imprisonment for up to three years.

U.S. citizens age 13 or older can register their drones at the FAA’s registration portal. The registrant will need to provide the FAA with their name, physical address, mailing address (if different) and email address. Upon completion of the registration process, the FAA will provide the registrant (or certificate holder, as the FAA calls them) with a unique registration number that must be affixed to each drone.4 In addition, each registration must be renewed every 3 years and will require an additional $5.00 renewal fee.

After the rule goes into effect on December 21, 2015, all operators of hobby drones falling within the weight limits of the rule must provide proof of registration in the form of a Certificate of Aircraft Registration, either in printed or electronic form—much the same way as an angler or a hunter currently provides proof of a state fishing license or a state hunting license to a game warden.5

A number of exceptions to the rule are worth mentioning. First, hobby drones that weigh less than 0.55 pounds (i.e. radio-controlled micro quadcopters that fit in the palm of your hand) do not require registration.6 The rule will not apply to drones flown solely indoors. The rule also does not apply to those who want to fly drones for commercial purposes (i.e. for pay and/or hire) or on behalf of a state or federal government agency (i.e. fire departments, police departments, etc.). Such operators must first obtain an exemption of the FAA’s rules applicable to, for example, passenger-carrying aircraft, that the FAA has interpreted as also being applicable to commercial operators of unmanned drones regardless of size and weight. In addition, any types of entities other than individual hobbyists—such as corporations, or anyone wanting to record a lease or security interest—cannot use the online registration portal.

The new rule is the first concrete step that the FAA has taken in response to Congress’ mandate to the FAA under Section 333 of the FAA Modernization and Reform Act of 2012 to integrate unmanned aircraft into the National Airspace System (NAS). The next big step will be the FAA’s issuance in 2016 of rules that provide a legal path for using drones for commercial purposes without having to obtain prior FAA approval. Those rules will likely have a sweeping impact on business owners of all types who want to buy and sell aerial imaging and/or aerial data transmission services, including large-scale land developers, shopping mall operators and owners of other large structures, real estate agents, wedding photographers, and live TV/radio broadcast providers, to name a few, as well as industries that support those businesses, including software and hardware component suppliers, venture capital and financing providers, accountants, and the like.

In the interim, the new rule will force a dramatic change in the way consumers think about small radio-controlled unmanned aircraft in the future—they’re not just toys anymore.


1 See FAA Interim Final Rule (IFR) available at: https://www.faa.gov/news/updates/media/20151213_IFR.pdf
2 The FAA’s drone registration portal is available at: https://www.faa.gov/uas/registration/
3 Model radio-controlled aircraft of all types, including the type of fixed wing, radio-controlled model aircraft that have flown in parks and fields for decades, fall under the new rule.
4 The unique registration number may be affixed via permanent marker, label, engraving or other means as long as the number is readily accessible and readable upon close visual inspection.
5 Although it is unlikely that the FAA will have the manpower to enforce the new rule against hobbyists who do not register their drones, the FAA nevertheless intends to employ a strategic approach to encourage compliance ranging from outreach and education programs to administrative and/or legal action should the facts of a case so warrant.
6 According to the FAA, most toy drones costing $100 or less will likely weigh less than 0.55 pounds (250 grams).

Happy Holidays: VTech Data Breach Affects Over 11 million Parents and Children Worldwide

The recent data breach of Hong Kong-based electronic toy manufacturer VTech Holdings Limited (“VTech” or the “Company”) is making headlines around the world for good reason: it exposed sensitive personal information of over 11 million parents and children users of VTech’s Learning Lodge app store, Kid Connect network, and PlanetVTech in 16 countries! VTech’s Learning Lodge website allows customers to download apps, games, e-books and other educational content to their VTech products, the Kid Connect network allows parents using a smartphone app to chat with their children using a VTech tablet, and PlanetVTech is an online gaming site. As of December 3rd, VTech has suspended all its Learning Lodge sites, the KidConnect network and thirteen other websites pending investigation.

VTech announced the cyberattack on November 27th by press release and has since issued follow-on press releases on November 30th and December 3rd, noting that “the Learning Lodge, Kid Connect and PlanetVTech databases have been attacked by a skilled hacker” and that the Company is “deeply shocked by this orchestrated and sophisticated attack.” According to the various press releases, upon learning of the cyber attack, VTech “conducted a comprehensive check of the affected site” and has “taken thorough actions against future attacks.” The Company has reported that it is currently working with FireEye’s Mandiant Incident Response services and with law enforcement worldwide to investigate the attack. According to VTech’s latest update on the incident:

  • 4, 854, 209 parent Learning Lodge accounts containing the following information were affected: name, email address, secret question and answer for password retrieval, IP address, mailing address, download history and encrypted passwords;

  • 6,368,509 children profile containing the following information were affected: name, gender, and birthdate were affected. 1.2 million of the affected profiles have enabled the Kid Connect App, meaning that the hackers could also have access to profile photos and undelivered Kid Connect chat messages;

  • The compromised databases also include encrypted Learning Lodge content (bulletin board postings, ebooks, apps, games etc.), sales report logs and progress logs to track games, but, it did not include credit card, debit card or other financial account information or Social Security numbers, driver’s license numbers, or ID card numbers; and

  • The affected individuals are located in the following countries: USA, Canada, United Kingdom, Republic of Ireland, France, Germany, Spain, Belgium, the Netherlands, Denmark, Luxembourg, Latin America, Hong Kong, China, Australia and New Zealand. The largest number of affected individuals are reported in the U.S. (2,212,863 parent accounts and 2,894,091 children profiles), France (868,650 parent accounts and 1,173,497 children profiles), the UK (560,487 parent accounts and 727,155 children profiles), and Germany (390,985 parent accounts and 508,806 children profiles).

Given the magnitude and wide territorial reach of the VTech cyber attack, the incident is already on the radar of regulators in Hong Kong and at least two attorneys general in the United States. On December 1, the Hong Kong Office of the Privacy Commissioner for Personal Data announced that it has initiated “a compliance check on the data leakage incident” of VTech Learning Lodge.  In addition, on December 3rd, two separate class actions have already been filed against VTech  Electronics North America, L.L.C. and VTech Holdings Limited in the Northern District of Illinois.  Since the data breach compromised personal information of children located in the United States (first and last name, photographs, online contact information, etc.), it is likely that the Federal Trade Commission (FTC) will investigate VTech’s compliance with the Children’s Online Privacy Protection Act (“COPPA”) and its implementing rule (as amended, the “COPPA Rule”). If a COPPA violation is found, the civil penalties can be steep and go up to $16,000 per violation. In addition to civil penalties imposed by a court, the FTC can require an entity to implement a comprehensive privacy program and to obtain regular, independent privacy assessments for a period of time.

©1994-2015 Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C. All Rights Reserved.

Regulating Recording Features of Personal Wearable Technology in Workplace

With each passing day, personal wearable technology, like the Apple Watch and Google Glass, becomes more mainstream and technologically advanced.  Employers should be aware of the challenges posed by employees wearing their technology into the workplace.  Businesses have already had to consider decreased productivity, exposure to computer viruses, and potential data breaches caused by personal wearable technology in the workplace. In addition, employers are now wondering if personal wearable devices are being used to discretely and instantaneously record events and copy information in the workplace. Several employment laws are implicated when employers seek to regulate the recording features of personal wearable technology in the workplace.

Restrictions on personal wearable technology in the workplace are subject to Section 7 of the National Labor Relations Act, which prohibits workplace rules and policies that chill discussions among non-management employees about wages, working conditions, work instructions, and the exercise of other concerted activities for mutual aid or protection.  NLRB General Counsel Memorandum No. 15-04  contains examples of both over broad and lawful work rules restricting recording devices in the workplace.  These examples are instructive when drafting employment policies restricting personal wearable devices.

Under Section 7, employers may prohibit employees from copying or disclosing confidential or proprietary information about the employer’s business, using wearable technology or otherwise.  Employers may also prohibit employees from taking, distributing, or posting on social media pictures, video, and audio recordings of work areas while on working time, so long as the policy carves an exception for conduct protected by Section 7.  The exception should expressly cite specific examples of permitted recordings, such as “taking pictures of health, safety and/or working condition concerns or of strike, protests and work-related issues and/or other protected concerted activities.”  Existing employment policies restricting personal cell phone and camera use in the workplace should be updated to include restrictions on the use of recording features of wearable technology.

The recording features of personal wearable technology also provide new methods and means for employees to engage in unlawful workplace harassment and other workplace misconduct.  Employers should consider revising their anti-harassment and conduct policies to prohibit the use of wearable technology, including its recording features, in an unlawful manner.  As technology continues to evolve, so too should employment policies, to address the use of such personal devices in the workplace.

Article By Stan Hill of Polsinelli PC

Three Trending Topics in IoT: Privacy, Security, and Fog Computing

Cisco has estimated that there will be 50 billion Internet of Things (IoT) devices connected to the Internet by the year 2020. IoT has been a buzzword over the past couple of years. However, the buzz surrounding IoT in the year 2015 has IoT enthusiasts particularly exerted. This year, IoT has taken center stage at many conferences around the world, including the Consumer Electronics Show (CES 2015), SEMI CON 2015, and Createc Japan, among others.

1. IoT will Redefine the Expectations of Privacy

Privacy is of utmost concern to consumers and enterprises alike. For consumers, the deployment of IoT devices in their homes and other places where they typically expect privacy will lead to significant privacy concerns. IoT devices in homes are capable of identifying people’s habits that are otherwise unknown to others. For instance, a washing machine can track how frequently someone does laundry, and what laundry settings they prefer. A shower head can track how often someone showers and what temperature settings they prefer. When consumers purchase these devices, they may not be aware that these IoT devices collect and/or monetize this data.

The world’s biggest Web companies, namely, Google, Facebook, LinkedIn, and Yahoo are currently involved in lawsuits where the issues in the lawsuits relate to consent and whether the Web companies have provided an explicit enough picture of what data is being collected and how the data is being used. To share some perspective on the severity of the legal issues relating to online data collection, more than 250 suits have been filed in the U.S. in the past couple of years against companies’ tracking of online activities, compared to just 10 in the year 2010. As IoT devices become more prevalent, legal issues relating to consent and disclosure of how the data is being collected, used, shared or otherwise monetized will certainly arise.

2. Data and Device Security is Paramount to the Viability of an IoT Solution

At the enterprise level, data security is paramount. IoT devices can be sources of network security breaches and as such, ensuring that IoT devices remain secure is key. When developing and deploying IoT solutions at the enterprise level, enterprises should conduct due diligence to prevent security breaches via the IoT deployment, but also ensure that even if an IoT device is compromised, access to more sensitive data within the network remains secure. Corporations retain confidential data about their customers and are responsible for having adequate safeguards in place to protect the data. Corporations may be liable for deploying IoT solutions that are easily compromised. As we have seen with the countless data breaches over the past couple of years, companies have a lot to lose, financially and otherwise.

3. Immediacy of Access to Data and Fog Computing

For many IoT solutions, timing is everything. Many IoT devices and environments are “latency sensitive,” such that actions need to be taken on the data being collected almost instantaneously. Relying on the “cloud” to process the collected data and generate actions will likely not be a solution for such IoT environments, in which the immediacy of access to data is important. “Fog computing” aims to bring the storage, processing and data intelligence closer to the IoT devices deployed in the physical world to reduce the latency that typically exists with traditional cloud-based solutions. Companies developing large scale IoT solutions should investigate architectures where most of the processing is done at the end of the network and closer to the physical IoT devices.

The Internet of Things has brought about new challenges and opportunities for technology companies. Privacy, security and immediacy of access to data are three important trends companies must consider going forward.

© 2015 Foley & Lardner LLP

Are UK-to-US employee data transfers sunk by ECJ’s torpedoing of Safe Harbor regime?

So there it is – in a tremendous boost for transatlantic relations, the European Court of Justice has decided that America is not to be trusted with the personal data of EU residents.  That is not exactly the way the decision is phrased, of course, which (so far as relevant to UK HR) is more like this:

Under the Eighth Principle of the UK’s Data Protection Act (and all or most of its EU cousins) the personal data of your employees can be transferred outside the EU only where the recipient country ensures an adequate level of protection for the rights and freedoms of data subject.

Until now an EU employer has been able to rely in this respect on a US company’s registration with the Safe Harbor (sic) scheme, a series of commitments designed to replicate the safeguards of EU law for that data.  As of this week, however, that reliance has been deemed misplaced – the ability and tendency of the US security agencies to access personal data held by US employers has been found to compromise those commitments beyond immediate repair.  In addition, one of the EU “model clauses” which can legitimise international data transfers requires the US recipient to confirm that it is aware of no legislation which could compel it to disclose that personal data to third parties without the employee’s consent.  New US laws enacted to boost homeland security mean that this can simply no longer be said.  Therefore Safe Harbor has been comprehensively blown up and can no longer be used as automatic air-cover for employee data transfers to the US.

This creates two immediate questions for HR in the UK.  First, what exposure do we have for past data transfers to the US on a basis which is now shown to be illegitimate?  Second, what do we do about such transfers starting now?

  • Don’t panic! To make any meaningful challenge out of this issue, the UK employee would need to show some loss or damage arising out of that transfer.  In other words, even if the data has been used in the US as the basis for a negative decision about him (dismissal or demotion or no bonus), the employee would need to show that that decision would have been more favourable to him if it had been taken by the same people based on the same data but physically within the EU.  Clearly a pretty tough gig.

Second, all this case does is remove the presumption that Safe Harbor registrants are safe destinations – it does not prove that they are not, either now or historically.  The question of adequacy of protection is assessed by reference to all the circumstances of the case, including the nature of the personal data sent, why it is sent to the US and what relevant codes of conduct and legislative protections exist there.

Last, Schedule 4 of the DPA disapplies the Eighth Principle where the data subject (the employee) has given his consent to the international transfer, or where the transfer is necessary for the entering or performance of the employment contract between the employee and the UK employer.  It will rarely be the case that neither of these exceptions applies.

If you have not previously had complaints from your UK employees that their personal data has been misused/lost/damaged in the US, nothing in this decision makes that particularly likely now.

  • Still don’t panic.

  • However, do be aware that this case is likely to lead to stricter precautions being required to ensure that what is sent to the US is genuinely only the bare minimum.

  • On its face, Schedule 4 should allow most reasonable international transfers of employee data anyway, pretty much regardless of what level of protection is offered in the destination country. However, there is a strong body of opinion, especially in Continental Europe, that reliance on this provision alone is unsafe and that it is still appropriate for the EU employer to take specific steps (most usually, some form of data export agreement with its US parent) to satisfy itself that a reasonable level of protection for that data exists. It may also wish to be seen to reconsider how far those HR decisions need to be made in the US at all, and whether EU employee data could be kept on an EU-based server if that is not currently the case.

  • To the extent that employment contracts do not already include it, amend them to include an express consent to the transfer of relevant personal data to the US (but do note another possible avenue of attack much mulled-over in Europe, i.e. that consent in an employment contract is not freely given because the job hangs upon it). Last, be seen to prune the UK employee data you do hold in the US back to what is strictly necessary and get rid of stuff which is no longer (if it ever was) relevant to the performance of the employment contract.

© Copyright 2015 Squire Patton Boggs (US) LLP

ECJ Rules EU-US Safe Harbor Programme Is Invalid

The powers of EU data protection authorities are significantly strengthened by the decision, allowing them to suspend some or all personal data flows into the United States in certain circumstances.

In Maximillian Schrems v. Data Protection Commissioner (case C-362/14), the European Court of Justice (ECJ) has ruled[1] that the European Commission decision approving the Safe Harbor programme is invalid. Further, the ECJ ruled that EU data protection authorities do have powers to investigate complaints about the transfer of personal data outside Europe (whether by Safe Harbor-certified organisations or otherwise, but excluding countries deemed as having “adequate” data protection laws according to the EU). Finally, the ECJ ruled that data protection authorities can, where justified, suspend data transfers outside Europe until their investigations are completed.

Safe Harbor Programme

According to the European Commission, the United States is a country with “inadequate” data protection laws. The European Commission and the US Department of Commerce, therefore, agreed in 2000 to a self-certification programme for US organisations that receive personal data from Europe. Pursuant to the self-certification programme, a US organisation receiving personal data from Europe must certify that it adhered to certain standards of data processing comparable with EU data protection laws such that the EU citizens’ personal data was treated as adequately as if their personal data had remained in Europe. The Safe Harbor programme is operated by the US Department of Commerce and enforced by the Federal Trade Commission. Over 4,000 organisations have current self-certifications of adherence to Safe Harbor principles.[2]

The Schrems Case

Mr. Schrems complained in Irish legal proceedings that the Irish Data Protection Commissioner refused to investigate his complaint that the Safe Harbor programme failed to protect adequately personal data after its transfer to the US in light of revelations about the National Security Agency’s (NSA’s) PRISM programme. The question of whether EU data protection authorities have the power to investigate complaints about the Safe Harbor programme was referred to the ECJ. Yves Bot, Advocate General at the ECJ, said in an opinion released on 23 September 2015 that the Safe Harbor programme  does not currently do enough to protect EU citizens’ personal data because such data was transferred to US authorities in the course of “mass and indiscriminate surveillance and interception of such data” from Safe Harbor-certified organisations. Mr. Bot was of the opinion that the Irish Data Protection Commissioner, therefore, had the power to investigate complaints about Safe Harbor-certified organisations and, if there were “exceptional circumstances in which the suspension of specific data flows should be justified”, to suspend the data transfers pending the outcome of its investigation.

The ECJ followed Mr. Bot’s opinion and, further, declared that the European Commission’s decision to approve the Safe Harbor programme in 2000 was “invalid” on the basis that US laws fail to protect personal data transferred to US state authorities pursuant to derogations of “national security, public law or law enforcement requirements”. Furthermore, EU citizens do not have adequate rights of redress when their personal data protection rights are breached by US authorities.

The EU-US Data Protection Umbrella Agreement

In the last two years, the European Commission and various data protection working parties have discussed ways to improve the Safe Harbor programme and strengthen rights for EU citizens in cases where their personal data is transferred to the United States. Recently, the United States and European Union finalised a data protection umbrella agreement to provide minimum privacy protections for personal data transferred between EU and US authorities for law enforcement purposes. The umbrella agreement will provide certain protections to ensure that personal data is protected when exchanged between police and criminal justice authorities of the United States and the European Union. The umbrella agreement, however, does not apply to personal data shared with national security agencies.

The umbrella agreement also provides that EU citizens will have the right to seek judicial redress before US courts where US authorities deny access or rectification or unlawfully disclose their personal data. Currently, US citizens have the right to seek judicial redress in the European Union if their data—transferred for law enforcement purposes—is misused by EU law enforcement authorities. EU citizens, however, do not have corresponding rights of redress in the United States. A judicial redress bill has been introduced in the US House of Representatives; adoption of the bill would allow the United States and European Union to finalise the umbrella agreement.

Key Findings of the ECJ Decision

The key findings of the ECJ decision are as follows (quotes indicate excerpts from the ruling itself):

“The guarantee of independence of national supervisory authorities is intended to ensure the effectiveness and reliability of the monitoring of compliance with the provisions concerning protection of individuals”.

The powers of supervisory authorities include “effective powers of intervention, such as that of imposing a temporary or definitive ban on processing of data, and the power to engage in legal proceedings”.

The Safe Harbor programme “cannot prevent persons whose personal data has been or could be transferred to a third country from lodging with the national supervisory authorities a claim. . .concerning the protection of their rights and freedoms”.

National courts can consider the validity of the Safe Harbor programme, but only the ECJ can declare that it is invalid.

Where the national data protection authorities find that complaints regarding the protection of personal data by Safe Harbor-certified companies are well-founded, they “must. . .be able to engage in legal proceedings”.

Organisations self-certified under the Safe Harbor programme are permitted to “disregard” the Safe Harbor principles to comply with US national security, public interest, or law enforcement requirements.

There is no provision in the Safe Harbor programme for protection for EU citizens against US authorities who gain access to their personal data transferred to the United States pursuant to the Safe Harbor programme. There is only a provision for commercial dispute resolution.

The EU Data Protection Directive[3] “requires derogations and limitations in relation to the protection of personal data to apply only in so far as is strictly necessary”, but there is no such requirement applicable in the United States following the transfer of personal data pursuant to the Safe Harbor programme.

The Safe Harbor programme “fails to comply with the requirements” to protect personal data to the “adequate” standard required by the EU Data Protection Directive and is “accordingly invalid”.

Other Options to Transfer Personal Data to the United States

Safe Harbor-certified organisations should note that there are other options to transfer personal data to the United States, including express consent and the use of Binding Corporate Rules or EU-approved model clause agreements. Organisations using Safe Harbor-certified vendors may wish to discuss these other options with their vendors. There is, however, a risk that this decision could affect these other options, as national security derogations are likely to override the protection of personal data regardless of how it is transferred, with the only exception being the specific and informed consent of an individual to the transfer of his or her personal data to governmental authorities for national security purposes.

Conclusion

The ECJ decision is likely to take the European Commission by surprise.

The powers of national data protection authorities are significantly strengthened by this decision. They could allow data protection authorities to suspend some or all personal data flows into the United States in serious circumstances and where there is a justifiable reason to do so. There is a risk that a data protection authority could order that the data transfers by an international organisation outside of Europe be suspended from that jurisdiction, whereas data transfers in other European jurisdictions are permitted. To mitigate this risk, the European Commission is entitled to issue EU-wide “adequacy decisions” for consistency purposes.

The European Commission has today announced that it intends to release guidance for Safe Harbor-certified companies within the next two weeks.

Article By Stephanie A. “Tess” BlairDr. Axel Spies & Pulina Whitaker of Morgan, Lewis & Bockius LLP
Copyright © 2015 by Morgan, Lewis & Bockius LLP. All Rights Reserved.

[1] See Judgment of the Court (Grand Chamber) (6 October 2015)

[2] See Safe Harbor List.

[3] Directive 95/46/EC

EU Official Calls for Invalidation of EU–U.S. Safe Harbor Pact

A European Court of Justice (ECJ) advocate general, Yves Bot, has called for the European Union–U.S. Safe Harbor Agreement to be invalidated due to concerns over U.S. surveillance practices (press release here, opinion here). The ECJ has discretion to reject the recommendation, but such opinions are generally followed. A final decision on the issue is expected to be issued late this year or next year.

The issue arises out of the claims of an Austrian law student, Max Schrems, who challenged Facebook’s compliance with EU data privacy laws. (The case is Schrems v. (Irish) Data Protection Commissioner, ECJ C-362/14.) He claims that the Safe Harbor Framework fails to guarantee “adequate” protection of EU citizen data in light of the U.S. National Security Agency’s (NSA) surveillance activities. Although the Irish data protection authority rejected his claim, he appealed and the case was referred to the ECJ.

The European Data Protection Directive prohibits data of EU citizens from being transferred to third countries unless the privacy protections of the third countries are deemed adequate to protect EU citizens’ data. The U.S. and EU signed the Safe Harbor Framework in 2000, which permits companies self-certify to the U.S. Department of Commerce (DOC) annually that they abide by certain privacy principles when transferring data outside the EU. Companies must agree to provide clear data privacy and collection notices and offer opt-out mechanisms for EU consumers.

In 2013, former NSA contractor Edward Snowden began revealing large-scale interception and collection of data about U.S. and foreign citizens from companies and government sources around the globe. The revelations, which continue, have alarmed officials around the world, and already prompted the European Commission to urge more stringent oversight of data security mechanisms. The European Parliament voted in March 2014 to withdraw recognition from the Safe Harbor Framework. Apparently in response to the concern, the Federal Trade Commission (FTC) has taken action against over two dozen companies for failing to maintain Safe Harbor certifications while advertising compliance with the Framework, and in some cases claiming compliance without ever certifying in the first place. For more, see here (FTC urged to investigate companies), here (FTC settles with 13 companies in August 2015), and here (FTC settles with 14 companies in July 2014).

Advocate General Bot does not appear to have been mollified by the U.S. efforts, however. He determined that “the law and practice of the United States allow the large-scale collection of the personal data of citizens of the [EU,] which is transferred under the [S]afe [H]arbor scheme, without those citizens benefiting from effective judicial protection.” He concluded that this amounted to interference in violation of the right to privacy guaranteed under EU law, and that, notwithstanding the European Commission’s approval of the Safe Harbor Framework, EU member states have the authority to take measures to suspend data transfers between their countries and the U.S.

While the legal basis of that opinion may be questioned, and larger political realities regarding the ability to negotiate agreements between the EU and the U.S. are at play, if followed by the ECJ, this opinion would make it extremely difficult for companies to offer websites and services in the EU. This holds true even for many EU companies, including those that may have cloud infrastructures that store or process data in U.S. data centers. It could prompt a new round of negotiations by the U.S. and European Commission to address increased concerns in the EU about surveillance.

Congressional action already underway may help release some tension, with the House Judiciary Committee unanimously approving legislation that would give EU consumers a judicial right of action in the U.S. for violations of their privacy. This legislation was a key requirement of the EU in an agreement in principle that would allow the EU and U.S. to exchange data between law enforcement agencies during criminal and terrorism investigations.

Although the specific outcome of this case will not be known for months, the implications for many businesses are clear: confusion and continued change in the realms of privacy and data security, and uncertainty about the legal rules of the game. Increased fragmentation across the EU may result, with a concomitant need to keep abreast of varying requirements in more countries. Change and lack of harmonization is surely the new normal now.

© 2015 Keller and Heckman LLP

Wearables, Wellness and Privacy

Bloomberg BNA recently reported that this fall the Center for Democracy & Technology (CDT) will be issuing a report on Fitbit Inc.’s privacy practices. Avid runners, walkers or those up on the latest gadgets likely know about Fitbit, and its line of wearable fitness devices. Others may know about Fitbit due to the need to measure progress in their employers’ wellness programs, or even whether they qualify for an incentive. When participating in those programs, employees frequently raise questions about the privacy and security of data collected under such programs, a compliance issue for employers. Earlier this month, FitBit reported that its wellness platform is HIPAA compliant.

fitbit, charge HR, wearable technology, fitness tech, exercise, step counter, weight loss deviceFitBit’s Charge HR (the one I use) tracks some interesting data in addition to the number of steps: heart rate, calories burned, sleep activity, and caller ID. This and other data can be synched with a mobile app allowing users to, among other things: create a profile with more information about themselves, to track progress daily and weekly, and to find and communicate with friends also using a similar device.

Pretty cool stuff, and reasons why FitBit is the most popular manufacturer of wearables with nearly 25 percent of the market, as noted by Bloomberg BNA. But, of course, FitBit is not the only player in the market, and the same issues have to considered with the use of wearables regardless of the manufacturer.

According to Bloomberg BNA’s article, one of the concerns raised by CDT about FitBit and other wearables is that the consumer data collected by the devices may not be protected by federal health privacy laws. However, CDT’s deputy director of the Consumer Privacy Project stated to Bloomberg BNA that she has “a real sense that privacy matters” to FitBit. This is a good sign, but the laws that apply to the use of these kinds of devices depend on how they are used.

When it comes to employer-sponsored wellness programs and health plans, a range of laws may apply raising questions about what data can be collected, how it can be used and disclosed, and what security safeguards should be in place. At the federal level, the Health Insurance Portability and Accountability Act (HIPAA), the Americans with Disabilities Act (ADA), and the Genetic Information Nondiscrimination Act (GINA) should be on every employer’s list. State laws, such as California’s Confidentiality of Medical Information Act, also have to be taken into account when using these devices in an employment context.

Recently issued EEOC proposed regulations concerning wellness programs and the ADA address medical information confidentiality. If finalized in their current form, among other safeguards, the regulations would require employers to provide a notice informing employee about:

  • what medical information will be obtained,

  • who will receive the medical information,

  • how the medical information will be used,

  • the restrictions on its disclosure, and

  • the methods that will be used to prevent improper disclosure.

Preparing these notices for programs using wearables will require knowing more about the capabilities of the devices and how data is accessed, managed, disclosed and safeguarded.

But is all information collected from a wearable “medical information”? Probably not. The number of steps a person takes on a given day, in and of itself, seems unlikely to be medical information. However, data such as heart rate and other biometrics might be considered medical information subject to the confidentiality rule. Big data analytics and IoT may begin to play a greater role here, enabling more detailed pictures to be developed about employees and their activities and health through the many devices they use.

Increasingly wellness programs seek to incentivize the household, or at least employees and their spouses. Collecting data from wearables of both employee and spouse may raise issues under GINA which prohibits employers from providing incentives to obtain genetic information from employees. Genetic information includes the manifestation of disease in family members (yes, spouses are considered family members under GINA). The EEOC is currently working on proposed regulations under GINA that we are hoping will provide helpful insight into this and other issues related to GINA.

HIPAA too may apply to wearables and their collection of health-related data when related to the operation of a group health plan. Employers will need to consider the implications of this popular set of privacy and security standards including whether (i) changes are needed in the plan’s Notice of Privacy Practices, (ii) business associate agreements are needed with certain vendors, and (iii) the plan’s risk assessment and policies and procedures adequately address the security of PHI in connection with these devices.

Working through plans for the design and implementation of a typical wellness program certainly must involve privacy and security; moreso for programs that incorporate wearables. FitBits and other devices likely raise employees’ interest and desire to get involved, and can ease administration of the program, such as in regard to tracking achievement of program goals. But they raise additional privacy and security issues in an area where the law continues to develop. So, employers need to consider this carefully with their vendors and counselors, and keep a watchful eye for more regulation likely to be coming.

Until then, I need to get a few more steps in…

Article By Joseph J. Lazzarotti of Jackson Lewis P.C.