Lessons in Becoming a Second Rate Intellectual Power – Through Privacy Regulation!

The EU’s endless regulation imposed on data usage has spooled over into academia, providing another lesson in kneecapping your own society by overregulating it. And they wonder why none of the big internet companies arose from the EU (or ever will). This time, the European data regulators seem to be doing everything they can to hamstring clinical trials and drive the research (and the resulting tens of billions of dollars of annual spend) outside the EU. That’s bad for pharma and biotech companies, but it’s also bad for universities that want to attract, retain, and teach top-notch talent.

The European Data Protection Board’s Opinion 3/2019 (the “Opinion”) fires an early and self-wounding shot in the coming war over the GDPR meaning and application of “informed consent.” The EU Board insists on defining “informed consent” in a manner that would cripple most serious health research on humans and human tissue that could have taken place in European hospitals and universities.

As discussed in a US law review article from Former Microsoft Chief Privacy Counsel Mike Hintz called Science and Privacy: Data Protection Laws and Their Impact on Research (14 Washington Journal of Law, Technology & Arts 103 (2019)), noted in a recent IAPP story from Hintz and Gary LaFever, both the strict interpretation of “informed consent” and the GDPR’s right to withdraw consent can both cripple serious clinical trials. Further, according to LaFever and Hintz, researchers have raised concerns that “requirements to obtain consent for accessing data for research purposes can lead to inadequate sample sizes, delays and other costs that can interfere with efforts to produce timely and useful research results.”

A clinical researcher must have a “legal basis” to use personal information, especially health information, in trials.  One of the primary legal basis options is simply gaining permission from the test subject for data use.  Only this is not so simple.

On its face, the GDPR requires clear affirmative consent for using personal data (including health data) to be “freely given, specific, informed and unambiguous.” The Opinion clarifies that nearly all operations of a clinical trial – start to finish – are considered regulated transactions involving use of personal information, and special “explicit consent” is required for use of health data. Explicit consent requirements are satisfied by written statements signed by the data subject.

That consent would need to include, among other things:

  • the purpose of each of the processing operations for which consent is sought,
  • what (type of) data will be collected and used, and
  • the existence of the right to withdraw consent.

The Opinion is clear that the EU Board authors believe the nature of clinical trials to be one of  an imbalance of power between the data subject and the sponsor of the trial, so that consent for use of personal data would likely be coercive and not “freely given.” This raises the specter that not only can the data subject pull out of trials at any time (or insist his/ her data be removed upon completion of the trial), but EU Privacy Regulators are likely to simply cancel the right to use personal health data because the signatures could not be freely given where the trial sponsor had an imbalance of power over the data subject. Imagine spending years and tens of millions of euros conducting clinical trials, only to have the results rendered meaningless because, suddenly, the trial participants are of an insufficient sample size.

Further, if the clinical trial operator does not get permission to use personal information for analytics, academic publication/presentation, or any other use of the trial results, then the trial operator cannot use the results in these manners. This means that either the trial sponsor insists on broad permissions to use clinical results for almost any purpose (which would raise the specter of coercive permissions), or the trial is hobbled by inability to use data in opportunities that might arise later. All in all, using subject permission as a basis for supporting legal use of personal data creates unnecessary problems for clinical trials.

That leaves the following legal bases for use of personal data in clinical trials:

  • a task carried out in the public interest under Article 6(1)(e) in conjunction with Article 9(2), (i) or (j) of the GDPR; or

  • the legitimate interests of the controller under Article 6(1)(f) in conjunction with Article 9(2) (j) of the GDPR;

Not every clinical trial will be able to establish it is being conducted in the public interest, especially where the trial doesn’t fall “within the mandate, missions and tasks vested in a public or private body by national law.”  Relying on this basis means that a trial could be challenged later as not supported by national law, and unless the researchers have legislators or regulators pass or promulgate a clear statement of support for the research, this basis is vulnerable to privacy regulators’ whims.

Further, as observed by Hintze and LaFever, relying on “the legal basis involves a balancing test between those legitimate interests pursued by the controller or by a third party and the risks to the interests or rights of the data subject.” So even the most controller-centric of legal supports can be reversed if the local privacy regulator feels that a legitimate use is outweighed by the interests of the data subject.  I suppose the case of Henrietta Lacks, if arising in the EU in the present day, would be a clear situation where a non-scientific regulator can squelch a clinical trial because the data subjects rights to privacy were considered more important than any trial using her genetic material.

So none of the “legal basis” options is either easy or guaranteed not to be reversed later, once millions in resources have been spent on the clinical trial. Further, as Hintze observes, “The GDPR also includes data minimization principles, including retention limitations which may be in tension with the idea that researchers need to gather and retain large volumes of data to conduct big data analytics tools and machine learning.” Meaning that privacy regulators could step in and decide that a clinician has been too ambitious in her use of personal data in violation of data minimization rules and shut down further use of data for scientific purposes.

The regulators emphasize that “appropriate safeguards” will help protect clinical trials from interference, but I read such promises in the inverse.  If a hacker gains access to data in a clinical trial, or if some of this data is accidentally emailed to the wrong people, or if one of the 50,000 lost laptops each day contains clinical research, then the regulators will pounce with both feet and attack the academic institution (rarely paragons of cutting edge data security) as demonstrating a lack of appropriate safeguards.  Recent staggeringly high fines against Marriott and British Airways demonstrate the presumption of the ICO, at least, that an entity suffering a hack or losing data some other way will be viciously punished.

If clinicians choosing where to set human trials knew about this all-encompassing privacy law and how it throws the very nature of their trials into suspicion and possible jeopardy, I can’t see why they would risk holding trials with residents of the European Economic Zone. The uncertainty and risk involved in the aggressively intrusive privacy regulators now having specific interest in clinical trials may drive important academic work overseas. If we see a data breach in a European university or an academic enforcement action based on the laws cited above, it will drive home the risks.

In that case, this particular European shot in the privacy wars is likely to end up pushing serious researchers out of Europe, to the detriment of academic and intellectual life in the Union.

Damaging friendly fire indeed.

 

Copyright © 2019 Womble Bond Dickinson (US) LLP All Rights Reserved.

Privacy Concerns Loom as Direct-to-Consumer Genetic Testing Industry Grows

The market for direct-to-consumer (“DTC”) genetic testing has increased dramatically over recent years as more people are using at-home DNA tests. The global market for this industry is projected to hit $2.5 billion by 2024.  Many consumers subscribe to DTC genetic testing because they can provide insights into genetic backgrounds and ancestry.  However, as more consumers’ genetic data becomes available and is shared, legal experts are growing concerned that safeguards implemented by U.S. companies are not enough to protect consumers from privacy risks.

Some states vary in the manner by which they regulate genetic testing.  According to the National Conference of State Legislatures, the majority of states have “taken steps to safeguard [genetic] information beyond the protections provided for other types of health information.”  Most states generally have restrictions on how certain parties can carry out particular actions without consent.  Rhode Island and Washington require that companies receive written authorization to disclose genetic information.  Alaska, Colorado, Florida, Georgia, and Louisiana have each defined genetic information as “personal property.”  Despite these safeguards, some of these laws still do not adequately address critical privacy and security issues relative to genomic data.

Many testing companies also share and sell genetic data to third parties – albeit in accordance with “take-it-or-leave-it” privacy policies.  This genetic data often contains highly sensitive information about a consumer’s identity and health, such as ancestry, personal traits, and disease propensity.

Further, despite promises made in privacy policies, companies cannot guarantee privacy or data protection.  While a large number of companies only share genetic data when given explicit consent from consumers, there are other companies that have less strict safeguards. In some cases, companies share genetic data on a “de-identified” basis.  However, concerns remain relative to the ability to effectively de-identify genetic data.  Therefore, even when a company agrees to only share de-identified data, privacy concerns may persist because an emerging consensus is that genetic data cannot truly be de-identified. For instance, some report that the existence of powerful computing algorithms accessible to Big Data analysts makes it very challenging to prevent data from being de-identified.

To complicate matters, patients have historically come to expect their health information will be protected because the Health Insurance Portability and Accountability Act (“HIPAA”) governs most patient information. Given patients’ expectations of privacy under HIPAA, many consumers assume that this information is maintained and stored securely.  Yet, HIPAA does not typically govern the activities of DTC genetic testing companies – leaving consumers to agree to privacy and security protections buried in click-through privacy policies.  To protect patient genetic privacy, the Federal Trade Commission (“FTC”) has recommended that consumers withhold purchasing a kit until they have scrutinized the company’s website and privacy practices regarding how genomic data is used, stored and disclosed.

Although the regulation of DTC genetic testing companies remains uncertain, it is increasingly evident that consumers expect robust privacy and security controls.  As such, even in the absence of clear privacy or security regulations, DTC genetic testing companies should consider implementing robust privacy and security programs to manage these risks.  Companies should also approach data sharing with caution.  For further guidance, companies in this space may want to review Privacy-Best-Practices-for-Consumer-Genetic-Testing-Services-FINAL issued by the Future of Privacy Forum in July 2018.  Further, the legal and regulatory privacy landscape is rapidly expanding and evolving such that DTC genetic testing companies and the consumers they serve should be watchful of changes to how genetic information may be collected, used and shared over time.

 

©2019 Epstein Becker & Green, P.C. All rights reserved.
This article written by Brian Hedgeman and Alaap B. Shah from Epstein Becker & Green, P.C.

Federal Privacy Law – Could It Happen in 2019?

This was a busy week for activity and discussions on the federal level regarding existing privacy laws – namely the European Union’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). But the real question is, could a federal privacy law actually happen in 2019? Cybersecurity issues and the possibility of a federal privacy law were in the spotlight at the recent Senate Judiciary Committee hearing. This week also saw the introduction of bipartisan federal legislation regarding Internet of Things (IoT)-connected devices.

Senate Judiciary Committee Hearing on GDPR and CCPA

Let’s start by discussing this week’s hearing before the Senate Judiciary Committee in Washington. On March 12, the Committee convened a hearing entitled GDPR & CCPA: Opt-ins, Consumer Control, and the Impact on Competition and Innovation.  The Committee received testimony from several interested parties who discussed the pros and cons of both laws from various perspectives. One thing was clear – technology has outpaced the law, and several of those who provided testimony to the Committee argued strongly for one uniform federal privacy law rather than the collection of 50 different state laws.

Some of the testimony focused on the impact of the GDPR, both on businesses and economic concerns, and some felt it is too early yet to truly know the full impact. Others discussed ethical concerns regarding data use, competition, artificial intelligence, and the necessity for meaningful enforcement by the Federal Trade Commission (FTC).

One thing made clear by the testimony presented is that people want their data protected, and maybe they even want to prevent it from being shared and sold, but the current landscape makes that difficult for consumers to navigate. The reality is that many of us simply can’t keep track of every privacy policy we read, or every “cookie” we consent to. It’s also increasingly clear that putting the burden on consumers to opt in/opt out or try to figure out the puzzle of where our data is going and how it’s used, may not be the most effective means of legislating privacy protections.

Model Federal Privacy Law

Several of the presenters at the Senate hearing included legislative proposals for a federal privacy law. (See the link included above to the Committee website with links to individual testimony). Recently, the U.S. Chamber of Commerce also released its version of a model federal privacy law. The model legislation proposal contains consumer opt-out rights and a deletion option, and would empower the FTC to enforce violations and impose civil penalties for violations.

IoT Federal Legislation Is Back – Sort of

In 2017, federal legislation regarding IoT was introduced but didn’t pass. This week, the Internet of Things Cybersecurity Improvement Act of 2019 was introduced in Congress in a bipartisan effort to impose cybersecurity standards on IoT devices purchased by the federal government. The new bipartisan bill’s supporters acknowledge the proliferation of internet-connected things and devices and the risks to the federal government of IoT cybersecurity vulnerabilities. This latest federal legislation applies to federal government purchases of IoT devices and not to a broader audience. We recently discussed the California IoT law that was enacted last year. Effective January 1, 2020, all IoT devices sold in California will require a manufacturer to equip the device with “reasonable security feature or features” to “protect the device and any information contained therein from unauthorized access, destruction, use modification or disclosure.”

The convergence of the new California law and the prospect of federal IoT legislation begs the question of whether the changes to California law and on the federal level would be enough to drive change in the industry to increase the security of all IoT devices. The even bigger question is whether there is the political will in 2019 to drive change to enact a comprehensive federal privacy law. That remains to be seen as the year progresses.

 

Copyright © 2019 Robinson & Cole LLP. All rights reserved.
This post was written by Deborah A. George of Robinson & Cole LLP.

Save the Internet Act of 2019 Introduced

On 6 March 2019, Democrats in the House and Senate introduced the “Save the Internet Act of 2019.” The three-page bill (1) repeals the FCC’s Restoring Internet Freedom Order released in early 2018, as adopted by the Republican-led FCC under Chairman Ajit Pai; (2) prohibits the FCC from reissuing the RIF Order or adopting rules substantively similar to those adopted in the RIF Order; and (3) restores the Open Internet Order released in 2015, as adopted by the Democratic-led FCC under Chairman Tom Wheeler.

Major Impacts:

  • Broadband Internet Access Service (BIAS) is reclassified as a “telecommunications service,” potentially subject to all provisions in Title II of the Communications Act.

  • The three bright line rules of the Open Internet Order are restored: (1) no blocking of access to lawful content, (2) no throttling of Internet speeds, exclusive of reasonable network management practices, and (3) no paid prioritization.

  • Reinstates FCC oversight of Internet exchange traffic (transit and peering), the General Conduct Rule that authorizes the FCC to address anti-competitive practices of broadband providers, and the FCC’s primary enforcement authority over the Open Internet Order’s rules and policies.

  • Per the Open Internet Order, BIAS and all highspeed Internet access services remain subject to the FCC’s exclusive jurisdiction and the revenues derived from these services remain exempt from USF contribution obligations.

  • The prescriptive service disclosure and marketing rules of the Open Internet Order, subject to the small service provider exemption, would apply in lieu of the Transparency Rule adopted in the RIF Order.

FCC Chairman Pai promptly issued a statement strongly defending the merits and benefits of the RIF Order.

KH Assessment

  • From a political perspective, Save the Internet Act of 2019 garners support from many individuals and major edge providers committed to net neutrality principles but faces challenges in the Republican-controlled Senate.

  • In comments filed in the proceeding culminating in the RIF Order, the major wireline and wireless broadband providers supported a legislative solution that codified the no blocking and no throttling principles but not the no-paid prioritization prohibition or classifying BIAS as a telecommunications service.

It is highly unlikely that the legislation will be enacted as introduced. Though still unlikely, there is a better chance that a legislative compromise may be reached.

 

© 2019 Keller and Heckman LLP.

CCPA Part 2 – What Does Your Business Need to Know? Consumer Requests and Notice to Consumers of Personal Information Collected

This week we continue our series of articles on the California Consumer Privacy Act of 2018 (CCPA). We’ve been discussing the broad nature of this privacy law and answering some general questions, such as what is it? Who does it apply to? What protections are included for consumers? How does it affect businesses? What rights do consumers have regarding their personal information? What happens if there is a violation? This series is a follow up to our earlier post on the CCPA.

In Part 1 of this series, we discussed the purpose of the CCPA, the types of businesses impacted, and the rights of consumers regarding their personal information. This week we’ll review consumer requests and businesses obligations regarding data collection, the categories and specific pieces of personal information the business has collected, and how the categories of personal information shall be used.

We begin with two questions regarding data collection:

  • What notice does a business need to provide to the consumer to tell a consumer what personal information it collects?
  • What is a business required to do if that consumer makes a verified request to disclose the categories and specific pieces of personal information the business has collected?

First, the CCPA requires businesses to notify a consumer, at or before the point of collection, as to the categories of personal information to be collected and the purposes for which the categories of personal information shall be used. A business shall not collect additional categories of personal information or use personal information collected for additional purposes without providing the consumer with notice consistent with this section. Cal. Civ. Code §1798.100.

Second, under the CCPA, businesses shall, upon request of the consumer, be required to inform consumers as to the categories of personal information to be collected and the purposes for which the categories of personal information shall be used. The CCPA states that “a business that receives a verifiable consumer request from a consumer to access personal information shall promptly take steps to disclose and deliver, free of charge to the consumer, the personal information required by this section. The information may be delivered by mail or electronically, and if provided electronically, the information shall be in a portable and, to the extent technically feasible, in a readily useable format that allows the consumer to transmit this information to another entity without hindrance. A business may provide personal information to a consumer at any time, but shall not be required to provide personal information to a consumer more than twice in a 12-month period.” Section 1798.100 (d).

Section 1798.130 (a) states that to comply with the law, a business shall, in a form that is reasonably accessible to consumers, (1) make available to consumers two or more designated methods for submitting requests for information required to be disclosed, including, at a minimum, a toll-free telephone number, and if the business maintains an Internet web site, a web dite address; and (2) disclose and deliver the required information to a consumer free of charge within forty-five (45) days of receiving a verifiable request from the consumer.

Many have suggested during the rule-making process that there should be an easy to follow and standardized process for consumers to make their requests so that it’s clear for both consumers and businesses that a consumer has made the verified request. This would be welcome so that it would make this aspect of compliance simpler for the consumer as well as the business.

When businesses respond to consumers’ requests, having a clear website privacy policy that explains the types of information collected, a documented process for consumers to make a verified requests, a protocol for responding to consumer requests, audit logs of consumer requests and business responses, a dedicated website link, and clear and understandable language in  privacy notices, are all suggestions that will help businesses respond to consumers and provide documentation of the business’ response.

As we continue to explore the CCPA and its provisions, we strive to understand the law and translate the rights conferred by the law into business operations, processes and practices to ensure compliance with the law. In the coming weeks, we’ll focus on understanding more of these provisions and the challenges they present.

 

Copyright © 2019 Robinson & Cole LLP. All rights reserved.
This post was written by Deborah A. George of Robinson & Cole LLP.

Six Flags Raises Red Flags: Illinois Supreme Court Weighs In On BIPA

On January 25, the Illinois Supreme Court held that a person can seek liquidated damages based on a technical violation of the Illinois Biometric Information Privacy Act (BIPA), even if that person has suffered no actual injury as a result of the violation. Rosenbach v. Six Flags Entertainment Corp. No. 123186 (Ill. Jan. 25, 2019) presents operational and legal issues for companies that collect fingerprints, facial scans, or other images that may be considered biometric information.

As we have previously addressed, BIPA requires Illinois businesses that collect biometric information from employees and consumers to, among other things, adopt written policies, notify individuals, and obtain written releases. A handful of other states impose similar requirements, but the Illinois BIPA is unique because it provides individuals whose data has been collected with a private right of action for violations of the statute.

Now, the Illinois Supreme Court has held that even technical violations may be actionable.  BIPA requires that businesses use a “reasonable standard of care” when storing, transmitting, or protecting biometric data, so as to protect the privacy of the person who provides the data. The rules are detailed. Among other things, BIPA requires businesses collecting or storing biometric data to do the following:

  • establish a written policy with a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information;
  • notify individuals in writing that the information is being collected or stored and the purpose and length of time for which the biometric identifier will be collected, stored, and used;
  • obtain a written release from the individual; and
  • not disclose biometric information to a third party without the individual’s consent.

The Illinois Supreme Court has now held that a plaintiff may be entitled to up to $5,000 in liquidated damages if a company violates any of these requirements, even without proof of actual damages.

In Rosenbach, the plaintiff’s son’s fingerprint was scanned so that he could use his fingerprint to enter the Six Flags theme park under his season pass. Neither the plaintiff nor her son signed a written release or were given written notice as required by BIPA. The plaintiff did not allege that she or her son suffered a specific injury but claimed that if she had known that Six Flags collected biometric data, she would not have purchased a pass for her son. The plaintiff brought a class action on behalf of all similarly situated theme park customers and sued for maximum damages ($5,000 per violation) under BIPA. The Illinois appellate court held that plaintiff could not maintain a BIPA action because technical violations did not render a party “aggrieved,” a key element of a BIPA claim.

In a unanimous decision, the Illinois Supreme Court disagreed. The court held that “an individual need not allege some actual injury or adverse effect, beyond violation of his or her rights under the Act, in order to qualify as an ‘aggrieved’ person and be entitled to seek liquidated damages and injunctive relief pursuant to the Act.” Even more pointedly, the court held that when a private entity fails to comply with BIPA’s requirements regarding the collection, retention, disclosure, and destruction of a person’s biometric identifiers or biometric information, that violation alone – in the absence of any actual pecuniary or other injury—constitutes an invasion, impairment, or denial of the person’s statutory rights.

This decision – along with the 200 class actions already filed – shows how important it is for vendors and companies using fingerprint timeclocks or other technologies that may collect biometric information to be aware of BIPA’s requirements.

 

© 2019 Schiff Hardin LLP

Privacy Legislation Proposed in New York

The prevailing wisdom after last year’s enactment of the California Consumer Privacy Act (CCPA) was that it would result in other states enacting consumer privacy legislation. The perceived inevitability of a “50-state solution to privacy” motivated businesses previously opposed to federal privacy legislation to push for its enactment. With state legislatures now convening, we have identified what could be the first such proposed legislation in New York Senate Bill 224.

The proposed legislation is not nearly as extensive as the CCPA and is perhaps more analogous to California’s Shine the Light Law. The proposed legislation would require a “business that retains a customer’s personal information [to] make available to the customer free of charge access to, or copies of, all of the customer’s personal information retained by the business.” It also would require businesses that disclose customer personal information to third parties to disclose certain information to customers about the third parties and the personal information that is shared. Businesses would have to provide this information within 30 days of a customer request and for a twelve-month lookback period. The rights also would have to be disclosed in online privacy notices. Notably, the bill would create a private right of action for violations of its provisions.

We will continue to monitor this legislation and any other proposed legislation.

Copyright © by Ballard Spahr LLP.

This post was written by David M. Stauss of Ballard Spahr LLP.

Law Firm Security: Privacy & Data Security Laws that Affect Your Law Firm

At this point in the cybersecurity game, it’s a given that to prevent a breach, law firms must take every precaution to protect its data as well as the valuable data of its clients. What may not be as clear are the obligations that law firms, or any other third party, owe to certain organizations via industry-specific privacy and data security laws and regulations. These are put in place by foundations, government laws, and agency policies to ensure that they are not vulnerable to cybersecurity attacks.

Privacy and Data Security Laws and Regulations

Although there are many organizations that are subject to these laws, this article will address the most high-profile organizations, including the following:

Health Insurance Portability and Accountability Act (HIPAA)

HIPAA applies to covered entities such as health plans, health care clearinghouses and certain health care providers. Because these entities do not operate in a vacuum and often rely on the services of third-party businesses, there are provisions that allow these entities to share information with business associates and law firms.

business associate “is a person or entity that performs certain functions or activities that involve the use or disclosure of protected health information on behalf of, or provides services to, a covered entity,” according to the U.S. Department of Health & Human Services website.

Before information is shared with a business associate, the entity must first receive satisfactory assurances that the information will only be used for the purposes for which it was obtained, that the information will be safeguarded and that the information will help the covered entity to perform its duties. The satisfactory assurances must be in writing to ensure compliance with privacy and data security laws.

Gramm Leach Bliley Act (GLBA)

The GLBA was enacted to require financial institutions to explain their information-sharing practices to their customers and to safeguard vulnerable customer data from a security breach.

Under the Safeguards Rule of the GLBA, all financial institutions must protect consumer collected information from a security breach. Usually, data collected includes names, addresses and phone numbers; bank and credit card account numbers; income and credit histories; and Social Security numbers.

Further, financial institutions are required to ensure that parties with whom they are doing business must also be able to safeguard data with which they have been entrusted, such as law firms. Financial institutions must “select service providers that can maintain appropriate safeguards. Make sure your contract requires them to maintain safeguards, and oversee their handling of customer information,” according to the FTC website to ensure compliance of privacy and data security laws.

The FTC provides a detailed list of tips that financial institutions, as well as third-parties, can use to set up a strong security system to prevent a data breach of a customer’s information.

Payment Card Industry Data Security Standard (PCI-DSS)

The PCI was founded by American Express, Discover Financial Services, JCB International, MasterCard, and Visa, Inc. with the intent to “develop, enhance, disseminate and assist with the understanding of security standards for payment account security,” according to its website.

The standards apply to all entities that store, process or transmit cardholder data. This would include law firms, of course. The website lists 12 requirements that must be maintained, including:

  1. Install and maintain a firewall configuration to protect cardholder data.
  2. Do not use vendor-supplied defaults for system passwords and other security parameters.
  3. Protect stored cardholder data.
  4. Encrypt transmission of cardholder data across open, public networks.
  5. Use and regularly update anti-virus software or programs.
  6. Develop and maintain secure systems and applications.
  7. Restrict access to cardholder data by business need-to-know.
  8. Assign a unique ID to each person with computer access.
  9. Restrict physical access to cardholder data.
  10. Track and monitor all access to network resources and cardholder data.
  11. Regularly test security systems and processes.
  12. Maintain a policy that addresses privacy and data security laws and regulations for employees and contractors.

Federal Reserve System

The Federal Reserve System issued the Guidance on Managing Outsourcing Risk publication to address concerns about third-party vendors or service providers and the risks of a data breach. The Federal Reserve defines service provider as, “all entities that have entered into a contractual relationship with a financial institution to provide business functions or activities.”

The publication indicates that a financial institution should treat the service provider risk management program commensurate with the level of risk presented by each service provider. “It should focus on outsourced activities that have a substantial impact on a financial institution’s financial condition; are critical to the institution’s ongoing operations; involve sensitive customer information or new bank products or services; or pose material compliance risk,” according to the publication.

An effective program should include the following:

  1. Risk assessments;
  2. Due diligence and selection of service providers;
  3. Contract provisions and considerations;
  4. Incentive compensation review;
  5. Oversight and monitoring of service providers; and
  6. Business continuity and contingency plans.

Federal Deposit Insurance Corporation (FDIC)

The FDIC issued a Guidance for Managing Third-Party Risk where the agency makes clear that an institution’s board of directors and senior management are responsible for the activities and risks associated with third-party vendors. This includes a breach into a third-party’s system. Among other third-party organizations, the publication lists significant organizations where “the relationship has a material effect on the institution’s revenues or expenses; the third party performs critical functions; the third-party stores, accesses, transmits, or performs transactions on sensitive customer information.” All of these could involve law firms that work with financial institutions.

The publication summarizes risks that third-party entities may pose, including strategic risk, reputations risk, operational risk, transaction risk, credit risk, compliance risk, and other risks. It also summarizes a risk management process, which includes the following elements of (1) risk assessment, (2) due diligence in selecting a third party, (3) contract structuring and review, and (4) oversight.

Conclusion

Being a third-party cybersecurity risk may be foreign territory to most law firms. But many organizations have in place privacy and data security laws and regulations to protect systems that could be vulnerable to a cybersecurity breach. It behooves law firms to be aware of these laws and regulations to be able to implement the laws and regulations as thoroughly and as expeditiously as possible.

ARTICLE BY:

© Copyright 2018 PracticePanther

Proposed House Bill Would Set National Data Security Standards for Financial Services Industry

A new bill introduced by House Financial Services subcommittee Chairman Rep. Blaine Luetkemeyer would significantly change data security and breach notification standards for the financial services and insurance industries. Most notably, the proposed legislation would create a national standard for data security and breach notification and preempt all current state law on the matter.

Breach Notification Standard

The Gramm-Leach-Bliley Act (GLBA) currently requires covered entities to establish appropriate safeguards to ensure the security and confidentiality of customer records and information and to protect those records against unauthorized access to or use. The proposed House bill would amend and expand  GLBA to mandate notification to customers “in the event of unauthorized access that is reasonably likely to result in identify theft, fraud, or economic loss.”

To codify breach notification at the national level, the proposed legislation requires all GLBA covered entities to adopt and implement the breach notification standards promulgated by the Comptroller of the Currency, the Board of Governors of the Federal Reserve System, the Federal Deposit Insurance Corporation, and the Office of Thrift Supervisor in its  Interagency Guidance on Response Programs for Unauthorized Access to Customer Information and Customer Notice. This guidance details the requirements for notification to individuals in the event of unauthorized access to sensitive information that has or is reasonably likely to result in misuse of that information, including timing and content of the notification.

While the Interagency Guidance was drafted specifically for the banking sector, the proposed legislation also covers insurance providers, investment companies, securities brokers and dealers, and all businesses “significantly engaged” in providing financial products or services.

If enacted, this legislation will preempt all laws, rules, and regulations in the financial services and insurance industries with respect to data security and breach notification.

Cohesiveness in the Insurance Industry

The proposed legislation provides uniform reporting obligations for covered entities – a benefit particularly for insurance companies who currently must navigate a maze of something conflicting state law breach notification standards. Under the proposed legislation, an assuming insurer need only notify the state insurance authority in the state in which it is domiciled. The proposed legislation also requires the insurance industry to adopt new codified standards for data security.

To ensure consistency throughout the insurance industry, the proposed legislation also prohibits states from imposing any data security requirement in addition to or different from the standards GLBA or the Interagency Guidance.

If enacted, this proposed legislation will substantially change the data security and breach notification landscape for the financial services and insurance industries. Entities within these industries should keep a careful eye on this legislation and proactively consider how these proposed revisions may impact their current policies and procedures.

 

Copyright © by Ballard Spahr LLP

California’s Turn: California Consumer Privacy Act of 2018 Enhances Privacy Protections and Control for Consumers

On Friday, June 29, 2018, California passed comprehensive privacy legislation, the California Consumer Privacy Act of 2018.  The legislation is some of the most progressive privacy legislation in the United States, with comparisons drawn to the European Union’s General Data Protection Regulation, or GDPR, which went into effect on May 25, 2018.  Karen Schuler, leader of BDO’s National Data and Information Governance and a former forensic investigator for the SEC, provides some insight into this legislation, how it compares to the EU’s GDPR, and how businesses can navigate the complexities of today’s privacy regulatory landscape.

California Consumer Privacy Act 2018

The California Consumer Privacy Act of 2018 was passed by both the California Senate and Assembly, and quickly signed into law by Governor Brown, hours before a deadline to withdraw a voter-led initiative that could potentially put into place even stricter privacy regulations for businesses.  This legislation will have a tremendous impact on the privacy landscape in the United States and beyond, as the legislation provides consumers with much more control of their information, as well as an expanded definition of personal information and the ability of consumers to control whether companies sell or share their data.  This law goes into effect on January 1, 2020. You can read more about the California Privacy Act of 2018 here.

California Privacy Legislation v. GDPR

In many ways, the California law has some similarities to GDPR, however, there are notable differences, and ways that the California legislation goes even further.

Karen Schuler, leader of BDO’s National Data & Information Governance practice and former forensic investigator for the SEC, points out:

“the theme that resonates throughout both GDPR and the California Consumer Privacy Act is to limit or prevent harm to its residents. . . both seem to be keenly focused on lawful processing of data, as well as knowing where your personal information goes and ensuring that companies protect data accordingly.”

One way California goes a bit further is in the ability of consumers to prevent a company from selling or otherwise sharing consumer information.  Schuler says, “California has proposed that if a consumer chooses not to have their information sold, then the company must respect that.” While GDPR was data protections for consumers, and allows consumers rights as far as modifying, deleting and accessing their information, there is no precedent where GDPR can stop a company from selling consumer data if the company has a legal basis to do so.

In terms of a compliance burden, Schuler hypothesizes that companies who are in good shape as far as GDPR goes might have a bit of a head start in terms of compliance with the California legislation, however, there is still a lot of work to do before the law goes into effect on January 1, 2020.  Schuler says, “There are also different descriptions of personal data between regulations like HIPAA, PCI, GDPR and others that may require – under this law – companies to look at their categorizations of data. For some organizations this is an extremely large undertaking.”

Compliance with Privacy Regulations: No Short-Cuts

With these stricter regulations coming into play, companies are in a place where understanding data flows is of primary importance. In many ways, GDPR compliance was a wake-up call to the complexities of data privacy issues in companies.  Schuler says, “Ultimately, we have found that companies are making good strides against becoming GDPR compliant, but that they may have waited too long and underestimated the level of effort it takes to institute a strong privacy or GDPR governance program.”  When talking about how companies institute compliance to whatever regulation they are trying to understand and implement, Schuler says, “It is critical companies understand where data exists, who stores it, who has access to it, how its categorized and protected.” Additionally, across industries companies are moving to a culture of mindfulness around privacy and data security issues, a lengthy process that can require a lot of training and requires buy-in from all levels of the company.

While the United States still has a patchwork of privacy regulations, including breach notification statutes, this California legislation could be a game-changer.  What is clear is that companies will need to contend with privacy legislation and consumer protections. Understanding the data flows in an organization is crucial to compliance, and it turns out GDPR may have just been the beginning.

This post was written by Eilene Spear.

Copyright ©2018 National Law Forum, LLC.