Proposed House Bill Would Set National Data Security Standards for Financial Services Industry

A new bill introduced by House Financial Services subcommittee Chairman Rep. Blaine Luetkemeyer would significantly change data security and breach notification standards for the financial services and insurance industries. Most notably, the proposed legislation would create a national standard for data security and breach notification and preempt all current state law on the matter.

Breach Notification Standard

The Gramm-Leach-Bliley Act (GLBA) currently requires covered entities to establish appropriate safeguards to ensure the security and confidentiality of customer records and information and to protect those records against unauthorized access to or use. The proposed House bill would amend and expand  GLBA to mandate notification to customers “in the event of unauthorized access that is reasonably likely to result in identify theft, fraud, or economic loss.”

To codify breach notification at the national level, the proposed legislation requires all GLBA covered entities to adopt and implement the breach notification standards promulgated by the Comptroller of the Currency, the Board of Governors of the Federal Reserve System, the Federal Deposit Insurance Corporation, and the Office of Thrift Supervisor in its  Interagency Guidance on Response Programs for Unauthorized Access to Customer Information and Customer Notice. This guidance details the requirements for notification to individuals in the event of unauthorized access to sensitive information that has or is reasonably likely to result in misuse of that information, including timing and content of the notification.

While the Interagency Guidance was drafted specifically for the banking sector, the proposed legislation also covers insurance providers, investment companies, securities brokers and dealers, and all businesses “significantly engaged” in providing financial products or services.

If enacted, this legislation will preempt all laws, rules, and regulations in the financial services and insurance industries with respect to data security and breach notification.

Cohesiveness in the Insurance Industry

The proposed legislation provides uniform reporting obligations for covered entities – a benefit particularly for insurance companies who currently must navigate a maze of something conflicting state law breach notification standards. Under the proposed legislation, an assuming insurer need only notify the state insurance authority in the state in which it is domiciled. The proposed legislation also requires the insurance industry to adopt new codified standards for data security.

To ensure consistency throughout the insurance industry, the proposed legislation also prohibits states from imposing any data security requirement in addition to or different from the standards GLBA or the Interagency Guidance.

If enacted, this proposed legislation will substantially change the data security and breach notification landscape for the financial services and insurance industries. Entities within these industries should keep a careful eye on this legislation and proactively consider how these proposed revisions may impact their current policies and procedures.

 

Copyright © by Ballard Spahr LLP

California’s Turn: California Consumer Privacy Act of 2018 Enhances Privacy Protections and Control for Consumers

On Friday, June 29, 2018, California passed comprehensive privacy legislation, the California Consumer Privacy Act of 2018.  The legislation is some of the most progressive privacy legislation in the United States, with comparisons drawn to the European Union’s General Data Protection Regulation, or GDPR, which went into effect on May 25, 2018.  Karen Schuler, leader of BDO’s National Data and Information Governance and a former forensic investigator for the SEC, provides some insight into this legislation, how it compares to the EU’s GDPR, and how businesses can navigate the complexities of today’s privacy regulatory landscape.

California Consumer Privacy Act 2018

The California Consumer Privacy Act of 2018 was passed by both the California Senate and Assembly, and quickly signed into law by Governor Brown, hours before a deadline to withdraw a voter-led initiative that could potentially put into place even stricter privacy regulations for businesses.  This legislation will have a tremendous impact on the privacy landscape in the United States and beyond, as the legislation provides consumers with much more control of their information, as well as an expanded definition of personal information and the ability of consumers to control whether companies sell or share their data.  This law goes into effect on January 1, 2020. You can read more about the California Privacy Act of 2018 here.

California Privacy Legislation v. GDPR

In many ways, the California law has some similarities to GDPR, however, there are notable differences, and ways that the California legislation goes even further.

Karen Schuler, leader of BDO’s National Data & Information Governance practice and former forensic investigator for the SEC, points out:

“the theme that resonates throughout both GDPR and the California Consumer Privacy Act is to limit or prevent harm to its residents. . . both seem to be keenly focused on lawful processing of data, as well as knowing where your personal information goes and ensuring that companies protect data accordingly.”

One way California goes a bit further is in the ability of consumers to prevent a company from selling or otherwise sharing consumer information.  Schuler says, “California has proposed that if a consumer chooses not to have their information sold, then the company must respect that.” While GDPR was data protections for consumers, and allows consumers rights as far as modifying, deleting and accessing their information, there is no precedent where GDPR can stop a company from selling consumer data if the company has a legal basis to do so.

In terms of a compliance burden, Schuler hypothesizes that companies who are in good shape as far as GDPR goes might have a bit of a head start in terms of compliance with the California legislation, however, there is still a lot of work to do before the law goes into effect on January 1, 2020.  Schuler says, “There are also different descriptions of personal data between regulations like HIPAA, PCI, GDPR and others that may require – under this law – companies to look at their categorizations of data. For some organizations this is an extremely large undertaking.”

Compliance with Privacy Regulations: No Short-Cuts

With these stricter regulations coming into play, companies are in a place where understanding data flows is of primary importance. In many ways, GDPR compliance was a wake-up call to the complexities of data privacy issues in companies.  Schuler says, “Ultimately, we have found that companies are making good strides against becoming GDPR compliant, but that they may have waited too long and underestimated the level of effort it takes to institute a strong privacy or GDPR governance program.”  When talking about how companies institute compliance to whatever regulation they are trying to understand and implement, Schuler says, “It is critical companies understand where data exists, who stores it, who has access to it, how its categorized and protected.” Additionally, across industries companies are moving to a culture of mindfulness around privacy and data security issues, a lengthy process that can require a lot of training and requires buy-in from all levels of the company.

While the United States still has a patchwork of privacy regulations, including breach notification statutes, this California legislation could be a game-changer.  What is clear is that companies will need to contend with privacy legislation and consumer protections. Understanding the data flows in an organization is crucial to compliance, and it turns out GDPR may have just been the beginning.

This post was written by Eilene Spear.

Copyright ©2018 National Law Forum, LLC.

California May Be Headed Towards Sweeping Consumer Privacy Protections

On June 21st, California legislature Democrats reached a tentative agreement with a group of consumer privacy activists spearheading a ballot initiative for heightened consumer privacy protections, in which the activists would withdraw the the existing ballot initiative in exchange for the California legislature passing, and Governor Jerry Brown signing into law, a similar piece of legislation, with some concessions, by June 28th, the final deadline to withdraw ballot initiatives.  If enacted, the Act would take effect January 1, 2020.

In the “compromise bill”, Assemblyman Ed Chau (D-Arcadia) amended the California Consumer Privacy Act of 2018, (AB 375) to ensure the consumer privacy activists, and conversely ballot initiative opponents, would be comfortable with its terms.

Some of the key consumer rights allotted for in AB 375 include:

  • A consumer’s right to request deletion of personal information which would require the business to delete information upon receipt of a verified request;

  • A consumer’s right to request that a business that sells the consumer’s personal information, or discloses it for a business purpose, disclose the categories of information that it collects and categories of information and the identity of any 3rd parties to which the information was sold or disclosed;

  • A consumer’s right to opt-out of the sale of personal information by a business prohibiting the business from discriminating against the consumer for exercising this right, including a prohibition on charging the consumer who opts-out a different price or providing the consumer a different quality of goods or services, except if the difference is reasonably related to value provided by the consumer’s data.

Covered entities under AB 375 would include, any entity that does business in the State of California and satisfies one or more of the following: (i) annual gross revenue in excess of $25 million, (ii) alone or in combination, annually buys, receives for the business’ commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices, OR (iii) Derives 50 percent or more of its annual revenues from selling consumers’ personal information.

Though far reaching, the amended AB 375 limits legal damages and provides significant concessions to business opponents of the bill. For example, the bill allows a business 30 days to “cure” any alleged violations prior to the California attorney general initiating legal action. Similarly, while a private action is permissible, a consumer is required to provide a business 30 days written notice before instituting an action, during which time the business has the same 30 days to “cure” any alleged violations.  Specifically, the bill provides: “In the event a cure is possible, if within the 30 days the business actually cures the noticed violation and provides the consumer an express written statement that the violations have been cured and that no further violations shall occur, no action for individual statutory damages or class-wide statutory damages may be initiated against the business.”  Civil penalties for actions brought by the Attorney General are capped at $7,500 for each intentional violation.  The damages in any private action brought by a consumer are not less than one hundred dollars ($100) and not greater than seven hundred and fifty ($750) per consumer per incident or actual damages, whichever is greater.

Overall, consumer privacy advocates are pleased with the amended legislation which is “substantially similar to our initiative”, said Alastair Mactaggart, a San Francisco real estate developer leading the ballot initiative. “It gives more privacy protection in some areas, and less in others.”

The consumer rights allotted for in the amended version of the California Consumer Privacy Act of 2018, are reminiscent of those found in the European Union’s sweeping privacy regulations, the General Data Protection Regulation (“GDPR”) (See Does the GDPR Apply to Your U.S. Based Company?), that took effect May 25th. Moreover, California is not the only United States locality considering far reaching privacy protections. Recently, the Chicago City Council introduced the Personal Data Collection and Protection Ordinance, which, inter alia, would require opt-in consent from Chicago residents to use, disclose or sell their personal information. On the federal level, several legislative proposals are being considered to heighten consumer privacy protection, including the Consumer Privacy Protection Act, and the Data Security and Breach Notification Act.

 

Jackson Lewis P.C. © 2018
This post was written by Joseph J. Lazzarotti of Jackson Lewis P.C.

GDPR May 25th Deadline Approaching – Businesses Globally Will Feel Impact

In less than four months, the General Data Protection Regulation (the “GDPR” or the “Regulation”) will take effect in the European Union/European Economic Area, giving individuals in the EU/EEA greater control over their personal data and imposing a sweeping set of privacy and data protection rules on data controllers and data processors alike. Failure to comply with the Regulation’s requirements could result in substantial fines of up to the greater of €20 million or 4% of a company’s annual worldwide gross revenues. Although many American companies that do not have a physical presence in the EU/EEA may have been ignoring GDPR compliance based on the mistaken belief that the Regulation’s burdens and obligations do not apply outside of the EU/EEA, they are doing so at their own peril.

A common misconception is that the Regulation only applies to EU/EEA-based corporations or multinational corporations with operations within the EU/EEA. However, the GDPR’s broad reach applies to any company that is offering goods or services to individuals located within the EU/EEA or monitoring the behavior of individuals in the EU/EEA, even if the company is located outside of the European territory. All companies within the GDPR’s ambit also must ensure that their data processors (i.e., vendors and other partners) process all personal data on the companies’ behalf in accordance with the Regulation, and are fully liable for any damage caused by their vendors’ non-compliant processing. Unsurprisingly, companies are using indemnity and insurance clauses in data processing agreements with their vendors to contractually shift any damages caused by non-compliant processing activities back onto the non-compliant processors, even if those vendors are not located in the EU/EEA. As a result, many American organizations that do not have direct operations in the EU/EEA nevertheless will need to comply with the GDPR because they are receiving, storing, using, or otherwise processing personal data on behalf of customers or business partners that are subject to the Regulation and its penalties. Indeed, all companies with a direct or indirect connection to the EU/EEA – including business relationships with entities that are covered by the Regulation – should be assessing the potential implications of the GDPR for their businesses.

Compliance with the Regulation is a substantial undertaking that, for most organizations, necessitates a wide range of changes, including:

  • Implementing “Privacy by Default” and “Privacy by Design”;
  • Maintaining appropriate data security;
  • Notifying European data protection agencies and consumers of data breaches on an expedited basis;
  • Taking responsibility for the security and processing of third-party vendors;
  • Conducting “Data Protection Impact Assessments” on new processing activities;
  • Instituting safeguards for cross-border transfers; and
  • Recordkeeping sufficient to demonstrate compliance on demand.

Failure to comply with the Regulation’s requirements carries significant risk. Most prominently, the GDPR empowers regulators to impose fines for non-compliance of up to the greater of €20 million or 4% of worldwide annual gross revenue. In addition to fines, regulators also may block non-compliant companies from accessing the EU/EEA marketplace through a variety of legal and technological methods. Even setting these potential penalties aside, simply being investigated for a potential GDPR violation will be costly, burdensome and disruptive, since during a pending investigation regulators have the authority to demand records demonstrating a company’s compliance, impose temporary data processing bans, and suspend cross-border data flows.

The impending May 25, 2018 deadline means that there are only a few months left for companies to get their compliance programs in place before regulators begin enforcement. In light of the substantial regulatory penalties and serious contractual implications of non-compliance, any company that could be required to meet the Regulation’s obligations should be assessing their current operations and implementing the necessary controls to ensure that they are processing personal data in a GDPR-compliant manner.

 

© 2018 Neal, Gerber & Eisenberg LLP.
More on the GDPR at the NLR European Union Jurisdiction Page.

Privacy Hat Trick: Three New State Laws to Juggle

Nevada, Oregon and New Jersey recently passed laws focusing on the collection of consumer information, serving as a reminder for advertisers, retailers, publishers and data collectors to keep up-to-date, accurate and compliant privacy and information collection policies.

Nevada: A Website Privacy Notice is Required

Nevada joined California and Delaware in explicitly requiring websites and online services to post an accessible privacy notice. The Nevada law, effective October 1, 2017, requires disclosure of the following:

  • The categories of “covered information” collected about consumers who visit the website or online service;

  • The categories of third parties with whom the operator may share such information;

  • A description of the process, if any, through which consumers may review and request changes to their information;

  • A description of the process by which operators will notify consumers of material changes to the notice;

  • Whether a third party may collect covered information about the consumer’s online activities over time and across different Internet websites or online services; and

  • The effective date of the notice.

“Covered Information” is defined to include a consumer’s name, address, email address, telephone number, social security number, an identifier that allows a specific person to be contacted physically or online, and any other information concerning a person maintained by the operator in combination with an identifier.

Takeaway: Website and online service operators (including Ad Techs and other data collectors) should review their privacy policies to ensure they are disclosing all collection of information that identifies, can be used to contact, or that is combined with information that identifies consumers. Website operators should also be sure that they are aware of, and are properly disclosing, any information that is shared with or collected by their third-party service providers and how that information is used.

Oregon: Misrepresentation of Privacy Practices = Unlawful Trade Practice.

Oregon expanded its definition of an “unlawful trade practice”, effective January 1, 2018, to expressly include using, disclosing, collecting, maintaining, deleting or disposing of information in a manner materially inconsistent with any statement or representation published on a business’s website or in a consumer agreement related to a consumer transaction.The new Oregon law is broader than other similar state laws, which limit their application to “personal information”. Oregon’s law, which does not define “information”, could apply to misrepresentations about any information collection practices, even if not related to consumer personal information.

Takeaway: Businesses should be mindful when drafting privacy policies, terms of use, sweepstakes and contest rules and other consumer-facing policies and statements not to misrepresent their practices with respect to any information collected, not just personal information.

New Jersey: ID Cards Can Only be Scanned for Limited Purposes (not Advertising)

New Jersey’s new Personal Information and Privacy Protection Act, effective October 1, 2017, limits the purposes for which a retail establishment may scan a person’s identification card to the following:

  • To verify the authenticity of the card or the identity of the person paying for goods or services with a method other than cash, returning an item or requesting a refund or exchange;

  • To verify the person’s age when providing age-restricted goods or services to the person;

  • To prevent fraud or other criminal activity using a fraud prevention service company or system if the person returns an item or requests a refund or exchange;

  • To prevent fraud or other criminal activity related to a credit transaction to open or manage a credit account;

  • To establish or maintain a contractual relationship;

  • To record, retain, or transmit information required by State or federal law;

  • To transmit information to a consumer reporting agency, financial institution, or debt collector to be used as permitted by the Fair Credit Reporting Act and the Fair Debt Collection Practices Act; or

  • To record, retain, or transmit information governed by the medical privacy and security rules of the Health Insurance Portability and Accountability Act.

The law also prohibits the retention of information scanned from an identification card for verification purposes and specifically prohibits the sharing of information scanned from an identification card with a third party for marketing, advertising or promotional activities, or any other purpose not specified above. The law does make an exception to permit a retailer’s automated return fraud system to share ID information with a third party for purposes of issuing a reward coupon to a loyal customer.

Takeaway: Retail establishments with locations in New Jersey should review their point-of-sale practices to ensure they are not scanning ID cards for marketing, advertising, promotional or any other purposes not permitted by the New Jersey law.

Read more legal analysis at the National Law Review.

This post was written byJulie Erin Rubash of  Sheppard Mullin Richter & Hampton LLP.

“WannaCry” Ransomware Attack Causes Disruption Globally – With Worst Yet to Come

A ransomware known as “WannaCry” affected 200,000 people in 150 countries over the weekend, locking computer files and demanding payment to release them. As of this morning, Australia and New Zealand users seem to have avoided the brunt of the attack, with the Federal Government only confirming three reports of Australian companies being affected.  Not that ransomware attacks tend to be the subject of reporting – there is quite a high rate of payment of affected users as the pricing is deliberately cheaper than most alternatives unless your back-up process is very good.

The ransomware utilises vulnerabilities in out-of-date, unpatched versions of Microsoft Windows to infect devices. It spreads from computer for computer as it finds exposed targets, without the user having to open an e-mail attachment or click a link as is commonplace in most attacks. Ransom demands start at US$300 and doubles after three days.

The U.K. National Health Service (NHS) was among the worst hit organisations, forcing hospitals to cancel appointments and delay operations as they could not access their patients’ medical records. The Telegraph suggested that 90 percent of NHS trusts were using a 16 year old version of Windows XP which was particularly vulnerable to the attack. More attacks are anticipated throughout the working week as companies and organisations turn on their devices.

The U.K. National Cyber Security Center has released guidance to help both home users and organisations limit the impact of the attacks. It can be read here.

Edwin Tan is co-author of this article. 

The Department Of Homeland Security Proposes New Rules Affecting Federal Government Contractors

This week, the Department of Homeland Security (“DHS”) issued three proposed rules expanding data security and privacy requirements for contractors and subcontractors. The proposed rules build upon other recent efforts by various federal agencies to strengthen safeguarding requirements for sensitive government information.  Given the increasing emphasis on data security and privacy, contractors and subcontractors are well advised to familiarize themselves with these new requirements and undertake a careful review of their current data security and privacy procedures to ensure they comply.

  • Privacy Training

DHS contracts currently require contractor and subcontractor employees to complete privacy training before accessing a Government system of records; handling Personally Identifiable Information and/or Sensitive Personally Identifiable Information; or designing, developing, maintaining, or operating a Government system of records. DHS proposes including this training requirement in the Homeland Security Acquisition Regulation (“HSAR”) and to make the training more easily accessible by hosting it on a public website.  By including the rule in the HSAR, DHS would standardize the obligation across all DHS contracts.  The new rule would require the training to be completed within thirty days of the award of a contract and on an annual basis thereafter.

DHS invites comment on the proposed rule. In particular, DHS asks commenters to offer their views on the burden, if any associated with the requirement to complete DHS-developed privacy training.  DHS also asks whether the industry should be given the flexibility to develop its own privacy training.  Comments must be submitted on or before March 20, 2017.

  • Information Technology Security Awareness Training

DHS currently requires contractor and subcontractor employees to complete information technology security awareness training before accessing DHS information systems and information resources. DHS proposes to amend the HSAR to require IT security awareness training for all contractor and subcontractor employees who will access (1) DHS information systems and information resources or (2) contractor owned and/or operated information systems and information resources capable of collecting, processing, storing or transmitting controlled unclassified information (“CUI”) (defined below).  DHS will require employees to undergo training and to sign DHS’s Rules of Behavior (“RoB”) before they are granted access to those systems and resources.  DHS also proposes to make this training and the RoB more easily accessible by hosting them on a public website.  Thereafter, annual training will be required.  In addition, contractors will be required to submit training certification and signed copies of the RoB to the contracting officer and maintain copies in their own records.

Through this proposed rule, DHS intends to require contractors to identify employees who will require access, to ensure that those employees complete training before they are granted access and annually thereafter, to provide to the government and maintain evidence that training has been conducted. Comments on the proposed rule are due on or before March 20, 2017.

  • Safeguarding of Controlled Unclassified Information

DHS’s third proposed rule will implement new security and privacy measures, including handling and incident reporting requirements, in order to better safeguard CUI. According to DHS, “[r]ecent high-profile breaches of Federal information further demonstrate the need to ensure that information security protections are clearly, effectively, and consistently addressed in contracts.”  Accordingly, the proposed rule – which addresses specific safeguarding requirements outlined in an Office of Management and Budget document outlining policy on managing government data – is intended to “strengthen[] and expand[]” upon existing HSAR language.

DHS’s proposed rule broadly defines “CUI” as “any information the Government creates or possesses, or an entity creates or possesses for or on behalf of the Government (other than classified information) that a law, regulation, or Government-wide policy requires or permits an agency to handle using safeguarding or dissemination controls[,]” including any “such information which, if lost, misused, disclosed, or, without authorization is accessed, or modified, could adversely affect the national or homeland security interest, the conduct of Federal programs, or the privacy of individuals.” The new safeguarding requirements, which apply to both contractors and subcontractors, include mandatory contract clauses; collection, processing, storage, and transmittal guidelines (which incorporate by reference any existing DHS policies and procedures); incident reporting timelines; and inspection provisions. Comments on the proposed rule are due on or before March 20, 2017.

  • Other Recent Efforts To Safeguard Contract Information

DHS’s new rules follow a number of other recent efforts by the federal government to better control CUI and other sensitive government information.

Last fall, for example, the National Archives and Record Administration (“NARA”) issued a final rule standardizing marking and handling requirements for CUI. The final rule, which went into effect on November 14, 2016, clarifies and standardizes the treatment of CUI across the federal government.

NARA’s final rule defines “CUI” as an intermediate level of protected information between classified information and uncontrolled information.  As defined, it includes such broad categories of information as proprietary information, export-controlled information, and certain information relating to legal proceedings.  The final rule also makes an important distinction between two types of systems that process, store or transmit CUI:  (1) information systems “used or operated by an agency or by a contractor of an agency or other organization on behalf of an agency”; and (2) other systems that are not operated on behalf of an agency but that otherwise store, transmit, or process CUI.

Although the final rule directly applies only to federal agencies, it directs agencies to include CUI protection requirements in all federal agreements (including contracts, grants and licenses) that may involve such information.  As a result, its requirements indirectly extend to government contractors.  At the same time, however, it is likely that some government contractor systems will fall into the second category of systems and will not have to abide by the final rule’s restrictions.  A pending FAR case and anticipated forthcoming FAR regulation will further implement this directive for federal contractors.

Similarly, last year the Department of Defense (“DOD”), General Services Administration, and the National Aeronautics and Space Administration issued a new subpart and contract clause (52.204-21) to the FAR “for the basic safeguarding of contractor information systems that process, store, or transmit Federal contract information.”  The provision adds a number of new information security controls with which contractors must comply.

DOD’s final rule imposes a set of fifteen “basic” security controls for covered “contractor information systems” upon which “Federal contract information” transits or resides.  The new controls include: (1) limiting access to the information to authorized users; (2) limiting information system access to the types of transactions and functions that authorized users are permitted to execute; (3) verifying controls on connections to external information systems; (4) imposing controls on information that is posted or processed on publicly accessible information systems; (5) identifying information system users and processes acting on behalf of users or devices; (6) authenticating or verifying the identities of users, processes, and devices before allowing access to an information system; (7) sanitizing or destroying information system media containing Federal contract information before disposal, release, or reuse; (8) limiting physical access to information systems, equipment, and operating environments to authorized individuals; (9) escorting visitors and monitoring visitor activity, maintaining audit logs of physical access, and controlling and managing physical access devices; (10) monitoring, controlling, and protecting organizational communications at external boundaries and key internal boundaries of information systems; (11) implementing sub networks for publically accessible system components that are physically or logically separated from internal networks; (12) identifying, reporting, and correcting information and information system flaws in a timely manner; (13) providing protection from malicious code at appropriate locations within organizational information systems; (14) updating malicious code protection mechanisms when new releases are available; and (15) performing periodic scans of the information system and real-time scans of files from external sources as files are downloaded, opened, or executed.

“Federal contract information” is broadly defined to include any information provided by or generated for the federal government under a government contract.  It does not, however, include either:  (1) information provided by the Government to the public, such as on a website; or (2) simple transactional information, such as that needed to process payments.  A “covered contractor information system” is defined as one that is:  (1) owned or operated by a contractor; and (2) “possesses, stores, or transmits” Federal contract information.

ARTICLE BY Connie N BertramAmy Blackwood & Emilie Adams of Proskauer Rose LLP

Law Firm Data Breaches: Big Law, Big Data, Big Problem

law firm data breachesThe Year of the Breach

2016 was the year that law firm data breaches landed and stayed squarely in both the national and international headlines. There have been numerous law firm data breaches involving incidents ranging from lost or stolen laptops and other portable media to deep intrusions exposing everything in the law firm’s network. In March, the FBI issued a warning that a cybercrime insider-trading scheme was targeting international law firms to gain non-public information to be used for financial gain. In April, perhaps the largest volume data breach of all time involved law firm Mossack Fonesca in Panama. Millions of documents and terabytes of leaked data aired the (dirty) laundry of dozens of companies, celebrities and global leaders. Finally, Chicago law firm, Johnson & Bell Ltd., was in the news in December when a proposed class action accusing them of failing to protect client data was unsealed.

A Duty to Safeguard

Law firms are warehouses of client information and how that information is protected is being increasingly regulated and scrutinized. The legal ethics rules require attorneys to take competent and reasonable measures to safeguard information relating to client. (ABA Model Rules 1.1, 1.6 and Comments). Attorneys also have contractual and regulatory obligations to protect information relating to clients and other personally identifiable information, financial and health, for example.

American Bar Association’s 2016 TechReport

Annually, the ABA conducts a Legal Technology Survey (Survey) to gauge the state of our industry vis-à-vis technology and data security. The Survey revealed that the largest firms (500 or more attorneys) reported experiencing the most security breaches, with 26% of respondents admitting they had experienced some type of breach. This is a generally upward trend from past years and analysts expect this number only to rise. This is likely because larger firms have more people, more technology and more data so there is a greater exposure surface and many more risk touch-points.

Consequences of Breach

The most serious consequence of a law firm security breach is loss or unauthorized access to sensitive client data. However, the Survey shows there was a low incidence of this, only about 2% of breaches overall resulted in loss of client data. Other concerning consequences of the breaches are significant though. 37% reported business downtime/loss of billable hours, 28% reported hefty fees for correction including consulting fees, 22% reported costs associated with having to replace hardware/software, and 14% reported loss of important files and information.

Employing & Increasing Safeguards Commonly Used in other Industries

The 2016 Survey shows that while many law firms are employing some safeguards and generally increasing and diversifying their use of those safeguards, our industry may not be using common security measures that other industries employ.

1. Programs and Policies. The first step of any organization in protecting its data is establishing a comprehensive data security program. Security programs should include measures to prevent breaches (like policies that regulate the use of technology) and measures to identify, protect, detect, respond to and recover from data breaches and security incidents. Any program should designate an individual, like a full-time privacy officer or information security director, who is responsible for coordinating security. However, the numbers show that the legal industry may not be up to speed on this basic need. Survey respondents reported their firms had the following documented policies:

Document or records management and retention policy: 56%

Email use policy: 49%

Internet use/computer use policy: 41%

Social media use: 34%

2. Assessments. Using security assessments conducted by independent third parties has been a growing security practice for other industries; however, law firms have been slow to adopt this security tool, with only 18% of law firms overall reporting that they had a full assessment.

3. Standards/Frameworks. Other industries use security standards and frameworks, like those published by the International Organization for Standardization (ISO) to provide approaches to information security programs or to seek formal security certification from one of these bodies. Overall, only 5% of law firms reported that they have received such a certification.

4. Encryption. Security professionals view encryption as a basic safeguard that should be widely deployed and it is increasingly being required by law for any personal information; however only 38% of overall respondents reported use of file encryption and only 15% use drive encryption. Email encryption has become inexpensive for businesses and easier to use with commercial email services yet overall only 26% of respondents reported using email encryption with confidential/privileged communications or documents sent to clients.

5. Cybersecurity Insurance. Many general liability and malpractice polices do not cover security incidents or data breaches, thus there is an increasing need for business to supplement their coverage with cybersecurity insurance. Unfortunately, only 17% of attorneys reported that they have cyber coverage.

Conclusion

It is important to note that the figures revealed by the 2016 Survey, while dismaying, may also be extremely conservative as law firms have a vested interest in keeping a breach of their client’s data as quiet as possible. There is also the very real possibility that many firms don’t yet know that they have been breached. The 2016 Survey demonstrates that there is still a lot of room for improvement in the privacy and data security space for law firms. As law firms continue to make the news for these types of incidents it is likely that improvement will come sooner rather than later.

President Donald J. Trump – What Lies Ahead for Privacy, Cybersecurity, e-Communication?

President TrumpFollowing a brutal campaign – one laced with Wikileaks’ email dumps, confidential Clinton emails left unprotected, flurries of Twitter and other social media activity – it will be interesting to see how a Trump Administration will address the serious issues of privacy, cybersecurity and electronic communications, including in social media.

Mr. Trump had not been too specific with many of his positions while campaigning, so it is difficult to have a sense of where his administration might focus. But, one place to look is his campaign website where the now President-elect outlined a vision, summarized as follows:

  • Order an immediate review of all U.S. cyber defenses and vulnerabilities by individuals from the military, law enforcement, and the private sector, the “Cyber Review Team.”

  • The Cyber Review Team will provide specific recommendations for safeguarding with the best defense technologies tailored to the likely threats.

  • The Cyber Review Team will establish detailed protocols and mandatory cyber awareness training for all government employees.

  • Instruct the U.S. Department of Justice to coordinate responses to cyber threats.

  • Develop the offensive cyber capabilities we need to deter attacks by both state and non-state actors and, if necessary, to respond appropriately.

There is nothing new here as these positions appear generally to continue the work of prior administrations in the area of cybersecurity. Perhaps insight into President-elect Trump’s direction in these areas will be influenced by his campaign experiences.

Should we expect a tightening of cybersecurity requirements through new statutes and regulations?

Mr. Trump has expressed a desire to reduce regulation, not increase it. However, political party hackings and unfavorable email dumps from Wikileaks, coupled with continued data breaches affecting private and public sector entities, may prompt his administration and Congress to do more. Politics aside, cybersecurity clearly is a top national security threat, and it is having a significant impact on private sector risk management strategies and individual security. Some additional regulation may be coming.

An important question for many, especially for organizations that have suffered a multi-state data breach, is whether we will see a federal data breach notification standard, one that would “trump” the current patchwork of state laws. With Republicans in control of the executive and legislative branches, at least for the next two years, and considering the past legislative activity in this area, a federal law on data breach notification that supersedes state law does not seem likely.

Should we expect an expansion of privacy rights or other protections for electronic communication such as email or social media communication?

Again, much has been made of the disclosure of private email during the campaign, and President-elect Trump is famous (or infamous) for his use of social media, particularly his Twitter account. For some time, however, many have expressed concern that federal laws such as the Electronic Communications Privacy Act and the Stored Communications Act are in need of significant updates to address new technologies and usage, while others continue to have questions about the application of the Communications Decency Act. We also have seen an increase in scrutiny over the content of electronic communications by the National Labor Relations Board, and more than twenty states have passed laws concerning the privacy of social media and online personal accounts. Meanwhile, the emergence of Big Data, artificial intelligence, IoT, cognitive computing and other technologies continue to spur significant privacy questions about the collection and use of data.

While there may be a tightening of the rules concerning how certain federal employees handle work emails, based on what we have seen, it does not appear at this point that a Trump Administration will make these issues a priority for the private sector.

We’ll just have to wait and see.

Jackson Lewis P.C. © 2016

IP Addresses Constitute Personal Data According to Court of Justice of European Union

IP AddressesIn a decision dated 19 October 2016, the Court of Justice of the European Union (CJEU) has provided much needed clarification on a long-standing issue in EU data protection law.

A German politician brought an action concerning websites operated by the Federal Republic of Germany that stored personal data, including IP addresses, on logfiles for two weeks.  The question before the CJEU was – are IP addresses personal data?  According to Article 2(a) of EU Directive 95/46personal data” is any information relating to an identified or identifiable natural person. An identifiable person is one who can be identified, directly or indirectly from the data.

The CJEU ruled that dynamic IP addresses constitute personal data for an online media service provider (here the Federal Republic of Germany) that makes a website accessible.

A dynamic IP address means that the computer’s IP address is newly assigned each time the website is visited.  Unlike static IP addresses, it is not possible for dynamic IP addresses, using only files which are accessible to the public, to create an identifiable link between the user’s computer and the physical connection to the internet provider’s network . Hence, the data included in a dynamic IP address does not enable the online media service provider to identify the user.

However, according to the CJEU, a dynamic IP address will be personal data if the additional data necessary to identify the user of a website is stored by the user’s internet service provider. The website provider only needs to have the legal means which enables him to identify the user. Legal means are, for example cyber attacks and does not have to be applicable for the specific case.

This decision has significant practical implications for all website providers, because the storing of user information by internet service providers falls under data protection laws. Ultimately, the website provider needs the consent of the user to store the dynamic IP address. This will also apply after the General Data Protection Regulation (GDPR) comes into force in May 2018, because Article 2 of Directive 95/46 is incorporated in almost the same words in Article 4 (1) of the GDPR.

© Copyright 2016 Squire Patton Boggs (US) LLP