The California Consumer Privacy Act Series Part 1: Applicability

California’s new privacy law, the California Consumer Privacy Act (the “CCPA”), goes into effect on January 1, 2020.  It is the most expansive state privacy law in U.S. history, imposing GDPR-like transparency and individual rights requirements on companies.  The law will impact nearly every entity that handles “personal information” regarding California residents, including (at least for now) employees.  An overview of the CCPA’s applicability is set forth below.

Who will the CCPA impact?

Most of the CCPA’s obligations apply directly to a “business,” which is an entity that:

  1. Handles “personal information” about California residents;
  2. Determines the purposes and means of processing that “personal information”; and
  3. Does business in California, and meets one of the following threshold requirements:

(a) Has annual gross revenues in excess of $25 million;

(b) Annually handles “personal information” regarding at least 50,000 consumers, households, or devices; or

(c) Derives 50% or more of its annual revenue from selling “personal information.”

However, “service providers” that handle “personal information” on behalf of a business and other third parties that receive “personal information” will also be impacted.  As currently written, however, the CCPA does not apply to non-profit organizations.

The CCPA’s three threshold requirements seem relatively straightforward, yet upon examination raise additional questions that will need to be clarified down the road.  For example:

  • Does the 50,000 devices threshold cover devices of California residents only, or apply more broadly?
  • Is the $25 million annual revenue trigger applicable only to revenue derived from California or globally?
  • What timeframe do businesses who suddenly find themselves within the CCPA’s ambit have to bring themselves into compliance with its provisions?

What is “personal information” as defined in the CCPA?

The CCPA defines “personal information” broadly in terms of (a) types of individuals and (b) types of data elements.  First, the term “consumer” refers to, and the CCPA applies to data about, any California resident, which ostensibly includes website visitors, B2B contacts and (at least for now) employees.  It is not limited to B2C customers that actually purchase goods or services.  Second, the data elements that constitute “personal information” term include non-sensitive items that historically have been less regulated in the U.S., such as Internet browsing histories, IP addresses, product preferences, purchasing histories, and inferences drawn from any other types of personal information described in the statute, including:

  • Identifiers such as name, address, phone number, email address;
  • Characteristics of protected classifications under California and federal law;
  • Commercial information such as property records, products purchased, and other consuming history;
  • Biometric information;
  • Internet or other electronic network activity;
  • Geolocation data;
  • Olfactory, audio, and visual information; and
  • Professional or educational information.

Does the CCPA have any exemptions?

The CCPA will apply to a broad number of businesses, covering nearly all commercial entities that do business in California, regardless of whether the business has a physical location or employees in the State.  However, there are some nuanced exemptions.

As a general matter, the exemptions are based on the types of information that a business collects, and not on the industry of the business collecting the information.  These include information that is collected and used wholly outside of California, subject to other state and federal laws, or sold to or from consumer reporting agencies.  Specifically, the excluded categories of “personal information” include:

      1. Activity “wholly outside” California

The CCPA does not apply to conduct that takes place “wholly outside” of California, although it is unclear how such an exemption will apply in practice.  The statute provides that this exemption applies if:

  • The business collects information while the consumer is outside of California;
  • No part of the sale of the consumer’s “personal information” occurs in California; and
  • No “personal information” collected while the consumer is in California is sold.

Determining when a consumer is outside of California when his or her “personal information” is collected will be challenging for businesses.  For example, given that an IP address is expressly included as “personal information” under the law, is a business supposed to do a reverse-lookup to determine whether an individual’s IP address originates in California?

      1. Data subject to other U.S. laws

While the CCPA exempts certain types of information subject to other laws, importantly it does not exempt entities subject to those laws altogether.  Entities subject to these laws are also not exempt from the CCPA’s statutory damages (i.e., no injury necessary) provisions relating to data breaches.  Likewise, some types of information (clarified below) are not exempt from the data breach liability provision.  At a glance, these exemptions appear helpful; however, they may end up making operationalizing the law even more difficult for certain entities.  For example:

  • Protected Health Information (“PHI”) and “Medical Information.” The CCPA exempts all PHI collected by “covered entities” and “business associates” subject to HIPAA and “medical information” subject to California’s analogous law, the Confidentiality of Medical Information Act (“CMIA”).  It also exempts any patient information to the extent a “covered entity” or “provider of health care,” respectively, maintains the patient information in the same manner as PHI or “medical information.”  However, many of these entities and their “business associates” collect information beyond what is considered PHI, such as employment records, technical data about website visitors, B2B information, and types of research data.  This data may not be eligible for the CCPA exemption.
  • Clinical Trial Information. The CCPA exempts information collected as part of a clinical trial subject to the Federal Policy for the Protection of Human Subjects, also known as the Common Rule.
  • Financial Information. Information processed pursuant to the Gramm-Leach-Bliley Act (“GLBA”) or the California Financial Information Privacy Act (“CalFIPA”) is exempt from the CCPA.  Much like the health-related exemption, this rule does not exempt entities subject to these laws altogether from its requirements to the extent an entity is processing information not expressly subject to GLBA/CalFIPA.  This particular exemption does not apply to the data breach liability provision.
  • Consumer Reporting Information. The CCPA exempts information sold to and from consumer reporting agencies if that information is reported in, or used to generate, a consumer report and use of that information is limited by the Fair Credit Reporting Act.
  • Driver Information. The CCPA also exempts information processed pursuant to the Driver’s Privacy Protection Act of 1994 (“DPPA”).  Importantly, entities subject to this law are not altogether exempt and this exemption does not apply to the data breach liability provision.

Moreover, the differences in definitions of relevant terms (e.g., “personal information” under the CCPA versus “nonpublic personal information” under GLBA) are important to consider when assessing relevant obligations and could result in institutions being only partially exempt from CCPA compliance.

 

© Copyright 2019 Squire Patton Boggs (US) LLP
This post was written by India K. Scarver and Elliot Golding    of Squire Patton Boggs.         

US Government Recommends Office 365 Security Advice including the use of MFA (Multi-Factor Authentication)!

Bleepingcomputer.com reported that the “Cybersecurity and Infrastructure Security Agency (CISA) issued a set of best practices designed to help organizations to mitigate risks and vulnerabilities associated with migrating their email services to Microsoft Office 365.”  The May 13, 2019 report entitled “U.S. Govt Issues Microsoft Office 365 Security Best Practices” included these following examples of Microsoft Office 365 configuration vulnerabilities in its AR19-133A analysis report from CISA:

Multi-factor authentication for administrator accounts not enabled by default: Azure Active Directory (AD) Global Administrators in an O365 environment have the highest level of administrator privileges at the tenant level. Multi-factor authentication (MFA) is not enabled by default for these accounts.

Mailbox auditing disabled: O365 mailbox auditing logs actions that mailbox owners, delegates, and administrators perform. Microsoft did not enable auditing by default in O365 prior to January 2019. Customers who procured their O365 environment before 2019 had to explicitly enable mailbox auditing.

Password sync enabled: Azure AD Connect integrates on-premises environments with Azure AD when customers migrate to O365. If this option is enabled, the password from on-premises overwrites the password in Azure AD. In this particular situation, if the on-premises AD identity is compromised, then an attacker could move laterally to the cloud when the sync occurs.

Authentication unsupported by legacy protocols: Azure AD is the authentication method that O365 uses to authenticate with Exchange Online, which provides email services. There are a number of protocols associated with Exchange Online authentication that do not support modern authentication methods with MFA features. Taking this step will greatly reduce the attack surface for organizations.

Given the widespread use of Office365 this is critical advice!

 

© 2019 Foley & Lardner LLP
This post was written by Peter Vogel of Foley & Lardner LLP.

FTC Settlement with Video Social Networking App Largest Civil Penalty in a Children’s Privacy Case

The Federal Trade Commission (FTC) announced a settlement with Musical.ly, a Cayman Islands corporation with its principal place of business in Shanghai, China, resolving allegations that the defendants violated the Children’s Online Privacy Protection Act (COPPA) Rule.

Musical.ly operates a video social networking app with 200 million users worldwide and 65 million in the United States. The app provides a platform for users to create short videos of themselves or others lip-syncing to music and share those videos with other users. The app also provides a platform for users to connect and interact with other users, and until October 2016 had a feature that allowed a user to tap on the “my city” tab and receive a list of other users within a 50-mile radius.

According to the complaint the defendants (1) were aware that a significant percentage of users were younger than 13 years of age and (2) had received thousands of complaints from parents that their children under 13 had created Muscial.ly accounts.

The FTC’s COPPA Rule prohibits the unauthorized or unnecessary collection of children’s personal information online by internet website operators and online services, and requires that verifiable parental consent be obtained prior to the collecting, using, and/or disclosing personal information of children under the age of 13.

In addition to requiring the payment of the largest civil penalty ever imposed for a COPPA case ($5.7 million), the consent decree prohibits the defendants from violating the COPPA Rule and requires that they delete and destroy all of the personal information of children in their possession, custody, or control unless verifiable parental consent has been obtained.

FTC Commissioners Chopra and Slaughter issued a joint statement noting their belief that the FTC should prioritize uncovering the role of corporate officers and directors and hold accountable everyone who broke the law.

 

©2019 Drinker Biddle & Reath LLP. All Rights Reserved

The Digital Revolution Takes on New Meaning: Among Calls for Heightened U.S. Data Privacy Measures, California is King

California’s ambitious new data privacy law, the California Consumer Privacy Act of 2018 (“CCPA”),[1] will go into effect on January 1, 2020, and promises to bring a new era of digital regulation to America’s shores. Financial institutions that just navigated their way through implementing the European Union’s General Data Protection Regulation (“GDPR”),[2] which became effective in May 2018,[3] may be uneasy about the prospect of complying with yet another new data privacy compliance regime. They will find some comfort in the fact that many of the systems and processes designed for GDPR compliance will serve their needs under the CCPA as well. However, between now and the go-live date of the CCPA, U.S. federal and state laws and regulations are likely to continue to evolve and expand, and financial institutions will need to prepare for CCPA implementation while staying abreast of other fast-moving developments. In this article, we provide some key takeaways for how firms can be as prepared as possible for the continuing evolution of U.S. data privacy law.

  1. The New California Data Privacy Law Will Apply Broadly to Financial Institutions with Customers in California

Financial institutions with customers who are California residents almost certainly fit within the types of businesses to which the CCPA will apply. A “business” subject to the CCPA includes for-profit sole proprietorships, partnerships, limited liability companies, corporations, associations, or any other legal entities that collect consumers’ personal information and that satisfy one or more of the following criteria:

  • has annual gross revenues in excess of $25 million;

  • alone or in combination annually buys, receives for the business’ commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices; or

  • derives 50% or more of its annual revenue from selling consumers’ personal information.[4]

The CCPA also applies to legal entities that control or are controlled by a CCPA-covered business, and where the two legal entities share common branding (such as a shared name, servicemark, or trademark).[5]

For U.S. businesses seeking to remain outside the purview of the CCPA, the available carve-out is extremely narrow. Businesses that collect or sell the personal information of a California resident are exempt from the CCPA only if “every aspect of that commercial conduct takes place wholly outside of California.” This requires that (a) the personal information must have been collected when the consumer was outside of California, (b) no part of the sale of the consumer’s personal information occurred in California, and (c) no personal information collected while the consumer was in California was sold. In practice, this means that any firm with a website or other digital presence visited by California residents will likely be ensnared by the CCPA even if they lack employees or a physical presence in the state.[6]

Businesses that fail to comply with the CCPA are subject to the possibility of a state enforcement action and consumer lawsuits (available only after providing notice to the business and the business fails to cure the violation within 30 days).[7] However, unlike the GDPR which can impose fines calculated as a factor of global revenue, the CCPA assesses penalties of up to $2,500 per violation and up to $7,500 per intentional violation.[8]

  1. California’s Expansive Concept of “Personal Information” Is Similar to the GDPR

When determining what consumer data will constitute personal information under the CCPA, firms can look to certain similarities with the GDPR.

Under the CCPA, “personal information” means “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” This includes, but is not limited to, names, addresses, identification number (such as social security, driver’s license, or passport), email address, and Internet Protocol (IP) address. It also includes biometric information, internet activity information (such as web browser or search history, or information regarding a consumer’s interaction with a website), geolocation data, and employment-related or education information.[9] This definition is largely consistent with how the GDPR broadly defines “personal data” for residents of the EU.[10]

The CCPA does not apply to data that has been “deidentified,” which means personal information that cannot reasonably identify, relate to, describe, or be linked to a particular consumer.[11] This is akin to the GDPR’s exclusion for “anonymized” data which cannot be used to identify a data subject. In addition, the CCPA does not apply to “aggregate consumer information,” which is information that relates to a group or category of consumers, from which individual consumer identities have been removed, that is not linked or reasonably linkable to any consumer or household or device.[12]

One difference between the two regimes, however, is that the CCPA’s definition of personal information excludes “publicly available” information, which is information that is lawfully made available from federal, state, or local government records.[13] The GDPR does not have a similar exception and instead provides the same protections to personal data regardless of its source.

  • California Consumers Will Enjoy a New Bill of Rights Protecting their Personal Information

Another similarity between the CCPA and the GDPR is the recognition of several fundamental rights that consumers will soon enjoy relating to the collection, use, and sale of their personal information. Under the CCPA, these can effectively be described as:

  • Right of Disclosure. A business that collects a consumer’s personal information will be required, at or before the point of collection, to inform consumers as to the categories of personal information to be collected and the purposes for which the categories of personal information will be used.[14] A consumer, e., a “natural person who is a California resident,” will also have the right to request such a business disclose to that consumer the categories and specific pieces of personal information the business has collected.[15] Such a request must be complied with promptly, by mail or electronically, and free of charge to the consumer; however, businesses will not be required to provide such information per consumer request more than twice in a 12-month period.[16] Together with this right, consumers will also have the ability to request the business or commercial purpose for collecting or selling personal information, and the categories of third parties with whom the business shares personal information.[17] Finally, consumers will have the right to request that a business that sells the consumer’s personal information, or discloses it for a business purpose, disclose what personal information was collected and the categories of third parties to whom it was sold.[18]

  • Right of Deletion. A consumer will have the right to request that a business delete any personal information about the consumer which the business has collected from the consumer.[19] If a business has received such a request, it will be required not only to delete the consumer’s personal information from its records, but also to direct any service providers to do the same.[20] This obligation to delete personal information at consumer request is subject to several exceptions, including for the completion of a financial transaction, to detect security incidents or debug errors, and to comply with legal obligations.[21]

  • Right to “Opt Out.” A consumer will have the right to direct a business that sells personal information about the consumer to third parties not to sell the consumer’s personal information going forward.[22] Once a business has received such an instruction from a consumer, it may not resume selling that consumer’s personal information unless express authorized to do so.[23] This right of a consumer to “opt out” must be clearly communicated to consumers on a business’ website under a banner titled “Do Not Sell My Personal Information,” with an accompanying link that enables a customer to opt out of the sale of the consumer’s personal information.[24]

  • Right to Non-Discrimination. Businesses will be prohibited from discriminating against consumers who exercise their various rights under the CCPA by denying them goods or services, charging different prices, or providing a different level or quality of goods or services.[25]

  1. Financial Institutions Should Not Expect a Complete Carve-Out Under Federal Law

The CCPA will not apply to personal information that is collected, processed, sold, or disclosed under certain federal laws.[26] One such law is the Gramm-Leach-Bliley Act (“GLBA”),[27] which covers financial institutions that offer consumers financial products, like banks, and contains its own consumer privacy-related protections.[28] However, this is not a complete exception because the CCPA defines personal information far more broadly than the financial-transaction-related data contemplated by the GLBA, and includes such data as browser history and IP address. As a result, firms will need to contemplate what personal information they collect in addition to what is captured under the GLBA and be prepared to protect it accordingly under the CCPA.

  1. Conclusion

California may be the next big word on U.S. data privacy legislation, but it is unlikely to be the last. In recent years, Congress and other states have faced increased pressure to explore new cybersecurity and data privacy legislation due to a multitude of factors including a growing awareness of how businesses collect and use personal information as seen with Cambridge Analytica’s use of Facebook data, and public frustration with companies’ perceived lackluster responses to major customer data breaches.[29] A recent report from the U.S. Government Accountability Office further highlights America’s growing appetite for GDPR-like legislation, calling it an “appropriate time for Congress to consider comprehensive Internet privacy legislation.”[30]  And while the last Congress failed to enact any new national data privacy legislation into law, both the House and Senate have held hearings recently to receive testimony on guiding principles for a potential federal data privacy law, with a key question being whether any such law should preempt state laws like the CCPA.[31] So while a full-blown U.S. equivalent of the GDPR may not yet be in the cards, the current mood among the public and among lawmakers points in the direction of more rather than less intensive data privacy rules to come.

1  SB-1121 California Consumer Privacy Act of 2018 (Sept. 24, 2018), 

2 European Commission, General Data Protection Regulation (Regulation (EU) 2016/679) of the European Parliament.

3  See Joseph Moreno et al., The EU’s New Data Protection Regulation – Are Your Cybersecurity and Data Protection Measures up to Scratch?, Cadwalader, Wickersham & Taft LLP (Mar. 6, 2017), .

4   Cal. Civ. Code § 1798.140(c)(1).

5   § 1798.140(c)(2).

6   § 1798.145(a)(6).

7   § 1798.150(b).

8   § 1798.155(b).

9   § 1798.140(o)(1).

10  Article 4 of the GDPR defines “personal data” as “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”

11  § 1798.140(h).

12  § 1798.140(a).

13  § 1798.140(o)(2). Under the CCPA, personal information loses its “publically available” designation if that data is “used for a purpose that is not compatible with the purpose for which the data is maintained and made available in the government records or for which it is publicly maintained.” Id.

14  § 1798.100(b).

15  § 1798.100(a).

16  § 1798.100(d).

17  § 1798.110(a).

18  § 1798.115(a).

19  § 1798.105(a).

20  § 1798.105(c).

21  § 1798.105(d).

22  § 1798.120(a).

23  § 1798.120(c).

24  § 1798.135(a)(1).

25  § 1798.125(a)(1).

26  § 1798.145(e).

27  15 U.S.C. §§ 6801-6809, 6821-6827.

28  Federal Financial Institutions Examination Council, Gramm-Leach-Bliley Summary of Provisions.

29  See Joseph Moreno, States Respond to Equifax Cyber Breach with Enforcement Actions and Calls for Enhanced Regulatory Powers, Cadwalader, Wickersham & Taft LLP (Oct. 13, 2017).

30  United States Government Accountability Office, Internet Privacy Additional Federal Authority Could Enhance Consumer Protection and Provide Flexibility (Jan. 2019), https://www.gao.gov/assets/700/696437.pdf.

31  U.S. House Committee on Energy & Commerce Subcommittee on Consumer Protection & Commerce, Hearing on “Protecting Consumer Privacy in the Era of Big Data(Feb. 26, 2019), ; U.S. Senate Committee on Commerce, Science, and Transportation, Policy Principles for a Federal Data Privacy Framework in the United States (Feb. 27, 2019), ; Alfred Ng, At Hearing on Federal Data-Privacy Law, Debate Flares Over State Rules, CNET (Feb. 26, 2019), ; Daniel R. Stoller, New FTC Powers Weighed in Senate Data Privacy Hearing (1), Bloomberg Law (Feb. 27, 2019), .

 

© Copyright 2019 Cadwalader, Wickersham & Taft LLP

California AG Announces Amendment to the CCPA

On February 25, 2019, California Attorney General Xavier Becerra and Senator Hannah-Beth Jackson introduced Senate Bill 561, legislation intended to strengthen and clarify the California Consumer Privacy Act (CCPA), which was enacted in June of 2018. If enacted, this would be the second amendment to the CCPA, following an earlier amendment in September of 2018 that Governor Jerry Brown signed into law Senate Bill 1121, which also clarified and strengthened the original version of the law.

As we reported previously, the CCPA will apply to any entity that does business in the State of California and satisfies one or more of the following: (i) annual gross revenue in excess of $25 million, (ii) alone or in combination, annually buys, receives for the business’ commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices, or (iii) derives 50 percent or more of its annual revenues from selling consumers’ personal information. Under the CCPA, key consumer rights will include:

  • A consumer’s right to request deletion of personal information which would require the business to delete information upon receipt of a verified request;
  • A consumer’s right to request that a business that sells the consumer’s personal information, or discloses it for a business purpose, disclose the categories of information that it collects and categories of information and 3rd parties to which the information was sold or disclosed;
  • A consumer’s right to opt-out of the sale of personal information by a business and prohibiting the business from discriminating against the consumer for exercising this right, including a prohibition on charging the consumer who opts-out a different price or providing the consumer a different quality of goods or services, except if the difference is reasonably related to value provided by the consumer’s data.

SB 561’s amendments include:

  • Expands a consumer’s right to bring a private cause of action. Currently, the CCPA provides consumer a private right of action if their nonencrypted or nonredacted personal information is subject to an unauthorized access and exfiltration, theft, or disclosure because the covered business did not meet its duty to implement and maintain reasonable safeguards to protect that information. The amendment broadens this provision to grant consumers a private right of action if their rights under the CCPA are violated.
  • Removes language that allows businesses the opportunity to cure an alleged violation within 30-days after being notified of alleged noncompliance.
  • Removes language allowing a business or third party to seek the opinion of the Attorney General for guidance on how to comply with the law. Instead, the amendment specifies that the Attorney General may publish materials that provide businesses and others with general guidance on how to comply with the law.

With an effective date of January 1, 2020 (and regulations not yet proposed), it is expected that additional amendments will be negotiated, drafted, and published. Last month, the California Attorney General’s Office began the CCPA rulemaking process with a six-part series of public forums, allowing all interested persons the opportunity to provide their comments on the new law.

SB 561 comes just days after the AG Becerra together with Assemblymember Mark Levine announced Assembly Bill 1130 to strengthen California’s existing data breach notification law. No doubt, California is leading the way in U.S. data privacy and security law.

Jackson Lewis P.C. © 2019.

This post was written by  Joseph J. Lazzarotti   Jason C. Gavejian and Maya Atrakchi

Google Fined $57 Million in First Major Enforcement of GDPR Against a US-based Company

On January 21, 2019, Google was fined nearly $57 million (approximately 50 million euros) by France’s Data Protection Authority, CNIL, for an alleged violation of the General Data Protection Regulation (GDPR).[1] CNIL found Google violated the GDPR based on a lack of transparency, inadequate information, and lack of valid consent regarding ad personalization. This fine is the largest imposed under the GDPR since it went into effect in May 2018 and the first to be imposed on a U.S.-based company.

CNIL began investigating Google’s practices based on complaints received from two GDPR consumer privacy rights organizations alleging Google did not have a valid legal basis to process the personal data of the users of its services, particularly for Google’s personalized advertisement purposes. The first of the complaints was filed on May 25, 2018, the effective date of the GDPR.

Following its investigation, CNIL found the general structure of the information required to be disclosed by Google relating to its processing of users’ information was “excessively disseminated across several documents.” CNIL stated the relevant information pertaining to privacy rights was only available after several steps, which sometimes required up to five or six actions. Moreover, CNIL indicated users were not able to fully understand the extent of the processing operations carried out by Google because the operations were described in a “too generic and vague manner.” Additionally, the regulator determined information regarding the retention period was not provided for some data collected by Google.

Google’s process for obtaining user consent to data collection for advertisement personalization was also alleged to be problematic under the GDPR. CNIL stated Google users’ consent was not considered to be sufficiently informed due to the information on processing operations for advertisement being spread across several documents. The consent obtained by Google was not deemed to be specific to any individual Google service, and CNIL determined it was impossible for the user to be aware of the extent of the data processed and combined.

Finally, CNIL determined the user consent captured by Google was not “specific” or “unambiguous” as these terms are defined by the GDPR. By way of example, CNIL noted that Google’s users were asked to click the boxes «I agree to Google’s Terms of Service» and «I agree to the processing of my information as described above and further explained in the Privacy Policy» in order to create the account. As a result, the user was required to give consent, in full, for all processing operations purposes carried out by Google based on this consent, rather than for distinct purposes, as required under the GDPR. Additionally, the CNIL commented Google’s checkbox used to capture user consent relating to ad personalization was “pre-clicked.” The GDPR requires consent to be “unambiguous,” with clear affirmative action from the user, which according to the CNIL, required clicking an unclicked box.

This fine may be appealed by Google, which indicated it remained committed to meeting the “high standards of transparency and control” expected by its users and to complying with the consent requirements of the GDPR. Google indicated it would study the decision to determine next steps. Given Google is the first U.S.-based company against whom a DPA has attempted GDPR enforcement, in combination with the size of the fine imposed, it will be interesting to watch how Google responds.

The GDPR enforcement action against Google should be seen as a message to all U.S.-based organizations that collect the data of citizens of the European Union. Companies should review their privacy policies, practices, and end-user agreements to ensure they are compliant with the consent requirements of the GDPR.


© 2019 Dinsmore & Shohl LLP. All rights reserved.
This post was written by Matthew S. Arend and Jared M. Bruce of Dinsmore & Shohl LLP.

Law Firm Security: Privacy & Data Security Laws that Affect Your Law Firm

At this point in the cybersecurity game, it’s a given that to prevent a breach, law firms must take every precaution to protect its data as well as the valuable data of its clients. What may not be as clear are the obligations that law firms, or any other third party, owe to certain organizations via industry-specific privacy and data security laws and regulations. These are put in place by foundations, government laws, and agency policies to ensure that they are not vulnerable to cybersecurity attacks.

Privacy and Data Security Laws and Regulations

Although there are many organizations that are subject to these laws, this article will address the most high-profile organizations, including the following:

Health Insurance Portability and Accountability Act (HIPAA)

HIPAA applies to covered entities such as health plans, health care clearinghouses and certain health care providers. Because these entities do not operate in a vacuum and often rely on the services of third-party businesses, there are provisions that allow these entities to share information with business associates and law firms.

business associate “is a person or entity that performs certain functions or activities that involve the use or disclosure of protected health information on behalf of, or provides services to, a covered entity,” according to the U.S. Department of Health & Human Services website.

Before information is shared with a business associate, the entity must first receive satisfactory assurances that the information will only be used for the purposes for which it was obtained, that the information will be safeguarded and that the information will help the covered entity to perform its duties. The satisfactory assurances must be in writing to ensure compliance with privacy and data security laws.

Gramm Leach Bliley Act (GLBA)

The GLBA was enacted to require financial institutions to explain their information-sharing practices to their customers and to safeguard vulnerable customer data from a security breach.

Under the Safeguards Rule of the GLBA, all financial institutions must protect consumer collected information from a security breach. Usually, data collected includes names, addresses and phone numbers; bank and credit card account numbers; income and credit histories; and Social Security numbers.

Further, financial institutions are required to ensure that parties with whom they are doing business must also be able to safeguard data with which they have been entrusted, such as law firms. Financial institutions must “select service providers that can maintain appropriate safeguards. Make sure your contract requires them to maintain safeguards, and oversee their handling of customer information,” according to the FTC website to ensure compliance of privacy and data security laws.

The FTC provides a detailed list of tips that financial institutions, as well as third-parties, can use to set up a strong security system to prevent a data breach of a customer’s information.

Payment Card Industry Data Security Standard (PCI-DSS)

The PCI was founded by American Express, Discover Financial Services, JCB International, MasterCard, and Visa, Inc. with the intent to “develop, enhance, disseminate and assist with the understanding of security standards for payment account security,” according to its website.

The standards apply to all entities that store, process or transmit cardholder data. This would include law firms, of course. The website lists 12 requirements that must be maintained, including:

  1. Install and maintain a firewall configuration to protect cardholder data.
  2. Do not use vendor-supplied defaults for system passwords and other security parameters.
  3. Protect stored cardholder data.
  4. Encrypt transmission of cardholder data across open, public networks.
  5. Use and regularly update anti-virus software or programs.
  6. Develop and maintain secure systems and applications.
  7. Restrict access to cardholder data by business need-to-know.
  8. Assign a unique ID to each person with computer access.
  9. Restrict physical access to cardholder data.
  10. Track and monitor all access to network resources and cardholder data.
  11. Regularly test security systems and processes.
  12. Maintain a policy that addresses privacy and data security laws and regulations for employees and contractors.

Federal Reserve System

The Federal Reserve System issued the Guidance on Managing Outsourcing Risk publication to address concerns about third-party vendors or service providers and the risks of a data breach. The Federal Reserve defines service provider as, “all entities that have entered into a contractual relationship with a financial institution to provide business functions or activities.”

The publication indicates that a financial institution should treat the service provider risk management program commensurate with the level of risk presented by each service provider. “It should focus on outsourced activities that have a substantial impact on a financial institution’s financial condition; are critical to the institution’s ongoing operations; involve sensitive customer information or new bank products or services; or pose material compliance risk,” according to the publication.

An effective program should include the following:

  1. Risk assessments;
  2. Due diligence and selection of service providers;
  3. Contract provisions and considerations;
  4. Incentive compensation review;
  5. Oversight and monitoring of service providers; and
  6. Business continuity and contingency plans.

Federal Deposit Insurance Corporation (FDIC)

The FDIC issued a Guidance for Managing Third-Party Risk where the agency makes clear that an institution’s board of directors and senior management are responsible for the activities and risks associated with third-party vendors. This includes a breach into a third-party’s system. Among other third-party organizations, the publication lists significant organizations where “the relationship has a material effect on the institution’s revenues or expenses; the third party performs critical functions; the third-party stores, accesses, transmits, or performs transactions on sensitive customer information.” All of these could involve law firms that work with financial institutions.

The publication summarizes risks that third-party entities may pose, including strategic risk, reputations risk, operational risk, transaction risk, credit risk, compliance risk, and other risks. It also summarizes a risk management process, which includes the following elements of (1) risk assessment, (2) due diligence in selecting a third party, (3) contract structuring and review, and (4) oversight.

Conclusion

Being a third-party cybersecurity risk may be foreign territory to most law firms. But many organizations have in place privacy and data security laws and regulations to protect systems that could be vulnerable to a cybersecurity breach. It behooves law firms to be aware of these laws and regulations to be able to implement the laws and regulations as thoroughly and as expeditiously as possible.

ARTICLE BY:

© Copyright 2018 PracticePanther

The Importance of Information Security Plans

In the first installation of our weekly series during National Cybersecurity Awareness Month, we examine information security plans (ISP) as part of an overall cybersecurity strategy.  Regardless of the size or function of an organization, having an ISP is a critical planning and risk management tool and, depending on the business, it may be required by law.  An ISP details the categories of data collected, the ways that data is processed or used, and the measures in place to protect it.  An ISP should address different categories of data maintained by the organization, including employee data and customer data as well as sensitive business information like trade secrets.

Having an ISP is beneficial for many reasons but there are two primary benefits.  First, once an organization identifies the data it owns and processes, it can more effectively assess risks and protect the data.  Second, in the event of a cyber attack or breach, an organization’s thorough understanding of the types of data it holds and the location of that data will expedite response efforts and reduce financial and reputational damage.

While it is a tedious task to determine the data that an organization collects and create a data inventory from that information, it is well worth the effort.  Once an organization assembles a data inventory, it can assess whether it needs all the data it collects before it invests time, effort and money into protecting it.  From a risk management perspective, it is always best to collect the least amount of information necessary to carry out business functions.  By eliminating unnecessary data, there is less information to protect and, therefore, less information at risk in the event of a cyber attack or breach.

Some state, federal and international laws require an ISP (or something like it).  For example, in Massachusetts, all businesses (regardless of location) that collect personal information of Massachusetts residents, which includes an organization’s own employees, “shall develop, implement, and maintain a comprehensive information security program that is written . . . and contains administrative, technical, and physical safeguards” based on the size, operations and sophistication of the organization.  The MA Office of Consumer Affairs and Business Regulation created a guide for small businesses to assist with compliance.

In Connecticut, while there is no requirement for an ISP, unless you contract with the state or are a health insurer, the state data breach law pertaining to electronically stored information offers a presumption of compliance when there is a breach if the organization timely notifies and reports under the statute and follows its own ISP.  Practically speaking, this means that the state Attorney General’s office is far less likely to launch an investigation into the breach.

On the federal level, by way of example, the Gramm Leach Bliley Act (GLBA) requires financial institutions to have an ISP and the Health Insurance Portability and Accountability Act (HIPAA) requires covered entities to perform a risk analysis, which includes an assessment of the types of data collected and how that data is maintained and protected.  Internationally, the EU General Data Privacy Regulation (GDPR), which took effect on May 25, 2018 and applies to many US-based organizations, requires a “record of processing activities.”  While this requirement is more extensive than the ISP requirements noted above, the concept is similar.

Here is a strategy for creating an ISP for your organization:

  1. Identify the departments that collect, store or process data.
  2. Ask each department to identify: (a) the categories of data they collect (e.g., business data and personal data such as name, email address, date of birth, social security number, credit card or financial account number, government ID number, etc.); (b) how and why they collect it; (c) how they use the data; (d) where it is stored; (e) format of the data (paper or electronic); and (f) who has access to it.
  3. Examine the above information and determine whether it needs to continue to be collected or maintained.
  4. Perform a security assessment, including physical and technological safeguards that are in place to protect the data.
  5. Devise additional measures, as necessary, to protect the information identified.  Such measures may include limiting electronic access to certain employees, file encryption, IT security solutions to protect the information from outside intruders or locked file cabinets for paper documents.  Training should always be an identified measure for protecting information and we will explore that topic thoroughly later this month.
© Copyright 2018 Murtha Cullina

“Hey Alexa – Tell Me About Your Security Measures”

California continues to lead the nation in cybersecurity and privacy legislation on the heels of the recent California Consumer Privacy Act of 2018 (“CCPA”).  Governor Brown recently signed into law two nearly identical bills, Assembly Bill No. 1906 and Senate Bill No. 327 (the “Legislation”) each of which required the signing of the other to become law, on September 28th, 2018.   Thus, California becomes the first country in the nation to regulate “connected devices” – the Internet of Things (IoT). The Legislation will go into effect January 2020.

  1. CA IoT Bills Apply to Manufacturers of Connected Devices

This Legislation applies to manufacturers of connected devices sold or offered for sale in California.  A connected device is defined as any device with an Internet Protocol (IP) or Bluetooth address, and capable of connecting directly or indirectly to the Internet.  Beyond examples such as cell phones and laptops, numerous household devices, from appliances such as refrigerators and washing machines, televisions, and children’s toys, could all meet the definition of connected device.

  1. What Must Manufacturers of Connected Devices Must Do

Manufacturers equip the connected device with reasonable security feature(s) that are “appropriate to the nature and function of the device, appropriate to the information it may collect, contain, or transmit, [and] designed to protect the device and any information contained therein from unauthorized access, destruction, use, modification, or disclosure.”

The Legislation provide some guidance as to what will be considered a reasonable security measure.  Devices that provide authentication with either a programmed password unique to the manufactured device, or provide a security feature that forces the user to generate a new means of authentication before access is granted will be deemed to have implemented a reasonable security feature.  The use of a generic, default password will not suffice.

Other than following this guidance, the Legislation does not provide specific methods of providing for reasonable security features.

  1. What Is Not Covered

a. Unaffiliated Third Party Software:  Many connected devices use multiple pieces of software to function.  The Legislation specifically states that “This title shall not be construed to impose any duty upon the manufacturer of a connected device related to unaffiliated third-party software or applications that a user chooses to add to a connected device.”

b. Companies That Provide Mechanisms To Sell Or Distribute Software: Application store owners, and others that provide a means of purchasing or downloading software or applications are not required to enforce compliance.

c. Devices or Functionality Already Regulated by Federal Authority: Connected Devices whose functionality is already covered by federal law, regulations or guidance of a federal agency need not comply.

d. Manufacturers Are Not Required To Lock Down Devices: Manufacturers are not required to prevent users from gaining full control of the device, including being able to load their own software at their own discretion.

  1. No Private Right of Action

No private right of action is provided, instead the “Attorney General, a city attorney, a county counsel, or a district attorney shall have the exclusive authority to enforce this title.”

  1. Not Limited To Personal Information

Previously, other California legislation had required data security measures be implemented.  For example, California’s overarching data security law (Cal. Civ. Code § 1798.71.5), requires reasonable data security measures to protect certain types of personal information.  This current approach is not tied to personal information, but rather applies to any connected device that meets the definition provided.

  1. Likely Consequences After The Legislation Comes Into Effect in January 2020

a. Impact Will Be National: Most all manufacturers will want to sell their devices in California  As such they will need to comply with this California Legislation, as unless they somehow segment which devices are offered for sale in the California market, they will have to effectively comply nationally.

b. While Physical Device Manufacturers Bear Initial Burden, Software Companies Will Be Affected: The Legislation applies to “any device, or other physical object that is capable of connecting to the Internet, directly or indirectly, and that is assigned an Internet Protocol address or Bluetooth address.”  While this puts the burden foremost on physical device manufacturers, software companies that provide software to device manufacturers for inclusion on the device before the device is offered for sale will need to support compliance with the Legislation.

c. Merger And Acquisition Events Will Serve As Private Enforcement Mechanisms: While there may not be a private right of action provided, whenever entities or portions of entities that are subject to the Legislation are bought and sold, the buyer will want to ensure compliance by the seller with the Legislation or otherwise ensure that the seller bears the risk or has compensated the buyer.  Effectively, this will mean that companies that want to be acquired will need to come into compliance or face a reduced sales price or a similar mechanism of risk shifting.

 

©1994-2018 Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C. All Rights Reserved.

Apple Imposes Privacy Policy Requirement for All Apps Operating on its Platform

As Apple recently reminded developers, starting on October 3, 2018 it will require all apps being submitted for distribution through its app store, or for testing by its TestFlight service, to have a publicly posted privacy policy. This requirement was incorporated into Apple’s App Store Review Guidelines and will apply to all new apps, as well as all updated versions of existing apps. Previously only those apps that collected user information had to have a privacy policy.

Apple’s previous requirements were consistent with a 2012 Joint Statement of Principles agreement that Apple and other app store platforms made with the California Attorney General. In that statement, the platforms agreed to require apps that collect information to conspicuously post a privacy policy telling consumers how their personal data was being collected, used, and shared. To encourage transparency of apps’ privacy practices, the platforms also agreed to allow app developers to link to their privacy policy directly from the store. Finally, the platforms agreed to create ways for consumers to notify them if an app was not living up to its policies, and to respond to such complaints.

The new Guidelines build on the principles established in 2012 and expand the privacy policy requirement to all apps, even utility apps that do not collect user information and apps still in the testing phase. Per the Guidelines, the policy will need to be included in the App Store Connect metadata field and as a link in the app itself. Without the policy, the app will not be reviewed and will not be made available on Apple’s platform.

Under the new Guidelines, an app’s privacy policy must still have a description of what data the app collects, how that data is collected, and how it is used. The policy must also notify users how long the app developer will keep the information it collects and how it will be deleted. The Guidelines also require the policy to inform users how they can revoke their consent (if applicable) for data collection and how to make a request to have their data be deleted. Finally, the policy will have to confirm that the app will follow Apple’s guidelines about sharing information with third parties, and that any third party that the information is sent to will be held to Apple’s data security guidelines. If the app’s privacy policy sets higher standards for data protection than Apple’s guidelines, the third party will have to also meet that benchmark.

Putting it Into Practice: This announcement is a reminder for companies to look at how they are sharing privacy practices with consumers across a variety of platforms, including mobile apps.

 

Copyright © 2018, Sheppard Mullin Richter & Hampton LLP.