Lessons in Becoming a Second Rate Intellectual Power – Through Privacy Regulation!

The EU’s endless regulation imposed on data usage has spooled over into academia, providing another lesson in kneecapping your own society by overregulating it. And they wonder why none of the big internet companies arose from the EU (or ever will). This time, the European data regulators seem to be doing everything they can to hamstring clinical trials and drive the research (and the resulting tens of billions of dollars of annual spend) outside the EU. That’s bad for pharma and biotech companies, but it’s also bad for universities that want to attract, retain, and teach top-notch talent.

The European Data Protection Board’s Opinion 3/2019 (the “Opinion”) fires an early and self-wounding shot in the coming war over the GDPR meaning and application of “informed consent.” The EU Board insists on defining “informed consent” in a manner that would cripple most serious health research on humans and human tissue that could have taken place in European hospitals and universities.

As discussed in a US law review article from Former Microsoft Chief Privacy Counsel Mike Hintz called Science and Privacy: Data Protection Laws and Their Impact on Research (14 Washington Journal of Law, Technology & Arts 103 (2019)), noted in a recent IAPP story from Hintz and Gary LaFever, both the strict interpretation of “informed consent” and the GDPR’s right to withdraw consent can both cripple serious clinical trials. Further, according to LaFever and Hintz, researchers have raised concerns that “requirements to obtain consent for accessing data for research purposes can lead to inadequate sample sizes, delays and other costs that can interfere with efforts to produce timely and useful research results.”

A clinical researcher must have a “legal basis” to use personal information, especially health information, in trials.  One of the primary legal basis options is simply gaining permission from the test subject for data use.  Only this is not so simple.

On its face, the GDPR requires clear affirmative consent for using personal data (including health data) to be “freely given, specific, informed and unambiguous.” The Opinion clarifies that nearly all operations of a clinical trial – start to finish – are considered regulated transactions involving use of personal information, and special “explicit consent” is required for use of health data. Explicit consent requirements are satisfied by written statements signed by the data subject.

That consent would need to include, among other things:

  • the purpose of each of the processing operations for which consent is sought,
  • what (type of) data will be collected and used, and
  • the existence of the right to withdraw consent.

The Opinion is clear that the EU Board authors believe the nature of clinical trials to be one of  an imbalance of power between the data subject and the sponsor of the trial, so that consent for use of personal data would likely be coercive and not “freely given.” This raises the specter that not only can the data subject pull out of trials at any time (or insist his/ her data be removed upon completion of the trial), but EU Privacy Regulators are likely to simply cancel the right to use personal health data because the signatures could not be freely given where the trial sponsor had an imbalance of power over the data subject. Imagine spending years and tens of millions of euros conducting clinical trials, only to have the results rendered meaningless because, suddenly, the trial participants are of an insufficient sample size.

Further, if the clinical trial operator does not get permission to use personal information for analytics, academic publication/presentation, or any other use of the trial results, then the trial operator cannot use the results in these manners. This means that either the trial sponsor insists on broad permissions to use clinical results for almost any purpose (which would raise the specter of coercive permissions), or the trial is hobbled by inability to use data in opportunities that might arise later. All in all, using subject permission as a basis for supporting legal use of personal data creates unnecessary problems for clinical trials.

That leaves the following legal bases for use of personal data in clinical trials:

  • a task carried out in the public interest under Article 6(1)(e) in conjunction with Article 9(2), (i) or (j) of the GDPR; or

  • the legitimate interests of the controller under Article 6(1)(f) in conjunction with Article 9(2) (j) of the GDPR;

Not every clinical trial will be able to establish it is being conducted in the public interest, especially where the trial doesn’t fall “within the mandate, missions and tasks vested in a public or private body by national law.”  Relying on this basis means that a trial could be challenged later as not supported by national law, and unless the researchers have legislators or regulators pass or promulgate a clear statement of support for the research, this basis is vulnerable to privacy regulators’ whims.

Further, as observed by Hintze and LaFever, relying on “the legal basis involves a balancing test between those legitimate interests pursued by the controller or by a third party and the risks to the interests or rights of the data subject.” So even the most controller-centric of legal supports can be reversed if the local privacy regulator feels that a legitimate use is outweighed by the interests of the data subject.  I suppose the case of Henrietta Lacks, if arising in the EU in the present day, would be a clear situation where a non-scientific regulator can squelch a clinical trial because the data subjects rights to privacy were considered more important than any trial using her genetic material.

So none of the “legal basis” options is either easy or guaranteed not to be reversed later, once millions in resources have been spent on the clinical trial. Further, as Hintze observes, “The GDPR also includes data minimization principles, including retention limitations which may be in tension with the idea that researchers need to gather and retain large volumes of data to conduct big data analytics tools and machine learning.” Meaning that privacy regulators could step in and decide that a clinician has been too ambitious in her use of personal data in violation of data minimization rules and shut down further use of data for scientific purposes.

The regulators emphasize that “appropriate safeguards” will help protect clinical trials from interference, but I read such promises in the inverse.  If a hacker gains access to data in a clinical trial, or if some of this data is accidentally emailed to the wrong people, or if one of the 50,000 lost laptops each day contains clinical research, then the regulators will pounce with both feet and attack the academic institution (rarely paragons of cutting edge data security) as demonstrating a lack of appropriate safeguards.  Recent staggeringly high fines against Marriott and British Airways demonstrate the presumption of the ICO, at least, that an entity suffering a hack or losing data some other way will be viciously punished.

If clinicians choosing where to set human trials knew about this all-encompassing privacy law and how it throws the very nature of their trials into suspicion and possible jeopardy, I can’t see why they would risk holding trials with residents of the European Economic Zone. The uncertainty and risk involved in the aggressively intrusive privacy regulators now having specific interest in clinical trials may drive important academic work overseas. If we see a data breach in a European university or an academic enforcement action based on the laws cited above, it will drive home the risks.

In that case, this particular European shot in the privacy wars is likely to end up pushing serious researchers out of Europe, to the detriment of academic and intellectual life in the Union.

Damaging friendly fire indeed.

 

Copyright © 2019 Womble Bond Dickinson (US) LLP All Rights Reserved.

Privacy Concerns Loom as Direct-to-Consumer Genetic Testing Industry Grows

The market for direct-to-consumer (“DTC”) genetic testing has increased dramatically over recent years as more people are using at-home DNA tests. The global market for this industry is projected to hit $2.5 billion by 2024.  Many consumers subscribe to DTC genetic testing because they can provide insights into genetic backgrounds and ancestry.  However, as more consumers’ genetic data becomes available and is shared, legal experts are growing concerned that safeguards implemented by U.S. companies are not enough to protect consumers from privacy risks.

Some states vary in the manner by which they regulate genetic testing.  According to the National Conference of State Legislatures, the majority of states have “taken steps to safeguard [genetic] information beyond the protections provided for other types of health information.”  Most states generally have restrictions on how certain parties can carry out particular actions without consent.  Rhode Island and Washington require that companies receive written authorization to disclose genetic information.  Alaska, Colorado, Florida, Georgia, and Louisiana have each defined genetic information as “personal property.”  Despite these safeguards, some of these laws still do not adequately address critical privacy and security issues relative to genomic data.

Many testing companies also share and sell genetic data to third parties – albeit in accordance with “take-it-or-leave-it” privacy policies.  This genetic data often contains highly sensitive information about a consumer’s identity and health, such as ancestry, personal traits, and disease propensity.

Further, despite promises made in privacy policies, companies cannot guarantee privacy or data protection.  While a large number of companies only share genetic data when given explicit consent from consumers, there are other companies that have less strict safeguards. In some cases, companies share genetic data on a “de-identified” basis.  However, concerns remain relative to the ability to effectively de-identify genetic data.  Therefore, even when a company agrees to only share de-identified data, privacy concerns may persist because an emerging consensus is that genetic data cannot truly be de-identified. For instance, some report that the existence of powerful computing algorithms accessible to Big Data analysts makes it very challenging to prevent data from being de-identified.

To complicate matters, patients have historically come to expect their health information will be protected because the Health Insurance Portability and Accountability Act (“HIPAA”) governs most patient information. Given patients’ expectations of privacy under HIPAA, many consumers assume that this information is maintained and stored securely.  Yet, HIPAA does not typically govern the activities of DTC genetic testing companies – leaving consumers to agree to privacy and security protections buried in click-through privacy policies.  To protect patient genetic privacy, the Federal Trade Commission (“FTC”) has recommended that consumers withhold purchasing a kit until they have scrutinized the company’s website and privacy practices regarding how genomic data is used, stored and disclosed.

Although the regulation of DTC genetic testing companies remains uncertain, it is increasingly evident that consumers expect robust privacy and security controls.  As such, even in the absence of clear privacy or security regulations, DTC genetic testing companies should consider implementing robust privacy and security programs to manage these risks.  Companies should also approach data sharing with caution.  For further guidance, companies in this space may want to review Privacy-Best-Practices-for-Consumer-Genetic-Testing-Services-FINAL issued by the Future of Privacy Forum in July 2018.  Further, the legal and regulatory privacy landscape is rapidly expanding and evolving such that DTC genetic testing companies and the consumers they serve should be watchful of changes to how genetic information may be collected, used and shared over time.

 

©2019 Epstein Becker & Green, P.C. All rights reserved.
This article written by Brian Hedgeman and Alaap B. Shah from Epstein Becker & Green, P.C.

FTC Settlement with Video Social Networking App Largest Civil Penalty in a Children’s Privacy Case

The Federal Trade Commission (FTC) announced a settlement with Musical.ly, a Cayman Islands corporation with its principal place of business in Shanghai, China, resolving allegations that the defendants violated the Children’s Online Privacy Protection Act (COPPA) Rule.

Musical.ly operates a video social networking app with 200 million users worldwide and 65 million in the United States. The app provides a platform for users to create short videos of themselves or others lip-syncing to music and share those videos with other users. The app also provides a platform for users to connect and interact with other users, and until October 2016 had a feature that allowed a user to tap on the “my city” tab and receive a list of other users within a 50-mile radius.

According to the complaint the defendants (1) were aware that a significant percentage of users were younger than 13 years of age and (2) had received thousands of complaints from parents that their children under 13 had created Muscial.ly accounts.

The FTC’s COPPA Rule prohibits the unauthorized or unnecessary collection of children’s personal information online by internet website operators and online services, and requires that verifiable parental consent be obtained prior to the collecting, using, and/or disclosing personal information of children under the age of 13.

In addition to requiring the payment of the largest civil penalty ever imposed for a COPPA case ($5.7 million), the consent decree prohibits the defendants from violating the COPPA Rule and requires that they delete and destroy all of the personal information of children in their possession, custody, or control unless verifiable parental consent has been obtained.

FTC Commissioners Chopra and Slaughter issued a joint statement noting their belief that the FTC should prioritize uncovering the role of corporate officers and directors and hold accountable everyone who broke the law.

 

©2019 Drinker Biddle & Reath LLP. All Rights Reserved

CCPA Part 2 – What Does Your Business Need to Know? Consumer Requests and Notice to Consumers of Personal Information Collected

This week we continue our series of articles on the California Consumer Privacy Act of 2018 (CCPA). We’ve been discussing the broad nature of this privacy law and answering some general questions, such as what is it? Who does it apply to? What protections are included for consumers? How does it affect businesses? What rights do consumers have regarding their personal information? What happens if there is a violation? This series is a follow up to our earlier post on the CCPA.

In Part 1 of this series, we discussed the purpose of the CCPA, the types of businesses impacted, and the rights of consumers regarding their personal information. This week we’ll review consumer requests and businesses obligations regarding data collection, the categories and specific pieces of personal information the business has collected, and how the categories of personal information shall be used.

We begin with two questions regarding data collection:

  • What notice does a business need to provide to the consumer to tell a consumer what personal information it collects?
  • What is a business required to do if that consumer makes a verified request to disclose the categories and specific pieces of personal information the business has collected?

First, the CCPA requires businesses to notify a consumer, at or before the point of collection, as to the categories of personal information to be collected and the purposes for which the categories of personal information shall be used. A business shall not collect additional categories of personal information or use personal information collected for additional purposes without providing the consumer with notice consistent with this section. Cal. Civ. Code §1798.100.

Second, under the CCPA, businesses shall, upon request of the consumer, be required to inform consumers as to the categories of personal information to be collected and the purposes for which the categories of personal information shall be used. The CCPA states that “a business that receives a verifiable consumer request from a consumer to access personal information shall promptly take steps to disclose and deliver, free of charge to the consumer, the personal information required by this section. The information may be delivered by mail or electronically, and if provided electronically, the information shall be in a portable and, to the extent technically feasible, in a readily useable format that allows the consumer to transmit this information to another entity without hindrance. A business may provide personal information to a consumer at any time, but shall not be required to provide personal information to a consumer more than twice in a 12-month period.” Section 1798.100 (d).

Section 1798.130 (a) states that to comply with the law, a business shall, in a form that is reasonably accessible to consumers, (1) make available to consumers two or more designated methods for submitting requests for information required to be disclosed, including, at a minimum, a toll-free telephone number, and if the business maintains an Internet web site, a web dite address; and (2) disclose and deliver the required information to a consumer free of charge within forty-five (45) days of receiving a verifiable request from the consumer.

Many have suggested during the rule-making process that there should be an easy to follow and standardized process for consumers to make their requests so that it’s clear for both consumers and businesses that a consumer has made the verified request. This would be welcome so that it would make this aspect of compliance simpler for the consumer as well as the business.

When businesses respond to consumers’ requests, having a clear website privacy policy that explains the types of information collected, a documented process for consumers to make a verified requests, a protocol for responding to consumer requests, audit logs of consumer requests and business responses, a dedicated website link, and clear and understandable language in  privacy notices, are all suggestions that will help businesses respond to consumers and provide documentation of the business’ response.

As we continue to explore the CCPA and its provisions, we strive to understand the law and translate the rights conferred by the law into business operations, processes and practices to ensure compliance with the law. In the coming weeks, we’ll focus on understanding more of these provisions and the challenges they present.

 

Copyright © 2019 Robinson & Cole LLP. All rights reserved.
This post was written by Deborah A. George of Robinson & Cole LLP.

New Washington State Privacy Bill Incorporates Some GDPR Concepts

A new bill, titled the “Washington Privacy Act,” was introduced in the Washington State Senate on January 18, 2019. If enacted, Washington would follow California to become the second state to adopt a comprehensive privacy law.

Similar to the California Consumer Privacy Act (CCPA), the Washington bill applies to entities that conduct business in the state or produce products or services that are intentionally targeted to residents of Washington and includes similar, though not identical size triggers. For example, it would apply to businesses that 1) control or process data of 100,000 or more consumers; or 2) derive 50 percent or more of gross revenue from the sale of personal information, and process or control personal information of 25,000 or more consumers. The bill would not apply to certain data sets regulated by some federal laws, or employment records and would not apply to state or local governments.

The bill incorporates aspects of the EU’s General Data Protection Regulation (GDPR) and borrows the “controller”/“processor” lexicon in identifying obligations for each role from the GDPR. It defines personal data as any information relating to an identified or identifiable natural person, but does not include de-identified data. Similar to the GDPR, it treats certain types of sensitive information differently. Unlike the CCPA, the bill excludes from the definition of “consumer” employees and contractors acting in the scope of their employment. Additionally, the definition of “sale” is narrower and limited to the exchange of personal data to a third party, “for purposes of licensing or selling personal data at the third party’s discretion to additional third parties,” while excluding any exchange that is “consistent with a consumer’s reasonable expectations considering the context in which the consumer provided the personal data to the controller.”

Another element similar to the GDPR in the bill, requires businesses to conduct and document comprehensive risk assessments when their data processing procedures materially change and on an annual basis. In addition, it would impose notice requirements when engaging in profiling and a prohibition against decision-making solely based on profiling.

Consumer rights 

Similar to both the GDPR and the CCPA, the bill outlines specific consumer rights.  Specifically, upon request from the consumer, a controller must:

  • Confirm if a consumer’s personal data is being processed and provide access to such data.
  • Correct inaccurate consumer data.
  • Delete the consumer’s personal data if certain grounds apply, such as in cases where the data is no longer necessary for the purpose for which it was collected.
  • Restrict the processing of such information if certain grounds apply, including the right to object to the processing of personal data related to direct marketing. If the consumer objects to processing for any purpose other than direct marketing, the controller may continue processing the personal data if the controller can demonstrate a compelling legitimate ground to process such data.

If a controller sells personal data to data brokers or processes personal data for direct marketing purposes, it must disclose such processing as well as how a consumer may exercise the right to object to such processing.

The bill specifically addresses the use of facial recognition technologies. It requires controllers that use facial recognition for profiling purposes to employ meaningful human review prior to making final decisions and obtain consumer consent prior to deploying facial recognition services. State and local government agencies are prohibited from using facial recognition technology to engage in ongoing surveillance of specified individuals in public spaces, absent a court order or in the case of an emergency.

The Washington State Attorney General would enforce the act and would have the authority to obtain not more than $2,500 for each violation or $7,500 for each intentional violation. There is no private right of action.

The Washington Senate Committee on Environment, Energy & Technology held a public hearing on January 22, 2019 to solicit public opinions on this proposed legislation. At the beginning of the public hearing, the Chief Privacy Officer of Washington, Alex Alben, commented that the proposed legislation would be just in time to address a “point of crisis [when] our economy has shifted into a data-driven economy” in the absence of federal legislation regarding data security and privacy protection.

Industry reaction to the bill

Companies and industry groups with an interest in this process applauded this proposed legislation as good news for entities that have become, or are on their way, to becoming compliant with the GDPR. Many also shared suggestions or criticisms. Among others, some speakers cautioned that by setting a high standard closely resembling the GDPR, the bill might drive small- or medium-sized companies to block Washington customers, just as they have done in the past to avoid compliance with the GDPR.

Some representatives, including the Chief of the Consumer Protection Division of the Washington Attorney General’s Office, call for a private cause of action so that this law would mean more to a private citizen than simply “a click on the banner.” The retail industry, the land title association, and other small business representatives expressed their preference for legislation on a federal level and a higher threshold for applicable businesses. Specifically, Stuart Halsan from the Washington Land Title Association recommended that the Washington Senate consider this bill’s impact on industries, such as the land title insurance industry, where the number of customers is significantly lower than the amount of data it processes in their ordinary course of business.

In response to these industry concerns, the committee acknowledged that this new legislation would need to be very sensitive to apply proportionately to businesses of different sizes and technology capabilities. The committee also recognized the need to make this legislation more administratively feasible for certain industries or entities that face difficulty in compliance (such as the secondary ticketing market) or subject to complicated regulatory frameworks (such as the bank industry). The Washington Senate continues to invite individuals, companies, or industry groups to submit brief written comments here.

 

©2019 Drinker Biddle & Reath LLP. All Rights Reserved

Google Fined $57 Million in First Major Enforcement of GDPR Against a US-based Company

On January 21, 2019, Google was fined nearly $57 million (approximately 50 million euros) by France’s Data Protection Authority, CNIL, for an alleged violation of the General Data Protection Regulation (GDPR).[1] CNIL found Google violated the GDPR based on a lack of transparency, inadequate information, and lack of valid consent regarding ad personalization. This fine is the largest imposed under the GDPR since it went into effect in May 2018 and the first to be imposed on a U.S.-based company.

CNIL began investigating Google’s practices based on complaints received from two GDPR consumer privacy rights organizations alleging Google did not have a valid legal basis to process the personal data of the users of its services, particularly for Google’s personalized advertisement purposes. The first of the complaints was filed on May 25, 2018, the effective date of the GDPR.

Following its investigation, CNIL found the general structure of the information required to be disclosed by Google relating to its processing of users’ information was “excessively disseminated across several documents.” CNIL stated the relevant information pertaining to privacy rights was only available after several steps, which sometimes required up to five or six actions. Moreover, CNIL indicated users were not able to fully understand the extent of the processing operations carried out by Google because the operations were described in a “too generic and vague manner.” Additionally, the regulator determined information regarding the retention period was not provided for some data collected by Google.

Google’s process for obtaining user consent to data collection for advertisement personalization was also alleged to be problematic under the GDPR. CNIL stated Google users’ consent was not considered to be sufficiently informed due to the information on processing operations for advertisement being spread across several documents. The consent obtained by Google was not deemed to be specific to any individual Google service, and CNIL determined it was impossible for the user to be aware of the extent of the data processed and combined.

Finally, CNIL determined the user consent captured by Google was not “specific” or “unambiguous” as these terms are defined by the GDPR. By way of example, CNIL noted that Google’s users were asked to click the boxes «I agree to Google’s Terms of Service» and «I agree to the processing of my information as described above and further explained in the Privacy Policy» in order to create the account. As a result, the user was required to give consent, in full, for all processing operations purposes carried out by Google based on this consent, rather than for distinct purposes, as required under the GDPR. Additionally, the CNIL commented Google’s checkbox used to capture user consent relating to ad personalization was “pre-clicked.” The GDPR requires consent to be “unambiguous,” with clear affirmative action from the user, which according to the CNIL, required clicking an unclicked box.

This fine may be appealed by Google, which indicated it remained committed to meeting the “high standards of transparency and control” expected by its users and to complying with the consent requirements of the GDPR. Google indicated it would study the decision to determine next steps. Given Google is the first U.S.-based company against whom a DPA has attempted GDPR enforcement, in combination with the size of the fine imposed, it will be interesting to watch how Google responds.

The GDPR enforcement action against Google should be seen as a message to all U.S.-based organizations that collect the data of citizens of the European Union. Companies should review their privacy policies, practices, and end-user agreements to ensure they are compliant with the consent requirements of the GDPR.


© 2019 Dinsmore & Shohl LLP. All rights reserved.
This post was written by Matthew S. Arend and Jared M. Bruce of Dinsmore & Shohl LLP.

Privacy Legislation Proposed in New York

The prevailing wisdom after last year’s enactment of the California Consumer Privacy Act (CCPA) was that it would result in other states enacting consumer privacy legislation. The perceived inevitability of a “50-state solution to privacy” motivated businesses previously opposed to federal privacy legislation to push for its enactment. With state legislatures now convening, we have identified what could be the first such proposed legislation in New York Senate Bill 224.

The proposed legislation is not nearly as extensive as the CCPA and is perhaps more analogous to California’s Shine the Light Law. The proposed legislation would require a “business that retains a customer’s personal information [to] make available to the customer free of charge access to, or copies of, all of the customer’s personal information retained by the business.” It also would require businesses that disclose customer personal information to third parties to disclose certain information to customers about the third parties and the personal information that is shared. Businesses would have to provide this information within 30 days of a customer request and for a twelve-month lookback period. The rights also would have to be disclosed in online privacy notices. Notably, the bill would create a private right of action for violations of its provisions.

We will continue to monitor this legislation and any other proposed legislation.

Copyright © by Ballard Spahr LLP.

This post was written by David M. Stauss of Ballard Spahr LLP.

Fourth Circuit Expands Title IX Liability for Harassment Through Anonymous Online Posts

The Fourth Circuit recently held that universities could be liable for Title IX violations if they fail to adequately respond to harassment that occurs through anonymous-messaging apps.

The case, Feminist Majority Foundation v. Hurley, concerned messages sent through the now-defunct app Yik Yak to the individual plaintiffs, who were students at the University of Mary Washington. Yik Yak was a messaging app that allowed users to anonymously post to discussion threads.

Because of the app’s location feature, which  allowed users to see posts within a 5 mile radius, the Court concluded that the University had substantial control over the context of the harassment because the threatening messages originated on or within the immediate vicinity of campus. Additionally, some of the posts at issue were posted using the University’s wireless network, and thus necessarily originated on campus.

The Court rejected the University’s argument that it was unable to control the harassers because the posts were anonymous. It held that the University could be liable if it never sought to discern whether it could identify the harassers.

The dissent encouraged the University to appeal the decision stating that “the majority’s novel and unsupported decision will have a profound effect, particularly on institutions of higher education . . .  Institutions, like the university, will be compelled to venture into an ethereal world of non-university forums at great cost and significant liability, in order to avoid the Catch-22 Title IX liability the majority now proclaims. The university should not hesitate to seek further review.”

 

Copyright © 2019 Robinson & Cole LLP. All rights reserved.
This post was written by Kathleen E. Dion of Robinson & Cole LLP.
Read more about college and university legal news on the National Law Review’s Public Education Page.

Fake Apps Find Their Way to Google Play!

Over the last two months a string of fake banking apps have hit the Google Play store, leaving many customers wondering whether they have been affected by the scam. A report by security firm ESET found users of three Indian banks were targeted by the apps which all claimed to increase credit card limits, only to convince customers to divulge their personal data, including credit card and internet banking details. The impact of this scam was heightened as the data stolen from unsuspecting customers was then leaked online by way of an exposed server.

The report claims these apps all utilise the same process:

  1. Once the app is downloaded and launched a form appears which asks the user to fill in credit card details (including credit card number, expiry date, CVV and login credentials)
  2. Once the form is completed and submitted a pop up customer service box is displayed
  3. The pop up box thanks users for their interest in the bank and indicates a ‘Customer Service Executive’ will be in contact shortly
  4. In the meantime, no representative makes contact with the customer and the data entered into the form is sent back to the attacker’s server – IN PLAIN TEXT.

The ESET report alarming revealed that the listing of stolen data on the attacker’s server is accessible to anyone with the link to the data, this means sensitive stolen personal data was available to absolutely anyone who happens to comes across it.

Whilst, the reality is any app on your personal smartphone may place your phone and personal data at risk, (as discussed here ‘Research Reports say risks to smartphone security aren’t phoney‘)

Customers can mitigate risk by:

  • only using their financial institutions official banking apps, these are downloadable from the relevant institution’s official website;
  • paying attention to the ratings, customer reviews when downloading from Google Play;
  • implementing security controls on your smartphone device from a reputable mobile security provider; and
  • contracting their financial institution directly to seek further guidance on the particular banking apps in use.

It cannot be overlooked, whilst Google Play moved quickly to remove the apps we query how it was so easy for cyber criminals to launch fake apps on Google Play in the first place.

Copyright 2018 K & L Gates.

This post was written by Cameron Abbott  and Jessica McIntosh of K & L Gates.

Read more stories like this on the National Law Review’s Cybersecurity legal news page.

California’s Turn: California Consumer Privacy Act of 2018 Enhances Privacy Protections and Control for Consumers

On Friday, June 29, 2018, California passed comprehensive privacy legislation, the California Consumer Privacy Act of 2018.  The legislation is some of the most progressive privacy legislation in the United States, with comparisons drawn to the European Union’s General Data Protection Regulation, or GDPR, which went into effect on May 25, 2018.  Karen Schuler, leader of BDO’s National Data and Information Governance and a former forensic investigator for the SEC, provides some insight into this legislation, how it compares to the EU’s GDPR, and how businesses can navigate the complexities of today’s privacy regulatory landscape.

California Consumer Privacy Act 2018

The California Consumer Privacy Act of 2018 was passed by both the California Senate and Assembly, and quickly signed into law by Governor Brown, hours before a deadline to withdraw a voter-led initiative that could potentially put into place even stricter privacy regulations for businesses.  This legislation will have a tremendous impact on the privacy landscape in the United States and beyond, as the legislation provides consumers with much more control of their information, as well as an expanded definition of personal information and the ability of consumers to control whether companies sell or share their data.  This law goes into effect on January 1, 2020. You can read more about the California Privacy Act of 2018 here.

California Privacy Legislation v. GDPR

In many ways, the California law has some similarities to GDPR, however, there are notable differences, and ways that the California legislation goes even further.

Karen Schuler, leader of BDO’s National Data & Information Governance practice and former forensic investigator for the SEC, points out:

“the theme that resonates throughout both GDPR and the California Consumer Privacy Act is to limit or prevent harm to its residents. . . both seem to be keenly focused on lawful processing of data, as well as knowing where your personal information goes and ensuring that companies protect data accordingly.”

One way California goes a bit further is in the ability of consumers to prevent a company from selling or otherwise sharing consumer information.  Schuler says, “California has proposed that if a consumer chooses not to have their information sold, then the company must respect that.” While GDPR was data protections for consumers, and allows consumers rights as far as modifying, deleting and accessing their information, there is no precedent where GDPR can stop a company from selling consumer data if the company has a legal basis to do so.

In terms of a compliance burden, Schuler hypothesizes that companies who are in good shape as far as GDPR goes might have a bit of a head start in terms of compliance with the California legislation, however, there is still a lot of work to do before the law goes into effect on January 1, 2020.  Schuler says, “There are also different descriptions of personal data between regulations like HIPAA, PCI, GDPR and others that may require – under this law – companies to look at their categorizations of data. For some organizations this is an extremely large undertaking.”

Compliance with Privacy Regulations: No Short-Cuts

With these stricter regulations coming into play, companies are in a place where understanding data flows is of primary importance. In many ways, GDPR compliance was a wake-up call to the complexities of data privacy issues in companies.  Schuler says, “Ultimately, we have found that companies are making good strides against becoming GDPR compliant, but that they may have waited too long and underestimated the level of effort it takes to institute a strong privacy or GDPR governance program.”  When talking about how companies institute compliance to whatever regulation they are trying to understand and implement, Schuler says, “It is critical companies understand where data exists, who stores it, who has access to it, how its categorized and protected.” Additionally, across industries companies are moving to a culture of mindfulness around privacy and data security issues, a lengthy process that can require a lot of training and requires buy-in from all levels of the company.

While the United States still has a patchwork of privacy regulations, including breach notification statutes, this California legislation could be a game-changer.  What is clear is that companies will need to contend with privacy legislation and consumer protections. Understanding the data flows in an organization is crucial to compliance, and it turns out GDPR may have just been the beginning.

This post was written by Eilene Spear.

Copyright ©2018 National Law Forum, LLC.