Survey Says: Fortune 500 Disclosing Cyber Risks

Mintz Logo

Ever since our 2013 prediction, an ever increasing number of public companies are adding disclosure related to cybersecurity and data breach risks to their public filings.  We previously analyzed how the nation’s largest banks have begun disclosing their cybersecurity risks.   Now, it appears that the rest of the Fortune 500 companies are catching on and including some level of disclosure of their cyber risks in response to the 2011 SEC Guidance.

The recently published Willis Fortune 500 Cyber Disclosure Report, 2013 (the “Report”), analyzes cybersecurity disclosure by Fortune 500 public companies.  The Report found that as of April 2013, 85% of Fortune 500 companies are following the SEC guidance and are providing some level of disclosure regarding cyber exposures.  Interestingly though, only 36% of Fortune 500 companies disclosed that such risk was “material”, “serious” or used a similar term, and only 2% of the companies used a stronger term, such as “critical”.

Following the SEC’s recommendation in its guidance, 95% of the disclosing companies mentionedspecific cyber risks that they face.  The top three cyber risks identified by those companies that disclosed cyber risks were:

1)      Loss or theft of confidential information (65%).

2)      Loss of reputation (50%).

3)      Direct loss from malicious acts (hackers, viruses, etc.) (48%).

Surprisingly, 15% of Fortune 500 companies indicated that they did not have the resources to protect themselves against critical attacks and only 52% refer to technical solutions that they have in place to defend against cyber risks.

The Report notes that despite the large number of Fortune 500 companies that acknowledge cyber risks in their disclosure, only 6% mentioned that they purchase insurance to cover cyber risks.  This number runs contrary to a survey published by the Chubb Group of Insurance Companies in which Chubb indicates that about 36% of public companies purchase cyber risk insurance.  For whatever reason, it appears that many of the Fortune 500 companies are simply not disclosing that they purchase cyber risk insurance as a means of protecting against cyber risk.

Almost two years after its issuance, the Report findings indicate that the 2011 SEC Guidance is in full swing and making its way into reality.  As more large companies disclose cyber risks in their public filings, this will continue to trickle down to the smaller companies that rely on those filings for precedent and guidance.  The Report provides a clear snapshot of where things stand in cyber risk disclosure by Fortune 500 public companies.  The scope of the Report is expected to expand to include Fortune 1000 companies, and it will be interesting to see how this data changes, if at all, when comprised of a larger pool of public companies.

Stay tuned!

Article By:

 of

Basic Guidelines for Protecting Company Trade Secrets

Lewis & Roca

Under the Uniform Trade Secrets Act (UTSA), “trade secrets” are generally defined as confidential proprietary information that provides a competitive advantage or economic benefit. Trade secrets are protected under the Economic Espionage Act of 1994 (EEA) at the federal level, and the vast majority of states have enacted statutes modeled after the UTSA (note that some jurisdictions, such as California, Texas and Illinois, have adopted trade secret laws that differ substantially from the UTSA; thus, businesses should research laws in the relevant jurisdiction(s).). Under the UTSA, to be protectable as a trade secret, information must meet three requirements:

i. the information must fall within the statutory definition of “information” eligible for protection;

ii. the information must derive independent economic value from not being generally known or readily ascertainable by others using appropriate means; and

iii. the information must be the subject of reasonable efforts to maintain its secrecy.

Trade secret theft continues to accelerate among U.S. companies, and can have drastic consequences. To combat this threat, Congress and certain state legislatures have recently enacted legislation to broaden trade secret protection. As a result, it is paramount that companies safeguard all proprietary information that may qualify as protectable trade secrets. This blog post explains some key trade secrets concepts, and offers pointers on how to identify and protect trade secrets.

(1) Determine Which Data Constitutes “Information”

The UTSA-type statutes generally define “information” to include:

Financial, business, scientific, technical, economic, and engineering information;

Computer code, plans, compilations, formulas, designs, prototypes, techniques, processes, or procedures; and

Information that has commercial value, such as customer lists or the results of expensive research.

Courts have similarly interpreted “information” to cover virtually any commercially valuable information. Examples of information that has been found to constitute trade secrets includes pricing and marketing techniques, customer and financial information, sources of supplies, manufacturing processes, and product designs.

(2) “Valuable” and “Not Readily Ascertainable” Information

To be protectable, information must also have “economic value” and not be “readily ascertainable” by others. Courts generally determine whether information satisfies this standard by considering the following factors:

Reasonable measures have been put in place to protect the information from disclosure;

The information has actual or potential commercial value to a company;

The information is known by a limited number of people on a need-to-know basis;

The information would be useful to competitors and would require a significant investment to duplicate or acquire the information; and

The information is not generally known to the public.

(3) Take Reasonable Measures to Maintain Secrecy

Businesses should implement technical, administrative, contractual and physical safeguards to keep secret the information sought to be protected. Companies should identify foreseeable threats to the security of confidential information; assess the likelihood of potential harm flowing from such threats; and implement security protocols to address potential threats. Examples of security measures might include restricting access to confidential information on a need-to-know basis, employing computer access restrictions, circulating an employee handbook that outlines company policies governing confidential information, conducting entrance interviews for new hires to determine whether they are subject to restrictive covenants with former employers, conducting exit interviews with departing personnel to ensure that the employee has returned all company materials and agrees to abide by post-employment obligations, encrypting confidential information, limiting access to confidential information through passwords and network firewalls, track all access to network resources and confidential information, restrict the ability to email, print or otherwise transfer confidential information, employ security personnel, limit visitor access, establish surveillance procedures, and limit physical access to areas that may have confidential information.

Conclusion

This blog post is intended to provide some broad guidelines to identifying and protecting company trade secrets. Most if not all companies have confidential information that may be protectable as a trade secret. But certain precautions need to be in place to ensure that the information is protectable. Because each company and situation is different, you should seek advice about your specific circumstances.

Article By:

 of

China’s First-Ever National Standard on Data Privacy – Best Practices for Companies in China on Managing Data Privacy

Sheppard Mullin 2012

Companies doing business in China should take careful notice that China is now paying more attention to personal data privacy collection. This would be an opportune time for private companies to internally review existing data collection and management practices, as well as determine whether these fall within the new guidelines, and where necessary, develop and incorporate new internal data privacy practices.

The Information Security Technology-Guide for Personal Information Protection within Public and Commercial Systems (“Guidelines”), China’s first-ever national standard for personal data privacy protection, came into effect on February 1, 2013. The Guidelines, while not legally binding, are just what they purport to be – guidelines – some commentators view these as technical guidelines. However, the Guidelines should not be taken lightly as this may be a pre-cursor of new legislation ahead. China is not quite ready to issue new binding legislation, but there are indications it seeks to develop consistency with other internationally accepted practices, especially following recent data legislation enacted in the region by neighboring Hong Kong and other Asian countries.

What should companies look for when examining existing data privacy and collection policy and practices? As the Guidelines provide for rules on collecting, handling, transferring and deleting personal information, these areas of a company’s current policies should be reviewed.

“Personal Information”

What personal information is subject to the Guidelines? The Guidelines define “personal information” as “computer data that may be processed by an information system, relevant to a certain natural person, and that may be used solely or along with other information to identify such natural person.”

“General” and “Sensitive” Personal Information

The Guidelines makes a distinction on handling “general” as opposed to “sensitive” personal information. Sensitive personal information is defined as “information the leakage of which will cause adverse consequences to the subject individual” e.g. information such as an individual’s identity card, religious views or fingerprints.

Consent Required

If an individual’s personal information is being collected, that individual should be informed as to the purpose and the scope of the data being collected; tacit consent must be obtained- the individual does not object after being well informed. With “sensitive” personal information being collected, a higher level of consent must be obtained prior to collection and use; the individual must provide express consent and such evidence be retained.

Notice

Best practices dictate a well-informed notice be given the individual prior to collection of any personal information. The notice should clearly spell out, among other items, what information is being collected, the purpose for which the information will be used, the method of collection, party to whom the personal information will be disclosed and retention period.

Cross Border Transfer

The Guidelines further limit the transfer of personal information to any organization outside of P.R. China except where the individual provides consent, the government authorizes the transfer or the transfer is required by law. It is unclear as to which law applies where transfer is “required by law”- PRC law or law of any other country.

Notification of Breach

There is a notification requirement. The individual must be notified if personal information is lost, altered or divulged. If the breach incident is material, then the “personal information protection administration authority.” The Guidelines, however, do not define or make clear this administration authority is here.

Retention and Deletion

Best practices for a company is to minimize the amount of personal information collected. Personal information once used to achieve their intended purpose should not be stored and maintained, but immediately deleted.

The Guidelines may not be binding authority, but at a minimum sets certain standards for the collection, transfer and management of personal information. Especially for companies operating in China, the Guidelines is a call to action, and for implementation of best practices relating to data privacy. Companies should take this opportunity to assess their data privacy and security policies, review and revise customer information intake procedures and documentation, and develop and implement clear, company-wide internal data privacy policies and methods.

Article By:

 of

New Cybersecurity Guidance Released by the National Institute of Standards and Technology: What You Need to Know for Your Business

Mintz Logo

The National Institute of Standards and Technology (“NIST”)1 has released the fourth revision of its standard-setting computer security guide, Special Publication 800-53 titled Security and Privacy Controls for Federal Information Systems and Organizations2 (“SP 800-53 Revision 4”), and this marks a very important release in the world of data privacy controls and standards. First published in 2005, SP 800-53 is the catalog of security controls used by federal agencies and federal contractors in their cybersecurity and information risk management programs. Developed by NIST, the Department of Defense, the Intelligence Community, the Committee on National Security Systems as part of the Joint Task Force Transformation Initiative Interagency Working Group3over a period of several years with input collected from industry, Revision 4 “is the most comprehensive update to the security controls catalog since the document’s inception in 2005.”4

Taking “a more holistic approach to information security and risk management,5” the new revision of SP 800-53 also includes, for the first time, a catalog of privacy controls (the “Privacy Controls”) and offers guidance in the selection, implementation, assessment, and ongoing monitoring of the privacy controls for federal information systems, programs, and organizations (the “Privacy Appendix”).6 The Privacy Controls are a structured set of standardized administrative, technical, and physical safeguards, based on best practices, for the protection of the privacy of personally identifiable information (“PII”)7 in both paper and electronic form during the entire life cycle8of the PII, in accordance with federal privacy legislation, policies, directives, regulations, guidelines, and best practices.9 The Privacy Controls can also be used by organizations that do not collect and use PII, but otherwise engage in activities that raise privacy risk, to analyze and, if necessary, mitigate such risk.

Description of the Eight Families of Privacy Controls

The Privacy Appendix catalogs eight privacy control families, based on the widely accepted Fair Information Practice Principles (FIPPs)10 embodied in the Privacy Act of 1974, Section 208 of the E-Government Act of 2002, and policies of the Office of Management and Budget (OMB). Each of the following eight privacy control families aligns with one of the eight FIPPs:

  1. Authority and Purpose. This family of controls ensures that an organization (i) identifies the legal authority for its collection of PII or for engaging in other activities that impact privacy, and (ii) describes the purpose of PII collection in its privacy notice(s).
  2. Accountability, Audit, and Risk Management. This family of controls ensures that an organization (i) develops and implements a comprehensive governance and privacy program; (ii) documents and implements a privacy risk management process that assesses privacy risk to individuals resulting from collection of PII and/or other activities that involve such PII; (iii) conducts Privacy Impact Assessments (“PIAs”) for information systems, programs, or other activities that pose a privacy risk; (iv) establishes privacy requirements for contractors and service providers and includes such requirements in the agreements with such third parties; (v) monitors and audits privacy controls and internal privacy policy to ensure effective implementation; (vi) develops, implements, and updates a comprehensive awareness and training program for personnel; (vii) engages in internal and external privacy reporting; (viii) designs information systems to support privacy by automating privacy controls, and (ix) maintains an accurate accounting of disclosures of records in accordance with the applicable requirements and, upon request, provides such accounting of disclosures to the persons named in the record.
  3. Data Quality and Integrity. This family of controls ensures that an organization takes reasonable steps to validate that the PII collected and maintained by the organization is accurate, relevant, timely, and complete.
  4. Data Minimization and Retention. This family of controls addresses (i) the implementation of data minimization requirements to collect, use, and retain only PII that is relevant and necessary for the original, legally authorized purpose of collection, and (ii) the implementation of data retention and disposal requirements.
  5. Individual Participation and Redress. This family of controls addresses implementation of processes (i) to obtain consent from individuals for the collection of their PII, (ii) to provide such individuals with access to the PII, (iii) to correct or amend collected PII, as appropriate, and (iv) to manage complaints from individuals.
  6. Security. This family of controls supplements the security controls in Appendix F and are implemented in coordinating with information security personnel to ensure that the appropriate administrative, technical, and physical safeguards are in place to (i) protect the confidentiality, integrity, and availability of PII, and (ii) to ensure compliance with applicable federal policies and guidance.
  7. Transparency. This family of controls ensures that organizations (i) provide clear and comprehensive notices to the public and to individuals regarding their information practices and activities that impact privacy, and (ii) generally keep the public informed of their privacy practices.
  8. Use Limitation. This family of controls addresses the implementation of mechanisms that ensure that an organization’s scope of use of PII is limited to the scope specified in their privacy notice or as otherwise permitted by law.

Some of the Privacy Controls, such as Data Quality and Integrity, Data Minimization and Retention, Individual Participation and Redress, and Transparency also contain control enhancements, and while these enhancements reflect best practices which organizations should strive to achieve, they are not mandatory.11 The Office of Management and Budget (“OMB”), tasked with enforcement of the Privacy Controls, expects all federal agencies and third-party contractors to implement the mandatory Privacy Controls by April 30, 2014.

The privacy families must be analyzed and selected based on the specific operational needs and privacy requirements of each organization and can be implemented at various operational levels (e.g., organization level, mission/business process level, and/or information system level12). The Privacy Controls and the roadmap provided in the Privacy Appendix will be primarily used by Chief Privacy Officers (“CPO”) or Senior Agency Officials for Privacy (“SAOP”) to develop enterprise-wide privacy programs or to improve an existing privacy programs in order to meet an organization’s privacy requirements and demonstrate compliance with such requirements. The Privacy Controls supplement and complement the security control families set forth in Appendix F (Security Control Catalog) and Appendix G (Information Security Programs) and together these controls can be used by an organization’s privacy, information security, and other risk management offices to develop and maintain a robust and effective enterprise-wide program for management of information security and privacy risk.

What You Need to Know

The Privacy Appendix is based upon best practices developed under current law, regulations, policies, and guidance applicable to federal information systems, programs, and organizations, and by implication, to their third-party contractors. If you provide services to the federal government, work on government contracts, or are the recipient of certain grants that may require compliance with federal information system security practices, you should already be sitting up and paying attention. This revision puts privacy up front with security.

Like other NIST publications, this revision will be looked at as an industry standard for best practices, even for commercial entities that are not doing business with the federal government. In fact, over the last few years, we have seen increasing references to compliance with NIST 800-53 as setting a contractual baseline for security. We expect that this will continue, and now will include both the Security Controls and the Privacy Controls. As such, general counsel, business executives and IT professionals should become familiar with and conversant in the Privacy Controls set forth in the new revision to SP 800-53. At a minimum, businesses should undertake a gap analysis of the privacy controls at their organization against these Privacy Controls to determine if they are up to par or if they have to enhance their current privacy programs. And, if NIST 800-53 appears in contract language as the “minimum standard” to which your company’s policies and procedures must comply, the gap analysis will at least inform you of what needs to be done to bring both your privacy and security programs up to speed.


1 The National Institute of Standards and Technology is a non-regulatory agency within the U.S. Department of Commerce, which, among other things, develops information security standards and guidelines, including minimum requirements for federal information systems to assist federal agencies in implementing the Federal Information Security Management Act of 2002.

2 See Security and Privacy Controls for Federal Information Systems and Organizations, NIST Special Publ. (SP) 800-53,
Rev. 4 (April 30, 2013), http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf.

3 The Joint Task Force Transformation Initiative Interagency Working Group is an interagency partnership formed in 2009 to produce a unified security framework for the federal government. It includes representatives from the Civil, Defense, and Intelligence Communities of the federal government.

4 See NIST Press Release for SP 800-53 Revision 4 at http://www.nist.gov/itl/csd/201304_sp80053.cfm. Revision 4 of
SP 800-53 adds a substantial number of security controls to the catalog, including controls that address new technology such as digital and mobile technologies and cloud computing. With the exception of the controls that address evolving technologies, the majority of the cataloged security controls are policy and technology neutral, focusing on the fundamental safeguards and countermeasures required to protect information during processing, while in storage, and during transmission.

5 See NIST Press Release for SP 800-53 Revision 4 at http://www.nist.gov/itl/csd/201304_sp80053.cfm.

6 See Appendix J, Privacy Control Catalog to Security and Privacy Controls for Federal Information Systems and Organizations, NIST Special Publ. (SP) 800-53, Rev. 4 (April 30, 2013),http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf. Appendix J was developed by NIST and the Privacy Committee of the Federal Chief Information Officer (CIO) Council.

7 Personally Identifiable Information is defined broadly in the Glossary to SP 800-53 Revision 4 as “Information which can be used to distinguish or trace the identity of an individual (e.g., name, social security number, biometric records, etc.) alone, or when combined with other personal or identifying information which is linked or likable to a specific individual (e.g., date and place of birth, mother’s maiden name, etc.). See page B-16 of Appendix B, Privacy Control Catalog to Security and Privacy Controls for Federal Information Systems and Organizations, NIST Special Publ. (SP) 800-53, Rev. 4 (April 30, 2013),http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf. However, as stated in footnote 119 in Appendix J, “the privacy controls in this appendix apply regardless of the definition of PII by organizations.”

8 Collection, use, retention, disclosure, and disposal of PII.

9 See page J-4 of Appendix J, Privacy Control Catalog to Security and Privacy Controls for Federal Information Systems and Organizations, NIST Special Publ. (SP) 800-53, Rev. 4 (April 30, 2013),http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf.

10 See NIST description and overview of Fair Information Practice Principles at http://www.nist.gov/nstic/NSTIC-FIPPs.pdf.

11 See pages J-4 of Appendix J, Privacy Control Catalog to Security and Privacy Controls for Federal Information Systems and Organizations, NIST Special Publ. (SP) 800-53, Rev. 4 (April 30, 2013),http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf.

12 See page J-2 of Appendix J, Privacy Control Catalog to Security and Privacy Controls for Federal Information Systems and Organizations, NIST Special Publ. (SP) 800-53, Rev. 4 (April 30, 2013),http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf.

What’s New Out There? A Trade and Business Regulatory Update

Sheppard Mullin 2012Proposed DoD Rule: Detection and Avoidance of Counterfeit Electronic Parts (DFARS Case 2012-D-005)

On May 16, 2013, the Department of Defense (“DoD”) issued a proposed rule that would amend the Defense Federal Acquisition Regulation Supplement (“DFARS”) relating to the detection and avoidance of counterfeit parts, in partial implementation of the National Defense Authorization Act (“NDAA”) for Fiscal Year (“FY”) 2012 (Pub. L. 112-81) and the NDAA for FY 2013 (Pub. L. 112-239). 78 Fed. Reg. 28780 (May 16, 2013). The proposed rule would impose new obligations for detecting and protecting against the inclusion of counterfeit parts in their products. Public comments in response to the proposed amendment are due by July 15, 2013.

The proposed rule, titled Detection and Avoidance of Counterfeit Electronic Parts (DFARS Case 2012-D-005), partially implements Section 818 of the NDAA for FY 2012 requiring the issuance of regulations addressing the responsibility of contractors (a) to detect and avoid the use or inclusion of counterfeit – or suspect counterfeit – electronic parts, (b) to use trusted suppliers, and (c) to report counterfeit and suspect counterfeit electronic parts. Pub. L. 112-81,§ 818(c). Section 818(c) also requires DoD to revise the DFARS to make unallowable the costs of re-work or other actions necessary to deal with the use or suspected use of counterfeit electronic parts. Id. The new rule also proposes the following in order to implement the requirements defined in Section 818.

  • Definitions: Adds definitions to DFARS 202.101 for the terms “counterfeit part,” “electronic part,” “legally authorized source,” and “suspect counterfeit part.”
  • Cost Principles and Procedures: Adds DFARS section 231.205-71, which would apply to contractors covered by the Cost Accounting Standards (“CAS”) who supply electronic parts, and would make unallowable the costs of counterfeit or suspect counterfeit electronic parts and the costs of rework or corrective action that may be required to remedy the use or inclusion of such parts. This section provides a narrow exception where (1) the contractor has an operational system to detect and avoid counterfeit parts that has been reviewed and approved by DoD pursuant to DFARS 244.303; (2) the counterfeit or suspect counterfeit electronic parts are government furnished property defined in FAR 45.101; and (3) the covered contractor provides timely notice to the Government.
  • Avoidance and Detection System: Requires contractors to establish and maintain an acceptable counterfeit avoidance detection system that addresses, at a minimum, the following areas: training personnel; inspection and testing; processes to abolish counterfeit parts proliferation; traceability of parts to suppliers; use and qualification of trusted suppliers; reporting and quarantining counterfeit and suspect counterfeit parts; systems to detect and avoid counterfeit electronic parts; and the flow down of avoidance and detection requirements to subcontractors.

Potential Impacts on Contractors and Subcontractors

Although the rule is designed constructively to combat the problem of counterfeit parts in the military supply chain, it imposes additional obligations and related liabilities on contractors and subcontractors alike.

  • The proposed rule shifts the burden of protecting against counterfeit electronic parts to contractors, thus increasing contractor costs and potential contractor liability in this area.
  • Under the proposed rule, contractors would need to take steps to establish avoidance and detection systems in order to monitor for and protect against potential counterfeit electronic parts, also increasing the financial and temporal impact on contractors.
  • Avoidance and detection system requirements will need to be flowed down to subcontractors, increasing subcontractors’ responsibility – and thus liability – for counterfeit parts.
  • The proposed rule would also make unallowable the costs incurred to remove and replace counterfeit parts, which could have a significant financial impact on contractors – even under cost type contracts.
  • As it currently stands, the narrow exception regarding the allowability of such costs applies only where the contractor meets all three requirements of the exception, which likely would be a rare occurrence.

Interim SBA Rule: Expansion of WOSB Program, RIN 3245-AG55

On May 7, 2013, the Small Business Administration (“SBA”) issued an interim final rule implementing Section 1697 of the NDAA for FY 2013, removing the statutory dollar amount for contracts set aside for Women-Owned Small Business (“WOSB”) under the Women-Owned Small Business Program. 78 Fed. Reg. 26504 (May 7, 2013). Comments are due by June 6, 2013.

The new rule would amend SBA 127.503 to permit Contracting Officers (“COs”) to set aside contracts for WOSBs and Economically Disadvantaged WOSBs (“EDWOSBs”) at any dollar amount if there is a reasonable expectation of competition among WOSBs as follows: (1) in industries where WOSBs are underrepresented, the CO may set aside the procurement where two or more EDWOSBs will submit offers for the contract and the CO finds that the contract will be awarded at a fair and reasonable price; or (2) in industries where WOSBs are substantially underrepresented, the CO may set aside the procurement if two or more WOSBs will submit offers for the contract, and the CO finds that the contract will be awarded at a fair and reasonable price.

The new rule would amend SBA 127.503 to permit Contracting Officers (“COs”) to set aside contracts for WOSBs and Economically Disadvantaged WOSBs (“EDWOSBs”) at any dollar amount if there is a reasonable expectation of competition among WOSBs as follows: (1) in industries where WOSBs are underrepresented, the CO may set aside the procurement where two or more EDWOSBs will submit offers for the contract and the CO finds that the contract will be awarded at a fair and reasonable price; or (2) in industries where WOSBs are substantially underrepresented, the CO may set aside the procurement if two or more WOSBs will submit offers for the contract, and the CO finds that the contract will be awarded at a fair and reasonable price.

Article By:

 of

Weighing Going Private or Sale to Carl Icahn, Dell Cuts off Info

McBrayer NEW logo 1-10-13

As Dell Inc. considers its future after a massive loss in value over the past decade, the question may fundamentally be this: are the company’s problems are the result of poor leadership or a relatively straightforward matter of shedding its stock obligations?

Two proposals are on the table. First, founder Michael Dell has proposed taking the company private by buying out the company’s stock for $24.4 billion through a private equity firm called Silver Lake. Second, business magnate Carl Icahn’s Southeastern Asset Management has offered to buy Dell for $12 in cash per share. Unfortunately, it’s not clear how the buyout negotiations are going.

An unquestioned leader in the personal computer industry in the 90s, Dell had lost some $68 billion in stock market value by 2010, reportedly due to a change in its customer base and inability to respond to Apple’s iPhone and iPad products. Sales at Dell continue to shrink, reportedly showing a 79 percent drop in a quarterly profit report filed last week.

As part of the buyout negotiations, Icahn sent a letter on seeking more detailed information from Dell, including data room access for a certain potential lender This week, however, a special committee of Dell’s board of directors sent Icahn a letter refusing access to that information until it can determine whether his offer is “superior” to Michael Dell’s.

Meanwhile, Dell insisted upon more information from Icahn — such as whether his offer is even serious. In its response, the committee specifically asked Icahn to make “an actual acquisition proposal that the Board could evaluate” as opposed to merely offering the board a backup plan in case Michael Dell’s proposal fails to move forward.

“Please understand that unless we receive information that is responsive to our May 13 letter, we are not in a position to evaluate whether your proposal meets that standard,” the special committee reportedly wrote in response to Icahn’s request.

The question on Wall Street is the same as Dell’s: Is the Southeastern Asset Management offer serious? Icahn reportedly already owns 4.5 percent of Dell’s stock, while Southwest, already Dell’s largest outside shareholder, owns 8 percent.

 of

“Actually, Someone Knows You are a Dog”– the Chinese Regulation Efforts on Private Data Protection

Sheppard Mullin 2012

Do you have privacy in the era of information?

“On the Internet, nobody knows you’re a dog.” First published in The New Yorker on July 5, 1993, this widely known and recognized saying has been quoted many times to describe the anonymous feature of Internet. However, now this description has been drifting from the truth.

The truth is that, some people using the Internet may know you better than yourself. When you log on Amazon, not only will the site greet you by name, the homepage will also suggest certain purchases. Surprisingly, you will be interested in at least one third of them. Your addresses have been recorded and Amazon will automatically calculate the delivery period. Besides those online shopping sites, getting visitors’ information is the common practice of online service and/or information providers. Youku and Netflix suggest videos to watch. Weibo and Facebook suggest friends to follow. Douban and IMDB suggest movie tickets to buy and parties to attend.

On one hand, these recommendations might give you convenience in your life and entertainment; while on the other hand, this can be really intruding and make you anxious by knowing you so much. For example, you just bought an apartment and even did not get the keys. However, decoration companies and contractors give you calls telling you the decoration designs for the new apartment have been done. You just submitted some resumes for a job. Even before the interview, insurance companies and training companies give you calls and emails to make sales. Have you wondered how strangers know your private, personal information?

Every time you log on a website, make a call or buy a ticket by showing ID card, computer systems will track you down, and record everything you have clicked and purchased. Data analyzing systems will collect, characterize, store your information, and take further actions based on the information. Some entities even purchase and resell personal data for profit. The reason why personal data become commodities is because direct marketing based on private data is profitable. Marketing communications are only classified as “direct marketing” where they are addressed to a specific person by name or where a phone call is made to a specific person, and the use of private data is the foundation of direct marketing. The newly issued Hong Kong Personal Data (Privacy) Amendment Ordinance contains a number of new provisions regulating the use of personal data in connection with direct marketing activities in Hong Kong, which has come into force since April 1, 2013. Apart from Hong Kong, there are over fifty countries and regions which have laws and regulations protecting personal data.

What is the new trend in China to protect personal data?

In order to safeguard the legitimate rights and interests of Chinese citizens concerning private data protection, the Ministry of Industry and Information Technology of China (“MIIT”) announced the Provisions on the Protection of Personal Information of Telecommunication and Internet Users (Draft for Comments) (“PPI Rules”) and the Provisions on the Registration of True Identity Information of Telephone Users (Draft for Comments) (“RTII Rules”) and sought for public comments. The deadline for submitting comments is May 15, 2013.

The PPI Rules and RTII Rules are a breakthrough with respect to legislation of personal information protection. Although these two rules are not officially a personal information protection law, they are a good beginning and call for a complete set of rules.

The PPI Rules and RTII Rules are designed to protect personal information from two perspectives. While the PPI Rules regulates the collection and utilization of users’ private information, the RTII Rules requests “real-name registration” of telephone users for the prohibition of direct or indirect marketing using no-name telephone numbers. Specifically, the PPI Rules requires that telecommunication service providers and Internet information service providers (“Service Providers”) shall not collect or use the users’ personal information without their consent. Service Providers shall also clearly notify the users of the purpose, method and scope of collection and utilization of the users’ personal information, retention period of such information, ways to access and modify such information, and consequences of refusal to provide such information.

Meanwhile, the “real-name registration” required by RTII Rules is a double-edged sword. Not only are telephone users required to supply their true identity information, some Internet services, for example, the Chinese Twitter Weibo, also require users’ true identity information. On one hand, it will reduce the risk of private information abuse by no-name telephones and Weibo bloggers. One the other hand, the “real-name registration” regime means it is legitimate for telephone and some Internet service providers to collect their users’ information. Although RTII Rules prohibits the sales and illegal provision of users’ information, it doesn’t mean those providers will not utilize the users’ information to make profits and provide such information to government or other compulsive entities. This “real-name registration” may limit the health development of Internet and even harm users’ right to free speech. Is “real-name registration” the only way to protect personal information? This is a controversial topic.

What can enterprises do to avoid violations of personal data protection rules in China?

Putting the controversial topic aside, let’s talk about what the enterprises doing business in China can do regarding new rules to protect personal information. Those enterprises may not be limited to Internet/telecommunication service providers, because the regime may expand in the future to regulate more entities that may get access to citizens’ personal data.

First, the concerned enterprises can log on MITT official websites and submit comments if any. They can make their voice heard since the rules are in the “draft for comments” period.

Second, thorough study of the new rules and other anticipated rules in this area is needed. The concerned enterprises need to provide proper training to their employees regarding the users’ information protection, since this is not only required by the new rules, but the enterprises might also have joint and several obligations with the employees who abuse the users’ information.

Third, proper drafts of disclaimer/declaration/agreement are needed when the enterprises want to collect and utilize the users’ private information. The enterprises need to make sure that they have obtained the users’ consents concerning the information collection and utilization. Proper preparations are needed to avoid future risks.

 of

Second Circuit Bars Criminal Defendant from Accessing Assets Frozen by Regulators

Katten Muchin

The US Court of Appeals for the Second Circuit recently upheld a district court’s refusal to release nearly $4 million in assets frozen by the Securities and Exchange Commission and the Commodity Futures Trading Commission to help a defendant fund his criminal defense.

Stephen Walsh, a defendant in a criminal fraud case, had requested the release of $3.7 million in assets stemming from the sale of a house that had been seized by regulators in a parallel civil enforcement action. In denying Walsh’s motion to access the frozen funds, the US District Court for the Southern District of New York found that the government had shown probable cause that the proceeds had been tainted by defendant’s fraud, and were therefore subject to forfeiture. Though Walsh and his wife had purchased the home in question using funds unrelated to the fraud, Walsh ultimately acquired title to the home pursuant to a divorce settlement in exchange for a $12.5 million distributive award paid to his wife, at least $6 million of which, according to the court, was traceable to the fraud.

Agreeing with the District Court, the Second Circuit found that although the house itself was not a fungible asset, it was “an asset purchased with” the tainted funds from the marital estate by operation of the divorce agreement and affirmed the denial of defendant’s request. Further, since Walsh’s assets did not exceed $6 million at the time of his arrest, under the Second Circuit’s “drugs-in, first-out” approach, all of his assets became traceable to the fraud.

U.S. v. Stephen Walsh, No. 12-2383-cr (2d Cir. Apr. 2, 2013).

 of

Trade Secret Misappropriation: When An Insider Takes Your Trade Secrets With Them

Raymond Law Group LLC‘s Stephen G. Troiano recently had an article, Trade Secret Misappropriation: When An Insider Takes Your Trade Secrets With Them, featured in The National Law Review:

RaymondBannerMED

While companies are often focused on outsider risks such as breach of their systems through a stolen laptop or hacking, often the biggest risk is from insiders themselves. Such problems of access management with existing employees, independent contractors and other persons are as much a threat to proprietary information as threats from outside sources.

In any industry dominated by two main players there will be intense competition for an advantage. Advanced Micro Devices and Nvida dominate the graphics card market. They put out competing models of graphics cards at similar price points. When played by the rules, such competition is beneficial for both the industry and consumers.

AMD has sued four former employees for allegedly taking “sensitive” documents when they left to work for Nvidia. In its complaint, filed in the 1st Circuit District Court of Massachusetts, AMD claims this is “an extraordinary case of trade secret transfer/misappropriation and strategic employee solicitation.” Allegedly, forensically recovered data show that when the AMD employees left in July of 2012 they transferred thousands of files to external hard drives that they then took with them. Advanced Micro Devices, Inc. v. Feldstein et al, No. 4:2013cv40007 (1st Cir. 2013).

On January 14, 2013 the District Court of Massachusetts granted AMD’s ex-parte temporary restraining order finding AMD would suffer immediate and irreparable injury if the Court did not issue the TRO. The TRO required the AMD employees to immediately provide their computers and storage devices for forensic evaluation and to refrain from using or disclosing any AMD confidential information.

The employees did not have a non-compete contract. Instead the complaint is centered on an allegation of misappropriation of trade secrets. While both AMD and Nvidia are extremely competitive in the consumer discrete gpu market involving PC gaming enthusiasts, there are rumors that AMD managed to secure their hardware to be placed in both forthcoming next-generation consoles, Sony PlayStation 4 and Microsoft Xbox 720. AMD’s TRO and ultimate goal of the suit may therefore be to preclude any of their proprietary technology from being used by its former employees to assist Nvidia in the future.

The law does protect companies and individuals such as AMD from having their trade secrets misappropriated. The AMD case has only recently been filed and therefore it is unclear what the response from the employees will be. What is clear is how fast AMD was able to move to deal with such a potential insider threat. Companies need to be aware of who has access to what data and for how long. Therefore, in the event of a breach, whether internal or external, companies can move quickly to isolate and identify the breach and take steps such as litigation to ensure their proprietary information is protected.

© 2013 by Raymond Law Group LLC

Brace for Impact – Final HITECH Rules Will Require Substantially More Breach Reporting

The National Law Review recently published an article, Brace for Impact – Final HITECH Rules Will Require Substantially More Breach Reporting, written by Elizabeth H. Johnson with Poyner Spruill LLP:

Poyner Spruill

 

The U.S. Department of Health and Human Services (HHS) has finally issued its omnibus HITECH Rules.  Our firm will issue a comprehensive summary of the rules shortly (sign up here), but of immediate import is the change to the breach reporting harm threshold.  The modification will make it much more difficult for covered entities and business associates to justify a decision not to notify when an incident occurs.

Under the interim rule, which remains in effect until September 23, 2013, a breach must be reported if it “poses a significant risk of financial, reputational, or other harm to the individual.” The final rule, released yesterday, eliminates that threshold and instead states:

“[A]n acquisition, access, use, or disclosure of protected health information in a manner not permitted under subpart E [the Privacy Rule] is presumed to be a breach unless the covered entity or business associate, as applicable, demonstrates that there is a low probability that the protected health information has been compromised based on a risk assessment of at least the following factors:

(i) The nature and extent of the protected health information involved, including the types of identifiers and the likelihood of re-identification;

(ii) The unauthorized person who used the protected health information or to whom the disclosure was made;

(iii) Whether the protected health information was actually acquired or viewed; and

(iv) The extent to which the risk to the protected health information has been mitigated.”
(Emphasis added).

In other words, if a use or disclosure of information is not permitted by the Privacy Rule (and is not subject to one of only three very narrow exceptions), that use or disclosure will be presumed to be a breach.  Breaches must be reported to affected individuals, HHS and, in some cases, the media.  To rebut the presumption that the incident constitutes a reportable breach, covered entities and business associates must conduct the above-described risk analysis and demonstrate that there is only a low probability the data will be compromised.  If the probability is higher, breach notification is required regardless of whether harm to the individuals affected is likely.  (Interestingly, this analysis means that if there is a low probability of compromise notice may not be required even if the potential harm is very high.)

What is the effect of this change?  First, there will be many more breaches reported resulting in even greater costs and churn than the already staggering figures published by Ponemon which reports that 96% of health care entities have experienced a breach with average annual costs of $6.5 billion since 2010.

Second, enforcement will increase.  Under the new rules, the agency is required (no discretion) to conduct compliance reviews when “a preliminary review of the facts” suggests a violation due to willful neglect.  Any reported breach that suggests willful neglect would then appear to require agency follow-up.  And it is of course free to investigate any breach reported to them.  HHS reports that it already receives an average of 19,000 notifications per year under the current, more favorable breach reporting requirements, so where will it find the time and money to engage in all these reviews?  Well, the agency’s increased fining authority, up to an annual maximum of $1.5 million per type of violation, ought to be some help.

Third, covered entities and business associates can expect to spend a lot of time performing risk analyses.  Every single incident that violates the Privacy Rule and does not fit into one of three narrow exceptions must be the subject of a risk analysis in order to defeat the presumption that it is a reportable breach.  The agency requires that those risk analyses be documented, and they must include at least the factors listed above.

So why did the agency change the reporting standard?  As it says in the rule issuance, “We recognize that some persons may have interpreted the risk of harm standard in the interim final rule as setting a much higher threshold for breach notification than we intended to set. As a result, we have clarified our position that breach notification is necessary in all situations except those in which the covered entity or business associate, as applicable, demonstrates that there is a low probability that the protected health information has been compromised. . . .”

The agency may also have changed the standard because it was criticized for having initially included a harm threshold in the rule, with critics claiming that the HITECH Act did not provide the authority to insert such a standard.  Although the new standard does, in essence, permit covered entities and business associates to engage in a risk-based analysis to determine whether notice is required, the agency takes the position that the new standard is not a “harm threshold.”  As they put it, “[W]e have removed the harm standard and modified the risk assessment to focus more objectively on the risk that the protected health information has been compromised.”  So, the agency got their way in that they will not have to receive notice of every single event that violates the Privacy Rule and they have made a passable argument to satisfy critics that the “harm threshold” was removed.

The new rules are effective March 26, 2013 with a compliance deadline of September 23, 2013.  Until then, the current breach notification rule with its “significant risk of harm” threshold is in effect.  To prepare for compliance with this new rule, covered entities and business associates need to do the following:

  • Create a risk analysis procedure to facilitate the types of analyses HHS now requires and prepare to apply it in virtually every situation where a use or disclosure of PHI violates the Privacy Rule.
  • Revisit security incident response and breach notification procedures and modify them to adjust notification standards and the need to conduct the risk analysis.
  • Revisit contracts with business associates and subcontractors to ensure that they are reporting appropriate incidents (the definition of a “breach” has now changed and may no longer be correct in your contracts, among other things).
  • If you have not already, consider strong breach mitigation, cost coverage, and indemnification provisions in those contracts.
  • Revisit your data security and breach insurance policies to evaluate coverage, or lack thereof, if applicable.
  • Consider strengthening and reissuing training.  With every Privacy Rule violation now a potentially reportable breach, it’s more important than ever to avoid mistakes by your workforce.  And if they happen anyway, during a subsequent compliance review, it will be important to be able to show that your staff was appropriately trained.
  • Update your policies to address in full these new HIPAA rules.  The rules require it, and it will improve your compliance posture if HHS does conduct a review following a reported breach.

As noted above, our firm will issue a more comprehensive summary of these new HIPAA rules in coming days.

© 2013 Poyner Spruill LLP