The Federal Register officially published the FCC’s new rules governing net neutrality on Monday, April 13, 2015, and the new rules will take effect 60 days following the date of publication. As anticipated, AT&T and the wireless and cable industry groups immediately filed suit in the D.C. Circuit Court to challenge the new rules on Tuesday, April 14, 2015. The litigation is spearheaded by AT&T and its trade group CTIA – The Wireless Association which also represents Verizon, Sprint and T-Mobile. The suit represents a new stage in the telecommunications industry’s efforts to challenge the recently enacted rules. Read additional coverage of the suit including potential arguments the telecommunications groups will raise, and stay tuned for our take on the developing litigation.
Category: Communication Media & Internet
IoT – It’s All About the Data, Right?
A few weeks ago, the FTC released a report on the Internet of Things (IoT). IoT refers to “things” such as devices or sensors – other than computers, smartphones, or tablets – that connect, communicate or transmit information with or between each other through the Internet. This year, there are estimated to be over 25 billion connected devices, and by 2020, 50 billion. With the ubiquity of IoT devices raising various concerns, the FTC has provided several recommendations.
Security
The report includes the following security recommendations for companies developing Internet of Things devices:
-
Build security into devices at the outset, rather than as an afterthought in the design process
-
Train employees about the importance of security, and ensure that security is managed at an appropriate level in the organization
-
Ensure that when outside service providers are hired, that those providers are capable of maintaining reasonable security, and provide reasonable oversight of the providers
-
When a security risk is identified, consider a “defense-in-depth” strategy whereby multiple layers of security may be used to defend against a particular risk
-
Consider measures to keep unauthorized users from accessing a consumer’s device, data, or personal information stored on the network
-
Monitor connected devices throughout their expected life cycle, and where feasible, provide security patches to cover known risks
Data Minimization
The report suggested companies consider data minimization – that is, limiting the collection of consumer data, and retaining that information only for a set period of time, and not indefinitely. Data minimization addresses two key privacy risks: first, the risk that a company with a large store of consumer data will become a more enticing target for data thieves or hackers, and second, that consumer data will be used in ways contrary to consumers’ expectations.
Notice and Choice
The FTC provided further recommendations relating to notice and choice. It is recommended that companies notify consumers and give them choices about how their information will be used, particularly when the data collection is beyond consumers’ reasonable expectations.
What Does This Mean for Device Manufacturers?
It is evident from the FTC’s report that security and data governance are important features for IoT device manufacturers to consider. Although the report suggests implementing data minimization protocols to limit the type and amount of data collected and stored, IoT device manufacturers should not be short-sighted when deciding what data to collect and store through their IoT devices. For many IoT device manufacturers, the data collected may be immensely valuable to them and other stakeholders. It would be naïve to decide not to collect certain types of data simply because there is no clear use or application of the data, the costs and risks of storing such data are cost prohibitive or because they want to reduce their exposure due to a security breach. In fact, quite often IoT device manufacturers do not realize what types of data may be useful. IoT device manufacturers would be best served by analyzing who the stakeholders of their data may be.
For instance, an IoT device manufacturer that monitors soil conditions of farms may realize that the data they collect can be useful, not only to farmers, but also to insurance companies to better understand water table levels, produce suppliers, wholesalers, and retailers to predict produce inventory, farm equipment suppliers, among others. Because of this, IoT device manufacturers should identify the stakeholders of the data they collect early and revisit the data they collect to identify new stakeholders not previously identified based on trends that can be determined from the data.
Moreover, IoT device manufacturers should constantly consider ways to monetize or otherwise leverage the data they gather and collect. IoT device manufacturers tend to shy away from owning the data they collect in an effort to respect their customers’ privacy. Instead of not collecting sensitive data at all, IoT device manufacturers would be best served by exploring and implementing data collection and storage techniques that reduce their exposure to security breaches while at the same time allay the fears of customers.
The New Competition – Emerging Legal Technologies Out of Silicon Valley
In January, the National Law Review had pleasure of attending theAnnual Marketing Partner Forum in beautiful Rancho Palos Altos, California. Programing was provided by the Legal Executives Institute at Thomson Reuters and featured over 15 hours of dynamic workshops. Hundreds of marketing partners, managing partners, in-house counsel and senior-level marketing and business development professionals were in attendance.
The “New Competition” program featured emerging legal technologies within Silicon Valley. Catherine Hammack of Jurispect, Monica Zent of Foxwordy, and Daniel Lewis of Ravel Law each showcased their innovative technologies and shared their thoughts as to where innovation is taking the legal industry in 2015.
“Jurispect will help fundamentally transform how companies operate by providing organizations with a real-time analytical view of both exposure and opportunities to take proactive steps to manage legal and regulatory risk.” – Catherine Hammack
Catherine was present on two momentous occasions in U.S. financial history: as an intern at Arthur Anderson when Enron was indicted, and as a first-day associate at Bingham McCutchen the day Lehman Brothers filed for bankruptcy, and the start of the financial crisis in 2008. Following her time at Bingham as a financial litigator, she transitioned to join Google’s Policy team, where her perspective on legal services dramatically changed.
As Catherine elaborated in a post-conference interview: “There was a huge gap between the way law firms traditionally provide counsel and the way companies need information to make business decisions.” She was surrounded by engineers and data scientists who were analyzing vasts amounts of data with cutting edge technology. Catherine became interested in adapting these technologies for managing risk in the legal and regulatory industries. Inspired by Google’s data-driven decision making policies, she founded Jurispect.
Jurispect is a tool that companies can use to track legal and regulatory changes relevant to their industry, and possibly to identify risks earlier on to help avoid future Enrons and Lehman Brothers. Currently, Jurispect is geared toward companies in the financial services and technology space, and will be expanding into other regulated industries in the very near future. Key decision makers in the corporate legal, compliance and risk departments of companies are benefitting from Jurispect’s actionable intelligence. The user’s experience is customized: Jurispect’s technology adjusts based on user profile settings and company attributes. As the user continues to utilize Jurispect, its algorithms continuously calibrate to improve the relevancy of information presented to each user.
Jurispect’s team of seasoned experts in engineering, data science, product management, marketing, legal and compliance collaborated to develop the latest machine learning and semantic analysis technologies. These technologies are used to aggregate information across regulatory agencies, including sources such as policy statements and enforcement actions. Jurispect also analyzes information in relevant press releases, and coverage by both industry bodies and mainstream news. The most time-saving aspect of Jurispect are the results that coalesce into user-friendly reports to highlight the importance and relevance of the regulatory information to their company. Users can view this intelligence in the form of notifications, trends, and predictive analytics reports. Jurispect makes data analytics work for legal professionals so they spend less time searching, and more time on higher level competencies. As Catherine elaborated, “We believe that analytics are quickly becoming central to any technology solution, and the regulatory space is no exception.”
“Foxwordy is ushering in the era of the social age for lawyers and for the legal industry.” – Monica Zent
Monica is an experienced entrepreneur and had already been running a successful alternative law firm practice when she founded Foxwordy. Foxwordy is a private social network that is exclusively for lawyers. Monica reminded the audience that we are, remarkably, ten years into the social media experience and all attorneys should consider a well rounded social media toolkit that includes Foxwordy, Twitter, and LinkedIn.
However, as Monica elaborated in a post-conference interview, LinkedIn, for example, “falls short of the needs of professionals like lawyers who are in a space that is regulated; where there’s privacy, [and] professional ethics standards.” As an experienced attorney and social seller, Monica understands that lawyers’ needs are different from other professionals that use the more mainstream and very public social networks, which is why she set out to create Foxwordy.
Foxwordy is currently available to licensed attorneys, those who are licensed but not currently practicing but regularly involved in the business of law, certified paralegals, and will eventually open up to law students. Anyone who fits the above criteria can request membership by going to the homepage, and all potential members go through a vetting process to ensure that they are a member of the legal community. At its inception, the Foxwordy team expected to see more millennials and solo practitioners taking advantage of the opportunity to network on Foxwordy. Those populations have joined as expected, but what was surprising is how the product resonated across all demographics, positions and segments of the legal industry. Foxwordy has seen general counsels, in-house counsels, solo practitioners, major law firm partners, law school deans, judges, politicians and more become members.
Foxwordy is currently available to join and will be emerging out of public beta around summertime this year. As Monica said during her presentation “Time is the new currency”, and what the Foxwordy team has found via two clinical trials is that engaging Foxwordy saves lawyers an average of two hours per day. Membership includes all the core social features such as a profile page, connecting with others, the ability to ask questions and engage anonymously, exchange referrals, and exchange other information and resources. Free members experience all the core functions fully and there is a premium membership that is available with enhanced features and unlimited use of Foxwordy. In the closing thoughts of her post-conference interview, Monica shared that “the ability to engage anonymously and discreetly, yet at the same time collaborate with our legal colleagues and engage with them on a social level has been very powerful.”
“There is an amazing opportunity to use data analytics and technology to create a competitive edge for lawyers amidst all of this information…” – Daniel Lewis
Data analytics and technology has been used in many different fields to predict successful results. In his presentation, Daniel pointed out that fields traditionally considered more art than science have benefitted from the use of data analytics to predict accurate results.
Having conducted metrics-based research and advocacy while at the Bipartisan Policy Center, and observing how data-driven decision making was being used in areas like baseball and politics, Daniel was curious why the legal industry had fallen so far behind. Even though the legal field is often considered to be slow moving, there are currently over 11 million opinions in the U.S. judicial system with more than 350,000 new opinions issued per year. There is also a glut of secondary material that has appeared on the scene in the form of legal news sources, white papers, law blogs and more. Inspired by technology’s ability to harness and utilize vast amounts of information, Daniel founded Ravel Law to accommodate the dramatically growing world of legal information.
Ravel Law is optimized for all lawyers across the country. Currently, thousands of associates, partners, and in-house counsel are using Ravel. Ravel has as also begun working with 30 of the top law schools around the country, with thousands of law students learning how to use it right alongside legal research staples such as Westlaw and LexisNexis. Professors and students around the country have also independently discovered Ravel and are using and teaching it. When asked why he works with law schools, Daniel said “We work with schools because students are always the latest generation and have the highest expectations about how technology should work for them.” Students have given the Ravel team excellent feedback and have grown into a loyal user base over the past few years. Once these students graduate, they introduce Ravel to their firms. Ravel’s user base has been growing very quickly and they have only released a small portion of what their technology is ultimately capable of.
Ravel’s team of PhDs and technical advisors from Google, LinkedIn, and Facebook, has coded advanced search algorithms to determine what is relevant, thereby enhancing legal research’s effectiveness and efficiency. Ravel provides insights, rather than simply lists of related materials, by using big data technologies such as machine learning, data visualization, advanced statistics and natural language processing. In a post-conference follow up Daniel elaborates: “Our visualizations then show how the results connect in context, helping people understand the legal landscape very rapidly as well as find needles in the haystack.” Ravel guides users toward analysis of relevant passages in a particular case, without navigating away from the original case or conducting a new search. Daniel and his colleagues will be launching more new features this year and are looking forward to continuing to “transform how attorneys search and understand all legal information.”
Sponsors of Net Neutrality Bill Receive Thousands from Internet Service Providers
Late last month, Rep. Fred Upton (R-MI) and Rep. Greg Walden (R-OR) held a hearing to discuss congressional action on net neutrality. The representatives, who chair House committees that oversee the Federal Communications Commission (FCC), also released an early draft of a bill to regulate the open flow of information on the Internet.
Consumer advocates have argued the draft proposal fails to adequately regulate net neutrality, and instead voiced support FCC Commissioner Tom Wheeler’s efforts to regulate the internet like a utility. The FCC isscheduled to vote on Commissioner Wheeler’s proposal on Feb. 26.
Campaign Contributions: MapLight analysis of campaign contributions from employees and political action committees (PACs) of the four largest internet service providers in the United States, AT&T, Comcast, Time Warner Cable, and Verizon Communications, to the campaign committees of Rep. Fred Upton (R-MI) and Greg Walden (R-OR) during the 2014 election cycle.
Member | Amount Received | ||||
AT&T | Comcast | Time Warner Cable | Verizon Communications | Total | |
Rep. Fred Upton (R-MI) | $10,000 | $37,600 | $21,500 | $30,400 | $99,500 |
Rep. Greg Walden (R-OR) | $5,000 | $32,050 | $14,500 | $5,250 | $56,800 |
Total | $15,000 | $69,650 | $36,000 | $35,650 | $156,300 |
-
The top four internet service providers, AT&T, Comcast, Time Warner Cable, and Verizon Communications, contributed $156,300 to Rep. Fred Upton (R-MI) and Rep. Greg Walden (R-OR) during the 2014 election cycle.
-
The four companies contributed $99,500 to Rep. Upton.
-
The four companies contributed $56,800 to Rep. Walden.
-
The top contributor, Comcast, contributed $69,650 to the two chairmen during the 2014 election cycle.
To view lobbying data for AT&T, Comcast, Time Warner Cable, and Verizon Communication please click here.
Campaign Contributions Methodology: MapLight analysis of campaign contributions to Rep. Fred Upton (R-MI) and Rep. Greg Walden (R-OR) from PACs and employees of the AT&T, Comcast, Cox Communications, Time Warner Cable, and Verizon Communications, the four largest internet service providers in the United States, from January 1, 2013 to December 31, 2014. Contributions data source: Federal Election Commission
Online Behavioral Advertising: Industry Guides Require Real Time Notice When Data Are Collected or Used for Personalized Ads
WHAT’S COVERED?
Online behavioral advertising (OBA) has become a very common tool for commercial websites. OBA can be defined as follows:
the collection of data online from a particular computer or device regarding web viewing behaviors over time and across Web sites for the purpose of using such data to predict preferences or interests and to deliver advertising to that computer or device presumed to be of interest to the user of the computer/device based on observed Web viewing behaviors.
OBA might be implemented by use of cookies directly on a company’s website by the company itself. Or it might occur through technology embedded in ads from other parties displayed on the company’s site. Either way, the operators of commercial websites need to be aware when OBA is occurring on their sites and should be taking steps to provide greater transparency about OBA occurring on their sites.
WHAT’S THE CONCERN?
While the use of OBA is largely unregulated by law in the U.S. at this time, its spread has generated concern among privacy advocates. Of particular concern is the gathering of data about consumers without their knowledge where such information is supposed to be anonymous but advances in technology make it more and more possible to link that information to individuals (not just devices) through combination with other information. Examples can include information about health conditions and other sensitive information gleaned by watching the sites a user visits, the searches he/she conducts, etc. Key characteristics of OBA include that it is: (a) invisible to the user; (b) hard to detect; and (c) resilient to being blocked or removed.
In an effort to stave off government regulation of OBA in the United States, the Digital Advertising Alliance (DAA), a consortium of the leading advertising trade associations, has instituted a leading set of guidelines. Based on standards proposed by the Federal Trade Commission, the DAA Self-Regulatory Program is designed to give consumers enhanced control over the collection and use of data regarding their Internet viewing for OBA purposes.
WHAT’S REQUIRED?
The key principles of the DAA’s guides are to provide greater transparency to consumers to allow them to know when OBA is occurring and to provide the ability to opt out. For commercial website operators that allow OBA on their sites, the compliance implications are as follows:
-
First Party OBA. First Parties are website operators/publishers. If a company simply gathers information for its own purposes on its own site, it is generally not covered by the guidelines. However, as soon as the First Party allows others to engage in OBA via the site, it has a duty to monitor and make sure that proper disclosures are being made and even to make the disclosures itself if the others do not do so, including assuring that “enhanced notice” (usually the icon discussed below or a similar statement) appears on every page of the First Party’s site where OBA is occurring.
-
Third-Party OBA. Third parties are ad networks, data companies/brokers, and sometimes advertisers themselves, who engage in OBA through ads placed on other parties’ sites. These Third Parties should provide consumers with the ability to exercise choice with respect to the collection and use of data for OBA purposes. (See below on how to provide recommended disclosures.)
-
Service Providers. These are providers of Internet access, search capability, browsers, apps or other tools that collect data about sites a user visits Service Providers generally are expected to provide clear disclosure of OBA practices which may occur via their services, obtain consumer consent for such practices, and provide an easy-to-use opt-out mechanism.
HOW TO COMPLY
Generally, Third Parties and Service Providers should give clear, meaningful, and prominent notice on their own websites that describes their OBA data collection and use practices. Such notice should include clear descriptions that include:
-
The types of data collected online, including any PII for OBA purposes;
-
The uses of such data, including whether the data will be transferred to a nonaffiliate for OBA purposes;
-
An easy to use mechanism for exercising choice with respect to the collection and use of the data for OBA purposes or to the transfer of such data to a nonaffiliate for such purpose; and
-
The fact that the entity adheres to OBA principles.
In addition, “enhanced notice” should appear on each and every ad (or page) where OBA is occurring. The “enhanced notice” means more than just traditional disclosure in a privacy policy. It means placement of a notice on the page/ad where OBA is occurring. The notice typically is given in the form of the following icon (in blue color) which should link to a DAA page describing OBA practices and providing an easy-to-use opt-out mechanism:
The icon/link should appear in or around each ad where data are collected. Alternatively, it can appear on each page of a website on which any OBA ads are being served. It is normally the duty of the advertisers (Third Parties) to deploy the icon. However, if they fail to do so, then the operator of the site where the OBA ads appear has the duty to make appropriate real-time disclosures about OBA on each page where OBA activity is occurring, including links to the DAA page describing OBA practices and providing an easy-to-use opt-out mechanism.
ENFORCEMENT
The DAA is taking its OBA guidelines seriously. It has issued sets of “compliance warnings” to many major U.S. companies. While DAA has no direct authority to impose fines or penalties, its issuance of a ruling finding a violation of its guidelines could create a tempting target for the FTC or plaintiffs’ class action lawyers to bring separate actions against a company not following the DAA guidelines. For all these reasons, operators of websites employing OBA (either first party or third party) should pay heed to the DAA Guidelines.
Data Analytics as a Risk Management Strategy
In our increasingly competitive business environment, companies everywhere are looking for the next new thing to give them a competitive edge. But perhaps the next new thing is applying new techniques and capabilities to existing concepts such as risk management. The exponential growth of data as well as recent technologies and techniques for managing and analyzing data create more opportunities.
Enterprise risk management can encompass so much more than merely making sure your business has purchased the right types and amounts of insurance. With the tools now available, businesses can quantify and model the risks they face to enable smarter mitigation strategies and better strategic decisions.
The discipline of risk management in general and the increasingly popular field of enterprise risk management have been around for years. But several recent trends and developments have increased the ability to execute on the concept of enterprise risk management.
First, the amount of data being produced everywhere has exploded and continues to accelerate. The typical executive today is swamped by data coming from all directions. Luckily, just as the raw amount of data has grown, the cost of the hardware to store data has decreased at an exponential rate. For example, in the last 10 years, retail hard-drive costs have dropped from about $1.20 per gigabyte (GB) in 2004 to about 4 cents per GB today. What’s more, the cost of hardware to store all that enterprise data is quickly becoming negligible.
But such huge amounts of data present a problem: Somebody has to manage and analyze it. All data is not equally important or relevant to the problems business executives need to solve or the risks they’re trying to manage. The explosion of data has created a greater amount of helpful and relevant data, but it can get lost in an even greater amount of useless, irrelevant, and distracting data. So an effective data management and analytics program is crucial to take advantage of the opportunities resident in the new flood of data.
One job of analytics is to sort the important from the unimportant and analyze and synthesize the data in new ways that create actionable information. Fortunately, the tools and techniques to manage large volumes of data have been progressing over the past several years. In particular, there has been a lot of buzz about big data. The field of big data has developed from a specific platform to manage large volumes of data into an entire ecosystem of related technologies. These tools are critical to the process of picking out the grains of useful intelligence from the vast quantities of distracting chaff that are characteristic of many big data sources.
Of course, all the recent technical developments and analytic techniques that make it possible to extract actionable information from a flood of data are all professionally exciting—if you’re an analyst. However, analytics for analytics’ sake does not help an organization. Often, analytics groups can remain isolated from the business itself. When such groups ultimately present what they have discovered, they may simply talk about the part most interesting to them—the analytics process—rather than focusing on the resulting information.
It is important to remember that actionable information is the ultimate goal of the entire exercise. The information must reach the decision makers in an understandable form when it is needed—the right information at the right place and at the right time. When designing information systems or even just presenting information to business executives, it is important for technical professionals to keep technical details to a minimum and focus on the actionable information. A feedback mechanism is critical. Users of the information must have a method to tell the creators of the information whether it was sufficient, correct, timely and understandable.
It’s been said that the three most important factors in real estate are location, location, and location. Similarly, the three most important factors in effective analytics are data, data, and data. Good data can sometimes make up for mediocre analytics, but even the best analytics will never produce anything useful from poor data.
Where should a business begin to leverage the new data and risk analytics? It has to start with the data itself. So start collecting and storing the data that’s available to you. Every business generates vast amounts every day. Collecting, managing, and analyzing internal data is necessary; but by looking outside the organization at social media, government data sources and third-party data vendors, a company can really begin to illuminate the environment in which it operates.
Managing data for analytics is a specialized field in its own right, and a topic for another day. But the business that can effectively leverage data and analytics to manage the risks it faces will be rewarded by seeing the future more clearly, making better decisions and ultimately being more successful than those companies that cannot.
Article authored by Phil Hatfield, modeling data services executive for ISO Insurance Programs and Analytic Services (IPAS), a Verisk Analytics (Nasdaq:VRSK) business.
Online Presence Management: You Down with OP…M? Yeah, You Should Be!
The stakes are higher than ever when it comes to your company’s online presence management (OPM), and you should be proactive in ensuring that your company is best positioned for success.
We are talking about total OPM. Yes, it is a real thing. The soaring growth of online media revenues (over 17%, recently), thesophistication of bad actors responsible for “mega-hacks,” and the ever-expanding social media market are but a few of the headlines that top the news on a daily basis.
Public interest is extremely high. As such, the risks and liabilities to your company are self-evident.
As a responsible lawyer (or, at least, someone interested enough in the law to read this blog), you should take a proactive approach to ensuring your organization is aligned with measures to both capitalize on the enormous opportunities presented by, as well as mitigate the risks associated with, managing your company’s total online presence.
So, where do you begin? What are the first steps? We recommend scheduling an internal discussion with your relevant stakeholders to take inventory of where you are with respect to your company’s OPM. You don’t need to involve outside counsel or be an expert in every nuance of the OPM space. Instead, the goal is to get a discussion between your business team and legal team about the structure and needs of your company. That is, the goal is to get the dialogue started internally so you have the information that you need to provide or to seek artful advice.
Here are the top three agenda items for your initial meeting:
1. Online Contracting Discussion—What agreements do you use, or should you use, on your website? Terms of service? Terms of use? Privacy policies? Codes of conduct? Foreign Corrupt Practices Act policies? Open-source policies? Once the inventory is completed, have a candid discussion with your business/marketing/OPM teams about (1) how each agreement is executed and used within the organization, (2) how updates are communicated, and (3) any pain points experienced by the business team. Special attention should be given to agreements that control services or products that produce revenue or that deal with the handling of important data or information. Understanding the total picture of your company’s online contracting structure allows you to identify risks and install protections to mitigate them.
2. Security Protocol Discussion—What are the processes in place to monitor and respond to potential security threats? What would your company do if it suspected a breach? How long would it react? What reporting systems are in place to alert responsible OPM team members of suspicious activities? Lawyers, like CEOs, can no longer assume that their company’s IT personnel handles these issues. By understanding the lay of the land, in-house lawyers and well-integrated outside counsel can better respond to emergency situations.
3. Data Leverage Discussion—What data is collected by your company’s internal tools? What data is collected by third-party tools and services? How is the data collected from the website (both personally identifiable and commercial) used by the company? Are there any synergies that can be gained by various business teams by gaining access to either of the above? Understanding what data is collected, especially commercial data such as user tendencies and product information, can assist lawyers in understanding the rights to negotiate when dealing with outside vendors and in drafting privacy policies.
As you can probably tell, the “discussions” approach will likely lead to many tangent discussions and identification of issues that you didn’t even realize existed in your organization. This is intentional.
In today’s online environment, you need to be proactive and agile to ensure that your company’s OPM is handled in a responsible, predictable, and measured manner. Having the discussions above will at least give you a starting point to demonstrate a more active approach and likely result in you being able to provide better and more business-focused counsel.
Not Just Your (Company) Email System Anymore! re: NLRB Purple Communications Ruling
On December 10, 2014, the National Labor Relations Board (Board) ruled in Purple Communications, Inc., 361 N.L.R.B. No. 126, thatemployees have a right, protected by the National Labor Relations Act (Act), to use an employer’s email system during non-working time for communications protected by the Act(e.g., to discuss union issues or other protected concerted activities protected by Section 7 of the Act). The Board has thus overruled prior precedent, as set out in Register Guard, 351 N.L.R.B. 1110 (2007), that the Act did not give employees the right to use their employer’s email systems for Section 7 purposes.
A copy of the December 10, 2014 Board decision can be found here. The following passage sums up the scope of the Board’s ruling:
First, [this ruling] applies only to employees who have already been granted access to the employer’s email system in the course of their work and does not require employers to provide such access. Second, an employer may justify a total ban on nonwork use of email, including Section 7 use on nonworking time, by demonstrating that special circumstances make the ban necessary to maintain production or discipline. Absent justification for a total ban, the employer may apply uniform and consistently enforced controls over its email system to the extent such controls are necessary to maintain production and discipline. Finally, we do not address email access by nonemployees, nor do we address any other type of electronic communications systems, as neither issue is raised in this case.
The Board’s decision may be appealed by the employer, but even if it is not appealed, the email issue will likely continue to be litigated before the Board. For now, employers should review their electronic communications policies to ensure compliance with the Board’s new standards or to, at a minimum, understand their risk.
QVC Sues Shopping App for Web Scraping That Allegedly Triggered Site Outage
Operators of public-facing websites are typically concerned about the unauthorized, technology-based extraction of large volumes of information from their sites, often by competitors or others in related businesses. The practice, usually referred to as screen scraping, web harvesting, crawling or spidering, has been the subject of many questions and a fair amount of litigation over the last decade.
However, despite the litigation in this area, the state of the law on this issue remains somewhat unsettled: neither scrapers looking to access data on public-facing websites nor website operators seeking remedies against scrapers that violate their posted terms of use have very concrete answers as to what is permissible and what is not.
In the latest scraping dispute, the e-commerce site QVC objected to the Pinterest-like shopping aggregator Resultly’s scraping of QVC’s site for real-time pricing data. In its complaint, QVC claimed that Resultly “excessively crawled” QVC’s retail site (purpotedly sending search requests to QVC’s website at rates ranging from 200-300 requests per minute to up to 36,000 requests per minute) causing a crash that wasn’t resolved for two days, resulting in lost sales. (See QVC Inc. v. Resultly LLC, No. 14-06714 (E.D. Pa. filed Nov. 24, 2014)). The complaint alleges that the defendant disguised its web crawler to mask its source IP address and thus prevented QVC technicians from identifying the source of the requests and quickly repairing the problem. QVC brought some of the causes of action often alleged in this type of case, including violations of the Computer Fraud and Abuse Act (CFAA), breach of contract (QVC’s website terms of use), unjust enrichment, tortious interference with prospective economic advantage, conversion and negligence and breach of contract. Of these and other causes of action typically alleged in these situations, the breach of contract claim is often the clearest source of a remedy.
This case is a particularly interesting scraping case because QVC is seeking damages for the unavailability of their website, which QVC alleges to have been caused by Resultly. This is an unusal theory of recovery in these types of cases. For example, this past summer, LinkedIn settled a scraping dispute with Robocog, the operator of HiringSolved, a “people aggregator” employee recruting service, over claims that the service employed bots to register false accounts in order to scrape LinkedIn member profile data and thereafter post it to its service without authorization from Linkedin or its members. LinkedIn brought various claims under the DMCA and the CFAA, as well as state law claims of trespass and breach of contract, but did not allege that their service was unavailable due to the defendant’s activities. The parties settled the matter, with Robocog agreeing to pay $40,000, cease crawling LinkedIn’s site and destroy all LinkedIn member data it had collected. (LinkedIn Corp. v. Robocog Inc., No. 14-00068 (N.D. Cal. Proposed Final Judgment filed July 11, 2014).
However, in one of the early, yet still leading cases on scraping, eBay, Inc. v. Bidder’s Edge, Inc., 100 F. Supp. 2d 1058 (N.D. Cal. 2000), the district court touched on the foreseeable harm that could result from screen scraping activities, at least when taken in the aggregate. In the case, the defendant Bidder’s Edge operated an auction aggregation site and accessed eBay’s site about 100,000 times per day, accounting for between 1 and 2 percent of the information requests received by eBay and a slightly smaller percentage of the data transferred by eBay. The court rejected eBay’s claim that it was entitled to injunctive relief because of the defendant’s unauthorized presence alone, or because of the incremental cost the defendant had imposed on operation of the eBay site, but found sufficient proof of threatened harm in the potential for others to imitate the defendant’s activity.
It remains to be seen if the parties will reach a resolution or whether the court will have a chance to interpret QVC’s claims, and whether QVC can provide sufficient evidence of the causation between Resultly’s activities and the website outage.
Companies concerned about scraping should make sure that their website terms of use are clear about what is and isn’t permitted, and that the terms are positioned on the site to support their enforceability. In addition, website owners should ensure they are using “robots.txt,” crawl delays and other technical means to communicate their intentions regarding scraping. Companies that are interested in scraping should evaluate the terms at issue and other circumstances to understand the limitations in this area.
FCC: The New Data Security Sheriff In Town
Data security seems to make headlines nearly every week, but last Friday, a new player entered the ring. The Federal Communications Commission (“FCC”) took its first foray into the regulation of data security, an area that has been dominated by the Federal Trade Commission. In its 3-2 vote, the FCC did not tread lightly – it assessed a $10 million fine on two telecommunications companies for failing to adequately safeguard customers’ personal information.
The companies, TerraCom, Inc. and YourTel America, Inc., provide telecommunications services to qualifying low-income consumers for a reduced charge. The FCC found that the companies collected the names, addresses, Social Security numbers, driver’s licenses, and other personal information of over 300,000 consumers. The data was stored on Internet servers without password protection or encryption, allowing public access to the data through Internet search engines. This, the FCC found, exposed consumers to “an unacceptable risk of identity theft.”
The FCC charged the companies with violation of Section 222(a) of the Communications Act, which it interpreted to impose a duty on telecommunications carriers to protect customers’ “private information that customers have an interest in protecting from public exposure,” whether for economic or personal reasons. Additionally, the companies were charged with violation of Section 201(b), which requires carriers to treat such information in a “just and reasonable” manner.
The companies were determined to have violated Sections 201(b) and 222(a) by failing to employ “even the most basic and readily available technologies and securities features.” The companies further violated Section 201(b), the FCC found, by misrepresenting in their privacy policies and statements on their websites that they employ reasonable and updated security measures, and by failing to notify all of the affected customers of the data breach.
Commissioners Ajit Pai and Michael O’Rielly dissented, arguing that, among other things, the FCC had not before interpreted the Communications Act to impose an enforceable duty to employ data security measures and notify customers in the event of a breach. Though now that the FCC has so-interpreted the Act, we can expect the FCC to keep its eye on data security.
The FCC made clear that protection of consumer information is “a fundamental obligation of all telecommunications carriers.” Friday’s decision also makes clear that the FCC will enforce notification duties in the event of a breach, and will look closely at carriers’ privacy policies and online statements regarding data security.