Overview
On January 6, 2016, the Federal Trade Commission (FTC) released a report titled, "Big Data: A Tool for Inclusion or Exclusion." The report highlights concerns that big data projects could result in illegal exclusion of certain groups, such as low income populations, from benefits made available to others. The report does not contain any legislative recommendations, but it does identify key legal and compliance issues that can arise when engaging in Big Data analytics. For instance, it cites examples where poor data sets delivered inaccurate predictions to affected consumers and employees. In response to this, the report encourages companies to apply big data analytics in ways that provide benefits and opportunities to consumers.
FTC Issues Important Report on Big Data, Compliance, and Consumer Protection
On January 6, the Federal Trade Commission (FTC) released a report titled Big Data: A Tool for Inclusion or Exclusion, which describes how big data use can run afoul of federal laws and regulations. Although the FTC report also praises the benefits to consumers companies have been able to deliver through big data analytics, like increasing educational opportunities for otherwise underserved students and tailoring health care to individual patients' needs, the report highlights the potential for unintended and exclusionary effects resulting from big data uses. In particular, the FTC report raises concerns that big data projects could result in the illegal exclusion of certain groups, like low income and underserved populations, from benefits made available to others. The report follows the FTC's Sept. 15, 2014, public workshop titled, "Big Data: A Tool for Inclusion or Exclusion?" and a March 19, 2014, seminar on "alternative scoring products."
The FTC report does not contain any legislative recommendations, but does identify key legal and compliance issues that can arise when engaging in Big Data analytics. The FTC report cites a number of examples where poor quality data or incomplete data sets delivered flawed and under-representative results, which led to inaccurate predictions and potential harm to affected consumers and employees. For example, the report takes note of credit card companies that lower a customer's credit limit, not based on the customer's payment history, but rather based on analysis of other customers with poor repayment history who may have shopped at the same establishments where the customer shops. The report also expresses trepidation that using big data to target ads to specific populations could lead to biased sales or lending practices. The report suggests, for example, that big data driven advertisements might result in low income consumers never receiving advertisements for more advantageous financial products. Research also shows that big data can result in companies varying prices to consumers in different zip codes.
Accordingly, the report describes three federal laws and regulations that could impact companies considering big data projects: The Fair Credit Reporting Act (FCRA) and its associated regulations; Equal Opportunity laws; and Section 5 of the Federal Trade Commission Act (FTC Act).
The Fair Credit Reporting Act
FCRA regulates consumer reporting agencies (CRAs) that compile and sell "consumer reports," how those consumer reports may be used and how companies provide information to a CRA. Entities that furnish personal information to others for use in making employment, credit, insurance, housing, or other decisions could be considered CRAs, even if they attempt to disclaim that role and instruct their clients not to use the furnished personal information for a covered eligibility decision. Entities using consumer reports for covered eligibility decisions are also required to take certain steps, including providing "adverse action" notices in appropriate cases.
The FTC's report calls companies' attention to the potential applicability of the FCRA to big data and data analytics. The report suggests that companies should pause to consider whether the use of novel data sets or analytical scoring models implicates FCRA's requirements. The report notes that, although big data sets might include information beyond that included in traditional credit history reports, using big data to analyze information about a specific individual could implicate the FCRA, even if that information is drawn from social media behaviors or other online activity. The report also suggested that questions about FCRA's applicability will be analyzed on a case-by-case basis. For example, although generally a company using its own data will not be considered a CRA or to be using a "consumer report," a company that provides its data to an unaffiliated firm and then acts on reports received back from that firm might, depending on the facts, be using "consumer reports" and the other firm might be considered a CRA. However, the report cautioned that this, and other issues regarding FCRA's applicability, will be highly fact-specific and resolved on a case-by-case basis.
Equal Opportunity Laws
The FTC report also notes that a number of federal laws, including the Equal Credit Opportunity Act (ECOA), Title VII of the 1964 Civil Rights Act, the Americans with Disabilities Act (ADA), the Age Discrimination in Employment Act (ADEA), and the Genetic Information Nondisclosure Act (GINA), all prohibit discrimination in different contexts based on "protected characteristics" - race, color, sex or gender, religion, age, disability status, genetic information, and others. Using a big data model that suggests different treatment for a group of persons on the basis of a protected characteristic, the FTC report warns, could violate one or more of these laws.
The report also notes that many of these laws contain prohibitions on practices that result in "disparate treatment" or a "disparate impact" on persons with a common protected characteristic. Even if a company does not, for example, "expressly" consider gender in evaluating job applications, the use of "big data analytics to screen job applicants in a way that has a disparate impact on women may still be subject to certain equal employment opportunity laws" and could subject the company to enforcement actions or civil suits. The report also cautions creditors, in particular, to be mindful of the use of big data in targeting advertisements. Taking note of Regulation B under the ECOA, the report reminds creditors not to use targeted advertisements to make statements that might "discourage on a prohibited basis a reasonable person from making or pursuing an application." Ads targeted at a population on the basis of a protected characteristic that run afoul of this prohibition could be found to violate the ECOA.
The Federal Trade Commission Act
In addition to sector-specific and characteristic-specific laws, uses of big data consisting of consumer information are subject to Section 5 of the FTC Act, which prohibits "unfair" or "deceptive" acts or practices in or affecting commerce. As the report highlights, "[u]nlike the FCRA or equal opportunity laws, Section 5 is not confined to particular market sectors but is generally applicable to most companies acting in commerce." The "deception" prong of Section 5 involves material statements or omissions that are likely to mislead consumers. The report notes that this includes material statements or omissions regarding the use of big data, such as promises not to share data or failing to disclose material information relating to its use. For example, if an organization states in its privacy statement that it will not share information collected from its customers with third parties, it would be deceptive if that organization subsequently shared the collected and consolidated customer information (i.e., the "big data") with third parties.
"Unfair" acts or practices under Section 5 are those resulting in injury to consumers where the injury is (i) substantial, (ii) not outweighed by any countervailing benefits to consumers or competition and (iii) not reasonably avoidable by consumers. The report focuses on two general practices of concern: securing big data, and the sale of data to customers who may use it for fraudulent or discriminatory purposes. Companies should reasonably secure big data consistent with the amount and sensitivity of the data maintained, and failure to do so may be an unfair practice. Big data consisting of sensitive categories of information, such as Social Security numbers or medical information, requires "particularly robust security measures." A second potentially unfair practice occurs if a company sells data to customers that a company knows or has reason to know will use the data for fraudulent or discriminatory purposes. Notably, the examples of "unfair" and "deceptive" acts contained in the report are far from exhaustive. As the report emphasizes, the "inquiry will be fact-specific, and in every case, the test will be whether the company is offering or using big data analytics in a deceptive or unfair way."
Special Policy Considerations Raised by Big Data Research
To encourage companies to maximize the benefits of big data analytics and limit harm, the report concludes by offering questions for companies to consider:
1. How representative is your data set? If analyzed data sets are missing information about certain populations, this may result in those populations being ignored or otherwise harmed. Therefore, organizations should "[c]onsider whether [their] data sets are missing information from particular populations and, if they are, take appropriate steps to address this problem."
2. Does your data model account for biases? Data analytics can reproduce biases present during the collection and analysis of data, thereby perpetuating those biases (even unintentionally). Organizations should "[r]eview [their] data sets and algorithms to ensure that hidden biases are not having an unintended impact on certain populations."
3. How accurate are your predictions based on big data? Correlations derived from big data analytics are not always meaningful, and if relied upon without further consideration, may result in harmful or wasteful outcomes. Stated alternatively, just because two factors are related, it does not mean that one necessarily causes or explains the other. Accordingly, organizations should "balance the risks of using [the] results [of data analytics]" and "[i]t may be worthwhile to have human oversight of data and algorithms when big data tools are used to make important decisions, such as those implicating health, credit, and employment."
4. Does your reliance on big data raise ethical or fairness concerns? Even when correlations derived from big data analytics are meaningful, overreliance on these meanings "could potentially result in a company not thinking critically about the value, fairness, and other implications of their uses of big data."
In short, the FTC report "encourages companies to apply big data analytics in ways that provide benefits and opportunities to consumers, while avoiding pitfalls that may violate consumer protection or equal opportunity laws, or detract from core values of inclusion and fairness."
Conclusion
The FTC report highlights the potential pitfalls that companies face when utilizing big data analytics and identifies key factors that companies must consider when employing a big data analytics approach. When leveraging big data analytics, in-house counsel should implement data collection and analysis processes that ensure that underlying data sets are complete, accurate, and do not contain hidden biases. Furthermore, in-house counsel should take reasonable steps to investigate whether correlations derived from data analytics are truly meaningful and, if so, whether their company's ultimate reliance on those correlations is appropriate and proportional.
Additional Resources
Region:
United States
The information in any resource collected in this virtual library should not be construed as legal advice or legal opinion on specific facts and should not be considered representative of the views of its authors, its sponsors, and/or ACC. These resources are not intended as a definitive statement on the subject addressed. Rather, they are intended to serve as a tool providing practical advice and references for the busy in-house practitioner and other readers.