Cybersecurity, Privacy and Data Protection > AG Data Dive > CPPA Releases Public Comments for CPRA Regulations
11 Jan '22

Public comments to recently published regulations governing compliance with the California Privacy Rights Act (CPRA) show that stakeholders sharply disagree on multiple areas of the CPRA. Seventy submissions totaling nearly 900 pages were published by the Agency during a forty-five day comment period. Some examples of contested topics are summarized below.1

The CPRA, which amends and expands the California Consumer Privacy Act (CCPA), will become effective on January 1, 2023.  CPRA establishes the California Privacy Protection Agency (CPPA or “Agency”), which has authority to update existing CCPA regulations and adopt new regulations implementing the CPRA.2

The California Attorney General’s Office published an initial set of final regulations governing compliance with the CCPA, which went into effect on August 14, 2020. Additional amendments to the final regulations went into effect on March 15, 2021. The CPRA now directs the new Agency to engage in further rulemaking on a variety of topics.

To that end, the Agency solicited preliminary written comments from the public from September 22, 2021 through November 8, 2021, specifically asking about “new and undecided issues” not addressed by the existing CCPA regulations.3 These issues include the following:

  1. Cybersecurity audits and risk assessments performed by businesses: The CPRA directs the Agency to issue regulations regarding businesses “whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security” to perform annual cybersecurity audits and to conduct regular risk assessments and submit to the Agency such risk assessments.4 The Agency sought comments regarding, among other things, the following topics: when a business’s processing of personal information presents a significant risk to consumers’ privacy or security; what businesses should be required to cover in annual cybersecurity audits; what businesses should cover in their regular risk assessments and how often they should submit risk assessments; and when “the risks to the privacy of the consumer [would] outweigh the benefits of” businesses processing consumer information.5
  2. Automated decision making: To issue regulations governing consumers’ access and opt-out rights with respect to businesses’ use of automated decision-making technology, 6 the Agency sought comments regarding the types of activities that should be deemed to constitute “automated decision-making technology” and/or “profiling;” when consumers should be able to access information about businesses’ use of such technology; the types of information businesses must provide to consumers in response to access requests; and the scope of and processes governing consumers’ opt-out rights regarding automated decision making.
  3. The Agency’s audit authority: The CPRA gives the Agency authority to audit businesses’ compliance with the law.7 The Agency sought public comments on the scope of its audit authority, criteria to select businesses to audit and safeguards to protect consumers’ personal information from disclosure to an auditor.8
  4. Consumers’ right to delete, right to correct and right to know: The CCPA gives consumers the right to request deletion of their personal information, know what personal information is being collected, access that personal information and know what categories of personal information are being sold or shared.9 The CPRA amended the CCPA to add a new right to request correction of inaccurate personal information.10 The Agency sought comments regarding procedures for consumers to correct inaccurate personal information, such as how often and under what circumstances a consumer may make such a request and how businesses should respond. The Agency also requested comments on how a business should be exempt from the obligation to take action based on a consumer request where the request would be “impossible, or involve a disproportionate effort.”11
  5. Opt-out preference signals: The CPRA provides for additional rulemaking to update the CCPA rules on the right to opt-out of the sale of personal information.12 To that end, the Agency sought comments in creating these regulations, including what requirements and technical specifications should be established for an opt-out preference signal.
  6. Consumers’ right to limit the use and disclosure of sensitive personal information: The CPRA amends the CCPA to give consumers additional rights over a new category of information: “sensitive personal information,”13 which includes, for example, social security numbers, information allowing access to a financial account, precise geolocation information, information about race, ethnicity, sexual orientation, religious or philosophical beliefs, and genetic data. The Agency sought comments on what constitutes “sensitive personal information” that should be deemed “collected or processed without the purpose of inferring characteristics about a consumer” and therefore not subject to the right to limit use and disclosure. In addition, the Agency requested comments on permissible uses or disclosures of consumers’ sensitive personal information.
  7. Information to be provided in response to a consumer’s request to know: In response to a consumer’s request, the CPRA requires businesses to disclose of certain information covering the 12 months prior to the date of the request.14 However, for all information processed on or after January 1, 2022, consumers may request information beyond the 12-month window unless a business determines that providing information beyond the 12-month window is “impossible” or “would involve a disproportionate effort.”15 The Agency sought comments on what standards should govern this exception.
  8. Definitions of various terms: The CCPA and CPRA provide for regulations to create or update definitions of certain important terms. The Agency sought comments to assist it with deciding whether to update or create definitions for such terms, including but not limited to the following: “personal information,” “sensitive personal information,” “deidentified,” “precise geolocation” and “dark patterns.”

Various individuals and organizations submitted comments in the nearly 900 pages of submissions, including trade associations, companies representing various industry sectors, consumer rights groups and academics. The public comments reveal significant disagreements in policy and statutory interpretation among various stakeholders. Below are high-level examples of hotly contested issues.

Privacy and Security Risk Assessments

The CPRA requires organizations engaged in data processing that pose a “significant risk” to consumer privacy and security to conduct and submit to the Agency risk assessments on a “regular basis.”16 However, the law does not specify when and how frequently businesses should conduct and submit assessments, or the scope and procedures for completing such assessments.

Multiple industry groups suggested that privacy and cybersecurity risk assessments should only be submitted to the Agency upon request, which would be consistent with current Virginia and Colorado privacy laws. They expressed concern that the adopting of strict or formalistic reporting requirements would place unnecessary burdens on businesses and the Agency.

Civil society organizations typically sought to impose expansive assessment requirements on covered businesses, with one coalition arguing that assessments should be conducted prior to any change in business practices that “might alter the resulting risks to individuals’ privacy” and be resubmitted to the Agency at 6-month intervals.

Californians for Consumer Privacy, a nonprofit organization which had a role in drafting the CPRA ballot initiative, suggested a graduated approach, which would initially require risk assessments from large processors of personal information.

Automated Decision-Making Technology

The CPRA directs the Agency to develop organizations governing access and opt-out rights with respect to the use of automated decision-making technology, including “profiling.” Although the CPRA does not define “automated decision-making technology,” it defines “profiling” as “any form of automated processing of personal information, as further defined by regulations pursuant to paragraph (16) of subdivision (a) of Section 1798.185, to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.” Id. § 1798.140(z).

Multiple commenters recommended the Agency regulate only automated decision-making technology that produces “legal or similarly significant effects” to consumers. Such effects would include automatic refusal of an online credit application, employment decisions made by online job recruitment platforms and other decisions that affect a consumer’s eligibility for credit, employment, insurance, rental housing, or license or other government benefit. Some industry groups pointed to a variety of low-risk, socially beneficial automated tools—e.g., calculators, spellcheckers—that should not be swept up by overly broad regulation.

By contrast, the Electronic Frontier Foundation argued that the Agency’s regulations should more broadly define automatic decision-making technology to include “systems that provide recommendations, support a decision, or contextualize information.”

Opt-Out Signals

The Attorney General’s FAQ page states that a browser-signal named the Global Privacy Control “must be honored by covered businesses as a valid consumer request to stop the sale of personal information.”17 However, commenters disagree on whether the CPRA requires businesses to honor the Global Privacy Control.

Some industry groups, including the California Chamber of Commerce, advocated for providing businesses with sufficient flexibility in responding to Global Privacy Controls, particularly where they receive competing signals (e.g., when a person opts out through a universal control but opts in for a specific service). They also interpreted Sections 1798.135(b) to suggest it is optional for a business to recognize an opt-out signal.

By contrast, Mozilla, which recently implemented the Global Privacy Control in the Firefox browser, urged the Agency to expressly require that companies comply with the global privacy control under the CPRA. Consumer rights groups also argued that the CPRA expressly mandates the recognition of global privacy signals, pointing to section 1798.135(e). Californians for Consumer Privacy also argued that “there is no reading of the statute that would allow a business to [refuse] to honor a global opt-out signal enabled by a consumer.”

Definition of Dark Patterns

The CPRA defines “dark patterns” as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.”18 The concept of dark patterns has been noted by Agency Board Chairperson as a potential subject for discussion at future Agency informational hearings.19

Some industry groups argued that the definition of “dark patterns” under the CPRA would encompass any interface that could be interpreted as impairing user choice. These organizations requested a narrower definition of “dark patterns” to focus on design practices that amount to consumer fraud. Other commenters suggested that the Agency should define the term as broadly as possible beyond the context of consent interfaces.

Conclusion

The Agency intends to publish its initial set of proposed regulations in early 2022 ahead of the CPRA rulemaking deadline looming on July 1, 2022. It remains to be seen how the Agency will address the wide range of viewpoints raised by the commenters. We will continue to monitor and provide updates on significant developments in the Agency’s rulemaking process.

If you have any questions, please contact a member of the Akin Gump Cybersecurity, Privacy and Data Protection team.


1 CPPA Regulations, available at https://cppa.ca.gov/regulations/.

2 See, e.g., Cal. Civ. Code § 1798.185.

3 See Invitation for Preliminary Comments on Proposed Rulemaking Under the California Privacy Rights Act of 2020, available at https://cppa.ca.gov/regulations/pdf/invitation_for_comments.pdf.

4 Cal. Civ. Code § 1798.185(a)(15).

5 Id.

6 Id. § 1798.185(a)(16).

7 Id. § 1798.199.65.

8 Id. § 1798.185(a)(18).

9 Id. §§ 1798.105, 1798.110, 1798.115, and 1798.130.

10 Id. §§ 1798.105 and 1798.130.

11 Id. § 1798.185(a)(8)(A).

12 Id. §§ 1798.185(a)(4) and 1798.185(a)(19)-(2).

13 Id. § 1798.121.

14 Id. § 1798.130(a)(2)(B).

15 Id. § 1798.130(a)(2)(B).

16 Cal. Civ. Code § 1798.185(a)(15).

17 California Consumer Privacy Act, Frequently Asked Questions, available at https://oag.ca.gov/privacy/ccpa.

18 Cal. Civ. Code § 1798.140(l).

19 California Privacy Protection Agency Board Meeting, Draft Meeting Minutes, available at https://cppa.ca.gov/meetings/materials/20211115_item3c.pdf.