On December 8, 2023, the California Privacy Protection Agency (“CPPA”) Board (the “Board”) held a public meeting to discuss, among other things, regulations addressing: (1) cybersecurity audits; (2) risk assessments; and (3) automated decisionmaking technology (“ADMT”).  After years in the making, the December 8 Board meeting was another step towards the final rulemaking process for these regulations.  The Board’s discussion of the draft regulations revealed their broad implications for businesses covered by the California Consumer Privacy Act (“CCPA”) and the challenging task of balancing the competing interests of privacy, the administrative burden of compliance, and the free market for goods, services, and employment.

Cybersecurity Audit Regulations

Previously, on September 8, 2023, the Board discussed an earlier draft of the cybersecurity audit regulations, which we wrote about here.  In advance of the Board’s December 8 meeting, the CPPA posted an updated draft.  At the December 8 meeting, like the September 8 meeting, the Board again focused on the revenue and processing thresholds for the cybersecurity audit requirement to apply—a topic that is of primary importance due to the potential administrative burden for companies who fall within the regulations’ scope.  At the outset, the CPPA will only require cybersecurity audits for a subset of CCPA covered businesses.  In the most recent draft of the regulations, the criteria are: (1) deriving 50 percent or more of annual revenue from selling or sharing consumer’s personal information; or (2) having $25 million gross annual revenue, and either (a) processing personal information of 250,000 or more consumers in the preceding calendar year, (b) processing the sensitive personal information of 50,000 or more consumers in the preceding calendar year, or (c) processing the personal information of 50,000 consumers that the business had actual knowledge were less than 16 years of age.  An economic analysis has not yet been finalized, and therefore the Board could not determine the economic impact on California businesses using these thresholds.  As a result, it is possible that the threshold criteria might change after the formal rulemaking process begins.

Also noteworthy was Board member Alastair MacTaggart’s observation that the draft cybersecurity audit regulations might implicate compelled speech in light of the Northern District of California’s ruling in Netchoice, LLC v. Rob Bonta.  In Netchoice, the court preliminarily enjoined enforcement of the California Age-Appropriate Design Code Act, which would “require covered businesses to identify and disclose to the government potential risks to minors and to develop a timed plan to mitigate or eliminate the identified risks, regulate the distribution of speech and therefore trigger First Amendment scrutiny.”  In that regard, Board member MacTaggart was concerned that the cybersecurity audit regulations might similarly trigger First Amendment scrutiny by requiring businesses to assess and document negative impacts to consumers associated with a breach, including psychological harm.  It is possible that this requirement will be scaled back during the final rulemaking process.

As a procedural matter, the CPPA staff will revise the regulations for readability and clarity, and then, after the Board approves those changes, the regulations will proceed to formal rulemaking 45-day public comment period.  Ultimately, the cybersecurity audit regulations appear to be in near final form before public comment begins.

Risk Assessment Regulations

The Board next addressed the updated draft risk assessment regulations that it also posted in advance.  A key focus was the cadence of risk assessments under the draft regulations.  The draft provides that initial risk assessments must be conducted within twenty-four months of the effective date of the regulations.  After some discussion, the Board was inclined to use a three-year cadence for updating and reviewing the risk assessments, which was the longest of available options.  As explained by Board member Lydia De La Torre, the longer time-period was, in part, justified by the significant undertaking a risk assessment can entail.  Indeed, unlike other jurisdictions, California’s risk assessments must also consider employment-related information, which increases the administrative burden for compliance.

During the meeting, Board members also raised concerns about the breadth of the risk assessment regulations.  For example, Board member MacTaggart noted that the requirement to conduct risk assessments when “profiling a consumer while they are in a publicly accessible space,” could be interpreted broadly to encompass the use of a mobile application in public. 

The Board agreed to provide additional feedback before moving the risk assessment regulations forward to the formal rulemaking and public comment process.  In that regard, the draft risk assessment regulations appear to be on a slower track that than the cybersecurity audit regulations.  The Board also noted that the ADMT regulations, discussed below, and risk assessment regulations were interrelated, and therefore should might be progressing on a similar track. 

Automated Decisionmaking Technology Regulations

The Board next discussed the recently released draft ADMT regulations publicly for the first time.  ADMT is broadly defined as “any system, software, or process—including one derived from machine learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.”  Board members noted that this was a very broad definition and could include any application or program that assists in decision-making.  In their current form, however, the draft ADMT regulations do not regulate all use of ADMT.  The primary obligations and rights imposed by the regulations—i.e., the “pre-use” notice and the rights to opt-out and access information—would only be required when ADMT is used in certain contexts: when making a decision that produces legal or similarly significant effects; when profiling a consumer who is acting in their capacity as an employee, independent contractor, job applicant or student; and when profiling in a publicly accessible space.  At the meeting, the Board considered clarifying and potentially limiting the scope of ADMT-use covered by these regulations.  For example, the Board discussed introducing the concept of “intrusive profiling” to include in the regulations. 

Questions regarding the ADMT opt-out right were top of mind for the Board members.  Board member Jeffrey Worthe questioned whether businesses would be required to provide goods or services to consumers who opt-out of ADMT.  Similarly, Board member MacTaggart questioned whether employees would have the protected right to opt-out of the use of ADMT that might promote the safety of its employees or merely monitor employee attendance at work.  The theme again was how broad these ADMT regulations should sweep, potentially outside the realm of privacy.  Considering the CCPA’s prohibition on discriminating against consumers who exercise rights, the implications of an ADMT opt-out right will create difficult legal issues for businesses covered by the CCPA.

The Board authorized staff to receive additional feedback on the ADMT regulations from individual Board members and to propose a revised draft at a following meeting.  It is likely that the ADMT regulations will undergo meaningful revisions before the formal rulemaking process begins.

Looking Forward

While the cybersecurity audit regulations should be moving forward to the formal rulemaking process soon (likely at the next Board meeting), the risk assessment and ADMT regulations will require additional refinement and clarification.  Businesses that are covered by the CCPA should start considering the obligations these new regulations might impose to stay ahead of the curve.

Back to Workforce Bulletin Blog

Search This Blog

Blog Editors

Authors

Related Services

Topics

Archives

Jump to Page

Subscribe

Sign up to receive an email notification when new Workforce Bulletin posts are published:

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.