White
Compliance

Quarterly Compliance Update: Fall 2025

  • October 7, 2025
  • Michael Kendrick
  • Approx. Read Time: 13 Minutes
  • Updated on October 7, 2025
Quarterly Compliance Update Fall 2025. Cisive.

From landmark U.S. Supreme Court decisions to sweeping new privacy enforcement efforts, the regulatory landscape continues to shift in ways that impact compliance leaders across every industry. In this quarter’s roundup, we highlight emerging trends in employment law, AI regulation, immigration enforcement, data privacy, and international data transfers. With increased scrutiny from regulators and rising litigation risks, staying current on these updates is more critical than ever.

 

 

Key Takeaways

        • The U.S. Supreme Court unanimously ruled that all Title VII plaintiffs—regardless of group status—must meet the same standard to raise a claim.

        • California, Colorado, and Connecticut launched a joint privacy enforcement sweep targeting failures to honor consumer opt-out requests.

        • A class action lawsuit against Otter.ai signals growing risks around the use of AI transcription and notetaking tools without proper consent.

        • The DOJ’s Data Security Program is now enforceable, and companies must finalize compliance programs by October 6, 2025.

        • California’s new AI employment regulations, effective October 1, 2025, impose significant responsibilities on employers using automated decision-making systems.

 

 

Table of Contents

  1. Federal Updates
  2. State, City, County, & Municipal Updates
  3. International Updates
  4. Conclusion

 

Quarterly Compliance Fall 2025 1

 

FEDERAL UPDATES

 

Supreme Court Unanimously Rejects Distinction Between “Majority” and “Minority” Plaintiffs in Employment Discrimination Claims: What’s Next for the McDonnell Douglas Framework?

On June 5, 2025, the Supreme Court unanimously held in Ames v. Ohio Dept. of Youth Services that all plaintiffs, regardless of race, color, religion, national origin, or sex, must satisfy the same proof requirements to meet their initial burden of establishing an employment discrimination claim under Title VII of the Civil Rights Act. That initial burden requires only that a plaintiff show that she applied for a position for which she was qualified and was rejected “under circumstances which give rise to an inference of unlawful discrimination.” The Court concluded that the plaintiff in this case, a straight woman, was unlawfully burdened by the Sixth Circuit’s additional requirement that “majority-group” plaintiffs provide more evidentiary support than “minority-group” plaintiffs to raise an initial inference of discrimination.

 

The Ames decision applies the legal framework mentioned above, which the Court first outlined in McDonnell Douglas v. Green.

 

This framework is used in most jurisdictions for proving all intentional discrimination claims in the absence of direct evidence of discrimination. In Ames, the Court limited its holding to a rejection of the extra burden the Sixth Circuit and four other federal Courts of Appeals had imposed on a subset of “majority-group” plaintiffs. Although the Court’s decision is not surprising, many employment lawyers have been especially interested in it. In Justice Thomas’s concurrence, joined by Justice Gorsuch, he suggests the Supreme Court, when next presented with the opportunity, should consider sweeping changes to the McDonnell Douglas framework, despite its having been accepted by the Court and used by the lower courts since 1981.

 

Click Here for the Original Article

 

 

AI Notetaking Tools Under Fire: Lessons from the Otter.ai Class Action Complaint

 

The rapid adoption of AI notetaking and transcription tools has transformed how organizations (and individuals) capture, analyze, and share meeting and other content. But as these technologies expand, so too do the legal and compliance risks. A recent putative class action lawsuit filed in federal court in California against Otter.ai, a leading provider of AI transcription services, highlights the potential pitfalls for organizations relying on these tools.

 

The Complaint Against Otter.ai

 

Filed in August 2025, Brewer v. Otter.ai alleges that Otter’s “Otter Notetaker” and “OtterPilot” services recorded, accessed, and used the contents of private conversations without obtaining proper consent. According to the complaint, the AI-powered notetaker:

    • Joins Zoom, Google Meet, and Microsoft Teams meetings as a participant and transmits conversations to Otter in real time for transcription.
    • Records meeting participants’ conversations even if they are not Otter accountholders. The lead plaintiff in this case is not an Otter accountholder.
    • Uses those recordings to train Otter’s automatic speech recognition (ASR) and machine learning models.
    • Provides little or no notice to non- accountholders and shifts the burden of obtaining permissions onto its accountholders.

The lawsuit asserts a wide range of claims, including violations of:

    • Federal law: the Electronic Communications Privacy Act (ECPA) and the Computer Fraud and Abuse Act (CFAA).
    • California law: the California Invasion of Privacy Act (CIPA), the Comprehensive Computer Data and Fraud Access Act, common law intrusion upon seclusion and conversion, and the Unfair Competition Law (UCL).

Click Here for the Original Article

 

DOJ’s 90-Day Data Security Compliance Grace Period is Over: Are You Compliant?

The U.S. Department of Justice (“DOJ”) Data Security Program (“DSP”) 90-day enforcement grace period ended as of July 8, 2025. While the program became effective April 8, 2025, DOJ implemented a 90-day enforcement grace period until July 8, 2025 for good-faith efforts towards compliance (see our previous blog here). With the expiration of the grace period, the majority of the DSP is now effective and will be enforced.

Background

As a reminder, the DOJ DSP aims to protect Americans’ sensitive personal data and certain U.S. Government-related data from foreign adversaries (see our blog here for more details on the rule). Specifically, the program prohibits or restricts “covered data transactions,” i.e., any transaction that involves any access by a country of concern (China, Russia, Iran, North Korea, Cuba, and Venezuela) or covered person to any bulk U.S. sensitive personal data or government-related data (as defined in the regulations) and that involves data brokerage; a vendor agreement; an employment agreement; or an investment agreement. Common types of data that will be subject to this rule include health and biometric data; human genomic data; financial data; personal health data; government identification numbers (such as social security numbers); demographic and contact information; and network, device, and advertising identifiers.

Enforcement Timeline and Path to Compliance

While the majority of the DSP is now effective and will be enforced as of July 8, 2025, the DSP includes another deadline for companies to establish required internal policies and procedures. By October 6, 2025, companies must implement the final requirements of the DSP to create a data compliance program (if participating in restricted transactions) and comply with reporting and auditing requirements.

It is crucial that companies evaluate and strengthen their data practices in advance of the upcoming October 6, 2025 deadline. Specifically, U.S. entities subject to the DOJ DSP should evaluate the following when shoring up compliance efforts.

    • Risk-based procedures for data security
    • Vendor management and validation
    • Written data and security policies with annual certification
    • Employee training programs
    • Dedicated compliance personnel
    • Audit, record-keeping, and reporting procedures and procedures for data security compliance

Companies should not delay in implementation of compliance programs. This is especially pertinent when considering the potential enforcement penalties associated with the DSP. The DOJ may bring civil enforcement actions and criminal prosecutions for knowing or willful violations of DSP requirements.

Click Here for the Original Article

 

 

ICE is Chomping at the Bit: Get Ready Now

 

In recent weeks, a Louisiana racetrack was raided by the Immigration and Customs Enforcement (ICE), and 84 people were arrested. Several central Kentucky farms were contacted recently by ICE and warned of fines regarding undocumented workers. The Federal government’s promised crackdown on illegal immigrants raises concerns for public and private facilities in both the horse racing industry and the sport horse industry. Employers can expect that ICE may show up on their property or in public settings where racing or horse shows take place or may conduct audits to examine whether any employees are not properly authorized to work. The time to plan is before this happens and to ensure that employees and managers know what to expect and understand their rights.

 

Most horse farms, horse trainers, racetracks and their employees do not know how to respond or what to do when ICE agents show up unannounced. The uncertainty drives anxiety and fear among employees and their families. Documented and undocumented workers now fear traveling to horse shows and the races, a customarily itinerant lifestyle, which places additional burdens on the employer and other employees who may unexpectedly have to shoulder additional work or be short staffed — particularly problematic as it may create a horse welfare issue.

 

Agents will seek to search stabling areas and property, retrieve documents, and/or speak to your employees. Here are some dos and don'ts for how to handle the situation when ICE agents arrive unexpectedly.

 

Click Here for the Original Article

 

 

Quarterly Compliance Fall 2025 2

 

STATE, CITY, COUNTY, AND MUNICIPAL UPDATES

 

California, Colorado, and Connecticut Launch Joint Privacy Sweep Over Opt-Out Rights

 

On September 9, the California Privacy Protection Agency (CPPA) and the Attorneys General of California, Colorado, and Connecticut announced a joint enforcement sweep targeting businesses that may be failing to honor consumer opt-out requests under state privacy laws. The joint effort centers on the Global Privacy Control (GPC), a browser setting that automatically signals to companies that a consumer does not want their personal information sold or shared.

 

The sweep underscores increasing state-level enforcement of consumer privacy protections, particularly where businesses fail to recognize or process opt-out signals. Regulators emphasized that refusal to honor GPC requests could amount to violations of state privacy statutes, including laws modeled after the California Consumer Privacy Act. The announcement also follows earlier CPPA enforcement actions and highlights unfair, deceptive, or abusive practices (UDAP) concerns tied to ignoring consumer choices in online tracking and data sharing. Iowa Consumer Data Protection Act – Effective January 1, 2025

 

The Iowa law applies to organizations that conduct business in Iowa or produce products or services that are targeted to Iowa residents and, during a calendar year, do at least one of the following:

 

  1. Control or processes the personal data of 100,000 or more Iowa residents, or
  2. Control or processes the personal data of 25,000 or more Iowa residents and derive more than 50% of their gross revenue from the sale of personal data. 

Iowa’s law differs from most comprehensive state privacy laws in that it does not provide consumers with a right to correct their personal data. Businesses subject to the Iowa law are not required to conduct risk assessments for activities that pose a significant risk of harm to consumers, which is a requirement in other states.

Click Here for the Original Article

 

 

California Privacy Protection Agency Defends Broad Authority to Investigate Potential CCPA Violations

 

The authority of the California Privacy Protection Agency (“CPPA”) to examine companies’ conduct prior to the enactment of regulations implementing the California Consumer Privacy Act in 2023 (“2023 CCPA Regulations”) recently has been challenged. On August 6, 2025, the California CPPA announced that it had filed a petition in Sacramento County Superior Court to enforce an investigative subpoena against Fortune 500 rural lifestyle retailer Tractor Supply Company (“Tractor Supply”).

 

According to the petition, the CPPA issued a subpoena to Tractor Supply in January 2025 requesting information about its privacy practices, including facts about the company’s: (1) processing of consumer rights requests, (2) use of technology to track consumers who visit the company’s website, and (3) relationship with entities who receive consumers’ personal information. The subpoena requested that Tractor Supply provide information covering the period between January 1, 2020, the date when the California Consumer Privacy Act (“CCPA”) became operative, and the present.

 

Although the subpoena required Tractor Supply to answer questions under oath, the CPPA alleges that the company failed to answer questions about its privacy practices prior to January 1, 2023 – three years after the CCPA’s effective date. According to the petition, Tractor Supply has asserted that the agency’s requested lookback period is overbroad, and that the company’s privacy practices before 2023 fall outside of the scope of the CPPA’s enforcement authority, as the 2023 CCPA Regulations had not yet been in effect.

 

In response, the CPPA asserts that “although regulations implementing aspects of the law followed over time, including in 2020, 2021, and 2023, any changes in the law do not abrogate the Agency’s authority to investigate possible violations.

 

Click Here for the Original Article

 

 

California’s New AI Employment Regulations Are Set To Go Into Effect On October 1, 2025

 

The California Civil Rights Council, which promulgates regulations that implement California’s civil rights laws, has published a new set of regulations concerning artificial intelligence (“AI”) in the workplace. These new rules (available here) are set to go into effect on October 1, 2025 and amend the existing regulatory framework of the Fair Employment and Housing Act (“FEHA”). This latest round of regulations is continuing a trend of California policing AI in the workplace, as we have previously reported here.  

 

According to the Civil Rights Department, these regulations are needed because “[a]utomated-decision systems — which may rely on algorithms or artificial intelligence — are increasingly used in employment settings to facilitate a wide range of decisions related to job applicants or employees, including with respect to recruitment, hiring, and promotion … [and] can also exacerbate existing biases and contribute to discriminatory outcomes.” Such “automated-decision systems” are defined as computational processes that make a decision or facilitate human decision-making regarding an employment benefit, which may be derived from and/or use artificial intelligence, machine-learning, algorithms, statistics, and/or other data processing techniques.

 

Click Here for the Original Article

 

 

NYC Limits Housing Discrimination Based on Criminal Background: Is ‘Criminal History’ History?

 

Housing providers are required to comply with the New York City Fair Chance Housing Act, Local Law 24, which prohibits discrimination toward prospective and current housing occupants based on criminal history, if they choose to do a background check. Covered providers must ensure they are aware of the Act’s parameters.

 

Who Is Covered?

 

Compliance under the Act extends to owners, lessors, lessees, sublessees, assignees, co-op and condo boards, and agents, employees, and real estate brokers of housing agencies authorized to sell, rent, or lease housing accommodations (“Providers”). The Act also applies to prospective purchasers, renters, and lessees of housing accommodations.

 

Prohibited Considerations

 

Providers are prohibited from engaging in the following conduct based on criminal history:

 

  1. Discriminating against any individual in the terms, conditions, or privileges of a sale, rental, or lease.
  2. Refusing to sell, rent, lease, approve the sale, rental, or lease or to deny a housing accommodation.
  3. Representing to any individual that any housing accommodation is not available for inspection, sale, rental, or lease. 
  4. Directly or indirectly excluding applicants with a criminal history in advertisements for the purchase, rental, or lease of such a housing accommodation; and 
  5. Conducting criminal background checks in connection with a housing accommodation, except as permitted below.

 

Permitted Considerations

 

Of course, the Act does not require Providers to completely ignore the potential for a legitimate adverse action based on criminal history. Providers may:

 

    • Consider “reviewable criminal history” including: 
        • Convictions registered on the New York, federal, or other jurisdictional sex offense registries;
        • Misdemeanor convictions where fewer than three years have passed from the date of release from incarceration, or the date of sentencing for an individual who was not sentenced to a period of incarceration; and
        • Felony convictions where fewer than five years have passed from the date of release from incarceration, or the date of sentencing for an individual who was not sentenced to a period of incarceration.
    • Take any lawful action against an individual for acts of physical violence against other residents or other acts that would adversely affect the health, safety, or welfare of other residents; and
    • Make statements, deny housing accommodations, or conduct criminal background checks where required or specifically authorized by applicable laws.

Click Here for the Original Article

 

INTERNATIONAL UPDATES

 

Adequacy of the EU–U.S. Data Privacy Framework Survives Challenge

On September 3, 2025, the European General Court (General Court) dismissed an action challenging the EU–U.S. Data Privacy Framework (DPF), developed to provide U.S. organizations with a reliable means to transfer personal data from the United States to the European Union, consistent with EU law.

The General Court’s judgment in case T-553/23, Philippe Latombe v European Commission, confirms that “the United States ensured an adequate level of protection for personal data transferred from the European Union to organizations in that country,” the Court’s press release states. The General Court and the Court of Justice make up the Court of Justice of the European Union (CJEU).

This decision means that entities that have self-certified compliance with the DPF may, for now, continue to rely on that mechanism for personal data transfers to the United States from the European Union (EU). The self-certification process includes, for example, a description of an organization’s activities with regard to all personal data received from the European Union in reliance on the EU-U.S. DPF, the organization’s policies covering such data, the types of data processed and, if applicable, the type of third parties to which it discloses such personal information.

Philippe Latombe, a politician in the French National Assembly (the lower house of the French Parliament), asserted in his September 2023 complaint that the DPF lacks a guarantee of the right to an effective remedy and access to an independent tribunal. The U.S. Data Protection Review Court (DPRC)—which reviews determinations of a Civil Liberties Protection Officer (CLPO) of the Office of the Director of National Intelligence in the United States—is not an impartial or independent tribunal, is dependent on the executive, and does not offer guarantees similar to those required by EU law, Latombe contended.

The DPRC, established by then-Attorney General Merrick Garland in 2022, “was established by an act of the United States executive and not by law,” the 2023 complaint argues. Latombe, a user of platforms that collect personal data for transfer to the United States, also asserted that the practice of U.S. intelligence agencies of collecting bulk personal data in transit from the EU, without prior authorization of a court or independent administrative authority, was illegal.

Latombe’s action for annulment was dismissed by the General Court, however, which found adequate “safeguards and conditions to ensure the independence” of the DRPC. And the General Court found nothing in EU case law—specifically Schrems II, discussed below—“to suggest that collection must necessarily be subject to prior authorization issued by an independent authority.”

Click Here for the Original Article

 

Conclusion

This quarter's updates reflect an increasingly aggressive regulatory environment—particularly around data privacy, automated technologies, and fair employment practices. At Cisive, we help organizations anticipate compliance challenges, build defensible screening programs, and navigate the evolving legal landscape. If your team needs guidance, policy updates, or audit support, Cisive’s compliance experts are here to help you hire with confidence.

 

Lets Build a Smarter Screening Strategy Together

 


Author: Michael Kendrick

Bio: Director of Corporate Legal/Compliance at Cisive.

Let's Connect on LinkedIn
Tags:
Share on:

Related posts