The SECURE Data Act: Advertising Industry Opportunities and Impact

On April 22, 2026, House Republicans announced the introduction of the SECURE Data Act (“SECURE”), a comprehensive, preemptive federal privacy bill designed to end the state “patchwork” era of regulation by establishing a single national standard. If enacted, SECURE would take effect 2 years after the enactment date, except that the obligations described in the Consumer Rights, Data Security, and Data Broker sections would take effect 1 year after the enactment date.

SECURE applies to entities that collect and process the personal data of more than 200,000 individuals annually with at least $25 million in revenue, or those processing data for 100,000 individuals while deriving 25% of their revenue from data sales. 

For the advertising industry, this represents the most significant legislative opportunities in years, moving beyond the compromise-heavy framework of past proposals like the American Privacy Rights Act (APRA).

Notable Industry Bill Priorities

Three specific provisions make this bill uniquely favorable to the advertising ecosystem compared to previous federal attempts:

Preemption: SECURE would broadly preempt state laws and other requirements that “relate” to its provisions. Notably, the current draft omits the typical “intent of Congress” preemption language that often signals a desire to fully displace state regimes. Without that express statement, courts may be more inclined to interpret preemption narrowly—particularly where states are seen as protecting rights, such as biometric privacy, that the federal framework does not address in detail.

Enforcement and Private Right of Action: The SECURE Data Act centralizes enforcement strictly within the Federal Trade Commission (FTC) and State Attorneys General. While the bill represents a significant step toward providing businesses with greater legal certainty, it currently lacks clarifying language expressly prohibiting a private right of action. Without an express prohibition, some stakeholders worry the door could remain open for future legislative maneuvering or judicial interpretation.

Pseudonymized / De-identified Data Exemptions: Under SECURE, de-identified data is completely exempt from the Act’s jurisdiction, provided the company publicly commits to a non-re-identification policy and contractually binds all third-party recipients to the same restriction. Pseudonymized data, by contrast, remains “personal data” but is granted a significant shield: it is largely exempt from the bill’s core consumer rights obligations—including the right to opt out of targeted advertising—so long as the information cannot be reasonably linked to an identified individual. This tiered framework creates a powerful incentive for agencies to adopt Privacy-Enhancing Technologies (PETs) and clean rooms, allowing effective measurement and modeling to persist even as traditional identifiers face stricter regulation.

However, a critical legal nuance remains regarding the depth of this exemption. Because the bill specifically limits the carve-out to the “assertion” of consumer rights, a debate has emerged as to whether this apply only to consumer-triggered requests (like access or deletion) or if it proactively exempts controllers from the affirmative opt-in consent mandate for sensitive data—a leap no current state law has made. 

How the SECURE Data Act Aligns with Themes from Existing State Privacy Laws

In many ways, SECURE codifies the “Virginia-model” framework that has become the blueprint for state privacy laws across the U.S. 

Exemptions: To maintain a streamlined regulatory environment, SECURE provides comprehensive entity-level exemptions for government bodies, institutions of higher education, political organizations, and nonprofits, while specifically carving out data already governed by existing federal sectoral laws such as HIPAA (health), GLBA (financial), FCRA (credit reporting), and FERPA (education).

Consumer Rights: SECURE grants Americans the right to access, correct, delete, opt and port their personal data. The bill would also permit a consumer to opt out of processing for: (1) targeted advertising; (2) sales; and (3) reliance on profiling to make a decision that has a legal or similarly significant effect. 

Consumer Rights Processes: SECURE requires consumer rights requests to be effectuated within 45 days with a single 45-day extension available for complex cases. If a request is denied, businesses must provide a conspicuously available appeals process similar to the initial submission method and respond to any appeal within 60 days. Crucially, SECURE permits controllers to require identity verification before fulfilling a request to prevent unauthorized data disclosure, while clarifying that businesses are not required to re-identify pseudonymous data or maintain data in an identifiable form solely to satisfy these rights. 

Data Minimization: SECURE would require a controller to limit data collection to what is adequate, relevant, and reasonably necessary to the purposes disclosed to a consumer.  It would prohibit a controller from processing personal data for a purpose that is not reasonably necessary or compatible with a disclosed purpose without consent. 

Controller-Processor Relationship and Requirements: SECURE maintains the fundamental distinction between “controllers” and “processors,” requiring clear contractual flow-downs to protect consumer data. Beyond these contracts, processors are subject to several rigorous federal mandates:

  • Comprehensive Privacy Audits: Processors must conduct annual internal or external privacy audits. Unlike some state-mandated data protection assessments (DPAs), which are typically project-specific and triggered only by high-risk activities, SECURE audits are comprehensive reviews of the company’s entire data architecture and compliance posture.
  • Executive Accountability: For large data holders, SECURE mandates that a designated privacy officer or a high-ranking executive certify the audit results, creating a direct line of C-suite liability for data practices.
  • Assistance Mandates: Processors are legally required to assist controllers in meeting their obligations, specifically in responding to consumer rights requests and providing the necessary technical information to verify the effectiveness of security safeguards.
  • Direct Regulatory Oversight: Crucially, processors are directly accountable to the FTC and State Attorneys General; they cannot shield themselves behind a controller’s instructions if their processing activities independently violate the Act’s security or minimization standards.
  • Sub-Processor Transparency: SECURE requires processors to provide controllers with prior notice and the opportunity to object before engaging any sub-processors, ensuring that the chain of custody for consumer data remains transparent and contractually bound to the same federal standards.
  • Security Practices:  SECURE would require a controller to establish, implement, and maintain reasonable administrative, technical, and physical data security practices that are appropriate to the volume, sensitivity, and nature of personal data processed.

Sensitive Data: SECURE mandates an affirmative opt-in consent model for “sensitive data”. Sensitive data is defined as a category of personal data that includes: (1) personal data that discloses racial or ethnic origin, religious belief, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status; (2) genetic or biometric data that is processed for the purpose of uniquely identifying a specific individual; (3) personal data collected from a child or teen; and (4) precise geolocation data (within a radius of 1850 feet).

Targeted Advertising: SECURE defines targeted advertising as “the display of an advertisement to a consumer in which the advertisement is selected based on personal data obtained from the activities of that consumer over time and across nonaffiliated websites or online applications to predict the preferences or interests of that consumer.”  The term does not include: (1) an advertisement based on activities within the website or online application of a controller; (2) an advertisement based on the context of a current search query, website visit, or online application; (3) an advertisement directed to a consumer in response to a request for information; or (4) processing personal data processed solely for measuring or reporting advertising or content performance, reach, or frequency.This definition it explicitly protects core operational functions—such as contextual advertising, first-party retargeting, and performance measurement—by excluding them from the “targeted advertising” classification and its associated opt-out requirements. By carving out internal site activities and frequency capping from the definition of “targeted,” SECURE provides agencies with a path to continue essential campaign optimization and reporting without the friction of federal opt-out mandates.

Universal Opt-Out Mechanism Requirements and Study: SECURE aligns with business-friendly state models by omitting any requirement for controllers to recognize universal opt-out mechanisms (UOOMs), such as the Global Privacy Control (GPC). This absence establishes a federal “ceiling” that favors direct, site-specific interactions over browser-based signals, effectively preempting stricter state mandates like those in California and Colorado that currently require businesses to honor automated opt-out signals; however SECURE would require the Secretary of Commerce to publish a report reviewing commercially available technologies relating to universal opt-out mechanisms.

Cure Period: To further protect businesses from immediate litigation, SECURE mandates a 45-day notice-and-cure period, requiring the FTC or State Attorneys General to provide written notice of a specific violation before initiating an enforcement action. 

To ensure the SECURE provides a robust “safe harbor” rather than a procedural formality, industry advocates may seek to strengthen the cure period through a more precise definition of remediation. A key amendment would explicitly define a “cure” as the combination of technical remediation and future prevention measures. By codifying this standard, Congress could ensure that fixing a software bug or updating a compliance banner within 45 days legally “eliminates” the violation—even if past data sharing cannot be technically “undone.” This would prevent regulators from claiming a violation is “incurable” as a pretext for immediate fines, a tactic that has frequently characterized enforcement actions in California. 

Small Business Applicability: To reduce the administrative burden on the startup ecosystem, SECURE provides a significant “on-ramp” for small businesses by exempting them from the bill’s most rigorous compliance mandates.For those falling below the covered entity thresholds, the bill offers a “safe harbor” through voluntary codes of conduct developed by the Secretary of Commerce. Adhering to these frameworks provides a rebuttable presumption of compliance, allowing smaller firms to demonstrate a commitment to federal privacy standards without the prohibitive overhead of the mandatory annual audits and executive certifications required of larger data holders.

The Federal Difference: Where the Bill Breaks New Ground

While the baseline rights feel familiar, SECURE introduces several “federal-only” nuances that represent a significant shift from the current state-by-state status quo.

Teens’ Data: SECURE significantly expands protections for younger demographics by reclassifying the personal data of all minors under the age of 16 as sensitive, effectively extending COPPA-style requirements to minors aged 13 to 15.  Of particular concern to the industry, the current draft lacks a “knowledge standard”—such as “actual knowledge” or “willful disregard”—implying a strict liability approach where businesses may be held responsible regardless of whether they knew a user’s age. This shift mandates verifiable parental consent for the 13–15 age bracket, forcing a major strategic pivot for brands that must now implement robust age-verification or consent workflows to avoid federal enforcement.

Targeted Advertising Disclosures: Following the precedent set by California’s “Notice at Collection,” SECURE requires controllers to provide a clear and conspicuous disclosure before collecting or using data for targeted advertising. This notice must not only inform the consumer of the activity but also provide a direct explanation of how to exercise their right to opt out. By codifying this into a single federal requirement, the Act provides a predictable disclosure roadmap for national campaigns, ending the need for the varied, state-specific footer links and “Notice at Collection” pages that currently clutter brand websites.

Consumer Profiling Right: SECURE introduces a sophisticated advanced profiling right, specifically designed to address automated decision-making that results in ‘legal or similarly significant effects’ (such as eligibility for housing, credit, or employment). Under this provision, businesses are not only required to disclose the use of such algorithms but must provide consumers with an explicit pre-decision opt-out. Under the bill it would no longer be enough to just include a profiling notice in a privacy policy. This provision grants consumers a definitive opt-out for automated decisioning, necessitating that organizations establish manual oversight protocols or risk forfeiting the use of AI-driven predictive modeling for vital consumer segments

Other Items: SECURE introduces several technical shifts that broaden the scope of federal oversight while offering unique defensive protections for businesses. Specifically, the bill extends its regulatory reach to communications common carriers subject to Title II of the Communications Act of 1934, closing a jurisdictional gap often debated in previous legislative sessions.

To balance this expanded scope, SECURE establishes a voluntary Code of Conduct certification framework, modeled after the COPPA safe harbor program. Under this system, entities can submit industry-specific self-regulatory guidelines to the Secretary of Commerce for approval. Once certified, these organizations gain a rebuttable presumption of compliance, effectively shifting the burden of proof in enforcement actions—a robust affirmative defense currently mirrored only in Tennessee’s state privacy law. Furthermore, the bill explicitly recognizes the Global Cross-Border Privacy Rules (CBPR) as a pre-approved code, providing a streamlined compliance pathway for international data transfers.

Legislative Prognosis

While SECURE represents a cohesive push for a national standard, its path to enactment is complicated by a fundamental ideological divide over preemption and enforcement. The bill’s “maximum preemption” strategy and the absence of a Private Right of Action are essential for securing industry support, yet these same provisions face intense opposition from Congressional Democrats and privacy advocates. Critics argue that stripping away individual legal recourse and overriding established state-level statutes, particularly those in California, represents a significant rollback of existing consumer protections.

Beyond these policy disputes, the legislative calendar is under the immense pressure of the 2026 midterm election clock. As the focus shifts toward campaigning, the appetite for the complex, high-stakes compromise required to bridge partisan gaps typically diminishes. While SECURE may successfully navigate House committees, it faces a steeper climb in the Senate, where the filibuster necessitates a 60-vote supermajority. Without bipartisan negotiations, the bill risks stalling as political priorities pivot toward the ballot box, potentially leaving the era of fragmented state compliance intact for the foreseeable future.However, the post-election “lame duck” session may offer a final, unique window for passage. During this period, departing lawmakers often feel less political pressure, potentially creating the necessary room for the complex horse-trading required to move a compromise version of SECURE through the finish line before the new Congress is seated. 

Conclusion

SECURE represents a watershed moment for the advertising industry, promising to replace the arduous “state patchwork” with a unified federal ceiling that prioritizes operational stability. By securing maximum preemption and excluding essential functions like contextual targeting and performance measurement from the definition of “targeted advertising,” the bill offers a great opportunity for the continuation of marketing practices grounded in responsible data use.

While this framework establishes a clear pro-business baseline, the industry must recognize the tactical nature of the current draft. Certain absences within the legislative text are likely intentional placeholders intended to facilitate horse-trading as the bill seeks broader coalition support.  Although the path to passage is not yet fixed, this proposal is significant for being the first to bridge the gap between divergent state protections and a unified federal mandate, effectively establishing the baseline for any future privacy law of the land.

Agencies should closely monitor the bill’s progress as it moves through House committees and into the Senate. We will be specifically watching for any “poison pill” amendments—particularly regarding a Private Right of Action or weakened preemption—that could destabilize a pragmatic, business-ready, federal privacy bill.

Have questions about the SECURE Data Act or the evolving federal privacy landscape? Contact Amanda Anderson, VP of Government Relations or Jim Potter, CHC Executive Director.

Related Posts

FDA’s Legislative Request on Rx and Compounded Drug Advertising

Congressional Agreement Reached on 2026 Health Spending

Shutdown Ending Deal Extends Medicare Telehealth