FlawCheck Citator
Check how courts have cited this case. Use our free citator for the most current treatment.
No. 10668502
United States Court of Appeals for the Ninth Circuit

Netchoice, LLC v. Bonta

No. 10668502 · Decided September 9, 2025
No. 10668502 · Ninth Circuit · 2025 · FlawFinder last updated this page Apr. 2, 2026
Case Details
Court
United States Court of Appeals for the Ninth Circuit
Decided
September 9, 2025
Citation
No. 10668502
Disposition
See opinion text.
Full Opinion
FOR PUBLICATION UNITED STATES COURT OF APPEALS FOR THE NINTH CIRCUIT NETCHOICE, LLC, No. 25-146 D.C. No. Plaintiff - Appellant, 5:24-cv-07885- EJD v. ROB BONTA, in his official capacity as Attorney General of California, OPINION Defendant - Appellee. Appeal from the United States District Court for the Northern District of California Edward J. Davila, District Judge, Presiding Argued and Submitted April 2, 2025 Phoenix, Arizona Filed September 9, 2025 Before: Michael Daly Hawkins, William A. Fletcher, and Ryan D. Nelson, Circuit Judges. Opinion by Judge R. Nelson 2 NETCHOICE, LLC V. BONTA SUMMARY* First Amendment / Social Media In a case in which NetChoice, an internet trade association that includes Google, Meta, and X, challenges California’s Protecting Our Kids from Social Media Addiction Act on First Amendment grounds, the panel, with one exception, affirmed the district court’s partial denial of NetChoice’s motion for injunctive relief and remanded with instructions that the district court modify its injunction. The Act broadly regulates social media offerings to minors in California. Without parental consent, the Act (1) restricts minors’ access to algorithmic feeds through its personalized-feeds provisions, (2) restricts certain platform design features through its default-settings provisions, and (3) mandates that platforms institute yet-unknown age- verification procedures (to be announced before 2027). Addressing NetChoice’s First Amendment as-applied challenges, the panel first held that the district court did not abuse its discretion in finding that, without more information and the participation of individual members, NetChoice lacked associational standing on behalf of its members to challenge the Act’s personalized-feed provisions. The panel next held that NetChoice was likely to prevail on its argument that the Act’s requirement that minors’ accounts operate with default settings that cannot show the number of likes, shares, or other feedback that a post has * This summary constitutes no part of the opinion of the court. It has been prepared by court staff for the convenience of the reader. NETCHOICE, LLC V. BONTA 3 received is unconstitutional as applied to its members. Finding the like-count requirement to be content- based and applying strict scrutiny, the panel held that the requirement was not the least restrictive way to advance California’s interest in protecting minors’ mental health. The panel held that the district court did not err by declining to enjoin the Act’s private-mode default setting provision, which requires that only users connected to the minors’ account can view or interact with the minors’ posts. The private-mode provision was not content based and survived intermediate scrutiny. The panel next held that NetChoice’s as-applied challenge to the Act’s age-verification requirement was unripe given that it was not set to start until 2027. Turning to NetChoice’s First Amendment facial challenges, the panel first held that NetChoice failed to support its facial challenge to the Act’s personal feed provision. Although some personalized feed algorithms may be expressive, that inquiry is fact intensive and NetChoice failed to show that the Act’s unconstitutional applications substantially outweigh its constitutional applications. The panel held that NetChoice established a likelihood of success on its facial challenge to the like-count provision but not to the private-mode provision, applying the same reasoning the panel used in the as-applied sections of its opinion. NetChoice’s facial challenge to the age- verification provisions was unripe for the same reasons as the as-applied challenge. The panel held that the Act was not unconstitutionally vague, rejecting NetChoice’s arguments that the statutory 4 NETCHOICE, LLC V. BONTA phrase “addictive feed” is pejorative and standardless and that the Act’s numerous exceptions create confusion about what services are covered. The panel held that although the like-count default setting was likely unconstitutional on its face and as applied, the provision was severable and could be excised from the Act. The panel held that the remaining preliminary injunction factors—irreparable harm, the balance of equities and the public interest—favored NetChoice on the issue. The panel reversed the district court’s denial of an injunction as to the like-count provision and remanded with instructions that the district court modify its injunction to enjoin the provision’s enforcement. In all other respects, the panel affirmed the district court’s denial of a preliminary injunction. NETCHOICE, LLC V. BONTA 5 COUNSEL Scott A. Keller (argued), Steven P. Lehotsky, Jeremy E. Maltz, and Shannon G. Denmark, Lehotsky Keller Cohn LLP, Washington, D.C.; Joshua P. Morrow, Lehotsky Keller Cohn LLP, Austin, Texas; Jared B. Magnuson, Lehotsky Keller Cohn LLP, Atlanta, Georgia; Bradley A. Benbrook, Stephen M. Duvernay, Benbrook Law Group PC, Sacramento, California; for Plaintiff-Appellant. Christopher J. Kissel (argued), Jennifer E. Rosenberg, and Shiwon Choe, Deputy Attorneys General; Lara Haddad, Supervising Deputy Attorney General; Thomas S. Patterson, Senior Assistant Attorney General; Rob Bonta, California Attorney General; Office of the California Attorney General, Los Angeles, California; for Defendant-Appellee. Thomas A. Berry, Cato Institute, Washington, D.C., for Amicus Curiae the Cato Institute. Aaron D. Mackey and Emma L. Armstrong, Electronic Frontier Foundation, San Francisco, California, for Amici Curiae Electronic Frontier Foundation, Freedom to Read Foundation, and Library Futures. Mark W. Brennan, J. Ryan Thompson, Erin Mizraki, and Thomas B. Veitch, Hogan Lovells US LLP, Washington, D.C., Washington, D.C.; Lawrence Walters, Walters Law Group, Longwood, Florida; Kerry M. Sheehan, Chamber of Progress, McLean, Virginia; Carlos Gutierrez, LGBT Tech, Staunton, Virginia; for Amici Curiae Chamber of Progress, LGBT Tech, and the Woodhull Freedom Foundations. Andrew S. Bruns and Flora D. Morgan, Keker Van Nest & Peters LLP, San Francisco, California, for Amicus Curiae Center for Democracy & Technology. 6 NETCHOICE, LLC V. BONTA Ariel F. Johnson, Digital Smarts Law & Policy LLC, Shaker Heights, Ohio, for Amicus Curiae Common Sense Media. Kwame N. Akosah, Assistant Solicitor General; Ester Murdukhayeva, Deputy Solicitor General; Barbara D. Underwood, Solicitor General; Letitia James, Attorney General; Office of the New York Attorney General, New York, New York; Kristin K. Mayes, Arizona Attorney General, Office of the Arizona Attorney General, Phoenix, Arizona; Tim Griffin, Arkansas Attorney General, Office of the Arkansas Attorney General, Little Rock, Arkansas; Philip J. Weiser, Colorado Attorney General, Office of the Arizona Attorney General, Denver, Colorado; William Tong, Connecticut Attorney General, Office of the Connecticut Attorney General, Hartford, Connecticut; Kathleen Jennings, Delaware Attorney General, Office of the Delaware Attorney General, Wilmington, Delaware; Brian L. Schwalb, District of Columbia Attorney General, Office of the District of Columbia Attorney General, Washington, D.C.; Raul Labrador, Idaho Attorney General, Office of the Idaho Attorney General, Boise, Idaho; Kwame Raoul, Illinois Attorney General, Office of the Illinois Attorney General, Chicago, Illinois; Anthony G. Brown, Maryland Attorney General, Office of the Maryland Attorney General, Baltimore, Maryland; Andrea J. Campbell, Commonwealth of Massachusetts Attorney General, Office of the Commonwealth of Massachusetts Attorney General, Boston, Massachusetts; Dana Nessel, Michigan Attorney General, Office of the Michigan Attorney General, Lansing, Michigan; Keith Ellison, Minnesota Attorney General, Office of the Minnesota Attorney General, St. Paul, Minnesota; Aaron D. Ford, Nevada Attorney General, Office of the Nevada Attorney General, Carson City, Nevada; Matthew J. Platkin, New NETCHOICE, LLC V. BONTA 7 Jersey Attorney General, Office of the New Jersey Attorney General, Trenton, New Jersey; Raul Torrez, New Mexico Attorney General, Office of the New Mexico Attorney General, Santa Fe, New Mexico; Jeff Jackson, North Carolina Attorney General, Office of the North Carolina Attorney General, Raleigh, North Carolina; Gentner Drummond, Oklahoma Attorney General, Office of the Oklahoma Attorney General, Oklahoma City, Oklahoma; Dan Rayfield, Oregon Attorney General, Office of the Oregon Attorney General, Salem, Oregon; Peter F. Neronha, Rhode Island Attorney General, Office of the Rhode Island Attorney General, Providence, Rhode Island; Marty J. Jackley, South Dakota Attorney General, Office of the South Dakota Attorney General, Pierre, South Dakota; Ken Paxton, Texas Attorney General, Office of the Texas Attorney General, Austin, Texas; Derek Brown, Utah Attorney General, Office of the Utah Attorney General, Salt Lake City, Utah; Charity R. Clark, Vermont Attorney General, Office of the Vermont Attorney General, Montpelier, Vermont; Nicholas W. Brown, Washington Attorney General, Office of the Washington Attorney General, Olympia, Washington; for Amici Curiae States of New York, Arizona, Arkansas, Colorado, Connecticut, Delaware, Idaho, Illinois, Maryland, Massachusetts, Michigan, Minnesota, Nevada, New Jersey, New Mexico, North Carolina, Oklahoma, Oregon, Rhode Island, South Dakota, Texas, Utah, Vermont, Washington, and the District of Columbia. Alison S. Gaffney, Dean N. Kawamoto, Felicia J. Craick, and William Dreher, Keller Rohrback LLP, Seattle, Washington, for Amici Curiae the American Federation of Teachers and the California Federation of Teachers. 8 NETCHOICE, LLC V. BONTA Megan Iorio and Tom McBrien, Electronic Privacy Information Center, Washington, D.C., for Amici Curiae the Electronic Privacy Information Center, Tech Justice Law Project, and Eighteen Law and Technology Scholars and Practitioners. OPINION R. NELSON, Circuit Judge: Addressing the growing concern that our youth are becoming addicted to social media, California passed a law regulating how internet platforms allow minors to access personalized recommendation algorithms. NetChoice sued, arguing that the law violates the First Amendment. The district court preliminarily enjoined some provisions but largely left the law in place. NetChoice appeals the district court’s denial of injunctive relief. With one exception, we affirm the district court. I A Some websites look the same to everyone. Our court’s website, for example, shows every visitor the same list of opinions, the same list of judges, and the same history of the court. Other websites, however, personalize users’ experiences, showing each user something different based on their location, past use of the site, and other information. Major social media platforms are prototypical examples of personalized websites, which use algorithms to curate NETCHOICE, LLC V. BONTA 9 personalized content feeds. Users see something different when they log on, depending on what they have viewed, interests they have expressed, or other data associated with their profiles. These personalized feeds work. Websites show users what they engage with to maximize user time on the website and to encourage return visits. Like many Americans, California thinks personalized feeds work too well. California worries that our children and youth are becoming addicted. So it passed the “Protecting Our Kids from Social Media Addiction Act.” 2024 Cal. Stats. ch. 321, SB 976, 2023–24 Reg. Sess. (Cal. 2024) (codified at Cal. Health & Safety Code §§ 27000–07). “Approximately 95 percent of 13- to 17-year-olds, inclusive, say that they use at least one social media platform, and more than one-third report using social media almost constantly.” Id. § 1. So California aims to reduce the “significant risk of harm to the mental health and well-being of children and adolescents” posed by “the algorithmic delivery of content” on online platforms, which the California legislature considered “addictive.” Id. The Act broadly regulates social media offerings to minors in California. Without parental consent, the Act (1) restricts minors’ access to algorithmic feeds through its personalized-feeds provisions and (2) restricts certain platform design features through its default-settings provisions. It also (3) mandates that platforms institute yet- unknown age-verification procedures (to be announced before 2027). The Act regulates many websites that use personalized feeds to serve up content to users. Those personalized-feeds restrictions make it unlawful—absent parental permission— for websites to provide “addictive feed[s]” to minors. Cal. 10 NETCHOICE, LLC V. BONTA Health & Safety Code § 27001(a); see also id. § 27002(b)(2), (b)(4). The law is mainly directed at social media platforms—meaning internet-based services or applications that “connect users in order to allow users to interact socially with each other” through content and posts created by networked profiles. Cal. Bus. & Profs. Code § 22675(f); see also Cal. Health & Safety Code § 27000.5(b)(1). But the Act also covers any website or online service that “provides users with an addictive feed as a significant part” of its service—regardless of whether that service qualifies as a social media platform. Cal. Health & Safety Code § 27000.5(b)(1). The Act’s central coverage definition describes “addictive feed” as any part of an online service or mobile application in which media “generated or shared by users are, either concurrently or sequentially, recommended, selected, or prioritized for display” based “on information provided by the user, or otherwise associated with the user or the user’s device.” Id. § 27000.5(a). As the state’s Attorney General puts it: An addictive feed consists of media shared by other users in which that media is shown to a user based on the user’s unique information and online activity. This means that, with some exceptions, covered platforms cannot use any information provided by minor users to decide what content to show them. 1 Cf. id. (enumerating seven exceptions); id. § 27000.5(b)(2) (two additional exceptions). 1 One such exception is of particular concern here. The Act excludes websites “for which interactions between users are limited to commercial transactions or to consumer reviews of products, sellers, services, events, or places, or any combination thereof.” Cal. Health & Safety Code § 27000.5(b)(2)(A). NETCHOICE, LLC V. BONTA 11 Regulating minors’ access to personalized feeds is the Act’s pièce de résistance. But it attacks the problem of minors’ social media addiction in other ways, too. California seeks to defang social media sites by regulating some design features and functionalities that it views as making these platforms especially addictive, targeting a series of default settings the Act imposes on minors. Two default settings are at issue. First, covered web platforms may not show minors how many likes, shares, or other forms of feedback a post has received within a personalized recommendation feed. Id. § 27002(b)(3). Second, covered platforms must make minors’ accounts private, which means their posts are visible only to friends on the platform. Id. § 27002(b)(5). All these restrictions—the personalized-feeds restrictions and the default settings—can be bypassed with parental consent. See id. §§ 27001(a)(2), (b), 27002. So families can opt out of the Act’s new regulatory regime if they do not share California’s views about the risks of internet addiction. California wants to know how often parents take this route: Covered companies must file annual reports showing how many minors use their services and how many of those minors’ parents opted out. Id. § 27005. All of this raises the question: What if a company does not know how old its users are? Until 2027, covered platforms need only apply the Act’s restrictions to users that they know are minors. Id. § 27001(a), 27002(a). That changes on January 1, 2027, when companies will have to apply the Act’s restrictions to all users, unless the company has “reasonably determined” a user is an adult. Id. The Attorney General of California must promulgate regulations before January 1, 2027, that define platforms’ age- 12 NETCHOICE, LLC V. BONTA verification obligations. Id. § 27006(b). What those regulations will look like—and what platforms will have to do to comply—is unknown. B Enter NetChoice, an internet trade association. Its members include Google (which owns and operates YouTube), Meta (proprietor of Facebook and Instagram), Nextdoor, Pinterest, and X (formerly Twitter). Each of these members owns a platform that allows users to create a profile, connect with other user profiles, and post content. And each employs a recommendation algorithm. In NetChoice’s view, the Act unconstitutionally limits these members’ ability to speak to minors (via personalized recommendation algorithms), impedes minors’ ability to access speech or speak publicly (via accounts not in private mode), and deters adults from accessing companies’ speech (by requiring them to first prove that they are adults). So NetChoice sued and sought a preliminary injunction. It argues that the Act is unconstitutional, both facially, and as applied to its members. It also argues that some of the Act’s language is void for vagueness. The district court granted in part and denied in part NetChoice’s motion for a preliminary injunction. See NetChoice v. Bonta, 761 F. Supp. 3d 1202, 1232 (N.D. Cal. 2024). The district court preliminarily enjoined California from enforcing two of the Act’s provisions: the restriction on sending minors notifications and the requirement that companies annually disclose the number of minors that use their services. Id. at 1227–30. Because California did not appeal those issues, they are not before us. Instead, we focus on the provisions of the Act that the district court left in place. NETCHOICE, LLC V. BONTA 13 First, the district court concluded that NetChoice’s challenge to the age-verification requirements was not ripe. It concluded that courts must wait until we know what the Attorney General’s chosen regulatory regime will look like before we can decide how much speech might be affected.2 Id. at 1211–19. In other words, further factual development is needed to determine what kind of constitutional burden the Act creates for NetChoice members and the broader population. Id. Next, the district court decided that it could not adjudicate NetChoice’s facial challenges to the personalized-feed provisions because there was not an adequate record to decide the scope of the Act’s coverage— including how much of the coverage would be unconstitutional—across the entire internet. Id. at 1218–23. In doing so, the district court assumed that not all personalized recommendation algorithms raised the same First Amendment concerns; if an algorithm “respond[s] solely to how users act online,” the district court doubted that it is expressive. Id. at 1220–21 (quotation omitted). Nor would the regulation of personalized algorithms necessarily burden users’ speech rights, either, the district court concluded. See id. As to the Act’s other provisions, the court concluded that the Act’s central coverage definition—which defines addictive feeds—was not content based, and decided that the 2 This relies on the assumption that some age-verification methods might raise constitutional concerns, while others might not. The district court opined that the regulations may require age-estimation tools “that run in the background and require no user input,” finding that such tools existed and would be feasible. NetChoice, 761 F. Supp. 3d at 1215–16. 14 NETCHOICE, LLC V. BONTA like-count and private-mode default settings passed constitutional muster. Id. at 1225–26, 1228. The district court did not separately consider NetChoice’s as-applied challenges to many provisions of the Act. See id. at 1230. It decided that doing so would be duplicative of the facial challenges. Id. It separately considered the as-applied challenge to the personalized-feed restrictions, concluding that NetChoice lacked associational standing because the as-applied challenges for “each separate NetChoice member” require their “own ‘ad hoc factual inquiry.’” Id. at 1230–31. To adjudicate these claims, NetChoice members needed to participate individually in the lawsuit to discover “how each of those members’ feeds work.” Id. at 1231. Finally, the district court concluded that the Act’s terms were not void for vagueness, and that facially unconstitutional provisions were severable. See id. at 1231– 32. NetChoice appealed. We issued a stay preventing California from enforcing any aspect of the Act until we resolved NetChoice’s appeal and expedited proceedings. We now affirm in part and reverse in part the district court’s injunction. II We have subject-matter jurisdiction under 28 U.S.C. § 1331 and appellate jurisdiction under 28 U.S.C. § 1292(a)(1). We review the district court’s denial of a preliminary injunction for abuse of discretion. X Corp. v. Bonta, 116 F.4th 888, 897 (9th Cir. 2024). A court abuses its discretion if it commits legal error. See id. We review NETCHOICE, LLC V. BONTA 15 factual findings for clear error. Am. Beverage Ass’n v. City & Cnty. of S.F., 916 F.3d 749, 754 (9th Cir. 2019) (en banc). To warrant the “extraordinary remedy” of preliminary relief, NetChoice must show that it is likely to succeed on the merits, that it is likely to suffer irreparable harm in the interim, and that an injunction is both equitable and in the public interest. Winter v. Nat. Res. Def. Council, Inc., 555 U.S. 7, 20, 24 (2008). III NetChoice asks us to reverse the district court in part and enjoin the parts of the Act that the district court left in effect. It wants that relief not only for its members, but for the world at large under a facial challenge. Before we address NetChoice’s arguments, we first focus on the distinction between facial and as-applied challenges. To invoke the court’s jurisdiction, NetChoice must allege a deprivation of a cognizable and “particularized” interest. Lujan v. Defs. of Wildlife, 504 U.S. 555, 560 (1992). Generally, a plaintiff “must assert his own legal rights and interests,” and cannot rely on the interests of third parties. Warth v. Seldin, 422 U.S. 490, 499 (1975). First Amendment facial challenges provide a narrow exception to this rule. Even if a speech regulation is constitutional as applied to a plaintiff, the plaintiff may challenge the law as facially overbroad or vague based on its impact on others’ speech. See Broadrick v. Oklahoma, 413 U.S. 601, 612 (1973) (overbreadth); United States v. Williams, 553 U.S. 285, 304 (2008) (vagueness). The parties do not rigorously delineate between as- applied and facial challenges. These two analyses overlap, but the differences are not limited to the scope of relief. That 16 NETCHOICE, LLC V. BONTA is, a facial challenge is not merely an as-applied challenge extended to cover non-parties. Granted, the line between facial and as-applied challenges can be “amorphous” at times. Project Veritas v. Schmidt, 125 F.4th 929, 940 (9th Cir. 2025) (en banc) (quotation omitted); Young v. Hawaii, 992 F.3d 765, 862–63 (9th Cir. 2021) (R. Nelson, J., dissenting), vacated, 142 S. Ct. 2895 (2022). But the distinction is critical. A facial challenge requires a different—and more stringent— analysis that demands record development on third-party speech and the full scope of a law’s potential applications. It is NetChoice’s burden to make those showings. Moody v. NetChoice, LLC, 603 U.S. 707, 724–26 (2024); Virginia v. Hicks, 539 U.S. 113, 122 (2003). A We start with NetChoice’s as-applied challenges. Then, we turn to the facial challenges before addressing the vagueness claim and the remaining injunction factors. We largely conclude that the district court did not err by denying NetChoice’s motion for a preliminary injunction. 1 First, we address NetChoice’s as-applied challenge to the Act’s personalized-feeds provisions. Cal. Health & Safety Code § 27001(a); id. § 27002(b)(2), (b)(4). The district court concluded that NetChoice had a standing problem and could not pursue this claim on behalf of its members. See NetChoice, 761 F. Supp. 3d at 1230–31. We agree. Plaintiffs generally “must assert [their] own legal rights and interests,” and cannot rely on the interests of third parties. Warth, 422 U.S. at 499. NetChoice, however, seeks to vindicate its members’ rights. So it must establish three NETCHOICE, LLC V. BONTA 17 threshold issues for associational standing. First, it must show that its members would have standing to sue on their own. Id. This is a constitutional requirement. United Food & Com. Workers Union Loc. 751 v. Brown Grp., 517 U.S. 544, 555 (1996). Second, it must show that the interest it asserts is germane to its purpose. See Hunt v. Wash. State Apple Advert. Comm’n, 432 U.S. 333, 343 (1977). Third, it must show that neither the claim asserted nor the relief requested “requires individualized proof.” Id. at 343–44. This last requirement is prudential. United Food, 517 U.S. at 555. It is that final, prudential requirement that the district court focused on. See NetChoice, 761 F. Supp. 3d at 1230– 31. And for good reason. NetChoice, which has over one hundred members, discloses only six platforms that it suggests are covered by the Act: YouTube, Facebook, Instagram, Nextdoor, Pinterest, and X. Of those, it provides declarations discussing the operations of only YouTube, Facebook, and Instagram. 3 Whether these particular platforms’ feeds are expressive is not what matters. What matters is that, for the prudential prong of associational standing, we must determine whether an algorithmic feed is expressive, which requires review of each member’s algorithm and how it functions. Moody, 603 U.S. at 723–26. The First Amendment analysis is “fact intensive” and will “surely vary” from “platform to platform.” Id. at 747 (Barrett, J., concurring). Such is the nature of a First Amendment claim related to algorithmic speech; and it is just as true in this as-applied challenge as it was in Moody’s facial challenge. Cf. id. And it means, in turn, that the merits of “the claim asserted” and the “relief requested” requires 3 It also briefly explains dreamwidth.org, whose coverage is unclear. 18 NETCHOICE, LLC V. BONTA the participation of individual NetChoice members, making associational standing inappropriate. Hunt, 432 U.S. at 343. As to the specific algorithms here, NetChoice acknowledges that each of its members is unique. That matters because the unique design of each platform and its algorithm affects whether the algorithm at issue is expressive. For example, the more an algorithm implements human editorial directions, the more likely it is to be expressive for First Amendment purposes. An algorithm that promotes a platform’s own message to users is likely to be protected speech. E.g., TikTok Inc. v. Garland, 604 U.S. __, 145 S. Ct. 57, 72 (2025). Such an algorithm, after all, is not unlike traditional media curated by human editors. See Moody, 603 U.S. at 731 (discussing Arkansas Ed. Television Comm’n v. Forbes, 523 U.S. 666, 674 (1998)); see also id. at 732–34, 738. On the other hand, an algorithm that “respond[s] solely to how users act online,” merely “giving them the content they appear to want,” probably is not expressive. Id. at 736 n.5; accord id. at 746 (Barrett, J., concurring). Personalized algorithms might express a platform’s unique message to the world, or they might reflect users’ revealed preferences to them. Knowing where each NetChoice member’s algorithm falls on that spectrum reasonably requires some individual platforms’ participation. See Hunt, 432 U.S. at 343. The district court decided it needed more information— and the participation of individual members—to adjudicate NetChoice’s “fact intensive” claims about all of its members’ algorithms and the appropriate relief. Cf. Moody, 603 U.S. at 747 (Barrett, J., concurring). After Moody, the district court’s holding was not arbitrary, irrational, or contrary to law. See United States v. Hinkson, 585 F.3d NETCHOICE, LLC V. BONTA 19 1247, 1263 (9th Cir. 2009) (en banc). It was thus reasonable for the district court to conclude that, as a prudential matter, NetChoice had not established associational standing without more information about members’ algorithms and feeds. In short, it was not an abuse of discretion to find that NetChoice lacked associational standing on behalf of its members for this issue. See Hinkson, 585 F.3d at 1263. 2 NetChoice also raises an as-applied challenge to the Act’s requirement that minors’ accounts operate with certain default settings, which can be turned off by a parent. Two such default settings are at issue: (1) that covered platforms cannot show minors the number of likes or other feedback on a post, see Cal. Health & Safety Code § 27002(b)(3); and (2) that minors’ accounts must be on “private mode,” id. § 27002(b)(5). a Starting with the like-count default setting, we face the threshold question: What level of scrutiny is warranted? Laws that regulate speech based on its content are presumptively unconstitutional and subject to strict scrutiny. Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015). Content- neutral laws, on the other hand, are subject to intermediate scrutiny. City of Austin v. Reagan Nat’l Advert. of Austin, LLC, 596 U.S. 61, 76 (2022). A law is content based if it “targets speech based on its communicative content.” Id. at 69 (cleaned up). If a law’s applicability turns on “the topic discussed” or “the idea or message expressed,” the law is “obvious[ly]” content based. Id. at 69, 74 (quoting Reed, 576 U.S. at 163). Other content 20 NETCHOICE, LLC V. BONTA discrimination is “subtler.” Id. at 74. If a law turns on the speech’s “function or purpose,” it is also content based, at least when the function-or-purpose classification is a “proxy” for subject-matter or topic discrimination. Id. (discussing Reed, 576 U.S. at 163). NetChoice argues that strict scrutiny should be applied here for three reasons. First, it argues that the Act’s exception for commercial websites makes the whole Act content based. Second, it argues that the Act’s focus on social media does the same. Third, it argues that the like- count provision in particular is content based, and so it, at least, is subject to strict scrutiny. We disagree that the whole Act is content based, but agree that the like-count provision itself is. i First, the district court correctly concluded the Act’s exception from coverage of websites “limited to commercial transactions or to consumer reviews,” Cal. Health & Safety Code § 27000.5(b)(2)(A); see NetChoice, 761 F. Supp. 3d at 1225–26, is not content based. While a close question, we agree. In City of Austin, the Supreme Court rejected “the view that any examination of speech or expression inherently triggers heightened First Amendment concern.” 596 U.S. at 73. City of Austin instead recognized an implication of this rule that governs here: Statutes that classify and single out solicitation “require some evaluation of the speech and nonetheless remain content neutral.” 596 U.S. at 72. The Court’s definition of “solicitation” is instructive. It is “speech ‘requesting or seeking to obtain something’ or ‘[a]n attempt or effort to gain business.’” Id. (quoting Solicitation, Black’s Law Dictionary (11th ed. 2019)). And NETCHOICE, LLC V. BONTA 21 “the Court has reasoned that restrictions on solicitation are not content based.” Id. (citing Heffron v. Int’l Soc. for Krishna Consciousness, Inc., 452 U.S. 640, 649 (1981)). Sitting en banc, we reiterated and elaborated on this solicitation carveout earlier this year. See Project Veritas, 125 F.4th at 949–50. Although the Act does not define “commercial transactions” or “consumer reviews,” cf. Cal. Health & Safety Code § 27000.5, the ordinary meaning of those terms suggests that they amount to commercial solicitation as City of Austin and Project Veritas discussed the term. This exception’s description of “[a]n internet website, online service, online application, or mobile application for which interactions between users are limited to commercial transactions or to consumer reviews of products,” id. § 27000.5(b)(2)(A), simply describes websites “requesting or seeking to obtain something” or “attempt[ing] . . . to gain business” online,4 see City of Austin, 596 U.S. at 72 (citation omitted). Thus, the exception categorizes websites along lines that have been affirmed as content neutral. See Project Veritas, 125 F.4th at 948–49. Even NetChoice seems to understand that the Act regulates “communication of all kinds,” indiscriminate of topic or content. Thus, the Act 4 It is irrelevant that the Act does not use the term “solicit.” For example, in Heffron, the constitutionally permissible regulation placed burdens on the “[s]ale or distribution of any merchandise, including printed or written material.” 452 U.S. at 643. But that was not enough to make the regulation content based. Here too, the relevant language of the exception—websites “for which interactions between users are limited to commercial transactions or to consumer reviews”—does not rely on “solicit” as a talismanic word but simply describes the concept in the digital context. 22 NETCHOICE, LLC V. BONTA “applies evenhandedly to all who wish to distribute and sell” online. See Heffron, 452 U.S. at 649. ii NetChoice also argues that the Act’s focus on social media makes the entire act content based. We disagree. The Act applies to any internet website “including, but not limited to, a ‘social media platform’” that personalizes feeds based on information provided by the user. Cal. Health & Safety Code § 27000.5(b)(1). A “social media platform” is a service whose “substantial function” is to facilitate social interaction. Id.; Cal. Bus. & Profs. Code § 22675(f)(1)(A). According to NetChoice, this definition is content based because it regulates websites based on whether they facilitate “social interaction” or “other forms of content.” But California’s use of “social media” platform as statutory shorthand does not render the Act content based, since it applies to websites whether they facilitate social interaction or other forms of content. So neither the commercial-transactions exception nor the Act’s focus on “social media” platforms makes the Act as a whole content based. iii That said, the regulation of like counts in particular is independently content based. See id. § 27002(b)(3). Like counts are “speech with a particular content.” Sorrell v. IMS Health Inc., 564 U.S. 552, 564 (2011). The Act prohibits platforms from describing posts based on “the idea or message expressed” by the description. Reed, 576 U.S. at 163. A platform may show a post to a minor. And it may presumably tell that minor that other users have interacted with it. But it cannot tell the minor the number of likes or NETCHOICE, LLC V. BONTA 23 feedback that the post has received. Thus, whether the Act restricts a website’s description of a post turns on what message the description will communicate. See Barr v. Am. Ass’n of Pol. Consultants, 591 U.S. 610, 619 (2020). That is content discrimination. As a result, strict scrutiny applies to this provision. And the like-count default setting is not the least restrictive way to advance California’s interest in protecting minors’ mental health. Normally, we would remand to the district court to conduct the strict scrutiny analysis. See Boyer v. City of Simi Valley, 978 F.3d 618, 624 (9th Cir. 2020). Here, however, on-point authority compels a single result. Just last year, we addressed a similar question in NetChoice v. Bonta, 113 F.4th 1101 (9th Cir. 2024). Here, as in that case, California could encourage websites “to offer voluntary content filters” related to like counts or educate children and parents on such filters. Id. at 1121; cf. Brown v. Ent. Merchs. Ass’n, 564 U.S. 786, 803 (2011). We see no basis to distinguish that recent case. So we conclude that NetChoice is likely to prevail on the merits of its challenge to the like-count provision as applied to its members. Cf. NetChoice, 761 F. Supp. 3d at 1228. b We next address the as-applied challenge to the private- mode default setting. Cal. Health & Safety Code § 27002(b)(5). In private mode, only users connected to a minor’s account (being “friends,” for example) can view or interact with that minor’s posts. Id. NetChoice argues that this restriction is subject to strict scrutiny because it is speaker based. This restriction may be speaker based. But not all speaker-based laws are subject to strict scrutiny. See Reed, 24 NETCHOICE, LLC V. BONTA 576 U.S. at 169–70. A speaker preference is problematic only if it “reflects a content preference.” Id. at 170 (quotation omitted). After all, speaker-based distinctions are suspect only because they “are all too often simply a means to control content.” Id. (quotation omitted). Contrary to NetChoice’s suggestion, “[c]haracterizing a distinction as speaker based is only the beginning—not the end—of the inquiry.” Id.; Barr, 591 U.S. at 619–20. The private-mode provision does not “reflect[] a content preference.” See Reed, 576 U.S. at 170 (quotation omitted). NetChoice does not explain how the private-mode default is a backdoor means of controlling content. As NetChoice concedes, users connected to a minor can share or access “the same message”—any message—that an unconnected user can share. The private-mode default is agnostic as to content and therefore need only survive intermediate scrutiny. It does so. While not perfectly tailored, this restriction is narrowly tailored. See Williams-Yulee v. Fla. Bar, 575 U.S. 433, 454 (2015). It is not underinclusive enough to raise “doubts about whether the government is in fact pursuing” the asserted interest. Id. at 448 (quotation omitted). In private mode, minors cannot conform their social media habits to maximize interaction and approval of a worldwide audience. This logically serves the end of protecting minors’ mental health by reducing screentime and habit-forming platform usage. The provision may allow minors “to communicate with unconnected users on other types of services.” But contrary to NetChoice’s contention, that does not mean that the Act is so “riddled with exceptions” that it raises doubts about whether California is trying to mitigate the addictive nature of platforms that NETCHOICE, LLC V. BONTA 25 provide personalized feeds. Id. (citing City of Ladue v. Gilleo, 512 U.S. 43, 52–53 (1994)). Neither is the provision so overinclusive to make it “substantially broader than necessary” to achieve California’s interest. TikTok, 145 S. Ct. at 71 (quotation omitted); McCullen v. Coakley, 573 U.S. 464, 486 (2014); Ward v. Rock Against Racism, 491 U.S. 781, 800 (1989). True, the requirement “applies to all covered websites and minor users, regardless of why they are using a particular service.” But California’s interests are wide-ranging. And California took a relatively nuanced approach. So the district court did not err by declining to enjoin the private- mode default setting provision. 3 The final as-applied challenge we consider is directed to the Act’s age-verification requirements. Again, starting January 1, 2027, the Act requires NetChoice’s members to “reasonably determine[]” whether their users are minors. Cal. Health & Safety Code §§ 27001(a)(1)(B), 27002(a)(2). What that means, exactly, will depend on what the California Attorney General says it means to “reasonably determine” the age of users. Id. § 27006(b). Until then, the Act does not require NetChoice members to make any age determinations—the substantive limits on notifications and personalized feeds apply only if NetChoice members already have “actual knowledge” that a specific user is a minor. Id. §§ 27001(a)(1)(A), 27002(a)(1). NetChoice argues that these age-verification requirements will chill users’ access to speech, and that any age verification procedure will be costly for its members. But third-party internet users’ interests at large are not enough to challenge this provision as applied to NetChoice’s 26 NETCHOICE, LLC V. BONTA members. NetChoice cannot assert third-party interests separate from its members. So if internet users believe that the age-verification requirement will burden their speech, they must raise their own challenge. As a result, NetChoice must rely on its members’ pocketbook injury alone, which would be a cognizable injury. See TransUnion LLC v. Ramirez, 594 U.S. 413, 425 (2021). But this injury is neither imminent nor ripe. Ripeness has constitutional and prudential components. See Twitter, Inc. v. Paxton, 56 F.4th 1170, 1173–74 (9th Cir. 2022). Constitutional ripeness restricts courts from acting before an actual case or controversy arises since we “are not roving commissions assigned to pass judgment on the validity of the Nation’s laws.” Broadrick, 413 U.S. at 610– 11. The judicial power may be used only when “necess[ary]” to “adjudicat[e] rights in particular cases between the litigants brought before the Court.” Id. at 611 (quotation omitted). Ripeness asks whether the injury has been suffered—or is imminent enough to invoke the judicial power. See Anderson v. Green, 513 U.S. 557, 559 (1995) (per curiam) (“[R]ipeness is peculiarly a question of timing.” (quotation omitted)). NetChoice’s members have suffered no ripe injury attributable to the age-verification provision, and none is imminent. The Act does not require NetChoice members to do any age verification before 2027. Cal. Health & Safety Code §§ 27001(a)(1)(B), 27002(a)(2). Moreover, the state attorney general has not yet issued regulations defining what NetChoice members must do to verify users’ age. Nor is there any indication what those regulations will require. At this point, any pocketbook injury caused by the age- NETCHOICE, LLC V. BONTA 27 verification provision is purely speculative. And any costs that NetChoice members are incurring now to comply with yet-unknown regulations are voluntarily self-inflicted. See Twitter, 56 F.4th at 1176. To be sure, a suit can be ripe even if the injury has not yet been suffered. But pre-enforcement review is the exception, not the norm. See Whole Woman’s Health v. Jackson, 595 U.S. 30, 49–50 (2021). Such review is ripe only when the plaintiff intends to act in a way “proscribed by the statute” and when there is “a credible threat” that the violation will be prosecuted. Susan B. Anthony List v. Driehaus, 573 U.S. 149, 158–59 (2014) (quotation omitted). There is not yet any indication what conduct will be “proscribed by the statute.” Id. (quotation omitted). So NetChoice cannot show that it intends to engage in conduct prohibited by the Act. The currently non-existent age-verification requirements do not imminently impose any injury on NetChoice members and may not do so until 2027. And the current prospect that NetChoice members will have to do something—without any indication of what, exactly, that means—is purely “conjectural” or “hypothetical.” See Lujan, 504 U.S. at 560 (quotation omitted); Twitter, 56 F.4th at 1173 (extending these requirements to ripeness). Thus, NetChoice’s as-applied challenge to the age-verification requirement is unripe. * * * NetChoice’s as-applied challenges are, for the most part, not likely to succeed on the merits. NetChoice lacks associational standing to challenge the personalized-feeds provisions. Its challenge to the age-verification requirement is unripe. And the private-mode default setting passes constitutional muster under intermediate scrutiny. The like- 28 NETCHOICE, LLC V. BONTA count default setting, however, triggers (and fails) strict scrutiny. B We next address its facial challenges. NetChoice challenges the same provisions discussed above, substantially relying on third-party speech interests. Unlike in the as-applied area, reliance on the speech of nonlitigants is permissible for facial challenges. Facial challenges are an exception to the general rule that courts adjudicate discrete disputes between the interests of the parties before them. Warth, 422 U.S. at 499. Generally, plaintiffs may claim that a statute is unconstitutional on its face only if there is “no set of circumstances” under which the statute would be valid. United States v. Stevens, 559 U.S. 460, 472 (2010) (quotation omitted). In the First Amendment context, however, the Court has recognized a “second type of facial challenge”—the overbreadth challenge. Id. at 473. Under this framework, we consider whether the law is unconstitutional in “a substantial number” of its applications, “judged in relation to the statute’s plainly legitimate sweep.” Moody, 603 U.S. at 723 (quotation omitted). In other words, we must determine whether the law’s unconstitutional applications “substantially outweigh” the constitutional ones. Id. at 724. To do this, a court first assesses the law’s scope and then decides “which of the law[’s] applications violate the First Amendment,” measuring the violative applications “against the rest.” Moody, 603 U.S. at 724–25. This requires a plaintiff to prove that the unconstitutional applications are “realistic, not fanciful,” and are “substantially disproportionate” to the lawful sweep. United States v. Hansen, 599 U.S. 762, 770 (2023). NETCHOICE, LLC V. BONTA 29 Like all facial challenges, First Amendment overbreadth challenges are “unusual.” Id. at 769. Overbreadth challenges allow the interests of unrepresented third parties to be considered—deviating from normal rules of standing and asking courts to speculate about hypothetical injuries to imaginary persons. See id.; United States v. Sineneng-Smith, 590 U.S. 371, 387 (2020) (Thomas, J., concurring). They risk “premature interpretation of statutes” based on “factually barebones records” and threaten to “short circuit the democratic process” by preventing states from enforcing a law in its constitutional applications. Wash. State Grange v. Wash. State Republican Party, 552 U.S. 442, 450–51 (2008) (quotation omitted). And because calculating a ratio of constitutional-to- unconstitutional applications risks veering into the purely speculative, the Supreme Court has emphasized that facial challengers bear heavy factual burdens. Moody, 603 U.S. at 726; Hicks, 539 U.S. at 122. Moody did not invent these “hard to win” burdens but emphasizes how they play when plaintiffs seek internet-wide injunctions related to complex algorithmic speech. See 603 U.S. at 723. Establishing a facial challenge entails two steps. First, a court must consider how a statute works in “all of its applications.” Id. at 744 (emphasis added). Because the “online world is variegated and complex, encompassing an ever-growing number of apps, services, functionalities, and methods for communication and connection,” this may be nearly impossible. Id. at 725; see also id. at 745 (Barrett, J., concurring) (“[D]ealing with a broad swath of varied platforms and functions in a facial challenge strikes me as a daunting, if not impossible, task.”). To determine “what [a law] covers,” a court needs a massive amount of information about the internet—which cannot be based solely on its own 30 NETCHOICE, LLC V. BONTA experience. Id. at 725 (majority opinion) (quoting Hansen, 599 U.S. at 770).5 At the second step, the court cannot speculate whether a law “unduly burden[s] expression” without a developed record that explains not only how plaintiffs’ platforms work, but also how a wide range of third-party services operate. Id. at 725–26. This information is necessary to characterize the “denominator” of a law’s applications. Hicks, 539 U.S. at 125 (Souter, J., concurring). Otherwise, what we know about the “numerator”—even if a statute is unconstitutional as applied to challengers—is not helpful. Id. The test is necessarily relative. Thus, facial challengers cannot focus on a law’s effect on their own platforms—they cannot even limit the inquiry to the tech giants. See Moody, 603 U.S. at 724. They must canvass “the full range of activities” covered by the law and “measure the constitutional against the unconstitutional applications.” Id. Otherwise, we cannot determine whether the ratio of unconstitutional applications is “lopsided.” Hansen, 599 U.S. at 770. In addition, at this second step, the court needs information not only about the law’s scope, but also about the facts that determine the law’s constitutionality in each of its applications. See Ams. for Prosperity Found. v. Bonta, 594 U.S. 595, 615, 618 (2021). If, for example, a 5 See also Moody, 603 U.S. at 747 (Barrett, J., concurring) (“[T]he analysis is bound to be fact intensive, and it will surely vary from function to function and platform to platform.”); id. at 749 (Jackson, J., concurring) (“Even when evaluating a broad facial challenge, courts must make sure they carefully parse not only what entities are regulated, but how the regulated activities actually function.”); id. at 769 (Alito, J., concurring in judgment) (“[I]t is impossible to determine whether [statutes] are unconstitutional in all their applications without surveying those applications.”). NETCHOICE, LLC V. BONTA 31 law is unconstitutional because it is insufficiently tailored, the court needs to know the facts that underlie that tailoring determination—for every application. See id. Failing to carry these burdens at either step, thereby treating a facial challenge “more like as-applied claims,” is fatal. Moody, 603 U.S. at 724. The Court has not only made this clear, but it has made it clear to NetChoice. See id. 1 With these principles in mind, we turn to NetChoice’s first facial challenge directed at the Act’s restrictions on minors’ access to algorithmic feeds. The only speech regulated by this provision is the speech made up of the algorithms themselves. All other content—including third- party content created by other users—remains available to minors. Minors can still search for it or follow its creator, after all. The district court concluded that personalized feeds are not necessarily a form of social media platforms’ speech, so restricting personalized feeds does not restrict access to those platforms’ speech. See NetChoice, 761 F. Supp. 3d at 1220–21. This is a novel question, and we are careful not to decide more than necessary—especially given the preliminary and interlocutory posture of this matter and a thin record.6 See id. at 1223. 6 Not to mention that artificial intelligence—also at issue—is a doctrinal and factual question mark all its own. See Moody, 603 U.S. at 795 (Alito, J., concurring) (“[W]hen AI algorithms make a decision, ‘even the researchers and programmers creating them don’t really understand why the models they have built make the decisions they make.’” (quotation omitted)). 32 NETCHOICE, LLC V. BONTA Moody left open the outer limits of which algorithm- based feeds are expressive for First Amendment purposes as we discussed above. 603 U.S. at 736 n.5; see supra Section III.A.1. We do not have to push those limits here. As above, all we recognize is that some personalized recommendation algorithms may be expressive, while others are not, and that inquiry is fact intensive. Moody made that much clear. We need go no further because NetChoice “fails to show that any unconstitutional applications of the statute substantially outweigh its constitutional applications.” Project Veritas, 125 F.4th at 937. At the first step of the overbreadth framework, NetChoice failed to develop a record that would allow the court to “determine [the] law’s full set of applications,” cataloging what “activities, by what actors” the law regulates. Moody, 603 U.S. at 718, 724. Doing so would entail the “daunting, if not impossible” task, id. at 745 (Barrett, J., concurring), of canvassing how the Act applies to an “ever-growing number of apps, services, functionalities, and methods for communication and connection,” id. at 725 (majority opinion).7 In fairness to NetChoice, current constitutional doctrine may make such a showing unrealistic. At any rate, “[i]t is a mystery how 7 On the current record, we are left to speculate about the full scope of the Act’s coverage provision, and whether every “internet website, online service, online application, or mobile application” that personalizes feeds based on user-provided information is engaged in expressive activity. Cal. Health & Safety Code § 27000.5(a). We are also left to guess whether those applications are the least restrictive means to a compelling interest. See Ams. for Prosperity, 594 U.S. at 615. This is a daunting prospect. Just think: ESPN.com? wsj.com? neopets.com? chess.com? Airbnb? Or any number of thousands of platforms. NETCHOICE, LLC V. BONTA 33 NetChoice could expect to prevail on a facial challenge without candidly disclosing the platforms that it thinks the challenged laws reach.” Id. at 787 (Alito, J., concurring). NetChoice also fails to carry its burden at the second step. It does not explain what’s different about third-party recommendation algorithms or how they work. Unlike other cases in which the constitutional analysis is materially identical across all applications of a law, here this record development matters. See NetChoice, 113 F.4th at 1116, 1123; Ams. for Prosperity, 94 U.S. at 615. As NetChoice tacitly concedes, the Act does not raise the same First Amendment issues in every conceivable application. And unchallenged applications of a statute must be assumed constitutional for purposes of a facial challenge—that is the only way the overbreadth framework makes sense. NetChoice bore the burden to rebut this null hypothesis, and it has failed to do so. So, on the current record, we cannot hold the Act is unconstitutionally “lopsided.” Hansen, 599 U.S. at 770; accord Project Veritas, 125 F.4th at 961. 2 Next, we consider NetChoice’s facial challenge to the Act’s default-settings provisions. Unlike the personalized- feeds provisions, these raise the same First Amendment issues “in every application to a covered business.” NetChoice, 113 F.4th at 1116. Thus, we do not need a dense factual record to know how these settings will play out in the real world. Otherwise stated, “the current record allows us to analyze whether the [default settings regime] is likely to violate the First Amendment . . . through a facial challenge.” Id. at 1122. 34 NETCHOICE, LLC V. BONTA Every website covered by the Act must, as a default, set minor users’ accounts to block feedback metrics and limit their ability to immediately interact with others. “Whether it be NetChoice’s members or other covered businesses,” all are “under the same statutory obligation.” Id. at 1116. Unlike the personalized-feed restriction, the constitutionality of which depends on the variations in algorithms, platforms’ speech interests are uniformly implicated by the default-settings provisions. As discussed above, the like-count provision is unconstitutional as applied to NetChoice members. See supra Section III.A.2.a.iii. The private-mode provision is not. See supra Section III.A.2.b. Since they apply the same way to all covered websites, NetChoice has established a likelihood of success for its facial challenge to the like-count provision, as well. 3 NetChoice’s facial challenge to the age-verification provisions is unripe for the same reasons as the as-applied challenge. See supra Section III.A.3. What’s more, without knowing what age-verification the Act will require, we cannot determine whether those procedures unconstitutionally chill the speech of users. Nor can we determine whether the requirements are unconstitutional in a substantial number of their applications. That is especially true if, as the district court found, NetChoice members can verify users’ age in the background without requiring user input. See NetChoice, 761 F. Supp. 3d at 1215–16. Given this factual finding by the district court, which is not clearly erroneous, NetChoice is unlike to prevail on the merits of this facial challenge. NETCHOICE, LLC V. BONTA 35 C NetChoice also argues that the Act is void for vagueness. The prohibition against overly vague laws comes from the Due Process Clause. Williams, 553 U.S. at 304. The question is whether a law gives “fair notice” to a person “of ordinary intelligence” of what is prohibited, or whether the law is “so standardless” that it allows or encourages “seriously discriminatory enforcement.” Id. NetChoice identifies four areas of purported vagueness. First, NetChoice suggests that the statutory phrase “addictive feed” is vague because it is “pejorative.” While NetChoice may disagree with California’s choice to use the “addictive” label, that is a policy question that the California legislature can answer, whether it comports with some scientific definition. To the extent that NetChoice thinks that “addictive” is standardless and will authorize discriminatory enforcement, that argument is foreclosed by the statutory text. If the Act simply prohibited “addictive” algorithms, the Act might feasibly be vague. But context, including statutory definitions, can render an otherwise vague term determinate. Grayned v. City of Rockford, 408 U.S. 104, 111 (1972); Williams, 553 U.S. at 294–97, 306. And the Act defines “addictive feed” and “addictive internet-based service.” Cal. Health & Safety Code § 27000.5(a)–(b). Those definitions do not circularly rely on the word “addictive” or any purportedly pejorative synonym. See id. Second, NetChoice argues that the definition of “addictive feed” is vague. Some platforms individualize users’ feeds by listing the content posted on the pages that each user follows. NetChoice says it is unclear whether those platforms provide an “addictive feed.” There is no 36 NETCHOICE, LLC V. BONTA uncertainty. If a user requests to see media by a particular content creator and the platform merely generates a feed of that content, the platform is exempt from the Act. Id. § 27000.5(a)(4). NetChoice notes that the Act contains a carveout from that exception. If, in addition to providing content from followed pages, a website proposes content based on “other information associated with the user,” the website is not exempt. Id. At best, this exception and carveout create “close cases.” Williams, 553 U.S. at 305–06. In some cases, it might be difficult to determine whether a feed is based on a user’s “follows” or “other information.” Yet there is no question about the statutory criteria. And that is the only indeterminacy that makes a law vague. Id. Third, NetChoice criticizes the definition of “addictive internet-based service” as an online application that provides a personalized feed “as a significant part” of its service. Cal. Health & Safety Code § 27000.5(b)(1). The Act then exempts websites that “operate[] a feed for the primary purpose of cloud storage.” Id. § 27000.5(b)(2)(B). NetChoice challenges the terms “significant part,” “operates a feed,” and “primary purpose.” NetChoice never explains why these phrases are “so standardless” that they allow or encourage “seriously discriminatory enforcement.” Williams, 553 U.S. at 304. At best, they create space for close cases about when a platform meets the statutory criteria. But the Act provides “a person of ordinary intelligence” fair notice of what those criteria are. See id. Finally, NetChoice says that the Act’s “numerous exceptions” create “genuine confusion about which services are covered.” But NetChoice does not identify any exception—other than those just discussed—that creates NETCHOICE, LLC V. BONTA 37 confusion. And on review, each exception is clear enough to allow a person of ordinary intelligence to understand what conduct is prohibited by the Act. At bottom, the Act is not unconstitutionally vague. D Because we conclude that the like-count default setting is likely unconstitutional on its face and as applied to NetChoice’s members, we must consider whether it may be excised from the Act. We conclude that this provision is severable, meaning that the Act is not “unconstitutional in its entirety.” Brockett v. Spokane Arcades, Inc., 472 U.S. 491, 506 (1985). When it comes to state law, state severability principles govern. Leavitt v. Jane L., 518 U.S. 137, 139 (1996) (per curiam). In California, a severability clause “normally calls for sustaining the valid part of the enactment.” Garcia v. City of L.A., 11 F.4th 1113, 1120 (9th Cir. 2021) (quoting Cal. Redevelopment Ass’n v. Matosantos, 267 P.3d 580, 607 (Cal. 2011)). The Act contains a severability clause, dictating that if “any provision” or “application” is invalid, “that invalidity shall not affect other provisions or applications” of the Act that “can be given effect without the invalid provision or application.” Cal. Health & Safety Code § 27007. To that end, “the provisions of [the Act] are declared to be severable.” Id. Under California law, this bolsters the presumption of severability. Matosantos, 267 P.3d at 607–08. At any rate, to be severable, an invalid provision must be “grammatically, functionally, and volitionally separable” from the rest of the Act. Id. at 607 (quotation omitted). Here, the like-count provision is grammatically severable from the rest of the Act. It is isolated from the Act’s other 38 NETCHOICE, LLC V. BONTA provisions, and removing it does not affect the “wording” or “coherence” of what remains. See id. at 607–08; Cal. Health & Safety Code § 27002(b)(3). Enjoining enforcement of this provision does not affect the operation of any other default setting outlined in the Act—much less the other provisions of the Act regarding personalized feeds or age verification. Without the offending provisions, the Act thus remains “complete in itself.” Matosantos, 267 P.3d at 607–08 (quotation omitted). Finally, the provisions are volitionally separable. The rest of the Act “would have been adopted by the legislative body” even had it “foreseen the partial invalidation of the statute.” Id. (quotation omitted). The severability clause is strong evidence of this factor. E NetChoice’s as-applied and facial challenges to the like- count default setting have a likelihood of success on the merits. To obtain an injunction, NetChoice must also show that it is likely to suffer irreparable harm without an injunction, that an injunction is in the public interest, and that the equities favor an injunction. Winter, 555 U.S. at 20, 24. We have held that the latter two elements—the public interest and equities—“merge” in lawsuits against the government. Porretti v. Dzurenda, 11 F.4th 1037, 1047 (9th Cir. 2021). Starting with irreparable harm: “By bringing a colorable First Amendment claim,” NetChoice “raises the specter of irreparable injury.” Stormans, Inc. v. Selecky, 586 F.3d 1109, 1138 (9th Cir. 2009) (quotation omitted). “The loss of First Amendment freedoms, for even minimal periods of time, unquestionably constitutes irreparable injury.” Elrod NETCHOICE, LLC V. BONTA 39 v. Burns, 427 U.S. 347, 373 (1976). California does not argue otherwise, and so NetChoice has established irreparable harm as to the like-count provision. See Stormans, 586 F.3d at 1138. Turning to the balance of the equities and the public interest, we have repeatedly held that when plaintiffs show they are likely to succeed on a First Amendment claim, that “compels a finding” that the equities and public interest favor an injunction. Am. Beverage Ass’n, 916 F.3d at 758 (quoting Cmty. House, Inc. v. City of Boise, 490 F.3d 1041, 1059 (9th Cir. 2007)). That makes sense here, too. Although California may have a weighty interest in preventing minors from becoming addicted to personalized feeds, California can pursue that interest in less intrusive ways. See Doe v. Harris, 772 F.3d 563, 583 (9th Cir. 2014). Indeed, the district court concluded that the remaining injunction factors favored NetChoice given the infringement of its First Amendment rights. See NetChoice, 761 F. Supp. 3d at 1227–28, 1230. The district court did not abuse its discretion in that analysis, and California did not cross- appeal or otherwise challenge that conclusion. Thus, the equities and the public interest favor NetChoice. And because we have concluded that NetChoice has shown a likelihood of success on the merits of its challenges to the like-count provision, we reverse the district court’s denial of an injunction as to that provision and remand to the district court with instructions to enter an order enjoining its enforcement. IV For the most part, the district court got it right. NetChoice’s challenges to the personalized-feeds provisions fail because it lacks associational standing for its as-applied 40 NETCHOICE, LLC V. BONTA challenge and because it has not marshalled an adequate record to support its facial challenge. And the district court correctly concluded that the age-verification challenges are not yet ripe. When it comes to the default-settings provisions, the district court was right that the private-mode default setting passed constitutional muster both as applied and facially. But when it comes to the like-count default setting, the district court overlooked that the regulation is, itself, content based and thus triggers strict scrutiny. This means that the provision likely fails strict scrutiny for the reasons we explained in NetChoice, 113 F.4th at 1121. Because we conclude that NetChoice has shown a likelihood of success on the merits regarding this provision, and because the district court has already found for NetChoice on the remaining injunction factors, we direct the district court to modify its injunction. In all other respects, we affirm the district court’s denial of a preliminary injunction. AFFIRMED IN PART, REVERSED IN PART, and REMANDED.
Plain English Summary
FOR PUBLICATION UNITED STATES COURT OF APPEALS FOR THE NINTH CIRCUIT NETCHOICE, LLC, No.
Key Points
Frequently Asked Questions
FOR PUBLICATION UNITED STATES COURT OF APPEALS FOR THE NINTH CIRCUIT NETCHOICE, LLC, No.
FlawCheck shows no negative treatment for Netchoice, LLC v. Bonta in the current circuit citation data.
This case was decided on September 9, 2025.
Use the citation No. 10668502 and verify it against the official reporter before filing.
Why Attorneys Choose FlawFinder

Why Attorneys Choose FlawFinder

Side-by-side with Westlaw and LexisNexis

Feature FlawFinder Westlaw LexisNexis
Monthly price$19 – $99$133 – $646$153 – $399
ContractNone1–3 year min1–6 year min
Hidden fees$0, alwaysUp to $469/search$25/mo + per-doc
FlawCheck citatorIncludedKeyCite ($$$)Shepard's ($$$)
Plain-English summaryIncludedNoNo
CancelOne clickTermination feesAccount friction
Related Cases

Full legal research for $19/month

All 50 states · Federal regulations · Case law · Police SOPs · AI analysis included · No contract

Continue Researching →