We are subject to numerous US federal and state as well as foreign laws and regulations covering a wide variety of subjects, and our introduction of new businesses, products, services, and technologies will likely continue to subject us to additional laws and regulations. In recent years, governments around the world have proposed and adopted a large number of new laws and regulations relevant to the digital economy, particularly in the areas of data privacy and security, competition, AI, and online content. The costs of compliance with these measures are high and are likely to increase in the future, including as a result of differing, and sometimes conflicting, laws and regulations.
New or changing laws and regulations, or interpretations or applications of existing laws and regulations in a manner inconsistent with our interpretations of such laws and regulations or our practices, have resulted in, and may continue to result in, less useful products and services, altered business models and operations, limited ability to pursue certain business practices or offer certain products and services, substantial costs, and civil or criminal liability. Examples include laws and regulations regarding:
- Competition and technology platforms' business practices: Laws and regulations focused on large technology platforms, including the Digital Markets Act in the European Union (EU) and the Act on Promotion of Competition for Specified Smartphone Software in Japan; regulations and legal settlements in the US, South Korea, and elsewhere that affect Google Play's billing policies, fees, and business model; as well as litigation and new and expected regulations in a range of jurisdictions.
- AI: Laws and regulations focused on the development, use, and provision of AI technologies and other digital products and services, which could result in monetary penalties or other regulatory actions. For example, the EU AI Act came into force on August 1, 2024, and will generally become fully applicable after a two-year transitional period (although certain obligations have already taken effect). The EU AI Act introduces various requirements for AI systems and models placed on the market in the EU, including specific transparency, safety, and copyright requirements for general purpose AI systems and the models on which those systems are based. Various countries, including Brazil, India, Japan, South Korea, Singapore, and Vietnam, have also enacted or are considering enacting regulations focused on AI. In the US, an increasing amount of legislative and regulatory activity regarding AI is taking place at the state level. In 2025, state legislatures considered more than 1,000 AI-related bills, including on fundamental model research and development, synthetic media, algorithmic decision-making, and many others, and took a variety of approaches to AI regulation. For instance, in 2025, California and New York passed the Transparency in Frontier Artificial Intelligence Act and the Responsible AI Safety and Education Act, respectively, each of which imposes safety and reporting obligations on developers of frontier models. At the same time, the White House's Executive Order, Removing Barriers to American Leadership in Artificial Intelligence, prioritizes deregulation, while its AI Action Plan emphasizes accelerating American innovation leadership.
- Data privacy, collection, processing, and portability: Laws and regulations further restricting the collection, processing, or sharing of user or advertising-related data, including privacy and data protection laws; laws affecting the processing of children's data (as discussed further below), data breach notification laws; laws limiting data transfers (including data localization laws); laws limiting use of data for AI training; and laws requiring data portability.
- Copyright and other intellectual property: Copyright and related laws, including the EU Directive on Copyright in the Digital Single Market and European Economic Area transpositions, which have introduced new licensing regimes, increase liability with respect to content uploaded by users or linked to from our platforms, or create property rights in news publications that could require payments to news agencies and publishers, which may result in other regulatory actions. The scope of the text and data mining exception is being challenged before courts in the EU, which could harm some aspects of our business.
- Content moderation: Various laws covering content moderation and removal, and related disclosure obligations, such as the EU's Digital Services Act, Florida's Senate Bill 7072 and Texas' House Bill 20, and laws and proposed legislation in Singapore, Australia, and the United Kingdom (UK) that impose penalties for failure to remove certain types of content or require disclosure of information about the operation of our services and algorithms, which may make it harder for services like Google Search and YouTube to detect and limit low-quality, deceptive, or harmful content, or, on the other hand, may impinge on the rights of free expression and access to content. Additionally, new regulations apply to online child safety, including access and content restrictions as well as other limitations for minors, which may also conflict with rights of free expression and access to information. These regulations could result in our having to modify our products and services and monitor minors' experiences on our products and services.
- Consumer protection: Consumer protection laws, including the EU's New Deal for Consumers, which could result in monetary penalties and create a range of new compliance obligations.
In addition, the applicability and scope of these and other laws and regulations, as interpreted by courts, regulators, or administrative bodies, remain uncertain and could be interpreted in ways that harm our business. For example, we rely on statutory safe harbors, like those set forth in the Digital Millennium Copyright Act and Section 230 of the Communications Decency Act in the US and the Digital Services Act in Europe, to protect against liability for various linking, caching, ranking, recommending, and hosting activities. Legislation or court rulings affecting these safe harbors may harm us and may impose significant operational challenges. There are legislative proposals and pending litigation in the US, EU, and around the world that could diminish or eliminate safe harbor protection for websites and online platforms. Our development, use, and commercialization of AI products and services (including our implementation of AI in our offerings and internal systems) could subject us to regulatory action and legal liability, including under specific legislation regulating AI, as well as new applications of existing data protection, cybersecurity, privacy, intellectual property, and other laws.
Further, we are subject to evolving laws, regulations, policies, and international accords relating to matters beyond our core products and services, including environmental sustainability, climate change, human capital, and employment matters. In response, we have implemented robust programs and initiatives and adopted reporting frameworks and principles that may require considerable investments. For instance, AI's energy and water demands have made efforts to reduce our emissions more complex and challenging across every level. We cannot guarantee that our initiatives will be fully realized on the timelines we expect or at all, and projects that are completed as planned may not achieve the results we anticipate.