In the ordinary course of business, we collect, receive, access, generate, transfer, store, disclose, share, make accessible, protect, secure, dispose of, use, and otherwise process general personal data. Our data processing activities may subject us to numerous data privacy and security obligations, such as various laws, codes, regulations, industry standards, external and internal privacy and security policies, contracts, and other obligations that govern the processing of personal data by us and on our behalf.
In the U.S., federal, state, and local governments have enacted numerous data privacy and security laws, including data breach notification laws, personal data privacy laws, and consumer protection laws (e.g., Section 5 of the Federal Trade Commission Act), and other similar laws (e.g., wiretapping laws). The California Consumer Privacy Act ("CCPA") applies to personal information of consumers, business representatives, and employees, and requires businesses to provide specific disclosure in privacy notices and honor requests of California residents to exercise certain privacy rights. The CCPA provides for civil penalties of up to $7,500 per violation and allows private litigants affected by certain data breaches to recover significant statutory damages. In addition, the California Privacy Rights Act of 2020, which became operative on January 1, 2023, expanded the CCPA's requirements to apply to personal information of business representatives and employees and established a new regulatory agency to implement and enforce the law.
Other states, such as Virginia, Colorado, Utah and Connecticut, have also passed comprehensive privacy laws, and similar laws are being considered in several other states, as well as at the federal and local levels. These developments may further complicate compliance efforts, and may increase legal risk and compliance costs for us and the third parties upon whom we rely.
Additionally regulations promulgated pursuant to the federal Health Insurance Portability and Accountability Act of 1996, as amended by the Health Information Technology for Economic and Clinical Health Act or, collectively, HIPAA, establish privacy and security standards that limit the use and disclosure of individually identifiable health information, or protected health information, and require the implementation of administrative, physical, and technological safeguards to protect the privacy of protected health information and ensure the confidentiality, integrity, and availability of electronic protected health information. Determining whether protected health information has been handled in compliance with applicable privacy standards and our contractual obligations can be complex and may be subject to changing interpretation. These obligations may be applicable to some or all of our business activities now or in the future.
If we are unable to properly protect the privacy and security of protected health information, we could be found to have breached our contracts, including HIPAA-required business associate agreements. Further, if we fail to comply with applicable privacy laws, including applicable HIPAA privacy and security standards, we could face civil and criminal penalties. The U.S. Department of Health and Human Services enforcement activity can result in financial liability and reputational harm, and responses to such enforcement activity can consume significant internal resources. In addition, state attorneys general are authorized to bring civil actions seeking either injunctions or damages in response to violations that threaten the privacy of state residents. We cannot be sure how these regulations will be interpreted, enforced, or applied to our operations. In addition to the risks associated with enforcement activities and potential contractual liabilities, our ongoing efforts to comply with evolving laws and regulations at the federal and state level may be costly and require ongoing modifications to our policies, procedures, and systems.
Outside of the U.S., an increasing number of laws, regulations, and industry standards apply to data privacy and security. The EU General Data Protection Regulation ("GDPR") and the U.K. GDPR impose strict requirements for processing personal data. Under the GDPR, government regulators may impose temporary or definitive bans on data processing, as well as fines of up to 20 million euros or 4% of annual global revenue, whichever is greater; or private litigation related to processing of personal data brought by classes of data subjects or consumer protection organizations authorized by law to represent their interest. We also target customers in Asia and have operations in India and Australia and are subject to new and emerging data privacy regimes in Asia. In addition, privacy advocates and industry groups have proposed, and may propose, standards with which we are legally or contractually bound to comply.
Certain jurisdictions have enacted data localization laws and cross-border personal data transfer laws, which could make it more difficult to transfer information across jurisdictions (such as transferring or receiving personal data that originates in the EU or in other foreign jurisdictions). Existing mechanisms that facilitate cross-border personal data transfers may change or be invalidated. For example, absent appropriate safeguards or other circumstances, the EU GDPR generally restricts the transfer of personal data to countries outside of the EEA that the European Commission does not consider providing an adequate level of data privacy and security, such as the U.S. The European Commission released a set of Standard Contractual Clauses ("SCCs") that are designed to be a valid mechanism to facilitate personal data transfers out of the EEA to these jurisdictions. In addition, the EU-U.S. Data Privacy Framework ("Data Privacy Framework") that went into effect in July 2023 allows for transfers for relevant U.S.-based organizations who self-certify compliance and participate in the Data Privacy Framework are valid transfer mechanism. Currently, the SCCs or certification under the Data Privacy Framework are valid mechanisms to transfer personal data outside of the EEA, but there exists some uncertainty regarding whether they will remain valid mechanisms, since they are subject to legal challenges, and there is no assurance that we can satisfy or rely on these measures to lawfully transfer personal data to the United States. Additionally, the SCCs impose additional compliance burdens, such as conducting transfer impact assessments to determine whether additional security measures are necessary to protect the at-issue personal data. Some European regulators have ordered certain companies to suspend or permanently cease certain transfers out of Europe for allegedly violating the GDPR's cross-border data transfer limitations.
If we cannot implement a valid compliance mechanism for cross-border data transfers to countries, such as the U.K., we may face increased exposure to regulatory actions, substantial fines, and injunctions against processing or transferring personal data from the U.K. or other foreign jurisdictions. The inability to import personal data to the U.S. could significantly and negatively impact our business operations, including by limiting our ability to collaborate with parties that are subject to such cross-border data transfer or localization laws, by or requiring us to increase our personal data processing capabilities and infrastructure in foreign jurisdictions at significant expense.
The privacy of children's personal data collected online is also becoming increasingly scrutinized both in the United States and internationally. For example, the United Kingdom's Age Appropriate Design Code, or AADC, and incoming Online Safety Bill, focuses on online safety and protection of children's privacy online. In the U.S., we may have obligations on the federal level under the Children's Online Privacy Protection Act, or COPPA. COPPA applies to operators or co-operators of commercial websites and online services directed to US children under the age of 13 that collect personal information from children and operators of general audience sites with actual knowledge that they are collecting information from US children under the age of 13. Our platform is aimed at a general audience, and any information that we might collect from third party business partners about any data subjects under the age of 13 would be de-identified. There may be situations, however, where despite the de-identification, we could be alleged to be collecting personal information from children or that we are a co-operator under COPPA.
Our obligations related to data privacy and security are quickly changing in an increasingly stringent fashion, creating some uncertainty as to the effective future legal framework. Use and development of AI and machine learning systems is also an area of developing laws, rules, and regulations. Our employees and personnel may use generative AI technologies to perform their work, and the disclosure and use of personal information in generative AI technologies is subject to various privacy laws and other privacy obligations. Additionally, these obligations may be subject to differing applications and interpretations, which may be inconsistent or conflict among jurisdictions. Preparing for and complying with these obligations requires significant resources and may necessitate changes to our information technologies, systems, and practices and to those of any third parties that process personal data on our behalf. In addition, these obligations may require us to change our business model. Our use of this technology could result in additional compliance costs, regulatory investigations and actions, and consumer lawsuits. If we are unable to use generative AI, it could make our business less efficient and result in competitive disadvantages. We use AI/machine learning to assist us in making certain decisions, which is regulated by certain privacy laws. Due to inaccuracies or flaws in the inputs, outputs, or logic of the AI/machine learning, the model could be biased and could lead us to make decisions that could bias certain individuals (or classes of individuals), and adversely impact their rights, employment, and ability to obtain certain pricing, products, services, or benefits.
Our business model materially depends on our ability to process user engagement data, so we are particularly exposed to the risks associated with the rapidly changing legal landscape. For example, we may be at heightened risk of regulatory scrutiny, and any changes in the regulatory framework could require us to fundamentally change our business model. Moreover, despite our efforts, our personnel or third parties upon whom we rely may fail to comply with such obligations, which could negatively impact our business operations and compliance posture. For example, any failure by a third-party processor to comply with applicable laws, regulations, or contractual obligations (including as a result of a data breach or similar incident) could result in adverse effects, including inability to or interruption in our ability to operate our business and proceedings against us by governmental entities or others.
If we fail, or are perceived to have failed, to address or comply with data privacy and security obligations, we could face significant consequences. These consequences may include, but are not limited to, government enforcement actions (e.g., investigations, fines, penalties, audits, inspections, and similar); litigation (including class-related claims); additional reporting requirements and/or oversight; bans on processing personal data; orders to destroy or not use personal data; and imprisonment of company officials. Any of these events could have a material adverse effect on our reputation, business, or financial condition, including but not limited to: loss of customers; interruptions or stoppages in our business operations (including, interruptions or stoppages of data collection needed to train our algorithms); inability to process personal data or to operate in certain jurisdictions; limited ability to develop or commercialize our products; expenditure of time and resources to defend any claim or inquiry; adverse publicity; or revision or restructuring of our operations.
Several jurisdictions around the globe, including Europe and certain U.S. states, have proposed or enacted laws governing AI/machine learning. For example, European regulators have proposed a stringent AI regulation, and we expect other jurisdictions will adopt similar laws. Additionally, certain privacy laws extend rights to consumers (such as the right to delete certain personal data) and regulate automated decision making, which may be incompatible with our use of AI/machine learning. These obligations may make it harder for us to conduct our business using AI/machine learning, lead to regulatory fines or penalties, require us to change our business practices, retrain our AI/machine learning, or prevent or limit our use of AI/machine learning. For example, the FTC has required other companies to turn over (or disgorge) valuable insights or trainings generated through the use of AI/machine learning where they allege the company has violated privacy and consumer protection laws. If we cannot use AI/machine learning or that use is restricted, our business may be less efficient, or we may be at a competitive disadvantage.
Additionally, we maintain privacy policies and other documentation regarding our processing of personal data. Although we endeavor to comply with our privacy policies and other data protection obligations, we may at times fail to do so or may be perceived to have failed to do so. Moreover, despite our efforts, we may not be successful in achieving compliance if our employees, contractors, service providers, or vendors fail to comply with our policies and documentation. Such failures can subject us to potential foreign, federal, state, and local action if they are found to be deceptive, unfair, or misrepresentative of our actual practices. Claims that we have violated individuals' privacy rights or failed to comply with privacy policies and other data protection obligations, even if we are not found liable, could be expensive and time-consuming to defend and could result in adverse publicity that could harm our business. We are also bound by contractual obligations related to data privacy and security (including related to industry standards), and our efforts to comply with such obligations may not be successful. For example, certain privacy laws, such as the GDPR and the CCPA, require our customers to impose specific contractual restrictions on their service providers. Additionally, some of our customer contracts require us to host personal data locally.
We may in the future receive inquiries from or be subject to investigations by data protection authorities regarding, among other things, our privacy, data protection, and information security practices. Any such investigations could impact our brand reputation, subject us to monetary remedies and costs, interrupt or require us to change our business practices, divert resources and the attention of management from our business, or subject us to other remedies that adversely affect our business.