Bill

Bill > HB4083


OK HB4083

OK HB4083
Technology; deployers; AI chatbots; minors; age verification systems; emergency situations; effective date.


summary

Introduced
02/02/2026
In Committee
02/03/2026
Crossed Over
Passed
Dead

Introduced Session

2026 Regular Session

Bill Summary

An Act relating to technology; providing definitions; directing deployers of chatbots to ensure AI chatbots do not make human-like features available to minors; directing deployers to implement reasonable age verification systems; permitting deployers to provide alternative versions of chatbot without human-like features; directing deployers to ensure social AI companions are not available to minors; providing exemptions for certain therapeutic chatbots; directing deployers to implement and maintain effective systems to detect emergency situations; directing deployers to only collect information that does not conflict with a trusting party's best interest; directing the Attorney General to bring action against businesses or persons who are in violation; creating a private right of action; providing for codification; and providing an effective date.

AI Summary

This bill requires "deployers," which are entities operating or distributing artificial intelligence (AI) chatbots, to implement measures to protect minors. Specifically, deployers must ensure that AI chatbots do not offer "human-like features" – defined as behaviors that suggest sentience, emotions, desires, or the ability to form emotional relationships, or impersonating real people – to individuals under 18. To enforce this, deployers must use reasonable age verification systems and can offer alternative versions of chatbots without these human-like features for younger users. The bill also mandates that "social AI companions," designed to form emotional bonds, are not available to minors. However, "therapy chatbots," intended for mental health support, are exempt if they meet strict criteria, including clear disclaimers, professional oversight, and proven safety and efficacy. Furthermore, deployers must have systems to detect and respond to "emergency situations," where a user indicates intent to harm themselves or others, prioritizing user safety. They are also restricted to collecting only necessary and relevant user information that aligns with the user's best interests. The Attorney General can take legal action against violators, imposing fines, and individuals, including minors or their guardians, can sue for damages and injunctive relief if a chatbot fails to comply with these protections. This act is set to become effective on November 1, 2026.

Sponsors (1)

Last Action

Second Reading referred to Rules (on 02/03/2026)

bill text


bill summary

Loading...

bill summary

Loading...
Loading...