House File 2715 - Introduced HOUSE FILE 2715 BY COMMITTEE ON ECONOMIC GROWTH AND TECHNOLOGY (SUCCESSOR TO HSB 647) A BILL FOR An Act relating to chatbots, including deployer requirements 1 and interactions with minors. 2 BE IT ENACTED BY THE GENERAL ASSEMBLY OF THE STATE OF IOWA: 3 TLSB 5402HV (2) 91 dg/jh
H.F. 2715 Section 1. NEW SECTION . 554J.1 Definitions. 1 1. “AI companion” means a public-facing chatbot designed to 2 simulate a human-like romantic or emotional bond. 3 2. “Artificial intelligence” means a machine-based system 4 that, for any explicit or implicit objective, infers from the 5 input the system receives to generate output that can influence 6 physical or virtual environments. 7 3. a. “Chatbot” means artificial intelligence that is 8 described by all of the following: 9 (1) The artificial intelligence accepts open-ended, 10 natural-language, or multimodal user input and produces 11 adaptive or context-responsive output. 12 (2) The artificial intelligence produces new expressive 13 content or responses that were not fully predetermined by the 14 person who created or who operates the artificial intelligence. 15 b. “Chatbot” does not include a service limited to internal 16 business operations or a service requiring user authentication 17 through an employer, an educational institution, or a similar 18 organization. 19 4. “Commercially reasonable” means a practice that is 20 consistent with prevailing industry standards and proportionate 21 to the size, resources, and technical capabilities of the 22 person employing the practice. 23 5. “Deployer” means a person that makes an AI companion, a 24 public-facing chatbot, or a therapeutic chatbot available to 25 users in this state. 26 6. “Minor” means an individual who is under eighteen years 27 of age. 28 7. “Output” means new content an artificial intelligence 29 generates based on a user’s input in relation to the data used 30 to train the artificial intelligence. “Output” includes but 31 is not limited to audio-visual media, images, predictions, 32 recommendations, and text. 33 8. a. “Public-facing chatbot” means a chatbot intentionally 34 made available to the general public or marketed directly to 35 -1- LSB 5402HV (2) 91 dg/jh 1/ 8
H.F. 2715 consumers for independent use without the ongoing supervision 1 of the deployer or an institutional consumer. 2 b. “Public-facing chatbot” does not include any of the 3 following: 4 (1) Software designed primarily for internal business 5 operations. 6 (2) Enterprise software licensed to a specific business, 7 nonprofit organization, or governmental entity. 8 (3) Chatbots used solely within the context of an existing 9 customer relationship. 10 (4) Systems requiring authentication through an employer, 11 educational institution, health care provider, or similar 12 organization prior to use. 13 9. “Therapeutic chatbot” means a public-facing chatbot 14 that is designed for the primary purpose of providing mental 15 health support, counseling, or therapy by diagnosing, treating, 16 mitigating, or preventing a mental health condition. 17 Sec. 2. NEW SECTION . 554J.2 Public-facing chatbots —— 18 general deployer requirements. 19 1. A deployer of a public-facing chatbot shall do all of the 20 following: 21 a. Implement and maintain protocols meant to detect, respond 22 to, report, and mitigate harm the public-facing chatbot may 23 cause a user in a manner that takes commercially reasonable 24 steps to protect the safety and well-being of users. 25 b. Limit the collection and storage of user information 26 collected by the public-facing chatbot to what is necessary to 27 fulfill the deployer’s purpose for making the public-facing 28 chatbot publicly available. 29 c. Clearly and conspicuously disclose each time the 30 deployer’s public-facing chatbot begins an interaction with a 31 user that the public-facing chatbot is artificial intelligence 32 and is not licensed as a medical, legal, financial, or mental 33 health professional. 34 d. At each three-hour interval of the deployer’s 35 -2- LSB 5402HV (2) 91 dg/jh 2/ 8
H.F. 2715 public-facing chatbot continuously interacting with a user, 1 clearly and conspicuously disclose the public-facing chatbot 2 is artificial intelligence and is not licensed as a medical, 3 legal, financial, or mental health professional. 4 e. Implement protocols for the deployer’s public-facing 5 chatbot for responding to user prompts indicating the user has 6 suicidal ideations or the intent to cause self-harm. Protocols 7 shall include but are not limited to making reasonable efforts 8 to refer the user to crisis service providers such as a suicide 9 hotline, crisis text line, or other appropriate service. 10 2. A deployer shall not knowingly or recklessly design or 11 make a public-facing chatbot available that does any of the 12 following: 13 a. Misleads a reasonable user into believing the 14 public-facing chatbot is a specific human being. 15 b. Misleads a reasonable user into believing the 16 public-facing chatbot is licensed by the state. 17 c. Encourages, promotes, or coerces a user to commit 18 suicide, perform acts of self-harm, or engage in sexual or 19 physical violence against a human or an animal. 20 Sec. 3. NEW SECTION . 554J.3 Chatbot interactions with 21 minors. 22 1. a. A deployer of an AI companion or a therapeutic 23 chatbot shall implement commercially reasonable measures to 24 determine whether a user is a minor. The measures must use 25 a risk-based approach appropriate with the nature of the 26 public-facing chatbot and the reasonably foreseeable harm that 27 may come from using the public-facing chatbot. 28 b. Reasonable measures to determine whether a user is a 29 minor may include self-attestation, technical measures, or 30 other commercially reasonable approaches. 31 c. This section shall not be construed to require a deployer 32 to verify a user’s age using government-issued identification. 33 2. A deployer of an AI companion or a therapeutic chatbot 34 shall implement protocols for sending a notification to a 35 -3- LSB 5402HV (2) 91 dg/jh 3/ 8
H.F. 2715 minor user’s parent, legal guardian, or legal custodian when 1 the minor user enters a prompt indicating the minor user has 2 suicidal ideations or the intent to cause self-harm. 3 3. A deployer shall only make a therapeutic chatbot 4 available for a minor’s use or purchase if all of the following 5 apply: 6 a. The therapeutic chatbot was recommended for the minor’s 7 use by an individual licensed under chapter 154B or 154D after 8 performing an evaluation of the minor. 9 b. The therapeutic chatbot’s developer has significant 10 documentation of how the public-facing chatbot was tested. 11 c. Peer-reviewed clinical trial data exists demonstrating 12 the therapeutic chatbot would be a safe, effective tool for the 13 minor’s diagnosis, treatment, mitigation, or prevention of a 14 mental health condition. 15 d. The therapeutic chatbot’s deployer provided clear 16 disclosures of the therapeutic chatbot’s functions, 17 limitations, and data privacy policies to the individual 18 recommending the therapeutic chatbot under paragraph “a” , and 19 to the minor’s parents, guardians, or custodians. 20 e. The therapeutic chatbot’s deployer developed and 21 implemented protocols for testing the therapeutic chatbot for 22 risks to users, identifying possible risks the therapeutic 23 chatbot poses to users, mitigating risks the therapeutic 24 chatbot poses to users, and quickly rectifying harm the 25 therapeutic chatbot may have caused a user. 26 4. A deployer shall not be liable for a user’s 27 misrepresentation of age if the deployer has made commercially 28 reasonable efforts to comply with this section. 29 Sec. 4. NEW SECTION . 554J.4 Enforcement. 30 1. The attorney general may bring an action on behalf of the 31 state to enforce the provisions of this chapter and may seek an 32 injunction for violations of this chapter. 33 2. a. A court may issue a civil penalty of not more than 34 two thousand five hundred dollars for each violation of this 35 -4- LSB 5402HV (2) 91 dg/jh 4/ 8
H.F. 2715 chapter, or seven thousand five hundred dollars if a person 1 violates an injunction issued under this chapter. 2 b. Penalties assessed under this subsection shall be 3 deposited into the general fund of the state. 4 3. a. Prior to initiating a proceeding to obtain a 5 civil penalty, the attorney general shall notify a person 6 in violation of this chapter of the violation and give the 7 person thirty calendar days from the date the attorney general 8 notified the person to cure the violation. 9 b. This subsection shall not apply if a violation will cause 10 imminent harm to a minor. 11 Sec. 5. NEW SECTION . 554J.5 Safe harbor. 12 A deployer that makes commercially reasonable efforts to 13 comply with this chapter shall not be subject to liability for 14 unforeseeable or emergent outputs generated by the deployer’s 15 public-facing chatbot. 16 Sec. 6. NEW SECTION . 554J.6 Construction. 17 This chapter shall be narrowly interpreted to apply only to 18 consumer-facing artificial intelligence. This chapter shall 19 not be construed to regulate internal business technologies. 20 Sec. 7. LEGISLATIVE INTENT. It is the intent of the general 21 assembly to support innovation in artificial intelligence 22 while establishing reasonable consumer protections for systems 23 designed for independent public use. 24 EXPLANATION 25 The inclusion of this explanation does not constitute agreement with 26 the explanation’s substance by the members of the general assembly. 27 This bill relates to public-facing chatbots (public 28 chatbots). 29 The bill defines “AI companion” as a chatbot designed to 30 simulate human-like romantic or emotional bonds. 31 The bill defines “deployer” as a person that makes an AI 32 companion, a public chatbot, or a therapeutic chatbot available 33 to users in this state. 34 The bill defines “public-facing chatbot” as a chatbot 35 -5- LSB 5402HV (2) 91 dg/jh 5/ 8
H.F. 2715 intentionally made available to the general public or marketed 1 directly to consumers for independent use without the ongoing 2 supervision of the deployer or an institutional consumer. 3 The bill defines “therapeutic chatbot” as a public chatbot 4 that is designed for the primary purpose of providing mental 5 health support, counseling, or therapy by diagnosing, treating, 6 mitigating, or preventing a mental health condition. 7 The bill also defines “artificial intelligence” (AI), 8 “chatbot”, “commercially reasonable”, “minor”, and “output”. 9 The bill requires a deployer to implement and maintain 10 protocols meant to detect, respond to, report, and mitigate 11 harm the deployer’s public chatbot may cause a user in a 12 manner that takes commercially reasonable steps to protect the 13 well-being of users; limit the collection and storage of user 14 information collected by a public chatbot to what is necessary 15 to fulfill the deployer’s purpose for making the public 16 chatbot publicly available; clearly and conspicuously disclose 17 the public chatbot is AI and not a licensed medical, legal, 18 financial, or mental health professional at the beginning of 19 the public chatbot’s interaction with a user and at three-hour 20 intervals of continuous interaction with the user; and 21 implement protocols to respond to user prompts indicating the 22 user has suicidal ideations or the intent to cause self-harm. 23 The bill prohibits deployers from knowingly or recklessly 24 designing or making a public chatbot available if the public 25 chatbot misleads a reasonable user into believing the public 26 chatbot is a specific human being; misleads a reasonable user 27 into believing the public chatbot is licensed by the state; 28 or encourages, promotes, or coerces a user to commit suicide, 29 perform acts of self-harm, or engage in sexual or physical 30 violence against a human or an animal. 31 The bill requires a deployer of an AI companion or a 32 therapeutic chatbot to implement commercially reasonable 33 measures to determine whether a user is a minor. The measures 34 must use a risk-based approach appropriate with the nature of 35 -6- LSB 5402HV (2) 91 dg/jh 6/ 8
H.F. 2715 the public chatbot and the reasonably foreseeable harm that 1 may come from using the public chatbot. Reasonable measures 2 may include self-attestation, technical measures, or other 3 commercially reasonable approaches, but the bill is not to be 4 construed to require a deployer to verify a user’s age using 5 government-issued identification. 6 The bill requires a deployer of an AI companion or a 7 therapeutic chatbot to implement protocols for sending a 8 notification to a minor user’s parent, legal guardian, or 9 legal custodian when the minor user enters a prompt indicating 10 the minor user has suicidal ideations or the intent to cause 11 self-harm. 12 The bill prohibits a deployer from making a therapeutic 13 chatbot available for a minor’s use or purchase unless the 14 deployer meets requirements related to therapeutic chatbot use 15 and safety as detailed in the bill. 16 The bill provides that a deployer is not liable for a user’s 17 misrepresentation of age if the deployer has made commercially 18 reasonable efforts to comply with the bill. 19 The bill authorizes the attorney general to bring actions on 20 behalf of the state to enforce the bill and seek injunctions 21 for violations of the bill. 22 The bill allows a court to issue a civil penalty of not more 23 than $2,500 for each violation of the bill, or $7,500 if the 24 person violated an injunction issued under the bill. Civil 25 penalties assessed under the bill shall be deposited into the 26 general fund of the state. 27 The bill requires the attorney general to notify a person 28 in violation of the bill of the person’s violation and give 29 the person 30 calendar days from the date the attorney general 30 notified the person to cure the violation prior to the attorney 31 general initiating a proceeding to obtain a civil penalty. 32 However, the requirement to notify a person shall not apply if 33 a violation will cause imminent harm to a minor. 34 The bill provides that a deployer who makes commercially 35 -7- LSB 5402HV (2) 91 dg/jh 7/ 8
H.F. 2715 reasonable efforts to comply with the bill shall not be subject 1 to liability for unforeseeable or emergent outputs generated by 2 the deployer’s public chatbot. 3 The bill shall be narrowly interpreted to apply only to 4 consumer-facing AI. The bill shall not be construed to regulate 5 internal business technologies. 6 The bill states the legislative intent of the general 7 assembly to support innovation in AI while establishing 8 reasonable consumer protections for systems designed for 9 independent public use. 10 -8- LSB 5402HV (2) 91 dg/jh 8/ 8