As Introduced

136th General Assembly

Regular Session H. B. No. 628

2025-2026

Representative Mathews, T.

Cosponsors: Representatives Ritter, Hall, T., Deeter, Fischer


To enact sections 3755.01, 3755.02, 3755.03, 3755.04, 3755.041, 3755.05, 3755.06, 3755.07, 3755.08, 3755.09, 3755.091, 3755.10, 3755.11, and 3755.12 of the Revised Code to create an independent verification organization license for verifying artificial intelligence risk mitigation.

BE IT ENACTED BY THE GENERAL ASSEMBLY OF THE STATE OF OHIO:

Section 1. That sections 3755.01, 3755.02, 3755.03, 3755.04, 3755.041, 3755.05, 3755.06, 3755.07, 3755.08, 3755.09, 3755.091, 3755.10, 3755.11, and 3755.12 of the Revised Code be enacted to read as follows:

Sec. 3755.01. As used in this chapter:

(A) "Artificial intelligence application" means a software program or system that uses artificial intelligence models to perform tasks that typically require human intelligence.

(B) "Artificial intelligence model" means an engineered or machine-based system that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.

(C)(1) "Deployer" means a person or entity that implements, integrates, or makes operational an artificial intelligence model or artificial intelligence application within this state.

(2) "Deployer" includes a person or entity that makes an artificial intelligence model or artificial intelligence application available for use by others within this state, whether directly or as part of a product or service.

(D) "Developer" means a person or entity who develops an artificial intelligence model or artificial intelligence application that is deployed in this state.

(E) "Independent verification organization" means an entity licensed by the attorney general pursuant to section 3755.03 of the Revised Code to assess artificial intelligence models' or applications' adherence to standards reflecting best practices for the prevention of personal injury and property damage.

(F) "Security vendor" means a third-party entity engaged by an independent verification organization or developer to evaluate the safety or security of an artificial intelligence model or artificial intelligence application, using processes such as red teaming, risk detection, and risk mitigation.

Sec. 3755.02. An application for an independent verification organization license shall be made by filing with the attorney general the information, materials, and forms specified in rules adopted by the attorney general, along with a plan detailing all of the following information:

(A)(1) The risk or risks with respect to which the applicant intends to verify that artificial intelligence models or artificial intelligence applications achieve acceptable levels of mitigation;

(2) For each risk detailed in division (A)(1) of this section, a proposed definition of acceptable levels of mitigation, along with all of the following:

(a) Measurable outcome metrics that are reasonable proxies for the attainment of acceptable levels of risk mitigation;

(b) Baselines and targets for outcome metrics described in division (A)(1)(a) of this section, identified data sources, and specific measurement methods;

(c) A description of the evaluation and reporting protocol the independent verification organization will use to determine whether verified artificial intelligence models or artificial intelligence applications meet the outcome metrics described in division (A)(1)(a) of this section on an ongoing basis.

(B) Proposed technical and operational requirements for developers or deployers, including procedures for both pre-development and post-development, to ensure that an artificial intelligence model or artificial intelligence application achieves acceptable levels of risk mitigation, including requirements for both of the following:

(1) Ongoing monitoring of risks;

(2) Ongoing assessment of the efficacy of mitigation measures.

(C) The methodologies and sources the applicant proposes using to evaluate its technical and operational requirements' efficacy for ensuring acceptable risk mitigation and for updating those requirements to address identified gaps or deficiencies;

(D) The benchmarks, technologies, and audit methodologies the applicant proposes using to verify developers' and deployers' adherence to its technical and operational requirements;

(E) The applicant's approach to ensuring continual good standing of a verified entity, including by reviewing and assessing the developer's or deployer's maintenance of artificial intelligence governance plans and policies, processes for risk monitoring and mitigation, whistleblower protections, and training for employees and third parties;

(F) The applicant's proposed requirements for developers or deployers to disclose to the applicant detected risks, incident reports, or material changes to the risk profile of the model or application, including risks detected prior to verification and risks resulting from fine-tuning or modifying an artificial intelligence model or artificial intelligence application after verification;

(G) The applicant's proposed procedure for prescribing and verifying implementation of corrective action to remedy an identified failure by a developer or deployer to do any of the following:

(1) Achieve acceptable risk mitigation with respect to an artificial intelligence application or artificial intelligence model;

(2) Comply with any other requirements promulgated by the applicant;

(3) Comply with the developer's or deployer's artificial intelligence governance plans and policy.

(H) The applicant's proposed standards and procedures for revoking the verification of artificial intelligence models or artificial intelligence applications for noncompliance with the applicant's requirements, failure to achieve acceptable levels of risk mitigation, or noncompliance with the developer's or deployer's artificial intelligence governance plans and policies;

(I) Whether the applicant proposes providing verification for one or more particular actual or potential artificial intelligence industry market segments and, if so, how the elements of the applicant's plans are tailored to any unique attributes of that market segment;

(J) The applicant's plan for interfacing and coordinating effectively with federal and state authorities;

(K) The applicant's personnel and the qualifications of those personnel;

(L) Whether the applicant's proposed risk mitigation procedures described in division (B) of this section will involve the use of security vendors.

(M) The applicant's governance policies, sources of funding, and policies to ensure its independence in carrying out its responsibilities under this chapter.

(N) Any other information the attorney general requires.

Sec. 3755.03. (A) The attorney general may license an applicant as an independent verification organization if the attorney general determines both of the following:

(1) That the applicant has demonstrated its independence from the artificial intelligence industry;

(2) That every element of the applicant's plan required under section 3755.02 of the Revised Code is adequate to ensure that artificial intelligence models or artificial intelligence applications verified pursuant to the plan will mitigate to an acceptable level one or more risks for which the applicant proposes conducting verification.

(B) If an applicant proposes conducting verification for a particular artificial intelligence market segment, the attorney general shall, in determining the plan's adequacy, account for the characteristics of the relevant market segment.

(C) If the attorney general finds that an applicant's plan under section 3755.02 of the Revised Code adequately mitigates some, but not all, of the proposed risks, the applicant shall be licensed to verify only those risks for which the plan is deemed adequate.

(D)(1) In licensing an independent verification organization, the attorney general shall expressly and specifically identify the risks for which the independent verification organization is licensed to conduct verification.

(2) If the independent verification organization proposes conducting verification for one or more specific market segments, the attorney general shall expressly and specifically identify the market segments for which the independent verification organization is licensed to conduct verification.

Sec. 3755.04. The attorney general shall revoke an independent verification organization's license if the attorney general determines any of the following:

(A) The independent verification organization's plan is materially misleading or inaccurate.

(B) The independent verification organization fails to adhere to its plan in a manner that materially impairs its ability to fulfill its responsibilities, including failure to adhere to the plan's procedures for ongoing monitoring of verified artificial intelligence models or applications and implementation of corrective action.

(C) A material change compromises the independent verification organization's independence from the artificial intelligence industry.

(D) Evolution of technology renders the independent verification organization's methods obsolete for ensuring acceptable levels of the risk the attorney general has designated the independent verification organization to verify.

(E) An artificial intelligence model or artificial intelligence application verified by the independent verification organization causes a material harm of the type the independent verification organization seeks to prevent by establishing acceptable risk levels.

Sec. 3755.041. Notwithstanding section 3755.04 of the Revised Code, if the attorney general determines that the public interest so requires, the attorney general may provide an independent verification organization with an opportunity to cure the basis for revocation before revoking the independent verification organization's license.

Sec. 3755.05. (A) The attorney general shall establish reasonable application fees and annual renewal fees for independent verification organizations licensed under this chapter sufficient to offset the costs incurred by the attorney general in administering this chapter.

(B) All fees assessed pursuant to this section shall be made payable to the attorney general.

(C) All fees collected pursuant to this section shall be used only to pay for the following:

(1) Processing of applications for independent verification organization licensure;

(2) Auditing of licensed independent verification organizations;

(3) Payment of members of the artificial intelligence safety advisory council;

(4) Other costs arising from the administration of this chapter.

Sec. 3755.06. (A) The artificial intelligence safety advisory council is established in the office of the attorney general, in conjunction with the auditor of state. The attorney general shall, after consulting with the auditor of state, determine the appropriate size of the council and appoint all members.

(B) The attorney general may convey powers and duties provided to the attorney general under this chapter to the advisory council, including licensing independent verification organizations.

(C) The advisory council shall include at least one member representing the interests of civil society, including non-governmental organizations, educational and research institutions, public policy institutes, or consumer and business advocacy organizations.

(D) All members of the advisory council shall do all of the following:

(1) Remain free from undue influence and from taking any action that could compromise their ability to carry out their responsibilities under this chapter or otherwise cast doubt on their ability to independently assess artificial intelligence models or artificial intelligence applications;

(2) Refrain from any action or occupation, whether gainful or not, incompatible with their duties, including employment by a developer or deployer of an artificial intelligence model or artificial intelligence application;

(3)(a) Refrain from owning or acquiring any equity or other interest, directly or indirectly, in any company whose business consists in significant part of developing or deploying artificial intelligence models or artificial intelligence applications.

(b) Division (D)(3)(a) of this section does not apply to equity acquired via a mutual fund or an exchange traded fund.

(4) Be precluded from accepting employment from an entity licensed or seeking licensure as an independent verification organization, or from an artificial intelligence model or artificial intelligence application developer or deployer, for a period of one year after leaving the advisory council;

(5) Have the required qualifications, experience, and skills to perform their duties, including evaluating whether the plan provided by an applicant for an independent verification organization license ensures acceptable risk mitigation, and determining standards for evaluating the plans of applicants.

(E) No member of the advisory council may serve for more than two consecutive terms.

(F) Each member of the advisory council shall receive the member's actual and necessary expenses incurred in the discharge of the member's duties. Each member may also receive a salary for carrying out their duties under this chapter.

(G) Members of the advisory council may be removed by the attorney general for inefficiency, neglect of duty, or malfeasance of office.

(H) A majority of the members of the advisory council constitute a quorum, and a concurrence of a majority of a quorum is sufficient for its determination.

(I) The advisory council shall keep a record of its proceedings, including all considerations relating to the issuance, refusal, renewal, and revocation of independent verification organization licensure.

(J) The advisory council shall publish redacted versions of reports issued by independent verification organizations to the attorney general's web site.

Sec. 3755.07. (A) An independent verification organization licensed pursuant to this chapter shall implement the plan submitted to the attorney general to verify artificial intelligence models' or artificial intelligence applications' ongoing mitigation of risks for which the independent verification organization is licensed to verify.

(B) Nothing in this chapter requires an artificial intelligence model or artificial intelligence application to seek verification from an independent verification organization.

Sec. 3755.08. An independent verification organization licensed pursuant to this chapter shall revoke the verification of any artificial intelligence model or artificial intelligence application whose developer or deployer fails to do any of the following:

(A) Meet the requirements prescribed by the independent verification organization;

(B) Cooperate with the independent verification organization's ongoing monitoring;

(C) Adhere to its artificial intelligence governing policies or plans;

(D) Implement corrective actions prescribed by the independent verification organization.

Sec. 3755.09. (A) An independent verification organization licensed pursuant to this chapter may, at any time, do any of the following:

(1) Update or modify:

(a) Technical and operational requirements;

(b) Evaluation benchmarks;

(c) Audit methodologies;

(d) Governance plans;

(e) Any other element of its plan in order to take advantage of improved technology.

(2) Address issues previously discovered;

(3) Otherwise enhance the efficacy of its verification activities.

(B) An independent verification organization shall provide written notice to the attorney general of any material changes under division (A) of this section. The independent verification organization shall describe the proposed changes, the rationale for the proposed changes, and an explanation of how the proposed changes will better enable the independent verification organization to ensure acceptable mitigation of the relevant risks.

(C) An independent verification organization may implement changes proposed under division (A) of this section upon delivery of the written notice required by division (B) of this section.

Sec. 3755.091. (A) The attorney general may, within six months after receiving notice of proposed changes under division (B) of section 3755.09 of the Revised Code, request additional information from the independent verification organization regarding the proposed changes or may issue a written notice denying the changes in whole or in part, if the attorney general determines that the proposed changes are inadequate to ensure acceptable mitigation of the relevant risks or are otherwise inconsistent with the goals of this chapter.

(B) If the attorney general rejects the changes under division (A) of this section, the independent verification organization has 30 days to modify its plan to comply with the attorney general's determination and to assess whether artificial intelligence models or artificial intelligence applications assessed under the previous plan must be reassessed.

Sec. 3755.10. (A) An independent verification organization shall submit an annual report to the general assembly, the attorney general, and the auditor of state. The report shall be filed at the time and in the form prescribed by the attorney general, shall be duly verified, and shall cover the yearly period fixed by the attorney general. The report shall include all of the following:

(1) Aggregated information on the capabilities of the artificial intelligence models and artificial intelligence applications evaluated by the independent verification organization, the observed societal risks and benefits associated with those capabilities, and the potential societal risks and benefits associated with those capabilities;

(2) The adequacy of existing evaluation resources and mitigation measures to address observed and potential risks;

(3) Aggregated results of verification assessments;

(4) Remedial measures prescribed by the independent verification organization and whether the developer or deployer complied with those measures;

(5) Anonymized descriptions of additional risks beyond those the independent verification organization is licensed to verify and the adequacy of existing mitigation measures to address those risks;

(6) A list of all artificial intelligence models or artificial intelligence applications verified by the independent verification organization;

(7) A description of the independent verification organization's methods, technologies, and administrative procedures for evaluating risk mitigation by artificial intelligence models and artificial intelligence applications;

(8) A description of any changes to the independent verification organization's governance policies, sources of funding, or any other changes that would call into question its independence in carrying out its responsibilities under this chapter.

(B) Nothing in this section shall be construed to prohibit an independent verification organization from redacting any material that the independent verification organization determines, in good faith, would risk revealing any of the following:

(1) Trade secrets;

(2) Competitively sensitive information;

(3) Personal identifying information;

(4) Information that otherwise presents a risk to the security of an artificial intelligence model or artificial intelligence application if publicly disclosed.

(C) An independent verification organization shall retain all documentation used for the preparation of the report required under division (A) of this section for ten years following the submission of the report.

Sec. 3755.11. (A) In a civil action asserting claims for personal injury or property damage caused by an artificial intelligence model or artificial intelligence application, there is a rebuttable presumption against liability if all of the following apply:

(1) The artificial intelligence model or artificial intelligence application in question was verified by a licensed independent verification organization at the time of the plaintiff's alleged injuries.

(2) The plaintiff's alleged injury arose from a risk that the independent verification organization was licensed to verify and for which the independent verification organization did verify the artificial intelligence model or artificial intelligence application.

(3) The artificial intelligence model or artificial intelligence application fell within the specified market segment, if any, for which the independent verification organization was licensed to conduct verification.

(B) The presumption against liability described in division (A) of this section may be rebutted upon showing, by clear and convincing evidence, both of the following:

(1) The defendant did any of the following:

(a) Engaged in intentional, willful, or reckless misconduct;

(b) Induced the independent verification organization to verify based on material misrepresentations or omissions about the artificial intelligence model or artificial intelligence application or the risks such artificial intelligence model or artificial intelligence application poses;

(c) Failed to adhere to representations made to the independent verification organization;

(d) Failed to satisfy a condition upon which the verification was expressly based;

(e) Failed to disclose to the independent verification organization newly identified risks, known shortcomings of existing mitigation measures, material changes to the risk profile of the verified artificial intelligence model or artificial intelligence application, or any other information required to be disclosed to the independent verification organization under the independent verification organization's verification requirements;

(f) Failed to implement corrective action prescribed by the independent verification organization to address identified risks.

(2) The conduct described in division (B)(1) of this section was a proximate cause of the plaintiff's alleged injuries.

Sec. 3755.12. (A) The attorney general shall adopt rules pursuant to Chapter 119. of the Revised Code to implement and administer this section.

(B) Rules adopted under division (A) of this section shall do at least all of the following:

(1) Establish conflict of interest rules for independent verification organizations, including reporting requirements regarding the independent verification organization's funding sources, revenue generation and self-audit requirements regarding the independent verification organization's board composition, to ensure adequate independence from the artificial intelligence industry;

(2) Identify any additional elements required for an applicant's independent verification organization's plan to ensure acceptable mitigation of risk from any independent verification organization-verified artificial intelligence models or artificial intelligence applications;

(3) Specify the circumstances in which corrective action, including loss of licensure, is mandated, such as the failure to adhere to representations made to the independent verification organization to obtain licensure;

(4) Subject to section 3755.06 of the Revised Code, establish the composition of the artificial intelligence safety advisory council, including the procedure for appointing additional members, and the term length of members;

(5) Establish the per diem salary of members of the artificial intelligence safety advisory council;

(6) Provide the information, materials, and forms required to apply for an independent verification organization license under section 3755.02 of the Revised Code.

(C) The attorney general shall consider input received from stakeholders in adopting rules pursuant to this section.