2024 -- H 7521 | |
======== | |
LC004182 | |
======== | |
STATE OF RHODE ISLAND | |
IN GENERAL ASSEMBLY | |
JANUARY SESSION, A.D. 2024 | |
____________ | |
A N A C T | |
RELATING TO STATE AFFAIRS AND GOVERNMENT -- AUTOMATED DECISION | |
TOOLS -- ARTIFICIAL INTELLIGENCE | |
| |
Introduced By: Representatives Baginski, Carson, and Boylan | |
Date Introduced: February 07, 2024 | |
Referred To: House Innovation, Internet, & Technology | |
It is enacted by the General Assembly as follows: | |
1 | SECTION 1. Title 42 of the General Laws entitled "STATE AFFAIRS AND |
2 | GOVERNMENT" is hereby amended by adding thereto the following chapter: |
3 | CHAPTER 166 |
4 | AUTOMATED DECISION TOOLS |
5 | 42-166-1. Definitions. |
6 | As used in this chapter, the following terms shall have the following meanings: |
7 | (1) "Algorithmic discrimination" means the condition in which an automated decision tool |
8 | contributes to unjustified differential treatment or impacts disfavoring people based on their actual |
9 | or perceived race, color, ethnicity, sex, religion, age, national origin, limited English proficiency, |
10 | disability, veteran status, genetic information, reproductive health, or any other classification |
11 | protected by state law. |
12 | (2) "Artificial intelligence" means a machine-based system that can, for a given set of |
13 | human-defined objectives, make predictions, recommendations, or decisions influencing a real or |
14 | virtual environment. |
15 | (3) "Automated decision tool" means a system or service that uses artificial intelligence |
16 | and has been specifically developed and marketed to, or specifically modified to, make or be a |
17 | controlling factor in making consequential decisions. |
18 | (4) "Consequential decision" means a decision or judgment that has a legal, material, or |
| |
1 | similarly significant effect on an individual's life relating to the impact of, access to, or the cost, |
2 | terms, or availability of, any of the following: |
3 | (i) Employment, workers management, or self-employment including, but not limited to, |
4 | all of the following: |
5 | (A) Pay or promotion; |
6 | (B) Hiring or termination; or |
7 | (C) Automated task allocation. |
8 | (ii) Education and vocational training including, but not limited to, all of the following: |
9 | (A) Assessment including, but not limited to, detecting student cheating or plagiarism; |
10 | (B) Accreditation; |
11 | (C) Certification; |
12 | (D) Admissions; or |
13 | (E) Financial aid or scholarships. |
14 | (iii) Housing or lodging, including rental or short-term housing or lodging. |
15 | (iv) Essential utilities, including electricity, heat, water, Internet or telecommunications |
16 | access, or transportation. |
17 | (v) Family planning, including adoption services or reproductive services, as well as |
18 | assessments related to child protective services. |
19 | (vi) Health care or health insurance, including mental health care, dental, or vision. |
20 | (vii) Financial services, including a financial service provided by a mortgage company, |
21 | mortgage broker, or creditor. |
22 | (viii) The criminal justice system including, but not limited to, all of the following: |
23 | (A) Risk assessments for pretrial hearings; |
24 | (B) Sentencing; or |
25 | (C) Parole. |
26 | (ix) Legal services, including private arbitration or mediation. |
27 | (x) Voting. |
28 | (xi) Access to benefits or services or assignment of penalties. |
29 | (5) "Deployer" means a person, partnership, state or local government agency, or |
30 | corporation that uses an automated decision tool to make a consequential decision. |
31 | (6) "Developer" means a person, partnership, state or local government agency, or |
32 | corporation that designs, codes, or produces an automated decision tool, or substantially modifies |
33 | an artificial intelligence system or service for the intended purpose of making, or being a controlling |
34 | factor in making, consequential decisions, whether for its own use or for use by a third party. |
| LC004182 - Page 2 of 9 |
1 | (7) "Impact assessment" means a documented risk-based evaluation of an automated |
2 | decision tool that meets the criteria of § 42-166-12. |
3 | (8) "Sex" means and includes pregnancy, childbirth, and related conditions, gender |
4 | identity, intersex status, and sexual orientation. |
5 | (9) "Significant update" means a new version, new release, or other update to an automated |
6 | decision tool that includes changes to its use case, key functionality, or expected outcomes. |
7 | 42-166-2. Assessments for automated decision tools. |
8 | (a) On or before January 1, 2025, and annually thereafter, a deployer of an automated |
9 | decision tool shall perform an impact assessment for any automated decision tool the deployer uses |
10 | that includes all of the following: |
11 | (1) A statement of the purpose of the automated decision tool and its intended benefits, |
12 | uses, and deployment contexts; |
13 | (2) A description of the automated decision tool's outputs and how they are used to make, |
14 | or be a controlling factor in making, a consequential decision; |
15 | (3) A summary of the type of data collected from natural persons and processed by the |
16 | automated decision tool when it is used to make, or be a controlling factor in making a |
17 | consequential decision; |
18 | (4) A statement of the extent to which the deployer's use of the automated decision tool is |
19 | consistent with or varies from the statement required of the developer by § 42-166-4; |
20 | (5) An analysis of potential adverse impacts on the basis of sex, race, color, ethnicity, |
21 | religion, age, national origin, limited English proficiency, disability, veteran status, or genetic |
22 | information from the deployer's use of the automated decision tool; |
23 | (6) A description of the safeguards implemented, or that will be implemented, by the |
24 | deployer to address any reasonably foreseeable risks of algorithmic discrimination arising from the |
25 | use of the automated decision tool known to the deployer at the time of the impact assessment; |
26 | (7) A description of how the automated decision tool will be used by a natural person, or |
27 | monitored when it is used, to make, or be a controlling factor in making, a consequential decision; |
28 | and. |
29 | (8) A description of how the automated decision tool has been or will be evaluated for |
30 | validity or relevance. |
31 | (b) On or before January 1, 2025, and annually thereafter, a developer of an automated |
32 | decision tool shall complete and document an assessment of any automated decision tool that it |
33 | designs, codes, or produces that includes all of the following: |
34 | (1) A statement of the purpose of the automated decision tool and its intended benefits, |
| LC004182 - Page 3 of 9 |
1 | uses, and deployment contexts; |
2 | (2) A description of the automated decision tool's outputs and how they are used to make, |
3 | or be a controlling factor in making, a consequential decision; |
4 | (3) A summary of the type of data collected from natural persons and processed by the |
5 | automated decision tool when it is used to make, or be a controlling factor in making, a |
6 | consequential decision; |
7 | (4) An analysis of a potential adverse impact on the basis of sex, race, color, ethnicity, |
8 | religion, age, national origin, limited English proficiency, disability, veteran status, or genetic |
9 | information from the deployer's use of the automated decision tool; |
10 | (5) A description of the measures taken by the developer to mitigate the risk known to the |
11 | developer of algorithmic discrimination arising from the use of the automated decision tool; and |
12 | (6) A description of how the automated decision tool can be used by a natural person, or |
13 | monitored when it is used, to make, or be a controlling factor in making, a consequential decision. |
14 | (c) A deployer or developer shall, in addition to the impact assessment required by |
15 | subsections (a) and (b) of this section, perform, as soon as feasible, an impact assessment with |
16 | respect to any significant update. |
17 | (d) This section does not apply to a deployer with fewer than twenty-five (25) employees |
18 | unless, as of the end of the prior calendar year, the deployer deployed an automated decision tool |
19 | that impacted more than nine hundred ninety-nine (999) people per year. |
20 | 42-166-3. Notifications and requests not to be subject. |
21 | (a) Notifications of consequential decisions. |
22 | (1) A deployer shall, at or before the time an automated decision tool is used to make a |
23 | consequential decision, notify any natural person that is the subject of the consequential decision |
24 | that an automated decision tool is being used to make, or be a controlling factor in making, the |
25 | consequential decision. |
26 | (2) A deployer shall provide to a natural person notified pursuant to this subsection all of |
27 | the following: |
28 | (i) A statement of the purpose of the automated decision tool; |
29 | (ii) Contact information for the deployer; and |
30 | (iii) A plain language description of the automated decision tool that includes a description |
31 | of any human components and how any automated component is used to inform a consequential |
32 | decision. |
33 | (b) Request to not be subject to the automated decision tool. |
34 | (1) If a consequential decision is made solely based on the output of an automated decision |
| LC004182 - Page 4 of 9 |
1 | tool, a deployer shall, if technically feasible, accommodate a natural person's request to not be |
2 | subject to the automated decision tool and to be subject to an alternative selection process or |
3 | accommodation. |
4 | (2) After a request pursuant to subsection (b)(1) of this section, a deployer may reasonably |
5 | request, collect, and process information from a natural person for the purposes of identifying the |
6 | person and the associated consequential decision. If the person does not provide that information, |
7 | the deployer shall not be obligated to provide an alternative selection process or accommodation. |
8 | 42-166-4. Statement of intended uses of automated decision tools. |
9 | (a) A developer shall provide a deployer with a statement regarding the intended uses of |
10 | the automated decision tool and documentation regarding all of the following: |
11 | (1) The known limitations of the automated decision tool, including any reasonably |
12 | foreseeable risks of algorithmic discrimination arising from its intended use; |
13 | (2) A description of the type of data used to program or train the automated decision tool; |
14 | and |
15 | (3) A description of how the automated decision tool was evaluated for validity and |
16 | explainability before sale or licensing. |
17 | (b) This section does not require the disclosure of trade secrets, as defined in chapter 41 of |
18 | title 6 ("uniform trade secrets act"). |
19 | 42-166-5. Safeguards and designates. |
20 | (a) Administrative and technical safeguards. |
21 | (1) A deployer or developer shall establish, document, implement, and maintain a |
22 | governance program that contains reasonable administrative and technical safeguards to map, |
23 | measure, manage, and govern the reasonably foreseeable risks of algorithmic discrimination |
24 | associated with the use or intended use of an automated decision tool. |
25 | (2) The safeguards required by this subsection shall be appropriate to all of the following: |
26 | (i) The use or intended use of the automated decision tool; |
27 | (ii) The deployer's or developer's role as a deployer or developer; |
28 | (iii) The size, complexity, and resources of the deployer or developer; |
29 | (iv) The nature, context, and scope of the activities of the deployer or developer in |
30 | connection with the automated decision tool; and |
31 | (v) The technical feasibility and cost of available tools, assessments, and other means used |
32 | by a deployer or developer to map, measure, manage, and govern the risks associated with an |
33 | automated decision tool. |
34 | (b) The governance program required by this section shall be designed to do all of the |
| LC004182 - Page 5 of 9 |
1 | following: |
2 | (1) Employee designates: |
3 | (i) Designation of at least one employee to be responsible for overseeing and maintaining |
4 | the governance program and compliance with this chapter. |
5 | (ii) An employee designated pursuant to this section shall have the authority to assert to |
6 | the employee's employer a good faith belief that the design, production, or use of an automated |
7 | decision tool fails to comply with the requirements of this chapter. |
8 | (iii) An employer of an employee designated pursuant to this section shall conduct a prompt |
9 | and complete assessment of any compliance issue raised by that employee. |
10 | (2) Identify and implement safeguards to address reasonably foreseeable risks of |
11 | algorithmic discrimination resulting from the use or intended use of an automated decision tool. |
12 | (3) If established by a deployer, provide for the performance of impact assessments as |
13 | required by § 42-166-2. |
14 | (4) If established by a developer, provide for compliance with §§ 42-166-3 and 42-166-4. |
15 | (5) Conduct an annual and comprehensive review of policies, practices, and procedures to |
16 | ensure compliance with this chapter. |
17 | (6) Maintain for two (2) years after completion the results of an impact assessment. |
18 | (7) Evaluate and make reasonable adjustments to administrative and technical safeguards |
19 | in light of material changes in technology, the risks associated with the automated decision tool, |
20 | the state of technical standards, and changes in business arrangements or operations of the deployer |
21 | or developer. |
22 | (c) This section does not apply to a deployer with fewer than twenty-five (25) employees |
23 | unless, as of the end of the prior calendar year, the deployer deployed an automated decision tool |
24 | that impacted more than nine hundred ninety-nine (999) people per year. |
25 | 42-166-6. Required publicly available information. |
26 | A deployer or developer shall make publicly available, in a readily accessible manner, a |
27 | clear policy that provides a summary of both of the following: |
28 | (1) The types of automated decision tools currently in use or made available to others by |
29 | the deployer or developer; and |
30 | (2) How the deployer or developer manages the reasonably foreseeable risks of algorithmic |
31 | discrimination that may arise from the use of the automated decision tools it currently uses or makes |
32 | available to others. |
33 | 42-166-7. Algorithmic discrimination. |
34 | (a) A deployer shall not use an automated decision tool that results in algorithmic |
| LC004182 - Page 6 of 9 |
1 | discrimination. |
2 | (b) Civil actions for algorithmic discrimination. |
3 | (1) On and after January 1, 2026, a person may bring a civil action against a deployer for |
4 | violation of this section. |
5 | (2) In an action brought pursuant to this section and § 42-166-8, the plaintiff shall have the |
6 | burden of proof to demonstrate that the deployer's use of the automated decision tool resulted in |
7 | algorithmic discrimination that caused actual harm to the person bringing the civil action. |
8 | (c) In addition to any other remedy at law, a deployer that violates this section shall be |
9 | liable to a prevailing plaintiff for any of the following: |
10 | (1) Compensatory damages; |
11 | (2) Declaratory relief; and |
12 | (3) Reasonable attorneys' fees and costs. |
13 | 42-166-8. Civil actions for algorithmic discrimination. |
14 | (a) Parties authorized to bring civil actions. |
15 | (1) Any of the following public entities may bring a civil action against a deployer or |
16 | developer for a violation of this chapter: |
17 | (i) The attorney general; or |
18 | (ii) A city or town solicitor with the consent of the attorney general. |
19 | (2) A court may award in an action brought pursuant to this subsection all of the following: |
20 | (i) Injunctive relief; |
21 | (ii) Declaratory relief; and |
22 | (iii) Reasonable attorneys' fees and litigation costs. |
23 | (b) Written notice. An authorized party, before commencing an action pursuant to this |
24 | section for injunctive relief, shall provide forty-five (45) days written notice to a deployer or |
25 | developer of the alleged violations of this chapter. |
26 | (c) Ability to cure. |
27 | (1) The developer or deployer may cure, within forty-five (45) days of receiving the written |
28 | notice described in this section, the noticed violation and provide the person who gave the notice |
29 | an express written statement, made under penalty of perjury, that the violation has been cured and |
30 | that no further violations shall occur. |
31 | (2) If the developer or deployer cures the noticed violation and provides the express written |
32 | statement pursuant to this section, a claim for injunctive relief shall not be maintained for the |
33 | noticed violation. |
| LC004182 - Page 7 of 9 |
1 | SECTION 2. This act shall take effect upon passage. |
======== | |
LC004182 | |
======== | |
| LC004182 - Page 8 of 9 |
EXPLANATION | |
BY THE LEGISLATIVE COUNCIL | |
OF | |
A N A C T | |
RELATING TO STATE AFFAIRS AND GOVERNMENT -- AUTOMATED DECISION | |
TOOLS -- ARTIFICIAL INTELLIGENCE | |
*** | |
1 | This act would require a deployer and a developer of an automated decision tool to perform |
2 | an impact assessment that includes a statement of the purpose of the automated decision tool and |
3 | its intended benefits, uses, and deployment contexts. |
4 | This act would also require a deployer to notify any natural person who is the subject of a |
5 | consequential decision when an automated decision tool is being used to make, or be a controlling |
6 | factor in making, a consequential decision and to provide that person with a statement of the |
7 | purpose of the automated decision tool. |
8 | The act would, if a consequential decision is made solely based on the output of an |
9 | automated decision tool, further require a deployer to, if technically feasible, accommodate a |
10 | natural person’s request to not be subject to the automated decision tool and to be subject to an |
11 | alternative selection process or accommodation, as prescribed. |
12 | Additionally, this act would prohibit a deployer from using an automated decision tool that |
13 | results in algorithmic discrimination and allow the attorney general and local solicitor to bring civil |
14 | actions against developers and deployers for algorithmic discrimination. |
15 | This act would take effect upon passage. |
======== | |
LC004182 | |
======== | |
| LC004182 - Page 9 of 9 |