2026 -- H 7954 | |
======== | |
LC005174 | |
======== | |
STATE OF RHODE ISLAND | |
IN GENERAL ASSEMBLY | |
JANUARY SESSION, A.D. 2026 | |
____________ | |
A N A C T | |
RELATING TO COMMERCIAL LAW -- GENERAL REGULATORY PROVISIONS -- | |
DIGITAL PLATFORM TRANSPARENCY AND DEMOCRATIC INTEGRITY ACT | |
| |
Introduced By: Representatives Potter, Caldwell, Boylan, Cortvriend, Dawson, Carson, | |
Date Introduced: February 27, 2026 | |
Referred To: House Innovation, Internet, & Technology | |
It is enacted by the General Assembly as follows: | |
1 | SECTION 1. Legislative findings and purpose. |
2 | The general assembly finds and declares that: |
3 | (1) Large digital platforms function as primary forums for public discourse and civic |
4 | engagement; |
5 | (2) Algorithmic ranking systems determine the visibility, amplification, and suppression of |
6 | content shown to users; |
7 | (3) The criteria used to rank and amplify content are generally opaque and may materially |
8 | shape public understanding and consumer decision-making; |
9 | (4) Synthetic media, automated amplification, coordinated inauthentic behavior, and |
10 | foreign-sponsored influence campaigns have been used to mislead consumers and distort public |
11 | discourse; |
12 | (5) Consumers in Rhode Island are entitled to accurate, factual information regarding how |
13 | content is curated, ranked, and amplified; |
14 | (6) The State has a compelling interest in preventing materially deceptive digital practices |
15 | and preserving the conditions necessary for informed democratic self-governance. |
16 | SECTION 2. Title 6 of the General Laws entitled "COMMERCIAL LAW — GENERAL |
17 | REGULATORY PROVISIONS" is hereby amended by adding thereto the following chapter: |
18 | CHAPTER 63 |
| |
1 | DIGITAL PLATFORM TRANSPARENCY AND DEMOCRATIC INTEGRITY ACT |
2 | 6-63-1. Definitions. |
3 | As used in this chapter: |
4 | (1) "Algorithmic ranking system" means an automated computational process used to |
5 | prioritize, recommend, amplify, demote, or suppress content based on user data, engagement |
6 | metrics, behavioral profiling, or other automated inputs. |
7 | (2) "Covered platform" means an online service, website, or application that: |
8 | (i) Has more than one million (1,000,000) monthly active users nationally; and |
9 | (ii) Allows users to create profiles and generate or share content visible to other users. |
10 | (3) "Synthetic media" means audio, video, image, or text content that has been materially |
11 | generated or altered using artificial intelligence or automated technologies in a manner that would |
12 | cause a reasonable person to believe the content depicts a real individual, statement, or event that |
13 | did not occur. |
14 | (4) "Coordinated inauthentic behavior" means the use of false identities, automated |
15 | accounts, or deceptive attribution practices to artificially amplify content. |
16 | (5) "Foreign-controlled entity" means an entity owned or controlled, directly or indirectly, |
17 | by a foreign government or foreign national as defined under federal election law. |
18 | 6-63-2. Applicability. |
19 | (a) This chapter shall apply only to content presented to users physically located within the |
20 | State of Rhode Island. |
21 | (b) Nothing in this chapter shall be construed to regulate commerce occurring wholly |
22 | outside the State of Rhode Island. |
23 | 6-63-3. Algorithmic transparency requirements. |
24 | (a) A covered platform shall publicly disclose, in clear and plain language: |
25 | (1) The primary factors used to determine ranking, recommendation, amplification, or |
26 | suppression of content; |
27 | (2) Whether engagement-based metrics materially affect content visibility; |
28 | (3) Whether paid promotion, sponsored placement, or financial incentives influence |
29 | content prioritization; and |
30 | (4) Whether behavioral profiling materially influences content personalization. |
31 | (b) A covered platform shall clearly disclose to users physically located within the state |
32 | whether content is presented chronologically or through algorithmic ranking. |
33 | (c) Nothing in this section shall require disclosure of proprietary source code, trade secrets, |
34 | or confidential business information. |
| LC005174 - Page 2 of 5 |
1 | 6-63-4. Synthetic media identification and disclosure. |
2 | (a) A covered platform shall implement and maintain commercially reasonable policies |
3 | and technical measures designed to detect synthetic media that materially depicts real individuals, |
4 | statements, or events. |
5 | (b) Upon actual knowledge or reasonable detection of synthetic media, a covered platform |
6 | shall clearly and conspicuously disclose that such content has been generated or materially altered |
7 | using artificial intelligence. |
8 | (c) Compliance with this section shall be evaluated based on commercially reasonable |
9 | industry standards. |
10 | (d) Nothing in this section shall require continuous monitoring of user content or proactive |
11 | surveillance beyond commercially reasonable detection practices. |
12 | 6-63-5. Prohibited deceptive practices. |
13 | It shall constitute an unfair and deceptive trade practice under chapter 13.1 of title 6 for a |
14 | covered platform to: |
15 | (1) Materially misrepresent the operation of its algorithmic ranking systems; |
16 | (2) Make an objectively verifiable and materially false representation that its content feed |
17 | is neutral, unbiased, or organic where algorithmic amplification materially alters content visibility |
18 | without disclosure; |
19 | (3) Knowingly fail to provide the disclosures required under this chapter; and |
20 | (4) Knowingly permit coordinated inauthentic behavior by foreign-controlled entities after |
21 | receiving actual notice of such activity. |
22 | 6-63-6. Enforcement. |
23 | (a) The attorney general may enforce violations of this chapter pursuant to chapter 13.1 of |
24 | title 6. |
25 | (b) Any Rhode Island resident materially harmed by a violation of this chapter may bring |
26 | a civil action for declaratory or injunctive relief. |
27 | (c) Civil penalties shall not exceed ten thousand dollars ($10,000) per knowing violation. |
28 | Each day a knowing violation continues shall constitute a separate violation. |
29 | (d) No damages shall be awarded absent proof that the covered platform intentionally or |
30 | recklessly violated this chapter. |
31 | 6-63-7. Construction. |
32 | (a) This chapter shall be construed as a content-neutral regulation of commercial conduct |
33 | and transparency. |
34 | (b) Nothing in this chapter shall be construed to: |
| LC005174 - Page 3 of 5 |
1 | (1) Regulate speech based on viewpoint or ideology; |
2 | (2) Require a platform to host or remove specific content; |
3 | (3) Impose liability for third-party content; |
4 | (4) Conflict with 47 U.S.C. § 230; |
5 | (5) Require continuous monitoring of user speech. |
6 | 6-63-8. Severability. |
7 | If any provision of this chapter is held invalid, such invalidity shall not affect other |
8 | provisions that can be given effect without the invalid provision, and to this end the provisions of |
9 | this chapter are declared to be severable. |
10 | SECTION 3. This act shall take effect upon passage. |
======== | |
LC005174 | |
======== | |
| LC005174 - Page 4 of 5 |
EXPLANATION | |
BY THE LEGISLATIVE COUNCIL | |
OF | |
A N A C T | |
RELATING TO COMMERCIAL LAW -- GENERAL REGULATORY PROVISIONS -- | |
DIGITAL PLATFORM TRANSPARENCY AND DEMOCRATIC INTEGRITY ACT | |
*** | |
1 | This act would regulate how certain large social media platforms utilize algorithms. |
2 | Knowing violations of this chapter would be subject to a civil penalty of up to ten thousand dollars |
3 | ($10,000), to be enforced by the attorney general. |
4 | This act would take effect upon passage. |
======== | |
LC005174 | |
======== | |
| LC005174 - Page 5 of 5 |