Attorney General Phil Weiser Testimony on Senate Bill 25-318 (May 5, 2025)
Attorney General Phil Weiser Testimony on Senate Bill 25-318 – Before the Committee on Business, Labor, and Technology
Colorado Senate – May 5, 2025 – (submitted to Legislative Council Staff)
Thank you, Majority Leader Rodriguez and members of the Committee, for taking up the important question of how Colorado should approach AI (“AI”), recognizing both its potential benefits and its risks. The current bill, as introduced, however, is inadequate to address the unintended and problematic consequences inherent in S.B. 24-205, enacted last year. If the General Assembly cannot substantially address the important concerns I and others have with this bill in the time remaining in the session, I recommend delaying its implementation by one year so we are not in a position of being asked to implement a problematic law. By so doing, Colorado can avoid moving ahead too quickly in a complex area, creating unintended consequences, and putting our state at a competitive disadvantage in an emerging field.
I. Background
Last year, at the end of session, the General Assembly enacted S.B. 24-205, which imposes a series of obligations on developers of AI technologies and on deployers of those technologies. That bill moved quickly through the legislative process and, unlike the Colorado Privacy Act, for example, S.B. 24-205 would have benefited from a more fulsome process for reflection and feedback. Given the significant risk of unintended consequences from the impact of the law as adopted and signed, Governor Polis, Majority Leader Rodriguez, and I came together to write an open letter, pledging that “state and legislative leaders will engage in a process to revise the new law, and minimize unintended consequences associated with its implementation.”
I want to begin by acknowledging the significant amount of work that went into this process and these conversations, including the impressive amount of time invested by Majority Leader Rodriguez and other members of the Joint Technology Committee. I also want to acknowledge that, in an ideal world, we would have national leadership that would be driving this conversation forward. Unfortunately, the U.S. Congress has demonstrated an inability to lead on critical technology policy issues—ranging from data privacy to oversight of social media to AI. This failure puts states in a difficult position—either step up on what is a “second best strategy” to national leadership or allow important public policy issues not to be raised and discussed seriously.
The conversation that Majority Leader Rodriguez has spurred is an important one for the United States. Like other conversations around technology policy topics like data privacy and oversight of social media, it is important that Colorado—and other states—proceed thoughtfully. There are, in other words, risks that arise from failing to address concerns around AI, but there are also risks that arise from prematurely adding unnecessary and problematic obligations on technology companies or other organizations developing or adopting this new technology.
To prevent Colorado from implementing oversight in the AI area prematurely and putting our state out there on this issue while others are continuing to develop and hone appropriate solutions, I recommend delaying the implementation of the bill by one year. By so doing, Colorado could continue to work on advancing this important dialogue, but not subject Colorado companies and organizations to obligations that still are likely to include unintended and unfortunate consequences.
II. Continuing the Conversation and Improving the AI Governance Model
It is important for the conversation on AI oversight to address a range of concerns with the current law. One important way to do that is to delay the implementation of the law by a year. In short, this is a complicated area and even with the impressive amount of work that has gone into the current draft amended law, I am very concerned that we have yet to land on a workable model.
First off, let me discuss a question that bears more reflection and refinement of the current law—the obligations placed on those who adopt, or deploy, AI systems. In discussing this issue, it is important to recognize that AI cannot be easily disentangled from software. Put differently, AI and software are not easily distinguished, as more and more software—or even what we call “data analytics”—uses AI. To that end, an appropriate first step might be for deployers to disclose that they are using AI in making consequential decisions, but not to require them to engage in what can be costly “impact assessments.”
Second, even if the requirement of putting impact assessments is removed from deployers and placed solely on developers, it still merits further reflection on what those assessments should look like and what topics should be covered by them. It is important that startup companies—including those eligible for the Advanced Industries Tax Credit (namely, companies with no more than $10 million in capital raised, $5 million in revenue generated, and five or less years in existence) and those under a certain size—be exempt from such requirements. Such an approach, wisely taken in the Colorado Privacy Act, appreciates the importance of leaving companies room to develop the necessary scale and sophistication to handle such obligations without being a regulatory burden that hampers their ability to innovate and grow.
As for the obligations on deployers and developers, there are important issues that warrant further discussion. A reasonable obligation on developers would focus on appropriate testing and oversight managed in a thoughtful way that supports a thriving environment for innovation and protects consumers in a smart and effective way. Those discussions are likely to continue in Colorado and other states—and, hopefully, in Congress at some point—as we confront the opportunities and risks of AI. Rather than have Colorado move to implement a problematic law in the near term, I recommend we delay the bill by one year.
Finally, let me close on a practical concern. The level of obligations currently placed on deployers would create a massive enforcement challenge for our department if the amended law was adopted in its current form. Given my view that the obligations for impact assessments are more logically placed on developers that create products for expected use cases that could give rise to discrimination, I believe that is where the enforcement focus should lie, not on the conduct of deployers, as deployers include a range of organizations who might adopt this technology and are unprepared for such a requirement.
* * *
At our best in Colorado, we are innovative, collaborative, and respectful in working through difficult public policy issues. Unlike Congress, we have shown the ability to do so on important technology policy issues. In this case, we have an opportunity to continue doing so and create the space—through delayed implementation—for a critical dialogue and for thoughtful problem solving. Thank you, again, members of the Committee and all who have been devoting considerable time and effort on this important topic to get it right.