Prepared remarks to the Federal Circuit Bar Association (June 29, 2023)
Thanks for the opportunity to join you today in Colorado, where we are still celebrating the 2023 NBA Champion Denver Nuggets. For some, being at altitude is a rough slog—as was experienced by the many teams the Nuggets beat along the way. For others, it is inspiring and can provide valuable perspective. I hope you are all in the latter category.
At the Colorado Attorney General’s Office, we are excited to live in Colorado and be able to address a range of emerging issues at the intersection of law and technology. After talking about these issues, I’d also like to spend a few minutes speaking with you about the role of engagement between the bench and bar as well as touch on some of the challenges facing our profession.
When we consider the work of our department regulating emerging technologies, I think about Colorado’s path-marking work on our Colorado Privacy Act. The story of Colorado’s Privacy Act is a familiar one: Congress failed, for well over a decade, to act on a bipartisan consensus that consumers’ personal information should not be shared or used—particularly in ways that can harm them—without their awareness or ability to protect themselves. Congress’ failure to act on technology policy issues has forced the states to step up and lead—something the states have done time and again when Congress is unwilling or unable to act. And that’s what we did—on an overwhelming bipartisan basis—in enacting the Colorado Privacy Act.
Colorado’s privacy law provides consumers with important transparency around how their personal data is used and, critically, establishes core rights for consumers. Notably, it affords them the ability to opt out of the use of their personal data for sale, targeted advertising, and profiling. And, in July 2024, Colorado consumers will have the right to use a universal opt mechanism to opt-out of the sale of their personal data and the processing of their data for targeted advertising. Moreover, one additional important right it provides, paralleling the federal Fair Credit Reporting Act, is that consumers will have the right to access and delete personal data that businesses collect on them and to correct data that is inaccurate. And consumers are also granted the right to access data collected about them in a portable form.
The Colorado Privacy Act institutes a series of responsibilities on businesses that hold and use consumers’ data. First, it requires them to provide meaningful privacy notices to consumers and specify the purpose for which consumers’ data is collected and processed. Moreover, the law imposes a data minimization requirement, specifying that businesses can only collect personal data that’s reasonably necessary in relation to a specified purpose. The law also requires that businesses use sound practices in storing personal data, avoid processing personal data in ways that violate antidiscrimination laws, and obtain affirmative consent before processing sensitive data. And the law requires businesses to conduct data protection assessments before conducting data processing activities that present a heightened risk of harm to consumers (which includes the use of personal data for targeted advertising, selling data, or processing sensitive data).
After the law was enacted by the legislature, our department began the process of developing Colorado Privacy Act rules. At the outset, I committed myself and our team to a transparent, inclusive, and engaged process. I am proud of how our team—and it is truly an A Team—has accomplished just this goal. By all accounts, we conducted an extensive, transparent, and collaborative rulemaking, taking extensive public comments, and wading through complex issues to develop rules that are understandable and able to be complied with. We finalized the rules in March 2023 after receiving and considering over 200 comments that we received through our comment portal, listening sessions, and a public hearing. Within these rules are some national firsts, including guidance on data protection assessments, where we adopted a “principle-based” approach and worked to discourage “check the box compliance” activities.
Consumers and businesses can analyze our rules for themselves. And, like any regulatory system, we expect to revisit and improve these rules over time. Given the novelty of some of the rules, we know that we are going to learn a significant amount as they go into effect. Consider, for example, that the concept of a universal opt-out mechanism is a new one. To drive this effort forward, we are taking on the responsibility of maintaining a list of recognized mechanisms. We are also interested to see how our requirements on privacy notices and purpose specification play out in practice and give consumers more insight into how data is used. And we are going to keep a close eye on so-called “dark patterns,” with rules now in place to guide when such patterns are in play.
II. Artificial Intelligence
If the past is prologue, the federal government will struggle to meet the challenges of regulating artificial intelligence. I deeply hope that our national system of governance finds the footing to operate on what is known as “regular order,” holding hearings, working through issues collaboratively, and enacting legislation. In a modest step in that direction, two Colorado members of Congress from opposing parties—Ken Buck and Joe Neguse—worked together to pass a law providing both the federal government and states with better tools to enforce antitrust laws. We can sure use that same type of leadership in meeting the challenge of overseeing artificial intelligence.
Like other emerging technologies, a critical challenge in the area of artificial intelligence (AI) is to encourage and oversee the proper development of dynamic and trustworthy tools without hampering innovation. Trustworthiness is a critical concept in AI; after all, AI can enable a range of deep fakes, deceptions, and scams that can erode trust in our society and in businesses. I commend the U.S. Department of Commerce NTIA for seeking comments on how to approach this challenge. In my reflections today, I will draw on the comments we submitted, along with 22 other States, in the current proceeding on that topic.
Let me acknowledge that the speed of innovation, the improved productivity, and the new solutions from AI holds great promise. Public policy should not get in the way of such improvements. Consequently, and given the dynamism of this emerging technology, I am very skeptical that we can at this point use prescriptive limitations as a regulatory tool. By contrast, I am very keen to see how commitments to robust transparency, reliable testing and assessment requirements, and after-the-fact enforcement can provide valuable guideposts on the road to regulating AI. Moreover, as with the data privacy arena, I would commend a risk-based approach, recognizing that certain use cases (say, routes for package delivery) are less concerning than others (say, health care options), though we recognize a nuanced evaluation of risk likely varies context by context. A risk-based approach should consider what AI could impact, such as collective or individual physical or psychological safety, civil and human rights, or equal access to goods, services, and opportunities. It also should consider what categories of data are used by the AI system (e.g., sensitive data such as medical information, biometric data, or personal information about children) and what are the ultimate outputs (e.g., “deepfakes” or manipulation).
For those organizations engaging in high-risk activities, they should be encouraged or, where appropriate mechanisms (such as procurement requirements) can allow, mandated—to publish public-facing policies that describe what decisions are powered by AI, what human involvement there is in validating those decisions, and what process individuals can use to appeal those decisions. Ideally, key transparency measures would also include consumer access to information about personal data used in any such decisions and a method for individuals to access and correct any personal information used by the AI in the decisions.
A focus on high-risk AI systems aligns with existing AI regulatory frameworks, such as the EC AI Act, which prohibits the use of AI in certain high-risk contexts and imposes heightened obligations in other such systems. U.S. privacy law has created similar guardrails around high-risk use cases. Laws in Colorado, Connecticut, Montana, and Virginia, for example, set heightened requirements when data is processed to support decisions that result in legal or other significant effects for an individual, such as automated decisions that impact an individual’s access to financial, educational, housing, or employment opportunities. Laws governing facial recognition technologies have also created heightened requirements in high-risk contexts, requiring meaningful human review when government entities use facial recognition technologies in ways that may impact individual liberties and rights.
Finally, I would be remiss not to emphasize that, if there is any federal legislation in the privacy or AI area, it is critical that new enforcement mechanisms are coupled with concurrent authority for state attorneys general. That provides a crucial check on the possibility that the federal government will not be active in enforcement; and it ensures that the on-the-ground expertise of state attorneys general can be leveraged appropriately. Moreover, my recommended architecture for technology policy—where addressing emerging technologies where today’s solution becomes tomorrow’s antiquated approach—is one that is agile and adaptive. To that end, I have advanced—and Colorado’s privacy law and our suggested AI oversight regime follows—an entrepreneurial model of regulation.
III. The Bench, the Bar, and the Legal Profession
In the presence of this group of leaders in our profession, I have to mention the importance of the legal profession (particularly at this point in our history and the frightening state of our politics) and engagement between the bench and bar at this time. Notably, we must acknowledge that we live in a difficult time for the rule of law. As Justice Ginsburg often said before her passing, the rule of law depends on respectful engagement and our era of rising polarization and demonization presents a fundamental challenge to our institutions and the rule of law. For us as lawyers, we need to take this challenge seriously.
We could spend plenty of time analyzing what is driving rising polarization and demonization, but given this audience, I would like to focus on the impact played by emerging social media platforms. Those platforms, as is well known, have algorithms that tend to focus on how to keep people “engaged,” even when that means promoting dangerous content. Significantly, it is no accident that many who assaulted the Capitol on January 6 were organized via social media platforms and some even received suggestions about “groups you might like.”
For Thomas Webster, his crimes on January 6 marked an evolution from a respected police officer—who served in Mayor Bloomberg’s security detail—to a convicted felon. When asked what explained his conduct and why he showed up at the Capitol in full body armor, he answered “I’ve seen countless videos [on social media] of people with my beliefs being assaulted by large groups of people.” After a jury trial, he was found guilty of multiple counts and was sentenced to ten years in prison. But before the verdict, the judge in his case allowed him to remain free pending trial, but on one condition: no access to the Internet.
We must recognize that January 6 is not merely in the past—the misinformation, demonization, and violence of that day remain with us. That’s why I led 47 state attorneys general to condemn the actions of January 6 and call for accountability for the capitol attackers. And who could forget the parade of Nazi symbols on January 6, or seeing a Confederate flag paraded through the halls of the U.S. Capitol. Those symbols are an attack on the core ethos of our nation: e pluribus unum. As lawyers, we must defend our core institutions, our Nation’s commitment to the rule of law, and our vision of an inclusive society with liberty and justice for all.
A core part of this work is viewing everyone as a fellow citizen and not ever allowing people to be demonized as “the other.” To that end, our department sponsored what we called the “Ginsburg-Scalia Initiative,” which elevated the importance of respectful engagement. To do so, we sponsored the Unify Colorado Challenge, which brought together citizens from different parts of our state with different points of view for a respectful conversation. The results were powerful and were captured in a documentary. We made this documentary along with related lesson plans available to teachers in order to provide them tools to help students to navigate difficult conversations around difference, while emphasizing listening and learning from alternative points of view and how they can form the core tenets of civic education—something we need more of.
In terms of this overall project, I speak often about the need to lead with “empathy, not judgment.” It is too easy, unfortunately, to judge others, particularly those we don’t know or those who are different. As lawyers, we can lead the way in encouraging greater empathy and a commitment to true listening, which I believe will make us better people and will help heal our democratic republic. Or, as Arthur Brooks wrote in his book, Love Your Enemies, we need a greater spirit of loving kindness and not one of contempt towards others. On this score, I think often of what Tom Junod, the author profiled in the movie, A Beautiful Day In the Neighborhood, made about Mr. Rogers, said about Fred Rogers:
Fred was a man with a vision, and his vision was of the public square, a place full of strangers, transformed by love and kindness into something like a neighborhood. That vision depended on civility, on strangers feeling welcome in the public square, and so civility couldn’t be debatable. It couldn’t be subject to politics but rather had to be the very basis of politics, along with everything else worthwhile.
* * *
Let me wrap up by recognizing that it is easy to be discouraged or even cynical about the challenges we face as Americans. The advent of AI, the challenges we face in maintaining our democratic republic and the rule of law, and a range of other challenges from teen mental health to a changing climate all are causes for consternation or even anxiety. But our Nation has withstood hard challenges and difficult times before—and I know we will again.
When people ask me how I can be an optimist and keep a positive attitude about what the future holds, I think back to my grandmother, who survived the Holocaust, gave birth to my mom in a Nazi concentration camp, and came to the United States with nothing—not speaking a word of English—and become an accomplished investor. When I asked my grandmother how she could keep her positivity and believe in a better future during difficult times, she would always say “it’s easier to believe.” She refused to give up hope. That’s why I believe that hope is a choice we can make every day. Our Nation was built on that spirit—and I know that it will continue to prevail.
 I have called on Congress, for example, to take action and empower an Internet platform agency to address a range of technology policy issues, joining with then-Pennsylvania Attorney General Josh Shapiro to make the case in 2021. https://coag.gov/app/uploads/2021/10/Internet-Regulation-Letter-to-US-Senate-10.0.21-Final.pdf Unfortunately, Congress has failed to do so—and there’s no sign of that changing soon.
 The remarks below draw on my prior explanation of the law to the Sedona Conference, which discuss our privacy law at greater length. Those remarks can be found at https://coag.gov/blog-post/prepared-remarks-attorney-general-phil-weiser-on-the-way-forward-on-data-privacy-may-4-2023/.
 That commitment was set forth in remarks to the International Association of Privacy Professionals (IAPP). https://coag.gov/blog-post/prepared-remarks-attorney-general-phil-weiser-at-the-international-association-of-privacy-professionals-april-12-2022/
 Those comments can be found here: https://coag.gov/app/uploads/2023/06/NTIA-AI-Comment.pdf.
 I outline this model in this article. See Philip J. Weiser, Entrepreneurial Administration, 97 B.U. L. Rev. 2011 (2017), available at https://scholar.law.colorado.edu/faculty-articles/838.
 For one such example, see “Ruth Bader Ginsburg: Senate Exemplifies Trend Of Sticking With ‘One’s Own Home Crowd”, CNN, available at https://www.cnn.com/2020/02/07/politics/ruth-bader-ginsburg-senate-partisan-polarization/index.html.
 Michael Wilson, “How a Respected N.Y.P.D. Officer Became the Accused Capitol Riot #EyeGouger”, New York Times, available at https://www.nytimes.com/2021/07/27/nyregion/capitol-riot-january-6.html.
 The Colorado Bar Association published my talk on this topic. See https://cl.cobar.org/departments/leading-with-empathy/.
 Tom Junod, “My Friend Mister Rogers,” Atlantic (Dec. 2019), https://www.theatlantic.com/magazine/archive/2019/12/what-would-mister-rogers-do/600772