Skip to content

Another try at regulating Automated Decision Systems in Washington state (UPDATED)

You can probably guess what happened

A Japanese man, in a tie and jacket, holding up paper.  Below, the words "SB 5356 Procurement/Automated System"
Sen. Bob Hasegawa at the SB 5356 hearing, with a copy of the Automated Decision-Making Systems Workgroup report

Originally published February 16, with a slightly different ending.

"Governments are increasingly turning to automated systems to make decisions for criminal sentencing, medicare eligibility, and school placement. Public officials and companies tout gains in speed and efficiency, and the hope that automated decisions can bring more fairness into bureaucratic processes. But the drawbacks are coming into clearer focus. Computer algorithms can also reinforce or introduce new biases. Studies have shown that facial recognition tools used by law enforcement could disproportionately impact African-Americans, that predictive policing software could unfairly target minority neighborhoods because of incomplete crime databases, and that algorithms can perpetuate racial disparities in insurance premium costs.

Lawmakers in Washington State are taking steps to tame this digital frontier."

Washington could be the first state to rein in automated decision-making,  DJ Pangburn, Fast Company, 2019

Four years later, it's all still true.  The discriminatory effects of automated decision systems (ADS) continue to come into clearer and clearer focus, and Washington's legislature is considering legislation to regulate ADS.  This time, it's Sen. Bob Hasegawa's SB 5356, which proposes minimum standards for fairness and accountability for any government agency buying or using ADS.  

Update: despite overwhelmingly positive testimony and signins at SB 5356's hearing, the Senate ENET Committee didn't act on the bill before the "policy cutoff", which means it's probably dead for the session.

If you're pressed for time, feel free to skip ahead to details of SB 5356, the discussion of the hearing, and the next steps.  But if you've got a couple of minutes, it's worth looking at how we got here.

What's past is prologue

The landmark 2019 algorithmic accountability bill the Fast Company article covered died in committee.  So did Sen. Hasegawa's Automated Decision Systems (ADS) regulation bill in 2021 ... and 2022.

And it's not just a Washington problem!  As Todd Feathers' Why It’s So Hard to Regulate Algorithms in The Markup (from early 2022) describes, Big Tech and government contractors have successfully derailed legislation almost everywhere by arguing that proposals are too broad. And even when legislation has passed, it's gotten watered down – last week's New York City’s Law on Using Tech to Make Hiring Decisions Keeps Getting Weaker (by Matthew Scherer and Ridhi Shetty in Slate Future Tense) is a good example.

More positively, even though Sen. Hasegawa's 2021 bill didn't pass, it led to a small budget proviso that created  the Automated Decision Systems (ADS) workgroup.  The ADS workgroup was tasked with "developing recommendations for changes in state law and policy regarding the development procurement, and use of ADS by public agencies."  As well as government agency leads, the workgroup also included researchers, and representatives from advocacy organizations that represent communities that are disproportionately vulnerable to being harmed by algorithmic bias.

When I was invited to join (as a volunteer) to provide research perspectives, I was quite hesitant. New York City's earlier ADS workgroup had been a disaster: they couldn't even agree on the defintion of an automated  decision system1, government agencies stonewalled civil society groups, and the workgroup's failure to get information established a dangerous precedent.  Still, I decided it was worth a try ... and fortunately, it worked out much better than I had feared.

The agency leads in Washington state's ADS workgroup didn't try to stonewall; they understood the situation, wanted to make progress, and were candid about their constraints.  We also had the advantage of several more years of research on and experience with ADS.  Perhaps even more importantly, we also leveraged the insights in Rashida Richardson's 2019 Confronting Black Boxes "Shadow Report" from the NYC workgroup, which includes recommendations for workgroups and advocacy coalitions as well as for government use of ADS systems – all of which proved hugely valuable to us (and should be required reading for anybody working on any kind of algorithmic regulation).

So it wasn't a disaster at all.  In fact, we even reached consensus on guiding principles and recommendations, and included them in the ADS Workgroup Report we issued.  

Are you at risk? Discriminatory algorithms in government agencies, a March 2022 Indivisible Town Hall, has perspectives from workgroup members María Paula Ángel, Jennifer Lee, and Ashley del Villar as well as Ben Winters of EPIC, Maya Morales of WA People's Privacy, and me.

Indivisible Town Hall: "Are you at risk? Discriminatory algorithms in government agencies"

The ADS Workgroup: Consensus on a path forward

"Automated decision systems are a way to reduce costs, improve delivery of public services, and make decisions more efficient, reliable, and accurate.... However, a growing body of evidence indicates that automated decision systems can be discriminatory, inaccurate and lack transparency and accountability.

Washington state agencies currently deploy a large number and range of automated decision systems....  Some systems in use by governmental agencies today have not been audited for biases, and in many cases, were developed several years ago when techniques for identifying and addressing biases were not as advanced as they are today."

– Washington state Automated Decision Systems (ADS) Workgroup Report, 2021

With my engineering management hat on (I’ve been CTO, VP of Engineering and Chief Product Officer at Silicon Valley-based companies as well as Software Architect at Microsoft), I could really relate to the challenging situation agency leads are in. And again, it's not just here in Washington, it's true everywhere. There's been a lot of progress in techiniques recognizing, correcting, and avoiding bias and discrimination in ADS over the last few years ... but most of the systems in use today were built long before that.

From an engineering perspective, this means  huge amount of "technical debt" has accumulated over the years.  As awareness of the harms of these systems grows, it's debt that has a greater and greater cost by exposing the state to steadily increasing litigation risk the longer it isn't addressed – Challenging the Use of Algorithm-driven Decision-making in Benefits Determinations Affecting People with Disabilities has multiple examples of lawsuits in other states.

It’s a tricky situation. Our state government depends on these systems, and so do millions of Washingtonians, so we can't just put everything on hold until we build new versions.  But the problem won't just fix itself.

So here’s a summary of the consensus recommendations we came up with.

  1. Prioritization of Resources: The state should develop a prioritization framework for allocating resources to address existing and future ADS.
  2. Procurement: As a part of the procurement process, assess new ADS procured by the state.
  3. Evaluation of Existing Systems: ADS currently in use by the state that produce legal effects on people should be assessed if they are processing data on a large scale or have substantial effects on the rights or freedoms of natural persons.
  4. Transparency: Require transparency of use, procurement, and development of ADS including monitoring or testing for accuracy and bias, that produce legal effects on identified or identifiable natural persons.
  5. Determination on Whether to Use System: The state should adopt a framework to evaluate whether use of ADS or AI-enabled profiling should be prohibited
  6. Ongoing Monitoring or Auditing: Ongoing monitoring or auditing should be performed on ADS systems that have legal effects on people to ensure they do not have differential effects on subpopulations that result over time
  7. Training in Risk of Automation Bias: Require training of state employees who develop, procure, operate or use automatic decision-making systems as to risk of automation bias.

These all still makes sense to me – and aligns well with other important work since our report, including the recommendations in Algorithmic Justice League’s Who Audits the Auditors survey of best practices and the White House Office of Science and Technology Policy Blueprint for a Draft AI Bill of Rights.

For more information:

  • The ADS Workgroup Report discusses the Guiding Principles and Recommendations on pages 10-12.  To make it concrete, we also included a discussion of how these recommendations would apply to a specific agency ADS system in use today – a Department of Corrections system that affects tens of thousands of Washingtonians – on pages 14-16.            
  • What are Automated Decision Systems and why you should care?, a webinar that state Chief Privacy Officer Katy Ruckle and I did last September for WaTech, discusses the workgroup's research findings (starting at 15:40) and recommendations (starting at 18:00)

What are Automated Decision Systems and why you should care?

The devil is in details

Of course, these are fairly high-level recommendations, and the devil is in the details.  As Sen. Hasegawa worked on a revised version of SB 5356's predecessor last year, scaling back goals and extending timeframes to try to make it more palatable to government agencies, it became apparent that the consensus didn't extend to the specific language of his bill.  A scheduled Ways & Means hearing was postponed twice to give more time to negotiate a compromise.  When the hearing finally happened, several workgroup members, including me, testified in support of the bill ... but Chief Privacy Officer Ruckle (who had facilitated the workgroup) testified in opposition.  

Awkward!

Sherri Sheridan of the Governor's Office also testified in opposition, highlighting concerns that the bill could put the systems the state government depends on at risk. With Gov. Inslee likely to veto the bill even if the legislature passed it, the legislature instead approved a budget proviso funding an inventory of ADS by the end of 2022, so that it would be available for this year's session.

But Gov. Inslee vetoed the budget proviso, instead directing agencies to begin a more-leisurely inventory without any additional funding, and hope that the situation will magically fix itself.  Last September, What are Automated Decision Systems and why you should care? discussed the leisurely inventory in more detail. At the hearing, Sen. Hasegawa described the results to date as "dismal": apparently most agencies aren't even bothering to respond.

So the current plan of record is that when the leisurely inventory delivers its dismal results at the end of 2023, at least we'll know how many government ADS there currently are at the agencies that bother to respond – although still won't know how many of them have been tested for biases.  

So the dilemma I talked about in my testimony last year remains.

"It's a difficult situation, and one that won't magically fix itself.  The longer we wait to address it, the harder and more expensive it is to make progress -- and the more likely lawsuits are.  Investing now leads to longer-term cost savings by increasing workforce skills and productivity as well as reducing litigation risk.  How to do that in a fiscally responsible way?"

SB 5356

Sen. Hasegawa's latest proposal for ADS regulation legislation has a new bill number (SB 5356) and has been assigned to a new committee: the Senate Environment, Energy & Technology (ENET) Committee, chaired by Sen. Joe Nguyen.2   Sen. Hasegawa made some additional changes (further scaling back goals and extending timelines) try to make it more palatable to agencies, but the essence remains similar.  

Here's how the Bill Report prepared by non-partisan legislative staff summarizes the bill. SB 5356:

  • Requires public agencies to develop an algorithmic accountability report
    that meet certain requirements on the use of an automated decision
    system (ADS).
  • Specifies minimum requirements when an agency develops, procures, or
    uses an ADS.
  • Requires the Office of the Chief Information Officer to adopt guidance,
    develop a prioritization framework, inventory, and conduct audits for an
    ADS.
  • Establishes the Algorithmic Accountability Review Board.
  • Expands Washington's Law Against Discrimination to prohibit
    discrimination by automated decision systems.

Those still seem like good ideas to me, and aligned with the consensus recommendations.  But will the legislature agree?

"We need to get ahead of this program. And I think we're already too late."

Senate Environment, Energy & Technology - TVW
Public Hearing: SB 5356: Establishing guidelines for government procurement and use of automated decision systems in order to protect consumers, improve
February 15 Senate ENET Hearing on SB 5356

790 people and organizations testifying at and signing in to SB 5356's February 15 hearing signed in PRO to support it, including representatives from Unemployment Law Project, Washington Patients in Intractable Pain, WA People's Privacy, EPIC Privacy, ACLU of Washington, Southwest Washington Equity Coalition, Japanese American Citizens League, Seattle Chapter, and Fix Democracy First.  

19 signed in CON, including representatives of the Governor's Office, WaTech (the state IT group) opposed it as well, and Big Tech lobbying organizations TechNet and Washington Technology Industry Association (WTIA).

Sen. Hasegawa, the bill sponsor, kicked off the hearing with a history of how we got here (highlighting the ADS workgroup's consensus recommendations) and an update on the discussions this year.  Despite the changes in this year's bill, the Governor's office continued to oppose it.  Sen Hasegawa had worked on a substitute version, moving virtually everything to the "intent" section (where it has no legal force); he described it as "so scaled back that it's only a shell of the original bill".  But the Governor's office didn't respond, so he didn't offer it as an amendment.  

Urging the committee to advance the bill despite opposition, Sen. Hasegawa noted the broader trends including the White House Office of Science and Technology Policy's Blueprint for an AI Bill of Rights.  He ended by talking about the threats to our freedoms and liberties: "We need to get ahead of this program.  And I think we're already too late."

Testimony highlighted a couple really horrible examples of discriminatory ADS here in Washington state and elsewhere:

  • Cyndi Hoenhous of Washington Patients in Intractable Pain discussed NarxCare, an ADS that uses a proprietary algorithm to deny medication to disabled people and people in poor health (especially minorities and women) with moderate to high pain.  Well, okay, NarxCare claims it identifies "patients with substance use disorders" (like opioids); but it's frequently wrong: 17% of the high scores are false, so deploying the system would lead to 72,000 people being unable to get access to the medication they need.  Today, the decision to deploy this system could be made without any transparency or accountability.
  • Anne Paxton of the Unemployment Law Project discussed Michigan's MIDAS system, which falsely accused 40,000 people of fraud. Many people didn't get the notifications, or didn't think they needed to respond, which led to charges of fraud ruining many people's careers and lives. Washington uses the same IT vendor as Michagan, and something similar happened: the system sent out questionnaires that people didn't respond to, and more than 100,000 people have gotten overpayment notices, aggressive collections, threats of garnishment and seizure – with no explanation.   Her summary: "black boxes are the enemies of equity."

Workgroup members María Paula Ángel of UW and Jennifer Lee of ACLU both highlighted the alignment of SB 5356 with the workgroup recommendations.  Ángel also noted SB 5356's aligment with ADS regulation internationally; Ben Winters of EPIC Privacy similarly highlighted the bill's alignment with ADS regulation in other states like California and Colorado.  And Maya Morales of WA People's Privacy, after noting that the bill comes from a place of love for WA residents and centers the human beings who are affected, asked legislators to look at systems.  Are they helpful to people in the state? Or do they hinder, and create inequities?

I wound up as the last person testifying, supporting SB 5356 as "a good first step", although expressing openness to other potential first steps as well.3 You can see my full testimony below. Here's an excerpt.

"The longer we wait to start regulating ADS, the more Washingtonians are harmed.

The longer we wait, the greater the potential costs and liabilities to the state from litigation over discriminatory systems.

And the longer we wait, the more expensive it will be to clean it up.  The partial fiscal note for SB 5356’s predecessor in 2021 estimated the costs over the first two years as $8.5 million; now, the estimated price tag has risen to $23 million – and that’s just in the first two years.  How much higher will the estimates be next year?"

What next?

"Big Tech and government contractors have successfully derailed legislation by arguing that proposals are too broad—in some cases claiming they would prevent public officials from using calculators and spreadsheets—and that requiring agencies to examine whether an ADS system is discriminatory would kill innovation and increase the price of government procurement."

Why It’s So Hard to Regulate Algorithms

Friday (February 17) was the "policy committee cutoff" in the Washington legislature, so if the ENET committee didn't  advance the bill at their executive session that morning then is dead for the session unless they decide to ignore the rules, which usually only happens for bills the Governor supports.  Sen. Hasegawa made the very reasonable request to advance the bill to give the legislature more time to work on this critical issue, and the committee certainly could have advanced a very weak subsitute to keep the discussions going.  

Before the hearing, I wrote

"Let's hope that's what happens.  Otherwise, Washingtonians will keep getting harmed until the legislature takes action,  and the burden will continue to fall disproportionately on the most vulnerable populations.  And not to sound like a broken record, but the price tag keeps increasing.  So even putting aside the issues of Washingtonians getting harmed, the fiscal aspects really are the kind of thing that Ways & Means (the Senate's fiscal committee) should be talking about.  

Then again, Chair Nguyen and the ENET committee may just decide to kick the can down the road, meekly wait for the dismal results of the leisurely inventory, and continue hoping that the situation will magically fix itself."

You can probably guess what happened.4

What next?  Sen. Hasegawa made it clear that he'd keep fighting for ADS regulation.  Even though this year's bill didn't go forward, the hearing was useful for publicly documenting what's been discussed behind the scenes for a while

  • the systems the state of Washington depends on are discriminatory, and cause harms to hundreds of thousand of Washingtonians
  • the Governor's Office, WaTech, TechNet, and WTIA are opposed to even the tiniest steps to make progress.  

There isn't any way to thread the needle.  The current version of was scaled back from last year's bill which was scaled back from the year before – and the Governor's Office wouldn't even support a proposed further-scaled-back substitute.

More positively, thinking about ADS regulation's come a long way since SB 5356's predecessor was first drafted several years ago.  So it's time for some creative thinking to find a different path forward.  Easier said than done, of course, but it's not impossible.  We shall see.


Testimony

With only two minutes for live testimony, I usually follow up with an extended written version.  Here's what I sent the committee.

Chair Nguyen, Ranking Member MacEwen, and members of the Senate Environment, Energy, & Technology Committee,

I'm Jon Pincus from Bellevue, a technologist and entrepreneur, and founder of the Nexus of Privacy.  In 2021, I was a member of the Washington State Automated Decision-making Systems (ADS) Workgroup; I’ve also been General Manager of Strategy Development at Microsoft, Senior Researcher at Microsoft Research, and founder/CTO of a successful Silicon Valley-based venture-funded startup. I appreciate the chance to provide testimony on this bill, and would like to thank Chair Nguyen for scheduling this hearing, and Sen. Hasegawa, Vice Chair Lovelett and the other co-sponsors of the bill.

I strongly SUPPORT SB 5356. As you heard at the hearing, the systems that run our state government often have biases that reinforce and magnify discrimination.

The longer we wait to start regulating ADS, the more Washingtonians are harmed.

The longer we wait, the greater the potential costs and liabilities to the state from litigation over discriminatory systems.

And the longer we wait, the more expensive it will be to clean it up.  The partial fiscal note for SB 5356’s predecessor in 2021 estimated the costs over the first two years as $8.5 million; now, the estimated price tag has risen to $23 million – and that’s just in the first two years.  How much higher will the estimates be next year?

The ADS Workgroup report includes more details on ADS usage in Washington, a path forward, and the landscape of ADS regulation elsewhere.   What are Automated Decision Systems and why you should care?, a webinar that state Chief Privacy Officer Katy Ruckle and I did last September for WaTech, is a good overview of the workgroup’s findings and the consensus recommendations of the agency leads, community groups, and researchers who were part of it.

I want to highlight the consensus in the recommendations n that report.  Agency leads have a good understanding of the situation, and everybody largely agreed about the general direction of the next steps to get from here to better solutions. The devil is in the details, of course, and it’s been a challenge turning this consensus into legislation … but let’s focus for a minute on where we agree.

ADS are powerful and when they work well can help government agencies provide better service to more Washingtonians conveniently and far less expensively than purely manual systems. But today’s ADS often embed biases that reinforce and magnify discrimination – and most systems in use at state agencies today were designed and implemented without using the techniques that today are recognized as industry best practices.

For example, the workgroup looked at one Department of Corrections system that affects tens of thousands of Washingtonians, and based on the information provided to us concluded that its implementation "has not included adequate monitoring or testing for bias" (see p. 14 of the ADS Workgroup's Report for more details).  If it’s typical of other systems, state government agencies may have dozens, maybe hundreds, of systems that haven't been tested for biases.

This is a problem everywhere, not just here in Washington state.  From an engineering perspective, there's been a huge amount of "technical debt" that has accumulated over the years.  As awareness of the harms of these systems grows, it's debt that has a greater and greater cost by exposing the state to steadily increasing litigation risk the longer it isn't addressed – Challenging the Use of Algorithm-driven Decision-making in Benefits Determinations Affecting People with Disabilities has multiple examples of lawsuits in other states.

It’s a tricky situation, because our state government depends on these systems, and so do millions of Washingtonians, so we can't just put everything on hold until we build new versions.

The recommendations and principles in our report align well with important work that’s happened since then including the recommendations the White House Office of Science and Technology Policy Blueprint for a Draft AI Bill of Rights Sen. Hasegawa mentioned in the hearing as well as the Algorithmic Justice League’s Who Audits the Auditors survey of best practices.  And with my engineering management hat on, they’re  a good way to approach a difficult situation where you can’t afford to shut down critical systems.

As I said in my testimony on SB 5356’s predecessor last year,

"It's a difficult situation, and one that won't magically fix itself.  The longer we wait to address it, the harder and more expensive it is to make progress -- and the more likely lawsuits are.  Investing now leads to longer-term cost savings by increasing workforce skills and productivity as well as reducing litigation risk.  How to do that in a fiscally responsible way?"

SB 5356 as written takes valuable steps towards these recommendations – especially if the estimates in the fiscal note include resources for training, bias testing, monitoring, and auditing.  The bill’s timelines and policies for extensions give agencies a lot of flexibility (again, as long as the fiscal note includes the resources they need in addition to making progress with everything else on their plate). It also aligns with proposed state legislation elsewhere.

That said I certainly don’t think SB 5356 as written is the only way forward, and there could well be some other better first step.  No matter the approach, some key elements to consider:

  • Start by examining the highest-priority existing systems.  The prioritization framework in the ADS Working Group's recommendations is an excellent suggestion from an agency lead that reflects reality – and (realizing it will take time to develop the framework) also recommends giving agencies the flexibility to make their own prioritization decision in the short term.
  • Make sure to account for costs and risks as well as benefits.  Audit requirements ensure that vendors and contractors provide a full accounting; the transparency requirements follow best practices by enabling third-party audits (as recommended by Who Audits the Auditors and the Blueprint for a Draft AI Bill of Rights)
  • Don't make the situation worse!  Rules on procurement ensure that new systems are designed, implemented, tested, and deployed in line are another current best practices. SB 5356 and California's proposed ADS Accountability Act take similar approaches, including algorithmic impact statements from contractors as part of the procurement process.

One way or another I think it’s very important for the legislature to pass some legislation or do a significantly-sized budget proviso this year.  The longer you wait, the more expensive it gets.  So that means it’s critical for ENET to advance this bill or a substitute in order to continue the fiscal discussions in Ways & Means and hone in on an achievable, fiscally responsible first step.

Thank you for your ongoing attention to this important issue, and I hope that you schedule an executive session and advance SB 5356 or a substitute to Ways & Means.

Jon Pincus, Bellevue

Notes

1 Rashida Richardson's 2021 Defining and Demystifying Automated Decision Systems discusses definitional issues in detail.  Here's her "narrow" definition.

“Automated Decision Systems” are any systems, software, or process that use computation to aid or replace government decisions, judgments, and/or policy implementation that impact opportunities, access, liberties, rights, and/or safety. Automated Decisions Systems can involve predicting, classifying, optimizing, identifying, and/or recommending.

Here's the definition in SB 5356, which is somewhat narrower.

"Automated decision system" means any algorithm, including one incorporating machine learning or other artificial intelligence techniques, that uses data-based analysis or calculations to make or support government decisions, judgments, or conclusions that cause a Washington resident or business to be treated differently than another Washington resident or business or results in statistically significant disparities with other classes of persons or businesses in the nature or amount of governmental interaction with that individual or business including, without limitation, benefits, protections, procurement processes, required payments, penalties, regulations, or timing, application, or process requirements.

2 Sen. Nguyen is no longer a Microsoft employee, by the way; he left after last session and started up a strategy consulting business. Washington has a part-time legislature and the previous ENET Chair did strategy consulting for the tech industry while sponsoring legislation like the Bad Washington Privacy Act that would benefit the tech industry, so we're used to this kind of stuff here.

3 Politically, I agree with Sen. Hasegawa that there's value in keeping the bill alive this session so that discussions continue.  If the only way to do that is to advance somehting even weaker than SB 5356, oh well, but in my view it'd still be better than once again punting till next year.

4 Of course it ain't over 'til it's over.  If the legislature decides they want to make progress on the ADS problem this session, they still could.  Even though SB 5356 is dead for the session, it could spring to life again. For example, the usual deadlines don't apply to bills that are deemed Necessary To Implement the Budget (NTIB), which a bill with a steadily-rising eight figure price tag could certainly qualify for.  The Bad Washington Privacy Act was marked NTIB in 2021 (and the Bad Washington Privacy Act's sponsor reportedly to "encourage" his colleagues to vote for it by threatening to hold funding for eviction protection hostage) as was the Foundational Data Privacy Act last year, and their budget impact was significantly less than SB 5356.