Binding Public Sector AI Diffusion
๐ Abstract
The article discusses the potential negative impact of the new Office of Management and Budget (OMB) policy on the diffusion of AI in the public sector. It highlights the extensive new process requirements and restrictions imposed on "rights impacting" and "safety impacting" AI systems, which could significantly slow down or halt the adoption of AI in government agencies due to budgetary and administrative challenges.
๐ Q&A
[01] Process Risks
1. What are the new categories of AI systems introduced in the OMB policy? The new policy sets out requirements for two new categories of AI systems:
- "Rights impacting" AI: Any system that could impact access to government services, civil rights, human autonomy, discrimination protections, or surveillance.
- "Safety impacting" AI: Systems that could impact human safety or well-being, the climate or environment, critical infrastructure, or government assets.
2. What are the "minimum practices" required for agencies to approve and use these AI systems? Agencies must now complete numerous "minimum practices" to use these AI systems, including:
- Independent evaluation
- Impact assessments
- Consultation with affected communities and the public
- Transparency requirements
- Human review and opt-out processes
- Real-world scenario testing
- Ongoing monitoring
3. How could these new requirements impact the use of AI in government? The article argues that these new requirements will likely cause delays, fiscal waste, and harmful failures in the adoption of AI, as they add extensive time, money, paperwork, coding, and bureaucratic manpower requirements on top of the pre-existing rules already causing government IT failures.
4. What examples are given of the potential over-inclusion of AI systems under these new rules? The article cites the example of the Postal Service's use of a "crude form of AI to read ZIP codes to route letters and verify postage," which could potentially be considered a "rights impacting" system under the new policy. It also mentions the Paperwork Reduction Act being over-interpreted to require White House approval for IT user experience testing, leading to delays and poorly tested systems.
[02] Process Meets Budget
1. How are IT budgets being impacted at the same time as these new AI regulations? The article notes that as these new AI rules come into force, IT budgets are being heavily squeezed. For example, the proposed budget for the Technology Modernization Fund (TMF), which is meant to aid federal AI diffusion, is being cut by 62.5%. Additionally, most civilian agencies are seeing IT funding decreases or freezes.
2. How could these budget cuts and new regulations impact the Department of Veterans Affairs (VA) and its use of AI? The VA has an extensive inventory of 130 AI applications, many of which are potentially transformative in healthcare. However, with a proposed 20% reduction in IT funds for FY2025 and the new administrative burdens, the article argues that VA officials will likely question the ROI of AI implementation projects, leading them to default to legacy options. This could cause the VA to fall behind in adopting the latest AI healthcare technologies.
3. What is the key balancing act the article suggests the administration is failing to recognize? The article argues that the administration is failing to recognize the need to balance regulation and funding for effective government. It suggests that when funds are low, regulations must loosen to maintain state capacity and enable the rapid diffusion of beneficial technologies like AI.