You make the call

In a memorandum sent last month to the heads of executive departments and agencies, the White House instructed federal agencies “to advance the United States’ global AI dominance and to promote responsible AI innovation.” The question is, does this AI push mean anything?

Well, of course it means something. The problem is that its meaning is, well, mean.

First of all, the memo all but erases the Biden Administration’s 2024 predecessor OMB Memo, Advancing the Responsible Acquisition of Artificial Intelligence in Government.

According to an analysis by tech experts, the Trump memo removes any discussion of “responsibility” and the previously adopted testing requirements for “rights-impacting” and “safety-impacting” AI systems. Gone are terms such as “equity,” “bias,” and “environmental.”  Surprise, Surprise.

Gone also is the Biden Administration’s adoption of “procurement as policy.” Biden sought to use the federal procurement power to shape the market for more trustworthy AI. The Trump memo ? Not so much.

Furthermore, the Biden policy required alignment with NIST’s 2024 AI Risk Management Framework to ensure standardized risk assessment. In Trump’s new policy, there is no requirement to follow NIST’s AI RMF; agencies are free to come up with their own performance standards. Typical of Trump, the emphasis is on performance, not risk.

Equity, civil rights, and environmental considerations are also gone in the Trump AI memo, as is AI guidance.  By contrast, the Biden systems required agencies to provide information about training data, data labor, model architecture, and relevant evaluations. And they were contractually bound to make best efforts to filter out CSAM, NCII, and other toxic content.

If the agencies follow their orders and implement the Trump AI plan, they will focus on cost and effectiveness, not safety.  Biden’s work to shape an AI market that is trustworthy will be sacrificed to Trump’s short-sighted greed and ignorance.

So is the White House memo a good idea? You make the call.