As did many fellow Americans, I chuckled when President Trump announced the creation of the US Space Force on December 20, 2019. I even remember laughing heartily while taking in the late-night circuit’s many Star Trek jokes that day. Yet, I had mostly forgotten that the Space Force still exists until last week when Secretary of War Pete Hegseth started a policy speech alongside Elon Musk at SpaceX’s headquarters by flashing the Vulcan salute and affirming Musk’s desire to “make Star Trek real”.
The absurdity of Musk’s introduction – in which he spoke of “going beyond our star system to other star systems, where we may meet aliens or discover long dead alien civilizations” as if this could happen in any of our lifetimes – belied the seriousness of the new US Military Artificial Intelligence strategy that Secretary Hegseth proceeded to announce.
Before an audience of Pentagon leadership and SpaceX employees, Hegseth outlined the structures, initiatives, and objectives in place to bring about what he called “America’s military AI dominance”, with his remarks largely following the plan documented in the July 2025 report “America’s AI Action Plan”.
A core goal Hegseth specified was “becoming an AI-first warfighting force across all domains”. He elaborated that AI will be deployed in three ways: for “warfighting, intelligence, and enterprise missions”.
Hegseth shared that the military’s generative AI model, known as genai.mil, launched last month for all three million Department of War (DOW) employees and will run on “every unclassified and classified network throughout our department.” The initial model was developed with Google Gemini and will soon incorporate xAI’s Grok. In its first month, one-third of DOW’s workforce (one million people) has used the generative AI model.
In the speech, Heseth repeated phrases such as “removing red tape,” “blowing up bureaucratic barriers”, and “taking a wartime approach” to the people and policies that he called “blockers”. Specifics he voiced disdain for included regulations in ‘Title 10 and 50’ – referring to Title 10 of the US Code (the legal bedrock of the armed forces, including the configuration of each branch) and Title 50 of the US Code (the laws which govern national security, intelligence, defense contracts, war powers, and more). These don’t sound like the types of data, processes, and policies to treat with a ‘move fast and break things’ approach.
How genai.mil might be used is even more frightening, especially as we learn how other AI programs are already being used to direct intelligence, surveillance, and warfare.
An April 2024 report from +972 unveiled an Israeli military AI program known as ‘Lavender’, which was used to generate kill lists of Palestinians. Despite the program reportedly having a known 10 percent false identification rate, no human validation was required before launching air strikes on the AI-identified targets. Another system, known as ‘Where’s Daddy?’, employed AI to locate targeted individuals.
This article is excerpted from ‘The Military’s AI Strategy Threatens Everything We Love’. Courtesy: Counterpunch.org