AI & TechArtificial IntelligenceNewswireStartupsTechnology

Coby Adcock’s Scout AI raises $100M for war training models

▼ Summary

– Scout AI, a defense startup founded in 2024, raised a $100 million Series A round to develop its “Fury” AI model for operating military assets, starting with logistics and moving toward autonomous weapons.
– The company uses Vision Language Action models (VLAs) to train AI on all-terrain vehicles at a military base, allowing the AI to learn off-road navigation through simulated missions and human takeover data.
– Scout’s first product, “Ox” command and control software, aims to let soldiers orchestrate drones and ground vehicles with simple commands, and the company has secured $11 million in military contracts from DARPA and other Department of Defense customers.
– The startup is testing autonomous weapons systems, including drone swarms that can identify and attack targets without human intervention, though experts note similar technology like heat-seeking missiles has existed for decades.
– Scout founders criticize other AI companies for refusing to work with the military, and they plan to build their own model from scratch to achieve AGI through real-world interaction rather than just internet data.

At a classified U.S. military installation in central California, four-seat all-terrain vehicles carve through dusty hillside trails. This is a training exercise, but the focus isn’t on the human passengers inside. Instead, it’s an intensive effort to train AI models for conflict zones, teaching machines to navigate terrain where no road signs, traffic lights, or lane markings exist.

The autonomous military ATVs are operated by Scout AI, a defense-focused startup founded in 2024 by Coby Adcock and Collin Otis. The company describes itself as a “frontier lab for defense” and announced Wednesday that it has closed a $100 million Series A funding round. The round was led by Align Ventures and Draper Associates, following a $15 million seed round in January 2025. Scout invited TechCrunch for an exclusive look at its training operations on a military base it asked us not to name.

The startup is building an AI model called “Fury” to operate and command military assets. Initially focused on logistical support, the model is designed to eventually control autonomous weapons. CTO Collin Otis compares the work, which builds on existing large language models (LLMs), to training a new soldier.

“They start when they’re 18 years old, and sometimes they even start after college, so you want to start with that base level of intelligence,” Otis told TechCrunch. “It’s useful to start with someone who’s already made an investment and then say, hey, what do I have to do to teach this thing to be an incredible military AGI, versus just being a broadly intelligent AGI?”

Scout has already secured $11 million in military technology development contracts from organizations including DARPA, the Army Applications Laboratory, and other Department of Defense customers. It is one of 20 autonomy companies whose technology is being used by the U. S. Army’s 1st Cavalry Division during its regular training cycle at Fort Hood, Texas. The unit is expected to bring along the products that prove themselves when it next deploys in 2027.

For Scout’s internal testing, the rubber meets the dirt on the base’s rugged terrain. There, the company’s operations team, led by former soldiers, is putting the vehicles through simulated missions.

While autonomous cars are beginning to appear in cities worldwide, they operate in structured environments with clear rules. Navigating unmarked trails or off-road terrain is an entirely different challenge. Otis, a former executive at autonomous trucking company Kodiak, said he was motivated to start Scout when he realized the system he helped build there wasn’t intelligent enough to operate in an unpredictable war zone.

A new approach to autonomy

Scout is turning to a newer technology: Vision Language Action models (VLAs). Based on LLMs and used to control robots, VLAs were first released by Google DeepMind in 2023. The technology seeded robotics startups like Physical Intelligence and Figure. AI, the humanoid robot company led by Adcock’s brother, Brett.

Adcock sits on Figure’s board. He says that experience convinced him of the opportunity to bring broader intelligence to the military’s growing fleet of autonomous vehicles. His brother introduced him to Otis, who was advising Figure, and they set about applying the latest AI advances to military solutions.

“If I handed you the controller of a drone right now and I strapped a headset on you, you could learn to fly that thing in minutes,” Otis said. “You’re actually just learning how to connect your prior knowledge to these couple little joysticks. It’s not a big leap. That’s the way to think about VLAs and why they’re such an unlock.”

I got a chance to drive one of Scout’s ATVs around the rutty trails, and the terrain was challenging: steep hills, loose sand on turns, disappearing tracks, and confusing intersections. I’m not an experienced ATV driver but made a fair go on my first attempt. That’s the kind of general intelligence the company wants in its models, which it has been training via these ATVs for just six weeks after using civilian ATVs to start the process.

I also rode in the ATV under autonomous control, and could feel the difference. It accelerates faster than a human who might be thinking about a passenger’s comfort. The operations team points out how the vehicles hug the right on wider trails but stay in the middle of narrow ones, mimicking their training drivers. When confused, they suddenly slow down to think over their next move, something that happens a few times as it carries us on a 6.5 km loop before returning to base.

Though VLAs are new enough that no company has deployed them in an operational setting, “the technology is good enough to be doing that experimentation in the field with soldiers to figure out how to most be effective to US forces,” said Stuart Young, a former DARPA program manager who worked on ground vehicle autonomy. Like other autonomy companies, Scout’s full autonomy stack also includes deterministic systems and other flavors of AI to round out its agents’ capabilities.

Young left DARPA this month to join Field after managing a program called RACER. It asked companies to create high-speed, autonomous off-road vehicles, helping seed this space the same way the organization’s Grand Challenge boosted self-driving cars. Two competitors in this space, Field AI and Overland AI, were spun out of that program, and Scout also participated as a later addition.

The first applications of ground autonomy, according to Scout executives and military technologists, will be automated resupply: carrying water or ammunition to distant observation posts, or in a convoy where a crewed truck might be followed by six to ten autonomous vehicles, saving precious human labor for more important tasks. Brian Mathwich, an active duty infantry officer doing a stint as a military fellow at Scout, recalled a recent exercise in Alaska where he led a resupply convoy in total darkness and wished for autonomous vehicles to help him out.

Adding intelligence to the Army’s motorpool

Scout sees itself primarily as a software company, building an intelligence layer for military machines. It doesn’t intend to make the autonomous vehicles themselves but to build atop them.

Adcock expects the startup’s first product to be widely adopted will be one called “Ox,” the company’s command and control software, bundled on hardened computer hardware (GPUs, communications, cameras). It’s intended to allow individual soldiers to orchestrate multiple drones and autonomous ground vehicles with prompt-like commands: “Go to this waypoint and watch for enemy forces.”

However, making that software work requires training on real vehicles. Hence Foundry, which is what the company calls its training range at the military base. There, drivers spend eight-hour shifts putting the ATVs through their paces, then work through a reinforcement learning system to log where they had to take over. That data is then used to improve the model. The base commander has already asked the company’s ATV to take a turn with security patrols.

One hypothesis Scout is testing is that VLAs will enable this relatively limited data set, alongside training data in simulations, to deliver a fully capable driving agent. While the vehicle seems comfortable on trails, for example, it isn’t ready to operate fully off-road yet.

Scout is also practicing with drones for reconnaissance and as weapons, giving them intelligence with vision language models, a multi-modal LLM variant. The company is working on a system that would see groups of munition drones fly with a larger “quarterback” platform that provides more compute resources to command them. In one mission, the drones would search a geographic area for hidden enemy tanks and attack them, possibly without human intervention. Otis contends that the alternative approach in this scenario might be indirect artillery fire, which is imprecise compared to drone strikes.

While autonomous weapons are a flash point in the politics of defense tech, experts note the concept is old: heat-seeking missiles and mines have been in use for many decades. The question for technologists is how the weapons are controlled, Jay Adams, a retired U. S. Army Captain who leads Scout’s operations team, told TechCrunch.

He notes the company’s munitions drones can be programmed to only attack threats in a specific geographic area, or only with human confirmation. He also says autonomous weapons platforms are unlikely to fire because they are scared, the way an 18-year-old soldier might.

VLAs, too, offer promise for better targeting. Scout says its models are pretrained on a specific set of military data to prepare them for, say, running into an enemy tank while on a resupply mission. Lt. Col Nick Rinaldi, who supervises Scout’s work for the Army Applications Laboratory, says that while automated targeting is hard and unlikely to be used outside of constrained environments in the near term, the potential of VLAs to reason about threats makes them a promising technology to investigate.

Adams says the promise of drones that can identify their own targets is key to future warfare. While Russia’s invasion of Ukraine has generated intense interest in drone warfare, he believes having humans operating individual UAVs doesn’t scale enough for the U. S. to face a large number of low-cost unmanned systems should they threaten American forces.

A mission to counter anti-military vibes

Like many defense startups, Scout wears its mission on its sleeve, and executives will freely criticize companies that are reluctant to hand their technology over to the government. Google, for example, reportedly pulled out of a Pentagon contest to develop control systems for autonomous drone swarms, a capability Scout is also working on.

“The AI people don’t want to work with the military,” Otis told TechCrunch, referencing Anthropic’s spat with the Pentagon over its terms of service. “None of them are open to running agents on one-way attack drones, or running agents on missile systems.”

Nevertheless, Scout is actually using existing LLMs as the base to build its agents, though it declined to say which ones. Otis says it has agreements with “very well known hyperscalers” to provide the pretrained intelligence for Scout’s foundation model. Otis also declined to comment on whether it uses open-weight models, such as those offered by Chinese companies. Many companies reliant on AI inference build on these models to operate with lower cost compared to models from frontier labs like Anthropic or OpenAI.

Scout expects to address this by building its own model from the ground up in the years ahead, and the founders say much of its capital will go into those training and compute costs. Indeed, Otis wonders if Scout will beat the existing leaders to AGI because its model will be constantly interacting with the real world.

“There’s an argument in the AGI community along the lines that you can only get so intelligent by reading the internet, and most intelligence comes with interacting in the world,” Otis said.

Does that mean Adcock is competing with his brother’s army of humanoid robots at Figure? No, Otis says, but “we can get to scale much faster because our customer has assets,” he said, referring to the Pentagon.

(Source: TechCrunch)

Topics

military ai 98% Autonomous Vehicles 96% defense startup 94% venture funding 90% vision language action models 88% drone warfare 85% military logistics 83% ai training 81% autonomous weapons 79% darpa contracts 77%