Intent does not equal adoption. Twenty years in enterprise platforms taught me the same lesson every wave of technology repeats: deployment is easy; sustained, correct use at scale is where transformation wins or loses.
Same adoption challenge, new acronym
I have spent 20 years helping enterprise organizations adopt complex technology platforms. Before it was AI, it was CRM. Before CRM, it was omni-channel customer service. Before that, it was basic workflow automation. The technology changes. The adoption challenge does not.
The Department’s public Data, Analytics, and AI Adoption Strategy (and the accompanying fact sheet) reads like an enterprise transformation charter at national scale: strengthen data foundations, advance the analytics and AI ecosystem, invest in interoperable, federated infrastructure, strengthen governance and remove policy barriers, and deliver joint and business impact — all under responsible AI guardrails aligned to the Department’s ethical principles for AI.
The intent is unmistakable: field capability quickly and make it stick.
Scale and human dynamics
But intent does not equal adoption — and this is where experience in enterprise SaaS maps directly to the defense mission.
In 20 years of deploying Zendesk, Salesforce, and AI-powered platforms across Fortune 500 organizations, I have seen the same pattern repeat:
| What looks easy | What is actually hard |
|---|---|
| Technology deployment | Getting thousands of people to use it correctly and consistently |
| Choosing a vendor | Producing the outcomes leadership expects |
The DOD’s total force — military, civilian, and selective reserve end strength in the low millions, reported annually through manpower and budget materials — dwarfs any single commercial tenant I have ever rolled out. For scale context, see the Office of the Under Secretary of Defense for Personnel and Readiness and published Defense Manpower Requirements reports. The point is not the exact headcount on a given day; it is that culture, process, and identity multiply across an enormous graph of organizations.
What works in enterprise — and how it applies
Start with the workflow, not the model
The most successful AI deployments I have led did not begin with which AI should we use? They began by mapping the existing workflow in painful detail — every step, handoff, decision point, and exception. Federal parallels show up in business process reengineering and service delivery modernization guidance; the NIST AI Risk Management Framework is also explicit that Map and Measure precede confident Manage — same instinct, different vocabulary.
Invest in champions, not just technology
Every successful deployment had a network of internal advocates who understood both the technology and their colleagues' daily realities. In DOD terms, think NCOs and mid-grade officers who translate strategic directives into operational practice. The White House’s OMB M-24-10 similarly pushes governance boards and Chief AI Officers for civilian agencies — a structural admission that people and process must co-evolve with tools.
Measure adoption, not deployment
Deploying a tool and deploying a tool people use are different achievements. In enterprise SaaS we track adoption obsessively — daily active users, feature utilization, time-to-first-value. DOD should apply the same rigor to AI adoption that it applies to weapons system readiness.
Plan for the resistance
It is not a failure of character; it is a natural human response to change that threatens established competence. The supply sergeant who has managed inventory for 15 years is not backward — the AI threatens the expertise that defines professional identity. Successful adoption moves the narrative from "my job is being replaced" to "my job is being amplified."
Reinventing wheels is optional
The DOD does not need to reinvent the wheel on AI adoption. Twenty years of enterprise technology transformation produced a substantial body of knowledge about what works. The defense mission adds unique constraints — security classifications, operational tempo, hierarchy — but the human dynamics of technology adoption are universal.
Takeaway
The organizations that win the AI race will not be the ones with the best models alone. They will be the ones that close the gap between deploying AI and actually using it.
Further reading
- DoD Data, Analytics, and AI Adoption Strategy (PDF) — strategic goals, RAI alignment, and enterprise coordination via the CDAO Council.
- DOD adopts five principles of AI ethics — the public anchor for Responsible, Equitable, Traceable, Reliable, Governable.
- Responsible AI Strategy and Implementation Pathway (PDF) — how principles translate into acquisition and engineering practice.
- NIST AI Risk Management Framework (AI RMF 1.0) — Govern, Map, Measure, Manage — a vocabulary DOD and industry both borrow.
- OMB M-24-10: Advancing governance, innovation, and risk management for agency use of AI (PDF) — civilian-side governance patterns that rhyme with enterprise SaaS program management.
- OMB M-24-18: Advancing responsible AI acquisition (PDF) — procurement guardrails for rights- and safety-impacting systems — relevant when AI touches public-facing or high-consequence workflows.