There Is No Middle: Why Presidents Must Choose a Future-Ready AI Strategy
By Robert J. Clougherty, PhD
In 1984, Microsoft released Multi-Tool Word. “Total Eclipse of the Heart” was on the radio. Dallas dominated TV. The iPhone didn’t exist, and “software” was barely a word. Forty years later, most university business still plays out in documents, spreadsheets, and slide decks. The files are digital. The metaphors are not.
We still sit at virtual “desktops,” file items in “folders,” and ask IT to “fix” things. If Don Draper wandered into a university boardroom today, he’d be confused by the monitors—but not the metaphors. The real problem? So are we.
We think we’ve modernized. But in truth, higher education is still working off the same conceptual architecture that defined office work in the late twentieth century. The language of “documents” and “files” didn’t come from the academy—it came from business, filtered through Xerox PARC’s vision of the workplace. That vocabulary still shapes what we believe our tools—and by extension, our institutions—are for.
And now higher education is standing in what complexity theorists call a phase shift: a moment where incremental change tips into transformation. AI isn’t coming. It’s already shaping how we teach, learn, govern, and communicate. But the culture of most campuses hasn’t caught up. Leadership still hedges: half-investments, pilot projects, small bets on big change—safe experiments conducted in the corner while the building quietly rewires itself.
There is no middle.
You cannot invest in AI while holding tight to legacy systems designed for stability, siloing, and replication. The strategic logic of these two worlds is incompatible. One optimizes for repetition. The other for emergence. Choosing both is a guaranteed way to fail at both—and exhaust the people asked to maintain the contradiction.
Let’s name the practical constraint: resources. You do not have the staff, time, or money to run two full strategies indefinitely. Legacy environments demand constant maintenance, vendor management, security patching, workarounds, and institutional “muscle memory.” AI-ready environments demand data stewardship, governance, training, experimentation, privacy design, and cross-functional collaboration. Doing both at scale is not “balanced.” It’s an unfunded mandate.
This is not an IT problem. It’s a leadership problem. And like it or not, the technologies you adopt—or avoid—are semiotic acts. They signal who you are, what you value, and whether you're designing the future or defending the past. Boards notice. So do your students. So do your faculty candidates. So do the regional accreditors who increasingly expect evidence of institutional effectiveness—and now, increasingly, evidence of responsible AI use.
To move forward, we must change how we understand computing. Traditional systems ask: “What do you want to automate?” AI asks: “What pattern are you not seeing yet?” It is a telescope, not a typewriter. It doesn’t just speed up the old workflow—it reveals what your workflow has been hiding.
Most institutions are still stuck in a loop: digitalization as paper mimicry, spreadsheets as strategy, IT as janitor. Meanwhile, emerging AI systems thrive on unstructured data—the kind higher ed has in abundance but rarely uses: syllabi, learning outcomes, narrative assessment reports, open-ended survey comments, advising notes, policy archives, web content, exit interviews. This is the goldmine presidents are standing on.
Think about it. You’ve got departments rewriting the same accreditation report every five years from scratch, not realizing they’ve accumulated a decade of language that can be analyzed, improved, and reused with AI support. You’ve got campuses redesigning websites every three years with outside consultants, without first asking an AI to assess tone, clarity, accessibility, and student confusion—at scale—based on real user journeys and feedback. You’ve got leaders sending campus-wide messages without ever seeing how their tone lands across audiences with different histories, identities, and trust levels.
But mining this gold takes more than IT. It takes mindset. It takes governance. It takes cultural accountability from every office that owns data, people, or processes. AI is not a tool you plug in. It’s a campus-wide shift in posture.
This is the uncomfortable part: functional areas must stop outsourcing basic computing and cyber responsibility to IT. If an office can approve a hire, sign a contract, or adopt a cloud service, that office can also own the basics of data stewardship, access control, and secure practices. The age of “I’m not a techie” is over. Not because everyone must become technical—but because everyone is now implicated in the institution’s digital reality. Culture is not soft. Culture is operational.
So what can presidents and provosts do now—without turning this into a 40-page plan that no one reads? Start with clarity, then build capacity.
- Stop investing in two futures. Choosing both is choosing neither.
- Align AI adoption with mission, not convenience. Use AI to augment what you most value: student success, equity, research, public service, learning.
- Shift IT from break/fix to strategy and enablement. IT must be at the table, not under it.
- Require data literacy in cabinet-level planning. You don’t need to code—but you must understand how models work, where your data lives, and what “good” looks like.
- Establish governance that is real, not ceremonial: clear policies for privacy, bias, accountability, and appropriate use—especially when vendors promise “AI” but deliver only a new interface.
- Craft your narrative. What story is your tech stack telling your board, your students, your accreditors? Are you signaling courage—or caution dressed up as prudence?
Presidents and provosts don’t need to write Python. But they do need to lead. AI is not a product. It is a process of rethinking what your institution is for—and how it will remain relevant. And relevance is not static. It is cultivated.
The coming years will define whether your institution is a participant in the next era of learning—or merely an observer with an expensive set of legacy systems and a dwindling talent pipeline.
There is no middle.
There is only momentum—or drift.
Robert J. Clougherty, PhD, is the founder of Recursive Meaning AI and a long-time scholar-practitioner exploring the intersections of artificial intelligence, organizational change, and meaning-making in higher education.