The Jobs Aren't Just Changing — They're Being Reimagined

·

9 min read

Cover Image for The Jobs Aren't Just Changing — They're Being Reimagined

AI and the Future of the Higher Education Workforce

Post 1 of 6 in the series: Leading Through the AI Shift: A Higher Education Leadership Series


There's a conversation happening in every provost's office, every HR leadership meeting, every cabinet retreat right now — and it tends to circle the same unresolved question: What exactly happens to our people?

Not in a dramatic, science-fiction sense. Nobody in higher education is seriously expecting robots to walk the halls of the registrar's office. But the quieter, more disorienting reality is already here: AI is absorbing the cognitive, knowledge-based tasks that made up the professional identity of many of our colleagues. And most institutions are not yet sure what that means for how they staff, develop, and lead their workforce.

This post opens a six-part series that draws on the themes from EDUCAUSE's upcoming summit on AI and workforce transformation in higher education — an event that places workforce implications, not just technology capabilities, at the center of the AI conversation. That framing matters. It's the right framing.


What We Mean When We Say "Transformation"

Let me be direct about something: the word "transformation" is used so casually in higher education that it has nearly lost its meaning. Every new student information system is called transformational. Every learning management platform refresh gets the same label. That overuse has made leaders understandably skeptical when they hear it again in the context of AI.

But this time, the word actually fits — and it fits in ways that are uncomfortable.

The World Economic Forum's 2025 Future of Jobs Report projects that by 2030, around 60% of the global workforce will require significant upskilling. Not new jobs — existing workers in existing positions who will need fundamentally different capabilities just to do the work they were already doing. That's not incremental change. That's structural.

What makes higher education's situation particularly pointed is that the roles most exposed to AI automation are not the ones we typically think of as "automatable." They're not warehouse workers or assembly line operators. They're knowledge workers — the advisors, analysts, coordinators, and administrators who make our institutions function. Goldman Sachs has estimated that roughly 300 million full-time jobs globally are exposed to AI-driven automation, with white-collar, professional roles disproportionately affected.

This is a direct challenge to a sector built almost entirely on knowledge work.


The Higher Education Workforce Is Uniquely Exposed

Consider what a student success advisor does: they review academic records, track early warning indicators, interpret data on student engagement, write follow-up emails, coordinate with faculty and financial aid, schedule appointments, and maintain case notes. Nearly every one of those tasks — except perhaps the relational core of the actual advising conversation — is something an AI system can now assist with, accelerate, or in some cases, perform independently.

Or consider institutional research staff who spend weeks constructing reports that a well-configured data tool could surface in hours. Or compliance teams manually reviewing documentation that large language models can scan in seconds. Or communications offices producing first drafts of every piece of content from scratch.

This isn't hypothetical. A 2026 EDUCAUSE study on the impact of AI on work in higher education found that among institutions already implementing AI work strategies, 69% are prioritizing upskilling and reskilling existing staff — which suggests that, at least at the leading edge, institutions understand the workforce implications are real and immediate.

The question is whether the rest of the sector will follow before they're forced to respond reactively.


The Human-Centered Argument Is Not Optional

Here's where I want to push back on a framing that sometimes shows up in these conversations: the idea that emphasizing "human-centered values" in AI adoption is a kind of philosophical luxury — something we say because it sounds responsible, while the real work is about productivity and cost reduction.

That framing is wrong, and it leads institutions astray.

The workforce implications of AI are not primarily a technology problem. They are a people problem. The staff members who feel their roles threatened by AI adoption — and many of them do, even if they're not saying it loudly — will either become active participants in transformation or they will become passive resistors. Both responses are deeply human, and neither can be managed by better software.

UNESCO's survey on AI in higher education found that two-thirds of institutions now have or are actively developing AI guidance, but guidance documents alone do not build trust. They signal intention. Building actual trust — the kind that allows a workforce to absorb significant role redefinition without fracturing — requires visible leadership, honest communication, and genuine investment in people's professional futures.

The Deloitte 2026 Higher Education Trends report is pointed on this: human-skills-centered approaches are not just ethically sound; they're strategically necessary for institutions navigating the enrollment, financial, and competitive pressures that define this moment. Institutions that strip out roles without thoughtfully rebuilding them will find themselves with capability gaps they didn't anticipate, at exactly the moment they can least afford them.


What's Actually Being Asked of Leaders

I want to be honest about how hard this is. Leading a workforce through AI-driven transformation is not a communications challenge or a project management challenge. It's a leadership challenge — one that requires holding real tensions without pretending they don't exist.

The tension between efficiency and employment is real. AI will, over time, reduce the headcount needed for certain functions. That's not conjecture; it's the operational logic of the technology. Leaders who pretend otherwise lose credibility with their teams. Leaders who acknowledge it honestly, while committing to ethical transition strategies and genuine investment in reskilling, build the kind of trust that makes transformation possible.

The tension between moving fast and moving carefully is also real. There's institutional pressure — from boards, from peer comparisons, from budget pressures — to show AI-driven productivity gains quickly. And there's an equally valid concern that moving too fast, without adequate support for workforce transitions, generates exactly the cultural resistance that slows long-term adoption. The institutions getting this right are not moving fastest. They're moving most intentionally.

Finally, the tension between strategic vision and operational reality is something every leader navigating this moment knows well. The vision is clear enough: a workforce that collaborates with AI, that focuses human attention on the work AI cannot do — relationship, judgment, creativity, context. The operational path from here to there, involving actual people in actual departments with actual job descriptions, is considerably harder.


The Roles That Survive — And Why

Not every role in higher education is equally threatened. It's worth naming what the research and early evidence suggest about durability.

Roles that center on complex human relationships — counseling, academic advising at its highest level, faculty mentorship, community building — appear more durable, because the relational core of those interactions is genuinely hard to replicate with AI. The technology can surface information and identify patterns; it cannot substitute for the trust built between a human advisor and a student navigating a difficult semester.

Roles that require contextual judgment about specific, complex institutional situations — legal, accreditation, faculty governance, complex donor relations — also appear more durable, because the judgment required is highly context-specific and depends on institutional knowledge and relationships that don't exist in any training dataset.

What's less durable are the production-oriented dimensions of professional roles — the report writing, the data compilation, the first-draft generation, the routine communication, the compliance documentation. These are the tasks that fill most of the working day for many higher education professionals, and they are exactly the tasks where AI assistance is advancing fastest.

This doesn't mean those professionals are not needed. It means their roles are changing in ways that require active management — and that the institutions that actively manage that change, investing in the development of their people's higher-order capabilities while AI handles more of the production work, will have stronger workforces than those that simply let the change happen.

The IMF's 2026 analysis on new skills and AI reshaping work identifies a clear pattern across sectors: workers who combine domain expertise with AI fluency are dramatically more productive than either AI alone or human experts working without AI. That combination is the workforce target higher education institutions should be building toward — not AI instead of people, but people equipped to lead and leverage AI effectively.


A Place to Start

If I were advising an institution just beginning to think seriously about this, I'd start with an honest workforce analysis — not a technology audit. Before asking "what AI tools should we adopt?", ask: which roles in our institution are most exposed to AI-enabled automation? Which are most durable? What do we owe our people in the transition, and how will we fund it?

That conversation — uncomfortable, specific, grounded in real job families and real career pathways — is the one that leads somewhere useful. The institutions that will navigate this well are not the ones that deployed the most tools. They're the ones that led the most honestly.

This series will work through six dimensions of that challenge. The next post examines what it actually means to build AI-augmented teams — not just in theory, but in the practical, day-to-day reality of higher education departments where the work still has to get done.


The EDUCAUSE Summit on AI and Workforce Transformation brings together higher education leaders to work through exactly these questions. Learn more at events.educause.edu.

If you're working through AI readiness at your institution and want to think through the workforce strategy dimension, EDIE, our AI assistant for education, is a place to start.