Are Universities Preparing Students Adequately for the Age of AI?
It's still early, but spoiler alert: The answer is no
Let’s cut to the chase: universities are not preparing students for a world dominated by artificial intelligence. The gap between what higher education offers and what the job market demands has always existed. But in the era of generative AI, that gap is becoming a chasm.
Here's a true story.
's neighbor is a C-suite exec and recently hosted potential interns from a top NYC university in a program that has run successfully for years with students from the school. This fresh batch of business school students were given a set of real-world problems in advance and told to prepare a pitch with their ideas. The best ones would be selected for the internship.How many were hired? Zero. The students read ChatGPT-generated content from their phones and presented Midjourney-created PowerPoints. At best, their presentations were basic or obvious, and at worst, flat out wrong. Worse, the students couldn't answer basic questions about "their" ideas. The internship program has been paused indefinitely.
This kind of disconnect isn’t new. It’s happened before with the arrival of the internet, Wikipedia, and Google.
Once again, higher education is being challenged to adapt its pedagogy. But walk into most lecture halls today and you’ll still find the same model that’s been used for decades—professors lecturing, students note-taking, and exams that reward memorization. Meanwhile, AI is redefining how we think, work, and create.
Higher Education is being disrupted.
Are business majors just learning how to prompt an LLM? Are they being taught how to use this as an assistant and offload versus supplanting their creative thinking? To critically evaluate outputs and use results responsibly? Clearly, not yet; judging by the failed interns above.
To be fair, there are bright spots—forward-thinking programs, standout professors, and a handful of institutions that are starting to get it.
Ohio State University, for instance, has launched a campus-wide AI Literacy initiative that will be required for all undergraduates starting in 2025. Students will engage in foundational coursework and ethics seminars designed to foster responsible, hands-on AI engagement. At the University of Michigan's Ross School of Business, faculty have embedded generative AI into the curriculum, creating custom tools for students to use AI to solve real business problems. These are promising steps, but they remain outliers.
For most schools, though, AI is at the edge of their workflow, not at the core. Some schools are slapping AI labels on current courses as basically a branding exercise. This will not create savvy AI-natives. The majority of students graduate without ever officially engaging with the tools and technologies that will shape their careers. Administrators and trustees (plot twist:
is one) have been timid about engaging in what traditionally has been the exclusive domain of the faculty. Faculty rigorously defend their academic freedom, which includes control over how and what they teach in the classroom, and this is enshrined in the governance of most institutions of higher education. In many places, this has led to the same knee-jerk reaction as with other recent disruptions: any LLM use is called cheating and is prohibited.But, think you can catch when AI is being used? You probably can’t reliably.
A 2024 study of six major AI detectors found that these tools are less than 39.5% accurate, plummeting to 17.9% accuracy when deliberately manipulated. Faculty are being left with few good options. Should they spend hours reviewing content that was generated in seconds? Or try to use the tools themselves—often without adequate training or institutional guidance?
A recent case at Northeastern University brought this tension into focus. A student requested a tuition refund after discovering her professor had used ChatGPT to generate grading feedback, inadvertently including the prompt in the returned comments. The student’s logic? If AI is doing the teaching and grading, what justifies the tuition? After all, ChatGPT Pro is $20/month—far less than what students are paying for a human education.
Schools have to adapt. Very quickly.
This isn’t a call to abandon foundational knowledge. This is a call to evolve. To integrate. This is a call to accept that AI is here, and it’s not going anywhere. Universities should be helping students become fluent in the language of the future. If things are to change, faculty and administrators need to have dialogue about a common plan for where AI needs to go on their campus. Academia’s reluctance to embrace this shift doesn’t prevent it—it just means students are going it alone and learning the wrong lessons.
And learning the right lessons is more important than ever - witness the mass layoffs in what was once the scorchingly hot tech sector and what is starting to happen to MBAs. Is AI the cause, or is it economic uncertainty? We may not know exactly why yet, but it is still our responsibility to get it right for our students and prepare them for the rest of their lives.
In much less time than we think, every 6-year-old will have a PhD-level assistant in their pocket. And not just a PhD in physics, a PHD in EVERYTHING, since it will have most of the world’s knowledge in its memory. This looming reality is going to upend higher ed as we know it. How is still unclear but status quo ante is not on the cards at all.
The institutions that get ahead by reimagining this new world will thrive. The ones that don’t will become increasingly irrelevant.
Students know it.
Employers know it.
The question is: do universities know it?
Thanks for reading! We’d love to hear from you. Please leave a comment and tell us your thoughts on AI in education.
Want to dive deeper? Check out our book BUILDING ROCKETSHIPS 🚀 and continue this and other conversations in our ProductMind Slack community.
Wow this is such an eye opener.
This ending paragraph screamed the message out loud to me.
“In much less time than we think, every 6-year-old will have a PhD-level assistant in their pocket. And not just a PhD in physics, a PHD in EVERYTHING, since it will have most of the world’s knowledge in its memory. This looming reality is going to upend higher ed as we know it.”
If the AI education answer is on universities, I don't know. But AI education will have to be iterative, researched by the day and interactive. Perhaps short courses is the way to go. 12 weeks (one university term) seems too long compared to how fast AI is evolving.