AI in higher education: Experts discuss changes to be seen

Gaby Clark
scientific editor

Andrew Zinin
lead editor

Artificial intelligence, for better or for worse, has become an integral part of college students' study habits.
In fact, a September 2025 study by plagiarism detection platform Copyleaks found that almost , with 53% using it at least weekly.
While students are flocking to AI tools, many universities have failed to keep up with the dramatic shift. A January 2025 report from the Digital Education Council revealed that though 61% of college faculty have used AI in teaching, .
The University of Cincinnati is at the forefront of shaping AI policy and equipping faculty and students with AI literacy skills. , one of the region's premier tech conferences, recently hosted its 2025 event at UC's 1819 Innovation Hub and discussed the role of AI in higher education. Below are some takeaways from the session titled "Brains, Bots and the Battle for Creativity."
Instant gratification has become easier than ever due to AI. Instead of having to think deeply about a solution to a problem, anyone can prompt generative AI and receive an immediate answer.
Panel moderator Padmini Soni mentioned , which found that "among students who use ChatGPT to write, their brain connectivity weakens, memory fades and ownership of ideas erode." The session kicked off by asking panelists what this means for the future of learning in an AI-driven world.
James Francis, founder of AI strategy firm Artificial Integrity, responded by noting that, "We have compressed the time between asking and answering … and we have instant information that sometimes can be confused as learning."
In other words, there's a massive difference between giving the correct answer and understanding the why behind it. Authentic learning, the panelists agreed, stems from a trial-and-error exercise that doesn't occur when students copy and paste solutions from generative AI.
"Learning is a process, and this idea of being able to get instant answers is skipping so many vital steps," Francis continues. "Evaluation, judgment, revisions, being able to explain, attribution, things like that … and those are critical skills in learning."
In a higher education environment where ChatGPT can write essays and solve test questions in seconds, evaluation techniques must adapt. The old approach to learning—teachers passing knowledge on to students and assessing their comprehension via exams—doesn't align well with a world transformed by AI.
Panelists discussed a necessary shift in higher education from evaluating students' final products to assessing how students arrived at their outcomes. This would mean replacing traditional exams with the following testing methods that emphasize the learning journey:
- Presentations
- Student-teacher conversations
- Debates
- Ideation walkthroughs
The goal, according to brand studio Bloomology founder Jeneba Wint, is that soft skills like "communication, adaptability, flexibility, critical thinking, problem-solving and connecting ideas to each other" can be fairly judged with these test methods, since they're "the soft skills already existing before AI even came into the room."
All panelists agreed that this set of skills will grow in importance, especially considering AI's ability to perform routine tasks. Evaluating the soft skills valued by tomorrow's employers is critical—and as America's fourth-best school for co-ops and internships, UC is leading the way.
Many college students are comfortable providing prompts to AI and copying its outputs into assignments, thereby avoiding the actual learning process. Francis thinks this is a major problem: "Students are blindly putting in AI's ideas, getting high-fives and turning it in," he says. "One, they didn't go through the learning process. And two, [the answer] might not even be good."
What's missing here is basic AI literacy, and the data supports this. An found the following:
- 58% of college students feel they lack sufficient AI knowledge and skills
- 48% think they're unprepared for an AI-driven work environment
Students blame higher education institutions for falling short on AI: 80% say their school's integration of AI into teaching, learning and training is insufficient. University of Cincinnati vice president and chief digital officer Bharath Prabhakaran says the school is training Bearcat faculty and students on AI's proper use—as a sparring partner for generating novel ideas—while introducing them to personalized, AI-driven learning tools that help them do so.
"Higher ed really needs to … think about what the student in the future will look like," Prabhakaran says. "How do we equip our students to have the skills to succeed in an AI-first or AI-driven world? And then how do we adapt our sort of pedagogy to achieve that result?"
AI adoption has been a continual learning process for UC's task forces tackling the issue as well. "I don't think anyone's got the answer [for AI in higher education] yet," Prabhakaran states. "Still, we either have to evolve, or the evolution will be done without us."
Used the right way, AI is a brilliant tool for introducing students to perspectives they'd otherwise miss. Since ChatGPT and similar tools can't interpret meaning, the onus is on users to verify whether information is accurate and challenge AI's differing perspectives.
"There's a certain level of self-awareness you can get to with AI," Wint says. "You can challenge your own thinking … you can expand your thinking from using AI as well. It's allowed me to get to levels I wouldn't have been able to get to in my own technique and all my knowledge without it."
This requires an essential ingredient from college students: curiosity. Francis views it as the faculty's job to ignite passions inside students, leading them to use AI as a learning mechanism rather than as an instant-answer machine. "The responsibility is on the teacher," he says. "The accountability is on the student."
"Sparring partner" was a commonly referenced term during this session, and it describes the ideal way for college students to view AI. UC faculty encourage Bearcats to question ChatGPT's answers, leading them to more deeply understand the how and why of their own perspectives. This requires students to look past AI's tendency to excessively agree with and flatter users—a practice known as sycophancy bias.
Incorporating AI into higher education comes with risks, but the potential benefits often outweigh them. Forward-thinking schools like UC have established task forces dedicated to responsibly integrating AI into the college experience, and they're relying on thought leaders in spaces such as UC's 1819 Innovation Hub and the Cincinnati Innovation District for guidance.
One final warning from Francis was that ChatGPT has no inherent values; it simply produces information based on what's already online. That may lead to biases that, if people take AI's answers as facts, could undermine specific groups of people or ideological perspectives.
"Before we start focusing on synthesizing information, we need to get the information right," Francis says. "So unless we have context and unless we have values, then the value of synthesizing that information needs to be questioned."
Basic AI literacy and foundational knowledge are of paramount importance for college students, faculty and staff. That's why the thought leaders at MidwestCon—and those at UC as a whole—are focused on transforming AI from a work shortcut into a valuable tool for higher education.
Provided by University of Cincinnati