How John Carroll University Is Preparing Students to Lead in an AI-Driven World

Research & Innovation
Published
People are mingling and talking in a large, ornate hall with arched columns and paintings displayed along the walls.
JCU faculty attended the AI Builders Forum in Rome where they met with prominent AI thought leaders and researchers from across the globe.

In a classroom in John Carroll University’s College of Health, small groups of counseling students are deep in discussion.

One group is talking through how to support a teenager struggling with anxiety while another is debating how to help a couple navigate conflict. Across the room, students are weighing what questions they should ask next and what ethical considerations should be considered to guide the conversation forward.

This is what good professional preparation has always looked like: students learning together, applying what they know, and preparing to work with real people.

But look a little closer, and you’ll see that something new is happening here. The “clients” that students are discussing were custom-created by their professor using artificial intelligence.

It might seem like a shortcut—but it isn’t. AI isn’t making the work easier; it’s challenging the students in a meaningful way and deepening their learning.

Professor Andrew Intagliata uses AI in his Family & Relationship Counseling courses to create fictional clients—individuals, couples, and families whose stories evolve throughout the semester and align with the developmental and counseling concepts students are reading about in their textbook and discussing in class each week.

Students work in the same small groups all term with the same simulated clients, gathering new information as the course progresses. Each week, they learn new details—just as they would in a real counseling session—and must interpret what they are seeing, decide what matters, and determine how they would respond as professionals.

Intagliata designed these simulated clients himself using the University’s supported AI platform, Gemini, which allows him to closely align scenarios with course material and adapt them to student interests or timely news issues relevant to the class. He uses the data-protected “walled garden” environment provided by John Carroll to keep the work secure. This means student and faculty data are not reviewed by outside companies or used to train commercial AI models, so he can innovate while modeling responsible and ethical AI use. And while Intagliata uses AI to help generate these evolving scenarios, he reviews and refines every output before sharing it with students.

While the technology helps create an interactive learning environment, it’s the professor—with years of clinical expertise and professional judgment—who shapes the experience and guides students as they work toward becoming counselors themselves.

The result is not passive use of AI. Instead, students practice some of the hardest parts of their future careers before they ever set foot in the workplace: thinking critically, making judgment calls, and working through situations where there is no perfect answer.

Why John Carroll Is Leaning Into AI Now

Across higher education, educators are grappling with how to respond to AI’s growing impact on professional work. At John Carroll, the focus is on a deliberate approach to student formation: how do we prepare graduates not just to use these tools, but to think critically, exercise sound judgment, and act ethically in an AI-enabled world?

The University’s response is grounded in four guiding principles that shape our academic strategy:

  1. We recognize that AI has fundamentally changed the environment in which college students learn.
  2. We form critical thinkers and ethical actors.
  3. We prepare students for an AI-enabled workforce.
  4. We affirm an approach to AI rooted in our Jesuit values.

And these principles aren’t just words on a page. They come to life in classrooms like Intagliata’s, where professors model ethical AI use and students learn to treat AI as a tool that helps strengthen the human-centered skills their professions will increasingly demand.

Teaching Students What AI Can’t Replace

Professor Intagliata and his students aren’t alone in this work.

Just down the hall, Professor Melissa Smith is tackling similar questions from a different angle. A counselor educator, clinical supervisor, and scholar whose research focuses on the intersection of counseling, technology, and ethical professional formation, she asks herself a simple but important question as she prepares her courses: If AI is changing parts of professional work, what should she, as an educator, emphasize even more?

Her answer is rooted in the reality of a rapidly changing work environment. In many clinical settings today, AI-powered electronic health record systems already assist with documentation and treatment planning. Rather than ignoring that reality, Smith has adjusted her teaching to reflect the change. Her students learn how these systems work, but she is intentionally shifting more time toward the professional skills technology can’t automate.

“Now, instead of a two-class overview of how to write progress notes, we can use that time to teach more important skills that can’t be automated, like how do you build empathy and trust and make ethical decisions,” she explains. “These are the skills that make us irreplaceable. AI can't replace us if we're focusing on human connection.”

Her students are learning how to use AI tools. But they’re also learning how to question them, review their outputs carefully, and take responsibility for the final professional judgment.

This reflects John Carroll’s second guiding principle: forming critical thinkers and ethical actors. The goal is not adoption for its own sake, but thoughtful integration that strengthens student learning.

Preparing Students for Their Future Careers

John Carroll’s third guiding principle focuses on workforce preparation. AI literacy is quickly becoming a baseline expectation in many professions, and the University is working to ensure students graduate ready for that reality.

Through the AI Faculty Fellows program, faculty members across disciplines are working with department leadership to ensure every academic program includes at least one meaningful AI learning outcome connected to professional practice. This means students are learning:

  • How AI is used in their discipline
  • What responsible use looks like in their field
  • Where professional judgment must guide technology
  • How to collaborate with AI rather than rely on it

Melissa Smith sees this preparation as essential for healthcare students. “Our graduates are entering workplaces where AI already exists,” she says. “Our responsibility is to make sure they’re prepared to engage with it thoughtfully, ethically, and confidently.”

A Return to Relationship-Rich Learning

While headlines warn that AI is replacing human thinking or threatening real learning with shortcuts, something very different is actually unfolding in John Carroll classrooms.

Sean Hansen, Dean of the Boler College of Business, sees John Carroll’s thoughtful implementation of AI as accelerating a return to the kind of education the University has always valued, leading to deeper engagement rather than surface learning. “I think we're seeing a real return to the Socratic method,” he explains. “Because the mere facts professors once delivered are now readily accessible, our role is increasingly to guide students through active exploration of what these tools mean for their professions.”

Ultimately, the goal is not simply to teach students how to use artificial intelligence, but to form the kind of professionals who know when (and how) to use it wisely. It’s a shift that places greater emphasis on discussion, mentorship, and guided inquiry—all hallmarks of a classic Jesuit education.

A Jesuit Approach to AI

John Carroll’s fourth guiding principle reflects the Jesuit ideal at the heart of our educational mission: that AI education must remain rooted in human dignity and the common good. This means the University’s focus is not simply on policies or restrictions, but on helping students develop judgment, responsibility, and purpose.

As Edward Peck, Ph.D., Vice President for Mission & Identity and co-chair of the University’s AI Steering Committee, explains, “For a Jesuit university, the question around AI is not just what these tools can do, but who our students are becoming as they use them. Our work is about accompanying students as they develop the discernment, character, and sense of purpose needed to ensure technology serves humanity. We’re not just preparing students to work with intelligent machines; we are forming the people who will shape how those tools are used.”

Building a Culture of Thoughtful Innovation

What’s happening in classrooms across campus is supported by a broader institutional effort. Faculty development workshops, interdisciplinary AI conversations, and national partnerships with organizations like the AAC&U and the Council of Independent Colleges are helping John Carroll refine its approach while staying connected to emerging best practices.

The goal is not isolated innovation but a campus-wide culture where faculty share ideas, experiment thoughtfully, and support one another as they prepare students for what comes next.

What This Means for Students

For students, the impact is already clear.

They’re not simply learning about AI. They’re learning how to use it thoughtfully as part of their professional development. Along the way, they’re developing the ability to:

  • Use it to deepen their thinking
  • Question its outputs
  • Apply professional judgment
  • Strengthen the human skills their careers will demand

In Professor Intagliata’s classroom, that preparation means students graduate with experience navigating difficult conversations and complex human situations much like those they will face in their professional lives. And that’s not because AI made learning easier, but because it made richer learning possible.

Education for a Changing World

At John Carroll, our approach to AI is not about rushing to keep up with technology or reacting to headlines. It is about preparing students for a world that already exists, one where technology will continue to evolve and careers will continue to change alongside it.

In that environment, the most valuable skills will remain deeply human ones, which is why we’ve implemented an AI strategy that’s intentional and guided by thoughtful teaching, careful leadership, and a commitment to approaching AI in a way that reflects the University’s mission and Jesuit values.

Because ultimately, the goal is not simply to teach students how to use artificial intelligence, but to form the kind of professionals who know when—and how—to use it wisely and ethically.

Saved Undergraduate Programs

Saved Graduate Programs

No programs saved yet.