The Future is Human - Even With AI

By Sloan MacRae, March 30, 2026
Ithaca College is charting a values-driven, student-centered path through the most transformative technological shift in a generation.

A Clear View

If you spend any time talking with David Weil ’87, MS ’89—IC’s senior vice president for strategic services and initiatives and chief information officer (CIO)—you quickly learn Cayuga Lake is one of his favorite places. He references it often as a source of perspective, a place where calmness and clarity naturally coexist. When Weil describes how he approaches complex technological change, he returns to the qualities the lake embodies—stillness, depth, and the ability to see patterns clearly once the surface settles. Of all the technological shifts he has helped IC to navigate, artificial intelligence (AI) is likely the most sweeping. Even so, Weil doesn’t treat it as a crisis to brace for or a spectacle to chase but as a fast-moving current that becomes navigable when you understand what’s happening beneath.

AI, he would admit, is not calm water. Ithaca College’s approach to AI is shaped by that same instinct for clarity: begin with people, listen longer than you speak, understand before you act, and make decisions that will hold up not just this semester but years from now.

When Weil talks about AI, he doesn’t start with tools or trends. He starts with students: “I think about the student who shows up at a job interview two years from now,” he said in our conversation last winter. “If they haven’t been exposed to AI, if they’re afraid of it, if they don’t understand how it fits with their craft—they’re going to be at a disadvantage.” He paused, choosing his words with the measured precision that colleagues across campus know well: “AI isn’t something you can fence off and hope it won’t reach you. It’s here. And it’s accelerating.”

Across higher education, early reactions to generative AI, which creates content like text and images, ranged from panic to prohibition. Some banned new tools outright. Others issued quick declarations about academic integrity. Many treated the technology as a passing disruption rather than a structural shift. At Ithaca College, the response is taking a different shape, one that is human-centered, transparent, and rooted in the same clarity that Weil draws from Cayuga Lake.

The Moment We're In

Artificial intelligence is not the first major technological shift to reach Ithaca College, but it may be the first to arrive everywhere at once: in syllabi, residence halls, advising meetings, coursework, administrative workflows, and campus governance.

By early 2023, while the national conversation oscillated between alarm and hype, Ithaca College faculty and staff began gathering informally to understand what AI meant for teaching, learning, and institutional operations. Those early conversations evolved into the Presidential Working Groupon Artificial Intelligence, convened from August 2024 through May 2025 with faculty, staff, and student representatives from across the campus.

What distinguished the working group was not its speed but its posture. It did not begin with fear. It began with listening to the people AI would affect most. That orientation was reflected in the group’s own framing of its charge. As the working group noted in the introduction to its final report, the members approached artificial intelligence with both openness and restraint: “From the outset, we acknowledged that AI has the potential to transform how Ithaca College operates, streamlining services, expanding access to information, and enabling proactive decision-making. At the same time, were cognized the risks of adopting AI without adequate oversight, particularly when it comes to data privacy, equity, and trust.”

The group, chaired by Casey Kendall, IC’s deputy CIO, spent months gathering input from:

•faculty seeking clarity without sacrificing academic freedom or creative autonomy.

•staff navigating opportunities as well as risks.

•students unsure where the ethical lines were.

•skeptics raising concerns about privacy, labor, and surveillance.

•sustainability advocates focused on AI’s energy and environmental costs.

•creative practitioners wary of replacement rather than enhancement.

This listening phase shaped the framework eventually adopted by the president and Executive Council in 2025, one that emphasized a human-centered approach, responsible use, equity, transparency, and alignment with IC’s mission. For Weil, the process of working collectively to understand the implications and impact of this new technology echoed other major technological transitions he has helped the college navigate over his three decades of his work at IC: the use of computers to help with teaching and learning, the rise of networked computing, the growth of cloud collaboration tools, the evolution of cybersecurity, and the expansion of a student technology workforce that has become a defining strength of his division. But none, he believes, carry the same broad implications as AI.

Seven Questions: Building IC’s Collective Approach

Late in fall 2023, as Weil listened to conversations across campus and across higher education, he realized that there were seven key questions that Ithaca and other institutions needed to grapple with to help better understand the impact of AI on their campuses:

1. How do we provide the skills and build the critical thinking competencies that students need to be prepared to enter a workplace that will use AI in key processes and tools?

2.How can AI enhance our students’ experience?

3.How can AI improve the effectiveness and efficiencies of our processes and support our strategic success?

4.What is the impact of AI on the skills, roles, and organizational structures of our workforce?

5.What AI-related academic, research, ethical, data, privacy, acceptable use, intellectual property, and other policy issues should our institution explore and develop?

6.How does AI and its use align with our institutional philosophy, approach, and core values?

7.What opportunities, drawbacks, and potential threats does AI present for the fundamental missions of teaching, learning, and research atour institution?

“You can't talk about AI responsibly without talking about sustainability.”

David Weil ’87, MS ’89, senior vice president for strategic services and initiatives and chief information officer

Answering these seven questions would drive Ithaca College’s AI exploration and ensure that it was not shaped by any one office, one technologist, or one philosophy. It would be shaped by many.

That philosophy was echoed inPresident La Jerne Terry Cornish’s charge in July 2024, when she introduced the Presidential Working Group on ArtificialIntelligence: “Establishing this working group signifies our commitment to integrating AI into our work and activities in ways that reinforce our educational mission and uphold our core values.”

Rather than jumping to policies or pronouncements, the group began by exploring the foundational questions about student experience, workforce readiness, ethics, privacy, sustainability, governance, and operational effectiveness. Those questions, and the community input they drew, eventually shaped the “Guiding Principles for the Use of AI” adopted by the president and Executive Council in summer 2025. Questions related to curriculum and classroom practice remained intentionally outside the scope of the Presidential Working Group. Those conversations were led by faculty, through work facilitated by the Center for Faculty Excellence in partnership with the Center for Instructional Design and Educational Technology, reflecting a range of perspectives on AI in teaching and learning.

By beginning with, “How do we ensure our choices about AI reflect who we are?” rather than, “Which AI tools do we need?”the college’s working group strategy was clear: the future will not be decided by tools but by the clarity and judgment of the people using them. This approach anchors IC’s AI framework by being measured, inclusive, values-driven, and distinctlyIthaca. It also acknowledges the real and valid concerns that continue to shape the conversation around AI.

The Concerns Are Legitimate

Weil doesn’t shy away from concerns over AI; in fact, he talks about and validates many of them before he makes the case for innovation. This validation doesn’t come from a sense of preemptive defensiveness but because he believes honesty is the only position from which a community can build trust and address the issues. In Weil’s view, these concerns need to be addressed and taken seriously:

AI cannot (and should not) replace the human element . At the top of the list are concerns voiced by faculty, students, and staff about AI replacing what makes Ithaca College distinctive in the first place: human insight and connection, intellectual curiosity, and a sense of community. As the college looks toward a future that includes AI, preserving that humanity remains essential. The adoption of AI tools must support — and ideally strengthen—human-to-human engagement, not diminish or replace it. Weil argues that when used thoughtfully, responsibly, and ethically, AI can do just that, freeing people to focus on deeper relationships, more meaningful interactions, and the work that benefits most from a human touch. But that potential depends on constant vigilance about how and where these tools are applied.

The environmental cost isn’t abstract . Campus sustainability advocates raised the alarm early: training AI models can require immense amounts of energy and water depending on the use case. Weil didn’t downplay it: “You can’t talk about AI responsibly without talking about sustainability.” The Presidential Working Group agreed, naming environmental transparency as a core principle and urging the college to weigh benefit against impact in every adoption decision. But Weil hasn’t limited these concerns to internal discussions. He has used his voice to elevate the conversation more broadly. In a recent article he authored for the EDUCAUSE Review, he challenged AI solution specialists to “provide greater transparency and more usable data on energy consumption so that institutions can make responsible decisions about how they architect AI-powered systems and when it is appropriate to use specific tools or alternatives.”

AI is having an impact on the creative arts .Across all five schools—not only in the Park School and the School of Music, Theatre, and Dance—students and faculty raised concerns about the growing use of AI to replace artists and traditional creative professionals. As these tools have become more capable of mimicking voices, performances, visuals, and written work, they have raised fundamental questions about what it means to create—and about the future roles our students may occupy in creative fields. At the same time, those technologies are opening new avenues for expression, enabling more people to develop and share creative ideas in ways that were previously out of reach.

Anxiety about entry-level work is real. Students in many majors, including media, design, business, and communications, raised worries about AI tools displacing early-career roles, precisely the work areas they expect to enter. Weil agreed, emphasizing the importance of helping students understand how AI is shaping the fields they are entering—and how to articulate the value their “human intelligence” (their judgment, communication, collaboration, critical thinking, and empathy) brings to the workplace. While some tasks will change, new paths will undoubtedly emerge, and preparing students for those paths is central to Ithaca College’s strategy. Many of the skills IC cultivates—leadership, collaboration, creativity, problem-solving—remain fundamentally human strengths that AI cannot replace. As history has shown, when work changes, opportunity doesn’t vanish; it shifts.

Overhype and panic overshadow nuance . “We need to get past one-sentence reactions,” Weil said. Too often people talk in absolutes: “AI is bad.” “AI is good.” Instead, we must continue to engage in nuanced discussions: What aspects of AI are harmful? What use cases add value? What is the cost if we don’t use AI in a situation? One of the working group’s earliest conclusions was that IC needed better questions, not louder answers. The group’s framework has discouraged both tech-solutionism and blanket fear.

Access and equity cannot be afterthoughts . Without shared literacy, AI widens gaps. The working group’s “Guiding Principles for the Use of AI” explicitly names equitable access as a priority, and the deliverables promote campus-wide education rather than exclusive expertise. “At a place like IC,” Weil said, “we don’t accept a future where some students get left behind.” This insistence on clarity and transparency led directly to IC’s strategic posture, one President Cornish articulated plainly: “AI is reshaping our world in ways we cannot ignore. Our obligation is to ensure that Ithaca College students are not simply reacting to that change but prepared to lead in it.” No mere slogan, this principle became the philosophical center of the entire AI framework. Where other institutions rushed toward sweeping bans or sweeping adoptions, Ithaca chose something different: an approach that, like a lake on a fine day, is steady, clear, and reflective

Future-Facing, Not Future-Chasing

The refrain, “Technology should amplify human relationships, not replace them,”is embedded throughout Ithaca College’s approach. This human-centered stance, formalized in the college’s “GuidingPrinciples for the Use of AI,” commits the college to AI use that enhances connection and learning rather than automating them.It encourages:

•faculty autonomy in determining pedagogical fit.

•student creativity and agency.

•ethical reflection embedded across disciplines.

•mentorship as a defining feature of an IC education.

•hands-on exploration rather than passive consumption.

“This isn’t about automating teaching,” Weil said. “We’re trying to support it.” That perspective anchors campus-wide decision-making around AI.

One of the most influential concepts to emerge that Weil often emphasizes is the ability for AI to provide a longitudinal view of the student and their journey. As Weil explained, “Our students do not engage with us in a vacuum. Their interactions flow across the entire institution, from the dining hall to the classroom, the gym, the library, and into the residence halls. These discrete moments combine to form each student’s unique journey. AI offers a way to connect those moments. It can integrate the totality of a student’s experiences across the institution to help inform decisions, anticipate needs, and enhance both engagement and success.”

The power of AI in this context lies not in any single data point but in its ability to recognize patterns and context overtime. This longitudinal view—when guided by human judgment, ethical boundaries, and respect for privacy—can transform the student experience from reactive to anticipatory, and from transactional to deeply personal.

It is important to work with students to equip them to take a broad and long-term view of AI. Generative tools will not just shape students’ lives for a semester; they will shape careers, industries, and civic life for decades. In the Presidential Working Group’s own language, AI requires institutions to “enhance student readiness for an AI-integrated future” and address the“impact of AI on workforce skills and roles.”

Short-term, reflexive reactions do not prepare students for an AI-infused world, but long-term clarity does. For the working group, clarity includes helping students understand how AI intersects within their fields, whether these students are marketing majors training with prompting frameworks, writers experimenting with revision tools, or physics students exploring modeling systems—and the list goes on. The common denominator is agency, not avoidance.

Scale is a consideration. Weil was refreshingly direct about IC’s size and resources. IC is not a massive university with expansive AI research budgets—and that, he argues, can be a strength rather than a limitation.

In practice, Weil has long championed the idea that we don’t have to chase every shiny object. The working group’s framework reflects that ethos, advocating for:

•targeted pilot programs to learn from rather than starting with institution-wide rollouts.

•low-cost or open-source experimentation that leverages our existing infrastructure and tools.

•faculty mini-grants to explore practical instructional applications.

•collaborative workshops that promote community learning.

Of course, the return on investment (ROI)—and not only in terms of dollars—is key. If a tool cannot improve student learning, strengthen support systems, increase accessibility, or meaningfully reduce administrative burden, IC’s strategy suggests it isn’t worth adopting. The working group’s executive summary emphasized “outcomes grounded in institutional values” not vendor promises. Weil echoed this sentiment: AI should be integrated only when it meaningfully enhances the human work of teaching, advising, supporting, or creating.

The ROI conversation also protects against inequity. If only some students or faculty understand AI’s value, the technology can deepen the divide rather than widen the access.

Perhaps even more importantly, the working group does not consider ethics a box to check or an appendix to the strategy. It is the anchor. Questions about bias, data provenance, environmental impact, algorithmic accountability, and equitable access are not deferred to later committees but shape decisions from the start. In this way, IC is not positioning itself as an early adopter of every tool. It is positioning itself as an early adopter of responsible adoption.

The heart of IC’s strategy reflects something essential about the institution: when Ithaca College moves with intention, it moves together. The participants in the Presidential WorkingGroup reflect that ethos and include:

•faculty from writing, physics, and strategic communication.

•staff from advancement, dining, and employee development.

•student success and student affairs leaders.

•diversity and belonging practitioners.

•a student majoring in mathematics and computer science.

•IT staff providing facilitation, context, and infrastructure insight.

Weil’s role as executive sponsor was not to dictate outcomes but to ensure the group had the context, tools, and access it needed—and to bring national conversations (through his EDUCAUSE work and recent board election) into IC’s local reality.

The result was a set of deliverables that now anchor the campus-wide AI approach: a governance process, guiding principles, use-case pathways, and a philosophical foundation tied unmistakably to IC’s mission and its approach: not resistance, not blind acceleration, but intentional movement shaped by values and the needs of the students who will inherit this future.

Navigating Currents

When Weil talks about the future shaped by AI, he avoids easy binaries. It will neither save higher education nor destroy it, neither erase human creativity nor render expertise obsolete. Instead, he frames the future as something far more textured, a landscape in which human judgment, ethical reasoning, and creative insight become more valuable.

Many of the shifts he anticipates echo concerns raised across campus during the Presidential Working Group’s listening sessions. Others reflect what he has observed through national leadership roles and conversations at EDUCAUSE and elsewhere. Taken together, they forma picture of a future that is challenging, fast-moving, and full of possibility—provided institutions are willing to prepare thoughtfully.

Weil sees this work happening across four pillars that the college has identified for AI exploration and adoption over the next few years:

1. Student success: Ithaca College will explore AI tools that provide insights to support each student’s academic journey with greater precision, personalization, and timeliness. These tools will enable earlier interventions, more relevant guidance, and increased capacity for student-facing staff to focus on meaningful, relationship-based work. Weil emphasized that these tools will be used to inform, not replace, the care and insights provided by faculty and staff.

2. Student experience: As students navigate their college experience, they may encounter small incidents of friction—missed information, delayed responses, and moments where systems lag behind student needs—that can add up overtime. AI can help to improve the overall campus experience by making services more responsive, information easier to access, and campus interactions more personalized, reducing this friction and improving student experience and success.

3. Student preparation for an AI-powered workplace: Ithaca College is working with faculty to help graduates thrive in a world shaped by AI by embedding AI literacy in various disciplines; deepening understanding of ethical, sustainability, and societal implications; expanding applied learning and new academic programs; and strengthening partnerships that offer hands-on experience with real tools. Equally important, students must be able to articulate their “human intelligence,”which becomes even more essential as AI reshapes early career roles.

4. Operational innovation, effectiveness, and efficiency:The college will leverageAI to streamline institutional operations, enhance planning and decision-making, and free staff for human-centered work. By automating routine tasks and improving access to reliable information, we reduce administrative friction and deliver clearer, faster, and more consistent service to the campus community.The college’s readiness does not stem from having the “right” tools (those will change) but from having the right conditions:

•a collaborative governance structure built through the Presidential Working Group.

•guiding principles that emphasize ethics, transparency, human connection, and mission alignment.

•an IT division that leads with people, not products.

•faculty who value experimentation alongside integrity.

•staff who see AI as a means to improve support and equity.

•students who are curious, adaptable, and eager to apply new skills.

Human-Centered Implementation

AI may be global, but its implementation is local. At Ithaca College, the local context is strong: a scale that encourages collaboration, a culture rooted in mentorship, and a mission that prioritizes hands-on learning shaped by values.

At Ithaca College, the future of AI won’t be guided by velocity or spectacle but by a steady way of seeing—one that looks past the churn, focuses on what endures, and keeps human judgment, curiosity, and connection at the center.

Steady Leadership in a Time of Change

David Weil ’87, MS ’89, would be the first to say that Ithaca College’s AI approach is not “his” plan. His instinct is always to redirect attention to the people around him: staff colleagues, faculty innovators, cross-campus collaborators, and especially the students who help support much of IC’s technology ecosystem. But understanding the college’s direction also means understanding the steady, human-centered philosophy of the person who has advised, facilitated, and translated so much of this work.

Weil has spent virtually his entire professional life at Ithaca College, an uncommon path in higher education and one that speaks as much to the institution’s values as to his own. Over more than three decades, he has held roles across academic technology, instructional support, systems integration, customer service, and now strategic leadership. Few people have witnessed—and helped shape—so many eras of technological change at IC. Through each transition, his approach has remained remarkably consistent: listen carefully, act thoughtfully, and put people at the center of every decision.

One of the signature initiatives shaped by Weil’s leadership is Educational TechnologyDay, an annual regional gathering he helped establish more than three decades ago. What began as a modest campus event has grown into a long-standing forum that brings more than 1,000 participants and dozens of national vendors to Ithaca each spring to explore the evolving relationship between technology and education. Now entering its 32nd year, the event reflects Weil’s belief that progress is communal, not solitary. Ed TechDay embodies the way he leads: convening people, sharing knowledge, and creating spaces where ideas can take shape across roles, disciplines, and institutions.

Colleagues often point to Weil’s calm presence, collaborative instinct, and deep institutional memory, all of which shaped his leadership of the Presidential Working Groupon Artificial Intelligence. His national work with EDUCAUSE, the leading professional association for technology, data, and digital transformation in higher education, including his recent election to its board of directors, brings a broader lens to IC’s local conversations. That perspective helps the college situate its AI strategy within national trends without losing sight of its own scale, values, or mission.

One throughline in Weil’s career has been his commitment to developing people, especially students. The IT student workforce he helped grow is now a defining strength of his division, launching graduates into careers in technology, instructional design, media systems, and cybersecurity. As he put it in our conversation last winter, students learn “real skills working with real users,” gaining confidence in ways that cannot be replicated through coursework alone. That belief, that technology should expand human capability rather than overshadow it, shapes every corner of IC’s AI strategy.

Weil doesn’t claim to have all the answers. What he brings instead is something quieter but more durable: a conviction that institutions navigate disruption best when they cultivate trust, empower people to ask better questions, and keep relationships at the center of technological change. It is that leadership style—steady, collaborative, and grounded—that continues to influence how IC is preparing its community for an AI-shaped future.

The Gift That Catalyzed AI at IC

For John Neeson ’84, chair of Ithaca College’s Board of Trustees, the decision to support the college's work around artificial intelligence (AI) grew out of a clear-eyed assessment of where the college stood. The vision was taking shape. The questions were the right ones. Neeson's support provided the funding to launch this work in earnest and sustain a coordinated, campus-wide approach. Neeson’s $520,000 gift provided the funding to launch this workin earnest and sustain a coordinated, campus-wide approach.

“I feel like we’re at this next frontier,” Neeson said. “I would love nothing more than my college, Ithaca College, to beat the forefront of AI, for our students to come out with skills they can apply that they’ve learned in college like no other college has done. We make the education even more special by applying AI thoughtfully, and that’s the real key: thoughtfully.”

When David Weil ’87, MS ’89, shared how the college was approaching AI—as an institution-wide responsibility that would require governance, ethical boundaries, and long-term stewardship—Neeson recognized the kind of work it would take to do this well.

Neeson’s perspective was shaped by decades working at the intersection of technology, strategy, and marketing.As a cofounder of the global business-to-business research and advisory firmSiriusDecisions, he helped organizations move beyond hype to define practical frameworks for how technology supports decision-making, accountability, and long-term value—experience that informed why Ithaca College’s measured, values-driven approach resonated with him and earned his support.

“I would love nothing more than my college, Ithaca College, to be at the forefront of AI.”

John Neeson ’84, chair of Ithaca College’s Board of Trustees

A core part of IC’s approach is a belief in student agency. For students, Neeson said, the goal with AI is to “incorporate it, not as something that’s doing your work, but as something that’s actually helping your work to be even more effective, a partner that can help you create ideas you can build upon. It can help you get to data and information that you can use more effectively and be more accurate.”

That framing—AI as a partner rather than a shortcut—matters in the classroom and in the work students will go on to do.It reinforces how students learn to think critically, how they test ideas, and how they understand their own value in anAI-shaped workplace. It also reflects a broader philosophy: education becomes more powerful, not less, when technology is applied thoughtfully.

For Neeson, that philosophy carries an implication: Institutions can allow technologies like AI to shape them by default—or they can decide, deliberately, how those tools align with who they are and how they teach. “We have a choice as a college,” Neeson said. “We can wait for it to happen, or we can get ahead of it and make it part of what we’re all about. I’d rather the latter happen.”

The work on AI is still unfolding, and it is intentionally collective. As the college continues to build a human-centered approach to AI, alumni and supporters who share Neeson’s belief in preparation, agency, and thoughtful leadership have an opportunity to take part in this foundational moment—one that will shape how students learn, create, and lead in the years ahead. Contact giving@ithaca.edu to learn more.