COLUMBUS OHIO Artificial intelligence is rapidly becoming a common feature in higher education, particularly in admissions and campus operations. From tutoring platforms to predictive analytics and admissions chatbots, universities are encountering an expanding marketplace of AI-based tools. Yet experts caution that the growing wave of products requires careful scrutiny before adoption.
At the National Association for College Admission Counseling’s recent conference, a panel of specialists in higher education technology stressed that the first step in responsible AI implementation is to define the problem clearly. Without doing so, colleges risk purchasing flashy but ineffective tools.
“Define what your AI use case is, and then find the purpose-built tool for that,” said Jasmine Solomon, senior associate director of systems operations at New York University. She warned against repurposing generic AI systems for needs they were not designed to meet. The result, she said, is often disappointing or misleading.
Asking “Why” Before Buying
Solomon and other panelists emphasized that institutions should first question whether AI is even the right solution. “How does AI solve this problem better?” she asked. “Maybe your team or the tools you already have can address the issue. Maybe you don’t need an AI tool for this.”
This perspective reflects a growing call for administrators to evaluate tools not only by their technological promise but also by their practical fit within the institution. Panelists recommended that leaders consider who will use the tool, the potential privacy risks, and whether the advertised features are truly functional. Some AI products, Solomon noted, are sold with features that are still in development or effectively in beta testing, leaving institutions to discover shortcomings after rollout.
Ethical and Operational Challenges
Beyond product quality, there are broader challenges involving compliance, staff responsibilities, and institutional workflows. Becky Mulholland, director of first-year admission and operations at the University of Rhode Island, urged institutions to understand how AI fits into existing structures.
“For those who are considering this, please make sure you’re familiar with those aspects,” Mulholland said. “We’ve seen this not go well in some other spaces.” She pointed to issues such as data storage, privacy concerns, and the presence of AI-related stipulations in collective bargaining contracts. Ignoring these factors, she argued, could expose colleges to legal, ethical, and labor complications.
The Environmental Cost of AI
One challenge that drew particular attention is AI’s environmental footprint. Research shows that AI-powered search engines can consume up to 30 times more energy than traditional search tools. In addition, massive data centers that support AI require vast amounts of water for cooling.
While panelists admitted there are few institutional-level solutions to these problems, they stressed the importance of awareness. Mulholland noted that future scientific innovations may provide more efficient energy and water use, but the technology is not there yet.
Solomon added that user behavior plays a role. When individuals repeatedly prompt AI tools to refine answers, the cumulative energy demand increases significantly. She argued that improving user training in areas like prompt engineering could reduce unnecessary strain on resources.
Balancing Innovation and Responsibility
The discussion highlighted the complex balance universities face as they adopt AI. The promise of efficiency and innovation is undeniable, but so are the risks tied to ethics, labor, privacy, and sustainability. Both Solomon and Mulholland agreed that thoughtful planning, oversight, and ongoing audits are necessary if AI is to serve higher education responsibly.
In the end, AI is not a plug-and-play solution. It requires institutions to ask difficult questions, evaluate risks, and maintain accountability long after implementation. Responsible use, the panelists argued, depends less on the technology itself and more on the careful human work of guiding it.