For more than a year, we’ve been tracking how AI reshapes program discovery across 100,000+ student journeys. In graduate search, students began asking conversational questions, getting instant comparisons, and forming shortlists before ever visiting a university website. That same pattern is now firmly present in undergraduate search, but the risks are different.
A recent New York Times report confirms that high school students and parents are turning to tools like ChatGPT, Claude, and Copilot as “always-on” college counselors, often at the very start of the search process. With school counselors stretched thin and families overwhelmed, AI isn’t just supplementing guidance—it’s actively shaping it.
For institutions that haven’t addressed how their programs appear inside AI-generated answers, two risks emerge quickly.
Two Risks Universities Face as Undergraduate Search Moves into AI
What makes this shift different for undergraduate recruitment is when and with whom control is being lost. These AI-driven conversations are happening at the very earliest stage of the student journey, and often with both students and parents, who are frequently the most influential decision-makers. By the time a student reaches a university website or speaks with admissions, assumptions about fit, affordability, and eligibility may already be set.
Risk #1: Other Universities Make the Shortlist Before Yours
AI is becoming the first “counselor” families turn to. The New York Times cites an average of 376 students per school counselor, helping explain why parents and students rely on AI: it’s immediate, always available, and feels authoritative.
Students and parents are asking questions like: What schools fit my GPA and interests? Which majors lead to this career? What scholarships might apply to us?
If your content isn’t structured in a way AI can clearly parse and cite, your institution may be excluded from those early answers—while competitor schools are named, compared, and shortlisted instead.
Risk #2: AI Misinformation Shapes Expectations Before You Can
As AI tools take on an advisory role, families increasingly trust the answers they receive—even when those answers are wrong. The New York Times highlights examples of AI inventing scholarships that don’t exist and telling students they’re “100 percent” likely to be admitted to ultra-selective schools.
When AI gets details wrong, it distorts expectations around admissions criteria, cost, competitiveness, and eligibility. The result isn’t just confusion—it’s more wrong-fit inquiries, lower yield, and additional strain on admissions teams tasked with correcting assumptions they didn’t create.
Practical Steps Universities Can Take Now
1. Strengthen your website content. Your website is your “source of truth.” AI systems rely on clear, consistent, and authoritative information. Admissions requirements, scholarship eligibility, program outcomes, costs, and policies need to be accurate, plainly written, and easy for AI to reference—otherwise ambiguity invites exclusion or misrepresentation.
2. Optimize for AI-powered discovery, not just traditional SEO. SEO still matters, but AI tools increasingly answer student and parent questions directly, often without sending them to a website. Content needs to be structured so AI can clearly interpret, attribute, and reuse it when shaping those answers.
3. Monitor how your institution appears in AI answers. Universities should regularly test how their programs show up in common student and parent prompts. Identifying gaps or inaccuracies early allows institutions to update content before misinformation shapes expectations at scale.
How Everspring Is Helping Universities Respond
At Everspring, we’ve been working on this shift for nearly two years—ensuring that university websites are sourced within AI-generated answers so that undergraduate programs are included early and represented accurately. The focus isn’t chasing AI trends, but aligning institutional content with how students and families are already researching and deciding.
The Bottom Line
Whether institutions are ready or not, undergraduate programs are already being evaluated inside AI tools. If your university isn’t showing up—or isn’t showing up accurately—you’re losing control at the very earliest and most influential stage of the decision process, often before admissions ever enters the conversation.
Want to know how your institution shows up in AI search?
We’ll assess how your programs appear across common student and parent prompts, identify where invisibility or misrepresentation may be shaping decisions, and outline practical steps to improve accuracy and shortlist presence.

