“But imagine if we could…”
Part 1 of an interview with Emily Pacheco
As part of my work in this space, I seek to highlight the folks I’ve been in conversation with or learning from over the last few years as we navigate teaching and learning in the age of AI. If you have thoughts about AI and higher education, consider being interviewed for this Substack.
Introduction
Emily Pacheco is an educator and AI ethics advocate focused on the responsible use of emerging technologies in education. She has worked in higher education for over 20 years, guiding students and institutions through the evolving landscape of U.S. education.
Emily currently serves as a community organizer for Imperial College London, based at their Global Hub in San Francisco, where she supports partnerships and alumni engagement initiatives. She is also the founder of the AI in College Admissions community and EdHub.ai, where she provides resources and learning opportunities on the evolving role of AI in education.
Emily is passionate about ensuring AI is used responsibly to enhance access and transparency. She believes in fostering informed discussions about AI’s ethical implications, helping educators and students alike navigate this rapidly changing landscape.
Interview Part 1
Lance Eaton: Let’s get into it. As it relates to AI, what’s your origin story?
Emily Pacheco: I stepped into it how a lot of people get into it. With that first “aha” moment where you discover these tools and think, “Let me apply this to something specific I’m doing in my work.” And then you realize, okay, there’s something really cool here.
At the time, my work was feeling pretty stagnant. It was year six of doing the same thing at the same institution, having the same conversations. The processes were all in place, and I’m a person who doesn’t love that. I don’t like it when all the processes get set because it feels boring. When that happens, I either reinvigorate my own work with new projects or I move on to my next role.
I looked at what this technology could do to make my work interesting again. In that exploration, I looked outward to find who are the people already doing this. Not necessarily in AI and education broadly, but specifically in admissions. Who’s doing cool stuff in my niche?
And there was almost nothing. I found Jeff Neill as the face of this conversation, but there just weren’t a lot of resources out there. Even looking at the organization for college counseling, the National Association of College Admission Counselors (NACAC), where I thought, surely they’d have something. They’ll have a committee. But they didn’t yet have a community looking at edtech, which really surprised me. They offered nothing focused on technology.
When I approached them about such a group, they were keen on me starting something, and because there wasn’t much in this space, it kind of just took off. I joined with a few others who were interested in leading these efforts, and here we are, less than 2 years later, and are coming up on 3,000 members.
Finding AI gave me an entirely new way to create a community and help others learn. I’ve always liked good design and quality content, but I haven’t always been able to create it. I didn’t have the tools at my disposal that I needed to do so, and being given access to AI tools gave me the ability to do so much more creatively. With these tools, I can now create things I feel comfortable sharing with people around me. It really opened up my world.
The first thing I jumped into was presenting at conferences. I hadn’t done much formal presenting before, and with AI, I was able to put together a number of great session proposals, and I submitted them to see how they would be received. They were all accepted. Every single one. I presented three sessions at one conference. There was clearly an audience and an interest for the topic.
Since then, I have worked to put out various resources that can help educators begin to navigate this technology in a thoughtful way. I want to create safe spaces where people can feel comfortable starting to have the conversations that need to happen in this new space.
Lance Eaton: I definitely experience that idea that the only way through is through. It reminds me so much of how I ended up in instructional design and faculty development about 16 years ago. Back then, I was just starting to use blogs, wikis, and Web 2.0. Faculty were having similar reactions, and I was full-time adjuncting. I remember thinking, yeah, but let’s use it and figure it out. So I started doing lots of talks and presentations, just exploring what was possible.
It hearkens back to that same energy. And again, clearly, you’ve used it in your own space. That hesitation, that angst, and that part about the student admissions community not having a tech focus: I just find it fascinating.
At one point, I even hoped to teach a course in critical studies in technology and higher education, looking at this strange tension where all of our meaning-making, experience, and work in higher education are digital, yet we don’t fully understand that. We don’t fully understand that in college, regardless of your role, a large piece of our work is about thinking through that digital space: when it shows up, where it shows up, and how it shapes what we do.
Emily Pacheco: Definitely. And there’s sometimes almost an aversion to it, like it’s noble not to be in that digital space. In higher ed, we’re a brick-and-mortar establishment; we sometimes see ourselves as higher value than something that’s just online or digital.
What I’ve found is that a lot of what is happening with AI is going on behind closed doors. it doesn’t get talked about. Almost every university is using artificial intelligence in pretty interesting ways. They’re just not talking about it. Many are paying consulting firms who are then using this technology, and often the offices don’t even understand clearly what the firms they have hired are doing with it. There’s a lot of cool stuff happening, but it doesn’t get shared because there’s mostly still this very negative perception of the technology. Using too much technology in the admission process is seen as taking away from what really matters: the human connection.
A lot of admission offices build that identity. You’ve got a staff of 20 admission counselors creating this façade of doing the human work; boots on the ground, meeting students. But if you look deeper into the machine, they’re paying millions to consulting firms that are actually running these systems. All of that is digital and much of it is now AI, figuring out who applies and why and who ends up enrolling, but we’re not advertising that, we are not talking about this. Sometimes we’re even embarrassed by how it actually works.
Lance Eaton:I think that’s such an interesting dynamic: the way institutions are digitally structured. All the work is digital, and I often think about it like this: imagine you’re a student profile, first created the moment you interact with an institution. That digital profile gets sliced and moved around through all the infrastructure. It ends up in the student information system, the application process, the alumni database.
So that “digital double” goes everywhere. We create these systems, but we don’t really think about them. I tie that to right now because this is another moment where even more of that is happening. And in that context, the push to “center relationships,” whether it’s lip service or a genuine attempt, feels really hard.
Emily Pacheco: It feels very hard, but it also feels like we’re on the cusp of an opportunity. Like you said, there are all these different systems, and we have all this information scattered here, there, and everywhere. We’ve lacked great systems for extracting that data. This is a whole new way where we could have humans actually working with data in more meaningful ways.
I just came from an alumni relations meeting, and they were talking about how they have 19 different alumni groups across the United States. Each one has its own CRM, plus there’s a campus CRM. Information is everywhere. They’re trying to figure out how to bring it all together. And now, with these AI systems, I’m hearing that this is the perfect problem for an edtech company to solve: bringing all of it together, helping us make sense of the data, and uncovering the nuances that tell us what our alumni are doing and where.
That’s the human connection piece: how can we tap into that information in a deeply human way, now that we have this depth of insight and research?
Lance Eaton: Yeah, it’s that question of whether we can leverage these tools to better foster connection. One of my good friends, Rhoan Garnett, his entire dissertation and work focuses on how technology might foster belonging for college students. He’s been studying how to improve matching and mentorship through AI, to help students as they enter new environments.
Especially for students who are first-gen or coming from communities very different from the ones they’re stepping into, he considers how can AI help build those bridges? In that sense, admissions isn’t just about admitting. It’s about facilitating the entrance in a much deeper, more human way.
Emily Pacheco: It’s everything from that first point of contact when a student has just discovered your campus and is starting to look at you all the way until their butt is in the seat on the first day of class. That’s the goal. It’s not just getting them to apply or even getting them to pay that $500 deposit. It’s getting them to show up and to succeed at your institution. I think the ultimate goal is a successful alumni out in the world who is happy and thriving and giving back to their community, and AI can help university admission offices be much more thoughtful and intentional about how they can help make this a reality from that first point of contact.
There’s so much potential for these new systems to work with students in entirely new ways. I just heard an example that really stood out. The University of the Pacific noticed they were losing a lot of students who were admitted, but never enrolled. These students had shown lots of engagement, lots of communication throughout the admission process, but then they’d get their admission letter, that would be the last they heard from them. The school wondered what was happening.
They knew that a large percentage of these admits were Pell Grant recipients and first-generation students. When they reached out to ask what was going on, they learned that those students weren’t getting the guidance they needed. Once the admission letter came, they’d Google, “How much will it cost to attend?” and quickly decide that they couldn’t afford that path. What they didn’t know is that most would qualify for a lot of financial aid that they were unable to see at that point. Some could’ve attended for less than $5,000 a semester.
These students were struggling to understand the complexities of higher ed financial aid and the idea of taking on $70,000 a year in debt, which is what it looked like initially, shut down the conversation completely. So, the University of the Pacific started using a chatbot system designed to support these students more deeply and specifically. It ensured they received follow-up after the admission letter and could get answers quickly. They provided these students with new tools that helped them understand the financial aid system better, earlier on. The chatbot was also made available to families, designed to guide them through the nuances of college admission and ensure they had access to the information and answers needed to better support their graduating high schooler.
There were still humans involved, but at a higher level. The chatbot handled 24/7 availability. If a student wanted to talk to a person, the system would say, “We’ll have someone call you tomorrow.” And they’ve seen those dropout numbers shrink. Students are staying engaged longer, asking more questions, and getting information in new ways. That’s one example of how AI can really expand access.
Lance Eaton: I’d love for you to walk through that process from the moment a student starts thinking about applying until their butt is in the seat. What do you see as the role of AI in each of those phases? And I mean that both from the institution’s perspective and the student’s: what feels appropriate right now?
Because this is all changing so fast. We haven’t even talked about AI agents yet and everything is evolving around that. But I’m curious, at this moment, what are you seeing in that process from both sides?
Emily Pacheco: You’re asking this at just the right time. As of last week, I’ve been putting together stories from both an admissions officer’s perspective and a high school college counselor’s perspective, imagining what all of this looks like in 10 years.
I’m picturing it as Admissions 2035. I used ChatGPT to help me write stories about this envisioned future, and they were so fascinating.
It’s 2035; what does the admission journey look like? Reading those stories, I kept thinking, God, this sounds like some solutions to the things that are broken right now. The admissions system—the process we’ve been using for so long—is outdated and stale. The admission process is not aligned with what we actually need from students, and I don’t think it’s serving its purpose of helping us understand whether a student will be a great fit or a meaningful contributor to campus life.
So, what I see is a shift that’s already started utilizing this new technology. From a marketing perspective, the experience is changing in significant ways. It used to be very black-and-white: universities would buy lists of student names, usually from the College Board, send out mass mailers, and blast the same generic email to everyone. Using drip campaigns today is like handing every student the same campus map and walking away, assuming they’ll all follow the same path at the same pace, regardless of where they are or what they’re looking for. Now, institutions are doing much more creative, personalized outreach. The data they’re using is sharper, enabling them to provide much more targeted and relevant materials. They know much more about what students are interested in and what they’re doing. That is changing how universities market to prospective students.
To be clear, I’m not saying this is happening everywhere. Maybe 15 to 20 percent of institutions are doing truly innovative work, but those schools are connecting with students on a whole new level and changing the game. And that’s what students are beginning to expect. It’s not about, “We offer you this program and have this club.” It’s more and more, “Here’s how we fit into your personal life, your interests, your goals.” It’s much more personal.
That personalization is transforming the student search process for the better.
The other major change, and this is the part I’m really excited about, is how college counselors are working with students. The tools they have now are incredible compared to what they had just a few years ago. Historically, quality college counseling has been expensive and available only to a small number of privileged students. But there are new tools emerging that can help one counselor support 500 students, each getting a truly personalized experience through the counselor utilizing AI assistance.
That’s still in early stages, but some tools are starting to be used this way. Students soon will be able to say, “I don’t have a college counselor, but I have access to an AI tool that could be a great replacement for one.” And honestly, these tools already do a pretty good job at college counseling. It’s the best free tool students have ever had access to, and we are going to see what that means to college access and see how that changes the process.
So, yeah, that’s what I’m seeing right now. It’s already changing everything in college admission: for students, for college counselors, and for universities and the next few years are going to be quite a roller coaster.
Lance Eaton: The idea that a counselor could now work with 500 students and work with quality, roughly speaking, is something I’ve been keeping an eye on. There’s a book by Allison Pugh called The Last Human Job, which focuses on what she calls connective labor.
She describes these roles, such as teaching, ministry, therapy, social work, where the meaning-making and the development happen in the relationships themselves. The work is the connection.
I think career counselors in high schools are a really interesting example. Yes, they could potentially support 500 more students, but that raises questions about the other artifacts of those relationships, whether we should still be holding on to them or rethinking them. For example, the recommendation letter. Is that still feasible? Or does the signal that a recommendation letter carries become diminished when it’s no longer possible to have 500 deep relationships?
Because if the AI is doing more of the connecting, and the advisor is just signing off, what does that mean? I can see that becoming more common or at least being offered as an option. And I worry about the ripple effects. It starts to resemble what we’re already seeing elsewhere: scaling up human-to-human work to a point where it’s diluted.
It’s that same pattern we see in higher ed; suddenly, it’s one faculty member to a thousand students.
Emily Pacheco: I often say this because counselors will ask, “Does this mean I’m going to be out of a job?” And I tell them, I don’t think there’s a parent out there who, if given the choice between a great human college counselor or a computer, would choose the computer. Most people would pick the qualified human if given the choice, and I don’t think that is going to change anytime soon. Right now, a human can do this better.
I believe we are still going to have a thriving college counseling network of people because of exactly what you mention, this is connective labor. Students and their families are navigating a serious life decision, and having a knowledgeable and capable person to help in that process is ideal. The problem is that there is a shortage of knowledgeable and capable college counselors, and most of them are at a price point that is outside of what many students can afford. Until now, this has meant that only students from a certain socio-economic background, or who have been able to access counseling through some of the very few free community-based organizations who offer such services, are getting access to quality college counseling. Yes, this is connective work, but only for those who can find that person who can connect with them and most times that requires a lot of resources.
But we could scale these services this with AI: one high-quality counselor, working with AI, could support hundreds of students, and I think that can be done without sacrificing too much meaningful connection. Ideally, it wouldn’t replace relationships but enhance them. AI could provide counselors with deep insights into the students they are working with and give them more pointed ways to connect with the students that need it the most. It can more easily recognize the students who are falling behind on the college application timeline and help counselors figure out where their time might be best spent. As they say, it is quality of time and not quantity, and I believe AI could help counselors spend their time more efficiently and help them find and connect with the students who need them most.
I think most people still agree that a human counselor is ideal. More and more, though, I hear people suggesting that a human who knows how to use AI well is even better. And if a qualified human isn’t available, some would argue that AI is still preferable to nothing. I know plenty of people who don’t agree with that last point, but I definitely think that when it comes to college counseling, AI is definitely better than having no support.
There are some great examples of how college counselors have utilized AI to enhance what they are doing. Jeff Neill, the counselor I mentioned earlier at Graded, The American School of São Paulo, works in a well-resourced office. About two years ago, his office began using AI to support a number of processes, including the letter of recommendation workflow. Their team uses AI to help draft letters, which has freed up time for more meaningful, direct work with students. They polled their students before and after, and not surprisingly, the students’ satisfaction with the counseling office went up. Surveys from students and parents show much higher satisfaction since adopting the AI system. You can see here how AI didn’t replace the counselors, but instead made their work more meaningful.
I think many high schools will continue to see the value of human college counselors, but ideally, those counselors will also have the tools and training to use AI in their work, finding new and creative ways to support their students.
I hope this isn’t just limited to college counseling, because there are so many paths to a successful and happy life beyond the typical four-year college route. I can imagine a tool that helps students navigate all the options available to them, not just college admission, and one that starts much earlier than the moment applications are due. I picture something that can identify a child’s strengths and areas for growth, and offer a framework to help them move toward a future where they’re happy and thriving. That feels like the real goal. A tool like this, paired with a great school counselor, could be incredibly powerful.
A tool like this could also help students put together a better college application, giving universities a clearer sense of who they are and what they might contribute to a campus. I imagine that kind of insight would be extremely helpful to universities when they are building incoming cohorts.
I don’t see AI diminishing college counseling. I see it transforming. Right now, many college counselors are focusing almost exclusively on four-year institutions, and this leaves a lot of kids behind. These tools could help create a better way to help all students find meaningful, realistic, and fulfilling paths forward after high school.
Join us in the next post for Part 2!
The Update Space
Upcoming Sightings & Shenanigans
Continuous Improvement Summit, February 2026
EDUCAUSE Online Program: Teaching with AI. Virtual. Facilitating sessions: ongoing
Recently Recorded Panels, Talks, & Publications
David Bachman interviewed me on his Substack, Entropy Bonus (November).
The AI Diatribe with Jason Low (November): Episode 17: Can Universities Keep Pace With AI?
The Opposite of Cheating Podcast with Dr. Tricia Bertram Gallant (October 2025): Season 2, Episode 31.
The Learning Stack Podcast with Thomas Thompson (August 2025). “(i)nnovations, AI, Pirates, and Access”.
Intentional Teaching Podcast with Derek Bruff (August 2025). Episode 73: Study Hall with Lance Eaton, Michelle D. Miller, and David Nelson.
Dissertation: Elbow Patches To Eye Patches: A Phenomenographic Study Of Scholarly Practices, Research Literature Access, And Academic Piracy
“In the Room Where It Happens: Generative AI Policy Creation in Higher Education,” co-authored with Esther Brandon, Dana Gavin and Allison Papini. EDUCAUSE Review (May 2025)
“Does AI have a copyright problem?” in LSE Impact Blog (May 2025).
“Growing Orchids Amid Dandelions” in Inside Higher Ed, co-authored with JT Torres & Deborah Kronenberg (April 2025).
AI Policy Resources
AI Syllabi Policy Repository: 190+ policies (always looking for more- submit your AI syllabus policy here)
AI Institutional Policy Repository: 17 policies (always looking for more- submit your AI syllabus policy here)
Finally, if you are doing interesting things with AI in the teaching and learning space, particularly for higher education, consider being interviewed for this Substack or even contributing. Complete this form and I’ll get back to you soon!
We periodically host small-group workshops and leadership sessions for higher ed teams. You can learn more about our current offerings here.
AI+Edu=Simplified by Lance Eaton is licensed under Attribution-ShareAlike 4.0 International



