Knight Explores AI’s Role in Higher Education at Global Symposium

As artificial intelligence continues to reshape higher education, Emerson College Dean of Faculty Brooke Knight is helping lead the conversation. Fresh off delivering a keynote address at Hong Kong Baptist University (HKBU)’s AI + COMM International Symposium, Knight spoke about the evolving role of AI on campus—and what comes next.
Knight, who co-chairs Emerson’s AI Governance Group with Frankie Frain ’08, MFA ’12, Assistant Vice President of IT Security & Infrastructure, attended the March 7-12 symposium alongside Interim School of Communication Dean Paul Mihailidis and Vice Provost for Global Engagement and Programs Anthony Pinder.
What were your goals in attending the symposium?
Knight: We went to HBKU in Hong Kong for two different purposes. One was to reestablish what had been about a decade-old relationship that we’ve had with [HBKU]. Prior to the pandemic, we had an exchange program and a memorandum of understanding. We had worked with HBKU in a couple of different ways, and then the pandemic really kind of shut that down.
[Now,] both institutions are eager to further the relationship and we’re moving forward to look at different ways to engage with each other.
HBKU hired [Professor and Dean of School of Communication] Bu Zhong recently, and he was a PhD program classmate of Paul Mihailidis. He and Paul connected and we started to think about rekindling the relationship with HBKU.

Also, we have a relationship with one of their partner institutions [Beijing Normal-Hong Kong Baptist University], which is on the mainland of China in Zhuhai. During the early days of the pandemic, for about a year and a half, BNBU hosted 90-some Emerson students and taught Emerson classes so those students wouldn’t fall behind their classmates at Emerson. That was enormously helpful to us. After the pandemic, those students returned to attend Emerson in Boston.
I was also there to talk about my work with AI at Emerson. Paul and I were invited as participants.
What was your keynote address about?
Knight: Essentially, there were eight short keynote addresses [by] mostly social science researchers from around the globe, including from Penn State, the United Kingdom, China, Canada, and Germany.
A lot of them are doing specific research on the impact of AI on society, and what the implications are. For example: the environmental impacts of AI.
My talk was a little bit different because I talked about Emerson’s AI Guiding and Governance Principles document that we’ve been working on, and that the story comes first at Emerson. That was the title of my talk: “The Story Comes First.” It’s a living document that we are in the process of developing about the approach that we’re taking to look at deploying AI at Emerson.
What did you discuss about Emerson’s AI Guidance and Governance Principles?
Knight: It’s about a way to ethically approach AI, and understanding the perils involved, the concerns that folks have, but also recognize the opportunities and the promise that AI, as a suite of technologies, holds. It’s about how we look to use technologies to advance human creative endeavors.
First and foremost, accountability remains with people and not machines. It’s about accepting responsibility, and at the same time, how do we critically engage using skepticism and rigor as part of our process, not only with AI output, but with AI itself. We are making sure that we’re transparent and that we use AI with integrity.
We look to have students ready for an AI future. It’s difficult to imagine a position in the future that’s not going to be impacted by AI in one way or another, especially in creative fields. It’s a huge threat to a lot of folks, but there are also opportunities. We want to make sure that our students have the opportunity to be prepared.
Lastly, it’s a reaffirmation of our data governance policy and making sure that any tool that we would use would protect Emerson’s information, and information that came from Emerson people and the institution.
Both Paul and I gave 10-minute talks in the second half of the conference at BNBU. Paul talked about media and proximity and its importance for community building, social good, and media literacy. He looked at it through an AI lens.
My talk was a thought experiment—looking at what the future of American higher education looks like and what the impact of AI may have on institutions of higher education in the U.S. The session I was in was about structures—governmental, societal, and educational structures.
What will be the impact of AI on higher education?
Knight: There will be an impact, and I’m not sure which way it’s going to go. Higher education [has] been served well by being relatively conservative in not changing with trends and fads too quickly.
We see a lot of institutions being really cautious about their adoption of AI in a variety of different ways.
I think that institutions granting degrees based on credits and sharing knowledge will remain. The format that we have will remain. I also talked about other potential futures—that maybe the humanities as well as high-touch experiences for students are going to be really valuable in an AI-saturated world. Maybe students will crave and want human connection.
I also imagine that some of the larger institutions that work online at scale might be able to use AI to scale up even more. These institutions have hundreds of thousands of students. They might be up to even millions of students, and potentially lower the cost of a college education for some. There may be some for whom an online educational experience speaks to them because AI allows for really highly individualized instruction. And for them, a low cost option might be really appealing. But there will be folks for whom a more traditional campus-based, in-person experience will be worthwhile.
I think it really challenges the question of, “Why go to college?” Let’s wait and see what happens after a couple of years. There’s a lot of AI hype right now, and there’s a lot of AI promise, but not as much actual benefit to companies as one might expect by this point. If our experience with the dot.com bubble and bust 25 years ago tells us anything, it’s that AI adoption won’t be in the way that we initially expected, but will still significantly impact how we teach and learn, and how people do their jobs in the future in ways we don’t know yet.
How will attending the symposium benefit your work at Emerson and Emerson College?
Knight: It was great to hear what is happening in research, in the field, and ways in which AI might be used for public good. In my own role as Dean of Faculty, I learned a lot about the impact of AI on research, and how faculty might use AI to accelerate their research.
We discussed at length what this may lead to—newer discoveries more quickly—and then challenges that it is going to place on the academic publishing industry, which has already had a number of challenges. The turnaround time for research may diminish significantly, and then the publication process might change. There was discussion at the conference about the pre-publication process, where it’s not fully vetted before it moves on to the next stage. It’s going to challenge the systems in which we evaluate scholarship and research.
Do you use AI?
Knight: I use it sometimes to help me draft stuff. I’ve had it help create slides out of [content] that I’ve drafted. I’ve had it do some research for me to look at what other institutions are doing about different policies. I use it as a research assistant. I haven’t done a lot of image generation. Candidly, it’s mostly text-based that I’ve been using.
How do you personally feel about using AI?
Knight: I’m surprised, both by what it can do, and what it gets wrong. There have been times in my work that I spent [too long] trying to have AI do a task that I thought it could do relatively quickly rather than just manually doing it myself. But other times it’s great and really helpful to find information that would’ve taken me a couple of days. I would’ve needed to have emails go back and forth, and looking through websites, and now I can have it at my fingertips right away. It’s amazing. But other times it’s been incredibly frustrating.
The other thing is that the models are getting better and better at understanding what people mean by their prompts.
Any final thoughts?
Knight: There’s a lot we don’t know yet about how AI is going to impact the professions into which our students graduate. We don’t know how it’s going to impact the institution. But I strongly feel we should be engaged with AI, and clear-eyed about the challenges and problems that it poses, but also be cognizant of the opportunities that might be available for us.
Categories