AI Assist: Advising, Teaching, and Career Prep for College of Engineering Students

Feb 8, 2024, 10:17 AM

Podcast Makela AI episode

Photo: Grainger College of Engineering Professor Jonathan Makela appears on CITL's Teach, Talk, Listen Learn podcast.

By Robert Baird, Senior Associate Director at the Center for Innovation in Teaching & Learning

Like much of the academic world, The Grainger College of Engineering at the University of Illinois has been exploring the potential of generative AI for teaching, learning, and research. One of the more intriguing administrative uses of generative AI in Engineering is the exploration of an advising chatbot for students. Professor Jonathan Makela (Electrical and Computer Engineering) who serves as the Associate Dean for Undergraduate Programs in the college, has been coordinating these efforts. Seen as a tool to help the very busy human advisors, once fully realized the always-on, endlessly-patient chatbot will help students 24 hours a day in responding to common questions.

Watch this short video where Jonathan talks about Advising Bots.

Engineering’s advising chatbot is being tested using “,” a web-based OpenAI tool built by the Department of Electrical and Computer Engineering in partnership with the National Center for Super Computing Applications. By providing the advising chatbot with the college-specific guidelines, documents, policies, and web documents related to advising, the design team ensured that the chatbot would be able to respond to student questions quickly and in a conversational manner. Jonathan: “This, I think, is a really powerful use that we're just experimenting with right now in this sandbox where we can control the information that the AI is using so that we can make sure that as the student is interacting with the chatbot, the information that's given back is the information we want the student to receive.” With concerns regarding AI’s reliability and choice of sources, Jonathan finds “what I like about what’s been set up at the NCSA through this site is that what comes back from the AI is trained on a specific set of information and actually referenced so you can then go and click and it shows you the website where it got that information.”

Student Privacy Safeguards

Mindful of student and faculty privacy, the design team built on a campus platform, separate from commercial and more open AI systems, giving “faculty and other people the ability to experiment with this technology in a controlled and safe environment where you can essentially ask it to incorporate whatever body of knowledge you want.” Jonathan explained that the design and use of the advising chatbot was carefully calibrated to the need to protect student information and privacy: “That's one of the concerns that we have in the student services area, especially when we talk with our advisors and staff that are experimenting with open AI. We need to be protective of student information, even for something as simple as the use case of AI helping draft a response email. You wouldn't want to put in the student's email to you because that contains personal identifiable information that we need to be protective of.”

AI-Powered College Advising

While the college utilizes many methods to communicate with students, the advising chatbot could offer benefits beyond assuming that students will scan websites and read though FAQs and attend informational sessions. With the advising chatbot, students would simply ask questions directly, with the chatbot assembling the information into a conversational response.

For advising professionals and administrators in the college one longstanding issue has been that some students would not seek out advisors or search through college information. For some, asking a chatbot may be easier and less intimidating than searching for the information or asking someone. Jonathan: “We need to recognize that there are situations in which students are going to be less willing or less likely to come into an advising appointment on their own. So, if we can give them the information that they need to navigate our systems in a way that they're comfortable with and willing to interact with, I think that's a win.” However, he is quick to point out that the AI needs to be able to identify times when students should seek the advice of a college professional, and recommend that course of action. He also points out that the correct information to provide depends on the context of the question (e.g., the major the student is in, the phase of the academic semester the question is asked) and something that currently is not well handled by AI.

A More “Empathetic” Advising Chatbot

While the advising chatbot has no emotions or feelings, the way it is prompted to respond, and its programmed language style, do convey emotions (or lack of them) to human readers. Jonathan noticed that if the advising chatbot replied to questions in a clinical or just-the-facts manner with a simple bullet list the chatbot felt curt and uncaring. They were able to tune the chatbot so that responses were perceived as more empathetic. Jonathan highlighted that “a lot of times in the advising space, there are feelings and emotions at play. And so it is important to be able to prompt the AI to be a little more empathetic in these responses and provide some information that is now in a different tonality.”

Learning With AI in Engineering

Jonathan hopes to fine-tune the college’s AI resources based on students’ individual needs: “How [AI] can be used in the classroom changes, whether you're working with first-year students versus Ph.D. students. I think what we're really trying to understand now is how that changes throughout the student experience . . . maybe we shouldn’t be using AI at the lower levels because this is where you're really trying to develop those foundational skills . . . but, then, later on in your career, the learning process becomes more open to using AI to offload some of that lower level work that you now understand, and you can focus in on some of those deeper learning topics.”

Another hope is to teach students to be critical consumers of AI outputs: “I think one of the interesting use cases for teaching, is using AI to ask it a homework problem as part of a classroom activity and then work with your students to critique the results, the answer that AI gave you, because it is not 100% correct . . . if you as a learner can go through and critique and see, yes, this is where it's right, this is where it's wrong and this is why it's wrong, I think that's a really nice demonstration of learning concepts.”

Teaching In a World With AI

When asked his advice for instructors adapting to the prevalence of AI, Jonathan had some tips: “For a week before classes start, run your homework through ChatGPT and look at the answers that come back. I did that and it was eye opening for me because it provided reasonably correct responses, which was impressive. It knew all the parlance of the domain language and could pull off some of the equations, but it wasn't exactly correct. I would give it a ‘C+’. But if you are a student going through the material for the first time, it sounded 100% plausible. And so that, again, is this idea of having the teacher in the loop to be able to guide students, to understand that this is what's right, this is what's wrong, and this is why it's wrong about that. That can be a valuable part of the learning process.”

Watch this short video where Jonathan talks about how teachers can get started with AI.

Teaching and Learning With AI

While obviously a new, disruptive technology, AI recalls previous historical moments and traditions: “We’re always going to offload calculations onto computers. So now that the computer incorporates AI, and you can interact with it in a different way . . . You could always go look up the even numbered problems, which were there in the back of the books I learned from, and now you could simply ask the AI what the answer is . . . This is one of the challenges that we have as educators. We have to be able to convince and show the value to our students of the process of learning rather than only focusing on getting the right answer.”

Engineering Careers and AI

Of course, the disciplinary and career focus of Engineering requires that the college not only pay attention to AI, but that it be actively involved in its use, on campus and off. “I think that it's just a fact that as our students graduate and go off into industry or academic careers, they're going to be utilizing AI with their peers and with their colleagues. So, they're going to need to be conversant in the technology, which is the reason why I think we can’t shy away from its use in the classroom, because it is just another tool that successful engineers are going to utilize in one way or the other. And so, it's important that in the safe confines of the university system our students can experiment with it.”

Did you like this story? Read more about how University of Illinois faculty are using generative AI in the classroom.