A short interview with Jennifer Cromley
Jennifer Cromley is a distinguished faculty member in Educational Psychology. As an instructor for a graduate introductory statistics course, she recognized the rising popularity and challenge of Generative AI in the educational landscape. This prompted her to reevaluate her teaching methods and her students' learning experiences.
To ensure her students truly grasped the subject matter rather than relying on shortcuts from Generative AI, Jennifer started a comprehensive research journey. She delved into the world of Generative AI, attended many AI workshops, and engaged in thoughtful discussions with her colleagues. The result? Jennifer made the bold decision to revamp her course for this fall semester. She dedicated many hours this summer to crafting a curriculum that prioritizes genuine learning over the convenience of Generative AI.
In this brief interview by the TLS Team, Jennifer shares her approach to steering away from Generative AI in assessments and her firsthand experiences in teaching this innovative course this fall.
Q: How did you start your journey of revamping your course?
A: Different professions use AI differently. The course I teach is a graduate introductory statistics course. It prepares the social science students to become scholars, researchers, and other professions that require skills of analyzing data and reporting the results in APA format. Our major publishers are Elsevier, Taylor & Francis, Routledge. Highest impact journals. They have all banned the use of generative AI in writing for publication other than checking grammar. So it's totally fine to use Grammarly. If you want to share your intellectual property with the whole world, you can put the whole paper in and let the generative AI give you suggestions for how to rephrase things or be more colloquial. So my decision about changing the course came from that perspective, that we are like the field of law. Students have to know how to analyze data and write reports without using generative AI.
Q: How did you revise your course to avoid generative AI?
A: I have four kinds of assignments. I think I changed all of them somewhat either in terms of what I was gonna give the students, or changing to things like proctoring, but also making some assignments optional. So the goals of the course are not to click on a lot of buttons, not to do a lot of math or mathematical proofs, but to be able to understand your data set. This was always my goal for my students to understand their data set and choose appropriate analyses to report them. The first assessment I revised was the analysis homework. In the past, I asked students to analyze small and complete data sets, and I would provide feedback and chances for them to revise their answers. Because it is easy to copy and paste the data set and ask GenAI to do the homework, I used a large and incomplete data set for my students to analyze. And it was still revisable as long as there was no cheating.
The second assessment was to review research articles. Previously students would read a research article and write their answers in Word and submit the article. This time, I asked students to annotate the article in PDF and submit the annotated PDF for grading.
The third one was the quiz. In the past, the quiz questions were scenario-based, and students would draw a conclusion from analyzing the problem. It allowed partial credit, and I gave them feedback. I also told students to make a crib sheet to summarize the key learning points, but they didn’t need to submit it. This fall, I changed all my quiz questions to be image-based. Students will read and interpret the data visualization. I also tried what I called “good faith effort” grading. It means if students really tried to follow what I taught in class, then they get the credit, and I provide feedback. For example, I had a scatterplot that was a little hard to read. The graph had an X axis that didn't start at 0, and lots of my students gave a kind of an obvious but wrong answer. But they were doing what I taught them to do. Look at the scatterplot, see where the regression line crosses the Y axis, and identify the value on Y.
Another change to the quiz is that I started requiring students to submit their crib sheets as part of their quiz preparation. What's fascinating to me is not just the effort, but also that students have to summarize. They have to sort of identify for themselves what the big ideas are and where the most important ideas are. They build their notes around those big ideas. In class, I clearly signal them on the PowerPoint. So there's something called key points on every PowerPoint, and when students say, Well, what should I put on my crib sheet? Well, the key points. But I also used my crib sheets for like 10 years after I finished my PhD. So, there are really lasting and useful summaries of a course. I guess the second unintended outcome about doing this is that it’s fascinating to see what the students do and don't put on the crib sheets. It gives me additional information about how they're thinking and how to help them learn better.
The last assignment is the exam. It used to be required two-hour long exams for both the midterm and final. This fall, I changed them to be optional or what we called credit recovery. If students didn’t do all the data analysis homework, the article homework, or the quiz, they can take the optional exams to earn more grades. But if students put in enough effort and assessed that they have earned grades from all the other assessments to get an A, they don’t have to take the exams. I also made the exams proctored either in person or online. So there were more options for students.
In general, the revision proceeds in three stages. First, identify how ChatGPT might be a threat to my teaching and students' learning. Second, explore what reasonable workarounds I could come up with. And then third, actually doing it with students.
Q: What are the students’ responses?
A: Well, so this has been fascinating to me. First of all, I put together a PowerPoint that just sort of had lots of examples of, you know, the US House of Representatives bans the use of generative AI. Bard, which is one of the generative AI apps, banned their own employees from using it. Samsung banned their employees from using it because trade secrets were actually being released into the world. So, I pulled together as many of these kinds of things as I could. I put up examples from all the publishers. And I asked for the students’ feedback. So the response seemed great. Nobody pushed back against it. Everybody was nodding their heads and saying, Okay, this makes a lot of sense like you're training us for a job. We're not gonna be allowed to use this stuff.
So, then I chatted with a few students afterward just as I ran into them in the hallway and asked if I was too harsh about the generative AI. They were all like, no, it's just great to know what's expected. So I think they like clear guidance about what they should and shouldn't do.
Q: What are some benefits that you can see?
A: It's still a work in progress. So far nobody left any questions blank on the first homework. Previously they were sometimes sort of gaming the homework, and they would be like, oh, I'm gonna, like, answer 10% of the questions, submit it, and then revise it. But this time after the revision, for two weeks later nobody picked and chose what question to work on. So they all answered all of the questions. Maybe that's sort of a third outcome; their work was much more complete. I think they sort of have this understanding. From their perspective they're kind of thinking, like, I can buy myself this really nice mid-semester break by working really hard, being very careful, you know, conscientious, paying attention to all the details. Each week for those first, whatever it is, you know, seven weeks. And then I can actually, with my effort, buy myself a break to not take the optional midterm exam.
After the revision, I think this course produces maybe an equally good learning outcome but much less stressful for students, which I think is very interesting. I mean, I think there have been some equity issues before. Students used to bear the stress of preparing for high-stakes exams. You know I could pretty much predict their midterm and final exam scores from how they had been scoring on all the previous assignments. So, at some level, it started to feel a little unfair to me. Some students get very stressed out by taking an exam, you know, two hours long. But after the revision, I think that this course will produce very similar learning but less stress for students.
Q: What is your teaching experience so far?
A: It did feel like a huge amount of work over the summer, but it really feels like it's paying off with more rewarding teaching this fall. It's just been a pleasure. It's been more enjoyable to teach. And I wasn't expecting that. But it feels like, okay, there's this cost of the work over the summer. And then there's this payoff during the semester.
This interview was featured in the Instructor Voices segment of CITL's Teaching & Learning newsletter. Subscribe to the newsletter to hear from more University of Illinois instructors.