Obsolete within Two Weeks: Challenges of Using ChatGPT in Asynchronous Library Instruction Content for WRDS

In the Spring of 2024, I collaborated with a WRDS professor to add ChatGPT to two of my most commonly requested workshops: Academic Sources and Researchable Research Questions and create two separate videos for his online class.

For Researchable Research Questions, part of what I focus on is prompting students to create keywords for their research topics. What I have found during my nine years as a librarian here at UNC Charlotte is that students most often struggle with finding sources not because they are “bad at research” but because their research topics aren’t researchable. Their topics or questions are often too broad (how does social media impact mental health), too narrow (what do the lyrics of [this band] say about society), or unanswerable (are humans good or bad). To help with this, during my in person workshops, I ask students to work in pairs to write down the first ten things that come to their mind in relation to their research topics. This encourages a discussion about how research topics sometimes have to be adjusted to find the best sources.

Students often don’t realize that academic articles, which they are often required to find, are very focused and may not match their exact topic question. So the student who wants to research social media and mental health is going to get overwhelmed by the plethora of
academic articles which often focus on a particular social media platform (tiktok), aspect of mental health (body image), and another element, usually a population (pre-teen girls). With their list of words related to their topic, usually there will be a more focused aspect of their topic that they can draw on to find sources. I joke with them that I know they’ll never be working on this at 2 am before the night it’s due, but if they are, having this list ready can prevent them from getting frustrated and not finding enough sources.

To incorporate AI into my workshops, I asked ChatGPT to give me keywords related to my
sample research questions. I believe that AI is a tool to enhance students’ research skills.
Ultimately, I want students to be able to learn, which comes from reading sources, not finding them. And if students spend all of their time trying to find sources, they aren’t learning. This activity is fairly low stakes outside of any learned bias where ChatGPT may suggest some keywords but not others.

For Academic Sources, the stakes were quite higher. This workshop had a more interesting and contentious interaction with ChatGPT rather than simply asking it to suggest keywords. When I made the video, ChatGPT response was:

“I can’t browse the internet in real-time, so I can’t find or provide links to specific articles. However, I can suggest some general search terms you can use to find academic articles on video games and mental health. You can try searching for terms like “video games and mental health,” “effects of video games on mental health”, or “video game addiction and mental health”. You can use databases like PubMed, Google Scholar, or PsycINFO to find relevant articles.”

It was the height of the ChatGPT backlash, where it was making up sources and citations that didn’t exist. ChatGPT had recently switched to saying it could not search the internet and could not give citations.

But by the time the students viewed the video in their course, two weeks after I created it, ChatGPT had already changed again and suddenly was giving actual citations that were real academic articles. So by the time students watched my video, the information was already incorrect. Keeping up with this fast changing environment is a challenge with creating online, asynchronous content. Luckily, I anticipated this and said in my video that it was created in Spring 2024 so if it’s viewed later, the information may be incorrect. But I think now if I make similar videos, I’ll need to put the exact date in my presentation so viewers know which version of ChatGPT I’m using.