South African university students use AI to help them understand – not to avoid work
Students often use generative AI tools for engaged learning. They have a critical and nuanced understanding of these tools.
When ChatGPT was released in November 2022, it sparked many conversations and moral panics. These centre on the impact of generative artificial intelligence (AI) on the information environment. People worry that AI chatbots can negatively affect the integrity of creative and academic work, especially since they can produce human-like texts and images.
ChatGPT is a generative AI model using machine learning. It creates human-like responses, having been trained to recognise patterns in data. While it appears the model is engaging in natural conversation, it references a vast amount of data and extracts features and patterns to generate coherent replies.
Higher education is one sector in which the rise of AI like ChatGPT has sparked concerns. Some of these relate to ethics and integrity in teaching, learning and knowledge production.
We’re a group of academics in the field of media and communication, teaching in South African universities. We wanted to understand how university students were using generative AI and AI-powered tools in their academic practices. We administered an online survey to undergraduate students at five South African universities: the University of Cape Town, Cape Peninsula University of Technology, Stellenbosch University, Rhodes University, and the University of the Witwatersrand.
The results suggest that the moral panics around the use of generative AI are unwarranted. Students are not hyper-focused on ChatGPT. We found that students often use generative AI tools for engaged learning and that they have a critical and nuanced understanding of these tools.
What could be of greater concern from a teaching and learning perspective is that, second to using AI-powered tools for clarifying concepts, students are using them to generate ideas for assignments or essays or when they feel stuck on a specific topic.
Unpacking the data
The survey was completed by 1,471 students. Most spoke English as their home language, followed by isiXhosa and isiZulu. The majority were first-year students. Most respondents were registered in Humanities, followed by Science, Education and Commerce. While the survey is thus skewed towards first-year Humanities students, it provides useful indicative findings as educators explore new terrain.
We asked students whether they had used individual AI tools, listing some of the most popular tools across several categories. Our survey did not explore lecturers’ attitudes or policies towards AI tools. This will be probed in the next phase of our study, which will comprise focus groups with students and interviews with lecturers. Our study was not on ChatGPT specifically, though we did ask students about their use of this specific tool. We explored broad uses of AI-powered technologies to get a sense of how students use these tools, which tools they use, and where ChatGPT fits into these practices.
These were the key findings:
- 41% of respondents indicated that they primarily used a laptop for their academic work, followed by a smartphone (29.8%). Only 10.5% used a desktop computer and 6.6% used a tablet.
- Students tended to use a range of other AI-powered tools over ChatGPT, including translation and referencing tools. With reference to the use of online writing assistants such as Quillbot, 46.5% of respondents indicated that they used such tools to improve their writing style for an assignment. 80.5% indicated that they had used Grammarly or similar tools to help them write in appropriate English.
- Fewer than half of survey respondents (37.3%) said that they had used ChatGPT to answer an essay question.
- Students acknowledged that AI-powered tools could lead to plagiarism or affect their learning. However, they also stated that they did not use these tools in problematic ways.
- Respondents were overwhelmingly positive about the potential of digital and AI tools to make it easier for them to progress through university. They indicated that these tools could help to: clarify academic concepts; formulate ideas; structure essays; improve academic writing; save time; check spelling and grammar; clarify assignment instructions; find information or academic sources; summarise academic texts; guide students for whom English is not a native language to improve their academic writing; study for a test; paraphrase better; avoid plagiarism; and reference better.
- Most students who viewed these tools as beneficial to the learning process used tools such as ChatGPT to clarify concepts related to their studies that they could not fully grasp or that they felt were not properly explained by lecturers.
Engaged learning
We were particularly interested to find that students often used generative AI tools for engaged learning. This is an educational approach in which students are accountable for their own learning. They actively create thinking and learning skills and strategies and formulate new ideas and understanding through conversations and collaborative work.
Through their use of AI tools, students can tailor content to address their specific strengths and weaknesses, to have a more engaged learning experience. AI tools can also be a sort of personalised online “tutor” with whom they have “conversations” to help them understand difficult concepts.
ALSO READ: Over 1.7 Million South Africans rally to register this weekend
Concerns about how AI tools potentially undermine academic assessment and integrity are valid. However, those working in higher education must note the importance of factoring in students’ perspectives to work towards new pathways of assessment and learning.
The full version of this article was co-authored by Marenet Jordaan, Admire Mare, Job Mwaura, Sisanda Nkoala, Alette Schoon and Alexia Smit.
Tanja Bosch, Professor in Media Studies and Production, University of Cape Town and Chikezie E. Uzuegbunam, Lecturer & MA Programme Coordinator, Rhodes University
This article is republished from The Conversation under a Creative Commons license. Read the original article.