The Unintended Effects of ChatGPT on Learning

What happens if we introduce AI into any learning experience? If we consider ChatGPT as an AI tool that might assist in the learning process, we need to immediately consider its ability to teach.
The Unintended Effects of ChatGPT on Learning

There’s been a lot of hype in the media recently about the impact of ChatGPT. While some are considering the positive impact it could have in corporate environments, others suggest it may have very negative impacts in academic and other learning environments.

I’ll confess, I’ve been wrestling with this question, more so from the perspective of my (not so little) kids and their learning experience, both at home and in school. And so, I thought I would spend some time developing a few thoughts to see where my own experience leads me.

Having raised five kids, I have experienced plenty of occasions when they have come home with a homework assignment, math problem or class project, requesting some form of help. And that’s where I want to start my thought process.

What Does Quality Help Look Like?

In the context of a learning experience, sometimes we need help along the way. But what does good, quality help look like? And when does that help begin to hinder the learner’s ability to acquire new knowledge and skills because the person providing the help actually produced the desired outcome, without the learner’s involvement? In other words: Am I truly helping my child, or did I end up doing the work for them?

Scenario #1: The Math Word Problem

Let’s look at a math word problem, for example. If the wording is a bit complex and the learner needs help breaking it down, I can help by facilitating a conversation providing suggestions on how to break down the scenario. But more importantly, I’m allowing the learner to write things down and do the thinking as we work together.

I can further help by emphasizing specific callouts in the scenario, allowing the learner (again) to write things down as we attempt to piece together numbers, put things in the right order, etc. But as soon as I say, “add these two numbers” or “subtract these two numbers,” I’ve jumped ahead. I’ve essentially given the learner the answer and thus deprived the learner the opportunity to make the connection in their own mind. In other words, I’ve not allowed them to work towards full comprehension.

To an extent, one might argue, the learner is not just learning math in that moment in time, they are also learning how to think critically or apply logic – skills that are highly valued in the workforce.

Scenario #2: The Research Paper

Another example might involve writing a paper. Of course, this includes much more than actual writing. It means performing the research, then reading and learning the material and finally attempting to compose a set of paragraphs explaining the newly acquired knowledge in their own words (without plagiarizing). That’s usually the most challenging part.

Obviously, this process involves thinking, processing, understanding and working hard (self-discipline) to comprehend a new fact or topic to the extent that the learner can now explain it in his/her own words. If the learner requests help in this type of setting, beyond explaining the meaning of vocabulary words or assisting in the correction of grammatical or spelling errors, there’s a limited number of things one can do before crossing that line of doing the task for them.

One might provide encouragement, consider incentives and help the learner remain focused on the task at hand by removing distractions. But in the end, the learner needs to do the work themselves, by reading, understanding the material and writing the paper. (Important note: In this scenario I am assuming that the learner does not have any particular disability. In cases where there is a disability the approach would look very different for each learner).

Scenario #3: The Creative Project

Finally, let’s consider a creative task that maybe involves the imagination and the pattern of words to compose a poem. Granted, to some this may seem a natural, easy task. While to others this may seem very challenging. But in the end, it’s something we can all learn to do to some extent or another. (In my case, I’ll be happy to place two sentences together that end with words that rhyme!)

But here again, if the learner requests help with this type of task, it is important that the help given, does not interfere with the use of the learner’s own imagination, nor the opportunity to learn new tools that might assist with such a task (synonym finder, etc.)

What is Considered Cheating?

Back to the original question: What does any of this have to do with ChatGPT? There have been all kinds of concerns expressed in academic circles about the use (or abuse) of ChatGPT to “cheat”.

Before I move ahead too quickly, I think it’s important we understand what we mean by cheating. In doing a little research, I found that colleges across the nation have various different definitions for cheating (or “academic dishonesty”). I encourage you to look them up!

I’ll be honest, in my opinion, some definitions are pretty weak. I do like this one though from the University of California San Diego: “Cheating occurs when a student attempts to get academic credit in a way that is dishonest, disrespectful, irresponsible, untrustworthy or unfair.” For my purposes here, I’m using my own, more specific definition: Cheating occurs when a method or tool is introduced that circumvents the learning process or assessment process, yielding results that are untrue or not accurate of the students’ actual knowledge or skills. Or maybe more relevant to the above examples: Cheating occurs when another resource is used to do the work for the student or to provide an answer that the student did not derive from their own efforts.

Could AI Count as Cheating?

I think it’s important to put things in context and draw a comparison here, because before ChatGPT, there were other ways to cheat. In fact, it’s very possible that people with the best intentions, really just ended up doing the mental work for students, and as a result prevented the student from learning (circumventing the learning process).

If we consider the previous three examples of a learner engaged in a learning experience, I think we can agree that learners are on a journey to not just acquire new knowledge, but in most cases, to acquire other extremely valuable skills along the way. Learning how to use logic, think critically, be self-disciplined and use the imagination, are all important professional skills (or character traits) that we’ll take with us to any job later in life. Unlike a particular set of technical skills, professional skills and character traits grow and mature with us throughout our entire lives.

But what happens if we introduce AI into any learning experience? If we consider ChatGPT as an AI tool that might assist in the learning process, we need to immediately consider its ability to teach.

ChatGPT doesn’t know where to draw the line between helping the learner vs. doing the work for the learner. Especially if the learner approaches ChatGPT without a true desire to learn. You simply enter a basic problem, and it’ll provide the solution. You can request it write a paragraph style essay on a topic, and it will produce it for you. You can request it write a poem on a given topic, and it will compose a poem for you. Where is the learner’s imagination stretched throughout that engagement? Where is the learner’s logic applied in producing the results?

Should AI Replace Brain Work in Learning Environments?

And most importantly, when is the learner’s work ethic put into practice with this approach? What might have taken hours for a learner to produce, took minutes for AI to produce. As they say in the world of sports, no pain – no gain. Why? Because in order to run longer or faster than you did yesterday, you need to practice. I don’t foresee a day where AI will run a track race for a high school student, and yet why are we considering it as a replacement for “brain work” in academic environments? Do we not need to make our brains “run races” all the same? And what happens if the next generation of students are allowed to engage less and begin leaning on AI to generate results (whatever those may be). Will it not result in the reduced use of our imaginations, logic and critical thinking?

Maybe I’m missing something here, or maybe my logic is flawed – but as of right now, I personally am a bit concerned about what a future generation of workers might look like, if the impact of AI platforms like ChatGPT are not given serious consideration and are simply allowed to run unchecked in the long-term future.

For now, I’ll end with this: My daughter who is a bit of an artist, came to me just recently, all excited because she discovered ChatGPT and was telling me about how it could compose poems and develop all sorts of “beautiful” stanzas. I smiled and shared her excitement in the moment. But as she was leaving the room, I encouraged her to make sure she is doing her college work and discouraged her from using ChatGPT to help with her college work, because in the long run it would only hinder her ability to learn, and specifically, her ability to learn how to work hard.

CompTIA is here to support you throughout your IT career. Get free resources, career advice, and special offers on CompTIA training and certifications!

Email us at blogeditor@comptia.org for inquiries related to contributed articles, link building and other web content needs.

Read More from the CompTIA Blog

Leave a Comment