
Generative AI is now a staple of many Gen Z and Gen Alpha students’ educational experience. Tools like ChatGPT and Deepseek are used for everything from editing assignments, studying course content, and even full blown cheating in every kind of class and job. These tools can be prompted to produce written works, math and science steps and calculations, code, and even pictures. In software engineering, ChatGPT, Deepseek, and generative AI tools built into Interactive Development Environments like Github Copilot are used for the creation and debugging of code. Personally, I have used all three.
In ICS 314 Software Enginerring, ChatGPT, Deepseek, and Github Copilot have been a huge help. In both Experience WODs done at home and in class practice and graded WODs, I have used ChatGPT and Deepseek to understand the concepts and code used for the assignment, with prompts such as “what are the functional programming methods in typescript and what is each used for?” I also prompt them to give other examples of, explain, and write code for me. For example, if there is code that is provided for an assignment I may give it to ChatGPT and promot it to explain the sections that I don’t understand. I would often ask either ChatGPT or Deepseek for a specific section of code that I needed next, after giving it my current code, and give very direct instructions. This helped me learn how the code I needed to use in this class worked, but also kept me dependent on generative AI for creating it. Once I had seen enough examples I was able to understand and repocude code, but often still chose to prompt ChatGPT or DeepSeek to do it for me, because I found html and css to be monotonous. For debugging and quality assurance AI is my first step. Github Copilit was convenient to use because it is built in to VSCode, however the debugging tips it gave were only useful for some of my issues. It sometimes misunderstood the situation or what I was trying to achieve, so I opted for using ChatGPT or DeepSeek where I could add a detailed explanation and prompt. Sometimes I gave ChatGPT or Deepseek my essay drafts to get feedback on how to reword it or what to add, but I usually didn’t take its advice because it wasn’t accurate to my perspective and writer’s voice. For these various purposes my final project group used ChatGPT, DeepSeek, and Github Copilot frequently as well. It was helpful for producing code and debugging it but easily wrote incorrect code if we didn’t include all of the context or other related code. Never had I even considered to use generative AI to document our code, because it does not have the same understanding of our vision and use as we do as humans. For asking smart questions in class or on discord, I did usually opt to ask ChatGPT or DeepSeek first if it was anything technical, because I found it to be more efficient and in depth than if I had asked someone else. Generally the only times I posted a question or asked my classmates or professor were when it was a question about the class itself, like about an assignment or its due date, because then it was out of the scope of AI.
My use of generative AI has enhanced my understanding of the course material, but created a challenge for my abilities without it. In my learning experience in ICS 314, I have often used generative AI to explain to me assignment instructions, sample code, write me code, or tell me how to fix an error. This has streamlined my software engineering experience considerably, making it more time and energy effient for me, however it has also let the development of my skills and problem solving stagnate. When writing new code or debugging my code, I usually turn to generative AI for any help I need, instead of sitting with and thinking about the problem for longer, or instead of consulting documentation, Stack Overflow, or my peers for help. This has made my learning experience easier yet less rewarding and comprehensive.
Gnerative AI has already been used in the real world, for important projects and situations. The important thing about responsible AI use is its’ impact. What are its’ results, and is the way it produced them ethical? An example of an issue that has arisen with this is in the healthcare industry, where health insurance automation is being developed. Some studies have shown that AI for processing insurance claims produced alarmingly high rates of claim denials, including one that had sixteen times as many denials than the previous method of evaluation. To combat issues like this, AI should be used alongside human efforts, so that both the human’s possible errors or biases and the AI’s can best balance each other while still increasing productivity and lowering costs.
A challenge with AI use is its’ effectiveness, both from how humans use it and what it is capable of. Within this class, I encountered erronous advice from the previously mentioned AI tools, if I had neglected to provide a part of the code or explain the situation in depth enough, as the AI did not have the complete picture of what was going on and so its’ response was simply a best guess at how to deal with the situation. In this class I also encountered a diverse range of uses of generative AI by my classmates, with varying levels of effeciveness. I witnessed classmates copy-and-pasting entire assignment instructions into ChatGPT and submitting the code it generated without proofreading it, which while it seemed to work for them in the beginning of the class, it did not aid in their understanding or ability and they attended class less after using AI as a crutch. There are also classmates that I noticed being less efficient with their AI use than I would have been in their situation, in my opinion, by not giving their AI tool much information in their prompt and relying more on established outside sources and trying to figure it out themselves, even when it is content they don’t yet understand. These students seem to take a lot longer to produce their final result, but possibly better retain the content and an ability to produce their results themselves. In future software engineering classes, it would be compelling to discuss in class everyone’s AI use and how we can best use it as a tool while also continue to learn the content ourselves, which can be a challenge to balance.
AI use is much more straightforward than traditional learning methods to the student in terms of explanations and examples that are functional and easy to understand. It can be much more engaging because of the ability to customize prompts to the students exact question, and the responses that tools like ChatGPT and DeepSeek produce can be extremely accurate and effective, while also being presented in a user friendly way. This is a stark contrast to the technical documentation of many languages and applications, which often is much more challenging to understand, is not interactive in any way, and may use examples that are not so comparable to the students current situation. However, traditional teaching methods have a stable focus on students learning all of the content, retaining it, and developing their skills, that is enforced by the more difficult learning process. Learning with AI on the other hand reinforces a students ability to forget the content because they know they can always just prompt the AI for help again.
Generative AI will only get more prevalent in education as it is continually developed and normalized. Many young students already have a difficult time in school, doing the tasks required of them including reading and writing, as a result of the effects of the Covid-19 pandemic and the state of technology in all aspects of our lives. This leads to students depending on AI to learn and to get by, which creates a circular relationship where they never learn independence from technology. While dangerous for the outcome of modern society, it is hard to criticise the people who are complicit with the rise of AI. To improve the presence of AI in our lives, there should be strict regulations by all governments and educational institutions on how AI is developed, used, and its energy usage, to keep AI ethical, or not at all. For example, in this class, there could be guidelines or other course content developed instructing students on the best ways to use AI in their studies, and if it is too soon to develop that, then this class could be a tool in itself for developing it. Students need structure and instruction to grow to be their best selves, and a good understanding of AI and its use is no different. Additionally, future software engineers will need to be able to use AI ethically and efficiently.