As Kelli Maria Korducki explains in The Atlantic, computer science is no longer the safe major.
From the time I left high school, computer science degrees have been sold as one of the safest paths to job security. Through several downturns, coding jobs kept spawning and wages are very good. By the time you get to an entry level software engineer position, you can make more than €150,000 annually. These wage packets have influenced enrollments in humanities degrees.
Then along came generative AI. As I have discovered during several semesters with students at workstations, ChatGPT and Bing can produce lines of code in several programming languages. Korducki says, “Many programmers have now developed rudimentary smartphone apps coded by AI. In the ultimate irony, software engineers helped create AI, and now they are the American workers who think it will have the biggest impact on their livelihoods.”
I’m teaching students how to revise drafts for essays generated by ChatGPT. Once you know how to integrate generative AI into your workflow, you can finish creative content jobs faster. In my classrooms, AI helps students iterate faster and produce higher word counts during half hour scrum sessions than ever before. I think it’s important to show students how they can guide machine intelligence towards ever-faster content production.
But unless you keep learning how generative AI thinks, you won’t avoid being replaced by someone else how has better thought processes.
I know generative AI can reduce the time it takes to produce content like written materials or lines of code. I also know there is a higher cognitive load on the people who try to leverage the AI. With an AI at your fingertips, you need to know how to define the finished job and ow to guide the AI towards helping you to do the job faster. For a lecturer at the head of the class like myself, this means being able to teach students (1) how to think as they ask AI questions and (2) how to cross-check the results the AI generates. If a student wants to coexist with an AI, they need to know what the AI is meant to be doing. Students don’t need to know how the AI is getting the results.
I have realised I cannot teach students in 2023 the way I taught in my classrooms of the 1990s. Today, I focus on teaching prompts as well as teaching critical thinking. Prompt Engineering uses human language phrases. You can ask an AI to sound more like a specific person when replying to an answer that it generates.
After my first semester using AI in the classroom, I’ve seen ways I can teach conceptual ingenuity instead of showing students how to write a well-phrased search query. When I succeed with this new approach, my students start to more entrepreneurially. They become makers of content because they know how to ask for better results and they can vet the provenance of the results generated by the AI. Prompt Engineers and Content Curators should be immune to automation in the workforce.