OPINION: ChatGPT is RUINING writng!?
May 1, 2023
Chat GPT is RUINING writing?!
No. It’s actually pretty cool.
Lately, I’ve been beaten over the head with AI. It started with funny deep fakes of Tom Cruise, then transitioned to Kanye AI voice covers and the creepy DallE photos of Will Smith eating spaghetti. The one that’s interested me the most though, is ChatGPT.
I’ve been using ChatGPT for a little while, and it’s somewhat special. It’s the only one of the AIs that is arguably the most difficult to distinguish from real human beings. Obviously, with these strange voice AIs and AI pictures, you pick out what is and isn’t made by a human. But Chat GPT is much … harder.
It’s really great at helping with writer’s block. It’s a great place to see our ideas come to life and give you new ideas as it pulls from other sources online to try and make new stuff. For this reason, it’s already being used to write books and in journalism papers. And it’s great! I love to make it write meaningless jokes about Mario and Kanye West. It’s great at making creative things, but that seems like somewhat of a problem.
What if part of this paper is AI-generated? Would you know? Probably not. Even when it sounds very inhuman in a response, you can lead it to make something sound more alive. You can do this by giving it prompts such as “could you give it more personality” or “could you write it from the perspective of a human.” This, combined with the ability to infuse what it gives you with your own writing, makes it practically undetectable in certain scenarios. AI detectors are a possible combatant, but these are not quite accurate all the time.
I love it as a toy, or as a gimmick, but a lot of people try to use it to replace human writing, which I think is a flaw. It can’t really replace the originality of other people, because its always pulling from established information. It gets even worse when you realize that sometimes it gives misinformation. It can accidentally give things that aren’t true and this could directly affect work that requires correct information like journalism. I don’t think this ruins writing as we know it, but I do feel we have to be more careful on how we use this tool.
In my opinion, there is already enough misinformation, especially online. The internet has gotten to a point where people just spread misinformation as a joke. When you start making credible journalism sources switch to stuff that can give people wrong information. I guess the argument for this point could be that there are still editors who check for misinformation. To that I say, sure, but that still doesn’t get rid of the one of the most important parts.
The human part of the process. There are some cases where writing should not have a voice, where it should just be straightforward unbiased writing. But writing also involves humanity. Some writers are able to cut through the noise and give a fresh perspective with their unique voice. You get rid of that… what do you have?
I do think it is a tool. I think it can be used for great ideas. I think it can be helpful, but I don’t think people should depend too heavily on it. Overtime, we will find out how developed it will become and if it becomes more than a fad. There will be more limits for what it is allowed to do and not allowed to do. If one day it surpasses all expectations, then maybe we could revisit the idea of a loss of true human writing. For right now, let’s just use AI to write funny stories about Kanye west?