OpenAI makes AI-Generated Videos

Date of Post: 2/16/2024


Wow, a new blog post after almost 2 months. Oopsies. Anyways, tonight I was viewing the Agora Road Forum as I usually do, when I came across this new thread. OpenAI revealed a new model: SORA! Now what SORA does is it takes in a text input of what you want (e.g. Guy drives in Nissan Altima on a sunny day with mountain background) and SORA will generate a very realistic video based on that input. So I watched the example video compilation provided on YouTube, and I found the videos to be.... not too realistic, to be honest. I mean, looking at the people in that video so far all looks like stock footage behavior. And that is because right now SORA looks like it's based on stock footage. Now, this is great and all, that you can pretty much generate your own stock footage with this bot. But what I, and many, many other more skeptical people and worried about, is that in the future, this technology will become much more refined, and realistic. As SORA gets used more and its algorithms become more and more sophisticated, it will eventually make more "realistic" videos. And therein lies the problems. Much like AI deepfakes and images, it will be harder for people to tell if videos are AI-generated or not. For example, if I had a much bigger online social media presence, (which thankfully I do not), what if someone made a video of me doing something very explicit and starts sharing that video everywhere? Even if the video is AI-generated, some people will still take the bait, and my online presence will be ruined. And it could even be spilt over to real life. But AI-generated videos don't stop there. Can video evidence be trusted anymore, if AI can eventually fabricate a video such as that? How will both US and foreign propagandists use such technology? For example, there could be AI-generated video propaganda of some militia group in the middle east beheading some Americans' heads off, and despite the video being fake people who watch it would still get outraged and probably support an invastion of (insert middle eastern country here). Also, when it comes to social media content, people will just try even less. Lazy shit content people already do on Tik Tok, Instagram and Twitter could end up being generated by video instead, thus making the work put into it even less, given enough time for SORA and similar models to become sophisticated enough. But content nowadays is already becoming more and more AI-generated. The Dead Internet Theory expands on this, so I won't really mention anything related to it, but AI-generated videos will blur the lines of reality and illusion. There might end up being a future where you cannot even trust one of the senses of your own body anymore, as everything you see could be made up by algorithms and machines. In the future, most of not all the things you think you see online which you take for granted to be real are in fact entirely fabricated, and you would not have any way of knowing about it. All of the reactions you make from those videos; Happiness, Sadness, Anger, Disgust, Cringe, Second-Hand Embarrassment. All of those emotions from a video generated by an algorithm, whose sole purpose is to manipulate you into supporting their benefactors and them only. A machine-induced psychosis.