Popular YouTuber Casey Neistat is raising concerns about OpenAI’s newly launched Sora app, which he describes as a “TikTok clone where every video is AI.” In a video released October 5, Neistat examines the implications of the platform’s AI-generated content on creativity and content creation.
The app, officially announced by OpenAI on September 30, allows users to generate videos featuring themselves and friends using the new Sora 2 model. OpenAI describes it as a social platform built around a feature called “cameos” that lets users place themselves into AI-generated scenes.
Neistat uses the term “AI slop” to describe what he sees as “an endless dribble of computer-generated nonsense that no one cares about.” He expresses concern about the platform’s potential to overwhelm authentic creativity with machine-generated content.
“What happens when all you have to do is type a couple words in from your bed in a dark room and click a button and it gives you a piece of video and then you share it?” Neistat asks in his analysis.
The creator uses a funnel metaphor to illustrate his concerns, suggesting that while the volume of content will increase dramatically, the proportion of quality content could diminish significantly.
Personal Identity Concerns
According to Neistat, the Sora app requires users to scan their face in a similar way to iPhone’s Face ID. This allows them to insert themselves into AI-generated scenarios, which aligns with OpenAI’s description of the “cameos” feature.
A TechCrunch report confirms this functionality, noting that users must “upload a one-time video-and-audio recording to verify their identity and capture their appearance” before using their likeness in generated videos.
Potential for Misuse and Safety Concerns
Neistat identifies several potential issues with the technology, including bullying through unauthorized videos of peers and misrepresentation of individuals. He specifically mentions concerns about young users potentially creating inappropriate content, such as videos of themselves receiving cosmetic procedures.
TechCrunch’s report echoes these concerns, noting that “even if a user trusts someone they know with access to their AI likeness, that person could still generate deceptive content that could be used to harm that person.”
In Neistat’s video, several users express enthusiasm for the app, despite its AI nature. When asked if they would tire of the “AI sloppiness,” one user responds, “No. Because it’s yourself. You’re inserting yourself into anything. Like it’s me being able to do everything I’ve wanted since I was a kid.”
OpenAI launched the Sora app alongside its Sora 2 model on September 30. The iOS app is currently available in the U.S. and Canada on an invite-only basis, though ChatGPT Pro users can access the Sora 2 Pro model without an invitation. The platform features an algorithmic feed reminiscent of TikTok and will be free at launch, with potential charges for generating additional videos during periods of high demand.
David Adler is an entrepreneur and freelance blog post writer who enjoys writing about business, entrepreneurship, travel and the influencer marketing space.
Popular YouTuber Casey Neistat is raising concerns about OpenAI’s newly launched Sora app, which he describes as a “TikTok clone where every video is AI.” In a video released October 5, Neistat examines the implications of the platform’s AI-generated content on creativity and content creation.
The app, officially announced by OpenAI on September 30, allows users to generate videos featuring themselves and friends using the new Sora 2 model. OpenAI describes it as a social platform built around a feature called “cameos” that lets users place themselves into AI-generated scenes.
Neistat uses the term “AI slop” to describe what he sees as “an endless dribble of computer-generated nonsense that no one cares about.” He expresses concern about the platform’s potential to overwhelm authentic creativity with machine-generated content.
“What happens when all you have to do is type a couple words in from your bed in a dark room and click a button and it gives you a piece of video and then you share it?” Neistat asks in his analysis.
The creator uses a funnel metaphor to illustrate his concerns, suggesting that while the volume of content will increase dramatically, the proportion of quality content could diminish significantly.
Personal Identity Concerns
According to Neistat, the Sora app requires users to scan their face in a similar way to iPhone’s Face ID. This allows them to insert themselves into AI-generated scenarios, which aligns with OpenAI’s description of the “cameos” feature.
A TechCrunch report confirms this functionality, noting that users must “upload a one-time video-and-audio recording to verify their identity and capture their appearance” before using their likeness in generated videos.
Potential for Misuse and Safety Concerns
Neistat identifies several potential issues with the technology, including bullying through unauthorized videos of peers and misrepresentation of individuals. He specifically mentions concerns about young users potentially creating inappropriate content, such as videos of themselves receiving cosmetic procedures.
TechCrunch’s report echoes these concerns, noting that “even if a user trusts someone they know with access to their AI likeness, that person could still generate deceptive content that could be used to harm that person.”
In Neistat’s video, several users express enthusiasm for the app, despite its AI nature. When asked if they would tire of the “AI sloppiness,” one user responds, “No. Because it’s yourself. You’re inserting yourself into anything. Like it’s me being able to do everything I’ve wanted since I was a kid.”
OpenAI launched the Sora app alongside its Sora 2 model on September 30. The iOS app is currently available in the U.S. and Canada on an invite-only basis, though ChatGPT Pro users can access the Sora 2 Pro model without an invitation. The platform features an algorithmic feed reminiscent of TikTok and will be free at launch, with potential charges for generating additional videos during periods of high demand.
Checkout Our Latest Podcast