Just days after OpenAI unveiled Sora 2, its upgraded AI video generator, the tech world is buzzing with a mix of excitement and concern. The new Sora app, essentially a TikTok rival powered by the model, has exploded in popularity, but access remains tightly controlled through invite codes. Users are scrambling online, trading codes on forums like Reddit and even reselling them on eBay for up to $50 a pop. It's a frenzy that highlights how desperately people want in on this next wave of AI creativity.
Indeed, the app lets anyone generate short, realistic videos from text prompts or photos, complete with synced audio. Early adopters have flooded feeds with everything from dreamlike scenes to personalized clips featuring friends. But not all is whimsical. Within hours of launch, viral videos of Pikachu rampaging through city streets or battling in epic animations have gone mega, drawing millions of views and sparking copyright debates since Pokémon's IP is involved. OpenAI insists safeguards are in place, yet the sheer ease of creation has led to slip-ups, with some clips blurring lines between fun and infringement.
Moreover, the darker side emerged quickly. Terrifying deepfakes of OpenAI CEO Sam Altman have surfaced— one shows him shoplifting in a convenience store, another has him ranting in impossible scenarios. These aren't harmless pranks; they raise alarms about misinformation in an election year. Altman himself tweeted a cautious note, urging users to watermark AI content, but enforcement seems spotty so far. However, the app's collaborative vibe, where you can remix others' videos, keeps pulling people back despite the risks.
Still, regional blocks have frustrated global fans, pushing some to VPNs just to snag codes. OpenAI plans to expand access soon, but for now, it's a gold rush mentality. One can't help but wonder if this unchecked creativity will redefine social media or just amplify our digital chaos.