OpenAI's Sora 2 AI video generation engine has taken the world by storm with its capabilities. And one of the most intriguing parts about the same is the ability that lets users make a cameo in any video. However, this could be used in a pretty manipulative way as well. To curb the issue, OpenAI is bringing some major updates that will offer more liberty to the users in terms of how there cameo can be used on the platform. Head of Sora at OpenAI, Bill Peebles, announced the same via an X post after considering the feedback received on Sora 2's capability of managing personal likeness. The most crucial part of the update is that users can now impose specific restrictions on the use of their cameo on the platform. For example, users can block
political content or ban the use of certain words along with the cameo. Peebles stated that this feature will provide more control to the users and open the gates for mass creative participation. To control the cameos, you need to visit the Edit Cameo settings on Sora 2 followed by Cameo Preferences and then heading to restrictions.
What Else Is On The Cards?
Apart from that, OpenAI has also dropped a new watermark for Sora to make it 'more visible and clearer' to increase transparency around AI-generated content. Peebles has also announced said that the backened safety features of the platform are going through a safety update to improve the detection of potential misuse and minimise false negatives in moderation.Also Read:AI-Only Social Network Sora 2 Is Heating Up The Debate Of AI Vs HumansIn addition, all the users on Sora 2 will also be able to delete their Sora accounts without having to delete their ChatGPT profiles, which was required by many of them. In the future, we may also get to see fictional character cameos on Sora 2, as Peebles said that the same is already on the roadmap for Sora 2. With these updates, OpenAI will be looking forward to make Sora 2 a safer platform for video generation that doesn't messes with the guardrails. It will be more than interesting to see how things pan out in the future and how safe the platform proves to be.