What's Happening?
The U.S. Sentencing Commission is seeking public input on proposed sentencing guidelines for offenses under the Take It Down Act, a law aimed at curbing nonconsensual deepfake pornography. This legislation,
which passed with strong bipartisan support, criminalizes the publication of nonconsensual intimate or pornographic imagery, both real and AI-generated. It mandates companies to remove such content within 48 hours of notification and empowers the Federal Trade Commission to enforce compliance. The proposed guidelines suggest prison sentences and financial penalties for digital forgery, with up to two years of imprisonment for deepfaking an adult and up to three years for a minor. The commission is also considering more specific penalties for various offenses and is accepting public comments until February 16, 2026.
Why It's Important?
The introduction of these guidelines is significant as it addresses the growing concern over the misuse of AI technology to create deepfakes, which can have severe implications for privacy and personal security. By establishing clear penalties, the law aims to deter potential offenders and protect individuals from digital exploitation. The legislation also highlights the need for regulatory frameworks to keep pace with technological advancements, ensuring that legal systems can effectively manage new forms of digital crime. This move could set a precedent for future laws addressing AI-related issues, impacting tech companies, legal professionals, and digital rights advocates.
What's Next?
The U.S. Sentencing Commission will review public feedback on the proposed guidelines, which could lead to adjustments in the final sentencing framework. Stakeholders, including tech companies and digital rights organizations, may respond to these proposals, potentially influencing the final regulations. The enforcement of this law will likely prompt tech companies to enhance their content monitoring and removal processes to comply with the 48-hour removal requirement. Additionally, the legal community may need to adapt to new challenges in evidence handling and case law development related to deepfakes.








