What's Happening?
Threat actors have been exploiting the OpenAI Assistants API to deploy a backdoor named 'SesameOp', allowing them to manage compromised devices remotely. This was discovered by Microsoft's Detection and
Response Team (DART) during an investigation in July 2025. The backdoor uses a complex setup involving internal web shells and compromised Microsoft Visual Studio utilities. Instead of traditional methods, the backdoor leverages the OpenAI Assistants API for command-and-control communications. The API is set to be deprecated by OpenAI in August 2026. The backdoor, designed for stealth and persistence, uses a heavily obfuscated DLL file and a NET-based backdoor to execute commands and send results back to OpenAI, ensuring secure communication through compression and encryption.
Why It's Important?
The exploitation of the OpenAI Assistants API for malicious purposes highlights significant security vulnerabilities in AI-driven technologies. This incident underscores the need for robust security measures in AI applications, as they can be manipulated for unauthorized access and control over devices. The use of legitimate APIs for malicious activities poses a challenge for cybersecurity, as it blurs the line between legitimate and harmful operations. Organizations relying on AI technologies must enhance their security protocols to prevent such exploitations, which could lead to data breaches and unauthorized access to sensitive information.
What's Next?
Microsoft has recommended several mitigation strategies to reduce the impact of the SesameOp threat. As OpenAI plans to deprecate the Assistants API, organizations using AI technologies should prepare for this transition and ensure their systems are updated to prevent similar exploitations. Cybersecurity experts and organizations must remain vigilant and proactive in identifying and mitigating potential threats posed by AI-driven technologies. The incident may prompt further scrutiny and regulatory measures to ensure the secure deployment of AI applications.











