Researchers from the cybersecurity firm Aim Security disclosed on June 12, 2025, a serious vulnerability in the popular AI assistant, Microsoft 365 Copilot. The vulnerability, dubbed "EchoLeak" and assigned the official identifier CVE-2025-32711, could potentially allow attackers to gain unauthorized access to other users' confidential information. As reported by leading information security publications like SecurityWeek and SC World, the essence of the vulnerability was the potential for cross-tenant information disclosure. This means an attacker with legitimate access to Copilot within their own organization could, using specially crafted prompts, "trick" the AI assistant into revealing data snippets it had processed for other users, including those from entirely different companies sharing the same cloud infrastructure. This posed a significant risk of leaking trade secrets, personal employee data, and other sensitive corporate information that Copilot has access to. Aim Security timely notified Microsoft of the discovered vulnerability under a responsible disclosure program. Microsoft responded promptly to the threat, developed, and has already released a necessary patch that resolves the issue for all users. Although the vulnerability was fixed before any known exploitation by malicious actors, this incident starkly highlights the new, complex cybersecurity challenges that arise from the deep integration of powerful AI models into corporate ecosystems.
"EchoLeak" Vulnerability Disclosed in Microsoft 365 Copilot
