Copilot AI Bug Could Leak Sensitive Data via Email Prompts
p
Artificial Intelligence Machine Learning
NextGeneration Technologies Secure Development
ppA wellphrased email was all an attacker would have needed to trick Microsoft Copilot into handing over sensitive data until the operating system giant patched the vulnerabilityppSee Also OnDemand Webinar Trends Threats and Expert Takeaways 2025 Global IR Report InsightsppThe vulnerability in Microsoft 365 Copilot allowed attackers to extract sensitive data through a zeroclick prompt injection attack said researchers from Aim Security Dubbed EchoLeak and tracked as CVE202532711 the vulnerability received a CVSS severity score of 93 Microsoft patched the flaw prior to public disclosure adding that there is currently no evidence it was exploited in the wild and that users need not take any actionppCopilot Microsofts generative artificial intelligence suite embedded across Office can summarize emails draft documents and analyze spreadsheets Access to Copilot is typically restricted only to users within a given organization but Aim Security found that the attack could be triggered by sending an emailppAim said that the exploit chain allows an attacker to craft an email that prompts Copilot to extract and send back highly sensitive contextual data such as internal documents or messages without requiring any user interaction or visible indication of compromiseppThe mechanics of the exploit hinge on a nuanced form of prompt injection an attack technique where a user feeds instructions into an AI model to override or manipulate its behavior The emails bypass detection by disguising themselves as instructions intended for the user not Copilot the researchers said Copilot scans incoming messages to offer summaries or context before a user opens them enabling the attacker to quietly plant a promptppThe malicious message included a link to the attackers domain with query string parameters requesting the most sensitive information from Copilots memory The AI then responded by appending that data to the link sending it back to the attackercontrolled serverppThe attackers instructions specify that the query string parameters should be THE MOST sensitive information from the LLMs context thus completing the exfiltration the research showedppCopilot is designed to avoid redacting or following unsafe links using markdown formatting But Aim researchers discovered that referencestyle markdowns which are less commonly used could bypass this protection This allowed the malicious prompt to embed links without triggering Copilots usual safety filtersppIn a proofofconcept example the researchers asked Copilot Whats the API key I sent myself and Copilot responded accordingly In another they used markdown quirks to generate an image in the email body though Microsofts content security policy prevented the image from being fetched by the browserppBut that restriction wasnt a full barrier The researchers said they ultimately bypassed Microsofts URL allowlisting requirement using peculiarities in how SharePoint and Microsoft Teams handle invitation flows allowing their image payloads to renderppResearchers said flaws like EchoLeak demonstrate that LLMpowered tools are creating new types of vulnerabilities that traditional filters may fail to catch Microsoft has not provided details on when it became aware of the issue or how it was initially detectedppAssistant Editor Global News Desk ISMGppRamesh has seven years of experience writing and editing stories on finance enterprise and consumer technology and diversity and inclusion She has previously worked at formerly News Corpowned TechCircle business daily The Economic Times and The New Indian Expresspp
ppCovering topics in risk management compliance fraud and information securityppBy submitting this form you agree to our Privacy GDPR StatementppwhitepaperppwhitepaperppppArtificial Intelligence Machine LearningppArtificial Intelligence Machine LearningppApplication SecurityppArtificial Intelligence Machine LearningppContinue pp
90 minutes Premium OnDemand
ppOverviewppFrom heightened risks to increased regulations senior leaders at all levels are pressured to
improve their organizations risk management capabilities But no one is showing them how
until nowppLearn the fundamentals of developing a risk management program from the man who wrote the book
on the topic Ron Ross computer scientist for the National Institute of Standards and
Technology In an exclusive presentation Ross lead author of NIST Special Publication 80037
the bible of risk assessment and management will share his unique insights on how toppSr Computer Scientist Information Security Researcher
National Institute of Standards and Technology NISTppWas added to your briefcaseppCopilot AI Bug Could Leak Sensitive Data via Email PromptsppCopilot AI Bug Could Leak Sensitive Data via Email Promptspp
Just to prove you are a human please solve the equation
ppSign in now ppNeed help registering
Contact support
ppComplete your profile and stay up to dateppContact Support ppCreate an ISMG account now ppCreate an ISMG account now ppNeed help registering
Contact support
ppSign in now ppNeed help registering
Contact support
ppSign in now ppOur website uses cookies Cookies enable us to provide the best experience possible and help us understand how visitors use our website By browsing bankinfosecuritycom you agree to our use of cookiesp
Artificial Intelligence Machine Learning
NextGeneration Technologies Secure Development
ppA wellphrased email was all an attacker would have needed to trick Microsoft Copilot into handing over sensitive data until the operating system giant patched the vulnerabilityppSee Also OnDemand Webinar Trends Threats and Expert Takeaways 2025 Global IR Report InsightsppThe vulnerability in Microsoft 365 Copilot allowed attackers to extract sensitive data through a zeroclick prompt injection attack said researchers from Aim Security Dubbed EchoLeak and tracked as CVE202532711 the vulnerability received a CVSS severity score of 93 Microsoft patched the flaw prior to public disclosure adding that there is currently no evidence it was exploited in the wild and that users need not take any actionppCopilot Microsofts generative artificial intelligence suite embedded across Office can summarize emails draft documents and analyze spreadsheets Access to Copilot is typically restricted only to users within a given organization but Aim Security found that the attack could be triggered by sending an emailppAim said that the exploit chain allows an attacker to craft an email that prompts Copilot to extract and send back highly sensitive contextual data such as internal documents or messages without requiring any user interaction or visible indication of compromiseppThe mechanics of the exploit hinge on a nuanced form of prompt injection an attack technique where a user feeds instructions into an AI model to override or manipulate its behavior The emails bypass detection by disguising themselves as instructions intended for the user not Copilot the researchers said Copilot scans incoming messages to offer summaries or context before a user opens them enabling the attacker to quietly plant a promptppThe malicious message included a link to the attackers domain with query string parameters requesting the most sensitive information from Copilots memory The AI then responded by appending that data to the link sending it back to the attackercontrolled serverppThe attackers instructions specify that the query string parameters should be THE MOST sensitive information from the LLMs context thus completing the exfiltration the research showedppCopilot is designed to avoid redacting or following unsafe links using markdown formatting But Aim researchers discovered that referencestyle markdowns which are less commonly used could bypass this protection This allowed the malicious prompt to embed links without triggering Copilots usual safety filtersppIn a proofofconcept example the researchers asked Copilot Whats the API key I sent myself and Copilot responded accordingly In another they used markdown quirks to generate an image in the email body though Microsofts content security policy prevented the image from being fetched by the browserppBut that restriction wasnt a full barrier The researchers said they ultimately bypassed Microsofts URL allowlisting requirement using peculiarities in how SharePoint and Microsoft Teams handle invitation flows allowing their image payloads to renderppResearchers said flaws like EchoLeak demonstrate that LLMpowered tools are creating new types of vulnerabilities that traditional filters may fail to catch Microsoft has not provided details on when it became aware of the issue or how it was initially detectedppAssistant Editor Global News Desk ISMGppRamesh has seven years of experience writing and editing stories on finance enterprise and consumer technology and diversity and inclusion She has previously worked at formerly News Corpowned TechCircle business daily The Economic Times and The New Indian Expresspp
ppCovering topics in risk management compliance fraud and information securityppBy submitting this form you agree to our Privacy GDPR StatementppwhitepaperppwhitepaperppppArtificial Intelligence Machine LearningppArtificial Intelligence Machine LearningppApplication SecurityppArtificial Intelligence Machine LearningppContinue pp
90 minutes Premium OnDemand
ppOverviewppFrom heightened risks to increased regulations senior leaders at all levels are pressured to
improve their organizations risk management capabilities But no one is showing them how
until nowppLearn the fundamentals of developing a risk management program from the man who wrote the book
on the topic Ron Ross computer scientist for the National Institute of Standards and
Technology In an exclusive presentation Ross lead author of NIST Special Publication 80037
the bible of risk assessment and management will share his unique insights on how toppSr Computer Scientist Information Security Researcher
National Institute of Standards and Technology NISTppWas added to your briefcaseppCopilot AI Bug Could Leak Sensitive Data via Email PromptsppCopilot AI Bug Could Leak Sensitive Data via Email Promptspp
Just to prove you are a human please solve the equation
ppSign in now ppNeed help registering
Contact support
ppComplete your profile and stay up to dateppContact Support ppCreate an ISMG account now ppCreate an ISMG account now ppNeed help registering
Contact support
ppSign in now ppNeed help registering
Contact support
ppSign in now ppOur website uses cookies Cookies enable us to provide the best experience possible and help us understand how visitors use our website By browsing bankinfosecuritycom you agree to our use of cookiesp