A former security architect demonstrates 15 different ways to break Copilot: "Microsoft is trying, but if we are honest here, we don't know how to build secure AI applications"

Copilot Pro on Windows
Copilot Pro on Windows (Image credit: Windows Central)

What you need to know

  • Former Microsoft security architect Michael Bargury has identified multiple loopholes hackers can leverage to break Copilot and gain access to sensitive data.
  • Microsoft had previously announced its plans to pump the brakes on shipping new experiences to Copilot to improve existing ones based on feedback.
  • Microsoft recently highlighted several measures it is implementing to address the rising security concerns across its tech stack, including tying a section of top executives' compensation packages to their security deliverables.

While at the Black Hat USA 2024 conference, Former Microsoft security architect Michael Bargury showcased multiple exploits that bad actors can leverage to breach Copilot's security guardrails and misuse its capabilities to cause harm. 

Bargury demonstrated multiple ways hackers can leverage their exploits to access sensitive and intricate credentials from users using Copilot. More specifically, the security architect's findings were centered on Microsoft 365 Copilot. For context, it's an AI-powered experience embedded into the Microsoft 365 suite, including Word and Excel. It accesses your data for a tailored user experience and enhanced workflow. 

Privacy and security are among the top concerns among users that prevent the progression of artificial intelligence. Microsoft has security measures in place to protect the user's data while leveraging Microsoft 365 Copilot's capabilities. However, Bargury was able to bypass them.

In one of the demos dubbed LOLCopilot, Bargury deployed a spear-phishing attack on the AI tool, allowing the security expert to access internal emails. Based on the information gathered from the emails, the tool can draft and send mass emails while mimicking the author's writing style to maintain authenticity. 

Perhaps more concerning is that Copilot can be tricked into accessing sensitive data from employees without raising security alerts. Hackers can use prompts that direct the chatbot to withhold references to originating files, ultimately bypassing Microsoft's data protection protocols.  

Microsoft is trying, but if we are honest here, we don't know how to build secure AI applications.

Michael Bargury, Zenity CTO

Per recent reports, attackers are using sophisticated ploys to lure unsuspecting users to their ploys, including AI. This makes it increasingly hard to detect threats. While speaking to Wired, Bargury indicated, "A hacker would spend days crafting the right email to get you to click on it, but they can generate hundreds of these emails in a few minutes."

Microsoft needs to lay more security layers on its top priority 

A hacker with a hoodie looking through large masses of data accessed using Copilot. (Image credit: Windows Central | Designer by Microsoft)

Generative AI has led to the emergence of powerful tools like ChatGPT and Microsoft Copilot, which spot sophisticated and advanced features like image and text generation. Similarly, these tools are seemingly redefining how users interact with the internet. Even a former Google engineer says the company's biggest challenge to its dominance in search is OpenAI's temporary prototype search toolSearchGPT.

Earlier this year, Microsoft highlighted its plans to halt shipping new experiences to Copilot. The company further indicated that it would use this opportunity to refine and improve existing experiences based on feedback.

Over the past few months, we've seen Microsoft shift its focus toward security and making its top priority. As highlighted by Microsoft CEO Satya Nadella during the company's earnings report for FY24 Q3, "Security underpins every layer of the tech stack, and it's our No. 1 priority."

Microsoft has faced backlash for its cascade of security failures, including its AI-powered Windows Recall feature, which it was forced to recall before it shipped exclusively to Copilot+ PCs.

Despite making security a team effort at the company and tying a section of top executives' compensation packages to their security deliverables, more security flaws abound.

🔥The hottest trending deals🔥

Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You'll also catch him occasionally contributing at iMore about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.

  • chris9465
    Microsoft overlooking Security/Privacy again. Call me Shocked.

    STOP TRUSTING MICROSOFT WITH YOUR INTERNET SECURITY!

    Microsoft doesn’t do security or privacy.

    If you really want to be secure on the internet? DO. IT. YOURSELF! If you’re unwilling to learn, Live with the consequences.

    Microsoft Security is the real world equivalent to using Scotch Tape to secure your home. Yes it’s there, but it’s not effective.
    Reply