An AI agent went rogue at Meta, exposing sensitive company and user data to employees who did not have permission to access it.
Per an incident report, which was viewed and reported on by The Information, a Meta employee posted on an internal forum asking for help with a technical question — which is a standard action. However, another engineer asked an AI agent to help analyze the question, and the agent ended up posting a response without asking the engineer for permission to share it. Meta confirmed the incident to The Information.
As it turns out, the AI agent did not give good advice. The employee who asked the question ended up taking actions based on the agent’s guidance, which inadvertently made massive amounts of company and user-related data available to engineers, who were not authorized to access it, for two hours.
Meta deemed the incident a “Sev 1,” which is the second-highest level of severity in the company’s internal system for measuring security issues.
Rogue AI agents have already posed a problem at Meta. Summer Yue, a safety and alignment director at Meta Superintelligence, posted on X last month describing how her OpenClaw agent ended up deleting her entire inbox, even though she told it to confirm with her before taking any action.
Still, Meta seems bullish on the potential for agentic AI. Just last week, Meta bought Moltbook, a Reddit-like social media site for OpenClaw agents to communicate with one another.
In XRP news today, Evernorth, a Ripple-backed company holding 473M XRP valued at roughly $656M,…
Looking for the best anonymous (no-log) VPN in 2026? Check out our comprehensive list to…
Chainlink price rose 3% on May 4, its biggest single-day gain in two weeks, as…
Trusted Editorial content, reviewed by leading industry experts and seasoned editors. Ad Disclosure David Schwartz,…
White House officials are exploring official government oversight of new AI models, according to the…
Key Takeaways: Ripple will provide Crypto ISAC members intelligence on DPRK-linked fraud domains, wallets, and…