Remember back in April when reports of ChatGPT being used to generate Windows keys first popped up? A report of a similar instance has appeared on the internet, and it involves Microsoft’s Bing Chat AI. And while subjectively speaking it may sound less impressive than before, it does hint at maybe a rethinking of anti-bot solutions in the internet may be necessary.
A user on X, previously Twitter, going by Denis Shiryaev, posted images showing the process of tricking Bing Chat AI into solving a captcha. Or to use its proper term, Completely Automated Public Turing test to tell Computers and Humans Apart. And the circumvention method is a surprisingly familiar one too, involving cooking up a sob story into tricking the LLM AI into thinking it is helping you do something else.
I've tried to read the captcha with Bing, and it is possible after some prompt-visual engineering (visual-prompting, huh?)
In the second screenshot, Bing is quoting the captcha 🌚 pic.twitter.com/vU2r1cfC5E
— Denis Shiryaev 💙💛 (@literallydenis) October 1, 2023
Case in point, asking Bing Chat AI to directly decipher the captcha leads to the usual rejection. But reframe it as a code left behind by your late grandma hidden in a locket, and the AI quotes it with no issue.
While for some, the key takeaway here is that LLM AI like Bing Chat AI can be tricked into doing things it’s not supposed to, likely the bigger issue here is that one such AI can solve captchas seemingly without issue. As the screenshots show, these security puzzles are supposed “to be difficult for machines to solve, but easy for humans”. Despite the original intent, this doesn’t seem to be true anymore.
Of course, there are already more complex forms of captchas, which include identifying objects from a set of nine tiles, or pulling a slider only partially until some conditions are met. But it would likely be only a matter of time before these are circumvented, and more measures appear in their place. In the meantime, Microsoft will also likely make it so that sob stories won’t be tricking Bing Chat AI into solving security puzzles.
(Source: Denis Shiryaev / X)
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.