UNTIL Microsoft curtailed the capabilities of its Bing chatbot – codenamed Sydney and powered by an advanced version of OpenAI’s ChatGPT model – there were a chaotic few days last month when it was threatening, cajoling, falling in love with and terrifying its beta testers.
Even journalists who regularly write about artificial intelligence expressed surprise: they know these programs are just statistical models of the language on the internet, but they still found Sydney’s “personality” unsettling and eerily human.
Bing’s chatbot has yet to be rolled out to the world at large and its curtailment has prevented it from going off the rails again, but it remains unnerving. …