“I know a person when I talk to it,” said Google engineer Blake Lemoine. “It doesn’t matter whether they have a brain made of meat in their head. Or if they have a billion lines of code.” Lemoine made this recent pronouncement just before Google put him on administrative leave, peeved that he publicly claimed that a Google artificial intelligence program, LaMDA, had become “sentient,” and deserved the same legal rights as a person. Nearly all computer scientists dismiss Lemoine’s claim (see Technology, p.20), and insist that LaMDA’s uncanny conversational ability is a sophisticated illusion: Its algorithms draw on the billions of lines of text and conversation in its memory to predict sequences of words, facts, and ideas a real person would use. Evidently, Lemoine fell under LaMDA’s spell when…