From @GaryMarcus:
"These systems never "think" and "figure out," they're just finding close approximations to the text that they've seen. It's more like an illusion than something that really understands things."
"These systems never "think" and "figure out," they're just finding close approximations to the text that they've seen. It's more like an illusion than something that really understands things."
⟳ 4
♡ 31