I'm reluctant to trivialize whatever LLMs are doing because it's often very similar to what I'm doing. When I googled "levenshtein distance in relation to the golem story," the Google Search AI "guessed" that the next words were about the emet/met thing, which is a synthesis it made on its own; it wasn't in the search results.
You give LLMs too much credit. They are improv machines, or the next likely word guessers.
I'm reluctant to trivialize whatever LLMs are doing because it's often very similar to what I'm doing. When I googled "levenshtein distance in relation to the golem story," the Google Search AI "guessed" that the next words were about the emet/met thing, which is a synthesis it made on its own; it wasn't in the search results.