A language models that fascinate as much as they question. But as. A after the initial excitement. A the reality is different: the majority of users have only tried ChatGPT once. So how far can this technology really go? Benedict Evans presents . A as he does every year.
A a summary of vision in a few minutes
The key question. A Evans says. A is scale. Will LLMs continue to self employed database grow to replace entire systems. A or will they shrink to become mere software building blocks? Already. A divergent strategies are emerging: Meta is distributing its open-source models for free. A while Apple is following its tried-and-true formula—making tools “better faster cheaper.
As it has always done in the computing world
In 2013. A machine learning proved its usefulness by slush presentation b2b marketing: 5 tips for an effective linkedin strategy identifying patterns. A such as recognizing a dog in a photo. But in 2024. A generative AI poses a new question: what is it really used for? Evans points out a crucial limitation: LLMs are not databases. The example of Air Canada.
A ordered to pay damages after its chatbot lied to a grieving passenger
A illustrates this problem well. How do you handle errors in a probabilistic belgium numbers and non-deterministic system? Can it replace Google? Nothing is less certain. On the other hand. A their potential lies (still) in slush presentation their ability to become “infinite interns” (to recall his phrase from last year). A automating repetitive tasks to free up time for strategic projects. Evans goes further and compares LLMs to invisible features. A like spell checkers. A that integrate into existing tools without the user having to think about them. But he insists that to be successful. A LLMs must not force users to invent their own use cases.