You’ll probably need to learn a new acronym to keep up with the latest AI trend

Courtesy of Rabbit

Hello and welcome to Eye on AI.

The tech press is lighting up with reviews of the Rabbit R1, the bright orange “AI box” that recently started shipping to users after making a huge splash at the CES trade show in January. The $200 standalone device that’s smaller than a smartphone is designed to answer queries just as ChatGPT, Gemini, or any other general AI chatbot would, but also call you an Uber, order you food via DoorDash, play music from Spotify, and generate images with Midjourney. It’s largely been received as charming, puzzling, and mostly useless—it doesn’t do those things particularly well, frequently gives wrong information (as chatbots do), and has a lot of other limitations, according to reviewers. But regardless of its utility, one thing the R1 is doing successfully is putting LAMs on the map.

In addition to an LLM, the Rabbit R1 uses another model called a Large Action Model, or LAM, to perform the four aforementioned tasks that involve other services (the company says it plans to add support for more services soon). LAMs are an emerging type of model that essentially turn LLMs into agents that can not only give you information but connect with external systems to take actions on your behalf. In short, they take in natural language (whatever you tell them to do) and spit out actions (do what you requested).

While LLMs are limited to generating content and can’t take other actions, LAMs do build on their success. LAMs use the same underlying transformer architecture that made LLMs possible—and they can potentially open up a slew of new use cases and AI applications. The long sought-after vision of a true AI assistant would obviously need to be able to perform tasks on its own, for example. LAMs are poised to play a major role in the continued development of AI to make these visions a reality, and if all goes as planned, they’ll also further elevate AI’s role and power in our lives.

The LAM-based actions the R1 is currently capable of aren’t hugely consequential. In his review for The Verge, David Pierce described asking the R1 to play Beyoncé’s new album only for it to excitedly present a lullaby version of “Crazy in Love” from an artist called “Rockabye Baby!” Online, users have also complained that the R1 ordered them the wrong meal or had it delivered to the wrong place—frustrating for sure, but not the end of the world. But while LAMs are still at an early stage, the ambition to use them across nearly all sectors and for more significant use cases is growing.

Microsoft, for example, says it's developed LAMs “that will be capable to perform complex tasks in a multitude of scenarios, like how to play a video game, how to move in the real world, and how to use an Operating System (OS).” One recent paper that includes authors from the company proposes a training paradigm for training AI agents across a wide range of domains and tasks, specifically demonstrating its performance across healthcare, robotics, and gaming.

Of course, this also includes growing interest in how the models can be deployed in the enterprise. Salesforce is turning to LAMs for its Service Cloud and Sales Cloud products, where it’s looking to have the models take actions on clients’ behalf. In March, banking platform NetXD unveiled a “LAM for the enterprise” geared toward banks, health care companies, and other institutions that it says can understand user instructions and generate code to automate the execution of microservices and actions. There’s also startup Adept, which has earned backing from Microsoft, Nvidia, and a $1 billion valuation for its pursuit of a “machine learning model that can interact with everything on your computer.” LLMs are already everywhere, and LAMs are certainly speeding up in the rear.

Now here’s more AI news.

Sage Lazzaro
sage.lazzaro@consultant.fortune.com
sagelazzaro.com

Correction: Last week’s edition (April 25) stated that Eli Lilly last year acquired AI drug discovery startup XtalPi in a deal worth $250 million. It was a collaboration deal valued at that amount.

This story was originally featured on Fortune.com

Advertisement