DEV Community

Cover image for Liquid AI Redesigns Neural Networks: Introducing Liquid Foundation Models (LFMs)
Arbisoft
Arbisoft

Posted on

Liquid AI Redesigns Neural Networks: Introducing Liquid Foundation Models (LFMs)

Before we dive into this topic, here is the link to our complete blog that you can check out for more details!

Liquid AI recently launched its innovative Liquid Foundation Models (LFM) during an event that highlighted their capabilities and potential applications. Here’s a comprehensive look at the event and what these models mean for the future of AI.

During the launch, the presenter, Lechner, explained that Liquid AI has developed three specific LFM models engineered for “training at every scale.” These models are tailored for key offline uses, with each one designed to perform optimally on different hardware setups.

*Model Specifications
*

Liquid AI introduced three models:

  • 1.3 billion parameters
  • 3 billion parameters
  • 40 billion parameters

Lechner emphasized that these numbers were not chosen at random, stating, “We deploy where no one else can.” This approach ensures that each model is specifically optimized for the device it operates on.

Furthermore, his session focused on how these models are tailored for different devices, how they handle information smoothly, and how they work in different situations.

Compact Yet Powerful

The smallest model, operating with 1.3 billion parameters, is designed to run on very compact hardware, specifically the Raspberry Pi. Lechner showcased this capability, noting that the Raspberry Pi is a full-fledged computer. He demonstrated how the smallest LFM system can function effectively on such a small component.

Enhancing Responsiveness

Lechner highlighted that the models are responsive and can utilize their context effectively. He noted that adding web search functionality allows the models to handle new, non-training data efficiently, enhancing their performance.

Demonstrations

One of the most engaging parts of Lechner’s presentation was the live demos. Using the 3 billion parameter model on a laptop, he asked an AI voice assistant about things to do in Boston and for tips on presenting at events like the one they were attending. The AI provided valuable etiquette advice, such as maintaining eye contact and offering a LinkedIn profile or business card after a meeting. The warmth of the assistant’s responses, including affirmations like “Absolutely!” and “You’re welcome!” made the interaction feel emotionally authentic.

Offline Functionality

Lechner also addressed privacy concerns and the challenges of having no internet connection. He demonstrated this by turning off Wi-Fi and asking the model for three AI-themed Halloween costume ideas. The AI creatively suggested:

Neural Network Necromancer

  • Deep Learning Droid
  • Generative Adversarial Knight

Then, he prompted the AI to create a short story about one of the costumes, resulting in a tale titled “The Necromancer’s Awakening,” which garnered applause from the audience.

Transforming Customer Service

In another demo, Lechner illustrated a potential customer service application using the 3 billion parameter model. He showed how the model could process unstructured data from two verbal customer problem descriptions and return accurate summaries, classifications, and customer names. This demonstrated the model’s ability to transform customer interactions into structured data effectively.

Looking Ahead

Lechner concluded his presentation by emphasizing, “We unlock the power of GenAI on devices that were once out of reach, in a private and secure manner.” He explained that these models can be easily fine-tuned and tailored to meet specific customer needs.

The impressive nature of the demos, combined with Lechner’s insights on deployment, paints a promising picture of how Liquid AI’s LFM models could be utilized across various markets. The transformative potential of these technologies is immense, and as businesses adapt to this innovative landscape, the future of AI looks brighter than ever.

Top comments (0)