LLM libraries are more prevalent in Python and TypeScript, and it seems that Ruby, which I will talk about this time, is still not as popular. However, I believe that a considerable number of people are developing applications with Rails and there is a strong demand to integrate LLM functionality into those applications. In the end, many people may feel that implementing LLMs with Python or TypeScript, which they are not familiar with, is not well-connected to their usual web development applications.
Current State of LLM Libraries (as of June 28, 2023)
First, let's summarize the current state of LLM libraries for Python and TypeScript, which are likely to have many users.
Python
- openai/openai-python: The OpenAI Python library provides convenient access to the OpenAI API from applications written in the Python language.
- pinecone-io/pinecone-python-client: The Pinecone Python client
- hwchase17/langchain: ⚡ Building applications with LLMs through composability ⚡
- jerryjliu/llama_index: LlamaIndex (GPT Index) is a data framework for your LLM applications
- microsoft/guidance: A guidance language for controlling large language models.
TypeScript
- openai/openai-node: Node.js library for the OpenAI API
- rileytomasek/pinecone-client: Pinecone.io client with excellent TypeScript support.
- hwchase17/langchainjs
Surprisingly, there are options for Ruby as well
Now let's introduce recommended libraries for Ruby.
Boxcars
Unlike Langchain, Boxcars offers a wide range of features for generating queries for LLMs, such as Google search, SQL, and ActiveRecord, rather than integration with a vector database. The experience of operating ActiveRecord with natural language is quite interesting.
Langchain.rb
andreibondarev/langchainrb: Build LLM-backed Ruby applications
This is an unofficial Ruby version of Langchain. It will eventually change its name, but for now, it is using this name. I am also involved in its development. In recent updates, integration with ActiveRecord has been added.
Hooks have been provided to update the data in the connected vector database in sync with updates on the ActiveRecord (RDB) side. For example, it looks like this:
class Product < ActiveRecord::Base
vectorsearch provider: Langchain::Vectorsearch::Pinecone.new(
api_key: # ...
environment: # ...
index_name: "Products",
llm: Langchain::LLM::OpenAI.new(api_key: # ...
)
after_save :upsert_to_vectorsearch
end
It's easy to imagine the integration with actual applications!
Baran
moekidev/baran: Text Splitter for Large Language Model (LLM) datasets.
Baran is a library I recently released for text splitting. Text splitting is quite common in the context of LLMs. It is an effective method to bypass the token limit of prompts and improve the accuracy of vector searches. For example, in Langchain, there is a module called Text Splitter.
Split by character | 🦜️🔗 Langchain
Baran is also used in Langchain.rb. It can split Markdown as well, and here's an example of how it works:
splitter = Baran::MarkdownSplitter.new
splitter.chunks(markdown)
# => [{ cursor: 0, text: "..." }, ...]
Although it may be an overstatement to use Langchain.rb just for text splitting, I created this library because there were situations where I wanted to perform text splitting.
ActivePinecone
This is another library I recently released as part of the Active〇〇 series. It allows you to manipulate vector data models using an ActiveRecord-like interface.
For example, you can define a Recipe model like this:
class Recipe < ActivePinecone::Base
vectorizes :title, :body
end
You can create and search for recipes with semantic search capabilities:
recipe = Recipe.create(
title: "Hamburger",
body: "...",
author: "Kevin"
)
recipes = Recipe.search("How to make a hamburger")
recipes.first.title
# => "Hamburger"
It even comes with an assistant:
assistant = Recipe.assistant
reply = assistant.reply("How to make a hamburger)
reply.content
# => "OK!..."
reply.references
# [#<Recipe:0x000...>, ...]
It can be a fun experience!
Ruby's LLM contributions are still in progress
All these libraries have a relatively small number of contributors. As I mentioned earlier, personally, I believe that by promoting integration with Rails, we can increasingly enhance the compatibility with the applications developers work on regularly (although it may be more suitable for beginners). So I encourage everyone to participate and contribute!
Top comments (0)