DEV Community

Emil Valeev
Emil Valeev

Posted on

🕵 Agency: The Go Way to AI. Part 1

Image description

Making AI Easy for Go Developers

The growth of generative AI like the OpenAI API and local LLMs is reshaping app development. Python and JavaScript have plenty of tools for this, with LangChain being popular. However, Go developers have fewer options. LangChainGo, a Go version of LangChain, doesn't fit well with Go's style, and some also find LangChain itself to be too complex.

Recognizing the need for a Go-friendly tool that’s simple yet powerful, we developed Agency. This Go library, designed with a clean approach, matches Go's strengths in a static type system and high performance. It's our answer to bringing easy-to-use, efficient AI capabilities to Go developers.

Example - Continue Text

Let's start from the simplest possible example - chat completion. For simplicity, we'll break it down piece by piece and then combine it into a single, cohesive example.

First, we need to create provider:

provider := openai.New(
    openai.Params{Key: "YOUR_OPENAI_API_KEY"},
)
Enter fullscreen mode Exit fullscreen mode

Provider is a set of operations implemented by some external service. In this case, it's OpenAI.

Note that we passed this Params struct. In case of local LLM (must have OpenAI-compatible API) we can pass openai.Params{BaseURL: "YOUR_SERVICE_URL_HERE"}.

Now that we have a provider, it's time to create an operation:

operation := provider.
    TextToText(openai.TextToTextParams{Model: "gpt-3.5-turbo"}).
    SetPrompt("You are a helpful assistant that translates English to French")
Enter fullscreen mode Exit fullscreen mode

This TextToText method that we call is an operation-builder - a function that takes some params and returns an operation, a value of type agency.Operation.

Operations are basic building blocks.

This one operation-builder has this signature:

func (p Provider) TextToText(params TextToTextParams) *agency.Operation
Enter fullscreen mode Exit fullscreen mode

This TextToTextParams struct is specific to concrete provider and operation-builder. It depends on what external service provider uses and what functionality (modality) it implements. Almost any provider/builder allows to specify Model but we'll see differences later on.

For example, Anthropic provider could have different params than OpenAI and openai.SpeechToText operation-builder will also have different params because of using different models under the hood than text to text (whisper instead of GPT).

Now see this SetPrompt("You are a helpful assistant that translates English to French") line? This is how we configure operations. In this case we configure prompt that will be used.

Okay, it looks like it's time to talk about what operations really are. Let's dig into the library source code:

// Operation is basic building block.
type Operation struct {
    handler OperationHandler
    config  *OperationConfig
}

// OperationHandler is a function that implements the actual logic.
// It could be thought of as an interface that providers must implement.
type OperationHandler func(context.Context, Message, *OperationConfig) (Message, error)

// OperationConfig represents abstract operation configuration.
// It contains fields for all possible modalities but nothing specific to concrete model implementations.
type OperationConfig struct {
    Prompt   string
    Messages []Message
}
Enter fullscreen mode Exit fullscreen mode

At the time of writing, the library has version v0.1.0 and is under active development, so implementation details may change, but the main idea should remain - operation consist of handler and config.

Handler is a function implements actual logic. It's signature (just like commentary says) could be "thought of as an interface that providers must implement".

Now let's look at what this SetPrompt("...") method does:

func (p *Operation) SetPrompt(prompt string, args ...any) *Operation {
    p.config.Prompt = fmt.Sprintf(prompt, args...)
    return p
}
Enter fullscreen mode Exit fullscreen mode

That's it. It simply configures the operation and implements templating via fmt.Sprintf.

Are you still here with me? We're almost finished! The scariest stuff is behind.

Actually, our operation is ready to use! Now we need to use something as an input for our operation, right? Let's create a message:

input := agency.UserMessage("I love programming.")
Enter fullscreen mode Exit fullscreen mode

We aren't scared of digging into the source code, are we? Here's the UserMessage implementation:

func UserMessage(content string, args ...any) Message {
    s := fmt.Sprintf(content, args...)
    return Message{Role: UserRole, Content: []byte(s)}
}
Enter fullscreen mode Exit fullscreen mode

That's right, it's just a tiny helper for Message. What's that? It's an abstract message that represents any possible message that operations operates on:

type Message struct {
    Role    Role
    Content []byte
}
Enter fullscreen mode Exit fullscreen mode

We already saw this message before, in operation's definition:

func(context.Context, Message, *OperationConfig) (Message, error)
Enter fullscreen mode Exit fullscreen mode

That is - operation is a function that takes message as an input and returns message as an output.

Finally, let's execute our operation and see what happens!

output, err := operation.Execute(context.Background(), input)
if err != nil {
    panic(err)
}
Enter fullscreen mode Exit fullscreen mode

Now let's put it all together!

provider := openai.New(
    openai.Params{Key: os.Getenv("OPENAI_API_KEY")},
)

operation := provider.
    TextToText(openai.TextToTextParams{Model: "gpt-3.5-turbo"}).
    SetPrompt("You are a helpful assistant that translates English to French")

input := agency.UserMessage("I love programming.")
output, err := operation.Execute(context.Background(), input)
if err != nil {
    panic(err)
}

fmt.Println(string(output.Content))
Enter fullscreen mode Exit fullscreen mode

Don't forget to insert your OpenAI API key. Here's my output:

J'adore la programmation.
Enter fullscreen mode Exit fullscreen mode

Looks like it works! You can of course rewrite this in a tighter manner if you want.

openai.New(openai.Params{Key: "YOUR_OPENAI_API_KEY"}).
    TextToText(openai.TextToTextParams{Model: "gpt-3.5-turbo"}).
    SetPrompt("You are a helpful assistant that translates English to French").
    Execute(context.Background(), agency.UserMessage("I love programming."))
Enter fullscreen mode Exit fullscreen mode

That's all for now! Thank you very much for reading this. If you find any mistakes, please leave a comment. Link to this and many more working examples as well as link to the library itself gonna be below.

We want to create the best library in the field and we need help. Feature-requests, bug-reports and of course pull-requests are more than welcome. We'll try to provide feedback as soon as possible.


In the next series, we will explore topics such as:

  • How to create ChatGPT-like application in 40 lines of code
  • How to combine operations in chains to execute them sequentially
  • How to create custom operations
  • How to use interceptors to observe operation-chains
  • How to use prompt-templating
  • How to use different modalities (speech, images, etc)
  • How to implement RAG (using vector databases)
  • ... And many more!

Links:

Top comments (1)

Collapse
 
tobiasgleiter profile image
Tobias Gleiter

Are you going to implement a manager llm?