DEV Community

Cover image for Quick tip: How to Build Local LLM Apps with Ollama, DeepSeek-R1 and SingleStore
Akmal Chaudhri for SingleStore

Posted on

1

Quick tip: How to Build Local LLM Apps with Ollama, DeepSeek-R1 and SingleStore

Abstract

In a previous article, we saw how to use Ollama with SingleStore. In this article, we'll modify the previous example and replace the existing LLM with DeepSeek-R1 instead.

The notebook file used in this article is available on GitHub.

Introduction

We'll follow the setup instructions from the previous article.

Fill out the notebook

We'll configure the code to use the smallest DeepSeek-R1 model, as follows:

llm = "deepseek-r1:1.5b"

ollama.pull(llm)
Enter fullscreen mode Exit fullscreen mode

We'll use LangChain to store the vector embeddings and documents, as follows:

docsearch = SingleStoreDB.from_documents(
    docs,
    embeddings,
    table_name = "langchain_docs",
    distance_strategy = DistanceStrategy.EUCLIDEAN_DISTANCE,
    use_vector_index = True,
    vector_size = dimensions
)
Enter fullscreen mode Exit fullscreen mode

Next, we'll use the following prompt:

prompt = "What animals are llamas related to?"
docs = docsearch.similarity_search(prompt)
data = docs[0].page_content
print(data)
Enter fullscreen mode Exit fullscreen mode

Example output:

Llamas are members of the camelid family meaning they're pretty closely related to vicuñas and camels
Enter fullscreen mode Exit fullscreen mode

We'll then use the prompt and response as input to DeepSeek-R1, as follows:

output = ollama.generate(
    model = llm,
    prompt = f"Using this data: {data}. Respond to this prompt: {prompt}."
)

content = output["response"]
remove_think_tags = True

if remove_think_tags:
    content = re.sub(r"<think>.*?</think>", "", content, flags = re.DOTALL)

print(content)
Enter fullscreen mode Exit fullscreen mode

We'll disable <think> and </think> using a flag so that we can control the output of its reasoning process.

Example output:

LLAMAS ARE RELATED TO CAMELS (THROUGH SIMILAR HOVES) AND VICUNVAS (THROUGH THEIR SIMILAR SKIN TEXTURE). They may also be indirectly related to other animals that use products with rubbery or bumpy skin, but their primary connections are through these shared characteristics.
Enter fullscreen mode Exit fullscreen mode

The answer contains a mixture of correct and vague statements. For example, llamas and camels are related, but not because of hooves.

Summary

Using the local Ollama installation gives us great flexibility and choice when it comes to which LLM to use. In this article, we've been able to replace one LLM quite easily with another.

Top comments (0)

👋 Kindness is contagious

Please show some love ❤️ or share a kind word in the comments if you found this useful!

Got it!