DEV Community

Tommaso Stocchi
Tommaso Stocchi

Posted on • Updated on

.NET Aspire for Data Chatting

.NET Aspire is a great way to orchestrate our distributed applications. It helps from the local development process right to the cloud deployment. And it works with different coding languages as well! Let's use it to build something a little bit more complex than a distributed application as we've seen the last time. Let's build a chat application! Specifically, I want to be able to chat with my database.

The idea

With the growth of AI, we can see more and more chatbots around. They are used in many different scenarios, from customer service to personal assistants. But what if we could chat with our database? We could ask it to retrieve some data, update some records, or even create new ones. This is the idea behind this project. We will build a chat application that allows us to interact with our database using natural language.

In order to achieve this, we will use the following technologies:

I've already discussed .NET Aspire in a previous article, so I won't go into details here. Semantic Kernel is a lightweight, open-source development kit that lets you easily build AI agents and integrate the latest AI models into your C#, Python, or Java codebase.

Data API Builder is a tool that allows you to create RESTful and GraphQL APIs for your database without writing any code.

As I mentioned, I want to be able to talk to my database using natural language. To do this, we will use Semantic Kernel to process the text input and generate a query that can be executed by DAB. The response from DAB will be sent back to the user, completing the loop. Of course, we will need to leverage Semantic Kernel's capability to use plugins and the GraphQL API provided by DAB.

GraphQL is a query language for your API that allows you to request only the data you need, therefor Semantic Kernel will be able to generate the perfect query based on the user input.

Data API Builder

Let's start with the core of our application - Data API Builder. DAB is cross-platform, open-source, and independent of language, technology, and frameworks. It's free, with no premium tier, and can run in any cloud. Best of all, it doesn't require any coding! We can either use the dab-cli or write the json file ourselves to have the API up and running in no time.

For this project, I have created a database from the AdventureWorks template. Using the dab-cli, I can simply run few commands to initialize the configuration file

dab init --database-type "mssql" --connection-string "@env('ConnectionStrings__sql')"

dab add "Customer" --source "SalesLT.Customer" --permissions "anonymous:*"
dab add "ProductModel" --source "SalesLT.ProductModel" --permissions "anonymous:*"
...
Enter fullscreen mode Exit fullscreen mode

and some others to define the relationships between the tables

dab update Address --relationship customerAddresses --target.entity CustomerAddress --cardinality many --relationship.fields "AddressID:AddressID"
dab update Customer --relationship customerAddresses --target.entity CustomerAddress --cardinality many --relationship.fields "CustomerID:CustomerID"
...
Enter fullscreen mode Exit fullscreen mode

This will generate a json file that looks something like this


  "$schema": "https://github.com/Azure/data-api-builder/releases/download/v1.1.7/dab.draft.schema.json",
  "data-source": {
    "database-type": "mssql",
    "connection-string": "@env('ConnectionStrings__sqldb')",
    "options": {
      "set-session-context": false
    }
  },
  "runtime": {
    "graphql": {
      "enabled": true,
      "path": "/graphql",
      "allow-introspection": true
    },
  },
  "entities": {
    "Customer": {
      "source": {
        "object": "SalesLT.Customer",
        "type": "table"
      },
      "graphql": {
        "enabled": true,
        "type": {
          "singular": "Customer",
          "plural": "Customers"
        }
      },
      "permissions": [
        {
          "role": "anonymous",
          "actions": [
            {
              "action": "*"
            }
          ]
        }
      ],
      "relationships": {
        "customerAddresses": {
          "cardinality": "many",
          "target.entity": "CustomerAddress",
          "source.fields": [
            "CustomerID"
          ],
          "target.fields": [
            "CustomerID"
          ]
        },
        ...
Enter fullscreen mode Exit fullscreen mode

Using DAB we can define everything - authentication, authorization, data access roles, row level security, and more.

Semantic Kernel

Because we want to be able to chat with our database, we need to process the text input and generate a query that can be executed by DAB. This is where Semantic Kernel comes in. We can add an OpenAPI plugin to Semantic Kernel so that it can invoke an API. Earlier, we defined a GraphQL API using DAB. We can describe the GraphQL API in an OpenAPI file in yaml, so that it can be consumed by Semantic Kernel.

openapi: 3.0.1
info:
  title: Data API builder - Graphql Endpoint
  version: 0.12.0
servers:
  - url: http://localhost:5000
paths:
  /graphql:
    post:
      description: a graphql endpoint.
      operationId: graphql_post
      ...
Enter fullscreen mode Exit fullscreen mode

Now the only thing left is importing the plugin into Semantic Kernel

await kernel.ImportPluginFromOpenApiAsync(
    pluginName: "GQLPlugin",
    filePath: Path.Combine(AppDomain.CurrentDomain.BaseDirectory, @"Plugins/DAB_GraphQL/openapi.yml"),
    executionParameters: new OpenApiFunctionExecutionParameters()
    {
        ServerUrlOverride = new Uri(Environment.GetEnvironmentVariable("services__dab__http__0")!),
        EnableDynamicPayload = true,
        IgnoreNonCompliantErrors = true,
        EnablePayloadNamespacing = true,
    });
Enter fullscreen mode Exit fullscreen mode

and of course a good system prompt to make sure it will use the plugin correctly.

The most important thing is that the Kernel doesn't try to guess the GraphQL schema, rather uses introspection to understand it:

...
### on the graphql_post tool:
- the grapghql_post tool provides access to data useful to answer user question.
- the graphql_post tool is a graphql endpoint.
- **Do not** guess schema type name, relationships or data types. 
- You **must** use introspection to understand the whole schema before planning the query you will execute.
- **Remember** names, including field names and type names, are case sensitive in grapql.   
- when composing the graphql request you can use **only** the following parameters: filter, orderBy, first and after .
- when composing the graphql request you can use **only** the following operators: eq,neq,gt,gte,lt,lte,isnull,contains,notcontains,startsWith,endsWith,and,or .
- mutation operations are not allowed.
...
Enter fullscreen mode Exit fullscreen mode

Here's the result of a simple request:

Chat Screenshot

As you can see, I let the agent outline its reasoning and give a somewhat formatted output. Of course, for business scenarios, we will need to cut out the reasoning and show the results in a more user-friendly way.

Putting it all together

I've started mentioning that this whole project rely on .NET Aspire. It's the orchestrator that will make sure the correct services are running and the correct endpoints are being injected were needed.

Of course, Data API Builder can be ran in a container, so our App Host becomes pretty simple:

var builder = DistributedApplication.CreateBuilder(args);

var sql = builder.AddConnectionString("sqldb");

var dabService = builder.AddContainer("dab", "mcr.microsoft.com/azure-databases/data-api-builder")
    .WithHttpEndpoint(targetPort: 5000, name: "http")
    .WithBindMount(@"D:\src\distributed-dab-sk\aw-data-api-builder\dab-config.json", "/App/dab-bm0/dab-config.json", true)
    .WithArgs("--ConfigFileName", "./dab-bm0/dab-config.json")
    .WithReference(sql)
    .WithOtlpExporter()
    .PublishAsContainer();
var dabServiceEndpoint = dabService.GetEndpoint("http");

builder.AddProject<Projects.DataAPIBuilder_AI_Web>("webfrontend")
    .WithExternalHttpEndpoints()
    .WithReference(dabServiceEndpoint);

builder.Build().Run();
Enter fullscreen mode Exit fullscreen mode

Notice how the connection string for the database is being injected into the DAB container and the connection string for the DAB container is being injected into the web frontend. This variable will override the one in the above mentioned OpenAPI plugin file.

Having this, we can leverage the .NET Aspire manifest to deploy the whole application to the cloud, hosting it on Azure Container Apps.
We now have a chatbot for our database, that requires no coding when new tables are added and that runs in a serverless and scalable environment!

You can find the full code along with some more tecnical details in the GitHub repository.

Top comments (0)