DEV Community

Cover image for Making a Financial Chatbot
Devexperts
Devexperts

Posted on • Originally published at blog.devexperts.com on

Making a Financial Chatbot

Chatbots are everywhere. But what is a chatbot? How can you build your own? In this article we describe several tools and techniques to get you started.

Requirements

  • Examples are written in Java programming language (Java 1.8)
  • Telegram Desktop client v 1.2.1 or Telegram Mobile client (Android, iOS) v 4.6
  • Telegram Bot API v 3.5
  • Python 2 v. 2.6.5+ or Python 3 v. 3.3+
  • Python pip utility
  • Basic understanding of modern Web protocols, including REST is helpful.

Introduction

A chatbot (bot) is any program that communicates with human users or other bots using natural language. Protocols and API’s now exist such that chatbots can be developed and integrated into social media channels like Facebook and Telegram. In this article we look at approaches to developing a simple chatbot that can respond to a basic question in the financial domain.

Smart Chatbots

Imagine designing a simple chatbot. How do you imagine a conversation would begin? If a bot sees the text “Hello!” then it might reply with “Hi!”, or “Hello!”. What if instead of “Hello!”, a user types “Hi!”, “Howdy”, “How do you do?” or even “Good morning”? What if a user enters an unexpected word order turning “Are you a bot?” into “You are a bot, aren’t you?”. We all get frustrated with bots and devices that reply “Hmm, I didn’t get it, say it again, please?” but it is likely an impossible task to anticipate all possible inputs. However, techniques and tools exist to make our bot appear smarter.

Regular Expressions

Regular Expressions (regex) are used to find patterns in text. While the details and usage of regular expressions are beyond this article, they can identify many complex predefined patterns in even large strings of text. We can use regular expressions to define a set of “commands” that our bot will understand. When we receive a message from a user, we use regular expressions to match against our predefined commands. In more complex cases these matches allow us to extract additional information from the message. For example, consider the following messages:

  • “What if I bought Apple on 20161223”
  • “Tell me how much I’d have earned if I purchased EUR/USD on 20170301”
  • “If I own 100 shares of Microsoft since 20130101”

We create a command called “whatif”. A response to “whatif” should be:

  • “If you bought XYZ on DATE, you would make (or lose) X$ per share (or contract)”.

Where “XYZ”, “DATE” and “X$” are a financial instrument symbol, a calendar date and a dollar amount correspondingly.

Here is our “simple” regular expression that can capture all the above examples.

whatif=if\s+(?:[Ii])?\s*(?:bought|purchased|own|had bought)\s+(?:\d+)?\s*(?:share|shares|lot|lots)?s*(?:of)?\s*(?:of)?\s*(<<instrument_symbol>>)\s+(?:on|since)\s+(\d{8})
Enter fullscreen mode Exit fullscreen mode

and <<instrument_symbol>> is defined with another regex:

\<\<instrument\_symbol\>\>=[a-zA-Z0-9#\$/]{1,20}
Enter fullscreen mode Exit fullscreen mode

When matched we can produce a templated response using the extracted information and retrieving security prices through a separate API.

Natural Language Processing

Regular expressions are a powerful tool but have limits to their robustness. What if you want the bot to communicate with users more naturally. In this section we are going to look at Natural Language Processing (NLP) approaches used by IBM (Watson) and Amazon (AWS) that are also available to all developers. Then we will examine another NLP approach with Apache’s OpenNLP library.

IBM Watson

IBM has long worked on developing intelligent applications and taking a moment to read about DeepBlue and Watson are well worth the time. Fortunately aspects of this same technology are available for free. To get started, register with IBM Cloud, the Lite plan is free and should be sufficient to explore its capabilities.

Watson Conversation

The Watson Conversation service facilitates building natural conversations between an application and a user. The Lite Plan itself will allow 10,000 API calls per month and while that would not be sufficient for an industrial application, it is plenty for experimenting.

The Watson Conversation web designer application is easy to use with helpful samples, tips and tutorials. To begin, we create a new instance of a Conversation service for our application (Conversation-traderBot). Each instance has associated credentials needed to communicate with the service via API. You can view your credentials or save it in JSON format from the Manage page of the service dashboard.

From the service console screen, launch the designer tool to begin building a conversation flow and create a new Workspace (Trader Dashboard). The building blocks of a conversation flow are Intents and Entities. Each Intent represents a question or a statement that a bot needs to understand. You will provide the responses for each recognized Intent which can incorporate Entities and context variables. Entities are the identified portions of the user’s input that can be used in a response. You can create new Entities for your dialogue’s domain as well as utilize the system Entities. Some common pre-defined Entities that can be extracted include the Date, Location, and Currency..

Once the Intents and Entities are defined, your are ready to build a conversation flow in the form of a dialogue. In the dialogue designer, you configure all the transition states from the initial intent to the final one. When the dialogue is ready, you can test it inside the designer. Finally, when complete, you can connect the workspace to other Watson services or external services such as Facebook or Slack.

Now we can have a simple conversation with Watson in our workspace.

 Ask watson: hello
 Dec 29, 2017 5:45:53 PM okhttp3.internal.platform.Platform log
 INFO: --> POST https://gateway.watsonplatform.net/conversation/api/v1/workspaces/0b31dca0-49a0-4f41-bcc9-78618bad6dee/message?version=2017-05-26 http/1.1 (307-byte body)
 Dec 29, 2017 5:45:53 PM okhttp3.internal.platform.Platform log
 INFO: <-- 200 OK https://gateway.watsonplatform.net/conversation/api/v1/workspaces/0b31dca0-49a0-4f41-bcc9-78618bad6dee/message?version=2017-05-26 (231ms, unknown-length body)
 Hello!
Enter fullscreen mode Exit fullscreen mode

If we ask Watson something it wasn’t trained to understand, it will reply with our predefined “negative” answer:

 Ask watson: How is it going?
 Dec 29, 2017 5:54:39 PM okhttp3.internal.platform.Platform log
 INFO: --> POST https://gateway.watsonplatform.net/conversation/api/v1/workspaces/0b31dca0-49a0-4f41-bcc9-78618bad6dee/message?version=2017-05-26 http/1.1 (388-byte body)
 Dec 29, 2017 5:54:39 PM okhttp3.internal.platform.Platform log
 INFO: <-- 200 OK https://gateway.watsonplatform.net/conversation/api/v1/workspaces/0b31dca0-49a0-4f41-bcc9-78618bad6dee/message?version=2017-05-26 (232ms, unknown-length body)
 I didn't understand. You can try rephrasing.
Enter fullscreen mode Exit fullscreen mode

Creating chatbot dialogues in the Watson Conversation service is best suited for well structured conversations like hotel reservations or opening a new trader account. If you are targeting to understand a free form user conversation with your application, another service from IBM Watson may be helpful – Natural Language Understanding (NLU).

Natural Language Understanding

NLU services attempt to extract meta-data from content. Such meta-data may capture keywords, semantic roles or even the sentiment of a message. With the Lite plan, you can employ the Watson Knowledge Studio tool and use up to 30,000 NLU items per month.

To illustrate the NLU service, let’s send the following question to IBM Watson:

What if I invested $1000 in IBM ten years ago?
Enter fullscreen mode Exit fullscreen mode

The POST request looks like this:

https://gateway.watsonplatform.net/natural-language-understanding/api/v1/analyze?version=2017-02-27&text=What if I invested $1000 in IBM ten years ago?&features=semantic_roles,entities
Enter fullscreen mode Exit fullscreen mode

Please note that we have omitted our own AUTH parameters obtained when we created our NLU instance. We requested Semantic Role and Entities features from NLU to analyze our data. Here is the response from IBM Watson in JSON format:

NLU response

{
    "usage": {
        "text_units": 1,
        "text_characters": 46,
        "features": 2
    },
    "semantic_roles": [
        {
            "subject": {
                "text": "I"
            },
            "sentence": "What if I invested $1000 in IBM ten years ago?",
            "object": {
                "text": "$1000"
            },
            "action": {
                "verb": {
                    "text": "invest",
                    "tense": "past"
                },
                "text": "invested",
                "normalized": "invest"
            }
        }
    ],
    "language": "en",
    "entities": [
        {
            "type": "Company",
            "text": "IBM",
            "relevance": 0.33,
            "count": 1
        },
        {
            "type": "Quantity",
            "text": "ten years",
            "relevance": 0.33,
            "count": 1
        },
        {
            "type": "Quantity",
            "text": "$1000",
            "relevance": 0.33,
            "count": 1
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode

The response identified three NLU items – one text unit and two feature units.The NLU service extracted a subject, an object and an action from the text. And the entity feature extracted a company “IBM”, and the quantities “$1000” and “ten years”. Unfortunately date extraction is not currently a feature of this service. Nonetheless the information provided by Watson NLU service can be used in our application to understand the user’s intent.

Amazon AWS

For those familiar with the Amazon Alexa personal assistant, it should not be surprising that they have also made significant progress into natural language understanding. One Amazon Web Service (AWS) that support language understanding is Amazon Lex. Sign up for the AWS Free Tier provides 12 months free access to many AWS. When signing up for the first time you will be required to verify your account with a $1 credit card charge. Please note, that if you exceed the limits of the AWS Free Tier, you may be charged a monthly fee.

Amazon Lex

Amazon Lex provides the ability for a developer to create conversational chatbots. Similar to the IBM Watson Conversation, you specify the basic conversation flow in the Amazon Lex Console. Amazon Lex manages the dialog and dynamically adjusts the responses. To start, let’s create a Lex bot in the Amazon Lex Console called TraderAssistant. You can build a dialog by adding Intents. An intent captures the general action intended by the user and provides slots that identify and store user context that relates to the intent. Let’s create and configure a “MakeAnInvestment” intent for the TraderAssistant bot. The purpose of this intent is to guide a user through the investment process. For simplicity, we would assume that the user selects FOREX type of investment. To initiate the intent we will support the following utterances:

I would like to invest; I would like to trade; I would like to buy; I would like to sell;
Enter fullscreen mode Exit fullscreen mode

To extract required data to fulfill the intent, we will create and add following slots:

Required Name Slot Type Prompt
yes InstrumentType InstrumentTypes What type of investment would you like to make?
yes Symbol ForexSymbols Which currency pair would you like to trade?
yes Amount AMAZON.NUMBER How much money do you want to invest in {Symbol}?

InstrumentTypes and ForexSymbols are custom slot type that we have created. When configuring new slots, you can provide a list of synonyms and slot resolutions, which restricts or expands slot values. For the InstrumentType slot, we provided the prompt text “What type of investment would you like to make?” and “I would like to invest in {InstrumentType}” as one of corresponding utterances. If the user’s utterance is recognized, then the InstrumentType slot is filled and the dialogue moves to the next slot/prompt.

After filling an intent’s slots, you will need to build it using the BUILD button in the Amazon Lex Console. If there are no errors, you can test your intent in the console. If you are satisfied, you can publish it where you will be prompted to add an alias for your build. When publishing is complete, you can connect to the intent from different platforms, including Facebook, Slack, Twilio SMS and Kik. There are also Android and iOS Amazon Lex SDKs to connect to a bot from mobile applications. We will use the Amazon AWS SDK for Java to connect with the TraderAssistant chatbot and its MakeAnInvestment intent.

The easiest way to install the AWS service on your machine is by installing the AWS Command Line Interface (AWS CLI) tool. You will need Python v. 2.6.5+ or Python 3 v. 3.3+ and Python’s pip utility. Installing AWS CLI with pip is easy:

$ **pip install awscli --upgrade --user**
Enter fullscreen mode Exit fullscreen mode

You can verify that the AWS CLI installed correctly by running aws –version command:

$ aws --version aws-cli/1.11.186 Python/2.7.10 Darwin/17.3.0 botocore/1.7.44
Enter fullscreen mode Exit fullscreen mode

You need to provide the AWS access key id and secret access key you created in the AWS web console. Here is the default profile configuration from our machine:

$ aws configure AWS Access Key ID [\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*KSHQ]: AWS Secret Access Key [\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*v7XT]: Default region name [us-east-1]: Default output format [json]:
Enter fullscreen mode Exit fullscreen mode

Once again, like in our IBM Watson example, we implement our simple interface for chatbot services, which takes a message from a user as a string and sends it to the service, and returns the response as a list of strings. Here is the console output from our utility application, showing the dialogue with Amazon Lex:

 Ask amazon: I would like to make an investment
 What type of investment would you like to make?
 Ask amazon: forex
 Which currency pair would you like to trade?
 Ask amazon: EURUSD
 Would you like to buy or sell EURUSD?
 Ask amazon: sell
 How much money do you want to invest in EURUSD?
 Ask amazon: 1000 dollars
 Okay, I will Sell EURUSD with 1000 of your money. Does this sound okay?
 Ask amazon: yes
Enter fullscreen mode Exit fullscreen mode

The text message response that we printed above is part of PostTextResult object that Amazon Lex sends back to the application. Let’s examine the content of the result objects, which corresponds to the last three stages of our dialogue:

 Ask amazon: sell
 Result: {IntentName: MakeInvestment,Slots: {InvestmentType=Sell, Symbol=EURUSD, InstrumentType=Forex, InvestmentAmount=null},
 Message: How much money do you want to invest in EURUSD?,DialogState: ElicitSlot,SlotToElicit: InvestmentAmount,}
 How much money do you want to invest in EURUSD?
 Ask amazon: 1000
 Result: {IntentName: MakeInvestment,Slots: {InvestmentType=Sell, Symbol=EURUSD, InstrumentType=Forex, InvestmentAmount=1000},
 Message: Okay, I will Sell EURUSD with 1000 of your money. Does this sound okay?,DialogState: ConfirmIntent,}
 Okay, I will Sell EURUSD with 1000 of your money. Does this sound okay?
 Ask amazon: yes
 Result: {IntentName: MakeInvestment,Slots: {InvestmentType=Sell, Symbol=EURUSD, InstrumentType=Forex, InvestmentAmount=1000},
 DialogState: ReadyForFulfillment,}
Enter fullscreen mode Exit fullscreen mode

When the DialogState is ReadyForFulfillment – we have all the information we need (in our simple example) to proceed with the investment decision.

Instead of communicating with Amazon Lex via text messages, you can also send voice commands – recorded audio files.

Apache OpenNLP

If you are interested in adding NLP capability to you application from the ground up there are a number of tools and libraries available. Python perhaps has the largest set of existing libraries to continue with Java, we will look at the Apache OpenNLP toolkit for its ease of use, capabilities, and community support.

The Apache OpenNLP library supports the most common NLP tasks with pre-built models for several languages. OpenNLP provides a processing pipeline through several components including:

  • language detector
  • sentence detector
  • tokenizer
  • name finder
  • document categorizer
  • part-of-speech tagger
  • lemmatizer
  • changer
  • parser

Each component takes input from its predecessor and produces output for its successor component.

After downloading and installing Apache OpenNLP CLI, configure OpenNLP on your machine by defining the following system variables JAVA_CMD, JAVA_HOME and OPENNLP_HOME and then adding $OPENNLP_HOME/bin or %OPENNLP_HOME\bin% to the PATH variable. You can then start the CLI tool from the command prompt.

$ opennlp
Enter fullscreen mode Exit fullscreen mode

The script will print the current version, usage and a list of available tools.

Next download the pre-trained english model. We will assume English as our default language but other options are available. In addition, we will include other models to support our specific type of financial conversations.

The goal of our application is to “understand” a user’s investment intents from a sentence written in English. To “understand” an intent, we will need to extract a subject, an object, an action and associated other entities, like the date, quantity and company. A sample sentence we should support would be:

What if I bought 100 shares of Microsoft 5 years ago
Enter fullscreen mode Exit fullscreen mode

We would like to extract from this sentence the subject (‘I”), the object (“100 shares of Microsoft”), and the action (“bought”), while to detecting the entities company (“Microsoft”), date (“5 years ago”) and quantity (“100 shares”).

Let’s trace the process of our sample question as it passes through each of the models.

The first and simplest tool is the tokenizer which breaks the sentence down into tokens where tokens are words, punctuation, numbers, etc.

Next is the Named Entity Recognition tool (NER) which detects names, dates, and organizations within the tokens. In this example, only Microsoft is identified and tagged. The generic NER model is limited and sometimes misses key terms that relate to the financial domain and our results might be improved by training the model with more examples.

The part-of-speech tagger (POS) takes the output from tokenizer and marks tokens with their corresponding word type based on the token and the context of the token. The OpenNLP POS Tagger uses a probability model to predict the correct tag.

The chunker divides the POS output into syntactically correlated parts of words, like noun groups, verb groups.

Finally with all of this content and context information, it is sent into the decision making process. Our TraderNlpDecision class expects an input text from the investment domain. With the help of NLP tools, we can correlate these into the correct aspects of our investment intent. After processing the Trader NLP Decision produces the following investment Intent:

Investment Intent

Complete intent:(Score: 96)
QUESTION: what
SUBJ: I
ACTION: had bought
OBJ: Microsoft
QUANTITY: 100 shares
DATE: 2013-01-12
EXTRA: of Purpose:
PAST INVESTMENT

Not bad! With this information in hand, our application can respond to a user with charts, quotes for a specific date or time interval, and calculate prospective Profit/Lose (PL).

Summary

In this article we described several techniques that help chatbots interact with users. We started with regular expressions that are a powerful and reliable tool for short queries and commands. We examined NLP for better understanding and processing of complex queries and longer sentences and how this could be achieved with cloud based tools from IBM (Watson) and Amazon (AWS). With Watson and AWS we can create chatbots that follow well defined scenarios or scripts. Finally, we described how to implement NLP processing in a bot application with the help of Apache Open NLP library. We hope that the information in this article was useful and can help you to decide which technique to use for your application.

References and Links

Dmitry Tsyganov, senior software engineer at Devexperts
by Dmitry Tsyganov, senior software engineer at Devexperts

Top comments (1)

Collapse
 
theodesp profile image
Theofanis Despoudis

Nice. Give some greetings to Evgeny Sorokin for me.