Suppose you envision that Flutter is the future, especially its capability to be compiled for many platforms through one codebase, which may greatly save up the cost of development for your own good. You have existing customers, and customers will also eventually move on to adapt new technology, that means you also have to keep up the market trend to satisfy your customers' needs.
Even though you wanted to deliver a revamped app to your customers, but in reality you have to face one important problem: customers may love to use your new app, but customers most likely do not want to pay a high price to change their database. In their perspective, their existing database just works for many years, there is no reason to 'upgrade' their database if it does not break. Here I mean, they are using SQL Server, and most likely they are using the Microsoft one.
Background
I have spent an extended amount of time to search for compatible plugins for Flutter that is able to communicate with Microsoft SQL Server. Sadly, though Flutter has a trusty sqflite plugins, and the same author has also wrote drift plugins (they're both great, I have used them), but dealing with SQL server is still a lacking part in Flutter ecosystem. Although drift does has remote library to deploy a remote SQL server, but it requires the other end to have drift installed too (correct me if I'm wrong). I don't think I would spend much time to setup drift for an existing, well-configured SQL server.
In the time of writing, there is one working plugin for Flutter that able communicate with MSSQL protocol, which is sql_conn. I thought I found a saviour, until I saw this:
I was like, whaaaat? Android only, other platforms are out of luck. I thought of forking the project on GitHub and add support to other platforms as well, but it is not as easy as I thought.
What To Do?
If there is no way I can find the off-the-shelf plugin to achieve this, shall I abandon my visionary Flutter project? After a serious consideration, I do not think I will opt for other frameworks. So, I thought of another approach: writing my own API backend as middleware.
Middleware -- FastAPI + SQLAlchemy
I am familiar with Python myself, so choosing to write a middleware in Python is a no-brainer for me. I stumbled upon something called "FastAPI" thingy which claimed to be a verrryyyyy fast API server, and good-olde SQLAlchemy with pyodbc that provides the protocol to communicate with MSSQL.
So, shall we?
Starting The Project
First thing first, fire up your terminal and install the required libraries.
python -m pip install fastapi sqlalchemy pyodbc
Then, we may create a python file, and write something like this:
from fastapi import FastAPI
from sqlalchemy import create_engine, text
from sqlalchemy.orm import sessionmaker
config = {
"sqldbIP": "192.168.16.99",
"sqldbPort": "1433",
"sqldbDB": "RMS",
"sqldbUserId": "test2",
"sqldbPass": "test2",
"MachineId": "1"
}
app = FastAPI()
# Start of program, connecting to DB
conn_string = f"mssql+pyodbc://{config['sqldbUserId']}:{config['sqldbPass']}@{config['sqldbIP']}:{config['sqldbPort']}/{config['sqldbDB']}?driver=ODBC+Driver+17+for+SQL+Server"
connection = create_engine(conn_string)
Session = sessionmaker(bind=connection)
session = Session()
The above code tells the Python to import the necessary libraries, then we write a dictionary called config
that defines where the SQLAlchemy should connect to.
We initialise the FastAPI app by calling app = FastAPI()
, then define the connection string conn_string
.
First we need to pass the pyodbc driver mssql+pyodbc
so it can communicate with MSSQL, then follow the connection format, and lastly pass the custom driver driver=ODBC+Driver+17+for+SQL+Server
.
We may need to create an engine and also session for us to execute SQL queries.
Queries
Now, we are going to write some pure SQL queries to interact with SQL Server. I would suggest you to create a file specifically for querying and not mix with the main Python file, so you can maintain the code more easily in the future when the project scales up.
Let's start with the most basic query:
SELECT * FROM <table_name>
This tells the SQL Server to show all records available in that particular table. Let's turn it into Python script:
# Import necessary libraries
from sqlalchemy import text
import json
def retrieve_data_from_table(session):
query = text("SELECT * FROM employee_table")
cursor = session.execute(query)
employee_id = []
employee_name = []
json_data = {}
for row in cursor.fetchall():
employee_id.append(row[0])
employee_name.append(row[1])
json_data = {id: employee_id, name: employee_name}
cursor.close()
session.close()
return json_data
As you can see, I defined a function specifically to retrieve the data from a table, and everything is pretty much a hard-code to make sure it works.
I used text
method from sqlalchemy
library to write the query, because this method can produce a well-formatted query for sqlalchemy to interact with the database.
The cursor
will store the data fetched from the database after calling execute
method from the database session, then I used a for
loop to populate the data into result
list, lastly store the result into json_data
dictionary. This should can be directly being used to output the API calls.
Lastly, before return the json output, don't forget to close the connection.
So you can imagine, whatever queries that MSSQL supports, you can just throw them into the text()
method, execute it and make sure to insert the results into your Python dictionaries. You can structure your Python dictionary as you please, in the end it will return the dictionary as json to your client-end.
Paths
We need to have paths for our client side to communicate with the backend. Defining path is very simple in FastAPI. Here's an example:
@app.get("/")
async def home():
return retrieve_data_from_table(session)
First, we need to use the decorator @app.get("/")
, which app
is the FastAPI instance, and tell the FastAPI that this path should use HTTP GET operation. Then, we define an asynchronous function (since of FastAPI's nature) home()
, and call the query function retrieve_data_from_table(session)
to retrieve our data from SQL server. The output of this path when being accessed by client-side Flutter app will be json_data
we defined in the query function.
If you have query parameters, you can give the parameters to the async function like how you give parameters in normal functions, and then pass the parameters in your function call.
Additionally, if you want to perform HTTP POST operation (when Flutter user submit a data), you can use the post
method:
# Import necessary libraries
from fastapi import Request
@app.post("/submit")
async def submit(request: Request):
data = await request.body()
data_str = data.decode()
data_json = json.loads(data_str)
return submit_order(session, data_json)
Notice that I use post
instead of get
on the decorator this time. The request
parameter is needed so the FastAPI can decode the data and send to the submit_order
function that performs either CREATE, UPDATE, DELETE SQL queries as you define it yourself.
On Flutter's Side
We're settled on backend side (Python), now we're going to our client side (Flutter).
Create an empty Dart file, and put some code into it:
import 'package:http/http.dart' as http;
import 'dart:convert';
const config = {
"apiUrl": "10.0.2.2:8000", // Connect to localhost for Android emulator
};
Future<Map<String, dynamic>> fetchEmployeeData() async {
var url = Uri.http(config["apiUrl"]!, '/');
final response =
await http.get(url);
if (response.statusCode == 200) {
// If the server returns a 200 OK response, parse the JSON.
Map<String, dynamic> employeeData = json.decode(utf8.decode(response.bodyBytes));
return employeeData;
} else {
// If the server returns an error response, throw an exception.
throw Exception('Failed to load employee data.');
}
}
Make sure the http
package is added into pubspec.yaml to make things work. I added some comments in the code so you can understand how it works. Basically, apiUrl
is defined so Flutter app can connect to the FastAPI server using http.get(url)
method. When the response status code is 200 (meaning OK), the app will decode the json (it's Python dictionary we defined as json_data
before) as Dart's Map
variable employeeData
. Finally, you use this employeeData
to display your data.
If you have query parameters, I encourage you can take a short read on Uri.http
reference about how to pass the query parameters.
Additionally, if you want to submit the user input to the server (which is POST instead of GET), it should be like this:
Future<bool> submit(
{required List<dynamic> data}) async {
var url = Uri.http(config["apiUrl"]!, '/submit');
var headers = {'Content-Type': 'application/json'};
final response = await http.post(
url,
headers: headers,
body: json.encode({
'data': data,
}),
);
if (response.statusCode == 200) {
bool submitStatus = true;
return submitStatus;
} else {
throw Exception('Submission Failed: ${response.reasonPhrase}');
}
}
For this time, notice that there is required parameters in the function submit()
, which contains the data to submit to the server, and a headers
is needed for POST operation, and in the end we use the http.post
method to perform POST operation.
In this case, I return a boolean variable submitStatus
to tell the app whether the POST operation is successful, but you can modify the function yourself that suits your needs.
Start the Server
After everything is setup, fire up your FastAPI server. FastAPI uses uvicorn as their default development server, and you can run this command in the terminal:
uvicorn main:app --reload
This tells uvicorn to run the app
instance, and able to reload the server every time the code changes. The default port is 8000
, but you can specify the uvicorn to use which port you desire by passing --port <integer>
, and you also need to change the target port in Flutter's code in that case.
The End
Phew! It's a long article, but through this your Flutter app can communicate with SQL Server no matter the app is compiled to which platform of choice. It's platform-agnostic.
If you found a better solution, do leave a comment and let me know too! Thank you.
Top comments (0)