-13.8 C
New York
Sunday, February 8, 2026

Making a Textual content to SQL App with OpenAI + FastAPI + SQLite


Making a Textual content to SQL App with OpenAI + FastAPI + SQLitePicture by Writer

 

Introduction

 
Information has change into an indispensable useful resource for any profitable enterprise, because it offers precious insights for knowledgeable decision-making. Given the significance of information, many corporations are constructing programs to retailer and analyze it. Nevertheless, there are lots of instances when it’s onerous to amass and analyze the mandatory information, particularly with the growing complexity of the information system.

With the arrival of generative AI, information work has change into considerably simpler, as we are able to now use easy pure language to obtain principally correct output that carefully follows the enter we offer. It’s additionally relevant to information processing and evaluation with SQL, the place we are able to ask for question growth.

On this article, we’ll develop a easy API software that interprets pure language into SQL queries that our database understands. We are going to use three most important instruments: OpenAI, FastAPI, and SQLite.

Right here’s the plan.

 

Textual content-to-SQL App Growth

 
First, we’ll put together all the pieces wanted for our challenge. All it’s worthwhile to present is the OpenAI API key, which we’ll use to entry the generative mannequin. To containerize the applying, we’ll use Docker, which you’ll purchase for the native implementation utilizing Docker Desktop.

Different parts, comparable to SQLite, will already be obtainable while you set up Python, and FastAPI can be put in later.

For the general challenge construction, we’ll use the next:

text_to_sql_app/
├── app/
│   ├── __init__.py          
│   ├── database.py           
│   ├── openai_utils.py       
│   └── most important.py               
├── demo.db                   
├── init_db.sql               
├── necessities.txt          
├── Dockerfile                
├── docker-compose.yml        
├── .env

 

Create the construction like above, or you need to use the next repository to make issues simpler. We are going to nonetheless undergo every file to realize an understanding of methods to develop the applying.

Let’s begin by populating the .env file with the OpenAI API key we beforehand acquired. You are able to do that with the next code:

OPENAI_API_KEY=YOUR-API-KEY

 

Then, go to the necessities.txt to fill within the essential libraries we’ll use for

fastapi
uvicorn
sqlalchemy
openai
pydantic
python-dotenv

 

Subsequent, we transfer on to the __init__.py file, and we’ll put the next code inside:

from pathlib import Path
from dotenv import load_dotenv

load_dotenv(dotenv_path=Path(__file__).resolve().dad or mum.dad or mum / ".env", override=False)

 

The code above ensures that the setting accommodates all the mandatory keys we want.

Then, we’ll develop Python code within the database.py file to hook up with the SQLite database we’ll create later (known asdemo.db) and supply a technique to run SQL queries.

from sqlalchemy import create_engine, textual content
from sqlalchemy.orm import Session

ENGINE = create_engine("sqlite:///demo.db", future=True, echo=False)

def run_query(sql: str) -> listing[dict]:
    with Session(ENGINE) as session:
        rows = session.execute(textual content(sql)).mappings().all()
    return [dict(r) for r in rows]

 

After that, we’ll put together the openai_utils.py file that can settle for the database schema and the enter questions. The output can be JSON containing the SQL question (with a guard to stop any write operations).

import os
import json
from openai import OpenAI        

shopper = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

_SYSTEM_PROMPT = """
You change natural-language questions into read-only SQLite SQL.
By no means output INSERT / UPDATE / DELETE.
Return JSON: { "sql": "..." }.
"""

def text_to_sql(query: str, schema: str) -> str:
    response = shopper.chat.completions.create(
        mannequin="gpt-4o-mini",        
        temperature=0.1,
        response_format={"sort": "json_object"},
        messages=[
            {"role": "system", "content": _SYSTEM_PROMPT},
            {"role": "user",
             "content": f"schema:n{schema}nnquestion: {question}"}
        ]
    )
    payload = json.hundreds(response.decisions[0].message.content material)
    return payload["sql"]

 

With each the code and the connection prepared, we’ll put together the applying utilizing FastAPI. The applying will settle for pure language questions and the database schema, convert them into SQL SELECT queries, run them by the SQLite database, and return the outcomes as JSON. The applying can be an API we are able to entry through the CLI.

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from sqlalchemy import examine
from .database import ENGINE, run_query
from .openai_utils import text_to_sql

app = FastAPI(title="Textual content-to-SQL Demo")

class NLRequest(BaseModel):
    query: str

@app.on_event("startup")
def capture_schema() -> None:
    insp = examine(ENGINE)
    international SCHEMA_STR
    SCHEMA_STR = "n".be part of(
        f"CREATE TABLE {t} ({', '.be part of(c['name'] for c in insp.get_columns(t))});"
        for t in insp.get_table_names()
    )

@app.put up("/question")
def question(req: NLRequest):
    strive:
        sql = text_to_sql(req.query, SCHEMA_STR)
        if not sql.lstrip().decrease().startswith("choose"):
            increase ValueError("Solely SELECT statements are allowed")
        return {"sql": sql, "end result": run_query(sql)}
    besides Exception as e:
        increase HTTPException(status_code=400, element=str(e))

 

That’s all the pieces we want for the primary software. The following factor we’ll put together is the database. Use the database under within the init_db.sql for instance functions, however you possibly can all the time change it if you need.


DROP TABLE IF EXISTS order_items;
DROP TABLE IF EXISTS orders;
DROP TABLE IF EXISTS funds;
DROP TABLE IF EXISTS merchandise;
DROP TABLE IF EXISTS clients;

CREATE TABLE clients (
    id INTEGER PRIMARY KEY,
    identify TEXT NOT NULL,
    nation TEXT,
    signup_date DATE
);

CREATE TABLE merchandise (
    id INTEGER PRIMARY KEY,
    identify TEXT NOT NULL,
    class TEXT,
    worth REAL
);

CREATE TABLE orders (
    id INTEGER PRIMARY KEY,
    customer_id INTEGER,
    order_date DATE,
    whole REAL,
    FOREIGN KEY (customer_id) REFERENCES clients(id)
);

CREATE TABLE order_items (
    order_id INTEGER,
    product_id INTEGER,
    amount INTEGER,
    unit_price REAL,
    PRIMARY KEY (order_id, product_id),
    FOREIGN KEY (order_id) REFERENCES orders(id),
    FOREIGN KEY (product_id) REFERENCES merchandise(id)
);

CREATE TABLE funds (
    id INTEGER PRIMARY KEY,
    order_id INTEGER,
    payment_date DATE,
    quantity REAL,
    technique TEXT,
    FOREIGN KEY (order_id) REFERENCES orders(id)
);

INSERT INTO clients (id, identify, nation, signup_date) VALUES
 (1,'Alice','USA','2024-01-05'),
 (2,'Bob','UK','2024-03-10'),
 (3,'Choi','KR','2024-06-22'),
 (4,'Dara','ID','2025-01-15');

INSERT INTO merchandise (id, identify, class, worth) VALUES
 (1,'Laptop computer Professional','Electronics',1500.00),
 (2,'Noise-Canceling Headphones','Electronics',300.00),
 (3,'Standing Desk','Furnishings',450.00),
 (4,'Ergonomic Chair','Furnishings',250.00),
 (5,'Monitor 27"','Electronics',350.00);

INSERT INTO orders (id, customer_id, order_date, whole) VALUES
 (1,1,'2025-02-01',1850.00),
 (2,2,'2025-02-03',600.00),
 (3,3,'2025-02-05',350.00),
 (4,1,'2025-02-07',450.00);

INSERT INTO order_items (order_id, product_id, amount, unit_price) VALUES
 (1,1,1,1500.00),
 (1,2,1,300.00),
 (1,5,1,350.00),
 (2,3,1,450.00),
 (2,4,1,250.00),
 (3,5,1,350.00),
 (4,3,1,450.00);

INSERT INTO funds (id, order_id, payment_date, quantity, technique) VALUES
 (1,1,'2025-02-01',1850.00,'Credit score Card'),
 (2,2,'2025-02-03',600.00,'PayPal'),
 (3,3,'2025-02-05',350.00,'Credit score Card'),
 (4,4,'2025-02-07',450.00,'Financial institution Switch');

 

Then, run the next code in your CLI to create a SQLite database for our challenge.

sqlite3 demo.db < init_db.sql  

 

With the database prepared, we’ll create a Dockerfile to containerize our software.

FROM python:3.12-slim
WORKDIR /code

COPY necessities.txt .
RUN pip set up --no-cache-dir -r necessities.txt

COPY . .

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

 

We may also create a docker-compose.yml file for working the applying extra easily.

companies:
  text2sql:
    construct: .
    env_file: .env    
    ports:
      - "8000:8000"    
    restart: unless-stopped
    volumes:
      - ./demo.db:/code/demo.db

 

With all the pieces prepared, begin your Docker Desktop and run the next code to construct the applying.

docker compose construct --no-cache   
docker compose up -d 

 

If all the pieces is completed effectively, you possibly can take a look at the applying through the use of the next code. We are going to ask what number of clients we’ve within the information.

curl -X POST "http://localhost:8000/question" -H "Content material-Kind: software/json" -d "{"query":"What number of clients?"}"

 

The output will seem like this.

{"sql":"SELECT COUNT(*) AS customer_count FROM clients;","end result":[{"customer_count":4}]}

 

We are able to strive one thing extra complicated, just like the variety of orders for every buyer:

curl -X POST "http://localhost:8000/question" -H "Content material-Kind: software/json" -d "{"query":"What's the variety of orders positioned by every buyer"}"

 

With output like under.

{"sql":"SELECT customer_id, COUNT(*) AS number_of_orders FROM orders GROUP BY customer_id;","end result":[{"customer_id":1,"number_of_orders":2},{"customer_id":2,"number_of_orders":1},{"customer_id":3,"number_of_orders":1}]}

 

That’s all it’s worthwhile to construct a primary Textual content-to-SQL software. You may improve it additional with a front-end interface and a extra complicated system tailor-made to your wants.

 

Wrapping Up

 
Information is the center of any information work, and firms use it to make selections. Many instances, the system we’ve is just too complicated, and we have to depend on generative AI to assist us navigate it.

On this article, we’ve realized methods to develop a easy Textual content-to-SQL software utilizing the OpenAI mannequin, FastAPI, and SQLite.

I hope this has helped!
 
 

Cornellius Yudha Wijaya is an information science assistant supervisor and information author. Whereas working full-time at Allianz Indonesia, he likes to share Python and information ideas through social media and writing media. Cornellius writes on a wide range of AI and machine studying matters.

Related Articles

Latest Articles