Introduction
Constructing inside instruments or AI‑powered functions the “conventional” approach throws builders right into a maze of repetitive, error‑inclined duties. First, they need to spin up a devoted Postgres occasion, configure networking, backups, and monitoring, after which spend hours (or days) plumbing that database into the entrance‑finish framework they’re utilizing. On high of that, they’ve to put in writing customized authentication flows, map granular permissions, and preserve these safety controls in sync throughout the UI, API layer, and database. Every utility element lives in a special surroundings, from a managed cloud service to a self‑hosted VM. This forces builders to juggle disparate deployment pipelines, surroundings variables, and credential shops. The result’s a fragmented stack the place a single change, like a schema migration or a brand new function, ripples by means of a number of techniques, demanding guide updates, intensive testing, and fixed coordination. All of this overhead distracts builders from the actual worth‑add: constructing the product’s core options and intelligence.
With Databricks Lakebase and Databricks Apps, the complete utility stack sits collectively, alongside the lakehouse. Lakebase is a completely managed Postgres database that provides low-latency reads and writes, built-in with the identical underlying lakehouse tables that energy your analytics and AI workloads. Databricks Apps provides a serverless runtime for the UI, together with built-in authentication, fine-grained permissions, and governance controls which are robotically utilized to the identical knowledge that Lakebase serves. This makes it simple to construct and deploy apps that mix transactional state, analytics, and AI with out stitching collectively a number of platforms, synchronizing databases, replicating pipelines, or reconciling safety insurance policies throughout techniques.
Why Lakebase + Databricks Apps
Lakebase and Databricks Apps work collectively to simplify full-stack improvement on the Databricks platform:
- Lakebase provides you a completely managed Postgres database with quick reads, writes, and updates, plus fashionable options like branching, and point-in-time restoration.
- Databricks Apps supplies the serverless runtime to your utility frontend, with built-in id, entry management, and integration with Unity Catalog and different lakehouse elements.
By combining the 2, you possibly can construct interactive instruments that retailer and replace state in Lakebase, entry ruled knowledge within the lakehouse, and serve all the things by means of a safe, serverless UI, all with out managing separate infrastructure. Within the instance beneath, we’ll present how you can construct a easy vacation request approval app utilizing this setup.
Getting Began: Construct a Transactional App with Lakebase
This walkthrough reveals how you can create a easy Databricks App that helps managers evaluate and approve vacation requests from their workforce. The app is constructed with Databricks Apps and makes use of Lakebase because the backend database to retailer and replace the requests.
Right here’s what the answer covers:
- Provision a Lakebase database
Arrange a serverless, Postgres OLTP database with a couple of clicks. - Create a Databricks App
Construct an interactive app utilizing a Python framework (like Streamlit or Sprint) that reads from and writes to Lakebase. - Configure schema, tables, and entry controls
Create the required tables and assign fine-grained permissions to the app utilizing the App’s shopper ID. - Securely join and work together with Lakebase
Use the Databricks SDK and SQLAlchemy to securely learn from and write to Lakebase out of your app code.
The walkthrough is designed to get you began rapidly with a minimal working instance. Later, you possibly can lengthen it with extra superior configuration.
Step 1: Provision Lakebase
Earlier than constructing the app, you’ll must create a Lakebase database. To do that, go to the Compute tab, choose OLTP Database, and supply a reputation and measurement. This provisions a serverless Lakebase occasion. On this instance, our database occasion is named lakebase-demo-instance.

Step 2: Create a Databricks App and Add Database Entry
Now that now we have a database, let’s create the Databricks App that may hook up with it. You can begin from a clean app or select a template (e.g., Streamlit or Flask). After naming your app, add the Database as a useful resource. On this instance, the pre-created databricks_postgres database is chosen.
Including the Database useful resource robotically:
- Grants the app CONNECT and CREATE privileges
- Creates a Postgres function tied to the app’s shopper ID
This function will later be used to grant table-level entry.

Step 3: Create a Schema, Desk, and Set Permissions
With the database provisioned and the app linked, now you can outline the schema and desk the app will use.
1. Retrieve the App’s shopper ID
From the app’s Surroundings tab, copy the worth of the DATABRICKS_CLIENT_ID variable. You’ll want this for the GRANT statements.
2. Open the Lakebase SQL editor
Go to your Lakebase occasion and click on New Question. This opens the SQL editor with the database endpoint already chosen.

3. Run the next SQL:
Please observe that whereas utilizing the SQL editor is a fast and efficient option to carry out this course of, managing database schemas at scale is greatest dealt with by devoted instruments that assist versioning, collaboration, and automation. Instruments like Flyway and Liquibase help you observe schema modifications, combine with CI/CD pipelines, and guarantee your database construction evolves safely alongside your utility code.
Step 4: Construct the App
With permissions in place, now you can construct your app. On this instance, the app fetches vacation requests from Lakebase and lets a supervisor approve or reject them. Updates are written again to the identical desk.

Step 5: Join Securely to Lakebase
Use SQLAlchemy and the Databricks SDK to attach your app to Lakebase with safe, token-based authentication. Whenever you add the Lakebase useful resource, PGHOST and PGUSER are uncovered robotically. The SDK handles token caching.
Step 6: Learn and Replace Knowledge
The next features learn from and replace the vacation request desk:
The code snippets above can be utilized together with frameworks reminiscent of Streamlit, Sprint and Flask to drag the info from Lakebase and visualize it in your app. To make sure all mandatory dependencies are put in, add the required packages to your app’s necessities.txt file. The packages used within the code snippets are listed beneath.
Extending the Lakehouse with Lakebase
Lakebase provides transactional capabilities to the lakehouse by integrating a completely managed OLTP database instantly into the platform. This reduces the necessity for exterior databases or advanced pipelines when constructing functions that require each reads and writes.

As a result of it’s natively built-in with Databricks, together with knowledge synchronization, id authentication, and community safety — identical to different knowledge belongings within the lakehouse. You don’t want customized ETL or reverse ETL to maneuver knowledge between techniques. For instance:
- You possibly can serve analytical options again to functions in actual time (out there in the present day) utilizing the On-line Function Retailer and synced tables.
- You possibly can synchronize operational knowledge with Delta desk, e.g. for historic knowledge evaluation (in Non-public Preview).
These capabilities make it simpler to assist production-grade use instances like:
- Updating state in AI brokers
- Managing real-time workflows (e.g., approvals, process routing)
- Feeding stay knowledge into suggestion techniques or pricing engines
Lakebase is already getting used throughout industries for functions together with customized suggestions, chatbot functions, and workflow administration instruments.
What’s Subsequent
In case you’re already utilizing Databricks for analytics and AI, Lakebase makes including real-time interactivity to your functions simpler. With assist for low-latency transactions, built-in safety, and tight integration with Databricks Apps, you possibly can go from prototype to manufacturing with out leaving the platform.
Abstract
Lakebase supplies a transactional Postgres database that works seamlessly with Databricks Apps, and supplies simple integration with Lakehouse knowledge. It simplifies the event of full-stack knowledge and AI functions by eliminating the necessity for exterior OLTP techniques or guide integration steps.
On this instance, we confirmed how you can:
- Arrange a Lakebase occasion and configure entry
- Create a Databricks App that reads and writes to Lakebase
- Use safe, token-based authentication with minimal setup
- Construct a primary app for managing vacation requests utilizing Python and SQL
Lakebase is now in Public Preview. You possibly can attempt it in the present day instantly out of your Databricks workspace. For particulars on utilization and pricing, see the Lakebase and Apps documentation.
