Databricks

Databricks integration
Sync data between Databricks and your SaaS tools. Use your lakehouse as a source for activating data, or as a destination for consolidating records from every tool.
Activate your lakehouse data
Push Databricks tables to SaaS tools or consolidate tool data in one place.
Reverse ETL, built in
Push modeled data from any Databricks table directly to your CRM, marketing, or support tools. No separate reverse ETL product — Oneprofile handles it.
Popular integrations with
Databricks
Connect
Databricks
to these tools for powerful data workflows

CRM
Push modeled customer data from Databricks to HubSpot contacts for sales context without reverse ETL tooling.
Databricks
+
HubSpot


CRM
Sync computed account scores and usage metrics from Databricks into Salesforce for revenue teams.
Databricks
+
Salesforce


Customer Support
Enrich Intercom conversations with product usage data stored in Databricks tables.
Databricks
+
Intercom

Payments
Consolidate Stripe billing events into Databricks for revenue analytics and churn modeling.
Databricks
+
Stripe


Analytics
Mirror PostHog product analytics into Databricks for SQL-based cohort analysis.
Databricks
+
PostHog


Database
Consolidate Stripe billing data into PostgreSQL for custom reporting and SQL-based dashboards.
Databricks
+
PostgreSQL
View All Integrations
Connect the tools you already use
Oneprofile supports wide range of integrations across categories
HubSpot
CRM

Salesforce
CRM
Attio
CRM
Stripe
Payments

Intercom
Customer Support

PostHog
Analytics
Mixpanel
Analytics

PostgreSQL
Database
MongoDB
Database
Braze

ActiveCampaign
SendGrid

Iterable

Shopify
E-commerce

Gainsight
Analytics

Planhat
Analytics

Vitally
Analytics

Marketo

Facebook Ads
Advertising

Recharge
Payments
View All Integrations
About
Databricks
Oneprofile is not just about sending data from one tool to another. It ensures customer profiles and events stay consistent across every system, even as data changes over time.

SUPPORTED RECORDS
Any Table
Any Column
INTEGRATION TYPE
Source
Destination
CATEGORY
Data Warehouse
Analytics
FAQ
How does Oneprofile connect to Databricks?
Through a Databricks personal access token and your workspace URL. Paste both into Oneprofile, and the connection is validated against the live Databricks API before saving. Tables and columns are discovered automatically.
Can I sync any Databricks table, or only specific ones?
Any table. Databricks uses wildcard record types in Oneprofile. Every table in your workspace is available as a source or destination. You choose which tables and columns to include during field mapping.
Does the Databricks integration support Delta Lake tables?
Yes. Oneprofile reads from and writes to Delta Lake tables in your Databricks workspace. Schema discovery detects column names and types automatically, including nested and complex types.
Can I use Databricks as both a source and destination?
Yes. Every Oneprofile integration is bidirectional. Read from Databricks to push data to SaaS tools, or write SaaS tool data into Databricks tables. Each direction runs as a separate sync config.
How does incremental sync work with Databricks?
Designate a timestamp or sequence column for change detection. Each sync run processes only rows newer than the last synced value. Initial loads do a full table read, then switch to incremental automatically.
Is the Databricks integration available on the free plan?
Yes. Databricks is available on all plans, including free. The free plan limits total integrations and sync configs, but there are no per-connector fees on any plan.

