Data Migration Process: Step-by-Step Guide
Data Migration Process: Step-by-Step Guide
The data migration process for SaaS tools in 6 steps. Audit source data, map fields, test on a small batch, backfill, validate, and set up ongoing sync.
No credit card required
Free 100k syncs every month
You're switching CRMs. You have 20,000 contacts, 3,000 companies, and 800 active deals in the old tool. Every data migration process guide you find describes a seven-step enterprise methodology that assumes you have a data engineer, a staging environment, and three months. You have until Friday. For the fundamentals of what data migration is and the five types you'll encounter, see our guide to data migration.
This guide covers the hands-on data migration process for moving records between SaaS tools. Six concrete steps. No warehouse, no ETL pipeline, no custom scripts. The same process works whether you're migrating a CRM, switching billing platforms, or consolidating customer records after an acquisition.
The data migration process starts with a source audit
Every failed migration traces back to assumptions about the source data. Before you move anything, spend 30 minutes understanding what actually exists.
Open your source tool and answer three questions:
What record types need to move? For a CRM migration, that's contacts, companies, deals, and sometimes activities or notes. Write the list down. If you skip this step, you will discover a missing record type three weeks after cutover when someone asks where the deal history went.
Which fields are actually populated? A CRM with 20,000 contacts might have email on 97% of records but phone number on 35%. Export a sample of 100 records and check field completeness. This tells you which fields are worth mapping and which would transfer as empty columns.
What is your matching key? How will you link records between the old tool and the new one? Email is the most reliable matching key for contacts. Company domain works for organizations. If your source data lacks a consistent unique identifier, you need to pick one before mapping fields. Without a matching key, every rerun creates duplicates.
Audit question | What to check | Time |
|---|---|---|
Record types | Contacts, companies, deals, activities | 5 min |
Field completeness | % populated for top 10 fields | 10 min |
Matching key | Email, domain, or custom ID | 5 min |
Custom fields | Picklist values, data types | 10 min |
The source audit is the step that separates a clean migration from a week of cleanup. It takes 30 minutes. Skipping it costs days.
Data migration process step 2: map fields between tools
Field mapping is where you decide what data goes where. Get it wrong, and phone numbers end up in email fields, dates render as text strings, and picklist values silently drop.
Build a mapping table with four columns: source field name, destination field name, data type, and any transformation needed.
Source (old CRM) | Destination (new CRM) | Type | Transformation |
|---|---|---|---|
Full Name | first_name + last_name | Text | Split on first space |
Company | organization_name | Text | Direct map |
Phone | phone_number | Phone | Convert to E.164 |
Lead Source | lead_source | Picklist | Remap values |
Created Date | created_at | Date | ISO 8601 conversion |
Subscription Status | subscription_status | Text | Direct map |
Plan Name | plan_tier | Text | Direct map |
Lifetime Revenue | total_revenue | Number | Cents to dollars (/100) |
Three things break during field mapping more than anything else:
Type mismatches. The source stores dates as "MM/DD/YYYY" strings. The destination expects ISO 8601 timestamps. The source stores currency in cents (10000). The destination expects dollars (100.00). Identify every type mismatch and define the conversion rule before running any migration.
Structural mismatches. One tool has a single "Full Name" field. The other has separate "first_name" and "last_name" fields. One tool stores address as a single text block. The other has five fields (street, city, state, zip, country). These require transformation logic, not just mapping.
Missing destination fields. Your source has custom fields (subscription_status, plan_name, renewal_date) that don't exist in the new tool yet. You need to create custom properties in the destination before the migration runs, or use a sync tool that creates them automatically.
Run a test data migration on a small batch
Never run the full data migration process on your entire dataset first. A test batch of 50 records catches mapping errors that would otherwise affect every record in your system.
Pick records deliberately, not randomly. Include:
Records with all custom fields populated
Records with missing required fields (no email, no company)
Records with special characters in names (accents, apostrophes, CJK characters)
The oldest and newest records in your system (date formatting edge cases)
Records with the most complex data (multiple deals, extensive notes)
Run the test batch. Then inspect 10 records in the destination manually. Check every field, not just the ones you expect to work. The fields you didn't think about are the ones that break.
Common problems the test batch reveals:
Names truncated at 50 characters because the destination field is shorter
Dates showing as January 1, 1970 (Unix epoch) because the format conversion failed
Picklist values mapped to "Other" or blank because the source value didn't match any destination option
Currency fields off by a factor of 100 (cents vs. dollars)
Fix the mapping. Run the test again. Two or three iterations is normal. Each one takes 15 minutes. Skipping this step costs days of cleanup on 20,000 records.
Execute the full data migration process with backfill
With your field mapping validated on a test batch, the full migration is straightforward. Connect the source tool and the destination tool, apply your mapping, and run the initial sync.
Choose your sync mode. Use "Update or Create" for most migrations. This updates existing records in the destination (if they match on your matching key) and creates new records for everything else. If you want the destination to be an exact mirror of the source, including deletions, use "Mirror" mode.
Run the backfill. The first sync transfers every record from the source to the destination. For most SaaS datasets (10,000 to 100,000 records), this takes minutes. The sync tool handles API rate limits, pagination, and batching automatically.
Watch the dead letter queue. Records that fail (field validation errors, rate limits, API timeouts) should land in a review queue, not disappear. After the backfill completes, check the queue. Common failures: a phone number with letters in it that the destination API rejected, a record missing a required field, or a batch that hit a rate limit and needs to be reprocessed.
With Oneprofile, connect your source and destination with API keys or OAuth. Map fields visually. Choose Update or Create mode. The first sync backfills all historical records. Property-level change tracking means only changed fields get written to the destination, reducing API calls and preventing data loss from full-record overwrites. Records that fail land in the dead letter queue for investigation.
Validate the data migration: record counts and field completeness
A migration that "completed successfully" but dropped 2,000 phone numbers is worse than one that failed with an error message. Validation is not optional. Run three checks:
Record count validation. Compare total records per type between source and destination. Contacts: 20,000 in source, 20,000 in destination. Companies: 3,000 and 3,000. Deals: 800 and 800. Counts should match within 1-2%. Larger gaps mean records were dropped or duplicated.
Field completeness validation. For your five most critical fields, compare the percentage of non-empty values. If email is 97% populated in the source but 88% in the destination, the mapping dropped records or the field didn't transfer.
Field | Source % populated | Destination % populated | Status |
|---|---|---|---|
97% | 97% | Pass | |
Company | 89% | 89% | Pass |
Phone | 35% | 35% | Pass |
Plan Name | 72% | 71% | Check |
Created Date | 100% | 100% | Pass |
Spot check. Pull 20 records at random. Compare every field value between source and destination. This catches formatting issues (dates in wrong format), truncation (names cut off), and mapping errors (data in wrong field) that aggregate counts miss.
If validation fails, check the dead letter queue first. Most failures trace back to a small set of records with specific data quality issues. Fix the root cause, reprocess the failed records, and validate again.
Convert from one-time data migration to ongoing sync
The traditional data migration methodology treats migration as a project with a start and end date. But SaaS tool transitions don't work that way. Your team uses both tools for days or weeks while integrations, automations, and habits catch up. During that overlap period, data changes in both places. Without ongoing sync, the two systems diverge within hours. For a detailed look at why this divergence is the most underestimated migration risk, see our guide to data migration risks.
The better approach: treat migration as sync with a starting point.
The initial backfill (Step 4) moved all historical records. That was the migration.
Now set a 15-minute sync schedule. Every 15 minutes, only records that changed in the source get synced to the destination. Your team keeps using the old tool. Changes flow to the new tool automatically.
When your team has fully moved to the new tool, stop the sync and decommission the old one.
This eliminates the most common data migration failure: the gap between "migration done" and "everyone actually using the new tool." Instead of a hard cutover that forces your team to switch overnight, you get a gradual transition where both systems stay current.
With Oneprofile, the migration and the ongoing sync are the same configuration. The backfill that migrated your data keeps running on a schedule. Property-level change tracking ensures only fields that actually changed get updated. When a sync fails, the record lands in the dead letter queue instead of disappearing. No second tool, no re-mapping, no new configuration. The data migration process becomes an ongoing data sync by changing one setting: the schedule.
How long does the data migration process take between SaaS tools?
The transfer itself takes minutes for most SaaS datasets (under 100,000 records). The full data migration process including audit, field mapping, test batch, and validation takes one to three days.
Do I need a data engineer to migrate data between tools?
No. SaaS tool migrations involve mapping fields between two APIs with well-defined schemas. A sync tool with visual field mapping handles it without code, scripts, or staging environments.
What is the most common data migration mistake?
Skipping the test batch. Teams run the full migration without testing on 50 records first, then discover field mapping errors across every record. A 30-minute test prevents days of cleanup.
Can I keep both tools running during the migration?
Yes. Run the initial backfill while both tools stay live. Enable incremental sync to keep them aligned every 15 minutes. Cut over to the new tool when you have verified the data is complete.
How do I prevent duplicate records during migration?
Use a matching key like email for contacts or domain for companies. The sync tool checks if a record exists before creating a new one. Update or Create mode prevents duplicates on every run.