Migrating data into or out of Salesforce is a time-consuming, but necessary task. Data migration is a challenge for any enterprise. There are both organizational and technical challenges that need to be managed and overcome to ensure a smooth Salesforce integration process. 

Why pay attention to the data migration process? It’s simple: bad data leads to bad business outcomes. Data can become misaligned or outdated in an enterprise for a number of reasons:

  • Large companies often have databases managed by disparate parts of the organization
  • Acquisitions mean databases with data organized differently need to synchronize with existing Salesforce data
  • Legacy systems, either old in-house systems or those acquired via acquisitions, might not communicate with the latest APIs and/or contain unnecessary data

Companies need a Salesforce data migration strategy to ensure their sales and marketing teams always have access to the best possible data to achieve business objectives. To help ensure that your data stays up-to-date, this post covers essential data migration strategy.

Table of Contents

  1. Best Practices for Your Salesforce Data Migration
  2. 3 Strategies to Guide Your Salesforce Data Migration
  3. The Xplenty Advantage

Integrate Your Data Today!

Try Xplenty free for 7 days. No credit card required.

Best Practices For Your Salesforce Data Migration

The goal of data migration is simple: to keep data stored in a centralized location up-to-date, or to keep data in multiple locations synchronized. This ensures that sales teams in a Salesforce organization are always using the most accurate data, which will lead to improved sales outcomes. 

1) Establish a Data Governance Plan

The first best practice is obvious - create and commit to a data governance plan. 

Your organization must have a clear process so that all stakeholders are on the same page about the data migration. The plan can be a part of an enterprise-wide data management initiative or it can be a simple agreement between the groups responsible for Salesforce and the system being integrated. One key piece of the plan is that it creates an ongoing process for CRM data migration -- keeping data clean is an ongoing task. 

Any data governance plan needs to identify details about the data being exchanged (such as how legacy IDs are mapped to user IDs), a process for determining how to add or remove data from the exchange, and the decision-makers on both sides of the exchange. 

2) Make Sure Your Data Is Organized

You can’t successfully migrate data without focusing on data quality. The essential first step is to have a data governance process that focuses on ensuring clean source data. Many companies do need custom fields, but every custom field should go through a rigorous vetting process to explain what the field is, how it will be used, and a definition that distinguishes “good” and “bad” data. 

Administrators will use this information to create validation rules, pick lists, and field dependences to enable data mapping. Sometimes, validation rules and pick lists aren’t enough to ensure quality data for a field. In that case, audit the field by running regular reports to check for junk data. Data cleanup and organization isn’t a one-time project, ensuring data quality is a continuous process. 

Related Reading: Salesforce Data Quality

3) Start With a Pilot Project

In most organizations, the Salesforce CRM system is the new system. Because of Salesforce’s flexible data structure, this can conflict with existing systems with inflexible data governance processes. This can cause tension between business units. A useful approach is to start with a simple pilot project that shows how the CRM system can perform easy data import. 

For instance, perhaps there are useful sales data in an older, less flexible database where it’s hard to extract value from that information. Even just using default data settings in Salesforce, you can filter out garbage data and accept weird legacy formats (among other tricks) to bring that legacy data into the Salesforce CRM database. From there, Salesforce reporting can be used to extract trends and perhaps even identify purchase behavior. 

Once you show value with a pilot project, the data governance team and other departments will more readily work with you. 

4) Closely Monitor Your Integration

Most data migrations and integrations are ongoing, not one-time, projects. Constant monitoring is required to ensure data conforms to the standard. An integration tool is the best way to do this. It will identify data issues and send alerts (often via email) to alert process owners of issues with the data. Another method is to regularly run exception reports to find data that doesn’t conform to the standard.

Each method has flaws. Alerts can create false alarms that cause monitoring fatigue. Reports are only as good as the people running them and evaluating the results. 

We recommend a best practice that combines a good integration tool and have the integration written by someone who completely understands the tool. Alerts are more likely to be taken seriously because they’ll be more accurate. As an additional way to track progress on data quality and duplicate records, Salesforce dashboard tools can create an integration status dashboard. Go beyond simple text reports. For example, create a dashboard component that counts integration errors over time, to track progress on reaching quality metrics. 

5) Consider Third-Party Data Integration Tools

The best technology can’t make up for a poor integration plan. Even though we are a data integration vendor, we always focus on process first; then technology. Without a strong process, data integration using a company’s data model will never be as effective as possible. On the other hand, a good integration plan can fall short with weak technology.

A tool like Xplenty can provide a great integration experience between your Salesforce implementation and your other key systems:

  • Drag-and-drop tools to build data pipelines with steps that cleanse and check data
  • Fully configurable pipelines enable alerts for bad data
  • Secure integration of processes behind your firewall using a cloud service like Salesforce
  • Team of experts that can build your integration pipeline if you don’t have the in-house expertise to do it yourself

Integrate Your Data Today!

Try Xplenty free for 7 days. No credit card required.

3 Strategies To Guide Your Salesforce Data Migration 

Now that you have a solid foundation of general data migration best practices, how do you turn those ideas into reality? There are three general methods to exchange data between Salesforce and internal and external systems:

  • Use Salesforce tools
  • Employ Salesforce API
  • Choose a Third-Party Integration Tool

There’s no single “right” solution. In fact, many organizations use multiple solutions to accomplish their data migration needs.

1) Use Salesforce Tools

Salesforce reporting can export Excel and text files for transfer to other systems. The Data Loader can load data from outside systems into Salesforce. An administrator or skilled user can use these capabilities to create an integration pipeline without writing any code. 

It’s a simple approach but takes a lot of manual effort. During the data load, careful oversight is required to spot bad data and formatting issues. Even so, it’s a great way to begin integrating a new Salesforce implementation into an organization. The prototype integrations can be coded later or implemented using a third-party tool

Let’s use customer tracking as an example:

  1. A report of object IDs and other customer information would be run periodically to export new customers. The report would export accounts created after the previous report.
  2. An upload to the external customer tracking system that would run a process that would export a list of customer IDs in .csv format.
  3. A data loader step that would use the csv file to update the custom field containing the external account IDs.

When your needs are simple and infrequent, using the built-in Salesforce tools could be your best choice as the cost of developing an API or using a third-party tool may be more than the benefit.

2) Employ The Salesforce API

Salesforce’s application programming interface (API) is used by programmers to upload and download data. 

Programmers use Salesforce’s proprietary Salesforce Object Query Language (SOQL) to retrieve data from Salesforce. Programmers can also write code to insert or update standard or custom Salesforce objects.

The Salesforce API is extremely flexible and powerful, but requires programming. Every integration using the API requires developmental resources and an integration specification. Like any development project, there are risks and costs. 

API benefits include:

  • Reduces data entry needs and saves time
  • Minimizes human error
  • Automates the movement of data between two different apps 
  • Eliminates or decreases the time-intensive need to manually run a report and export the results

If you have complex or frequent data migration needs and/or developers with time to spare, building your own API interface could be the right data migration strategy for you.

3) Choose a Third-Party Integration Tool

Third-party tools occupy the middle ground between a manual approach and writing your own code. These tools can be as simple as a one-way extraction program, or as complex as a tool that has its own programming language. 

These tools shield the user from the details of the API, and many of them have a drag-and-drop interface that makes it easy to create an integration workflow. Often, these tools are cloud-based so there’s no need for the hassle of an on-premises install.

A good third-party tool will include many of these features:

  • Data cleansing and verification technology to make sure the data being transferred in the integration is not corrupt.
  • Allow scheduling and process automation so manual intervention is for exceptions, not the norm. 
  • A wide variety of data targets, including most of the popular cloud data warehouse platforms (Amazon Redshift, Snowflake, Google BigQuery, etc.), relational databases (Oracle, SQL Server, MySQL, etc.), and the widely-available REST API. The latter two targets are vitally important in an enterprise context because integration targets in the enterprise are often systems “inside the firewall.” 
  • Bi-directional: it can read from and write to your Salesforce instance.

In addition to the obvious benefit of avoiding programming, third-party tools promise faster implementation and higher-quality integrations. Implementations can be fast because the tools are pre-built, and integration quality can be high because of built-in data validation and cleansing tools. The drawbacks of third-party integrations are cost, vendor stability, and the possibility that the integration tool may lack the capability necessary to fully implement the integration.

The Xplenty Advantage

As a third-party integration tool, Xplenty has a number of strengths: 

  • We integrate with almost any modern tool, including cloud data warehouses (RedShift, BigQuery, Snowflake, and others), relational and no SQL databases (including MySQL, Heroku PostgreSQL, SQL Server, MongoDB), and any system, cloud or internal, that has a REST API interface. 
  • Our Salesforce integration is bi-directional -- we can read from and write to standard or custom Salesforce objects.
  • Since Xplenty can securely access data behind your enterprise firewall, it can be used for integrations between Salesforce and your internal systems.
  • Xplenty uses a drag-and-drop interface that lets non-developers build data pipelines between systems without writing code. Or, our skilled data engineers can build pipelines for you, if you don’t have the resources to do it yourself.

If you want to start extracting value from your Salesforce implementation like never before, contact us for an Xplenty demo, test pilot, and complimentary startup session with our implementation team.