WEBINAR Series

Salesforce Data Integrity Webinar

Use Salesforce regularly? This webinar recap is for you. Here, Xplenty's panel of experts explore hot-button Salesforce issues and more.

HOST Leonard Linde
KEYNOTE SPEAKER Sten Ebenau CEO of Plauti B.V.
AREAS WE WILL COVER Salesforce, Data Integrity

Do you work with Salesforce on a regular basis? Then this comprehensive webinar recap is perfect for you. In this engaging, all-encompassing discussion, Xplenty brings together experts from the data and Salesforce world to tackle some of the hot-button issues surrounding data integrity within Salesforce.

Hosted by Xplenty's Leonard Linde, this talk features Stan Ebenau (Plotty), Matt Kennedy (OneBackup), and Andrea Hall (Form Assembly) touching on a wide range of Salesforce data integrity topics. Beginning with introductions and some all-too-relatable "horror stories" about data integration gone wrong on different platforms, the experts dive deep into the hot-button issues of the Salesforce and data world.

Andrea helps to cover some of the best practices for data gathering, specifically when it comes to using forms within Salesforce. Then, Stan covers what listeners need to know about data duplication within their systems and Salesforce databases, and presents some ways to avoid the duplications that can plague data collection. Matthew speaks on backup strategies, underscoring the importance of an effective redundancy plan for an organization's critical data.

The panel also answers some questions from attendees, covering important data issues from the perspective of both the producer and the consumer and addressing the problems of data validation in form tools and realistic timelines for archiving critical data. The recap concludes with examples of a good-quality workflow and ways to work with zero-budget customers. Want to learn more? Take a deep dive into the issues with the full transcript!

TRANSCRIPT
  • Salesforce data "horror stories" (01:56)
  • Using web forms with Salesforce and data gathering best practices (07:43)
  • Salesforce native duplicate manager or other apps? (14:03)
  • Effective duplicate management strategies and tools (17:13)
  • Compliance with Salesforce data (19:11)
  • Archiving your own data (26:07)
  • Data as a producer and consumer (27:59)
  • Timelines for archiving data (32:06)
  • Data validation in form tools (34:21)
  • Examples of a good-quality workflow (39:21)
  • Working with "zero-budget" customers (43:51)

Leonard Linde:

(00:41) I want to welcome everybody to the first X-Force data Summit webinar. This is building off the X-Force data summit, which was a virtual conference that we held. We are Xplenty; we're a company that makes ETL tools, and one of our targets is Salesforce. And essentially what we're trying to do is have discussions and other virtual events in this time where there are a lot of locked-down people, where we can talk about all kinds of different Salesforce topics. So today I'm going to start with this agenda. We're going to do some introductions, and then we're going to deal with some data integrity content topics - best practices for data gathering, data cleanup, deduplication, and more. 

Leonard Linde:

(01:26) For Data enrichment - unfortunately, the person who was the expert on that, Jeremy Boudinet, was unable to make it at the last minute today, so maybe I'll say something on data enrichment. I'm sure it will be much less interesting than anything Jeremy was going to say. Then we'll go to backup and recovery and Salesforce in the enterprise. And by that, I mean, basically places people are using Salesforce as a single source of truth. So what does it mean to be a good data integrity steward in that environment? 

Leonard Linde:

(01:56) So, let's begin with introductions. And when you introduce yourself, if you have a hair curling, interesting, worst story to tell, that's welcome; if you don't, that's fine too. Maybe just tell us a little about yourself. I'm going to start with Stan M now from a company called Plotty. 

Stan Ebenau:

(02:16) Welcome, everyone. My name is Stan. I'm the CEO and founder of Plotty, a company based in the Netherlands. We create two applications for Salesforce. They are on the upper exchange. It's called duplicate check for Salesforce and record validation for Salesforce. Data management and data quality for duplicate check is all about finding duplicates; record validation is all about validating your existing data. If, if the address is correct, if the phone number is correct, if the postal address, of course. When I got excited about data quality, I was doing some work for Proctor and Gamble.

(3:00) They were having their global customer data center and customer contact center based on Salesforce and in the service cloud. They were giving out coupons for free diapers and promotion packs or batteries, et cetera, and people were just so - how do you call it - creative about how to send those coupons in with different addresses and basically the same address, but that's all about data quality. So their ROI on return on investment on a data quality tool only for that was immense. So that's really when I got excited about doing data quality for Salesforce. I think that's all for my introduction. 

Leonard Linde:

(03:56) Thanks, Stan. We're going to head to Andrea. Now tell us a little about yourself and if you have any data quality. 

Andrea Hall:

(04:03) Sure. My name is Andrea Hall, I'm the manager of our implementation team at Form Assembly. Form Assembly is a data collection tool, a form building data collection tool. Our biggest integration is with Salesforce. I'll tell a story that we actually see quite often within our Salesforce connector we allow for, and/or logic when you're looking up records to either create or update those records. Every once in awhile, we'll see a customer who doesn't really realize their lookup uses all and logic rather than a combination.

(4:40) So, for example, they're trying to look up a contact based on last name and email or last name and mobile phone number or last name and home phone. But what they don't realize when they're setting it up is that they actually have and logic throughout all of that. So the connector thing to match on all of those fields, nine times out of 10, that results in no record found. It's not going to match on all of that unless you have a contact who keeps everything up to date. 

(05:06) Every so often, this happens to coincide with a form that expects a lot of responses once they release it. So that usually means they're expecting anywhere from a few hundred to a few thousand responses in those first few hours. That leaves them with a few hundred to a few thousand people get records in Salesforce. Luckily, they're usually able just to remove all those duplicates that get created, and then your data is actually stored in Form Assembly. So we can go through and help them get their connector configured correctly and reprocess all of them, but it usually does cause just a little bit of a stirrup right there within the first few hours. 

Leonard:

(05:47) Yeah. Duplicates, thank you, Andrea. The final, our final panelist is Matt Kennedy from OwnBackup. Tell us about yourself, Matt.

Matt Kennedy:

(05:57) Yeah. Yeah. Thanks, Leonard. So as you said, my name is Matt Kennedy, Director of Partner Enablement here at OwnBackup. I've been in the Salesforce ecosystem since about 2009 - I'm a Salesforce certified admin. I have a lot of years of developer experience. So I'm also a user group leader here in New Jersey; I've been with OwnBackup the last three and a half years initially as a sales engineer, but now I switched over to the alliances team. So I'm a technical resource, working with a lot of partners and working with customers.

(6:25) I think we have lots of stories because we're backing up over 2000 Salesforce instances. But the one that scares me the most is from one of the customers. We worked with a company called Yadkin Bank (another banking company acquired them). When you think about banks working on the Salesforce platform - I know you're going to talk later about a system of record, source of truth, right? 

(06:52) You think about a loan and a loan record within Salesforce - it's not just one record. Usually, there's a lot of information that's associated with that loan. You're putting all kinds of application forms and everything associated with that. They had an integration running with Informatica, another ETL tool, and integration should have updated these records. Well, instead of updating 45,000 loan collateral records, it deleted them. Now just think about that.

(7:21) That's 45,000 records deleted plus all the information associated with them. Fortunately, they were using our tool and were able to recover all that. But if they didn't have a backup solution in place, imagine the damage that would inflict on their system. 

Leonard Linde:

(07:43) Agreed. I'm going to get to our first question here, and it's about data gathering best practices. What are the main gotchas to avoid when gathering data from a web form for use in Salesforce? I'll shoot this to Andrea, but everybody else can chime in - if using web forms, gathering data for Salesforce, since that's definitely a common use case. 

Andrea Hall:

(08:17) I have a couple of points on best practices when looking for form tools to send your data into Salesforce. The first one is to make sure that all of your form field validations align with your Salesforce validation. So I'm not just talking about making sure a phone number gets formatted in the correct way and throughout your records, but also, you know, if an opportunity stage moves to closed one, then there's a required field, things like that.

(8:47) You want to make sure that your form tool can match all those validations and all those complexities. Then this will ensure that the data that you're sending into Salesforce is in the same format, and all of your data gets collected correctly every single time. The second one would be to make sure there's a spam-prevention solution, and that you enable that whenever it's possible. 

(09:09) This will minimize the possibility of sending hundreds of thousands of spam responses to Salesforce. I've also seen some customers who will leave that off of their forms, and they have a connector set up to create a new contact every time the form is submitted. They wake up the next morning at 8:00 AM and have thousands of responses inside of their Salesforce environment. So that's never fun. You want to make sure that's always enabled whenever you can. Form Assembly offers Google ReCaptcha for on all of our plans to help alleviate some of those issues. 

Leonard Linde:

(09:44) I was just going to ask; it seems like ReCaptcha is the gold standard. It's the only one I see anymore. You guys feel that way too. 

Andrea Hall:

(09:52) I do, yes. I have to fill that out on a lot of websites. Click all the stop signs. 

Leonard Linde:

(09:59) Right? Right. They have that magic one where they just, if you click it, it figures you out, that you're human. I don't know exactly how that works... 

Andrea Hall:

(10:04) Watch us through the webcam, maybe. I don't know. The next one is duplicate management. Just make sure you keep that kind of top of mind that only when you're choosing your form solution, but when you're building out your web forms as well when you're sending your data into Salesforce, you want to make sure you're considering all the audiences that you're going to be collecting that data from.

(10:32) Then with that, you have to consider if you're going to be creating new records every time, updating existing records, or maybe a combination of both. So our connector does allow you just to create new Salesforce records, update Salesforce records, just do a lookup to find something related and things like that. This will eliminate a lot of possible duplicates that you see with form tools that only offer standard create options. 

(11:02) Then the last one is the ability to pre-fill existing information into a form. If the ability is there, definitely do it. There's a lot higher rate of completion when somebody has a form, the tab filled out for them already. The best example of this is a stay in touch form. Your form respondents are way more likely to give you up-to-date information when everything's filled in, and all they have to do is look that over and make sure it's still accurate and hit submit your data is a lot more likely to stay clean and up to date.

Leonard Linde:

(11:36) Somebody in the audience asked what your take is on too many versus too few form fields. You know, I think what he's trying to get at is that what do you find that the threshold for people just abandoning a form is when they fill it out? 

Andrea Hall:

(11:56) I think it depends on the type of form. For example, I've seen bank applications that have, let's say ten pages, but on each of those ten pages, you know, there's maybe a maximum of 20 fields, and 10 of those fields are conditional based on whether you said yes or no to a different question. So I think the key is just to organize them (any fields that go together) put those into groups and kind of visually show that they go together and make sure that any logic is in place.

(12:28) So, if someone doesn't have to fill out a field rather than saying, if you answered, yes, please explain. You can trigger those conditionally behind the scene and things like that. Another great option is to have some type of save and resume feature, to enable that on longer forms. That way, if someone does get tired of filling it out in one sitting, they can save it and come back to it the next day. 

Matt Kennedy:

(12:52) Are you finding, Andrea, that the email would be the primary key? Are you using multiple? Like if you're prepopulating those forms, what are you using as your lookups? 

Andrea Hall:

(13:02) It depends on how somebody is getting to the form. So we see a lot of people who will send out stay in touch emails from Salesforce. In that case, you can actually attach like the contact ID or an account ID to the URL and use that and your main parameter in the lookup. Then I see other people who do kind of a two-form process where they'll have to put in maybe first name, last name, email, and then it will query Salesforce to look for the rest of your information and pull that into the second form. Awesome. 

Leonard Linde:

(13:37) I would have to say, I just worked with a client that was using a WordPress form - just stop whatever, you know, there's a hundred of them - and the ability to reach into Salesforce and populate the form just doesn't exist on a lot of these low-end form solutions. To me, that's a real impediment to user acceptance and quality data. 

Andrea Hall:

(13:59) Oh, absolutely. It can be very powerful. 

Leonard Linde:

(14:03) That was data gathering best practices. Here's one for Stan, but everybody else can chip in if you've done this before. What's your take on Salesforce native duplicate manager versus apps like demand tools and duplicate checks?

Stan Ebenau:

(14:36) Yeah.

Leonard Linde:

(14:39) If we could check it's much better, is that your take? 

Stan Ebenau:

(14:44) It really depends on what your goal is with duplicates and data. If you use Salesforce as a sales organized tool just for meetings - basically sales - a simple duplicate management system is okay for you. When you're entering new records into Salesforce, that you have a sort of preventing mechanism, and that's about it. Salesforce duplicate management is probably the right fit for you because Salesforce duplicate management can give you that.

(15:19) But if you're working with forms like in Form Assembly, or even the standard web forms that Salesforce offers, or if you have a marketing automation tool and behind Salesforce - Pardot, HubSpot marketing cloud, doesn't really matter which one - then Salesforce is going to be the customer 360 platform. 

(15:37) Then you need much more than the standard Salesforce duplicate management tools. Then you need the batch processing. Then you need API enter protection. Because all those marketing clouds and all those integrations into your Salesforce system, of course, can create records, can update records, can do whatever they want.

(16:01) And in the case of, if you have the email address or even the ID, then that's not an issue, but if you don't have that information, then you're probably going to create duplicate records. One example is one of our biggest customers is Education First. They have lead generation on their website. They have around 10 to 20,000 new leads each day. And their goal is to call each one of them because they have offices around the world within five minutes. 

(16:31) But one person is going to hit that submit button on the website two or three times because they want different brochures or different folders, and they don't want to have a team calling that person two or three times. So within one minute, after that submit button, they need to have to be duplicate checked. Well, you can't do that with standard Salesforce duplicate management. That's just impossible. So it's really, depending on the goal you're having with duplicate management, how much records you have, how many influxes of records you have, that will tell you the tool you need. 

Leonard Linde:

(17:09) That's interesting. Does anybody else have anything to add about duplicate management? 

Matt Kennedy:

(17:13) I will just add to that because I think it's really critical when you think about the amount of data that could be coming in If you're running a webinar like this, right, and afterward there's a series of leads that come out of it, and they're all getting loaded in. We look at it as you want to make sure that you have a baseline of your data - having a backup in place before any kind of large integration of new information or any duplication process.

(17:45) Then you want to have a backup during and after, so you can compare points in time because if something does go wrong - If you bring in a large number of records with incorrectly formatted data, you want to be able to undo that without having your production org have a bunch of bad information in it.

Stan Ebenau:

(18:03) Absolutely. Because we have plenty of examples - using our tool with an incorrect setup or something where the intern is responsible for cleaning up the duplicates.

Matt Kennedy:

(18:19) Yeah. A manual process for an intern to go in and spend hours trying to figure out what changed and correct it, to be able to automate that and compare.

Stan Ebenau:

(18:30) I mean, we have examples where clients had a database of, let's say 10,000 accounts, and where an intern hit the wrong button, and they only had two thousand left. So you really need a backup because yes. It's all around our app - when you hit a merge or an auto-merge, "merge cannot be undone." It's really loud. Everywhere.

Leonard Linde:

(19:01) One strategy, if you don't have a full backup, is to mark whatever's coming in from some third party source, but that doesn't deal with any of the use cases you just brought up. 

Stan Ebenau:

(19:11) So let's look at the standards. If you look at standard Salesforce, duplicate management, it's not really geared about enterprise usage, data governorship. So you don't have any audit tracking. You don't have any logging of what's happening inside your system, who is merging what, which is merging what, et cetera. So, and so if you have an enterprise environment with a lot of customizations, you need one of our tools - or one of the competitors - but you need them.

(19:40) You need a better duplicate management system than the standard Salesforce. And if you look at why Salesforce built the duplicate management system, it's because the competitors already had all that. Microsoft CRM already had something like that - but especially for the GDPR compliant bit, you need a proper duplicate management tool if you want to be GDPR compliant. 

Matt Kennedy:

(20:04) I think your story is kind of interesting. Because it's not just about merging of leads and finding things that are getting put together. Like you said, the same person on that website maybe hit submit on three different pages. If they hear from three different people, that's going to be a really bad experience. So the fact that your tool can recognize that and kind of merge those on the spot is really going to save that customer a lot of aggravation.

(20:34) Same thing with GDPR, right? If somebody puts in a "forgotten" request and you've marked that lead, and nobody should be contacting them anymore, and there was a duplicate - now suddenly they're contacted, that's going to create a problem. 

Stan Ebenau:

(20:47) That's a big fine in the European Union. 

Leonard Linde:

(20:52) Those are some interesting cases. Since we touched on backup, we'll get into Matthew's world here. We'll talk about key factors Salesforce customers should be thinking about when they're formulating their backup strategy. 

Matt Kennedy:

(21:15) That's a good question. It's it's interesting. I mean, obviously, in these times, we're all social distancing, right? I mean, normally next week would be New York City World Tour. I haven't missed that event in five years, and it's going to be weird not to be there. But when we're out at these events and people are coming to go to a booth, that's usually the first thing we get. They're kind of looking at us like, "Why do I need a backup solution? My data's in Salesforce, it's in the cloud, it's secure." 

(21:42) And we're like, that's completely true. You can always log into Salesforce, 99.9 percent, of the time. You're going to be able to log in and access your data as it exists right now. But to Stan's point, if you're doing a merge, or to Andrea's point, if you're doing forms and you're bringing a lot of data, if something goes wrong, how do you roll back to what you had yesterday or the day before? 

(22:05) So that's really what we talk about. When we talk to Salesforce customers, it's not about the Salesforce platform - the platform is secure, your data's protected. However, as an end-user, you have the ability to modify your data - not just your data, but also your metadata. Think of admins going in changing profiles, permission sets, workflows. We're doing daily snapshots of all the configuration files as well. So if something goes wrong, they can roll that back. So usually, the question I ask people when they come into the booth is, "Have you ever suffered a data loss?"

(22:37) The most common reply we get is, "I don't think so." And that's not really optimal. You want to be right. You want to know. You want to be confident that you have a baseline, and you can see daily, what's getting added, What's getting changed, and what's getting deleted to be able to roll back. Right? You don't want to have that unknown about what's in your Salesforce instance. 

Leonard Linde:

(23:00) One of the key things in any backup strategy is checkpoints, right? If you're going to go and do some big data change you want to be able to take a checkpoint - you're talking about logging, but I assume like OwnBackup and probably the other the competitors have the ability to say, "Hey, I'm about to do something really dangerous. I would checkpoint."

Matt Kennedy:

(23:24) So our solution, by default, is doing a once-automated daily backup. So usually overnight, we're doing a complete backup of the whole environment, but we do have the ability through the user interface or through our API - to do an on-demand backup. So we'll tell people, Hey, you're about to run a large deployment, or you're running an integration, force a backup right before that.

(23:45) Even if you have the backup from last night, you have usually been working all day, so backup late in the afternoon and have another restore point. If you think about the native Salesforce solutions that out-of-the-box is a weekly export. You think about maybe on a Sunday you're running a full dump of your org, but then you could lose up to seven days worth of information, and there really is no restore capabilities. It's basically just a copy of your data. 

Leonard Linde:

(24:12) Yeah. It's terrible. It's terrible, I mean, I used to do that because we didn't have a backup solution. It's like, well, if some disaster strikes, I guess I'll figure out what to do with this. 

Matt Kennedy:

(24:24) I think that's the key differentiator we try to make with people: it's not just backup, right? Having a copy of your data is important, but the frequency of those backups is too if you're doing it daily or doing it hourly. The hard part is the comparison and the restoration, right? If something goes wrong, to be able to look at two different points in time and analyze and identify exactly what changed. So then you can selectively recover, right?

(24:51) You don't want to be a time machine and say, "Hey, something went wrong today on a Thursday - I need to roll my whole org back to my weekly export from Sunday." I want to go in and just fix this merge that I did, or maybe this import that I did and undo that, but leave everything else intact. 

Stan Ebenau:

(25:08) Once a month, basically, we get a customer deleting something on accident. Now Salesforce itself is removing the capability for data restoration, which they had. So backing up more and more important because there's no fail-safe anymore. 

Matt Kennedy:

(25:32) Right. Salesforce has something called the data recovery service that is retiring at the end of July. 

Leonard Linde:

(25:40) One of the things that one might want to do also is have some data that essentially you want to archive. You don't want it in Salesforce, but you want it someplace you can get at it. Is that something that OwnBackup provides? 

Matt Kennedy:

(26:07) No, that's a great question. That's what our core solution is really about - backup and restore - but we did release another application last year. It's on the Salesforce app exchange called OwnBackup Archiver. It does exactly what you described - which is basically for compliance requirements. Maybe you have older data, you could have case information that goes back years, and you only need to keep maybe two or three years worth of that history.

(26:35) You can set up policies on our archiver application that says any closed case (older than two years old) - pull it out of Salesforce, put it in this offsite storage, reduce your storage, and improve Salesforce performance. You also have the ability to resurface that data in Salesforce as a related list. So it would be view only, but if you went to an account, you could see active cases versus archive cases. 

Leonard Linde:

(26:59) Where are those archived cases stored? In S3? 

Matt Kennedy

(27:03) AWS, S3 primarily. 

Leonard Linde:

(27:03) Cool, cool. I'd seen your product before, but I hadn't seen that. 

Matt Kennedy:

(27:11) As you're using Salesforce, storage goes up, right? And if you're getting more and more information, it's a way to cut down some of that storage.

Stan Ebenau:

(27:21) Cool. That's also pretty useful if you're looking at duplicates, because comparing things with four years back is really useful, of course. So if things get archived, your duplicate checking will be quicker as well, and your validation will be quicker. So archiving, for us as well, is key there. 

Matt Kennedy:

(27:42) Absolutely. Because I could see as you guys are adding more and more information, that data grows your searches or reports. Everything can slow down potentially. 

Stan Ebenau:

(27:51) Yeah, absolutely. So if you're looking at millions of records, then a search should still be there within the second, of course. 

Leonard Linde:

(28:03) Switching gears a little bit. This is just a general question. We see with our customers a lot of Salesforce as a single source of truth or as a source of truth for a certain set of data in the enterprise - often customer data, which makes sense. So as a player in the enterprise, you're a producer and a consumer. As a consumer, how do you make sure your data state starts and stays clean as you import it? And as a producer, how do you understand how you be sure the customer understands and uses your data correctly? 

Stan Ebenau:

(28:39) For us, the most important thing in this aspect is to define what your data is. If you don't define it, what it is, and what you want to do with it, it's difficult to basically tell the business or even the end-users or what they need to do with it. If you define what it is, you can also define the quality rules. So an email address should also always be this or the phone number should always get formatted like that, or this is mandatory or not, and those kinds of things.

(29:17) You can also define what you don't need because you can store it - all of the information which you don't need. I know I only look at the things you need for a data perspective, so I think that's very important. 

(29:27) If you start doing some data management, data integrity, from our standpoint, that's the first step - define what you have, then define the protection rules and then do the cleanup and the monitor, of course, all around the globe because you need to know if it's still okay or not.

(29:46) If you do get that enterprise feeling defined in the definition phase, you always define where is my data coming from? Which integrations do I have? Is it Form Assembly? Is it any other tool? Define what comes in, define what you want to do with it, and then start acting on it. Definement phases are the most important here. 

Matt Kennedy:

(30:10) I think it goes back to the question I had for Andrea earlier - it's about the uniqueness, right? What is the primary key of that data? As she said, if you can look that up in Salesforce and prepopulate that form with a lot of the information, your quality of data will be so much better going in, right? You're not filling in a bunch of blank fields. You have it prepopulated. I think that's, that's really the power of their solution, 

Andrea Hall:

(30:35) Yeah, for sure. I think another big piece of collecting data is not only making sure that it's clean, but that you're collecting it securely and you're meeting all of your compliance. You're talking about GDPR and HIPAA compliance and things like that. We try to make that easy for our customers.

(30:54) Our plans are compliant with GDPR, PCI DSS level one, and our compliance cloud compliant with HIPAA. Then on top of that, we try to make it flexible so that you can pre-fill in any data that you already have existing and then push it to any CRM or database - the main one being Salesforce, but we have quite a few customers that push it other places. 

Andrea Hall:

(31:18) Then with a few simple settings in your account, you can wipe that data from Form Assembly. So you can store anything that you want collected from your forms, but you also have the power to remove it so that you remain compliant. If your data should only get held in Salesforce and you can wipe an entire response, you can wipe just those sensitive fields, things like that. So I think that's another big piece of collecting data as well. 

Matt Kennedy:

(31:46) No, I think if you look at the alternative - if it's an Excel spreadsheet or a CSV file - there are no controls, right. It's free text, and you're going to be missing all the validation rules, you're going to be not hitting the dropdowns with the proper values. You're going to get all kinds of duplicates that are going to have to get cleaned up afterward. 

Leonard Linde:

(32:06) There's an audience question asking how long it takes to implement an archiving solution once you've got your data policies in place and a rough idea of the steps involved? 

Matt Kennedy:

(32:26) Yeah. It's not too bad. It's basically a managed package on the app exchange. As I said, you just go to the Salesforce app exchange search for it. When it installs, it's basically creating the structure of the archiver. At that point, it's really up to you to define the policies, and policies are pretty straightforward. They could run daily, weekly, monthly, but you pick the object you want the policy to run against.

(32:49) Like I said if it's tasks or cases or whatever object you're looking to archive, and then you just set up the filtering criteria. So very simply say whether it is based on close date or status or whatever it might be. So the cases that meet that criteria are then archived out. I saw that the follow-up question was about the work on the AWS side. With that, storage gets included with the application. So there is no configuration of AWS. We take care of all that for you. 

(33:24) Then the retrieve. So basically, when you do that, we have a Lightning component you can drop onto the form. It'll just be a related list. So it knows what objects it's associated with. If they were tasks on accounts, you could drop that onto the accounts, and it would just pull up as a related list. 

Leonard Linde:

(33:45) And the last question was about external objects, but my understanding of external objects as they come backed up in whatever system the external object lives in. 

Matt Kennedy:

(33:51) We have access to anything through the Salesforce API. So if it's standard custom objects, if it's big objects, we can access all that data. If t's outside of Salesforce, then yeah, that would be different in terms of Lighting and web components used to display it.

Leonard Linde:

(34:21) Another one for Andrea here. What should I expect it from a good form tool in terms of data validation? A classic one is you have a picklist based on a picklist in Salesforce, and you need to keep that in sync. What, what can tools like Form Assembly offer? What do you think the best practices are there? 

Andrea Hall:

(34:57) So I did touch on validations. I'll just say one more quick thing. I think we talked about some built-in validations - phone numbers, emails, and number validation. But you also want to look for the ability to be able to define custom validation. So with Form Assembly, we offer a couple of different options. The most popular one is validating with a regular expression. This allows you to actually validate almost anything.

(36:26) I mean, you can get as specific as you want it to with a very specific string, or you can require a particular date format within a specific range, things like that. So that's key a feature to look out for with validations. In terms of Salesforce picklists versus form pick lists, it's always good if you can find a form tool that can pull your picklist from Salesforce. 

(35:54) Having that ability to direct your form, pick lists over to Salesforce, and pull those values in will alleviate a lot of errors in Salesforce. In Form Assembly. If you get an error in Salesforce, it actually comes back to the form as well. So you'll see that error in both places, sometimes something as simple as a space or even capitalizations - you know, maybe a there is a state capitalized in Salesforce and not capitalized in a form.

(36:24) That will throw it off. So, pulling those picklist values in is very important and will alleviate a lot of pain points, when you're trying to clean up your data and collect it in a very clean way. Then, another thing that Form Assembly can do as well with dynamic picklist is actually query Salesforce to pull in a list of records. For instance, you can have a form - let's say you have a household account - and you send that form over to that family.

(36:58) You can have a picklist that actually shows all the different contacts within that account. When they click on the contact, then that will actually pre-fill their information into the form so you can check multiple people in one form. That can be very powerful. Also, make sure that all of your contacts get associated with accounts and things like that. 

Leonard Linde:

(37:19) So in the case of a rarely-updated picklist that that updated rarely, does Form Assembly check that every time, or do you have some kind of process where it watches that picklist metadata to see if the picklist changed?

Andrea Hall:

(37:34) It does. It does pull that in every single time. So with our picklist, you authenticate it right on the form over to Salesforce, and you choose your object and your picklist that you want to associate that too. Then every time the form loads, it pulls in the newest values. So whether you're changing it once a year or once a week, it's going to pull in those values every time. 

Leonard Linde:

(37:55) On the data validation, usually, there are two tiers of data validations, that simple stuff. Sometimes it's done client-side, right? Do you have the JavaScript that's supposed to be a number, and somebody will put in a letter or whatever? Do you have that two-tiered element, or is all your stuff making a round trip to get validated? 

Andrea Hall:

(38:13) What exactly do you mean by two-tier? 

Leonard Linde:

(38:16) Well, what I mean is, for example, in the browser, if you have to put in a phone number and you start typing a letter, it just won't let you do it - it'll automatically say, Oh, I want a number here, and it doesn't have any. It doesn't go back to Salesforce or back to the server to do that. It's all on the client on your web browser. 

Andrea Hall:

(38:34) We have a couple of different options. All of the validations that you set within Form Assembly will run when you hit the next page, or the submit button. It won't process onto Salesforce or whatever other CRM you might be sending it to until it meets all of your Form Assembly validations. And then we do have a few other ones where you can require strictly numbers or something like that. It will only allow you to type in numbers.

(39:05) Once it passes Form Assembly, if you do get a validation error from Salesforce, you can trigger that to show on your form. So then it's submitting to Form Assembly, but it's not submitting to Salesforce until that data is actually the way that it needs to be. 

Leonard Linde:

(39:21) This is for Stan - knowing that one size doesn't fit all. Do you have one or more examples of a good data quality workflow? What would you recommend people do for certain applications for a certain type of customer? Give us an example of going through, and first, you do this to check your data, then you do that - if that makes sense. 

Stan Ebenau:

(40:09) There's one global workflow that works for basically everyone, and that's prevent first, clean up after. So that's the global one. Same goes for Form Assembly. They prevent first from entering the information you don't want, and then we have tools to clean that up if there is already dirty data.

(40:36) So that's the most important thing, but if you look at what I think of data quality, data management is often an administrative task - the Salesforce administrator has to clean up the database. I think that that's not the way to go because I think data and especially data quality is a shared responsibility. So you should be able to ask your salesperson or help your salesperson or your marketing person and create data quality with each other. 

(41:16) So if they're entering new data, they should get helped in that by entering correct data. So that means validations and a direct prevention popup, but also if they're looking at a record, they need to to have the tools to directly clean it up, so they don't have to send it over to an administrator where the administrator has to go in again and do the validations, et cetera. The salesperson is already on that record. I think if you do two clicks and you can clean it up, everyone is happy.

(41:50) So my goal and I think our company goals is to make sure that the data quality and the cleaning up process of data is a shared responsibility. I think if you implement that in your company, that everyone is responsible for their data, I think the data quality itself will go up quite easily because it's your responsibility.

(42:11) It's not someone else's down deep in the tech departments, which you can blame. No, you have to blame yourself. If the email is not delivered, that's a thing. If you look at sort of broader perspective, that's my advice to companies on how to implement data quality and data management. Make the team responsible, not someone else behind a desk responsible, 

Matt Kennedy:

(42:35) It's a good point, Stan. I mean, a lot of people think about it like it's an admins job to go clean up all this data. To your point, I think Salesforce has done a much better job with Lightning, so when salespeople are entering a new lead, it should pop up and say this looks really similar to a lead you already have. At that point, they can do that evaluation and prevent duplication in the first place. 

Stan Ebenau:

(43:01) We also have the tools. So we have the tool, of course, but we also have the tools - Salesforce has it as well - if you look at a certain record, you can directly see if the email address is correct or the phone number is correct, or a postal address is correct. With one click, you can revalidate it. So if it's incorrect, you can just say, click validate and revalidate that information. Then it's clean. So everyone wins there. 

Matt Kennedy:

(43:26) It results in much better quality when the person that's familiar with that account can make corrections. 

Stan Ebenau:

(43:33) \We call that within the company, we call the data happiness survey. We are trying to broadcast a message of data happiness across the Salesforce ecosystem. 

Matt Kennedy:

(43:48) It sounds like a tee-shirt coming out soon 

Leonard Linde:

(43:51) Or a plush doll. One last -but not least - question. For Matt - how do you tell a customer with a zero budget? Do you have a better than nothing approach? Basically, what do you tell the customers with a zero budget? In other words, if you don't have any money, is there something that you can do? And the second one is since they have a zero budget (but nobody actually has a zero budget for anything), how do you convince management to invest in backup on the cloud to keep their data safe and secure? 

Matt Kennedy:

(44:39) If nothing else, you need to take ownership of your data and use the default solution from the weekly exports. So Salesforce does have some tools built into the application. As an admin, you can schedule that weekly export to run and download that information on a manual basis every week.

(44:56) There's no cost to that, except for obviously the admin's time. If that's too much, at least do data loader or these two reports on some kind of a regular basis and extract that information. You know the system's good right now, and you're about to run a big integration; well, do an export first to take the existing data with a report and at least get a copy of it if something goes wrong.

(44:22) Maybe when you're running that job, keep track of the batch, know the inserts so you can undo it if you ever needed to. I think that's the key that even if you have zero budget, it is still your responsibility. 

Leonard Linde:

(45:49) The second question is, what's your elevator pitch to management to sell them a backup solution when they believe that the cloud keeps everything safe?

Matt Kennedy:

(45:58) I mean, we talked about earlier, it's really about that your instance is safe except from yourself, right? The biggest thing that we see as human inflicted data loss. Somebody does an account merge, and then they realize they made a mistake. How do you undo it? They run a large integration through an ETL tool, and it updates thousands of records or potentially deletes the information. How do you undo that?

(46:25) So we always ask what is your Salesforce data worth? If you're putting in a lot of information, you're running your business out of Salesforce, what would happen if that data went missing? What is that worth? Then, does that correspond to a purchase of a proper solution to protect that data? 

Leonard Linde:

(46:43) Right. That's a great way to put it. 

Stan Ebenau:

(46:50) We're not a big company, so I'm the administrator of our own Salesforce instance. I think last year we bought a solution and for backup because everything which we do is in Salesforce. If our Salesforce gets corrupted, or something happens, then our business is out just like that. When I really realized that then it was easy to make money available, to invest in that, 

Leonard Linde:

(47:26) Right. If it's going to be a single source of truth, you don't want to lose your truth. 

Stan Ebenau:

(47:30) Yeah, absolutely. Especially because we have a lot of customers where Salesforce is the single source of truth, the only source of truth. There's no option not to have a backup, basically. 

Leonard Linde:

(47:46) We're at 48 minutes in here, and we scheduled this for about 45 minutes or so, so I think that's pretty much going to wrap it up. I want to thank Stan, Andrea, and Matthew for a participating, and their very interesting answers. And I'd like to thank the audience for their questions and their attention.

(48:12) We're going to keep doing these webinars, and anybody who attended this, of course, will be getting plenty of information about what we're doing next. So thanks, everybody, and we'll see you next time, hopefully.

Integrate Your Data Today!

Try Xplenty free for 14 days. No credit card required.