In this entertaining and thorough talk, Kevin Dieny, Digital Marketing Analyst at CallSource, provides guidance to help marketers achieve their Holy Grail -- attribution. Every marketing professional who wants to get the credit they deserve for contributing to sales needs to watch this video for advice and inspiration. 

Dieny shares his personal successes and failures as he focused on creating a marketing attribution model to prove the value and influence of marketing efforts and campaigns. He shares actionable advice, including the importance of having clean data. He also reveals how in trying to find the data to answer the question of “how much is marketing influencing sales” lead to a reevaluation of the data process. GIGO (garbage in, garbage out) is as important for marketers as it is for data professionals. 

Come along on an entertaining journey and you’ll pick up a few ideas to prove the value of your marketing campaigns to your sales team and senior leadership.

What You’ll Learn

  • Call Tracking Explained [00:03:21]
  • How Attribution Works and an Example [00:06:02]
  • Relationship Between Attribution and Influence [00:15:14]
  • The Attribution Model [00:35:50]
  • Drafting with Excel and Xplenty [00:42:18] 
  • The Results of Comparing the Model in Excel to Real Data Queries [00:47:58]
  • Most Important Takeaway: Actionable Results are Best [00:53:16]
  • Murphy’s Law: Be Ready for Things to Go Wrong [00:56:09]
  • A Simple Outline of the Process [00:59:45]

Integrate Your Data Today!

Try Xplenty free for 7 days. No credit card required.

Full Transcription

[00:00:11] Hello and welcome to another X-Force virtual summit, a presentation. Today I'm happy to introduce Kevin Deiny. He's a digital marketing app analyst at a company called CallSource, which has a whole bunch of interesting things with telephone numbers that I'd never heard of until I met Kevin. And he's going to tell us about that.

[00:00:32] He's going to tell us about attribution, which is like the big deal at CallSource. And I'm sure it's a big deal in a lot of other places. So, here's Kevin to explain more of that. 

[00:00:44] Yes. I'm Kevin Dieny and you might be wondering who’s this guy? So I work for CallSource. And a little bit about me, I guess would be a good place to start here.

And that's, I figured this would be a kind of a good way to do it. So, who I kind of see myself as, I have four kids and it's busy. It keeps me very much on my toes. My lovely wife and my family, we live in Southern California. My company is based out of Westlake Village. Very close to Thousand Oaks, which I guess if you don't know the area, it's Southern California.

[00:01:25] So we're Pacific time. And what my coworkers kind of see me as is the, I usually get like two nicknames. One of them is like a wizard because apparently I do magical things with data. And the other one is like the Batman of analytic stuff because I have Batman stuff all over my desk and people here know that I love Batman.

[00:01:51] That's actually like, I put a little background so you can see it right here. I love the animated series, so this kind of Gandalf on the computer, it's all great. With the spinner hat of kind of how I'm seen here at my company. I sit in the marketing department. But I'm sort of halfway between marketing and I don't know where analytics data, sciency edge.

[00:02:13] So I do a lot of marketing things. I build email programs. I build and run our advertisement acquisition, do a lot of our web analytics. I basically handle all of the analytics and reporting for our marketing department, and I've been involved with a lot of projects involving data, data governance, and information monetization here at CallSource.

[00:02:36] So that kind of leads me to the last thing is what my customers like, what do they see at the end of the day that I do? Because I do a lot of analytics. They don't necessarily see that, but I do run our acquisition and advertisements stuff, and I've heard many times, especially from our own employees who just so happen to hit the right page and get pulled into my audience that they feel like I follow them everywhere with my ads and retargeting so that’s Johnny Depp running away from all my ads. It's like a perfect emotional capture of what it feels like for them. So I thought that'd be a good little snapshot of me and what I do with CallSource.

[00:03:21] CallSource, the company I work for, we are a very old, one of the original, call tracking companies. We actually have the original patent for call tracking. That's how old we are. That's how long we've been in business and yet at conferences, everywhere I go, everyone's like, “Who does call tracking? What are you talking about?” So to give a little insight into what it is I'm actually talking about, cause I'm not tracking, I'm not talking about tracking boyfriends and girlfriends, which is a huge request we get, you'd be surprised, but it's actually like a down here in the lower left of the slide you can see. We supply millions of tracking phone numbers to businesses and they put these phone numbers on wherever they want customers to call them from. So it could be their listings, it could be like mailers. We do this on the web with digital visitors, so the visitors that hit your website.

[00:04:26] We also have numbers that spell things. So I'm sure there's a lot of common jingles that you can think of; 1-800-FLOWERS is the most famous one out there that people have heard of. So we do things like that. And then since there's a unique number on all your marketing. 

[00:04:47] That's where we get attribution from. We could tell, okay, if it's on this campaign and they're calling you, we know which number they saw cause that's the number they call. So because you have a unique number and all these different marketing channels or assets, or even like on attached to like keywords and stuff like that, we can see which marketing component, asset, whatever is driving calls for you.

[00:05:11] And then the last thing we do is this third box on the right here is we give you the analytics, the insights. We also listen to calls, score calls, and by that I mean we tell you if like a lead is calling or if it was just grandma or a girlfriend calling. We tell you if an appointment ended up being set on those things or if you know what some kind of positive conversion result on the end of the call, we tell you like how many you're getting, and based on the sources you can see which calls.

[00:05:40] What places are sending you the best calls. So that's what CallSource has been in business for for a long time. And we do offer businesses all kinds of solutions to fit their needs, but we tend to be in that space of the phone call component of conversions in terms of all the things you're measuring.

[00:06:02] So that’s CallSource  Attribution is like the ghost of our CallSource machine. Attribution is a lot of what we're about because a lot of the things we do is attributing, like, we listen to calls, we track calls, we do all that. But the heart of it is we are all about attribution.

[00:06:26] I also wanted to define how I see and we see attribution here, and that is that attribution is a lot less about what's better or worse, even though that's kind of what everyone uses it for. It's more about tracking, let's say, the influence of things over time. So that's, to me, a better way of looking at attribution is not to compete things against each other necessarily.

[00:06:50] But to see their influence over time, because even small things can have big impacts. So attribution is a little dicey, and that's why I think this presentation could be very helpful for those of you who are trying to tackle the marketing attribution hurdle.

[00:07:11] Let's get into the meat marketing attribution. It's always been something in marketing you're always trying to do, but we were tasked, I was tasked with, okay, Kevin, go tell me the revenue that's coming in. The actual specific ask was, I need a monthly financial breakdown of all the marketing attributed revenue with a couple of different dimensions. Meaning like, I want to see the revenue that came in from this department division, vertical grouping so I could see how marketing is impacting revenue across our business sectors or our markets. So I put a little example here.

[00:08:00] it's not real information This ask was, “okay, tell me how marketing is performing here.”  I guess the big takeaway I wanted to kind of get across is that our ask was very simple and it didn't require me to make insane business intelligence charts or interactive drill down reports or anything like that.

[00:08:32] Our ask was actually fairly simple, just tell me what you did. And even that has huge struggles and issues and trying to get it, what did we actually do at the end of the day and how can we prove that? I think what is important when you do any attribution, actually, I feel like when you do any sort of database reporting, you're preparing anything for anyone is, it's really important to know who's asking for it.

[00:09:00] Like who are the stakeholders or who's going to be the person at the end of the day that's gonna be looking at this information. I don't know if you should always prepare a little dossier, like how I kind of pictured it here, but over time I think you develop a pretty good understanding, right, of the kind of person or people that you're going to be providing this information to.

[00:09:25] And one of the reasons why that my ask and that single page report was the way it was is who was asking for it. So my boss, who previously was the CFO and is now the president of our company, he has a very long history of being in the finance industry. So he's used to seeing things like income, balance sheet statements, things like that.

[00:09:50] So that's where he was like, this is how I want to see it. And why it is presented that way. Cause that's how he wants to see it. If he wanted to see charts, things like that, we could make it. That's not really what he wanted. And that's usually not what he wants to see. He's very much a numbers guy.

[00:10:09] So that's why all along there are numbers and they're used to seeing it that way and like reading through a sheet very quickly that way. And so when we have a conversation, I can, I've understood the sheet because I've made it and he can look at it and he knows that kind of sheet template already. So it's very easy for us to get right into  what matters instead of spending a whole bunch of time trying to explain, okay, what this number means, what that number means, why this is there, or whatever, we can get straight into it. So I think that was a pretty decent format for getting straight to the meat of the ask.

[00:10:44] So I think with a couple of things I wanted to cover, where are the constraints. And what by that I mean what were the issues, struggles, limitations, and stuff like that that my team had and getting to marketing attribution. I think these may be common with some other departments and other teams out there.

[00:11:07] And so these were the ones that were the most important for us that we really had to  overcome. The first one was an operational constraint. This one might be fairly common for you guys. We knew we had problems, but these were problems caused because of the processes that we had in place already.

[00:11:34] And in trying to obtain marketing attribution information, we needed a lot of information that was in siloed areas of the company. That also meant that when I needed something from another department; they had to figure out how to get it for me. So, we had a lot of non-Salesforce people having to go into Salesforce for us because Salesforce is where we try to push everything.

That's our CRM. Where we are trying to keep all our information together. We try to limit how many different platforms and systems and silos our data is in.  I don't think I've ever heard of anyone having the perfect system out of the gate that has everything in one place.

[00:12:17] We just have different types of systems that are meant to hold different types of data and a lot of marketing tools and stuff like that you may have found don't really integrate well with every other tool. So then we also needed the final reports to come out in a financial statement format, and our current process aren't set up to do that.

[00:12:40] We can't just press a button in Salesforce and it's going to print out that exact template sheet. So there's a couple of times where manual things had to come in to change it up. Basically, we looked at it like, okay, yeah, we can do it, but man, there's a lot of operational processes we'll have to put in place, like who's going to need to send what information, who and to where and in what order.

[00:13:02] So that was quite a lot of meetings where it felt like nothing necessarily got accomplished, but we just planned out how we were going to do everything. I mentioned this data is in silos. Looking at this like as a macro, like for us, I've mapped out about 21 marketing data sources we have, but not all that information is important for telling me revenue.

[00:13:32] So in this instance, it may have just been like 10 or so that I needed information to get pulled in. So that I could do this, but I can't just send all that data into Salesforce because Salesforce maps everything to records,  Like people, the different objects, like leads and contacts, accounts, opportunities, deals, things like that.

[00:13:54] Those are kinds of Information about something very specific. Whereas marketing data may just be here's a visitor cookie ID, or here's a visit time, or here's an interaction with a specific email or something like that. Sometimes they have the right information and sometimes they don't.

[00:14:09] Like with phone call data, we have the phone number,  with emails we have an email address. If they don't have a way to stitch them together, that's sort of a problem. So, we wanted Salesforce to be the center of truth. But we can only put stuff in there that it was designed or is capable of being put in there and we couldn't overload Salesforce, which is a ton of information.

[00:14:30] So some of the data had to stay out. We still had to be able to connect it, which is where a lot of Xplenty’s ingestion capabilities for being able to let systems and silos stay where they are in a sense, but only need to stitch together the data that you need for a specific query or a specific task.

And we love the unicorn example. This is like a joke. We always say and tell everyone, especially who we joke about it with sales teams, if it's not in Salesforce, it doesn't exist. It's like a unicorn. The next constraint we had was quality. With marketing attribution quality kind of refers to our ability to prove something happened.

[00:15:14] The way we work here is if we send an email out, we don't just get revenue attribution for everyone that got the email. We track it based on the engagement. So if I send an email to 10,000 people and 10% of them opened, and then 10% of them, well, those opens, clicked we still wouldn't count attribution yet.

[00:15:36] We count it when they . . .  we call it “raise their hand.” So when someone says, yes, I want to talk to sales, which is a very tough mountain to climb sometimes, because not every marketing channel, not every marketing thing is really designed to do that. So in the back end, we also track, on the marketing team, we track influence, which let's say is like, okay, if someone clicks, they're interested a little bit.

[00:16:09] Sometimes teams use scoring models to kind of accrue like, okay, we're going to pass it over when we think they're ready, even before they've said, “yes, call me. I want to talk to your sales team. I'd love to talk to sales.” That doesn't happen all that much.

That's pretty hard. It came down to us because in so many conversations we were having we were like “how do we prove it?” Marketing is always going to be denied attribution unless we can prove it that we were told, okay, just stop. We are now going to have this rule where 80% accuracy is okay.

[00:16:42] And that might make a lot of people, especially data scientists, cringe a little bit. It makes me a little sad because, I think at the end of the day how much we could be misled a little bit, but at the same time it takes a little bit of the burden off of trying to be so precise and so perfect with everything and prove everything down to perfection. It's really hard, especially with top of funnel marketing, to prove that anything's happening. So 80% for us is kind of like our baseline -- let's get there. And, A little bit of what that causes is it does reduce the manual stuff we just sometimes have to do because to get to 90% or 99% may actually take 10 times more effort.

[00:17:28] But, manual is still a huge cost, especially right now with all the things that are happening in the world. It's very important to be as productive and as efficient as you possibly can. So if there's a way to take something that's manual and make it automated, and a lot of people are trying to do that right now, but we had to start everything manually. Before we could prove marketing attribution, we kind of had to lay it all out, do it all, but we had to do it manually.

[00:18:00] That meant guaranteed high costs of labor going into this project. The type of information that is in my head; it's not well-documented. Or if I leave or something happens to me, we always joke here,”What if I get hit by a bus?” What are we going to do about it? So those are real things that companies should be looking at and teams and managers should be considering is that there is a lot going on right now. There's a lot of things happening. If I lose people, what happens? Am I still able to even complete this project? That's a real concern, a real thing that's been happening.

[00:18:36] For us it’s like, well, okay, who's going to do this now? So, at the end of the day the amount of manual stuff we had to do to get this off the ground was a lot. There's a lot of time, a lot of people have different projects they're working on that have deadlines. And so it meant a high stress environment.

[00:18:54] It wasn't the most fun activity we've ever done, but we are still able to get there. I want to just show you what I mean by manual process. Leonard and I talked about it a little bit beforehand and it sounded like a good idea to have an example of what I mean. It doesn’t just mean it's manual like I'm turning up a wrench somewhere. It's that when information crosses silos, or something happens in between there that maybe let's say Xplenty would do for us, we start out by doing it manually. The output of let's say our attribution model kicks out here's the opportunity ID and the attribution, whether it was marketing or not marketing that gets credit.

[00:19:41] Initially that was, okay, we know which ones get credit, which ones don't. We need to put that back into Salesforce now that we've stitched all this data into the warehouse and now we need to take it and put it into Salesforce. Initially that was manual and, it's still a little bit manual right now because we're still ironing out all the kinks.

[00:19:59] We have a field in on opportunities in Salesforce, which is like a deal or something like that that's happening, a sale that's in progress. How we track sales in Salesforce, we have a little field for attribution and that would be marked yes or no, basically. Right. And then that way you could run a report in Salesforce, which is in the lower right.

[00:20:23]And so that's what my report looks like. I'm running a report to say, okay, which opportunities is marketing getting credit for, should they get credit for, and then that information is passed over to our accounting team. The accounting team are non Salesforce users, because nothing they do really is in Salesforce.

[00:20:09] Salesforce for them is just a big headache. I jokingly put on here an old computer because that's how it feels to me, their process. They’re ultra accurate and ultra precise in what they do. They're just not used to the same systems that marketing is flying all over the place with new tools and innovative things. And they're just like very much like, we're used to this and this is how it works. So we had a clashing departmental culture there. But  at the end of the day, this got us the results we needed. Went to them, they were able to confirm which contracts closed and which had receivables in them and stuff like that. I just mentioned it, right? We had cultural issues. So, this is one of the best, water-cooler event conversations that I've had at events and conferences is ”Oh man, what are the cultural constraints you're dealing with?”

[00:21:39] Like what, what does your manager say? Or how does your team look at things? One of the things, I don't think it's been solved, I  think this is just like a struggle, is our culture here. So marketing attribution gets a little technical,  Because a couple of times it's been asking me, why can't you just tell me?

[00:22:00] Why can't you just go and look and tell me what the revenue from marketing is or what the impact of it is? And every time I'm like, Oh, it's not easy. It's not just there. I have to use a model and confirm with everyone that, yes, this is what we're gonna consider is attribution.

This is what we're going to consider is credit for marketing is not easy. So many companies that I've been with and that I've worked with, there's a serious lack of trust in marketing. I think that probably comes from the fact that it's almost been a cost center for a long time. So it's just been a place where we put money in.

[00:22:42] We probably should emotionally get over the fact that maybe it's going to disappear forever. It's a sunk cost. We may never see it come back. And marketing may do their dance and pretty frills and make it look like it's impacting. But generally, I feel like marketing as a department lacks a serious amount of trust.

[00:22:58] When it comes to proving revenue, marketing has to go to the moon to prove that this thing happened. But a lot of the other departments, they don't have to work that way. They just say, even just someone could say, Oh yeah, I think I did something there, and they'll get credit for it. They'll be trusted in that.

[00:23:7] So. something we were even told by people here was like, wow, marketing has to do all that to prove it's attribution? That's a lot. Then in a completely ironic, opposite way, right? This is hilarious. we'll over-prove, like a big, huge Manila folder full of (this is just metaphorically) proof of, look at this is all the deals we influenced and most of the time we hear is look, I don't care how the sausage is made. I don't care how you did it. Like it's fine. I trust you right now. But then every once in a while there's that like, Oh, I don't know about that.

[00:24:00] Or I don't think so. And then we have to come back with the huge Manila folder and have it. But if we always bring it; It seems like, okay, you're doing too much. Like I don't need to know all the little details, just give me the final numbers. But every once in a while we're asked to prove it or asked to go into it or drill down.

And so we kind of better have it around. So it's like a weird irony. We don't really trust you and yet we trust you. So marketing attribution is like a double-edged sword sometimes. And I don't care how the sausage is made. It’s a great quote; a friend, a coworker here said it and I've always just thought of it.

[00:24:43] So now let's get into what we actually did. So you understand what we were asked to do, which is getting monthly revenue from marketing and be able to put it into different dimensions and have proof of what happened. And then you also know kind of what were the things that were keeping us from just doing that really easily, or the things that were causing us issues, which may be some things that are the same kinds of things that are causing you issues. So I want to go through how we were able to overcome the constraints and ultimately how we did our attribution. This is one of those Franklin Covey quotes. We actually have posters, this stuff all over our, our office.

[00:25:21] But keeping the end in mind is really important. So by that I mean we have got to know what the end report, the end need is for the data and how it's going to be used before we even start. We've gotten into a lot of issues and struggle and just wasted so much time, making a lot of assumptions about what is needed, where it needs to go, how it's going to get used, how often it's going to get used, what its application is, and just completely misfired on that.

[00:25:51] I would strongly suggest everyone who begins any sort of a project, trying to scope out what exactly the end is otherwise you're going to waste a lot of time. And that happens with marketing attribution a lot because it's so much work. Like doing it over again is just soul-crushing.

[00:26:13] So, one of the ways to get around that is, at least the way I've figured out how to do this and it works for me pretty well. If you can turn every request into a question back to them and try to restate or rephrase exactly what they want, then you can usually get clarity. That seems pretty simple, but it's very effective. So I put a little example conversation here, and it would be like, okay, this is my boss, right?

I would love to see how marketing is performing each month. 

And then I say, Oh, so you want to see monthly recurring revenue of all marketing campaigns by month? Right there, I'm saying, okay, you want to see the monthly recurring revenue.

[00:27:02] You don't want to see revenue minus expenses. He wants to profit. You want to see, do you want to see this monthly? You want to see it annualized. I'm trying to very much like specify the kind of metric I need and marketing campaigns. That's a dimension. Cause we have campaigns, we have them grouped up as channels.

[00:27:17] Beneath campaigns there might be like different types of things that they're doing, like a specific email or a specific ad or whatever. So where, what level do you want this? And then by month I'm just clarifying, okay, where do I aggregate to, right. So I have a big pile of data for scooping up each month at a time, or what am I doing?

[00:27:39] So my boss would say, in this example, you'd be like, yes, just the one number. Okay, that tells me I don't need a huge chart for this. I just need the number and let's review it each time we meet. That's actually how the simple requests, in the beginning, when, and I didn't need to make a chart. I just needed the straight number. If you needed to ask more, I could,  the data behind it would be sitting there and then I could aggregate all my campaigns. It made it so I knew exactly what I needed to get to. So if you, by using like a question and answer rules, you can really figure out what metrics, what dimensions, when everything I need, we call it here a menu of what's requested and then you can serve them the final, here you go through the final result.

[00:28:26] That's kind of how we overcome this issue and how it kind of started out. I've talked about the cultural issues, the operational silos. Working together sort of sounds like my teacher to my kid's teacher telling my kid at school, come on everybody, let's work together. But it's a little more complicated than that. In the work environment, all these departments we have here are measured differently. So if one department spends a lot of time on this project with you, that might completely mess them up versus you may be measured totally differently and it doesn't matter if there's a little bit of waste of time here or there.

[00:29:03] And then in this project, we had someone who was in a line. So this person had to finish their job before this person could start. And then this person had to wait for that person to finish their job before they could start. So because there's dependencies, right, in getting things done it meant that we had to figure out, okay, who do I go to. When I  have an issue and who's in charge of what and what are the limitations that other teams have. So figuring that all out is very crucial and we're playing a little bit of empathy or maybe a lot of empathy. There was ego clashing.

[00:29:40] There was authority head-butting, and by that I mean we're in charge of this or we're going to be doing this and this is how it's going to go. It wasn't really a, we're not on an even playing field here. You're trying to figure this out together. And there's a lot of emotional roller coaster because sometimes we'd be, Oh yeah, everything's working well.

[00:29:58] And then a couple of days later we'd come back to them and be, Oh man, all that information is all wrong. And then there would be such a high to such a low. So any sort of project that works like this with a lot of data and different teams that are used to data, requires a lot of understanding of this, putting it up front.

[00:30:20] This may not be easy, and, that could be okay, we may struggle or we may fail at this a lot and that should be okay. And I need to know or it may be helpful to know, the priority of this thing in my lap or the different types of things I'm doing. We had teams tell us, look, we can't work on this the fourth week cause we have this huge project we can work on with the week after. That helps everyone understand what was going on. Because at the end of the day, the big picture is everyone wants everything to work. Everyone wants everything to be successful. They're not trying to stab each other in the back here. So, I think it's helpful to know that everyone's trying their best might be a good attitude to have when going about something like this.

[00:31:03] GIGO is probably one of the most common phrases I've heard our data scientists say. We have an analytics team here that crunches all of our call analytics data. So  hundreds of millions of phone calls all the time and garbage in, garbage out is one of the most common things I've ever heard them say. Making sure that the highest quality of data on the out is happening, right. Requires the right amount of information coming in. We have a Salesforce admin, maybe not everyone has that. Maybe the CRM is a shared thing that everyone manages, but for us there was someone kind of in charge of the CRM and its architecture.

[00:31:45] And that meant what are the fields, objects, the values, the taxonomies, and formats that everything needed to be in. And with marketing attribution, it's critical because a lot of marketing attribution relies on things like the text or the date or the information, the ID numbers, and stuff like that to be case sensitive.

[00:32:05] So you really can't mess around with, Oh, I typed email instead of E M A I. L. I typed it, E M I A L, I've mistyped it, whatever. That, I can't really happen. So how can you control the inputs to kind of make it so it's really hard to mess up and that way people will feel better too. And they're, Oh man, did I mistype out? And they worry. And then so could the whole thing fall apart if one little bolt falls out of place? Those are pretty serious questions for figuring this out, because when you stitch data together, so let's say you're going to stitch an email to an email, they have to be exactly matching. And so sometimes someone will put an extra space accidentally or they'll forget the dot or whatever.

[00:32:45 We have the process for connecting the data, but then we also have processes above and below that validate and check on every point. If we can't do that right, and that may take extra work just to validate, but it would save time. And if you get to the end and you're like, wow, this thing only has five values in it, and that's often wrong. So you can set this up, validation and things like that, I actually would recommend setting it up with Xplenty or setting up reports in Salesforce where they're literally, their whole purpose is just to validate. So we have ones where they're looking for invalid emails, invalid phone numbers, invalid or missing identification, or unique identifiers.

[00:33:27] All these different things are part of the attribution model process that we have so that at any point along the way, I can catch an error and I can figure it out. And I don't need to look at the reports all the time. But at this point, if I  go in and something doesn't really match the result that I kind of expected, I may go check them to see if there's any major flaws.

[00:33:48] I may not be able to control all the garbage that goes in, but you can put up some validation stuff in place to kind of check it right and see if it's working. I'm crunching a year down like this example here. We knew how the model was going to work.

[00:34:12] My personal expertise is not in data science. I'm a marketing analyst. I have learned, picked up some SQL and picked up the things I need to do as I do them. and so I am really comfortable taking things like in Excel and stitching them together and modeling how it will work there before I go into explaining, or before I go into Salesforce or Xplenty or into our automation tools and build it all out there. We even have a tool called [unclear]chart where we have boxes and arrows and figures out how we're going to plan and how the processes are going to happen. So we whiteboard, we do all those types of things and so.

[00:34:57] Initially it was, okay, what's important to know for attribution, which is basically all the stitch points and joins that need to happen. The values that represent revenue or denote attribution or campaigns, things like that, that basically just say, okay, marketing did this. Those things are vital to have. Ultimately we're working with functions and formulas. We can't have  divided by zero errors. We can't have issues in missing or no values or things don't match up and go. So what do all the formulas and everything look like? So I did that all in Excel actually first, which maybe to some is  kind of crude, but, I'm so much more comfortable sometimes breaking something in Excel where I know it's not going to affect our nodes or anything else that I can kind of keep it in a safe environment just for me.

[00:35:50] So this is actually my attribution model. You could see the boxes in the arrows. That is our visual process planning tool. We use a sort of our way of doing whiteboarding, but my handwriting is terrible. This tool allows us to work this way and everyone can see, okay, this is actually the rules that's going to go through. Each tier or level here represents an if/then statement that's happening, that's going through and each, if/then statement, actually I created a field and a value for, so at the first one, it would be, okay, was it created by SDR or not created by SDR? So created by SDR could be a true/false. And so all the way down this, it's going to be building a logic of this. This is, yes, this is, yes, this is, yes, this is yes. And that way data would flow the right way. And this is the data flow path to getting to does marketing deserve the attribution or not?

[00:36:53] One of the things Leonard asked was, what is this 183 nonsense going on here? If it's the third level down, right? So at this point,  it's either SDR or not, there's a campaign attached to it or not. That's what I mean by campaign responded. And then it's, okay, what's this created, responded less than or equal to 183. That comes out of our product pipeline life cycle.  So that is the moment that someone comes into the moment some of the larger deals close, what is that window? What's the window of credit for something it did like four years ago? And that didn't make sense. I can't feel confident suggesting that we should get credit for something that happened so long ago. So where is that comfort point and where does your company figure out? Right? A lot of these are company top-down decisions that come from up above that should come down to you is okay, what is fair, what's right for marketing attribution cause  on small deals we've had --

[00:38:07] we have a phrase here called one call close. It could be a deal happens within 20 minutes. And then on the other end, we have deals that take months, if not,  years to call it. So what's fair, right? There's a small amount that happened,  really, really long and a small amount that happened really fast.

But where is it fair? 

[00:38:27] 180 days represents roughly six months.  We kind of came up with 183 is, three days extra on the six months, because we were kind of told and suggested from our Salesforce reps, okay, whatever model you decide to go with, just add a few extra days. So we decided six months was fair, so we added three extra days. So that's why you see this weird, I don't know if anyone else ever has had a 183 attribution window, but that's ours.

[00:39:04] We also have a higher confidence 90-day window that's not represented on this, but that's just an extra thing. That's mostly for me to see what stuff we're impacting within a closer, shorter window of time. So windows are almost different attribution models, all alternate selves.

Finally the colors here, right? They represent the different,  sort of almost like our channels, at least the way marketing internal channels look. So for us, there's SDR, which is our sales development rep team.  They get credit for things they work and they do. But marketing as the digital side of marketing or the campaign stuff side, it has to have a campaign attached to it with some pretty good hand raising and stuff like that to prove it. And each of our campaigns falls into different, we call them channel buckets. So every campaign has what's called a campaign type, which we just used a soft version of channel, and that tells us, okay, an email campaign thing affected this deal and that it shouldn't get credit or in yellow here, right ads so if an ad campaign did it, it should get credit.

[00:40:15] What do we do? Right? Cause I think most attribution models, the big question is, what if you have multiple, what if the Plinko here falls down. And you have email and ads should get credit, then what? So that's where the thing at the bottom came in. After many, many deliberative, I don't know, slightly tense hours of discussion with our teams, it kind of fell to the equal attribution weighted model.

[00:40:45] That's because . . . we want to see, this is sort of our baseline, right? We want to see how everything behaves in an equal environment. Even if it was the first thing that happened or the last thing that happened, because we do actually have marketing across, and this is simplistic, but the top, middle, and the bottom of the funnel, we actually have marketing at all stages, so, well, we know it's going to be completely tilting to do the stuff at the last attribution, if for the, for the top stuff, because it's never going to be represented.

[00:41:15] And so just to see what everything was equally, it's sort of our 80% correct, to get back to that way of knowing what revenue is coming from where. So this is how it looks. And by equal, I guess I should clarify, but equal. I mean, if a hundred dollar deal right happened and two channels, two campaigns or whatever are responsible, they will get 50%. So they'll take whatever the number is and divide it by the number of Attributing campaigns or channels, and it would split it. So if we're looking at an at channel level and there's two channels that get credit, then it's 50/50. If you're looking at a campaign and there's five, then each one gets 20% so that's it. It's always split evenly. It's not like one gets more, the poles get more credit or whatever. So that's our current attribution model and the rules and the data flow for how it works. This is probably one of the more important slides, so that's why I want to make sure it makes sense.

[00:42:18] So getting the first draft together, we had our model, we have the visual, you know what it looks like;  the important fields, values. I'm working on modeling it in Excel. So I know, okay, this is kind of what I should expect along the way, what kind of data I should be looking at. And that's where it's, okay, now we're ready to put this in Xplenty in our ingestion tool so that all the data can come in. It can be formulas, expressions, everything can be outputted the right way. So at the end of the day, I can take it and I can actually compare it to what I started with in Excel to see if they match. So a couple of things about the first draft is that to make things run a little bit smoother, at least for those non Xplenty experts. Excel is a great place to start, but not everything actually translates really easily over to Xplenty expressions.

[00:43:13] If I'm doing an index match, which is like Excel’s version of a join, at least the way I use it. How am I going to join data in Xplenty? It's funny. Well, actually they have a module for that and that's really easy to do, but not everything, especially with some Expressions. I know that one of them I, I love using is like index of, which is basically like being able to search for a word or a phrase or something like that in a field. So, whereas in Excel, I might just use like a find or search or something like that. So also don't put a whole bunch of data in there you don't need, that's one of the things I love to do is I'm like, Oh, what if I need this later? Like, no, I have to continually tell myself, just start with the very basics of what's needed. You don't need that many dimensions. Normally it's the dimensions I end up trying to cram in there. So, and then the last thing is from someone who is not a data scientist, I absolutely, even if you're a data scientist, even if you know what you're doing is, I would not hesitate to ask for help. One of the best things I know about Xplenty and Salesforce and a lot of tools, but man, they have an amazing support, so you can just hit the chat and you can ask them complex things,  I started out not knowing a thing about SQL when I got Xplenty, and now I'm

[00:44:37] building some pretty crazy stuff. And that's because like I asked for help or how do I do this? Or I have an idea of how I want it to work and they've given me like three solutions, free options, almost every time. So I can be like, okay, and then I can ask questions. How is this working? Or what is this doing? Or what's different about this? Why is this the best solution for this? Or how many nodes do I want or need, or all those kinds of things are amazing uses of going to the help. Does it help? it's a little bit of like a pride chop to be like, Oh man, I can't do it myself because I want to do it myself all the time.

[00:45:11] But I really recommend you use the support because you'll save a lot of time, even if you have to wait 20 minutes or 10 minutes or whatever. If anything like that ever happens, you're still going to get there faster almost all the time. So, this. All right. So this was like the very first thing I worked on and it was, all right, I need to get my Salesforce data and all those multitude of objects. I want them in my warehouse and we use a Redshift warehouse, and I don't want them all in just one table because it's insane. I probably won't need it like that. So I figured, okay, I'll take the common objects and some of the objects have actually like five other objects, all kind of related to them.

[00:46:07] So I want to kind of roll them up into each like master object and save that as its own table in Salesforce. So for instance, leads,  There's like lead, lead history, lead status, all these lead objects,  the Salesforce API I want like most or all of them that have relevant information to be just one lead object and have all that information in there. So this is what this is going to end up looking like. For me, it was massive. There's a lot of joins and duplication and distinct and all kinds of stuff going on here. and I put this here because I want you to try to keep it simple.

[00:46:43 Obviously this is me not keeping it simple. So learn from what I've done. Maybe it does require this, but a lot of times I don't think it does.  I don't think at the end of the day, you need all this information in there. I think you can pretty much keep it simpler than this, but because initially, I had no idea what I wanted.

[00:47:04]  I know at the beginning of any kind of data ingestion, I'm kind of like, I don't know what I'm going to need down the road, so maybe I'll stick it all in there. But, again, that's like not keeping the end in mind. So this is what it looks like when you're like, okay, I want to put everything in.

[00:47:19] And, it can be fairly overwhelming. I don't know, maybe some of you have way more complicated branching and stuff than this, but this is definitely like, 

[moderator] I don’t know if it's possible for anybody to have any more complicated than this.

[00:47:37] Yeah. Yeah. So there's a lot going on here. There's expressions feel like in a lot of the select statements stuff in here, it's all over the place. So just try to keep it simple,  you do what you have to do. But I would recommend just trying to keep things very simple at first. So if something wrong happens, you'll know, okay, this is where it's going wrong.

[00:47:58] So now that the embarrassing part is over,  what were the initial results? So we had the model, we had the data, we knew the flow. I mapped it out in Excel. I transformed that and put it and built it out the Xplenty. I ran the query.. What are the top errors I ran into? And just getting my initial results right. Because, A bit of that was like, okay, I'm using the wrong expressions. I'm using the wrong modules,  I'm using the wrong stuff in Xplenty, or I totally scoped this out wrong. Once, like those kinds of issues are dealt with. The other kinds of issues that I always run into is like, okay, I'm using the wrong data type.

[00:48:41] Or when I pull it out of Salesforce or when I pull it out; normally Salesforce pretty good, but not in every platform. It is where everything will default to like stream and I'm like, okay, I got to cast this to a date or I got to put a date-time, or how am I going to work with this. So data type data formatting is a big deal. And when it gets to the end, like can you just take it out and you're gonna throw it in Excel anyway., it may not necessarily be a big problem, but when you do take information and data and you put it into like any sort of charting or business intelligence use case or any kind of thing like that, the data type and the data format makes a big deal in how those tools will segment, filter, and give you drill down options into the information. So If you're in a place where formatting doesn’t matter, just be kind of cautious that at some point it may matter. So, and it's always like one of those things where once you in Xplenty anyway, once you go in and set up and make sure everything's in the right data type, I don't really need to do it over and over again cause you're done.

[00:49:42] And once you figured out how to cast or change or transform or convert one type to another. You can go in and you can even create a document. I have a document where it's like, okay, what are all my expressions? Again, that free format stuff. So I have a reference,  Like if I'm doing this to that, I should use this expression because there's a lot of stuff in there and I don't memorize all of them.

[00:50:03] So data formatting is a big thing that we run into issues and errors all the time. I've talked about how I include too many columns. That's just an obsession I have. And then asking for help is where this third check comes from. There's an expression for that,  I will try to get it right myself, and sometimes I'll spend a few hours. And then I'll ask for help and they'll be like, Oh yeah, that expression isn't right. Like the index of being able to search for, text, text contains some kind of thing. I didn't even know that existed. And then they told me, Oh, you're doing a lot of extra work here. Use this. And that's what I ended up using. I was like, Oh, wow, I wish I had asked for help earlier. Right. So try not to do that. The snowman example here is kind of how any bright-eyed marketing analyst goes in, expecting like this amazing thing. And then when you actually really try, you're like, gosh, this is hard, or this is cold, or this is not what I expected at the end.

[00:51:05] And you're like, well, it kind of looks like it. We're kind of at the point in it, and it looks kind of right. and I think like, it's kind of valuable also to remember, like, it's okay, everyone's going to really mess up on this stuff at first, but you can do it. At least I think I believe in you. So. And that comes out of not trusting results.  Or feeling like, okay, this is off. Maybe there's something wrong. So when you're at that point where like you've run it manually, you've tested it, maybe running like experimental or test, or just like sandbox nodes and then you're ready. Okay, I'm going to put this into production. Basically, I'm going to put this into maybe like a workflow or there's an order like this has to happen before this, before that, you can really get things cooking or automated and so some questions to make sure you're aware of is frequency requirement. So that initial example, 

[00:52:00] I need marketing revenue monthly. To me, that kind of sounds like it's a monthly frequency. I mean, but don't need to run it that often, but I actually have to keep track of it weekly so that I can stay ahead of it. So when at the end of the month, like the accounting team isn't just waiting for me to give them all the attribution by the end of the month, I'm only working on a few days of attribution by the time that it gets to there, because each week I've kind of broken it up and I'm kind of chopping the big project down each week so that it's smaller by the time of the end of the month. The end of the month is a crazy time for accounting. So that's kind of how I've figured out my frequency for this. And also like your nodes, there's a lot around nodes. I don't think I can get too much into it, but like how many you need, how many is safe and good and the quantity and large volume of data and the amount of that, that insanely long thing that's going on.

[00:52:50] There's a lot there around like what's an efficient way to get this done and how many concurrent programs or projects or whatever are happening at the same time, that may overload it. So that's an important thing to consider. And then I think maybe the most, maybe the most important thing you can take away from this entire presentation has got to be 

[00:53:16] actionable results are best. When you scope out any project and at the beginning, right, step one, and then at the end when it's being delivered and for maybe weeks or months after that, there needs to be evaluation over how whatever you're doing is being used. At some point, maybe like a couple of years ago, like one in maybe 10 projects actually got used, acted on, did anything with otherwise, they were like, Oh, that's nice and they put it down, right. So that's terrible. At least it feels terrible. Maybe that's like a pretty good average, but, It's so critically important that the things you work on and use, there's a plan for how it's going to be acted upon. So marketing got the attribution,  Or, marketing proved its revenue.

[00:54:00] Again, marketing is doing well. The action isn't like, okay, marketing is either going to stick around or you're going to get fired. That's not what I'm talking about. It's more like, okay, what positive things does marketing want to come out of providing these results? Well, we want more budget. We want more people. We want more tools. We want to be able to have more freedom. We want to do whatever,   those are the things we want. How can maybe this project get us there? And having that in mind. Or if you're a different team,  And you're responsible for their thing,

[00:54:32] how do you know it and understand and get some feedback that the things you're delivering are actually being used and that there's some good feedback coming in that will help you next time. Or if they don't use it, then they don't see it as valuable or important or whatever. Why is that? So going back to the drawing board and being like, okay, we've been doing this for a couple months. But I'm not seeing anything come out of it. What are you guys using it for? And if they don't have anything, they're using it for maybe the data that maybe something is wrong and you should reevaluate it so that they will start using it.

[00:55:08] We did a whole project. And how we used it was we changed our operations in the beginning. Right. The garbage in part. We did this whole long project to tell us something. We didn't use it for the something we used it to realize like we didn't have enough to tell us something.  So if it was like, okay, we want to know how much revenue we had, and we couldn't even get there because we had terrible processes.

[00:55:28] Well, the project actually let us change the processes, so that's okay. That might be where a lot of things go with data is like, wow, we don't have quality data. So now we have to go back and figure out how to get that. That happens a lot. Also, is it being acted upon or used in some way? Because that's so important. Then it's tragic for information to not like be used or to have to keep going to keep moving and building on its value. The moment data stops moving or transforming or being used, its value starts to die, at least the way I see it.

[00:56:09] We're near the end of this. Murphy's Law is one of my favorite, ironic, sarcastic, just like philosophical ideas. And that said, something's always gonna go wrong or assume it will. So basically what to consider if that is true. You may not believe in Murphy's Law and maybe you're feeling fairly immune to things like that. But what I think all teams should consider, if you're doing marketing attribution, if you're doing any sort of data ingestion, is and with the constraint, right, of like information leaving, of people leaving, we have this huge crisis going on, is to keep functions.

[00:56:55] When I have projects, expressions, sources, everything to keep it flexible. And by that I mean if with case sensitivity of stuff,  Can you write an expression or formula or whatever algorithm that allows for things to be incorrect? So for instance, the easiest way to get around the case-sensitivity problem isn't explaining it, It's just to put lower. Lowercase everything,  and if you lowercase everything as it comes in so you don't have to worry about necessarily how tight perfectly. So there's little things you can do, very little things you can do to make your stuff more flexible so that when you're reading it, interpreting information when it's happening in an automated way, that your stuff is more flexible for like, or if team members leave, 

[00:57:40] and you're doing stuff based on usernames and user emails, that can cause havoc, and every time a staff change happens, you have to go in and rewrite all your stuff. That's not really scalable, and that's not really, that's more manual. Again, so it's not like you start manual. Is there a way to make your stuff more flexible, to things happening. And that is trying to avoid Murphy's law, but that there was going to be issues there. And then the last thing here, even though you may consider an event to be rare,  Like, Oh, black swans don't exist,

[00:58:16 ]we found out they do in Australia. And they didn't know that for thousands of years. So basically, even though rare events are rare, they may have large sweeping consequences if they happen. I would never have guessed we'd be in a pandemic right now. And that's happening, there’s stay at home stuff going on and that's a big deal. So how could anyone have foreseen that necessarily? Like it's really hard,  But, and you may not be able to foresee everything, you can't have a plan for all these volcanoes and earthquakes and meteors and stuff that are extremely rare. But just within your scope, right. What, are there any kinds of rare things or things you've seen along the way that have happened? People with experience, probably have a lot of things that have happened. And so when they take me, take them a little longer, it may take them down a different path

[00:59:09] doing the work they do, but they're building it so that it's more flexible. So making yourself like, raising up your career, raising up your like abilities here is knowing like a lot of times, like how to be a little more flexible with the things you're making so they don't break it as often.

[00:59:25] So people aren't always coming to you and be like, Oh this data's wrong. Cause if that happens over and over again, why are they going to trust the data? Are they going to trust information? So the more flexible you can be, I think with your results at the end, the more it'll allow you to like come in, tweak, change, and get it right or fix it and try to reduce them and any errors that happen.

[00:59:45] So let's go through some concluding thoughts,  This is just a complete summary of all my suggestions for the process, but it took us to get to marketing attribution and being able to prove, in the simplest way, monthly revenue from marketing with some dimensions in a monthly format that came out in a financial statement, basically. So you may not have anywhere near the same request or ask that we did. You may just generally be curious about how marketing it gets attributed. So basically you could follow these six steps here.

[01:00:33] And that is the start with the end in mind and model the end result with the stakeholders. So you know exactly what's required, exactly what you're going to need, like from the beginning, of the day. And then the second one here is you're going to have to work with other teams, possibly other departments, other people, to get the information you need to prove marketing attribution.

For us, accounting, marketing are different silos here with different silos that we've been information, different cultural silos altogether. So that required us crossing over, but you may have like many more silos than that you're going to have to work with. So just try to avoid being the bottleneck or the squeaky wheel.

[01:01:17] I guess it depends on which way you want to look at it, but try to be one that's not holding everybody else up. For us, we use Salesforce, so that's why I put it here. But basically, keep your CRM, keep your data sources as clean or as high quality as you can. A lot of times, like when a project starts in something new, there won't be much consideration over what information goes into fields or what information we need, because we don't really know, maybe six months from now what we're going to do with anything. So when we start out a project, we may start it out fairly dirty, but if you have some ideas, yeah, okay, we may need these two things, or this is kind of like we're going to use these two things to base it off base other actions off of, then those are really important to make sure that they're gathered appropriately and accurately.

[01:02:01] So not just knowing what's required at the end, but like knowing what information is going into a system is going to help you to have that data in the future so you won't have to rewrite it or I don't know have to realize it's all garbage and you'll have to start all over again.

The fourth one here, is sort of just my personal way, but modeling the data on paper, on whiteboards, Excel, we use LucidChart or whatever you're comfortable with. It's really helpful and especially if you can, if you can take something really complex and you can explain it to someone who has no idea and they can kind of get it, that means you're really onto something.

Like if you can put a model of what you're trying to do in a very simple way, just initially basics at first then you're really on to being able to take that and turn it into something like use of more complicated tools to pull it off or make it more automated. And that way you can always refer back to it and it's not all in your head.

[01:03:03] There's actually a documented process of what's happening. I think the last two here are kind of like getting it past the finish line. So, building everything out and testing it thoroughly and thinking everyone like hits it and goes and was like, okay, it ran successfully. That's a good sign.

And then what was the output of that information? And if you have a way, if you've set up a way to compare if it's right or not, that's huge. That's why I do it in Excel first and then I do it in Xplenty, and then I compare the two. Right. Did they match? Am I off? If this says a hundred and this says 5,000 there's some serious issue there that happened along the way.

[01:03:44] And then again, a Murphy's Law, try to make things as flexible as possible, but at the same time your information and things at the end of the day, like have to be actionable. So you want to spend a bit of time making them flexible, but you may want to start with, okay, are they actually using this or is this actually being used in, for some,  to make a change or for something, some big decision to happen.

You may not always have the clarity of that all the time, but asking for some feedback and stuff on what you're providing and if there's different ways or ways of making it better, or at least at some regular intervals are good ways to like follow up to make sure get some feedback. Are they at least using the information I'm providing.

[01:04:34] Marketing attribution is one of those things where it's like you may start providing it weekly or monthly, and then it may be like, Oh, I'm fine with it right now. Maybe I'll wait six months to see it.  And then you wait six months and they're like, Holy cow, things are crazy. I need to go back to monthly.

So things can ebb and flow,  like how whoever the stakeholder is is adapting to the information they're given there. They're digesting information from a lot of sources. So some of them might just be looking for fires and some of them may actually be looking for opportunities. So those are important things to separate.

[01:05:06] Oh, that's it. I'm probably going to turn orange here, but, but yeah. I appreciate it. I'm really grateful for the opportunity to present on the topic of marketing attribution with Xplenty and X-Force. 

Moderator: This is great. Thanks, Kevin. 

[01:05:25] I've got a question for you. How long did this project take from the minute you got a go from your management to you had pretty much a good, a good output? It looks, it sounds like it was years. 

Dieny: I'm glad it sounded like it was years, cause emotionally, that's how it felt. Leading up to this specific ask, as the analyst on the team in foreseeing like, okay, I absolutely need to track this stuff. But not necessarily knowing like the ask, exactly what it's going to be.

Because we had KPIs so we knew the stuff we had to measure too. But I would say getting like UTM parameters set up, getting textonomies like well-established and then figuring out like, okay, how are we going to report on our revenue that took a couple of years, but I think every marketing department is always like, okay, how are we doing this?

This specific project with this specific ask; this took two months. Okay. Sounds, maybe that sounds kind of fast, but it was kind of during a downtime for us. And also during a time where like, we had people come in and we're like, look, we're kind of flattening and cutting.

So red tape marketing, we need to know what your attribution revenue is monthly. Your team, you need to know this, this team needs to know that. So everyone kind of knew, okay, everyone's going to be asking everybody else. Before this, things were drawn out long because we had meetings, but no one knew the priority of this.

We were given high priority from the top down to figure out this marketing attribution problem that we have and it's sort of being, it being like a heart of CallSource.  Can we work our attribution company? Why can't we figure this out? Like this is, this should be easy or whatever.

And yeah, we have our phone calls stuff fine, but we have all these other marketing channels, so how do we get them all in one place and, and stuff like that. And also in one other little thing I would add is when you have executive-level changes happen or departmental operational changes, like right now, maybe a lot of companies are changing their dynamics.

[01:07:43] That couldn't be something for us that changed like every week. Oh, not like a whim. We joke here where someone had a dream last night and they came in and they're like, well, this is how it's going to be. Or they went to some event and came back and I'm like, no, this is going to be, we kind of had to establish, like for us to even do our job, we need to be able to like pivot on this in a slow way because it takes a while to see if something's working and then it takes a while to act on that and improve it.

And then a while to see if that improvement worked. We can’t just keep changing, iterating so fast so that the pace of iteration had to be a little longer for us. And so that's why initially the first couple years, that was all we did and then this last project was, okay, we've decided all these things and we were like, Oh, thank you.

Good. So nice that we have, someone has actually told us to draw a line in the sand for how things are going to be. It was a huge struggle for us to that point. 

{01:08:40] Q: So just sort of from understanding, it took two months to do this specific project, but underneath the whole thing, you already had Salesforce as a single source of truth. You had done some data cleansing. You'd done a lot of laying the groundwork that allowed you to do this one specific project. Is that right?

Dieny: Yeah. For instance, not everyone is imitating Salesforce or using campaigns. We do, heavily, and not everyone uses UTM parameters.

Some people use like maybe an attribution tool, like a visible or something. Some people use different ways to say this is credit and this is not, or whatever that maybe they tag, they put up, they look at lead source or something like that. We use campaigns because that's sort of what Salesforce has built to measure and track campaigns, but marketing is only one using campaigns, so it's a little easier.

Sales doesn't use them even though they could for us, but that way we could track like multiple campaign influence on opportunity deals. Salesforce has a report for campaign influenced opportunities or opportunities influenced by campaigns. So out of the box we could kind of get our attribution reporting almost straight out of Salesforce. Like Salesforce says, this is an opportunity influence from campaigns, and we had some processes and stuff like that for data inputs. We actually, this is just a side note here.

My colleague and I, I'm Matt Whitmire here, he's in charge of our SDR team and. He and I made an entire Salesforce training course. We filmed the videos in this. I'm in a green room right now. That's why you see Batman in the background. We filmed an entire course for our own company to train them on what they should do because it was such a problem for marketing.

So we spent last summer, three months filming an entire Salesforce training course on how to do it and how to do it right and why to try to help them. And we got our sales and our executive leaders to buy into that and to set some like punitive reasons why they should do it and also some rewards over who does it best.

So we set up that whole thing. And that helped us get some better information in and also help the sales leaders know, okay, their teams are trained. They can actually tell their teams like, no, if you don't know how to do this, go watch that video that's in the intranet that our own people made for you.

Moderator: Well, Kevin, thank you so much for your, for presenting and for sharing all your knowledge about your attribution project. And for telling us how Xplenty helped you achieve your goals for your company, we appreciate it very much. 

Integrate Your Data Today!

Try Xplenty free for 7 days. No credit card required.

About Xforce

The Xforce Data Summit is a virtual event that features companies and experts from around the world sharing their knowledge and best practices surrounding Salesforce data and integrations. Learn more at www.xforcesummit.com.