Mark Stouse is the CEO of Proof Analytics, a sales and marketing analytics platform that helps CMOs and CFOs bridge the ROI gap by providing cause-and-effect analytics that shows true business impact and financial worth.
The company’s ‘Proof Business GPS’ guides businesses through the whole marketing lifecycle, and provides a complete picture of a company’s marketing efforts. Their solution enables planning, budgeting, and optimization of marketing in all channels.
An award-winning B2B CMO and CCO, Mark is one of the first leaders to connect all types of marketing investment to revenue, margin, and cash flow impact in complex, long-cycle companies.
In 2014, he was named Innovator of the Year for his pioneering work by U.S. marketing leaders.
What you will learn
- Mark’s journey in building his software platform Proof Analytics
- How to develop a good reputation in B2B marketing
- How Proof Analytics is implementing AI into their systems
- Mark shares the onboarding process for businesses interested in Proof Analytics
- How to decide whether your company should outsource analytics or manage it in-house
- Why it’s critical to understand your own limitations and be open to learning throughout the entrepreneurial journey
- How detailed analytics can help you make confident business decisions
- Mark shares his key learnings as an entrepreneur
- Plus loads more!
00:00:07 - 00:01:24
Hi everyone and welcome to The Jeff Bullas Show. Today I have with me Mark Stouse. Now, Mark is the CEO of Proof Analytics, a sales and marketing analytics platform helps CMOs and CFOs bridge the ROI gap by providing cause-and-effect analytics that shows marketing and sales true business impact and financial worth. Code for he does a lot of math. So the company's ‘Proof Business GPS’ guides through the whole marketing lifecycle and provides a complete picture of a company's marketing efforts. Not only do they analyze the past, but they are helpful in predicting the future with data and maths, their solutions planning, budgeting and optimization and marketing all channels. And as Mark and I were discussing before, data is the new oil, but we just wanna make sure it's not crude but developed and smart. So he's an award winning B2B CMO and CCO. Mark is one of the first leaders to connect all types of market investment to revenue, margin, and cash flow impact in complex, long-cycle companies. In 2014, he was named Innovator of the Year for his pioneering work by US marketing leaders.
Welcome to the show, Mark, it's an absolute pleasure to have you here.
00:01:24 - 00:01:28
Thank you so much. It's great to be here with you.
00:01:28 - 00:02:03
So, Mark, when people, a lot of people, when you mention maths, they start to cloud over, right? I have, yeah, but your job is to make sense of maths for the real world and that especially for marketing and sales, which is fantastic. So, Mark, tell us a story, how you came to build Proof Analytics? What was the idea? Where did the idea come from? What was the story? What was the pain points or the inspiration that got you to start that?
00:02:04 - 00:03:51
So I think that, you know, that has evolved over time, but I'm gonna answer it this way rather than just kind of giving you a narrative. When I was, you know, so this is now like 20 years ago and I was a very senior leader in large companies like HP at that time, I was very ambitious. I was on a search for significance, personally and professionally. And so when business leaders above me would essentially criticize what we had done, criticize the function of marketing for spending a lot of money and not having anything really to provably show for it. From a purely ego, I'm not saying this is the best motive by the way, but I'm just saying from a purely ego point of view, that really sucked, right? And so I just said, you know what? I either have to get the hell out of here and do something else or I have to fix this and, you know, we're often counseled by really smart men and women to be careful what we ask for. And so, I started that whole process of becoming reacquainted with high school and university math. I think I told you when we, before we got started that I went back and actually I found my old math textbooks and for the first time ever, I actually read them.
00:03:51 - 00:03:53
Do they look brand new, did they?
00:03:54 - 00:07:10
Yeah. Well, some of them did, that's for sure. Yeah, they certainly were not annotated in any way, right? And so I, you know, I got reacquainted with regression and then I started bumping into the work that Proctor & Gamble had done with Econometrics. And then CFO of HP, Bob Wayman, kind of started to mentor me a little bit on it. And then I left HP and went to BMC and the coolest thing for about the next five or six years is that no matter where I worked, I had total support from my CEO, my CFO to pursue this virtually carte blanche, right? And that just goes to show you that we never do anything no matter how egocentric we are in the moment, right? We never do anything, we never accomplish anything alone because if they had not backed my play the way they did wouldn't have turned out this way, right? But they did. And so by 2008-2009 I had gotten a reputation in B2B marketing for having done this work, for having stitched it all together analytically and being able to show the boards or the C suites or whatever. A portrait of cause and effect that was so compelling and so mathematically solid that they were like, okay, you know what? That works and we will accept that we not only will accept it, we do accept it, right? And they started calibrating, not just marketing spend but overall go to market spend even though we didn't call it that at the time based on those analytics and we've gone beyond, you know, we climbed up the hill, right? So initially, it was the analytics was all about what we were doing. And then we realized holy crap, you know what I mean? The rest of the world, the conditions out there, the externalities is the technical term, right? Really matter a lot. And so we started including the relevant externalities in our models and there was a period of adjustment on that one. And so I was ultimately hired to be CMO of Honeywell Aerospace on, based on this work. That was then several years of really amazing proof of ultimate proof of concept success with it. But then it had to, we had to, we realized that the only way to get scale and speed with the analytics, be able to operationalize those analytics in a way that was useful was to create software that automated significant parts of the process. And that's how we got to Proof.
00:07:11 - 00:07:27
So really, you got beaten up a little bit by, got beaten up by CEOs going, you can't explain to me how we got from A to B can you and that's, and you, that's really what you needed to do.
00:07:27 - 00:08:26
Yeah, I mean, like, you know, one time I was in a conference room with Mark Hurd and a bunch of other people and he decided to really, you know, come after me a little bit, you know, and I was standing up because I had been presenting and he walks over and he literally physically backs me, without touching me at all, right? He backs me up against the wall, you know, and he's hot and his nose is about maybe six inches from my nose, right? And that's a, you know, you don't forget that kind of stuff. I mean, before he passed away, he and I had several really great conversations about it and he took a lot of credit for the fact that I had gone where I had gone, right? But I mean, and that's cool, that's all good, right? But yeah, sometimes you have to go, you have to enter into really uncomfortable circumstances to make a change.
00:08:27 - 00:08:31
So he asked some tough questions that forced you to go looking for answers.
00:08:32 - 00:08:51
Yeah, because basically what he also was saying is you guys either get your act together or you're not gonna be here very long. So you, you know, you're kind of, you have a choice like I mean, anyone has a choice in that circumstance, you're either motivated to go get a different job or you're motivated to do something to keep your job.
00:08:51 - 00:09:09
Yeah. So in essence, you were able to learn and do proof of concept about what you do today with Proof Analytics because of the corporations you were working for and the tough questions that were asked of you in those organizations.
00:09:10 - 00:09:59
Yeah, absolutely. You know, because the math is the math, right? I mean, there's no proprietary math on this stuff for anyone, right? It's all about understanding how to ask the questions that are necessary so that you can build a model that then delivers a pretty solid probabilistic answer, right? And it's really, that's really super important to say because everything that we deal with in this world is a probabilistic equation. There's only one assuming you believe in God, that's the only entity that knows it all enough to be deterministic, right?
00:10:00 - 00:10:14
So part of what you were trying to do is you built a platform that showed that when you turned knob A it produced a result XYZ. Is that correct?
00:10:14 - 00:11:00
Yeah. Although it's not just knob A, it's probably like depending on the question, the question you're asking, it could be 50 different knobs, right? And so there's a lot of interlacing of effects across different time lags. So time lag is from the time you spend the money to the time that you can see a discernible impact from that. How long is that? And depending on what the situation is, the business that you're in, the industry you're in, what the channel is, whatever, right? It's gonna be different time lags and all those have to be computed individually as well as collectively.
00:11:00 - 00:11:13
So basically what you're looking for is trying to work out if I spent more marketing and advertising on Facebook, for example, as a simple equation that would produce a sales result.
00:11:14 - 00:13:01
Yeah. So at a tactical level, that would be an example of a use case of a question more strategically, right? You're spending $100 million a year in marketing and sales together, let's say, right? And all of a sudden the economy changes and maybe there's a pandemic, right? So there's a whole bunch of headwinds or tailwinds because these things can be either or depending on the circumstance on the type of business that you're in, right? So, like high interest rates can be a real solid headwind for most businesses, but it's going to be a massive tailwind for other businesses, right? So then you're computing, okay, what is the headwind that we're now kind of going to market into the teeth of? Right? How much are these headwinds eroding the effectiveness of what we have historically been spending? And what are we going to have to spend if we are to overcome those headwinds and is that doable? Is that the way we want to spend the money? I mean, now you're not talking about a marketing question anymore or a sales question anymore, you're talking about a business question, right? Obviously has implications for those functions, but it's a much higher order question so that it's that kind of stuff we work. Our software is used all up and down that scale all the way from the strategy level questions that I like. The one I just described all the way down to your Facebook type.
00:13:02 - 00:13:43
So one of the keywords you used at the beginning when we were having the conversation prior to hitting the record button here was latency. And because as you said, different businesses are have different, you know, like you said, high interest rates will affect one, it will be a positive for one, a negative for the other. The other one is that some businesses, especially larger enterprises have a long sale cycle. It could be 12 months, others got a sale cycle of four weeks. So you need to collect data that gives you insights into both those models, don't you?
00:13:44 - 00:15:36
Yeah. The thing about it is though is that companies measure things that they know that are important to them at a cadence, whatever that is, that generally represents the clock speed of their business, right? And so that is so uniformly true that it means that a system like Proof is sort of self calibrating, un-lag and latency. And also because it's automated, it can recalculate the results of the models at that same cadence that things are being measured at. So basically what happens in Proof is that fresh data comes in at whatever intervals it's set up to do that. And as soon as it hits the model or models plural, it automatically gets recomputed and re-forecast, right? So on an on demand, relevant time kind of construct people are able to get the answers that they need at any time to make the decision they need to make. So like retail can be hourly and daily. But like Honeywell Aerospace is gonna be realistically monthly. That would be kind of, there'll be underlying weekly data in those models. But we're talking about the, it's so long, right? That you're not gonna look at weeks, you're gonna look at months.
00:15:37 - 00:16:08
Yeah, exactly. So, but that data you can plug in is you got data from the past that you plug in the model starts to make sense of it. So it can give you an analysis of where you are. In other words, turning these five knobs has produced this result at the end. So you started Proof Analytics in about 2015 after going through the pain of being challenged by CEOs about how did we get these sales results and did marketing.
00:16:08 - 00:16:42
And then the success of meeting those CEOs where they were, right? And giving them the answers. But realizing that most companies weren't willing to pay anywhere from five to $10 million a year for marketing analytics. And so there had to be a cheaper way, a more scalable way, a more understandable way, a faster way, right? So that's why we did the software. That's why we did Proof.
00:16:42 - 00:16:55
Yeah. So when you start in 2015, there was what is now one of the biggest changes in technology in decades, AI.
00:16:56 - 00:16:59
Yeah, which is a huge umbrella of technologies
00:16:59 - 00:17:32
It is, oh absolutely. It's just a catch all phrase for a lot of technologies, you know, MLM, machine learning use a lot of data. That's why NVIDIA, you know, the NVIDIA pro, you know, chip companies' price has gone through the roof. So when did you, have you implemented AI into your platform to assist you in both analysis of current situation plus predicting the future a little bit from past data?
00:17:33 - 00:20:12
So we were an AI first product, right? So it's been in the product since the beginning. The AI though that we use is designed to help analysts create models a lot faster and much easier collaboration with business counterparts than they normally could. The portion, the parts of AI that are mainstream. So you mentioned machine learning, are not really good at causal analytics. So machine learning will show you a lot of patterns, repeating patterns in large amounts of data. But those patterns are not significant just because they repeat all the time. They can be actually central to something or they can be way out here, right? And not and everyone's doing it but it's not meaning anything, right? So the other thing is that I'm kind of skipping across the waves here a little bit, but I think it's really important to understand is that all AI or all artificial intelligence is a big data solution. The problem is that while corporations, there are exceptions to the statement but there are exceptions. Most corporations don't have a lot of big data. They have a lot of data, they have a lot of lean data, right? They have a lot of time series data but they don't have a lot of big data. You have to, you know, the pharmaceutical sector would be an exception. Anything that is research based. So, automotive, aerospace, they're gonna have a lot of big data. Anything that is life and death oriented is going to typically have a lot of big data associated with it. But like big tech, strangely enough, right? Does not have a lot of big data. So their use of really significant AI is very limited by that, right? So like let's just talk about LLMs for a second, right? A public LLM? Its training data is the entire internet.
00:20:12 - 00:20:18
And well, let's explain to the other people who maybe haven't heard that Large Language Model is an LLM, isn't it?
00:20:19 - 00:21:49
That's correct. So this would be like ChatGPT and things like that, right? So it's sourcing, it's training data from the entire internet. So from a purely quantitative perspective, statistically, it has more than enough big data to make a determination. The problem is that there's a lot of crap on the internet, right? And so you probably have noticed that if you ask ChatGPT to write you a research paper on some subject, right? And then you read through it with any kind of knowledge, there's a lot of flaws in it, right? So then you've got private LLMs which now suffer from the problem we were just talking about, which is a constriction of big data inside of a corporation which then makes the quality of the data that is there far more important,right? So actually this is an example, the reason why I'm bringing this up is that it's an example of the limitations of AI right here, right now. It's not about problems with the algorithm. The AI technology, it's really what we feed it is the problem.
00:21:49 - 00:21:56
Well, there's a term used for that which has been used for a long time is garbage in, garbage out.
00:21:57 - 00:22:10
That's right. So Proof like all regression and regression is still, you know, that, you know, if you were to ask a data scientist about this, he'd say we use regression 80% of the time.
00:22:10 - 00:22:13
Explain regression to our audience just quickly.
00:22:14 - 00:22:37
So regression is a multi variable calculation of how a bunch of different factors on one end conspire together or not as the case may be to produce a particular outcome into the future. And it will tell you about how long that's going to take the time calculation.
00:22:37 - 00:22:42
So the regression is about the past, telling you about the future or today's situation.
00:22:42 - 00:25:24
Yes, but it's not extrapolation, right? So extrapolation is, wow, you know what the last six quarters have all been up and to the right. And so we naturally will say that the next three quarters will also be up and to the right. That's extrapolation. Totally ignores externalities, totally ignores a bunch of things, right? It just makes a giant assumption. And my dad always said, you know, when you assume it makes an ass out of you and me. So, we don't like that. Regression, on the other hand, we'll take a look at all the history that you have, all the patterning in that history. So like seasonal, stuff like that and when it forecasts into the future, it will take all that into account. So the reason why Proof is so attractive to so many companies in this regard is that it recalculates these models like every day, every week, something like that, right? Whatever is relevant to that business. And so it can check the accuracy of the forecasts very repeatedly and on a very low latency basis. So the future becomes the present, it gets measured, it gets recalculated in the model and everybody goes, wow, we're tracking almost exactly the original forecast or we're not, right? We're actually, oh, wow, you know, there's a gap that's opening up here between forecasts and actuals what's causing that to happen. Oh, it's that and the other, right? So then they can respond right there on the screen with different scenario, kind of planning more gaming and decide what it's gonna take, what changes are gonna be necessary to get back on track. This is where Proof actually operates exactly like a GPS, right? You know, you'll be cruising along on your route to your destination and all of a sudden the world will shift traffic ahead of you, wreck whatever, right? And so it will recalculate and say, okay, if you stay where you are, man, you're gonna be an hour late. But if you take a right and you take a left and you take two more rights and one more left, you'll be there in eight minutes. So why don't we go do that? That's exactly what Proof does for businesses.
00:25:25 - 00:25:44
Right. That's very, very cool. So can you give us a concrete example of how you would engage with a customer? In other words, they go, we're interested in your software, we're interested in Proof Analytics. What's the engagement process look like from inquiry to implementation?
00:25:45 - 00:28:06
So, one of the things we really work on a lot is our ICP, right? Our Ideal Customer Portrait because we want our customers to be successful. And like with a lot of sass, if you sell it to people who aren't ready for it yet and it's not their fault, it's just reality, right? They will not have a good experience with it and they will churn and that would be bad for everybody. So we really try very, very hard to avoid that. And so we spend the first call, first meeting, asking questions to determine where they are on, kind of like let's call it a maturity curve and if they're not where we really think they have the greatest chance of success with our software we will say, look, you know, this is kind of where we think you are. And so we don't think that selling you the software is a good idea, but we are happy to sell consulting to you or whatever to help you get ultimately at your own pace, right? Whatever is right for you to get you to a place where using our software could easily be a good thing for you. But it will, you know, the principles are the same. So if you bought all our consulting and then you bought somebody else's software, it's still is gonna be good for you, right? So we, so let's say that though, that they are really right for Proof, they're ready to go. So we will say, okay, you know what, we have a use based pricing scheme, right? And so you can start with as little as 50 data sets and you can model to your heart's content at no additional charge with 50 data sets. And that's gonna be about $500 per data set per year. So it's a little over a dollar a day per data set.
00:28:07 - 00:28:08
And a data set means?
00:28:08 - 00:31:58
Of any size, right? So we, it doesn't matter to us what the size is and a data set is in our case is a time series. In most corporations, it's a time series measurement, repeating measurement of something, right? So that's, you know, crystal clear, the basis of the pricing model is crystal clear. Everybody goes, yeah, totally got it. That's great. So then we sit down with them after the deal is done and we do the onboarding and where we start is we say, okay, what do you want to know? What do you want to focus on? What are your top 10 questions? What are your top 20 questions? Whatever the number is, what do you want to focus on? They usually have absolutely no problem rattling those off. It's the stuff that keeps them awake at night. It's the stuff that, right? So then we say, okay, got it. And we show them using a couple of them as examples how to build a model framework, which is the difference between a model framework and a model is, model framework has no data in it yet and the model does the model framework though will spit out a punch list of required data types necessary to arm the model to get the answer that they need. And so then they, that gives them like a, you know, like a laundry list, a grocery list, you know, go out and get these 12 data sets, right? And then those are loaded into the model. You can do it manually, you can do it by API, you know, do what, however you want to. And then they get a first cut, right? And it happens in seconds. It's like, and the whole thing, the whole process average is about three hours per mode,l to create a model, which kind of put that in perspective is compared to days in using any other tool, right? So you can really, you know, a particularly, a really talented data analyst, probably create eight of them in a day, right? That's, you know, 20 of them in several days, right? I mean, it's that kind of scale. So then what happens is that they'll work on those models and they'll go wow, you know what this is like, amazing. It's really helping us out and they will pick out some number of those models and they will create a standardized package of models. And then they'll say, so let's say it's a multinational and let's say they're doing business in 100 countries. I'll say over the next year, we want to scale this package of models, apples to apples out to all those countries, we understand it's not all gonna happen at once, okay? But at the end of the year, that's what we're after. And that way we can see apples to apples, how all this is happening around the world and the markets that we serve. And maybe we get some meta level learnings. Maybe we can say, hey, these guys over here in Zimbabwe, they're doing something that everybody else probably needs to at a minimum, investigate their markets, do more of that, try it out. It's stuff like that.
00:31:58 - 00:32:08
Right. So initially, there's a fair bit of hand holding, initially some hand holding by your team with their team.
00:32:09 - 00:34:13
Yeah, I would say that is probably overstating it because when we start the onboarding process, we're typically working with data analysts and so they just need guidance on like where everything is in the system kind of thing, not they don't need education on the principles, right? The other kind of onboarding, which happens a little later is that we sit down with marketing teams, sales teams, business leaders, whatever, and we work with them on how to make, how to use this kind of output to make better decisions because that's a whole skill in and of itself and our readouts are very intelligible, very understandable, very intuitive. But still, right? If you've never done it before, you're gonna need some orientation, right? And so we do that as well and typically the way it ultimately goes down, again, this is purely driven by maturity levels, around this whole topic is that either we sell them software licenses, they fully implement it internally and they're fishing, right? Or they don't know very much but they know that they need this and so they will outsource it to us on a managed service basis or our partners on a managed service basis. And that kind of takes two forms. One is, hey, you know what? Really love this, really need it. Totally want to outsource this to you for like ever and I, right? Or and I want you to teach us how to fish for the next year and then hand it off to us and we're gonna fish ourselves.
00:34:13 - 00:34:32
Yeah. So looking at your platform and also you've obviously had to build it. So you had to build the platform. So did you create an MVP, a minimal viable product initially?
00:34:33 - 00:37:16
That's a great question. So in most areas of SAS, you would do exactly that you'd create an MVP, right? In analytics, it's a, there's a real challenge there because a product mathematically speaking is either complete or it's incomplete and no one cares, right? So we had to effectively is why it took us the better part of three years, okay? To build it is that we had to kind of build about 18 or 19 customers, early customers like pre-release customers who were paying us a little bit of money, but most of the compensation was feedback on the product and we had to iterate the product with their feedback pre-release. And so that was our way of kind of approaching the MVP idea from a different perspective because all of those customers were totally okay with the fact that at any given moment during that three year period of time, it wasn't complete. So they were able to kind of atomize and separate different functions of it and focus, right? You can't really ask the market to do that. So yeah, it was a trick. I mean, it really was. And when you also, you know, it's a look, I don't care who you are, there's a whole bunch of that whole journey that you do not know. So you are, you feel like you're dodging bullets on a regular basis. And the temptation is to say to yourself, what an idiot you were for getting involved in this because all you feel like all you're doing is bumping into areas of no knowledge or very little knowledge and you know that you desperately needed to know that whatever that was a year ago, right? And then you have to kind of make peace with the fact that never stops.
00:37:17 - 00:37:20
In other words, be comfortable in confusion.
00:37:21 - 00:38:13
Yeah, that, and also I think the scientific method or the scientific culture is actually a really healthy view on all this. They just assume that they are ignorant all the time, that's why you hardly ever see a scientist who's really willing to say, this is absolutely positively it, right? Because it's almost like a realtor saying we've sold the house. It's not contracts, not inked yet, but we've sold the house. It's almost like seen as like a hex on that, right? And that is the way scientists definitely look at that. That is as soon as I make a claim like that I will be proven wrong. And so they don't do it.
00:38:14 - 00:38:39
Yeah. So you had to go through a journey starting with a lot of knowledge, but then you discovered that you didn't have enough knowledge that you had to continue to learn. And you got 19 customers. Were they telling you a lot of the same things to guide you or were they very disparate and different?
00:38:40 - 00:41:41
You know, some of them almost by virtue of their culture were really focused on what I would call the back end of the tool. So, capabilities, mathematical accuracies, you know, all this kind of stuff, right? Other ones just kind of assumed correctly that we had that part of it under control. And so they were really focused on user experience. They had absolutely no problem telling us, okay, I see what you're showing me and I have no effing clue what this is and what I'm supposed to do with, right? And it's okay to, always, to have a situation where something has to be on boarded or explained in some way, it can't be that opaque from first glance. They have to be able to have some level of connection and orientation to whatever it is you're showing them if you get an answer, like we got a long time ago, right? You know that you are not even remotely in the right area, right? You're not meeting them at all where they are. Now, their data scientists would look at and go, what's your problem? I totally understand this, right? So the breakthrough actually for us, because we really struggled on this a lot because everybody kind of had a slightly different version of what they thought a proper UX was. So obviously, that doesn't really scale. But one of our early customers was a large retailer who was feeding hourly data into Proof. And when that started happening, they were kind of late in the game, so to speak, right? So all of a sudden when they started doing that we saw on our screens, the models recomputing at that frequency, right? It was just like a GPS. It was literally like watching the GPS on your phone, right? And we were like, oh my God, that's it, right? Because if you think about it, almost every question in life, certainly in business is a navigation question either literally or metaphorically.
00:41:41 - 00:41:42
How do I get from A to B.
00:41:43 - 00:42:58
Where am I? Where do I need to go? How am I gonna get there? What's changing around me to make me change in response? Right? And so that's where we went with it. And then we did a batch of, well, we showed it to all of them and we showed it to some net new people, right? And everybody went, oh, that's great. That's awesome. That's exactly it, right? And the fact that it was also interactive where they could war game changes and they could say, for example, right there on the screen live in whatever meeting you could say, if you were to invest another $10 million in marketing and we were just to distribute it in the same way that we're currently distributing what we are spending, this is what it would look like.
Yeah, or you can change the mix right there, right? And that's what that would look like. And so all of a sudden people were having real business decisions about marketing as opposed to marketing decisions about marketing.
00:42:59 - 00:43:03
So instead of guessing you had the data to support decision making.
00:43:04 - 00:44:23
Yeah, even more than the data, it was the relationships between the data, that was what it really was. And the fact that people could see it evolving live in front of them also gave them a lot of confidence. And then the other thing I'll point out about the GPS on your phone is that it was probably right around 2012-2013, GPS providers started including the countdown to arrival, the ETA and the countdown, right? And nothing made a bigger difference in confidence. People's confidence in the GPS than that because when you turn into the parking lot of the restaurant and the ETA goes to zero and it's exactly when they said you'd arrive, right? You go, okay, this machine knows its stuff, I can count on it, right? That is that made it, that we do the same thing. And that made a big difference.
00:44:24 - 00:44:43
So GPS is very interesting in the sense that it's collecting data from real life cars on a similar route or nearby, isn't it? And it's feeding it back into the machine or the ghost and it's updating, I suppose a prediction when you're gonna arrive.
00:44:44 - 00:45:00
Yeah, it's so it's tracking all kinds of externalities around you and either the satellite or the terrestrial station is for progress, right? Relative to what's going on around you.
00:45:00 - 00:45:13
Yeah, that's a really great analogy, I think in terms of what you guys are doing. So, now you've got Proof Analytics or Proof quite stable now and customers are very happy with it.
00:45:14 - 00:46:14
And we also, some years later, we acquired a marketing resource management, a tool which is essentially an ERP for marketing. If you know what an ERP is, everything that it does, regardless of whatever function it's pointed at this one does for marketing, right? So it's unique in the sense that it is the only one native on salesforce. So there's tremendous advantages that accrue just from that alone. And then we integrated it with the analytics. And so now you can plan, predict, prove and pivot in an endless loop cycle, right? Where the analytics flip back around into your next planning cycle and help you understand how you need to be planning and budgeting your next move.
00:46:14 - 00:46:21
Right. So, sales analytics, sorry, salesforce your main focus?
00:46:22 - 00:46:50
It's certainly a very heavy focus. We also have it on AWS kind of a platform agnostic kind of use. You would go AWS if you were a heavy salesforce shop and you were trying to get more and more and more leverage out of your core sales force investment, you would totally buy our products on salesforce and use them on salesforce.
00:46:50 - 00:47:00
Right. Cool. So where's the future for Proof Analytics? Is it doubling down on AI in its use? What's the future for you guys?
00:47:01 - 00:49:07
I mean, I think that there's some really kind of some early steps. I mean, we're getting ready to release. Essentially it's ChatGBT on the AWS version, it's a salesforce version of ChatGBT on that one, where you can say, write me a report or create me, create a slide deck for me or whatever based on the key learning so far in proof. So actually not, it's really, it's kind of cool but it's pretty standard, it's no big deal. I think that it will be very useful, particularly if you in your props, you say write a talk track a script because I'm gonna present the latest findings in Proof, but I'm going to use Proof live. There's no slides. I'm gonna be essentially demoing Proof. I need a talk track to make sure that I cover it all and all that kind of stuff, right? And so I think that there's a lot of value in that. I think that we'll continue to add the use cases that our customers asked for, kind of implicit in Proof is that it is I'm talking about the analytics side, the analytics product, it is an agnostic product. So it's not limited to marketing and sales. It's you could literally use it for anything, right? If you wanted to use it to study climate change, you could use it for that, right? So while we are staying very focused as a startup on the go to market space with Proof, we already have customers that are using it in other ways. And that’s fine.
00:49:08 - 00:49:48
That's really cool. So I think you're hearing your journey from being confronted by a CEO to actually look for the answers to the questions is really cool and they're developing it with existing well, with customers to get it to maturity. So there's two more questions I want to ask before we wrap it up. Number one is what have you learned along the way as an entrepreneur by starting Proof Analytics? What a couple of two learnings? And this could go for hours, I know, so just pick one.
00:49:48 - 00:53:54
Yeah, I would say this there's so so many things but if I had to pick one that was like super relevant to, particularly right now in the current situation, everybody is in. Entrepreneurship certainly has some really awesome moments in it, but it has a lot of suffering. I choose that word on purpose because that's actually what it is, it's not just like hard times or something like that, particularly if you are the CEO, like it amazes me actually, that we lionize entrepreneurs and CEOs and I think we only do that because they perceived as having a lot more money than a lot of people and power at the amount of suffering in those roles, those kinds of situations, compounded by loneliness. Which again, if you've never experienced it, you cannot relate. So don't even try, right? But how, you know, when you're tempted to not have empathy for your CEO you might want to rethink that a little bit, not saying that some of them aren't jerks. And I'm not saying that they're beyond accountability. That's not what I'm saying at all. I'm just saying it's a lot harder than it looks. And so as you and I were talking about earlier, right? You can either be really pulled down by all that or you can make a choice to learn from it and also to see what there is to be thankful about in the midst of all that. And sometimes it is something totally different that's obviously pretty easy to be thankful about. That's not part of the suffering. And then sometimes it's also you do have these moments where you go, you know what? This really sucks. But I'm learning so much right now and I'm very thankful for that. So you, and it ultimately, you know, I think you have to be very proactive about this, but you have to cultivate joy, which is not related to your circumstances. It's about something higher and bigger than that, right? And it's kind of like opening up the optic and seeing more of your life, what it has been, what it is and what it could be at any given moment. Then most of us don't do that very much unless it's a worry, right? So as a data scientist, right? What I will add to that is is that it's a rule of thumb that two thirds of every model minimum involves things that you do not control the wave that you are seeking to surf, but you will never understand, you will never control the wave and the wave can kill you at any time or really mess you up, right? So you have to really realize that while this is the paradox that's illustrated in analytics over and over and over again, right? Even though it's super important what you do, it's still a very small percentage contribution to what ultimately happens, but that's not an excuse for not doing your best, right?
00:53:54 - 00:54:06
Yeah. So it's very much about the journey and enjoying that rather than getting lost to the pain or the actual moment, I suppose.
00:54:07 - 00:54:44
Yeah, you cannot allow yourself and, you know, to become lost in the pain and that and so to be very clear what that means at an absolute minimum is you have to early in that process where you just feel like you're being subsumed into pain, you have to know that you've got to ask for help, because if you try to handle it all yourself, it will not end well.
00:54:45 - 00:54:59
Yeah, that's a very good, yeah, that's a very good insight. So that leads to my other question which we touched on in your last answer is what brings Mark joy?
00:55:00 - 00:56:25
You know, I have to say that it's a whole bunch of different things. I love my boys. I'm really blessed to have two sons that are just, I think by any measure, really, really great human beings. And they have made being a dad actually remarkably easy. I feel like I sort of scraped the cream off the top of that experience in many, many ways just because the need to correct them, certainly the need to get mad at them. Some is, has been exceptionally rare. So I think that, that's definitely part of it, but I will say this. I try, I think that joy actually is most related to something you cannot lose. So God forbid I could lose one of my sons, right? And that would just be, you know, a horrible thing as it would be for any parent, you can lose your money. So, if you find your joy and your money, that's a dicey proposition, very much.
00:56:25 - 00:56:25
00:56:26 - 00:58:20
If you find it in a relationship with somebody else in your marriage or whatever that can change, there's just so many things that change in this world that if you are not anchored and something that it really kind of operates above all that and separate from all that and yet integrated with it. Then, you know, it's a fragile thing. It's a, your joy is a fragile foundation, right? And so for me personally, you know, my, I have a faith that started very, very small that has grown and grown and grown that I think of as being a, it meets me emotionally and intellectually where I am, it's based upon thought and action, you know, and it doesn't require me to be perfect. There's a lot of grace in my faith. But there's also accountability and, right? And so you just, and you're fundamentally, you know, if you have to believe in God for this to work, but I feel like I'm in a relationship where I actually have not like a voice to voice thing or something like that. But I know that the God that I believe that I know is out from my desk that doesn't mean that I'm gonna be saved from all pain and suffering clearly, right? And, he kind of takes a bigger view of pain and suffering and he expects me to take the same big view.
00:58:21 - 00:58:45
Yeah. I think you've touched on quite a few elements there, which is great. So, your relationship with your boys, I think it's one a guiding philosophy or religion is another. And also the other one you touched on is meaningful work because obviously you wouldn't be doing what you're doing unless you felt like you were doing something of consequence or not and on purpose for you.
00:58:46 - 01:00:13
Oh, it's so true. And I will also say this, you know, one of the things that I appear to have gained a reputation for on LinkedIn is the way I integrate a lot of this, right? And so, you know, I'm kind of the philosopher CEO to a lot of people, right? And I, and those posts get many times more engagement than my technical post, right? And so I'm, it's been great because I ultimately why I built Proof was that I had acquired all this knowledge after a lot of pain with different CEOs and CFOs want to share that knowledge and that capability because marketing is a great profession. It just sucks in this area, right? So, if I can help people get past that, it's like, totally awesome. And if I can help them, you know, we're in a period of time right now where a lot of people are in a lot of pain. They're being dragged across a lot of broken glass. By other people, by their situations, by themselves, right? And I certainly don't have anywhere near all the answers. But if sharing what I do have, it makes a difference for somebody, then yeah, quite honestly, it doesn't get much better than that.
01:00:14 - 01:00:55
Yeah. Thank you very much for sharing. That reminds me of the three fundamentals of a good and fulfilled life, code for happiness, code for joy. And it was Epicurus and Epicurus said the first element is the relationship with good friends, but that includes family. Next one was meeting, meaningful and purposeful work. But you're obviously doing that and three a philosophy or religion that supports you and you've summed all three up. So thank you for sharing.
01:00:55 - 01:02:06
You bet I would add a subset on number one or excuse me for me personally, the thrill of a Eureka moment. All of a sudden you feel like you've had a breakthrough of understanding, could be in my software product, could be in with a customer, could be whatever, right? I mean, I'm in addition to all this, I'm kind of a longstanding scholar of 15th century technology and innovation mainly compare contrast, Northern Italy and Southern Germany. They were both hotbeds and of it, but they did it very differently. So, when all of a sudden I feel like I have an epiphany there, a Eureka moment, it is a thrill. It's a joy. Now, it's a transitory thing in a sense, right? You don't just keep feeling that particular joy just constantly into the future, but it's still pretty awesome.
01:02:06 - 01:02:14
Yeah, I think you, what you've touched on is what I think brings joy to a lot of humans. That's the art of creating.
01:02:14 - 01:02:17
Oh, that's exactly it. Yeah.
01:02:18 - 01:02:49
And Mark, you've created a business, but a lot of people don't give enough credits to entrepreneurs as artists and creators, creators, artists, writers, it doesn't matter, musicians, they take all these elements, they bring them together to create a masterpiece. And to me hearing your story, you've created a masterpiece that will live beyond you and will be remembered. It makes a difference to the world. Even if some of us don't understand the language.
01:02:50 - 01:04:12
So I'll tell you this is like a parting story that illustrates your point that you just made, right? So Leonardo da Vinci, close creative partner with Lorenzo de Medici, he was very creative but had no skills. So he really needed Leonardo to kind of like realize his vision on whatever, right? But they had this incredible partnership and there are a number of things either in his diaries where he would sketch designs or actually in some cases in different pieces of art where secretly there is a, a joint signature and by da Vinci, right? That, it's in code but it's da Vinci and Medici, right? And he's giving equal billing to Medici, not because Medici paid for it. You know, da Vinci is recognizing the literally the creativity, the co-creativity that Medici had brought to the table and that is really cool. I love that.
01:04:13 - 01:04:46
Well, it reminds me of the term. We all stand on the shoulders of giants and you mentioned before and out with that, you used a lot of the wisdom of corporate, mathematicians and other things that you learned from and, but you've created your own particular formula which is your art. So, thank you very much for sharing your story Mark and your insights. It's been an absolute joy.
01:04:46 - 01:04:50
Thank you so much. I really, it was a thrill for me too.
01:04:51 - 01:04:53
Thank you everyone and thank you Mark.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.