Ai Today ( Aired 09-01-25) AI Regulation and Pilots 2025: CEO’s Guide to Success

September 01, 2025 00:50:16
Ai Today ( Aired 09-01-25) AI Regulation and Pilots 2025: CEO’s Guide to Success
AI Today (Audio)
Ai Today ( Aired 09-01-25) AI Regulation and Pilots 2025: CEO’s Guide to Success

Sep 01 2025 | 00:50:16

/

Show Notes

Master AI integration: navigate regulations, plan pilots, manage data, reduce risk, and boost revenue while protecting brand trust.

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Sam. Hello. Hello. Welcome back to power CEOs and AI today. I am Jen Goade and I AM here with Dr. Alan Badot. And we have decided that the time has come for another special collaboration because AI is evolving. There's a lot happening. There's, there's regulation changes, there is agency AI. There's so many things that have happened. And so we wanted to come together today and give you an entire hour dedicated to do I need to know from a business standpoint, what are the tech situation? What is the tech situation and more. So, Alan, it's a pleasure to be here with you. Thank you for agreeing to do this. [00:01:02] Speaker B: Yeah, it's great to be back with you, Jen. [00:01:05] Speaker A: Let's dive in. Yeah, the AI race got new referees and their rule book can impact your bottom line even if you're not in Europe. The EU's AI act isn't just a European red tape. It really is kind of a global wake up call for CEOs and investors. Whether you're building AI tools or just integrating them into your workflows, this regulation could dictate how you design, deploy or simply integrate them into your business or even market your business. So, Alan, I wanted to break down what's changing, why it matters now, and how to turn compliance into our next competitive advantage as entrepreneurs. So first things first, will you sort of give us an overview of the EU AI Act? What are the key requirements, timelines and why US companies, companies can't ignore it? [00:01:56] Speaker B: Yeah, sure. So this is, this is something that it's not entirely new. It started, you know, really around almost, almost a year ago in that, you know, they came out with their initial EU AI act and they laid out what a framework was going to look like and some things from a compliance perspective, well, you know, then deadlines started to hit February, like early February, I think it was February 2nd or 3rd or something like that of 20. That's when some initial deadlines were starting to supposed to take effect, which looked at, oh geez, where are they getting the data from, how are they using the data, what kinds of models are they actually training these, what are these models going to do, those sort of things? And then it started to really just focus on, okay, they're getting a lot of data, they're using our citizen data, we need to protect them. So now we're going to say, okay, you know, depending on what kind of models you're going to use, how you're going to use that data, there's going to be some risks associated with that. And so they started to assign risk scores to some of these models and then, you know, those risk scores really just lead to what they can do with those models later on and how that's going to impact businesses and, and where businesses can use them from. [00:03:28] Speaker A: So let me ask you a question about that because we've seen all kinds of things happen. We've seen countries want to ban the use of certain models and open source and whatnot. For a company who's currently integrating, maybe an open source LLM, what are the first compliance checkpoints that they need to be considering or that they need to really start thinking about integrating? [00:03:57] Speaker B: Yeah, if they're, you know, there's, there's, there's sort of two thoughts around this. If you're going to do business over in Europe, then you've got to look at gdpr. That's how data is moved, that's how data is used and that's how data is, you know, really computed across different computers and Internet and those kind of things. And that's the first step that you would want to look at. But from another compliance perspective in the U.S. you've got to look at is the data allowable to be used in different types of models? Because medical data, PII data and those kind of things really dictate where they can be used, how they can be used and what they can be used for. And they've, you know, if you're a business, you've got to get through those technical hurdles before you can really do anything else. And you know, then it just starts a laundry list of other things that you have to, that you have to take into account. And now my question back to you though, Jen, is taking that stuff into account, the costs. How do you start to look at, you know, weighing those costs, those, those benefits, those trade offs that companies have to invest in. But is it worth it? How do they determine if something like. [00:05:15] Speaker C: That'S worth it or not? [00:05:17] Speaker A: Yeah, well, I think this, Alan, that's an excellent question. I think this goes back to, you know, what we talk about a lot and that's what are the strategic goals of the company? Where are the bottlenecks? Because if there's something that is repetitive, it's predictable in some way, shape or form and it's time consuming, those are almost always wins when we look at an advanced AI automation. Because what we're looking at, we're not just looking at the monetary cost benefit analysis, we're also looking at the human labor side of things. And one of the things that, that keeps Coming up. And I'll be frank, Alan, the most successful integrations are the ones that consider this particular aspect. We're in an AI age. AI is being integrated into a lot of different things, whether we want to or not. So we have to make a decision as an executive, as a founder today, do we want to control where this is going for our companies and be a part of that, or do we want to let it control us? And so that piece of that cost benefit analysis is being ignored in a lot of places. People are putting their heads in the sands, are not even thinking about it. But if it's something that's repetitive, it's predictable, it takes time, it's time consuming, those are the first things we need to be looking at when we start to assess this. And then of course, as you know, do we have the right data, do we have the right protections, do we have the right governance in place? And being aware of what that looks like from the beginning helps us to sort of mitigate some of those compliance and risk costs on the other side. Right? So if we have a knowledge of, okay, I'm in medical, I know I have to worry about protected information, I know I need to redact this information, I know what I can and cannot use within my guidelines, then we have to start with that end in mind and start with those guidelines in mind. And when we look at our cost benefit analysis, we also have to ask ourselves the core question that I'm asking, and actually, I'm going to kick it to you because a lot of us have stuff stored in cloud. But as we've seen recently, cloud is not necessarily the safest place to be housing things as all of these bigger systems keep getting hacked or they have exposures. We're seeing this in financial, we're seeing this in the news every day. So when we're thinking about this, we also have to consider, where is my data being stored? Where, where am I, where, where am I housing this? Because there's also risk associated with a lot of the aspects that go into data security. So I am not the expert in that space. I'm going to defer to you and ask you, how do we engage that in our conversation about, okay, this is something that we need to consider, what is the cost going forward? Because those costs also keep escalating. So, like, if we're using cloud storage, every other month, these, these price points are going up. So how do we, how do we look at this from a, what is the actual risk? And, and then from a cost standpoint, like this is going up, we know there's not enough infrastructure. So what are you seeing on your side of that so that we can make more informed choices when we look at the financial burden? [00:08:26] Speaker B: Yeah, I think the biggest challenge that, that folks have is they treat data separate from machine learning or even basic analytics. The challenge is you've really got to have a data ops piece that really combines with your ML ops or your analysis ops, whatever you want to call it, so that as your strategy is refined for the data, it also gets refined naturally for your models or whatever you're using that. Oftentimes we don't do that because we either look at only the data and getting the data, or we look at only the models and training the models. And we're not very good at doing both. And that's just going to have to change because you're right, there are security issues all over the place. People are just housing it, it's not encrypted, it's not backed up. You know, the laundry list is just absolutely huge. And as folks need to pay attention to the cycles, the things that are happening, that really is going to fundamentally shift what your strategy is going to be, where you get your data, how you get your data, how you use the data. That entire data life cycle is going to be really exceptionally more important now than even it was two, three months ago because of the size of some of these models. That's really the interesting piece actually about the AI regulations coming out of the eu. They can actually start to find you now. I, I think it was like 3% of your revenue that you're getting on, on some of those. And now it's, it's still, it's still a little wishy washy. I think it's, but it's better. There's, there's, there's some, you know, teeth behind it that they can start to do, you know, those kind of things. And so as you as a business owner, if I came to you as the CTO and said, hey, you know, we've got to do this, this and this, and oh, by the way, we could get fined 3% of our revenue associated with this, how would you, how would you handle that and what sort of, what sort of conversation would we have after that? [00:10:36] Speaker A: Yeah, I think, I think the first conversation, the first thing that would do is give me pause because that's not chump change. We're not talking about 3% of our profit, we're talking about 3% of our top line revenue. And a lot of companies have a margin of maybe 10, 18, 20% at the end of the day. And so it would give me pause, and I would have to say, okay, well, let me reconsider this, maybe we shouldn't do business there. But that's actually not a really good option because as we have seen in other regulations, what happens in one place, people are going to pick and choose, and that's going to come to be here as well. So then the next thought becomes, okay, so we designed two models. We designed something that's in compliance over there and has a more literate, like a more limited data set and is in compliance with there. And then we have a separate thing for those that are here in the United States. But then that's a double cost. So there's a lot of things that are now going to give me pause that actually will have the impact of maybe I'm going to duck my head in the sand for a lot of people. And this is what I'm seeing. I'm seeing people like, oh, wait, hold on, this just came out. I'm gonna, I'm gonna press pause, and I'm not even gonna think about this until it all shakes out. Well, that doesn't work either, because by the time it all shakes out, I'm gonna be out of business because my competition is gonna have to leapfrog me. So these are all kind of questions, what you've asked me. It opens the questions that I now have to have a conversation with you as the CTO to say, okay, how do I solve for this? What makes the most sense? And how do we keep it within this sort of budget? How do we keep it within this sort of context so that we're in compliance and we're able to be future proofed? And so I know that you have something to say on this, and I want to dive into this, but we're hitting up on the end of the segment. So we'll be right back to dive into this particular question after these messages. Back from break and before the break, Alan, I asked you a couple of questions. We're having this dialogue because we have these new regulations that have come in, these, this governance considerations, these compliance considerations we were talking about. Where do we, how's our data? Encrypt our data, how do we get it? How do we use it, the entire data lifecycle and the conversation between the CEO or founder and the CTO on how do we move forward from here. And some of the things that I posed before the break were, how do, how do I consider this, do I have two separate models, one that's in Europe and one that's for the U.S. do I, do I take a gamble on what's going to actually shake out once all the dust settles in pretty much global regulatory and compliance standpoint, how do I consider this? Because 3% of my top line revenue is not something I'm willing to risk to be out of compliance. So that opens these questions and this conversation between me as the CEO and you as the cto. And so I'm going to pose the question, what do you think is the most economical or budget friendly way for us to move forward with AI now considering these options? [00:14:05] Speaker B: Yeah, the, the challenge, and the tough part about that is that if you've dealt with GDPR in the past, you have in essence had to have two solutions already. You've had to have one that made sure that your data didn't cross certain borders. Like, just because you're running an Amazon doesn't mean sometimes your data doesn't go up to Canada and back. And you may not know that, but with GDPR you're supposed to know that and you have to prevent those sort of things. So you would have to have a local system that was, you know, almost specifically built so that data did not leave, you know, the country and you know, citizen data didn't leave the country and those kind of things. And so now, you know, if you had that already, then you're in pretty decent shape moving forward. Now you just have to say, oh, geez, if I'm going to use one of these general models, what am I, how am I going to guarantee that it doesn't have any sort of risk data in there? And the examples that I've seen are, you know, oh, it's got this model now, it lowers the barrier to create chemical weapons and those kinds of things. Well, in theory that's, that's the case. But what you have to do is you have to be very focused on looking at what you want to apply these models for and choosing the right models. Now just going out and grabbing a general model, a foundational model and training your data on that and moving forward is not, it's not the answer anymore because they're assigning risk now to each one of those, those categories. So you've got to be very particular. And so I think you have to spend more time, like we always talk about more time planning what model you're going to choose, what you're going to apply it to and how you're going to do that. And so my response back to you would be, we're going to have to spend three, six more months planning, coming up with a good strategy. And this is the cost that it's going to take in order to do that. [00:16:06] Speaker C: Are you willing to really invest in that moving forward? [00:16:10] Speaker B: And that's going to be my bigger question back to you is how, you know, how much really bandwidth and how much leash are you going to give me as the CTO to spend that extra money to do that kind of planning? Because it is going to be more intensive. [00:16:27] Speaker A: Yeah. And, and so that's, that's, that's a common question. I actually have this question with entrepreneurs that I coach and consult with, because everybody thinks that it's just the cost of the integration or the, or the tool, and they don't even consider the planning phase to begin with. And then they plan for full integration over a pilot. So there's always risk with a pilot because the pilot might not even work. Like, we don't know. And so I think the next step in this, in this conversation naturally becomes, okay, what is this costing me now? What is my cost of doing business now? What is my labor cost? What is the cost of the redundant activities that are being done? What is the human capital and financial capital? And we have to really get granular with each process that we're thinking or considering in our pilot or in our rollout so that we have a clear understanding of, okay, I run this process 100 times a week. It's, you know, $100 per process. It costs this amount per year. If I am going to already be spending that amount of money, and I'm not doing mental math right now, if I'm like, let's just call it $500,000 that I'm spending a year on a process to make it very simple. If I'm already spending, no matter what, $500,000 to run a process every year, that with my current clientele, and I have a growth rate that I'm anticipating, which is going to, of course, increase that cost as well by the same percentage, then I'm going to look at it and say, okay, I'm willing to put 50% of that into this project, and of that maybe I'll take 20% and put that into planning and pilot, because I'm spending the money anyway, because there's a risk that it won't happen. I know that I'm growing. I know that this cost of doing business or cost of running this process is going to continue to increase as I grow. But I'm willing to take that money and put that aside right now to pilot and see, is this going to work? Am I able to do this? And now I have a budget for my pilots. And so if it's, if it's, we called it 500,000 maybe, I'm willing to put 100 grand, 20% into the pilot to, to determine is this something that's feasible, is it going to work? Is there something that I can move forward with and then actually do that process? So that would be how I would look at it. That's how a lot of the entrepreneurs that I coach would look at it. Because you now have numbers and you have something, you have a budget for the pilot and you have an idea of what you wanted to do. So maybe if I'm going to spend a hundred thousand dollars, what I want in my ROI is I want it to save me $300,000 in, in my overarching scheme of things, running the process. And so I'm looking at all of those things. So I started with, what are my business initiatives? What am I trying to achieve? I'm trying to achieve operational efficiency. In this case, what are my current cost? $500,000 per year and growing at a 20% rate, because my company is growing at a 20% rate, whatever that may be. I'm willing to spend 20% of one year to do a pilot to make sure that this is going to actually work. And I expect to see 2 or 3x ROI on the money that I'm spending in, in that particular process. And if I meet all those goals and the pilot meets those goals, then I'm going to say, okay, let's roll it out. Right? And so I've already tested it now. If the pilot fails, then I'm going to come back to you and say, okay, this didn't work. What happens? What do we need to do? What do we need to tweak? What do we need to iterate? And we see that all the time, Alan. We see that sometimes we need tweak for a little bit in the pilot and it maybe runs over project cost by 20%. So how, how do we have that conversation when the pilot maybe didn't yield the result that we were hoping for? [00:20:11] Speaker B: Yeah, I would say, hopefully we've had conversations already that things were not looking as rosy as, as we had planned, and so it would not be a shock to you, as the CEO, that the pilot was not. Was not working. That's the worst possible case that I could think of is that, oh, yeah. [00:20:33] Speaker C: We'Re in this big bang pilot discussion. [00:20:35] Speaker B: And you have never heard anything about. [00:20:37] Speaker C: It and then it doesn't work. Right. That's a disaster. [00:20:40] Speaker B: Now when we. [00:20:43] Speaker A: It happens though. [00:20:44] Speaker B: It does, it happens because people go. [00:20:47] Speaker A: Out and they hire a company to do all of this and they don't ask these questions or set up these communications. So that's why, I mean like the reality folks is we're having this conversation on the air because we see this in the day to day when somebody says, oh, I'm going to do this, go do this. And then they step out and they just expect it to be magic. And the other side. [00:21:06] Speaker B: That's right, that's right. That third party, they've got it, they're telling us everything is great, it looks fantastic and then it's not even close. Yeah, it's, yeah, it's, it's, it's not a good situation. But I think one of the good. [00:21:19] Speaker C: Things about the, at least from a. [00:21:23] Speaker B: Tech company that's trying to do business. [00:21:25] Speaker C: In the US is that, you know, as these penalties become more clear and the legal ramifications around these models become more clear and me as a small business, what's my responsibility from a technology, from a technical perspective on, oh, if it's part of my software and it. [00:21:46] Speaker B: Hallucinates, what sort of responsibility do I have? I think that gives an opening to. [00:21:53] Speaker C: Tech companies in the US to say I'm going to focus on what we're. [00:21:57] Speaker B: Doing over here first and then if I can apply it over over in Europe later on, that's great. [00:22:05] Speaker C: If not, I'm not going to stress about it because I've got a revenue. [00:22:08] Speaker B: Stream coming in and as you align. [00:22:10] Speaker C: Your pilots, I think people have to think about that we failed because of this and this is the reason. But those, those longer term sales opportunities. [00:22:20] Speaker B: And business things that you've tried to. [00:22:22] Speaker C: Align to it, I would just, I. [00:22:24] Speaker B: Would focus on areas where there's less. [00:22:27] Speaker C: Risk from a, you know, a failure perspective than not. Because if you fail over in Europe on a pilot, it could be, it hallucinates too much. [00:22:37] Speaker B: You're done in the States, you modify it, you keep going and you know. [00:22:42] Speaker C: You can have something that's sellable, but over there the risk seems too high. [00:22:49] Speaker B: And that's, and from an acceptable risk perspective, I can talk about it from. [00:22:53] Speaker C: A technology perspective, but as an owner, as a business, you know, investor and things like that, how are you now. [00:23:01] Speaker B: Going to look at the risk associated with some of these things that are trying to get into that Market. [00:23:07] Speaker A: You know, it's an ever evolving conversation that we have in my circles because the risk is evolving. You know, let's be real. Regulation can't keep up. Regulation's what, three years, five years behind what we're seeing right now. And they're wanting to backtrack and say, oh, everything that you've done until now is now out of compliance. So what, what, what, what are we doing? How are we doing this? And I think it really goes back, Alan, to what we discussed in the very beginning and that's we have to have a solid business use case. We have to know exactly what it is that we hope to achieve and define success. And then we have to know what is happening with the data because we are responsible to that. The risk to us is tremendous. As a founder, it can sink our company because it's not just about the financial standpoint. It's also if we integrate something and all of a sudden we get slapped to find that's like people are going to know about that. And then that destroys brand trust. And so we spend all of our time building brand trust and now we have something that's broken and doesn't work and now we've got fined for it and it's like it's a compliance nightmare. And that's just going to create fear in our potential consumers. And it's a reality that we're facing. And so we are unfortunately have to break because we're getting really at the end of our segment. But I think there's a lot of takeaways from, from this. For those of you watching, it's do you know what you're hoping to achieve? Do you have it dialed in? And I think that's where we should go next, Alan, I think after the break, let's go to how do we dial in an appropriate sized pilot? How do we nail that down? What does it look like and what does a scope, appropriate scope for a pilot look like? Because that's the thing that I'm finding in all of the compliance, I mean in all of the companies that I consult, they have this very vague idea and it ends up being a complete disaster and overspend and everything because they haven't got a very clear identified scope and understanding of what it's going to take. So we will open up with that after these important messages. We are back. We are talking all things AI. What do we need to consider? We, we started today, Alan, talking about the EU AI Act. What is the compliance and regulatory risk? We Talked about the 3% hit on revenue if we're out of compliance and some of these other things. And we started talking about, you know, compliance by design. And one of the things that before the break I alluded to was that we are finding, and you and I both consult with companies all the time, that companies have this sort of like very vague idea of what they want. They want a solution that does everything. And when we talk to them about a pilot, they have no concept about what kind of data requirements that's going to take. They have no kind of concept about the auxiliary aspects like AI governance. Like I say AI governance and I get like deers in the headlight look. And I'm like, okay, so you need a policy about this and how this is going to be used for your company, or like it doesn't matter where you are, that's a problem. Like, there's some basic things that I keep seeing come up and I know you and I have seen it on some of the projects that we work on. So how do we build this? How, how do we build the plan moving forward? What are the things that go into a scope that makes sense for a pilot that could be successful as opposed to, and I want you to kind of show me, like, give me a contrast with something that, and we've had quite a few of these recently where the scope is completely muddy. They have this vague pie in the sky idea. They have like, they think a little bit of data is going to yield a result when we know that it's not the case. Give me a compare and contrast of a very good, well defined example of scope versus one that's kind of muddy and needs some work. [00:27:21] Speaker B: Yeah, the best example is, you know. [00:27:24] Speaker C: Something that is really good, is very focused. [00:27:28] Speaker B: They've got a strategy in place. [00:27:31] Speaker C: They have a technical roadmap laid out that says these are the milestones that we will hit at these points and this is what the end result is going to be, or at least the result is going to be at those milestones. All that goes back to requirements. Have you sat down and looked at the user requirements? [00:27:52] Speaker B: What are you, what do you want. [00:27:53] Speaker C: The user to do? Have you looked at the data requirements? How much data are you going to need? You know, where are you going to get the data from? Is it clear data? Is it all those other things and then the security and every other part of, you know, those aspects around that. But it's got to be very tightly focused, usually something that you can get done in six to eight weeks. You don't want to go past that, because then you've gone from a pilot to a smile. It really where you've mishmashed everything from a, you know, an operational plan and roll out to a pilot, and it's just awful. It's just awful. And then you're in continuous build really is what winds up, winds up happening. And, you know, that's what happens, though, a lot with these AI things, because they think, oh, now that I've got this model, I can do just about anything with it. Well, that's great in theory, but in practice, we all know it may work, it may not work, and there's all those other things. And so, you know, if you don't have a good scope, if you haven't focused what that pilot deliverable is going to be, what is that thing that you can hang on, you know, the door or pin on the wall that says, this is it, this is successful, then you're, you're going to be in trouble. If you cannot define success for your pilot, don't start. Your pilot is really the best way to do that. [00:29:19] Speaker B: Because as, as I'm pulling all this. [00:29:22] Speaker C: Stuff together as the cto, I've got to give you a budget. And if my budget keeps moving all the time, then I think that's going to lead you to have not very much confidence that it's going to be successful at the end. And so that dialogue back and forth is going to be really important. So what are some of the questions that you would chuck at me as. [00:29:45] Speaker B: The CTO to say, do you know what the hell you're doing? And, you know, how are you spending too much on things that you shouldn't. [00:29:53] Speaker C: Be doing and what's the end result going to be? [00:29:57] Speaker A: Yeah. So I think some of the key things that we need to be asking as CEOs is, what is it? Do we have the data that we need in order to solve this problem? We have to have an idea of what problem I'm trying to solve and come to you with, hey, Alan, I'm trying to solve this precise problem. This is about what it's costing me right now. So I have an idea of what that problem is costing me, what that bottleneck in my business is costing me, or what that growth strategy is? Because a lot of times what we're looking at is we're looking at amplifying our current human capital through the use of automation. That's what most of my clients are looking for. And so they're wanting to grow and scale. So I have a very good idea of A, of, of a operational efficiency, maybe it is, or whatever it is. And I'm going to come to you and I say this is the problem. This is what it's currently costing me. What do you need from me? What do you need in order for us to assess this problem and create a technical solution for it? Because I'm going to tell you, I am talking to every one of you who is watching right now. CEOs, we don't know everything. We definitely don't know the tech side. Even tech CEOs miss the mark on this sometimes. And so we have to ask our subject matter experts in this space these questions so we know what we, whether we have the data to compile. So I'm going to ask you, Alan, what is it that I need in order to solve this problem? Like what are the considerations from a data standpoint? Do is our data? Like can you go into what does it look like? Is our data clean and what data do I need? Like I'm going to ask you what is necessary to solve this problem and I'm going to shut up and listen. [00:31:37] Speaker B: Yeah, I think, I think my response. [00:31:40] Speaker C: Back to you is of course it's going to depend on the problem. But you know, the biggest challenge that we as tech folks have is that we sometimes put blinders on and only look at data that is either related to our field and we just ignore everything else that impacts it because we're not familiar with it or we over engineer things sometimes. And with data it's very easy to do that with these models. You know, if you assume that they're not biased, good luck. [00:32:16] Speaker B: But that's a different discussion. [00:32:18] Speaker C: However, you know, as you're, as you're training them and trying to, to get them to focus and specialize in certain areas. You know, if you are only wed to one model and that's all you ever use, then you've got a big problem because some data is going to work better with certain models than other data is, is going to. And so, you know, just really keeping an open mind, trying multiple models using different types of data and by types, I'm not talking about, you know, formats and stuff, I'm talking about maybe I need to use financial data, maybe I need to use customer data, maybe I need to use weather data, whatever, whatever that is. But really looking at all sides of that and saying this model I'll use for this with this data, this one with this, and then I'll try to combine them. And I think from a CEO perspective, understanding and having that dialogue to say, oh, you know what, this impacts us in more ways than you as a tech guy can, can see. That's really where the CEOs should come into play because that's where you guys are great at, right? You've got a ton of business skills in a lot of different fields and you would be able to tell me better, oh, yeah, this impacts this. You need to look at that. Doesn't matter. You know, we'll fight, we'll figure out how to get the data. But having that understanding and that dialogue is going to be really important. [00:33:41] Speaker A: And I think, you know, Alan, I, I'm sorry, I didn't mean to interrupt you, but this goes to who do we have at the table? This isn't just a CEO and a CTO conversation. This is who are the players, how many people are going to be touched by this? How does this interrelate? So if we have a solution, maybe what our solution is we're doing a back end optimization in customer service. And our complaints, the complaints that we've been hearing or the feedback that we've been hearing from our customers, it takes too long to solve customer service issues or maybe they're on the line for a long, like whatever that issue is or to too long to get a solution. And so that's an identified bottleneck in our business. And I will come to you and I'll say, okay, well, right now the time to call is an hour and 17 minutes. That's because our customer service team is overstaffed. It's hard to hire them. We have a revolving door. Whatever the reasons are that we have decided to identify this as our first, our first opportunity to automate. And then I'm going to say, okay, so who touches customer service as a CEO? Because it's not me. I'm the CEO. I'm the one who gets reported everything. It's our customer service team. It's the boots on the ground, it's the front end workers, it's their managers, it's the customer is going to interact with this. It may be our sales team has a part in this because they're talking to them and explaining what the customer service is. So every point in that entire process, anyone who touches that has to, they have to have representation at the table in the planning phase of this. Because I can't just dictate. And this is one of the biggest things that I see with people who I consult with. The CEO dictates I'm going to do this, I'm going to automate this it's going to make life lovely, everybody go forth and conquer. And it like, it's, it's vague. The process is not right. We're missing crucial pieces or people who touch this, this particular process along the way. So it's important for us as founders, as executives, as entrepreneurs to understand every place. Where does this process originate? Oh, it originates when they hit the website. Oh my goodness. Okay, so we have to go all the way to our front end marketing. So we need to have our marketing and our sales and our customer service. And then on the output side, the fulfillment side has to know about this because maybe it's a fulfillment issue and it's not getting to people on time or whatever that is. We have to have every single department that touches this in this conversation. Because the reality is, is we are not doing it as CEOs. Like what CEO is touching every part of this process? Maybe in a solo, solopreneur standpoint, which we are that, but we need to know every single point in the process, otherwise it's going to fail because we're missing some key point in the process. So it really goes down to you bringing that question to me going, hold that phone. I need to get the right people in the room so that we can have this educated discussion and let's look at what we actually have and what is actually happening. Happening. [00:36:30] Speaker B: Yeah, yeah. And one, one of the customers that. [00:36:33] Speaker C: That I've been working with and this is, this is the difference between understanding what machine learning models are really good at and what they're really good at. But they just give you the right answer, but the wrong results that you're looking for because, you know, one of the, you know, they are, you know, a product company and they were trying and are trying to better market themselves to, to folks and they're training all their models on their own internal data. Well, that's great, but your customers are more than just your data. [00:37:10] Speaker B: And so they're saying, oh, the AI. [00:37:11] Speaker C: Is not working because it's not increasing, you know, the acceptance of our, our products. And they're only training on their data. [00:37:19] Speaker B: And so the answer is giving, you know, the AI is giving them the exact answer that they're getting and they. [00:37:25] Speaker C: Should be getting because that's the only data that they're using. And then they're shocked that, oh, you know, how come the AI can't do this, this and this. And it's because you didn't train on anything else. They don't live in your bubble. People live outside of the bubble. Use More data than just that. And that's, you know, that's a discussion that has to be had early on, too. [00:37:44] Speaker A: Yeah, you're absolutely right. And so we do have to take a brief break because we have, we have run out of time, unfortunately, for the segment. But let's pick that up when we're, when we are right back after these important messages. Welcome back. Alan and I are discussing everything you need to know, plan for, for a successful pilot. And before the break, Alan, you were talking about a client example where they wanted to increase their marketing. They, they had a pro product, they were trying to get a higher acceptance rate, but they only trained on their own internal data. And so some of the first questions, and we talked about this in the break, but some of the first questions that I have that go to my mind is, are they even good at marketing? Because I see this all the time. I see all the time a company comes to me and they want to utilize AI for their marketing, but their marketing is not good to begin with. They don't have a clear marketing message. They don't have a clear brand identity or brand values. And their sales process is, quite frankly, terrible. And they're not closing. Maybe they have a 5% or a 10% conversion rate when the industry average is 35%. And so I, the first question I ask as a CEO or as a consultant in this space is I asked the CEO, I'm like, okay, so what are your sales conversions? What is your sales conversion rate? And then I look up and I say, well, your industry conversion rate, is this your sales conversion rate? Is that what are you missing? And have you gone and asked? And the easiest solution to this, Alan, it's so simple. It's to ask the people who say no why they're saying no. Like, what is it? Is it price? And is it not the right fit? Is it not something that's important enough for you to spend money on in today's tighter wallet economy? Like, what is it that is causing that? No. And if we are not closing that feedback loop with our prospects, we don't know. [00:39:58] Speaker B: That's right. [00:39:58] Speaker A: Have that data. And I'll tell you, most small businesses do not close this loop at all. So give to me a couple of things that you're seeing on the side. Let's, let's keep that same, that same company who trained only on their own internal data. But maybe they're not optimized and is not happy because they didn't get the result that they wanted because they don't quite frankly, have the right data in order to solve their problem. Talk us through that and how to solve it. [00:40:24] Speaker B: Yeah, it goes back to the same. [00:40:26] Speaker C: Old sales discussion that we always have. [00:40:29] Speaker B: That, you know, it's great that you. [00:40:31] Speaker C: Have a whole bunch of people that have said yes, but I don't care about the yeses and the machine learning models. You know, it's, you know, the yeses are going to follow your standard Gaussian curve and you know, you're going to hit those more often than not. How you really increase your acceptance and your ability to market and train these models to help you is by understanding the no's. Why did they say no? Why did they say, you know, we don't even want to look at that, we're not interested in that, Whatever. I need those cases. I need more of that data and more of that information. And oftentimes they don't even look at it from that perspective. They're, they, they won't try to dig in. They won't try to get more info on the nose. They just put those, you know, aside and write them off or whatever. That's the data that we need is we're trying to train these models to, to help improve. And most, like I said, most companies don't even, don't even look at that because it's a, they see it as a negative. And, and they don't like to deal with negatives. And I know some CEOs that I've talked to, they, they ignore that kind of stuff. They don't want any part of that because they're, they want to highlight the positives, they want to get rid of the negatives. But me as a, an engineer, I need the negatives to train these models. And so I think it's a mindset shift that's going to have to start to take place. [00:41:53] Speaker A: You know, it's really funny. I'm, I'm sitting here, I'm like, man, you could be, you could be talking like we, I'm a, I'm a coach. I'm a business strategist and an executive coach. And one of the biggest things that we see is they always make excuses or they have reasons why they had no's, but they haven't actually asked. And it's so hard to get them over that hump because they just are like, well, it's the numbers. It's a numbers game. It's a numbers game. Well, yeah, it's a numbers game, but your competition is winning. Yeah, competition is winning. What are they doing? Have you secret shopped them. And let me tell you, 99.9% of the time when I ask that one question, the answer is no. Well, why would I? Yeah. So if your competition has a greater market share, you need to understand what they're doing, where are the gaps, where are their clients not happy. And solve for that and boom, you increase your market share. Like, it is such a simple thing, but it's like this. Like entrepreneurs or executives or marketing teams, they like have a flashlight. They walk into the dark, dark room and they're shining the flashlight and they can only see what's spotlighted. And maybe a little bit that's great off. But when I take that flashlight and I go look back there, they're like, oh my gosh, I had no idea that was back there. You know, and it's that, it's, it's, it's that it's classic. I call it project guys or tunnel vision, because we are on a way and we're looking for that. I call it confirmation bias. It's confirmation bias. We want to see the result that we're looking for and we see it in research is very hard. That's why, you know, quality research requires double blind studies and all these other things because we have something in our mind that is our hypothesis and we think we know. So we're always looking for the confirmation of what we know. So we make excuses about all the rest, but the rest is where our growth is. [00:43:42] Speaker B: That's right. That's right. And if they would just take a. [00:43:45] Speaker C: Page out of, you know, even, and I don't say this very often, but even social media reviews, people will write more when they're ticked off about something and they've had a bad experience. If you don't even bother to ask, then you're wasting so much information that is, is more relevant than a yes, then it kills me. It really kills me. [00:44:11] Speaker A: Yeah. And I look at it like this. My mindset shift is it's just a, not yet. They don't K and o W know enough to say yes. That's, that's the reality. Or I didn't solve their value. I, they didn't see the value in how I presented a product or service. And it's really hard for salespeople to, to take that. It's really hard for marketers to, to take that point of view because their whole job revolves around getting leads. Whether it's a quality leader or not is a whole other ball game. And we don't know if we don't Explore the no. [00:44:44] Speaker B: That's right. [00:44:45] Speaker C: And, and if they would think about it and maybe shift it a little bit, it may not be a no because of your product even. There are other events that are taking place in people's lives that we can get data for, public data, even social media, you know, Nielsen TV data, for goodness sake, Census data, all this data out there is available. And they could be saying no to your product because of some other reason that has nothing to do with your product. And having that kind of information is even more power because then you can really start to use AI to expose the gaps that you need to fill and how you can market and how you can really, you know, better, you know, give them or the, the person that you're trying to sell to, that perception that they have to have it. And these are the reasons why, as opposed to, oh, yeah, you're a current customer, you should do this. Yeah, that's not going to work anymore. [00:45:45] Speaker A: Yeah, I couldn't agree more. We are getting kind of towards the end of our, of our show today, so I wanted to, want to just wrap this up with a couple of clear, actionable items. Number one, regulation is here. There are very big risks, especially now in the EU with AI utilization. We have to know where our data stored, how we're getting, getting our data, how we're using our data. We have to think about encryption. We have to think about a lot of things a lot more clearly, especially if we're going to do business in that area. And guess what, folks, if you're in the US saying, oh, great, well, I'll just do business in the US there's going to be regulations coming down the pipe for us as well. So it's better that we start to think about this now and solve for these things in the planning phase of our product. So if you're looking at AI integration and your head is in sand, that's the wrong answer because it's here. And you will not be in business in a year and a half if you haven't thought about this and automated something because your competition is going to be integrating ahead of you. So that's the wrong answer. The right answer is where are the bottlenecks in your business? What is your strategic plan and what, what is the one pilot? What is the next pilot or that you want to explore and then get with your cto. If you don't have one, hire somebody such as Alan hire Allen. If you're not sure about AI hire, hire myself. We come and we can help you to Suss out, does this make sense from a financial standpoint? Do you have a good scope? What do you need from a data standpoint? Are you thinking about these things? Because that's the role of consultants. The role of consultants is to shine a light back there where you're not looking and no one in your team is looking because you're laser focused on what you're doing. So having that external consultant is so powerful when it comes to considering your AI integrations and what you're doing next. Any automation, anything that you're doing in your business, if you're stuck, it's because you're not looking behind you. And the reality is all of us do that. So hire somebody to come in and help you with your strategy, to help you get your, your game plan in order and spend 96% of your time in the planning phase and then do your pilot. That way you have a better chance of success in your pilot so that you can move forward. Well, Jen, how much does it cost? Well, that's easy. If you don't know this, then it's time for you to dig into your numbers. Yes, I know you hate numbers. Everybody tells me they hate numbers. I hate numbers too. It's not my favorite thing in the world. I'm a people person, but I have to look at them in order to know where the business is going because otherwise I'm throwing spaghetti at the wall and that's just a waste of money and time and resources. And so what is the process that you've identified as your bottleneck that's going that if we unstop, that is going to accelerate your growth? And then how much does that process currently cost? What percentage of that are you willing to put into your pilot? And you've got your budget, very simple. It's oversimplified for the sake of the show, but do that. So if you have an idea of your bottleneck next, get your numbers, then you plan out. Okay, what kind of data am I going to need? If you don't know that, contact Alan. Alan, how can people reach out to you? [00:48:50] Speaker B: Again, LinkedIn is the easiest way to. [00:48:52] Speaker C: Get a hold of me or go to my website. [00:48:54] Speaker A: Yeah, I'm with you. LinkedIn. Go to LinkedIn. Reach out to me, reach out to Alan. We are there and we will hop on a call with you. We are consultants in this space for a reason. We've seen this and we want to make sure you're set for inevitable success. So if you don't know how to get to that scope hire consultants who have been doing this for a while, who know this inside and out and can help guide you to that so that you have a very dialed in scope, a dialed in pilot that has a higher chance of success. Success and less of a chance of going over budget. And then once you have a successful pilot and you see the results and it's the results that you want, then you roll it out. That is kind of the keys to successful integration. Alan, anything you wanted to add? [00:49:35] Speaker B: No, I think the biggest, I mean, you had. You know, I agree 100% with what you said. You know, we. [00:49:40] Speaker C: We've got to continue to go back to that old adage where, you know, measure twice, cut once. It's the exact same thing with AI and pilot projects. Spend more time on the planning and then everything else is going to be smooth. If you skip the planning, you're in for a nightmare. [00:49:57] Speaker A: That's exactly right. And unfortunately, all things come to an end, including this show. But good news is we'll be back, same time, same stage, next week. So until then, take your action. Reach out to Ellen or myself. We are here for you. We are a resource, and we'll see you next time.

Other Episodes

Episode

March 19, 2025 00:53:40
Episode Cover

AI Today (Aired 03-19-2025) AI and Cybersecurity: Detect, Prevent, and Stay Ahead of Digital Threats

Discover how AI is transforming cybersecurity—detect phishing, fraud, and deepfake scams before they strike. Learn defense strategies to safeguard your identity, business, and data....

Listen

Episode

October 16, 2024 00:50:16
Episode Cover

AI Today (Aired 10-16-2024) : Mastering AI Integration in Business

Dr. Allen Badeau & AI advisor Jen Gaudet explore AI integration in business—challenges, strategies & human impact

Listen

Episode

June 10, 2025 00:50:29
Episode Cover

AI Today (Aired 06-10-2025) AI Surveillance and Identity Theft: What You Need to Know Before

Dr. Allen Bado reveals how AI, facial recognition, and deepfakes power modern scams—and what actions you can take to protect your data and digital...

Listen