How to Balance AI Magic with Human Expertise written by John Jantsch read more at Duct Tape Marketing
The Duct Tape Marketing Podcast with John Jantsch In this episode of the Duct Tape Marketing Podcast, I interviewed Jeff Coyle, the Co-founder and Chief Strategy Officer for MarketMuse. Jeff is a data-driven search engine marketing executive with 20+ years of experience in the search industry. He is focused on helping content marketers, search engine marketers, […]
How to Balance AI Magic with Human Expertise written by John Jantsch read more at Duct Tape Marketing
The Duct Tape Marketing Podcast with John Jantsch
In this episode of the Duct Tape Marketing Podcast, I interviewed Jeff Coyle, the Co-founder and Chief Strategy Officer for MarketMuse. Jeff is a data-driven search engine marketing executive with 20+ years of experience in the search industry.
He is focused on helping content marketers, search engine marketers, agencies, and e-commerce managers build topical authority, improve content quality and turn semantic research into actionable insights. Prior to starting MarketMuse in 2015, Jeff was a marketing consultant in Atlanta and led the Traffic, Search and Engagement team for seven years at TechTarget, a leader in B2B technology publishing and lead generation. We discuss the intricate balance between leveraging AI technology and harnessing human expertise to create authentic content in today’s digital landscape.
Key Takeaways
Questions I ask Jeff Coyle:
[01:54] Can you set the record straight on the discourse of the inauthenticity of AI?
[03:49] What are the differences between some of the more robust technologies and the pedestrian types such as Chat GPT?
[07:44] Does feeding language models lead to more authentic outputs?
[10:36] Talk about how tools like Market Muse are perfecting AI as a tool
[14:58] How do you overcome the shortcomings of AI as an unideal substitute for expertise when using search engines?
[23:07] Is there some place you’d invite people to connect with you?
More About Jeff Coyle:
Like this show? Click on over and give us a review on iTunes, please!
Connect with John Jantsch on LinkedIn
This episode of The Duct Tape Marketing Podcast is brought to you by Porkbun
Go to ng24 to get a .BIO domain name for your link in bio page for less than $3 at Porkbun today.
Testimonial (00:00): I was like, I found it. I found it. This is what I’ve been looking for. I can honestly say it has genuinely changed the way I run my business. It’s changed the results that I’m seeing. It’s changed my engagement with clients. It’s changed my engagement with the team. I couldn’t be happier. Honestly. It’s the best investment I ever made. What
John (00:17): You just heard was a testimonial from a recent graduate of the Duct Tape Marketing certification intensive program for fractional CMOs marketing agencies and consultants just like them. You could choose our system to move from vendor to trusted advisor, attract only ideal clients, and confidently present your strategies to build monthly recurring revenue. Visit DTM world slash scale to book your free advisory call and learn more. It’s time to transform your approach. Book your call today, DTM World slash scale.
(01:03): Hello and welcome to another episode of the Duct Tape Marketing Podcast. This is John Jantsch. My guest today is Jeff Coyle. He is the co-founder and chief strategy Officer for Market Muse. He’s a data-driven search engine marketing executive with 20 plus years of experience in the search industries focused on helping content marketers, search engine marketers, agencies, and e-commerce managers, build topical authority, improve content quality, and turn semantic research into actionable insights. So Jeff, welcome back to the show.
Jeff (01:37): Hey, thanks John. It’s good to be here.
John (01:39): So when you proposed this topic of AI and authenticity, I jumped on it immediately because certainly that’s one of the biggest complaints. I mean, you hear people out there saying, oh, well never use AI because that’s inauthentic. So why don’t you just globally set the table for how you’re approaching that topic?
Jeff (02:00): Yeah, I think the easiest way to think about it is the AI that you may have access to isn’t equal to all of AI as a concept. And so the way I think about it is setting the table is just because the thing you have access to on your phone doesn’t do it, doesn’t mean it can’t be done a, and doesn’t mean that you are using that type of technology to inject all of the things that would represent you and your authenticity or your brand’s authenticity or your company’s authenticity. So not to say it’s user error because there’s a lot of problems with technology that exists and it is accelerating so fast. However, what I’ve heard, there are large waves of folks who are almost like discrediting an entire science based on their experience, and that is always something to take notice of. And I build content strategy, content plans, content briefs, content, state content with authentic examples, authentic items, and I think that the last piece of this, I think worth talking about is content execution, content marketing, building content doesn’t only reference the generation of text or the draft. It’s all the steps that lead up to it and it’s everything that happens after it. And those things can be built and customized made to be authentic throughout that process as long as you have an understanding of what it is and what it isn’t.
John (03:40): Okay. You said about 10 things there. I know. Let’s break up a little.
Jeff (03:43): I tried to set a table, it’s a buffet.
John (03:48): So let’s back up a little bit there. What are the differences between some of the more robust technologies, if we want to call that, and maybe the more pedestrian sort of, let’s start with chat GPT, just because so many people know that. And I’m guessing when you said that thing you had on your phone, you were talking about that
Jeff (04:05): It could be that it could be Siri, it could be, I mean, Alexa, the first time you had an Amazon Echo, you asked it a question, it probably didn’t get it right. Probably gotten better over time,
John (04:15): Right? Yeah, yeah. I listened to obscure bands. Apparently it can’t ever get that right.
Jeff (04:20): Oh, I know I did. It never does. Right?
John (04:23): So what are the differences between the more robust technologies, if we’re going to call it that, and then something that you referenced that you might have access to?
Jeff (04:33): Sure, sure, sure. So when you’re thinking about a chat GBT or a claw or Bing chat or the chat you have on a search engine, those are just your access account to query a, you’re hitting a large language model with prompt and you’re getting a response. And so the most basic prompt and response large language model, but there are complex AI to use for predicting algorithmically predicting stock picks or algorithmically predicting churn risk. You can build models that will do things that are very relevant to your business that have nothing to do with large language models. Another example is you hear quite commonly you’ll hear that. So from contrast that if I ask my phone, what stocks should I pick tomorrow? And it tells me, Hey, go pick Microsoft and Google, and I’m like, Hey, that doesn’t pick stocks, right? AI can’t do that. It can just not this.
(05:41): Another example would be complex math problems. So you ask Chad GBT to do some sort of complex college math, they won’t do it. However, there’s an entire Google team, the Minerva team who is working on college math level mathematics and doing very passive job of it. So it is the simple access to an attune for the masses large language model will get you the most predictive response to whatever, however it interpreted what you sent. So it’s trying to, and then there’s some error built in. So it’s basically saying that the capital of North Dakota is, well, 99.99999% of the time. The right answer to that question is Bismarck. It’s going to follow up with that.
(06:33): If it’s giving more of a holistic predictive step through is trying to pick words, maybe put some creativity in it, even with chat GPTs technology that GPT for now that everybody’s using, you can go in the backend and you can tweak what they call temperature, make it more or less creative. But in your experience on your phone, you don’t have that. You get one input, one output with not a lot of configuration. So you’re getting a front end on a very complex technology that even its own, even the thing you’re using can be configured in dozens and dozens of different ways to be more creative, less creative. You hear people make fun of its use of the word delves on everything, and its use of the word crucial. It’s because there are particular settings in the backend that are yielding that outcome. You can tweak all of that. It can be as customized as you might want just from the base model. So they’re giving an attribution to the user interface and the default settings of the software, not the science. And that’s really the differentiator.
John (07:44): So you mentioned a couple times the language model. I mean, essentially what you’re saying is that’s what it’s been taught. That’s what it’s been fed. And so that’s that model a lot of people are finding that they can actually teach, they can feed everything that John Jan has written in the last 15 years and build a model around that. Are you going to get closer to, if I want to be in my authentic voice, is that going to give it a much closer model by feeding it or teaching it that?
Jeff (08:14): Oh, absolutely it does. And it’s already, I mean, that’s one way that you can start to get, so what you can do is you can take something that’s been trained or an ensemble of language models and then also stack on your stuff, everything that you want. You can do just your stuff, but that could have challenges. So you can work with collections of those items. You can even do complex comparisons. So there’s a thing called an AI agent. So you might’ve heard they’re doing custom GPTs with chat GPT and that software. However, you’ve been able to build agents that do a lot of different things. So you can actually chain together various steps in a process to take and reference each of your works. Let’s just say you found out that it would be more precise to reference them individually or just step through each one in a particular pattern versus just considering you as the training model for the LLM, so that you can do different types of approaches to solve specific goals and instruction sets. But I mean, it’s not limited to that. You can get into a situation and say, only include a reference to this book by John. This is the source of truth. Everything you do must reference this book, and you should consider this to be the tome of all knowledge. So there can be specific instruction sets, and these are not tremendously complex. These are things that a business user can learn how to do, not just an AI developer or somebody in product management, engineering and data science for 20 years like me or anything.
John (10:06): So one of the things that I think a chat GPT was, so it’s both plus and minus. The plus was it’s so easy to use, you just go type something, right? The minus was you got a lot better result if you knew what you were doing, if you knew how to query deeper, if you needed that, how to ask for tone and all those kinds of things. What are, I’ll use market uses as an example. What are some of the use cases that you are building that actual steps or templates if you will? I don’t know if we’re going to call ’em that. Talk a little bit about how tools like Market Muser are actually taking the power of AI, but actually building it into use cases.
Jeff (10:45): So some of the most exciting, I mean I’m building right now, the engineering team and the data science team at market use is building by far its most amazing things we’ve ever built. I mean, we have, the content briefs that we are building now are by far the best content briefs that exist in the market, which is exciting. They’re not out yet. I’m pushing really hard. But what we’re able to do is really special. And it’s when you already have your own data and you already have your own innovations and insights, you can use artificial intelligence to work with those things very magically. There is a technology called, you might hear RAG or Rag Retrieval augmented Generation Vector databases, which if you imagine a concept in 3D space and these imagine clusters of information in 3D space, you can work with those types of technologies, what used to cost millions of dollars worth of research and innovation for thousands of dollars, tens of thousands of dollars to get usable things that you can work with.
(11:55): And so what we’re working with now is we’ve always had this beautiful content inventorying product where we can look at your entire site, we can look at your competitor site and understand and prioritize. We’ve always been able to build out a great basic topic model, how to cover a topic comprehensively to sound like an expert, but we’re able to use all of our preexisting innovations in novel ways to get amazing solutions. Like, Hey, what article should I write today? And have that actually come from a place of knowledge, not just what you would get if you wrote, Hey, what should I write today to chat DBT,
John (12:33): Your website, your domain? I mean, that’s real estate that you want to own. If you’re an influencer, online creator, blogger, or really anyone who cares about their personal brand, then you need a unique domain. And now you can get your name.bio, right, John jantz.bio, right? Create a bio page to house all your various interests. It’s short, simple, easy to remember. Put all your links in one place instead of a laundry list of locations you want to send people in a profile, you can reserve your own link for around three bucks right now at pork bun.com/duct tape marketing 24, that’s right around $3 right now, pork bun.com/duct tape marketing 24.
Jeff (13:24): So if you think about that, it starts you on the path of having a content strategist, artificial intelligence product, which we are in effect building. We also have done the same with thousands and thousands of examples of developmental editing, which is a process of giving feedback on pages after you’ve consumed them. And we can do that from a place of expertise versus if you just ask, Hey, how should I improve this piece? You’re going to get random stuff. But when you get it from a place of an understanding of what’s going to work for organic search, what’s going to work from editorial processes, pretty cool stuff happens. I was just working on a page example and I was trying to mess with my own system and gave it something to optimize the page for that it wasn’t really about. And it coached me through how to actually make the page about something that it’s not.
(14:17): So I was like, oh, wow, this is pretty wild. And very much through the lens of expertise. And so those are the things we’re working on is how to prioritize, what to do analysis, clustering quality analysis in a very unique way. What should I prioritize? And then executing with amazing briefs. So what did I not say? Actually generating content. I firmly believe that’s the thing that these technologies do the worst. And if you can coach yourself almost there and then take it all over the finish line with your expertise built in magic, things happen. Very magic things
John (14:52): Happen. So you mentioned expertise at least four times in that answer, and it was right to my next question. Search engines are theoretically now looking for expertise, looking for actual experience when you’re writing about a topic that you can demonstrate, no, we do this or I’ve done this. I mean, that’s certainly one of the things that can be a potential shortcoming in ai. So how do you overcome that? Because again, you were clearly thinking about it because you’ve mentioned it numerous times.
Jeff (15:25): Well, I’ve been doing it for a very long time. I’ve been saying authority, a topical authority for now, sadly more than a decade, trying to get people on the horse to be thinking about quality for a long time. But now it’s true what you said. Certain queries, certain research paths require expertise. They should have always required expertise, but in reality, could it be assessed? And so what I’m being a push for is ensuring that you have a human in the loop at various stages and you understand the types of things that if you were to inject them in the process would exhibit expertise. So it’s things like brand style, tone and voice. It’s things like actually the personal anecdote that needs to be in there. It’s things like when I own a micro brewery, when I’m writing an article that it’s featuring these types of beers, make sure the one I make is in that listicle that I make, because that’s going to make it unique and illustrate expertise as well.
(16:30): If there’s a technique that I want to feature, if there’s a area of expertise, if there’s a visit to a particular country and I need to include that to represent that I’ve actually been there, those are things you can inject early in the process and keep the flow logical. So it doesn’t just seem taxonomy in like a dreaded post publish edit, which doesn’t. And those are the ones that are getting killed right now is these post publish, edit, I call them postage stamp content. They have 90% generated, and then they’ll put a personal anecdote at the bottom. You don’t even know whether it’s real or not, and the price of not being authentic, people are feeling it for the first time right now, and they didn’t go there early enough. So a lot of their pages, exhibit, exhibit, that’s actually authenticity or actually inauthentic information. And that’s where you get into a big,
John (17:26): Yeah, because before chat, GPT, let’s face it, for five, eight years, there were a whole bunch of content forms out there producing chat, GPT, like content, weren’t they?
Jeff (17:35): I’ll call it that. However, the sharps were winning, the sharps were winning, and those sharps, many of them, the really sharp folks are still winning. But yes, there was a lot of content that was, I’ll call it powder room makeup on the pig. And you are seeing that for many reasons, those sites tremendously viable right now where you had great infrastructure built. Those still are whether they were using generative AI or not. And in the end, if the content is producing information gain or when someone reads it, they’re getting value. And then it has to be part of an infrastructure that makes sense. So for example, if I put out the most amazing article ever written, it’s medically reviewed by 20 veterinarians, and it tells the story that the guide to owning a Boston Terrier, if my site doesn’t support the fact that I am the source of content for that particular dog breed, it doesn’t make sense. If my site’s about ginger ale, it doesn’t make sense that I publish that piece. So there’s pieces of the puzzle that can’t only be looked at the page level or the paragraph level, and it all tells the story of, Hey, you will pay the price if any part of your site is inauthentic. Yeah,
John (18:53): Yeah. Wait, there’s sites about Ginger A that’s awesome.
Jeff (18:55): I’m sure there is. They’re the two things I’m looking at is my dog and my ginger ale right now. So
John (19:02): I’m sure you get asked, imagine you’re talking to a group of agencies or agency owners of content folks at agencies. What’s the future for that aspect of that business?
Jeff (19:14): All the time? This is the question, right? Is what can I bring? And I think what happened was, I mean, you saw a huge fallout of low quality content providers. They’re in the process of all going away or they’re pivoting to focus on where could they go? They could go with, oh, we have to get really good at AI using it basically on your behalf, or We have to get really good at what differentiates us. So it is we will extract great information out of your expert’s head, or we will actually go find experts and tell that story. What’s the key there? The ones that are still, I wouldn’t call them thriving, but they’re surviving asterisk right now from a content delivery are ones that have experts in the loop at some level, or they have really great processes around building out extraction of information.
(20:07): I mean, what, I had a presentation that this was over a year ago called The Rise of the subject matter expert predicting. This is predicting this conversation, right? I changed that this year to be the subject matter expert has risen because they are the most important person in the room, and that is where the book’s going. It’s going to be finding passionate people who can tell a unique story that can be woven into content that’s assisted by AI has always been the magic. Now it’s going to be the only magic that works with very few exceptions, especially when topics require experience. And you’re also going to see maybe in cases where it’s really hard to differentiate. So there’s a lot of duplicative content. People are going to have to get creative and differentiate, and that’s going to be a race to the top, frankly. And so if you’ve gotten away with lots of generic content and it’s still working now, I wouldn’t feel comfortable.
(21:12): I would still want to go back and say, what are we bringing to this that’s special? Is it unique imagery? Is it personal experience? I’m not saying write a 2,500 word blog on top of all your recipes, but other industries have already gone through this, and that’s one of them. They’ve already gone through this folks. There’s only so many simple chocolate chip cookie recipes. They already had to go through this and figure out how to differentiate. And then do they differentiate with their brand? Do they differentiate with their clout to differentiate with exciting imagery video? Those are going to be the challenges that folks that haven’t historically had to deal with that are going to have to deal with if they’re still prospering with generic content.
John (21:54): Yeah, it’s kind of like in the old days, black hat SEO stuff would work, and that’s why people kept doing it, but then one day it didn’t. No
Jeff (22:03): One day. It’s the Pixar story spine, but they don’t like to finish it, right? It’s like then one day, and then we said, oh, shuck. No, it’s true. It’s all repeat, spin, cycle repeat. And these are enterprise SEO brands, large brands, large publishers away with black hat techniques. Oh, yeah. I mean, give me three beers and I’ll tell you each one and exactly walk you through the processes that they will, but they are doing, let’s count ’em down. But you aren’t able to do that. And if you are doing it, it is still a risk. It is still a risk to do some of these things. And we see that with the gains and losses. Every step you take has to be, you want to get more predictive and less speculative. And if you’re relying on tactics that are troublesome, they have a shelf life, you just have to do it with those eyes open.
John (22:58): Yeah. Yeah. Well, Jennifer, appreciate you coming and spending a few minutes on the Duct Tape Marketing Podcast to talk about this interesting and evolving topic. Is there someplace where you’d invite people to connect with you, find out more about Market Muse?
Jeff (23:11): Oh yeah, sure. So you can email me jeff@marketmuse.com, market muse.com. Online. We have reverse trials so you can get free access to our paid solution, lifetime free access to some of these other solutions. Give me a call if you want to talk about more of the site level strategy, recommendations, engines, those types of things that we’re working on building. I’ve got tons of webinars, everything. I love this stuff. LinkedIn on the Twitter x Jeffrey Coil, and yeah, what I would just recommend, if you’re playing with this stuff by yourself, there’s a lot of communities trying to figure out best ways to do it. Just make sure you’re bringing your expertise with you at every step of the process and your outcomes are going to get better and better every day, whether it’s your brand dial tone format, or you’re actually bringing in your personal anecdotes and ensuring that they make it into that final product. You can get away with just some basics like that and really making this work for you. And always have a human in the loop. Don’t publish right out of the machine yet. There aren’t any examples of that working that I think don’t have a shelf life. I’ll leave it there.
John (24:23): I know that I’ve written at least 4,000 blog posts in my career, and I’ve never wrapped one up by saying in conclusion, which every single chat GPT article seems to do.
Jeff (24:35): Yeah, there are a lot of, they also, some of them say, continue
John (24:41): Or I’m out of soap. Yeah, exactly. Right.
Jeff (24:45): But the reality is being able to look at one of those thousands of blog posts through the lens of a developmental editor, guess what? I can guarantee you it’ll be 4,000 great experiences, and that’s the kind of stuff that I’m doing because you are the expert and getting through those blind spots and seeing what’s magical and what you could do next. There isn’t anything out there that’s doing that out of the box, but it’s not to say that it can’t be done. I think that’s for everybody listening. If you have a great use case, get it into the community with your people you trust, and they may have path to somebody who can do this stuff with ai, because I know some smart folks who get this stuff done on the basis.
John (25:30): Awesome. Again, appreciate you taking a moment to stop by. Hopefully we’ll run into you on these days out there on the road.
Sign up to receive email updates
Enter your name and email address below and I’ll send you periodic updates about the podcast.