Publish & Prosper
Where publishing, ecommerce, and marketing collide. Brought to you by Lulu.com.
Publish & Prosper
Where Does AI Belong in Publishing?
In this episode, Matt & Lauren explore four different ways publishers integrate AI tools to better support and streamline internal workflows. While we never advocate for using AI as a substitute for human creativity or editorial judgment, these tools can be a great way to help:
- Clean up and optimize book metadata
- Route and triage manuscript submissions
- Track version changes and content updates
- Track, plan, and modify production schedules
Listen now wherever you get your podcasts, or watch the video episode on YouTube!
Dive Deeper
đź’ˇ Learn More About
- IBPA | The Fast-Changing World of AI and Publishing
- Publishers Weekly | When It Comes to AI, Adapt or Else, Says Keith Riegert
đź’ˇ Listen to These Episodes
- Ep #59 | Mastering Book Metadata to Maximize Market Reach
- Ep #78 | How to Use Generative AI to Sell More Books
- Ep #100 | Looking Ahead at Publishing Trends for 2026
đź’ˇ Read These Blog Posts
- Think Outside the Bot: How AI Can Help Content Creators
- Genres, BISAC Subject Codes, and You
- What the Hell Is Metadata?
Sound Bites From This Episode
🎙️ [4:46] “The simple rule of thumb here is that AI can click, copy, sort, summarize. But humans should choose, shape, and approve.”
🎙️ [22:05] “I just think that any time you're dealing with multiple drafts of something and you're using an outdated tool and workflow and process, you're just bound for all kinds of heartache and trouble.”
🎙️ [45:34] “Use it as a tool to free up the creative people on your team, or yourself personally. So that you can devote more time to those creative pursuits, that creative work and the things that specifically need that human touch. Let it do the boring work so that you can do the fun stuff.”
💀 Can’t wait for our next episode? Check out our Resources page for links to our blog, our YouTube channel, and more.
đź’€ Find us on Facebook, Instagram, and LinkedIn at luludotcom!
đź’€ Email us at podcast@lulu.com
đź’€ Sign up for our mailing list.
Matt: Welcome back everyone, to the first episode of Publish & Prosper for 2026.
Lauren: Wow.
Matt: Hope you had a good holiday break. I certainly did. That's a weird statement to make because actually, in all transparency, we're recording this before January. So, I'm going to assume I had a good holiday break. I know I will.
Lauren: I hope you had a great holiday break.
Matt: I'm sure you and Rose will have a great one, too.
Lauren: I hope so too.
Matt: Yeah.
Lauren: I actually just got a text from my dad, who’s staying at my apartment.
Matt: Oh, I thought you were going to say I got a text from Rose.
Lauren: Oh, no. I got a text from my dad, who’s staying at my apartment right now, that Rose is curled up in a ball under the blankets on my bed. So she is spending her snowy Friday afternoon in December exactly the right way.
Matt: I'm sorry for your dad.
Lauren: I'm sorry for Rose.
Matt: Fair enough.
[1:02] - Episode Topic Intro
Matt: Okay. So first episode back. I'm kind of excited about this one. Today we're going to be talking about using AI in the back office of publishing. So using AI to automate workflows without touching or sacrificing editorial judgment. So again, not using AI for creative purposes, but how you, as a publisher, can use AI and how a lot of other publishers are already using AI, whether you realize it or not. So the latest issue of the IBPA Independent. The first thing I did when I got it was open it up, and the very first page I opened it up to was a nice big quote right there from, from Keith Riegert, or Riegert, apologies if I said one of those the wrong way, but. Keith speaks at a lot of conferences, a lot of publishing events, and has a lot of great opinions. But even he says here in this one, “There's no doubt in my mind that every industry, including publishing, is going to become very reliant on generative AI, AI agents, and AI workforces.” Now, I know that's a somewhat general statement, but I think it's one that people need to keep hearing. And it's part of the reason we're going to talk about this today. And if you can get your hands on this, a copy of the IBPA Independent, and I'm sure it's available online too. There is some really great content in here around AI and publishing. And we're going to talk about some stuff today too that hopefully runs hand in hand with that. And I'm sure we'll touch on this more throughout the year.
Lauren: Absolutely. I think it's unavoidable at this point. I also do think that, while this is definitely coming at it from a publishing lens, a lot of the practical things that we're going to talk about in here are also applicable to indie authors, small business owners, entrepreneurs, creators. Just when it comes to how to integrate AI as an assistant, as an operational tool into your workflow.
Matt: Yeah.
Lauren: So we're going to talk about it through a publishing lens, but I think it's applicable to pretty much anybody listening. Just…
Matt: Yeah.
Lauren: Tweak it slightly –
Matt: Nope.
Lauren: – for your own needs.
Matt: That's a great thing to point out, yeah. We typically talk about topics where we can have something that's applicable across groups of users, user bases. So, yeah, while this is very much focused on using AI in a publishing back office setting, it is not necessarily specific to small, medium, or large publishers, hybrid publishers or like you said, indies and other types of businesses.
Lauren: Yeah.
[3:20] - Understanding AI as a Tool
Matt: So, yeah, let's jump in. One of the, the most common misconceptions, or at least, you know, most AI talk in publishing seems to be sort of wrapped up in conspiracy and debate about writing blurbs and ad copy and even, you know, manuscript copy. But the real, I think, progress being made and I think where the real conversations should be happening, are in that boring stuff. It's in the spreadsheets, it's in routing workflows. It's in file names and calendars and notes and automating other things that save you time so you can be more creative.
Lauren: I think that that's again, something that if you're not inside the scope of the industry or even in-house, you might not realize how much...
Matt: Yeah.
Lauren: Busy work goes on on the back end, inside the office. And I think that there's a lot of opportunities that people don't even think about or realize for implementing tools like this to help streamline that so that you can have, instead of your editor spending so much time or their their assistant spending so much time working on the minutia of these details, they can work on the things like helping you make your book better.
Matt: Yeah. Yeah. So essentially, again, we're talking about AI as more of an operations assistant, right? An office assistant, however you want to phrase that, like using AI again to do those things, not as an acquisitions editor or a line editor or a cover designer or any of those types of activities, not as a tool that would actually choose manuscripts or things like that for you, submissions. I think the simple rule of thumb here is that AI can click, copy, sort, summarize. But humans should choose, shape, and approve. And I think if we can apply that, as a rule of thumb over everything we talk about and how you implement AI, I think we're off to a good start. So if a decision needs to incorporate taste or ethics or brand risk baked into it, then a human should do that. Because AI does not have taste. It certainly doesn't have ethics, at least not right now. And definitely doesn't quite understand brand risk at all. So those are absolutely human territory.
Lauren: Yeah, but even if it can do the things to just help you better understand –
Matt: That’s right.
Lauren: – how to make those decisions.
Matt: Yeah.
Lauren: Just doing the grunt work, basically.
Matt: Yep.
Lauren: And that’s what we're talking about here.
Matt: So we'll talk about some use cases, three or four of them. Again we've touched a little bit on, in a previous episode, how people are already getting comfortable using AI to clean up their metadata. Suggest better metadata, you know, pull comps, things like that. That is the stuff that you should be using AI for. But where it's also being used in some publishers operations already, and where others can start incorporating it, is things like routing submissions or version tracking or production scheduling or taking meeting notes. If you're not already doing something as basic as that. Like, turn on Gemini, let it take meeting notes when you're in a meeting, it's it's a much cleaner process. It's going to catch everything and then summarize it for you at the end of it, and there's a ton of different tools out there already that do that stuff. So, we're going to focus on four. Metadata cleanup. We'll talk about routing and triage of submissions. We'll talk a little bit about version tracking, and change summaries, because I think there's a lot of that that goes on. And if you can automate any of that, it could greatly reduce the amount of time it takes to get a title to market. And then lastly, we'll touch on some production scheduling and operational planning. And then we'll follow it up by talking about maybe some best practices, how would you roll this out –
Lauren: Yeah.
Matt: – potentially, inside of a, an office where maybe you've got a lot of people who are still kind of deathly afraid of it.
Lauren: Culturally and practically, how to roll this out, how to implement it, how to start. You don't just wake up one morning and say we're going to completely overhaul our workflow.
Matt: Yeah.
Lauren: And automate everything.
Matt: Yeah.
Lauren: So we’ll talk through some practices for that too.
Matt: Yeah, reaction is everything. What your culture is currently is important as it pertains to how you treat a tool like this, but how you shape your culture going forward to sort of incorporate AI and AI usage, I think, is also super important. So.
Lauren: Something that we've talked about in the past, really important within that implementation, within the culture and the introduction and understanding that, is if your approach to this is I hate this, I want nothing to do with it, I'm going to stick my head in the sand and avoid it.
Matt: Yeah.
Lauren: And you don't implement policies and guidelines for how it can be used or you're not aware of how people are using it, you are more likely to get screwed over by it. Like if you don't have an understanding of how it works, or if you don't have a structure in place of like, we can use it for this, we absolutely cannot use it for this. We will not use it for this. We do not condone the use of it for this.
Matt: Yeah.
Lauren: Because you haven't made that clear, you no longer have control over whether or not people are using it in ways that you don't want them to.
Matt: Yeah, it's true. I think people are often times more comfortable trying or doing something when there are guardrails in place, because they don't feel like it could go off the track so easily. Especially if they don't know what they're doing or they're afraid of it. Whereas if you just say, here's ChatGPT go do some cool stuff with it, I think people are generally still like, oof.
Lauren: Yeah.
Matt: I don't know, this, this seems sketch. So I agree with that. I think, yeah, I think that's important.
[8:25] - Using AI for Metadata Cleanup and Enrichment/Optimization
Matt: So we'll jump into the first use case that I think a lot of publishers are already using this.
Lauren: Yes.
Matt: Small, medium, and large. There’re already a lot of tools on the market specifically for doing this. But I think the most popular and the first thing people really started using AI for in publishing, for the most part – outside of some of the bad actors who immediately started creating terrible books and posting them up on Amazon – is metadata cleanup and enrichment. So running your existing metadata, if you have it, through a tool, an AI-based tool, or even ChatGPT or something, and getting cleaner and better metadata, getting better comps, picking better BISAC codes. Like, that's always a pain point for people. Even when I published my book, BISAC codes were a nightmare. I thought it would be easy for me.
Lauren: Yeah.
Matt: But like, publishing a nonfiction book and having to go through the nonfiction business BISAC categories and pick one that applied to the book that I wrote, I think I landed on like, entrepreneurship and maybe one or two others that were close. But, I mean, you know, picking BISAC category codes is a big pain point sometimes, you need to get it right.
Lauren: And that's what's so important in that. It is something that is time consuming and complicated, but it's also really important to get it right.
Matt: Yeah.
Lauren: I think a lot of people, especially in indie publishing, make the mistake of saying, ugh, there's too many options for BISAC, and there's too many different keywords, and I don't have the time or the bandwidth or the like, full knowledge of all the different options available to me to choose the right one. So I'm going to default to choosing a higher level one that I know is applicable.
Matt: Yeah.
Lauren: While we know that best practices are to niche down as –
Matt: To do what?
Lauren: – as tightly as you possibly can.
Matt: Did you say niche down?
Lauren: I said to niche down –
Matt: Oh.
Lauren: – into your preferred niche.
Matt: I don't think you ended it properly but that's cool.
Lauren: Well.
Matt: Yeah.
Lauren: That's okay. Close enough.
Matt: It is crazy, yeah. You know, the amount of people that don't understand the importance of your metadata.
Lauren: Yes.
Matt: And honestly, for the first several years that I was in publishing, I didn't truly understand the impact metadata had on discoverability. Understanding that is first and foremost central to being able to do it properly. But having a tool like AI, you know, or an AI powered tool that can do that for you, like you said, I mean, these days invaluable, so. There are other issues, I think, too though, that AI is really good for, when it comes to metadata and things. You may have back titles that you're going to kind of reposition. It's really good for taking that existing metadata if you want to reposition that title, feeding that metadata through, telling it what you're trying to accomplish with the repositioning strategy. And it gives you new, better metadata to go after that. Maybe that other audience or particular retailers and things. And then, you know, if you've got some last minute rushes for metadata, before a major retailer push or a deadline or something like that, instead of you pouring over by set categories, metadata descriptions, keywords, all of these things that we're talking about, again, this thing can do it for you in a matter of seconds. So I think it's probably one of the most used ways right now, if not the number one most used way in publishing. And in some of the ways that they're using this is taking a first pass, using it to scan, you know, your catalog, flagging some, some different, inconsistencies that might be there, titles with missing keywords or categories that don't necessarily match the description, or weird capitalization or style breaks, I think is really helpful.
Lauren: I think style guide is an incredible way to be using this. To be able to go through and say, hey, were we inconsistent –
Matt: Yeah.
Lauren: – in our metadata? Did one of these titles – which is actually something that affects libraries and bookstores.
Matt: Yeah.
Lauren: And if the metadata is inconsistent and in six of the seven titles in a series, you have a hyphen in between two words.
Matt: Yeah.
Lauren: And then in the seventh one, you don't have that hyphen there. That is going to disrupt the flow of those books appearing in the cataloging system. And something that small, something as small as a hyphen could disrupt your book’s discoverability –
Matt: Yeah.
Lauren: – your catalog in general. And who–who has the time to go through and manually review all that?
Matt: Yeah. Keywords.
Lauren: Yeah.
Matt: So ask it to give you…ten potential reader search keywords for this title, based on a description that you give it. You make the final call as a human, like we said. But let the tool do the work. Again, we talked about style. You just touched on some really good things, but also adding in things like, brand terms, tone and style guides. Using that guide as kind of your prompt guardrails each time, you can upload a style guide into your house AI tool that you're using. And so the tools are very good about making sure if you've got an uploaded file or guide that you want it to stick to each time, it will. And then again, we touched on backlist, right?
Lauren: Yes.
Matt: So picking like, a small experimental group of backlist titles, or something, and do like a metadata refresh and see if that helps regenerate some, some new sales, give some, some resurgence to those titles. I think are some really great ways to use AI for metadata.
Lauren: Also for metadata research. Cause it is able to scrape not just your own internal systems, but any available, publicly available data as well. And you can do that. If you're sitting here saying, okay, I want…I want to best position this new book in the market.
Matt: Yeah.
Lauren: What are the keywords and categories that these ten other titles from other publishers are using?
Matt: Yep. And again, I don't know if I've ever met somebody that was like, I'm so stoked. Today I've got to do, you know, a couple different metadata projects for a couple different titles. Like, nobody wants to do metadata, it’s the worst. BISAC categories are like the tenth level of Dante's nine, ten levels of hell or whatever it is.
Lauren: It’s nine, nine levels of hell.
Matt: Yeah, it's the tenth level of Dante's Hell. Like, that and TikTok, they, they share the tenth ring, the tenth circle, whatever that is.
[14:00] - Using AI to Route and Triage Manuscript Submissions
Matt: All right. So we'll move on because that one's pretty obvious. And I think another way that publishers can use it, and some already are, to a degree, is implementing it as a way to route and triage submissions. And we've heard that editors, they’re just spending too much time trying to sort through all these submissions and weeding out the ones that just aren't a good fit, you know what I mean? Like, they're spending a lot of time doing that. And there are ways that you can speed that process up through using AI. You know, using it as a traffic cop, not necessarily a gatekeeper. Like, you're not gonna use AI to say this submission, this manuscript looks great.
Lauren: Right.
Matt: This trope is awesome. It's the perfect niche for us.
Lauren: Somebody did that trick where they copy and pasted the job description into a blank page in the doc and then made the text white and ChatGPT read that and said, oh, future New York Times bestseller.
Matt: Eugh.
Lauren: Yeah no, we're not talking about doing that.
Matt: No.
Lauren: Definitely, definitely not the case.
Matt: But, but using it as a traffic cop –
Lauren: Yes.
Matt: – and not a gatekeeper, I think, I think is the right way to do that. You know, you can use it to automatically tag submissions by genres, age categories, word count, whatever you want. However you already internalize submissions and that process, and however you already use tagging, if you do. If you don't, great. Here's a good way to start. It will definitely make your life easier, but if you already are using different types of tagging, just build that out within an AI tool and it'll just take care of that for you. It'll, you know, bucket everything properly, it’ll route them to the correct queue, if you have queues set up. So, if you've got YA, fantasy, nonfiction, all those things, this can greatly reduce the amount of time that a human has to sit there and sort and go through and put these in the right buckets, so that another human can go in and be like, good fit, bad fit, good fit, bad–you know, whatever that process looks like. So.
Lauren: I do think it's also important to acknowledge that we're not exclusively talking about something like ChatGPT.
Matt: No –
Lauren: Like, AI tools –
Matt: Yeah.
Lauren: – there, there are all kinds of tools.
Matt: That’s right.
Lauren: It's not just specifically things like ChatGPT or Gemini or Perplexity or whatever.
Matt: There are a lot of different tools out there on the market -
Lauren: Yeah.
Matt: - that exist. There are tools that allow you to piece together multiple tools to make workflows and things. There's just a lot of great stuff happening on that front. So it shouldn't be an issue of tools. Now, you might have to find somebody internally that is able to find the right tools for you.
Lauren: Right.
Matt: Sometimes that can be a bit of a challenge, but that's still not rocket science. That just is somebody who's going to do a little bit of work. Go get yourself an intern from the local college. Hands down they're going to know more about AI than anybody on your staff, that's for sure, and have them help you build this thing out. So sorting and routing, things like that, can really help speed up the triage of submissions coming in, especially if you're just inundated. But again, as a traffic cop, not a gatekeeper.
Lauren: Yes.
Matt: AI should not be making decisions on manuscript submissions, but use it to help route it, use it to help speed up the efficiency for those who do have to trudge through those piles. AI can do things like extract out bits and pieces of information from, you know, the query or the synopsis that they're required to send with the submission, and use that for tagging and routing as well. Author names, genres, comps that were mentioned, things like that. There should definitely be some guardrails though.
Lauren: Yes. Yeah.
Matt: Again, AI should never be the only one filtering and reviewing -
Lauren: Sure.
Matt: - manuscripts and things like that. It should never be making decisions on content per se. Humans should always be allowed to override the tagging process, by the way, and the routing. So if AI gets it wrong for one reason or another, there has to be a system in place where you could very easily reroute that if need be. And then quite frankly, AI probably shouldn't be allowed to write and send a rejection or an acquisition email on its own. So.
Lauren: Definitely not.
Matt: I think those are some easy ones to think about. Again, I don't know how many small, medium, and large publishers are struggling with their submission inboxes right now, but I know quite a few are. And I think this is a great way to test out some AI stuff if you're not already using it. And again, it could be pretty simple to get something like this implemented.
Lauren: Maybe. Maybe not. But maybe this is worth experi–I, you know, I love traditional publishing. I've always been less of a hater about it than some people in this room. But I absolutely think that traditional publishing has a technology problem. Where they are resistant to –
Matt: Yes.
Lauren: – adoption of –
Matt: Yes.
Lauren: – new technology. Including one of the things that I think is – unless you had more to say on submissions?
Matt: No.
Lauren: Can I move on to the next one?
Matt: But I have something to say about technology in the traditional publishing world.
Lauren: Oh well, you go ahead.
Matt: I agree with you.
Lauren: Okay.
Matt: I think I think from the office to the internet, like they get it wrong on so many levels.
Lauren: Yes.
Matt: You know what I mean?
Lauren: Yeah.
Matt: And it's amazing because when you think about the size of many of these publishers and the amount of money they spend on certain things, to this day you can still go to some of their websites and they're pure garbage. They don't work properly. You get stuck in infinite loops. Even just publishing industry, not even traditional publishers, publishing industry-related brands and businesses like... I don't know, for some reason, you're right. Like, there's a huge technology lag there, to a degree. Thankfully there are people out there doing the Lord's work like, you know, Supadu goes out there and builds websites for traditional publishers, and they build really nice websites. So there are people out there actively trying to help. And I think I will also help with a lot of this. But you're right, like traditionally, historically, it seems like traditional publishing has been slow to adopt technology that could ultimately make their lives easier and, quite frankly, add a few more points to that margin. You know?
Lauren: Yes.
Matt: Make that PNL sheet look even better at the end of the year. So I agree with you.
Lauren: Yes, absolutely.
Matt: Now we can move on.
[19:30] - Using AI to Track Version Changes
Lauren: Okay. Well, I was going to say one of the things that I, I've actually had heated arguments with people–not arguments.
Matt: You?
Lauren: I'd say heated debates.
Matt: No.
Lauren: Me? Never. With people in the past about the absolute adamance that every traditional publishing company has about still using Word Docs with inline comments and just sending back and forth different versions.
Matt: Yeah, that is an internal struggle, I think for traditional publishing as well. And their production departments and their editorial departments is...I guess, alternatives have to exist.
Lauren: Right.
Matt: That are at least similar to what they're used to, which is Word Docs with inline changes, comments, and changes being tracked that way. But there are better ways, obviously.
Lauren: There are.
Matt: And so I think -
Lauren: Yeah.
Matt: I think that was a good transition for you to take us to the next use case.
Lauren: Yeah. Because one of the arguments that I will concede that is a somewhat valid reason for using the Word Docs, is that you have the historic –
Matt: Yeah.
Lauren: – track changes. You have every single version, unless you're saving over them, which nobody ever is. Everyone's always saving it as version 1.0, version 2.0, whatever. So you have that history of the different versions.
Matt: Yeah.
Lauren: But if you had some way of keeping track of and summarizing the different versions and understanding what's been changed from version to version, what existed in previous documents, what have we made different here? What updates do we need to make? Which one is the final version? Maybe there's a better system for doing that.
Matt: Yeah. I think there's a lot of ways that you can use AI to help with that and streamline that process. And some of the ways we know some publishers are experimenting with it right now are using it for things like comparing drafts, uploading a couple of different drafts. Ask your AI tool to summarize the main changes between the two, in bullet points or whatever works best for you, and then use that summary for like, internal status reports or cover brief updates or things like that. Or, you know, if the marketing team needs some sort of context or something like... So being able to compare drafts, extract or extrapolate out, you know, the major changes or differences. Being able to upload multiples and really sort of get the latest version, the most approved version, things like that. Comment suppression. You can take a long-ass editorial letter and turn it into like a one page internal summary. Something like here's the top three structural changes, or here's how the positioning of the book has shifted now, from this version to the latest version, or whatever that might be. And then ultimately you can use it to address your biggest concern, which is use it to sort of define the single source of truth, you know? Decide where that final final manuscript lives. Make sure your workflows always point back to that. And AI can help by summarizing and labeling a lot of different things. So, I think there are ways that you can implement AI to really help clean up that process, not have 37 different versions floating around with some confusion as to which one’s the final, final, final. And then when you add on top of that, again, things like editorial letters that are attached to it that are meant to be internal, for stakeholders and other people involved in the project. I just think that any time you're dealing with multiple drafts of something and you're using an outdated tool and workflow and process, you're just bound for all kinds of heartache and trouble. So I like the idea of using AI to really streamline that process and make sure everybody's clear on which version is the one we're working from right now. Which version’s the final final, ready to go to the formatting team.
Lauren: Kind of buried in there, you made a really good point too, about communicating the shifts in, in positioning between teams within the publishing house.
Matt: And between drafts too.
Lauren: Between drafts. Because that absolutely can be something that changes. And I've seen that happen before, where I've seen the marketing team working on an outdated positioning statement for a book, like they were working on the pitch that was given at a sales meeting six months ago. And we're two drafts later down the line, and some of the key details and some of the key positioning points of the manuscript have changed. But marketing and publicity haven't been made aware of that. So they're still marketing a book that is not accurate to the book that is going to make the final cut. And that's, that's a recipe for disaster.
Matt: Sounds like fun.
Lauren: So, you know, that's a great opportunity to be like, hey, we're just making sure that everybody is pulling from the most updated summary of this is what the current state of this book is. This is what the current –
Matt: Yeah.
Lauren: – top five USPSs are. This is the keywords that we're working on, or the very specific things that you should be talking about when you're reaching out for earned promo opportunities for this book, this is what you need to know.
Matt: Yeah.
Lauren: Instead of people operating on outdated information.
Matt: Yeah.
Lauren: Because that can blow up in your face really fast.
Matt: Yep. Couple of guardrails for versioning. You never want to let whatever tool you're using or whatever AI you're working with modify the master files. That should be obvious, but we should also state it. And then, you know, because of the nature of the content you're dealing with, you want to make sure that whatever tools you're using comply with your own internal policy requirements and stuff that you've set forth as a business. Again, you're dealing with content. You're dealing with someone's manuscript. You're dealing with things that, right now, those are the hot button issues with AI. So you want to be very careful about that. When you're using AI to help with anything that relates to a manuscript, you really have to be very transparent and clear about how that's happening and making sure privacy guardrails are in place so that nobody freaks out.
Lauren: Which is possible.
Matt: Yeah.
Lauren: It is absolutely possible to use these tools with privacy settings in place for that.
Matt: Yeah. Of course.
Lauren: So.
Matt: Of course.
Lauren: Yeah.
[25:16] - Using AI to Manage Production Schedules
Matt: The fourth use case, and I know for a fact of a couple of what I would call mid-market sized publishers that are using it for this, but I'm sure there are a lot of others. They're using it around production scheduling, operational planning, things that have to do with actually getting the book to market. And the things that are kind of ancillary to that, or parallel with that. At that point in the journey, there's a lot of spreadsheets going around for one title, you know? Whether that's the sales team or the finance team or even the marketing team. There are now spreadsheets that have been developed. This title's getting close to go to market time. I might build a spreadsheet for something. And the minute I pass it on to somebody else, they're not going to understand it, I wrote that spreadsheet to fit my needs. If I write a spreadsheet from a marketing standpoint and I send that over to Nick or Thomas in Finance, they're not gonna know what my spreadsheet’s talking about. Chances are they won't. There are other issues around production scheduling, operational stuff. Publication dates get changed all the time. What do those delays do to the rest of that publishing chain? If a date gets changed, there are still things happening that need to be altered. There are activities that are being put in motion that now need to also accommodate that date change. Newer staff members potentially struggling to understand production timelines, things like that. So using AI to really come in and help with some of this operational and production stuff, the scheduling of things and planning, I think is extremely helpful. Definitely at that mid-market size where you got just enough people on staff for things to start getting confusing. But definitely at the higher levels of the bigger traditional publishing houses as well. Smaller publishers, maybe not so much. When you got a smaller team, it's a little easier to run timelines and schedules and have everybody kind of be on board. But when you've got a pretty big team, half of which, if not more of them, are working remotely, you're dealing with a bunch of different deadlines, like…stuff can go sideways pretty fast, right?
Lauren: Not just a pretty big team, but also a pretty big list. If you're putting out more than a couple of books a month, you've got to keep track of not just the production schedule for one title, but for all the titles on your list.
Matt: Yeah.
Lauren: And that's something that you need to be able to take a high level look at and see like, oh, we didn't realize that we had fifteen books coming out in the same week in May and only five books coming out the week before that in May. And that's going to make February really tough for some reason or whatever it is. I think also to your point about not just where spreadsheets are internal, especially for a specific person, but also when you're in a company or in a position where you're working out of either hard copies of documents or documents, where you're saving them manually to your own computer and not using a cloud based doc –
Matt: Yeah.
Lauren: – or a spreadsheet. If Matt puts together a production schedule and sends it to me, and I change two things on it.
Matt: Yeah.
Lauren: And then send that to Paul, and I forget to send the updated version back to Matt.
Matt: Right.
Lauren: Or I forget to tell him, or it gets lost in his email or hundreds and thousands of ClickUp notifications, and he misses that update. And he's still working off of the original version that he has on his computer.
Matt: Yeah.
Lauren: That's going to snowball into a very serious production issue later on down the line.
Matt: Yeah.
Lauren: That would not have been an issue at all if we were all working out of a streamlined –
Matt: Time is money.
Lauren: – managed –
Matt: Yeah.
Lauren: – system. Yes.
Matt: Time is money.
Lauren: Absolutely.
Matt: And in publishing, like a lot of other businesses, but definitely in publishing. The margins get tighter the further down the line you go? Or some would argue the reverse of that. They start the tightest at the publishing and production levels, and then they get better as they go on down to retail. And depending on who you ask, they'll tell you which way it goes. But the point is, margins are tight, and time is money. And so when you're dealing with a ton of different launch dates, a ton of different titles – you can define ton for your business relative to the amount of people that work for you, right? A ton for a ten person team can be twenty, a ton for Simon & Schuster could be 130. The point is, when you have a pretty standard process, right? A book kind of goes through the same stages and steps no matter what kind of book it is.
Lauren: Yeah.
Matt: For the most part. What's really cool is you can take that list of steps, right? So acceptance of manuscript, copy edit, design, proofing, all those things that happen. You can create a document from that. And then what you can do is take historical data from your last…I don't know, two or three hundred releases, and look at the time frames for each of those steps for each of those titles, based on the word count, dump all that into an AI tool, and then tell it to give you the averages. And then what it can do is it can generate a draft schedule for you for any title that you now take in, because it has all that historical data. So if you take in a new title and it's going to be, let's say, 225 pages or X amount of words, 80,000 words, and it's whatever, whatever, you can literally say based on historical trending and averages. What is the timeline, what should the target publication date be for this book? It's a great way for now estimating publication dates, launch dates, and different timelines for activities. And to give you a really good idea while you're project managing a bunch of different titles going through the pipeline. So I think that's really fun to think about.
Lauren: I think it's also a really good opportunity. If you find yourself in a position where you're trying to either tighten up the process or trying to do something out of the normal process. Let's say there's a really popular surge in a specific book genre, and you're trying to get a new title to market while this trend is happening. And so while your normal production timeline from ideation to publication is twelve months, you want to see if you can get a title out in six months.
Matt: Yeah.
Lauren: So that you can get it to market faster. Having access to a tool like this where you can say, okay, one, is it even possible for us to get a title out in six months? And two, how much like, where would we have to tighten up? What would we have to streamline? What are our absolute drop dead dates for when things have to be done or done in order to get this done by? Help me streamline this so that we can make this happen where we get it out in six months instead of twelve. And then also, while we're doing that, what adjustments have to be made to the other titles on the schedule to make sure that they're not going to be delayed later on in the process because we're devoting our attention to getting this title out faster?
Matt: Yeah, and you can even run this as just, go into the tool and say, hey, listen, if I have to delay copy editing on this title, you know, by, let's say a week, realistically, what has to shift?
Lauren: Yeah.
Matt: You know, who do I need to contact? And hopefully you've already fed, again, all this stuff into your tool and including maybe the people in your team that touch the different… The AI can outline, you know, which milestones might need to move and what risks those introduce and help you better game plan. Like, okay, they really want me to delay this one by two weeks. We need to do this. Or in your example, I really got to pull this forward because we're trying to hit this particular event that might be happening. And this title is perfect. Like, we could capitalize on the wave that this event is going to produce, but that means I've got to pull forward everybody's timelines. Type that into your tool. What's going to happen? Who do I need to talk to? Which teams are going to be touched the hardest, and what do I need to make happen? What does that do to the other titles that are currently in the pipeline? It would know all that stuff already. It's such an easy way, in real time, to shift your production values and see what happens. So that's, that's really cool, I think. Yeah.
Lauren: I would encourage anybody, if anybody is still listening to this that doesn't work in publishing, or is just listening to it for funsies at this point. You know, whatever. Thanks for listening. But I would encourage anybody to, to test out having a conversation with ChatGPT or your generative AI of choice.
Matt: Claude, Perplexity.
Lauren: Right.
Matt: Whatever you use.
Lauren: About how to streamline your workflow.
Matt: Llama.
Lauren: Sure.
Matt: Nano Banana – no, Banana’s a...image generation. Sorry. Some of these names are just getting ridiculous. Although that's kind of cool, but also kind of like, really? All right.
Lauren: I mean, I guess they’re memorable? I don't know. But I spent about an hour the other day exhaustively typing out my current workflow for the production of a podcast episode.
Matt: Oh.
Lauren: I typed out every tool that I use, every step of the process, what my goals are, why I'm doing it this way, what's important to me. I did all of that. I uploaded it all to ChatGPT and I said, this is my current workflow. These are my non-negotiables. This is what I need to end up with at the end of it. These are my goals for how I'm trying to streamline this. These are my pain points for I can't figure out how to do this thing, and that's why I'm doing it this way instead. How can you help me streamline my workflow?
Matt: Yeah.
Lauren: We had a surprisingly productive conversation. Which I also can't believe I just said conversation twice. But that is –
Matt: You've actually officially, officially cemented yourself as a professional yapper. Because it is now confirmed that you will talk to anything or anybody.
Lauren: Yeah, obviously.
Matt: Okay. I like that though. As your boss, I really like hearing that you did that.
Lauren: Yes.
Matt: I hope something productive came out of that. But that's a great – I love that. Yeah.
Lauren: Yeah. I just think it was a really interesting… Because it is something that in this particular position for me especially, on my team, there's nobody else who does my exact job. So there is nobody else in-house –
Matt: Yeah.
Lauren: – that I can bounce this off of that I can say where, where would you tighten this up? And now publishing, maybe not the same thing. But maybe you have different processes for doing something. Maybe one of you is doing something that the other one isn't. Maybe you've never thought about doing this way before. If you have the opportunity to say hey, here take an overview with this, what am I missing? What could I be doing differently? Is anybody else who's using this shared workspace doing something that I didn't think of within this workflow?
Matt: Yeah.
Lauren: It could be a small change that is a game changer.
Matt: Yeah. That's true. If you're doing all this anyways and putting all this information in there, take the chance for the opportunity to also then say, by the way, is this the best way to be doing this?
Lauren: Yeah.
Matt: I was initially coming at it from the standpoint of transparency. And again, everybody having a better understanding of just what happens in the production timeline when you move something. And again, being able to see that in real time. Or even like how beneficial would it be for somebody editorial to be able to sit with somebody in marketing and say, okay, you want us to do this, but let me show you what happens when we do this. You push one button and the tool shows what happens when you mess with the timelines for copyediting, developmental editing, proofreading, those things. And they can literally watch all these things shift. What happens to that target publication date? What happens to what would be those pre-launch campaigns? And so I think that's really cool. And that level of transparency, I think, can really help all the way down the chain. So.
Lauren: What about having an understanding of how that changes budget too?
Matt: Yeah, all of it.
Lauren: And how it changes the bottom line.
Matt: Yeah.
Lauren: Because that's.
Matt: Yeah.
Lauren: How does this one simple change domino the entire like, cost of this project?
Matt: Yep. All of it.
Lauren: Absolutely.
Matt: The guardrails there are again, you know, humans need to be making final decisions on things, especially when it comes to production. A human is really going to understand better than AI, at least currently, whether or not that's enough time to get 25,000 books offset print in whatever country you get them printed in and shipped to where they need to be.
Matt: AI can't really make that decision for you. So things like that, humans have to have the final say. And then again whatever you're getting from your tools, treat these as proposals or drafts, things like that. This should not be the final like, you hand this over to the sales of marketing team and go, this is the delivery schedule.
Lauren: Yep.
Matt: Here you go. Like, you know, things, things are always in flux. So just, just remember that.
[36:34] - Tips for Defining AI Policies and Rolling Out New Tools
Matt: Now we get to the policies area, the guardrails, how to roll this out and, and maintain a really good culture around it. So let's talk about some of those things.
Lauren: It's a tough one. Well, it is something that is -
Matt: Why is it a tough one?
Lauren: Well, because it's. It's something that is worth approaching delicately, cautiously, intentionally. But I think it's also something that...I get a little bit of agita every time Matt says to me I want to do an episode on AI. Because I'm like, oh God, is, is this the time that we get cancelled, because we did?
Matt: If it makes you feel better, I do too.
Lauren: It does make me feel better, but you still keep making me do it.
Matt: Well, so let's talk about that. I think the first step for any organization, whether you're a publisher or some other type of business, and you really want to start implementing AI in some of your back office stuff, like we've been talking about. To make people's jobs easier, by the way, not necessarily to replace them. You start with some things that AI should never do.
Lauren: Yes.
Matt: You said this at the top of the episode. There really should be very clear guidelines about what people in your organization should not be doing with AI. So things like: AI is never in charge of accepting or rejecting a manuscript. It's there for routing purposes. It's there for clearing that inbox faster and getting them where they need to go.
Lauren: Indexing and cataloging, yeah.
Matt: It should never approve cover designs, right?
Lauren: It should never generate a cover design either, but.
Matt: That’s a whole other thing. I agree with you, but it definitely shouldn't be making the final decisions on cover designs, like. Or copy or again, anything where a human needs to create this content. Again, this should be obvious, but we'll say it: it should never make a legal or ethical call. There are a lot of people right now who are trying to replace their attorney fees by using ChatGPT, and it is absolutely comical.
Lauren: Jail.
Matt: So again, I think if you could lay down some of these what AI should never do in our organization first, people will get more comfortable. And like you said, when you have some guardrails, people are a little more prone to step out of their comfort zone and try stuff. So all creative decisions, all creative decisions, should have a human owner that has a name attached to it. Lauren, Matt, Paul, Laurie. Anything creative, any creative decision? Again, a human should make those final calls. So I think if you start there and we move into a few of these other things, quickly, based on the amount of time we've we've rambled on this stuff, but I think that helps sets the tone.
Lauren: Yes.
Matt: Right? Instead of your boss or whoever, like the IT guy or head of development just coming in one day and going, hey guys, I got everybody a license for Claude. I want everybody to start using Claude, and then just walking out of the room. Like, everybody's going to be like…WTF?
Lauren: Yup.
Matt: What? Oof. Immediately everybody's on Slack talking junk. But if that person was to get up there and say these things and talk about how we hope to use it to make our business more efficient, to make your jobs easier so you can spend more time being creative or supporting the creative teams, or going out there and finding new authors at events, and the fun stuff that AI can't do. The really important stuff. I think that's the way to do it.
Lauren: I think that's important both internally and externally too.
Matt: Sure.
Lauren: As much as it is about like, internal culture and making sure that your teams feel comfortable with this, and that they feel like they're not in danger of being replaced or anything like that.
Matt: Yeah.
Lauren: It's also important to have these guidelines in place for public transparency. If you have an author that approaches you and says, oh my God, I heard you guys are using AI on my book, and why am I working with an editor, or whatever? Like, you can very clearly point to these and say, no, no, no, we're not. No AI tool has had any kind of creative input on your book whatsoever. It is exclusively being used under these guidelines, and it is emphatically not allowed to be used in these circumstances.
Matt: Yeah.
Lauren: Are there still people out there that are going to be upset that –
Matt: Maybe.
Lauren: – your company is using AI? Yeah, but they're going to do that whether or not you're a publishing company, honestly, so.
Matt: Yeah, but I think you're right. I think the more honest and transparent you are about it, the more comfortable they get with it. And then again, working back internally, with regards to culture. Having those conversations, acknowledging that fear too, by the way.
Lauren: Yes.
Matt: You know, having a leader that steps up and says, you know, if you're quietly worried about the fact that AI could take your job, relax.
Lauren: Yeah.
Matt: We really want to do this to try and make your job easier. You can do less administrative data entry type stuff and really focus more on what we actually hired you to do all those years ago, whether that's creative work or sales work or marketing work. Every team can benefit by some sort of AI assistance, by the way. Whether it's sales or marketing or editorial or design. You can do things like have little lunch and learn sessions, you know? Where you all get in a room and you have some fun with it, do some fun, silly stuff with it, but then also do some very real, you know, exercises with it. Like some of the things we've talked about. Show them what it looks like to dump in a historical document of the last two hundred releases and the timelines for those releases, and what each step of the way took in terms of time and how you can pull averages and then adjust those times and get better on those windows to release dates. Getting your team really comfortable with it. That will also help them be more comfortable talking about it publicly because you, as a brand or business or a publishing company, could go out there publicly and say, yeah, we're doing this, and here's how we're using it, and we absolutely aren't using it this way. But if internally, your team is still like, ooh, it's – you're not aligned, it's not, it's going to show, you know what I mean? So making sure both – you were right to say external – internal and external, are kind of on the same page with, with how you're using it, I think is extremely important.
Lauren: And also what is being used within the AI and what's being turned out from it. So I think, you know, basically we're talking about data privacy, security, making sure those things are being handled and that you are clear and upfront about how they're being handled.
Matt: Yes. I think when you're rolling something out internally, you know, and we talk about creating policies, things like that, it's probably a little easier said than done, but just keeping it clear, concise, simple. Like we said, just a really short thing. Here's what AI can be used for. You know, administrative stuff, draft summaries, first passes. Here's what AI cannot be used for. You know, final approval, sensitive contract information stuff. Performance reviews. Never use AI for a performance review. How to use AI to generate labels for certain things. I think that again, to your point, having those guardrails, those do's and don'ts, will really help people adopt it, and move forward a little bit faster. Real quickly, you know, to implement a quick experiment, maybe run it for thirty days or whatever. Choose one of the workflows that we've talked about or something that makes sense for your team, your business. It can be something as small as like, just using it to take meeting notes. If that's your, your first toe in the water, great. Do that for a couple of weeks to a month, see how that works out. Are people getting better notes or are ideas being pitched more clearly? Define the before. So how long does that typically take now, whatever it is you're experimenting with, so that you know what to measure it against afterwardsPut the pilot in place. So use it for whatever, thirty days, two or three weeks, whatever that might be. And then again, having that before snapshot to have something to measure against. At the end of that period get together as a team. Talk about did you find this easy to use? Were you afraid to use it? How do you feel now after having used it? Did you get better notes or did you feel like the submission process became a little easier in terms of you knew exactly which bucket, you know, these particular submissions were in, so you could go and start going through them. Whatever the experiment was that you ran. What do you still feel needs human oversight? Ask all those important questions. And then lastly, from that experiment, write down a little tiny policy like we talked about. If you're going to continue to use this and then move on to the next experiment, what's the policy for this one? So here's how we'll use AI for this. And here's how we won’t. And now you're on to the next one. This is a really quick like five step way to start implementing one little AI initiative at a time within your organization.
Lauren: Emphasis on one at a time. It's like AB testing. You're only going to change one thing at a time.
Matt: That's right.
Lauren: Try one thing. And if you're not sure where to start, start by asking the question what is the most time consuming thing that I do every day that I wish could be automated?
Matt: Yeah.
Lauren: Or what is the most annoying task that I'm working on every week? The most repetitive thing that I'm working on every week. That would be something that would be really nice to streamline, so that I had more time to do this other thing instead.
Matt: Agreed, yeah. AI is great for helping you run your back office stuff. It is what you should be using for. We've said this a thousand times. We'll keep saying it. Hopefully this episode shed a little more clarity on exactly what we mean by back office stuff. Again, some publishers are doing these things, or some of these things, some are not. Small to medium sized publishers, you might have a little more flexibility in implementing these things. This could give you your leg up to really start growing in a market right now that seems to be somewhat stagnant for a lot of publishers and genres. So, just remember, it belongs in your back office. It does not belong on your editorial team's calendar of how we're going to run these next few manuscripts or things like that.
Lauren: Use it as a tool to free up the creative people on your team, or yourself personally.
Matt: Yeah.
Lauren: So that you can devote more time to those creative pursuits, that creative work and the things that specifically need that human touch. Let it do the boring work so that you can do the fun stuff.
[45:53] - Wrap Up
Matt: Yeah.
Lauren: You can also –
Matt: That was fun.
Lauren: Yeah.
Matt: AI.
Lauren: Well.
Matt: I have a feeling we'll be talking more about it throughout the year.
Lauren: I have a feeling you're going to make me talk more about it throughout the year. But if you have any questions about it, or if you have anything in particular that you want to hear us talk about, or if you want to argue with Matt about why we shouldn't talk more about it. Just kidding.
Matt: No, I like that.
Lauren: You can always reach out to us podcast@lulu.com. You can leave us a comment on any of Lulu's social media channels or on our YouTube channel where all of these episodes, if you're not already watching them, you can watch all of these episodes in full video on YouTube and leave comments directly on those videos. Lucky you.
Matt: Lucky you. Yeah.
Lauren: And if you're all good with that one, that's okay. We'll be back next week with another new episode.
Matt: Fingers crossed. Just kidding, because you've got a whole new process now.
Lauren: I'm working on it.
Matt: Working on it, okay.
Lauren: I'm working on implementing the new process.
Matt: Got it.
Lauren: So we'll see how it goes. Wish me luck.
Matt: Very clear.
Lauren: And uh, we’ll see you next week.
Matt: All right. Later.
Lauren: Thanks for listening.