Jun 20, 2023
Want ideas on how marketing professionals are actually using AI in their daily life? Listen in to Will Francis and Danny Richman chat about what's new in how AI is affecting marketing
This week they talk about Open.ai's recent announcements on changes to function calling - Danny explains what that's all about and how it'll be of huge benefit to those using GPT for content development (or co-piloting as he calls it). He also mentions the new API builder from Zapier which makes that even easier to use.
“It feels very much like having a kind of a colleague sitting next to me, and we're working together to kind of solve this problem” Danny Richman
Will Francis 00:00
So news from Open AI this week about their API allowing function calling. Now that's been talked about quite a bit. But I'm aware, not everyone knows what function calling means. You don't have to happen to have an idiot proof definition of what that means do you?
Danny Richman 00:20
Yeah. So I mean, I think you know, the people that are probably going to have the best understanding, and the most use from this new feature will be developers who are building stuff with GPT. However, I think there are going to be implications and practical uses for most people. So let me just kind of explain why it's important. And what it does, you know. One of the challenges that there has been in using GPT, if you're using it as part of an application, is that you can ask it for some information, or you can ask it to do something. But the way that it responded, wasn't completely consistent and reliable. So say, for example, you wanted to get it to, I don't know, provide a description of a piece of text to give a summary of it. Sometimes it might come back and. say, just give you the summary. Sometimes it might say, and here is your summary: there'll be all kinds of variations in in the format, and the way that it would come back with that information.
Danny Richman 01:30
So one of the big advantages of being able to now define functions is that you can stipulate with the prompt that you're sending to GPT, you can stipulate how that information is going to be sent, and how you want it to come back.
Will Francis 01:45
Right, so you can get structured data back basically, that's the key,
Danny Richman 01:49
Yes, you get very structured data that you can rely on 100%, there will always be in the same manner, each time it comes back. And so that has implications if you're trying to get back data in a very consistent way that you're then going to use somewhere else. Or you might then want that data back in a way that allows you to go and make an API request or call some other function within an application where you need it to be absolutely consistent every single time.
Will Francis 02:22
Because it's structured and it's machine readable right? Basically, you can send and you can get back structured data that is essentially machine readable, which means that an app that you've built can send off bits of information in a structured way to GPT and get it back in a very structured way. So it can put that in back into the app in a way that it can work with rather than just get like, you know, streams of text that it has to try and pass and separate, right,
Danny Richman 02:50
Correct, that's exactly right. And, you know, one of the ways that I've been playing around with this this week is in conjunction with a new feature that's been released by Zapier this week. They've got a new thing, which is called API builder. So if you've ever use Zapier, then you may know that one of the ways that you can trigger a zap so for anyone not familiar with Zapier, Zapier is basically like an automation platform. So you can say, you know, if something happens here, then go and kind of do this here.
Will Francis 03:23
Simple stuff right. Like if I make a sale, add it as a row to my Google Sheets, or if I post on Instagram, put that photo in my Dropbox and all that.
Danny Richman 03:30
Absolutely. And that's one of the most common use cases, is using it for kind of CRM stuff and processing leads and all that kind of thing, you know. So you know, when this happens in Salesforce, then trigger an email to be sent here, add it to a Google Sheet. That's basically what Zapier does, you know, very powerful tool, very flexible. But one of the things that Zapier has always had is the ability to trigger automations via a web hook. Okay, so you get a URL, and using that URL, you can then start that automation going. And you can also send it data through that URL and say, Okay, now do something with this data.
Danny Richman 04:18
What you have never been able to do previously, is send it a URL, and then get data back from that same automation. And that's essentially what an API is, you know, you're triggering a URL and then expecting to get some data back. So this is the first time they've done it, they've (Zapier) created this thing called API builder, where you can send a web hook and then get data back. So for example, I could send a URL to Zapier with the email address of a customer. It can then go and retrieve that customer's details from a Google Sheet. Do some stuff with that data, send it off to GPT, format it into an email, whatever it is you want to do with it. And then the API can then return the response of that. But I can then use anywhere else, you know.
Will Francis 05:14
Yes. And that sounds like that opens up millions of creative possibilities of the way that we can get GPT to interact with our various apps, our toolkit, really our tech stack
Danny Richman 05:29
Completely, I mean it's early days, for me playing with this. And I would love to be able to say to you, you know, here is an example of an actual use case where you could go and use it, but it's going to be so dependent on the workflows and needs of any organization that's using it. But just knowing that you can do that, that's how I find I tend to use this stuff, it's just, I need to know, it's it can do it. And then my head then starts encountering all these kinds of situations where it's just, oh, oh, yeah, I could do that with that.
Will Francis 06:04
Here's an idea. And this is just off the top of my head. What would be interesting would be if you knew what products someone had bought, and you knew a few other environmental factors, like what the weather's like, maybe if it's, you know, a sports brand, and any major results in sports, you know, those kinds of things, right. Could you then basically automate very highly personalized email messages that were individual to each customer?
Danny Richman 06:32
You absolutely can. And of course, the beauty of that as well, is that you could have probably done that before. But you would have need to have spent an awful lot of money with developers to go and set something like that up. This is it's all available to people that have no coding skills whatsoever. And that, that that's, that's a game changer, really to make that available to anyone and everyone that needs it. Even tiny, little businesses, you know, little local florists and, and window cleaners that would never have been able to do this kind of thing, can now do this really quite easily.
Will Francis 07:07
Absolutely. Yeah. I mean, again, I'm just my mind's racing, though, if you were a florist, and you had customer records, and you knew the names of people they'd sent flowers to previously and the occasions on which they've done that and things like that. This is all stuff you could have sat in a database, like just Google Sheets, or Airtable or something, and have, I mean, you'd have to trust it. But you can have ChatGPT start to imagine, messages that it could send very highly personalized messages that you could send to people based on environmental factors and current events.
Danny Richman 07:42
Even things like really clever retrieval of data. So something I was just doing this week actually, is, you know, you can create an automation in a tool like Zapier, where you can say, Okay, here's the email address of a customer, go and find all the records that belong to that particular customer. Now, the only problem is, it's very limited, all you can essentially do is just find a row where that customer email matches. But I can't really do anything more flexible, I can't do anything more clever with the data that's in that sheet. But you can do it with code. So you can say, find all the records that belong to the customer, Danny Richman, and have a look at all the kinds of the products that they bought, what sort of category of thing do they buy? What's their buying pattern? When do they do they spend certain money at certain times, all that kind of thing?
Danny Richman 08:41
Now, you know, you can't actually do that natively in Zapier, it's really pretty much impossible to do. But you can do it writing code. So what I've been doing is just going to GPT and saying, I've got a spreadsheet, a Google sheet that contains all this data, write me some code that will look at that data, and using this email address, will do that. And it just writes the code. I don't have to know anything about coding. And it will basically just give me all the answers it wants. So and it works perfectly. It works absolutely perfectly.
Will Francis 09:18
Yes, that's really cool. And I suppose just to kind of double back to the conversation we had last time about me and my dream of totally, you know, intelligent agents like, basically a virtual assistant working on my behalf. It does feel like a step closer to that.
Will Francis 09:39
It 100% is a step closer to that because one of the things that you're now going to be able to do with these new with this new function calling in GPT is it can call other services. So if there is an API that allows you to book theater tickets, if there's an API that can get the weather, if there's an API that can book a restaurant, you know, now you can get that data structured in such a way that it can tap into all of these services. And just do anything on your command, you know, so, yeah, it we're definitely getting closer to your dream robot that you so desperately want. Indeed.
Will Francis 10:22
Another bit of news was about the increase in size of the context window (in ChatGPT) to 16k. So the significance of that is obviously, we can have longer conversations that can provide widening context, without it forgetting earlier things that we said,
Danny Richman 10:40
Which is a real problem at the moment, you know, I'm finding as I'm using ChatGPT, that things are things that either I or it mentioned at the beginning of the conversation, by the time we're all the way down the bottom of page, it's now forgotten that we discussed those things already. And so can make errors based on that. So yes, you've got much bigger context, but it also means you can put in much more data into your messages to ChatGPT.
Danny Richman 11:09
And then worry less about hitting that limit, right,
Danny Richman 11:12
worrying less about hitting the limit. And also using that in conjunction with the Bing add on as well. So you know a really nice thing I was playing around with the other day that works really well is let's say, you're thinking about applying for a new position a new job somewhere and you've seen an ad for a job that's going in your field. So you go to the recruitment website, the job website, where they've got all the job description listed out there with all the details of what they're looking for. You've got your CV and all your experience on your LinkedIn profile. So you can now just go to ChatGPT. And say, here’s my CV, here is the description of the job I'm applying for write me a covering letter will show this employer why I am best suited to this role. And bang, it just does it.
Will Francis 12:06
That's really good, isn't it? And you know, but that also points to something that maybe people listening to this haven't really thought about so much. And the reason that context window is so important is that prompting actually takes a number of exchanges, like getting what you want takes feedback, doesn't it? You know, and I don't think enough people quite acknowledge that they kind of like, Write me a blog post about the best afternoon tea in London. And that's it. But of course, it requires you to try I mean, Open ai do recommend that to break it down into smaller tasks to feedback on results. And eight, you know, 8000 tokens is, how many words does that equate to? Roughly?
Danny Richman 12:50
You can't equate it? It's like around about three or four to one, something like that. So, yeah.
Will Francis 12:58
So, you know, it's only about 2000 words. Right? Yeah. And so that the earlier exchanges quickly get lost quickly get forgotten? They do and it's feedback, you know,
Danny Richman 13:11
and also, quite often, I find that I don't know exactly what I want until I get it, you know. So you know, I'll give ChatGPT an instruction, it will come back. And I realize, I haven't quite been clear enough in what I'm trying to get here. But by seeing something come back that's not right, I can then say, actually, I need that adapted, I need it in a different format, I need it in a different style. And so there is this exchange, you know, and I particularly notice this when I'm using ChatGPT, for generating code, that quite often it will come back with code, the code doesn't quite do what I want it to, or it's coming up with an error when I try and run it. So I then put the error that's being generated back into GPT. And it says, ah yes really sorry about that. And it feels very much, very much like a collaborative process between the two of us. Yeah, we're testing things. I'm giving it feedback. It's saying maybe we should try this. It feels very much like having a kind of a colleague sitting next to me, and we're working together to kind of solve this problem, you know?
Will Francis 14:19
Yes. And so, same here, I mean, I know I probably I'm going to probably say this every time we talk, but I don't you know, I really want to get past the idea is like a lazy copywriter. I use it as a co-thinker. So I'm asking it to help me with ideas for things and help better ways to explain things in my talks and workshops and that kind of thing. And you're right, I start out maybe not quite being fully clear on what I mean, or what I want. And I navigate that with its help. And that's the useful thing about it. So I think Sam Altman talked about people starting to at some point in the future get 32,000 token window in GPT-4 but I don't know when that's happening. But that would equate to maybe, I don't know, 10,000 words, maybe something like that in GPT-4, so you'd have the plugins and the web browsing. So I think that's when you can start to have those much longer exchanges and really go kind of deep on.
Danny Richman 15:16
When you're using it for coding, they call it a co-pilot. And I think that's actually a really good term. You know, I think it's a good way to think about ChatGPT particularly as a co-pilot when you're working on something rather than just give it a job to do the, rather than just treating it like an intern. Treat it like a co-pilot.
Will Francis 15:36
That's exactly it.
Learn the latest and most relevant digital marketing skills with DMI’s Professional Diploma in Digital Marketing. From SEO to website optimization to analytics to PPC to social media and email marketing, you’ll learn everything you need to execute successful digital campaigns and advance your career.