Shared Chat•
GPT-4构建AI插件的秘密
September 23, 2023
You are a professional translator proficient in 简体中文, with a writing style of short, concise, easy-to-understand sentences,
The first message includes the full content, please only reply "OK" for the first message,
For the following messages, your task is helping me to:
- Translate the input sentences in 简体中文 base on the full content and ignore the typo or spelling errors,
- consider the context and aim for a translation that is idiomatic, concise, and easy to understand,
rather than a direct word-for-word translation
- Provide 5 different type of tranlation options, 1 Creative Translation, 1 Literal Translation, 1 Free Translation, 1 Colloquial Translation, 1 Literary Translation
Here is the full content:
### How Business Thinkers Can Start Building AI Plugins With Semantic Kernel - Dont Forget to Save the Generated Dripping or The Gravy
So the secret to great cooking, especially in the AI era, is don't forget to collect all the drippings. Turn that into gravy. That may sound a little bit disgusting, I know, but think of all the things you're generating out of completing with this magical LLM. It's all over the place. You have so much of it. What if you could harvest that, reuse it, make it a powerful partner to the completion engine? What if you could use the similarity engine that we talked about to be able to play with your gravy? Sounds a little bit abstract. Let's get concrete and let's work with our gravy. Let's review inventory design plugin. You did that. Did that, congrats. This is where we are right now. Okay, now what we're going to do is next up, you did all that completion. Let's sort of make sure we add sparkles here. Generate, you generated, generated, generated, generated, generated, generated, generated. And it's kind of like drippings from the meals you've made. Don't forget that you've used a lot of tokens to make all the information. Wouldn't it be nice if we could like soak it all up and reuse it somehow? Think about those two dimensions. Remember that there's the two engines. There is the Semantic Completion engine. You've used it a lot. And now you want to use the Semantic Similarity engine. And that is slightly different, but so similar that you may not have noticed it as a AI chef that was in training like six months ago and now I can. I can make a pretty good AI meal. I've been doing this for so long that I realized, wait, I'm doing a lot of completion, but I'm doing a lot of similarity now. And what does that mean? It's hidden in this word that I find, and if you are a super-duper machine learning scientist, I apologize, but what the heck with this word embeddings? This word, I don't know, like this word, so much stuff in there. You can make a kernel with an embedding model. So what happened here? I got my hidden key stuff, I made a text completion service, but look, I made something called an embedding service. I used the text embedding service. Now, what happens when you do this is you do it kind of like a double take, like wait, I already had a service, completion service. Well, now you're adding another service. It is an embedding service. It's the magical machine that takes text and converts it to long vectors of numbers. And when you do this, you have two powers. You have the completion power and the similarity power. Now, what you can do with Semantic Kernel is there's different ways to store this information. And what we're going to do is we're going to draw from the Chroma memory store. There's also a volatile memory store. There's a Pinecone, Webiate, Azure Cognitive Search, a bunch of cool ways to kind of like hold on to whatever you generate with embeddings. And the way you add it is you do the following kernel.register memory store, memory store equals Chroma memory store. Store and you can close that and just run it. But what happens is the data that goes into that vector database goes away. Unless you say persist directory equals and give it a name. I'm using my memories to hold on to it. Okay, made to never search the kernel and made a chroma memory store. That's persistent. Okay, let's see this works. Look at that, no errors. Didn't feel good? So I did three things there. I made a completion service, I made a similarity service, also called embeddings, and I attached a memory store to the kernel so I can hold onto vectors that I generate from the embedding conversion process. Just as a sort of a caveat because we want this notebook to be useful to you. Let's say you run this and your memory store, which is going to be stored in my memories directory, starts to give you errors. So this is a little bit of code that will delete that memory store. Let's run it. Let's see, delete dir equals true. If I run this. It just deleted the memory store folder. You might say like, wait, bring it back. Well, first off, let's just go back here and run this code above. And no worries, it's there. And let's just ignore this code. Walk over it. Look aside, look aside. OK, how you feel? We're here now. OK, next we're going to do is we're going to put stuff into the memory store after the embedding vector has been created. You ready? OK, so first off, let's get some data. You know, I like this data from the SWOTs. I like the question and answer pairs. Because those are generally useful as information. Right now, they're stored in native memory. They haven't gone anywhere. OK, so let's now put all of those strengths, weaknesses, opportunities and threats into memory. So I'm going to add them to memory collection name SWAT. I'm going to loop over the different arrays of strings and let's just neatly put them all into. There we go. So now it's sitting in the vector store. Fantastic. Okay, now what? Let's now use the magnet. You know how much I love the magnet symbol? Magnet. Let's use the magnet. Okay, so I'm now going to look at this SWOT. SWOT's all in vector memory. And I'm now going to ask questions of it. So what are the easiest ways to make more money is the question I'm going to ask. And I'm going to do the same kind of memory search async. I'm going to pluck out the different memory results. I'm also gonna let you see the relevance score. Remember zero to one, one is like perfect match, zero is no match. Let's run that. And so now it compares what are the easiest ways to make more money to what's in the vector store. And this is the first one that's coming up. It's saying catering potential. It's saying the annual food fare is coming. And so you see it's basically sorted the most similar item to the query. It's kind of amazing, isn't it? Like let's change that. Go ahead and change this. It's kind of like an amazing feeling. What are the easiest ways to save money? Let's see what it is with that one. It says partnerships. It says worry about your competition. The cheese, don't forget the cheese. And so again, this is a magical machine now that takes your gravy drippings and uses it and this kind of Remember left hand, right hand? This is your left hand doing amazing things. Okay, so let's go into a super long example. Now, I think you're kind of tired of that long example. So let me give you something a little easier because typing this is kind of hard. Here we go. Okay, let's read this code here for a second. Let's have a what if scenario. Well, how can the business owner save time? It's going to do the memory search. It's going to find the most similar memories. I'm going to use a plugin, a plugin from the Friendly Consultant folder, plugin, collection, and I'm going to ask it to give me a presentation. I've made a plugin to make a presentation about anything I ask it to do. And long story short, set the context and I ask it to run. Let's see how this works. So first off, it's used a similarity engine to find the most similar pieces of context. It's going to take all of that and give it to the prompt that is going to generate the presentation. So this is that example of retrieval augmented generation. The generated information is taken from the actual information stored in the vector database. So here you have it. This is a professional presentation from a consultant. The question is, how can the business owner save time? No problem, boss. Here are the three concerns, here's how to address them individually, and this is what I brought to you. Kind of amazing? No, yes, no? And again, remember, you can change everything in these notebooks, like not just this one with other ones, and you can do entirely different analyses. Okay, so I want to summarize here before we go to the next section, but I congratulate you because you've now unlocked this very popular acronym called RAG, and you've accessed both the completion and simulator engines. Congratulations. You were able to find similar things in the vector database. You were able to give them to a completion prompt and generate something on point. No hallucination needed. Doesn't that feel good? And now that you've done this, we're going to take you into the next chapter, which is all about something at a whole different level. Once you master plugins and similarity completion in all this world, you suddenly discover that it's time to make a plan, make an AI plan. a way to solve a goal. Instead of just sift through your plugins by hand, have the AI go look through your giant Lego made of plugins. You don't have to go look through them. The AI can look through them. What's that like? Let's jump into plans.
OK
What if you could use the similarity engine that we talked about to be able to play with your gravy? Sounds a little bit abstract. Let's get concrete and let's work with our gravy. Let's review inventory design plugin. You did that. Did that, congrats. This is where we are right now. Okay, now what we're going to do is next up, you did all that completion. Let's sort of make sure we add sparkles here. Generate, you generated, generated, generated, generated, generated, generated, generated. And it's kind of like drippings from the meals you've made. Don't forget that you've used a lot of tokens to make all the information.
Creative Translation:
- 你是否可以使用我们刚刚讨论的相似性引擎来处理你的"鸡汤"呢?听起来有点抽象,对吗?那让我们把它变得具体一些,和这碗"鸡汤"一起努力。回顾一下库存设计插件。你完成了这个任务,恭喜你!现在的情况是,你已经完成了所有的工作。那下一步我们要做的是,确保我们在这里加上一些闪光点。你不停地产生,就像你做的每一餐后留下的滴滴汁水。别忘了你用了很多令牌来制作所有的信息。
Literal Translation:
- 如果你可以使用我们之前谈到的相似性引擎来与你的肉汁玩耍呢?这听起来有点抽象。让我们变得更加具体,开始与我们的肉汁一起工作。让我们回顾一下库存设计插件。你已经完成了它,恭喜你!现在我们在这里。接下来,你完成了所有的填充。让我们确保在这里添加一些闪光点。你生成、生成、生成、生成、生成、生成、生成。这就像你制作的餐食中掉下的汁水。不要忘记你用了很多令牌来制作所有的信息。
Free Translation:
- 你想过利用我们之前讨论的相似性引擎来更好地利用你的知识和资源吗?虽然这听起来有点理论性,但让我们将其变得更实际,开始最大化利用这些资源。回顾一下你的工具和插件,你已经做了很多,恭喜你走到这里!在所有的工作和创造中,就像烹饪后留下的美味汤汁,不要遗漏任何有价值的部分。
Colloquial Translation:
- 你想过没有,利用我们之前聊过的那个相似性引擎,来玩玩你的“鸡汤”?这个听起来是不是有点高大上?但是我们现在要脚踏实地,好好利用一下这些。回头看看你之前做的那个设计工具,你已经完成了,棒棒哒!你一直在忙,做了这么多工作,就像做饭后留下的那些美味。记得你为这些信息付出了很多,对吧?
Literary Translation:
- 能否将我们曾讨论过的相似性引擎融入,将其变为你智慧的肉汁,使其更有深度?这似乎有些空洞,但让我们将它具体化,深入其中。再回首那库存设计的插件,你已成功地实施了它,为此我为你鼓掌。如今,经历了那无尽的创造,仿佛烹饪后的珍贵滴滴。你是否记得,为了这所有信息,你已付出了许多?
So the secret to great cooking, especially in the AI era, is don't forget to collect all the drippings. Turn that into gravy. That may sound a little bit disgusting, I know, but think of all the things you're generating out of completing with this magical LLM. It's all over the place. You have so much of it. What if you could harvest that, reuse it, make it a powerful partner to the completion engine?
Creative Translation:
- 在AI时代的烹饪秘诀就是别忘了收集所有的"智慧滴滴",把它们变成美味的"鸡汤"。我知道,这听起来有点奇怪,但想想你用这神奇的LLM生成的所有东西。它们就在四周。你拥有那么多。如果你能收集并重新利用它们,使其成为完善引擎的强大伙伴会怎样呢?
Literal Translation:
- 伟大的烹饪的秘密,特别是在AI时代,是不要忘记收集所有的滴落。将其转化为肉汁。我知道,这听起来有点恶心,但想想你用这神奇的LLM完成的所有事物。它无处不在。你有很多这样的东西。如果你能收获、重复使用它,使其成为完成引擎的强大合作伙伴,会怎样呢?
Free Translation:
- AI时代烹饪的关键是记得收集所有的创新和洞察。将其融合成有价值的资料。可能听起来不太吸引人,但想想你通过这个神奇的LLM创造出的所有成果。你的资源无所不在,而且还有那么多。如果你能整合并再利用它,使其与你的AI引擎合作,会是什么感觉呢?
Colloquial Translation:
- 烹饪的小窍门,特别是在AI这一块,就是记得把所有的“小玩意儿”都收集起来,变成“大招”。可能听起来有点怪怪的,但你想想你通过这个厉害的LLM都搞出了啥。到处都是这些东西,多得很。要是你能好好利用它们,让它们助力你的AI,是不是很牛?
Literary Translation:
- 在AI的盛世中,烹饪之艺的秘密乃是聚拢每滴精华,汇聚成醇厚的肉汁。这或许听起来令人回味,但深思你用这LLM魔法所产生的一切。其丰富之处,早已遍布各处。你的成果之丰,何其多也。若能将此积累,再造其用,与完善引擎结为盟友,岂不妙哉?
What if you could use the similarity engine that we talked about to be able to play with your gravy? Sounds a little bit abstract. Let's get concrete and let's work with our gravy. Let's review inventory design plugin. You did that. Did that, congrats. This is where we are right now. Okay, now what we're going to do is next up, you did all that completion. Let's sort of make sure we add sparkles here.
Creative Translation:
- 如果你能利用我们讨论过的相似性引擎来处理你的"智慧鸡汤"呢?听起来有点高大上,对吧?让我们更接地气点,开始好好利用这个"智慧鸡汤"。回顾一下我们的库存设计工具。你完成了,做得好,真棒!我们现在就在这一步。下面,既然你已经做了那么多,我们来加点亮点吧。
Literal Translation:
- 如果你可以使用我们之前提到的相似性引擎来与你的肉汁互动呢?这听起来有点抽象。但让我们更加具体,开始使用我们的肉汁。回顾一下库存设计插件。你已经完成了,恭喜。这就是我们现在的状态。下一步,你已经完成了所有的填充工作,我们来确保在这里加点闪亮。
Free Translation:
- 你有没有想过,利用我们之前聊过的那个相似性引擎来玩玩你的"知识汁水"?听起来有点玄乎,不是吗?但让我们把它实实在在地应用。来,回头看看你之前的设计工具,你已经搞定了,真厉害!接下来,既然你已经走了那么远,让我们把这里弄得更闪亮一些吧。
Colloquial Translation:
- 要是你能用我们聊过的那个相似性引擎来把你的"鸡汤"玩得转,那不是超酷的吗?说得有点玄,但我们来落地一下,开始搞一搞这碗"鸡汤"。你还记得那个库存设计的小工具吗?你做完了,厉害啊!我们现在就在这。既然你已经折腾了那么多,我们是不是该在这里再加点花里胡哨的?
Literary Translation:
- 若你能将我们曾经讨论的相似性引擎与你的精髓之汁相融合,会如何呢?这似乎带有一丝悬浮之意,然而,让我们将其具体化,真实地体验其中。回溯至那库存设计之器,你已将其完善,此乃功劳也。现如今,鉴于你早已竣工如此之多,不妨在此之上,再加以点缀。