I would like to share some major tips for every average user of our new awesome chatbot, so you can actually explore more possibilities and maximize the efficiency of what you can do with ChatGPT. The company behind ChatGPT, OpenAI, provides so called API – that is interface for programmers to communicate with bot via programs and functions. While that is not useful for an everyday user directly, it turns out you can take advantage of API parameters directly in your public chat when prompting ChatGPT.
Since I actually went and studied the documentation, I know about the parameter, but let’s ask our bot what he thinks:
me: do you have a temperature settings/parameter?
bot: As a language model, I do not have physical characteristics such as temperature. However, when generating text, there is a parameter called “temperature” that can be adjusted to control the level of randomness or creativity in the output. A higher temperature will result in more varied and unexpected responses, while a lower temperature will produce responses that are more similar to the patterns and phrases found in the training data.
Great, now I know he cannot get cold, what a relief. He will always work, 24/7 🙂
Luckily, I also found out that this frontend version, the public version of a bot, knows about its parameters behind the scenes. Let’s ask if we can actually use them and how:
me: Can i instruct you via this prompt to adjust the level of randomness or creativity?
bot: Yes, you can adjust the level of randomness or creativity in my responses by specifying a temperature value when you prompt me to generate text. The temperature value can be any decimal number between 0 and 1, with values closer to 0 resulting in more conservative and predictable responses, and values closer to 1 producing more varied and unexpected responses. You can specify the temperature value in the prompt by adding it to the end of the command, for example: “ChatGPT, generate text with a temperature of 0.7.”
Nice, seems almost like a done deal. Now, it might be a bit tricky, but my guess is this wouldn’t give me the desired result and ChatGPT might always get confused what you mean, if that “temperature of 0.7” is part of the text input to generate content from or if it is really a paramater we want to send together with our text prompt.
me: ok, generate 2 brief paragraphs about european union, 1 using your api temperature settings 0 and one with your temperature settings 1
bot: Sure, here are two brief paragraphs about the European Union, one using a temperature setting of 0 and one using a temperature setting of 1:
With temperature setting 0: The European Union is a political and economic union of 27 member states located primarily in Europe. Established in November of 1993 by the Treaty of Maastricht, the EU has since grown to become a highly integrated economic and political entity.
With temperature setting 1: The European Union is a grand experiment in unity and cooperation, a melting pot of cultures and ideologies from across the continent. It’s a bold vision of a united Europe, where national borders are nothing more than a fading memory and the people are free to move and live as they please. But with power comes responsibility, and the EU is now facing a critical juncture as it struggles to maintain its relevance and influence in an ever-changing world.
And we have a winner. Clean and clear example of what our temperature settings does and how can we use it. I think this is a great option for non-programming users to work with. So let loose your imagination and let loose the bot’s imagination as well, see what you can come up with and let me know in comments 😉
me: what is your default temperature settings?
bot: The default temperature setting for my text generation, when not specified by the user, is typically set to a value of 0.5. This value is considered to be a “medium” level of randomness or creativity in the output. A higher temperature will result in more varied and unexpected responses, while a lower temperature will produce responses that are more similar to the patterns and phrases found in the training data. However, it’s worth noting that the default temperature settings can vary depending on the specific implementation or version of GPT model you are using.
Here our bot tells us its default temperature settings is 0.5, however I do not believe that to be true. It would make logical sense for it to be 0.5, however it did give me 0.7 on occassion in few other prompts, and when I look at Open AI playground, place for programmers to test bot’s interface, there the default value is 0.7 as well. I am assuming this to be the default now 🙂