A simple guide to setting the GPT-3 temperature

AlgoWriting
4 min readNov 21, 2020

--

Along with the prompt, the temperature is one of the most important settings. It has a significant impact on the output, so it’s worth spending some time explaining it. The temperature controls how much randomness is in the output.

Screenshot of OpenAI Temperature Setting

In general, the lower the temperature, the more likely GPT-3 will choose words with a higher probability of occurrence. It is particularly useful when we want GPT-3 to complete something, where there is only one answer and vice versa, if you want to generate ideas or complete a story, higher temperature will bring us more variety.

Use GPT-3 with Temperature = 0 to complete the sentence “How to make” and you’ll get a completion: “a good impression on a first date”. Repeat it with Temperature = 1, and you’ll get anything from “Turban Wrapped Coconut Hairdo” to “mayo without almost no oil at all’ or “a new server in minecraft 1”. See the difference?

Sometimes people call Temperature a creativity setting, but official documentation discourages such a label. Creativity is more complex than just raising the randomness.

Temperature = 0

Let’s test the lowest temperature on a fairly common sentence: “My favorite animal is.” As you might expect with randomness limited to zero, GPT-3 will come up with the most probable completion — “a dog”.

one run (Engine = davinci; Response Length = 12; Temperature= 0, Stop Sequences = “.”)

Eliminating randomness has another effect: GPT-3 will always produce the same output for a given prompt. We ran the prompt 10 times, and we received 10 exact same outputs. Lower temperature is suitable for cases when we need stability, most probable output (factual output, classification, etc.)

ten runs (Engine = davinci; Response Length = 12; Temperature= 0, Stop Sequences = “.”)

Thanks to the “Show Probabilities” setting, we can see how GPT-3 sees the world, and the words (tokens) that should come next (the list in the screenshot covers ~38% of possible cases). Not surprisingly, the second most probable output after “dog” is “cat,” and a third is “gir,” which is very likely token from the word “giraffe.” So when we talk about randomness, we mean that GPT-3 is going for tokens with various levels of probability.

A screenshot from GPT-3 playground

Temperature = 1.0

Setting the temperature value to 1 will deliver very inconsistent and sometimes interesting results. Even for open-ended tasks you should use a temperature value of 1 only in special situations. It is more common for story completion or idea generation to see temperature values between 0.7 to 0.9.

ten runs (Engine = Davinci, Temperature= 1, Stop Sequences = “.”)

Output is very different for temperature of 0, in the case of “Wibblezoo” it is not even an animal. Though GPT-3 still keeps the context, but it’s not as reliable with this setting. Given the setting, GPT-3 is expected to go off-script faster for longer text.

Temperature = 0.75

Typically, a temperature between 0.70–0.90 is the most common for creative tasks. Before looking at the output below, try to guess what kind of output GPT-3 will generate on a setting temperature of 0.75 for “My favorite animal is” completion.

With slightly lower randomness, we are getting more consistent results. GPT-3 more often ends with a period right after the animal. We got Velociraptor, which is nice, but no nonexistent animals.

Temperature = It depends

Although there are some general recommendations on Temperature settings, nothing is set in stone. As this is one of the most important settings, it’s worth playing around, see what impact the changes have on different kinds of prompts. It will help you to be a better prompter.

--

--

AlgoWriting
AlgoWriting

Written by AlgoWriting

Hey, curious marketers and copywriters, deploy GPT-3 in your creative process and be more effective. Written by Jan Antonin Kolar

Responses (2)