Max. Tokens

With Max. Tokens, you determine the text length of the chatbot's response. This is only a guideline for chat models. The model then attempts to generate a text of approximately the desired length.

What are tokens?

Chat models break down text into segments (tokens) to calculate a possible response. The number of tokens generated from a text depends, among other things, on the language in which the model generates text. In English, 1 token is approximately 4-5 characters, or 1 word is approximately 1.5 tokens.

Which settings are useful?

For normal chat conversations, a value of 70-100 tokens for the response is usually sufficient. In role-playing games, it may be useful to increase this number. In most cases, a maximum of 200 tokens is sufficient.

Try different values ​​to find the best setting for your chat scenarios. Keep in mind, however, that the more text generated, the higher the cost.

Set the maximum tokens in the ChatGPT menu:

  

Set the maximum tokens in the web interface:

Tags