What are tokens?

Tokens are the basic unit that OpenAI GPT models use to compute the length of a text. They are groups of characters, which sometimes align with words, but not always. In particular, it depends on the number of characters and includes punctuation signs. The cost of a prompt is measured by tokens used.