Tokens are the basic unit that OpenAI GPT models use to compute the length of a text. They are groups of characters, which sometimes align with words, but not always. In particular, it depends on the number of characters and includes punctuation signs. The cost of a prompt is measured by tokens used.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| What are Large Language Models? | 0 | 29 | April 18, 2023 | |
| New entry screen, new use cases and data discovery | 0 | 40 | July 10, 2023 | |
| Response does not answer questions being asked | 0 | 28 | April 18, 2023 | |
| Responses are taking a long time to generate: | 0 | 28 | April 18, 2023 | |
| ChatAible is not double checking responses, no blue highlights | 1 | 35 | April 19, 2023 |