LLM (Large Language Model) Top-P works
Cost and value balance with LLMs (LLM parameters – LLM Top-P)
MCP (Model Context Protocol)
Cost and value balance with LLMs (LLM parameters – LLM temperature)
Choosing an LLM model
Cost and value balance with LLMs (LLM parameters – Max tokens)
AI Prompt Engineering
Artificial Intelligence (AI)
Big data analytics with Starburst
Secure from Code to Cloud

LLM (Large Language Model) Top-P works

  1. Token Probability: The #LLM (#LargeLanguageModel) assigns a probability to every possible next token in its vocabulary based on the preceding text.
  2. Cumulative Probability: Then the #tokens are sorted in descending order of their likelihood.
  3. Nucleus Selection: The #LLMmodel sums the probabilities of the #tokens, it starts with the most likely, until the cumulative sum reaches or exceeds the Top-P value.
  4. Token Selection: The #model then randomly samples from this identified set of tokens (sum of #tokens based on their probability till the Top-P value as seen in the above example) and produce the next word, rather than selecting from the entire vocabulary.

Use Cases

Low Top-P value

Use Case: Customer Support Why?

  • Needs to be accurate, precise and factual
  • Avoids unusual creativity in answering which leads to confusion

High Top-P value

Use Case: Creative writer

Why?

  • Broader vocabulary, more creativity in writing and presenting story ideas
7 Total Views 7 Views Today
twitterlinkedinmail

You cannot copy content of this page