LLM (Large Language Model) Top-P works
Cost and value balance with LLMs (LLM parameters – LLM Top-P)
MCP (Model Context Protocol)
Cost and value balance with LLMs (LLM parameters – LLM temperature)
Choosing an LLM model
Cost and value balance with LLMs (LLM parameters – Max tokens)
AI Prompt Engineering
Artificial Intelligence (AI)
Big data analytics with Starburst
Secure from Code to Cloud
2025
Sep
18

LLM (Large Language Model) Top-P works

Use Cases Low Top-P value Use Case: Customer Support Why? High Top-P value Use Case: Creative writer Why?

2025
Sep
18

Cost and value balance with LLMs (LLM parameters – LLM Top-P)

The LLM parameter Top-P, also known as nucleus sampling, controls the diversity of the output by setting a cumulative probability threshold for selecting the next token. It is used to produce higher quality and more diverse outputs depending on the setting. When generating text, tokens (words, sentences

2025
Sep
18

Cost and value balance with LLMs (LLM parameters – LLM temperature)

With this setting you can choose the right balance between randomness and determinism of the outcome generated by the LLM. An important aspect in applications where decisions should be made based on profound facts. The other side, if you need more creativity

2025
Sep
18

Cost and value balance with LLMs (LLM parameters – Max tokens)

With LLM parameters you have the chance to configure additional important settings for your LLM. With that settings you can influence the balance of costs and value for instance. But in addition, you can influence how the output will be generated, is

You cannot copy content of this page