Settings to use when calling an LLM.
This class holds optional model configuration parameters (e.g. temperature,
top_p, penalties, truncation, etc.).
Not all models/providers support all of these parameters, so please check the API documentation
for the specific model and provider you are using.