LLM Parameter Settings

Parameter Settings

  • To best leverage the programming capabilities of LLM models, several key parameter settings need to be adjusted.

  • Previously, these were adjusted manually, but actually, we can let the LLM models recommend their own parameter settings.

  • After asking each LLM for its own parameter setting recommendations, the following settings are currently adopted:


Modelcontext LengthtemperaturetopP
gemini1.5 pro40960.20.95
gpt-4o40960.20.95
Grok2 beta40960.20.95
Mistral Large81920.10.9
GPT-4o mini40960.20.95
Llama3.2 90b40960.20.95
Doubao40960.11
Qwen2.5 72b40960.20.9
Llama3.1 405b40960.20.9
  • Among them, Mistral Large seems very confident about the length of its context window. However, judging from the responses of other models, such a large context window is only needed for slightly larger projects. Let’s try it out and see.