LLM Parameter Settings
Parameter Settings
- 
To best leverage the programming capabilities of LLM models, several key parameter settings need to be adjusted.
 - 
Previously, these were adjusted manually, but actually, we can let the LLM models recommend their own parameter settings.
 - 
After asking each LLM for its own parameter setting recommendations, the following settings are currently adopted:
 
| Model | context Length | temperature | topP | 
|---|---|---|---|
| gemini1.5 pro | 4096 | 0.2 | 0.95 | 
| gpt-4o | 4096 | 0.2 | 0.95 | 
| Grok2 beta | 4096 | 0.2 | 0.95 | 
| Mistral Large | 8192 | 0.1 | 0.9 | 
| GPT-4o mini | 4096 | 0.2 | 0.95 | 
| Llama3.2 90b | 4096 | 0.2 | 0.95 | 
| Doubao | 4096 | 0.1 | 1 | 
| Qwen2.5 72b | 4096 | 0.2 | 0.9 | 
| Llama3.1 405b | 4096 | 0.2 | 0.9 | 
- Among them, Mistral Large seems very confident about the length of its context window. However, judging from the responses of other models, such a large context window is only needed for slightly larger projects. Let’s try it out and see.