🧠DeepSeek LLM Chat with Parameter Tuning
Model Provider
Prompt
Temperature
↺
0.1
1.5
Top-p
↺
0.1
1
Max New Tokens
↺
32
2048
Repetition Penalty
↺
1
2
Clear
Submit
output
Share via Link