Ollama deterministic output (source code)

= Ollama deterministic output
{c}

TODO: haven't managed. `/set parameter seed 0`:
* https://github.com/ollama/ollama/issues/3775
* https://github.com/ollama/ollama/issues/2773#issuecomment-2732874259
* https://www.reddit.com/r/ollama/comments/1jmnb8b/testability_of_llms_the_elusive_hunt_for/

Across hardware:
* https://stackoverflow.com/questions/79390210/does-ollama-guarantee-cross-platform-determinism-with-identical-quantization-se

It might be easier to just use <llama-cli> for this, it has a `--temperature` flag.