OurBigBook About$ Donate
 Sign in Sign up

Ollama deterministic output

Ciro Santilli (@cirosantilli, 37) ... AI text generation Text-to-text model Large language model Open source LLM Ollama Ollama HOWTO
Created 2025-03-20 Updated 2025-07-16  0 By others on same topic  0 Discussions Create my own version
TODO: haven't managed. /set parameter seed 0:
  • github.com/ollama/ollama/issues/3775
  • github.com/ollama/ollama/issues/2773#issuecomment-2732874259
  • www.reddit.com/r/ollama/comments/1jmnb8b/testability_of_llms_the_elusive_hunt_for/
Across hardware:
  • stackoverflow.com/questions/79390210/does-ollama-guarantee-cross-platform-determinism-with-identical-quantization-se
It might be easier to just use llama-cli for this, it has a --temperature flag.

 Ancestors (16)

  1. Ollama HOWTO
  2. Ollama
  3. Open source LLM
  4. Large language model
  5. Text-to-text model
  6. AI text generation
  7. Generative AI by modality
  8. Generative AI
  9. AI by capability
  10. Artificial intelligence
  11. Machine learning
  12. Computer
  13. Information technology
  14. Area of technology
  15. Technology
  16.  Home

 View article source

 Discussion (0)

New discussion

There are no discussions about this article yet.

 Articles by others on the same topic (0)

There are currently no matching articles.
  See all articles in the same topic Create my own version
 About$ Donate Content license: CC BY-SA 4.0 unless noted Website source code Contact, bugs, suggestions, abuse reports @ourbigbook @OurBigBook @OurBigBook