OurBigBook About$ Donate
 Sign in Sign up

Ollama HOWTO

Ciro Santilli (@cirosantilli, 37) ... Generative AI by modality AI text generation Text-to-text model Large language model Open source LLM Ollama
Created 2025-03-20 Updated 2025-07-16  0 By others on same topic  0 Discussions Create my own version
  • Table of contents
    • Ollama output size Ollama HOWTO
    • Ollama deterministic output Ollama HOWTO

Ollama output size

 0  0
Ollama HOWTO

Ollama deterministic output

 0  0
Ollama HOWTO
TODO: haven't managed. /set parameter seed 0:
  • github.com/ollama/ollama/issues/3775
  • github.com/ollama/ollama/issues/2773#issuecomment-2732874259
  • www.reddit.com/r/ollama/comments/1jmnb8b/testability_of_llms_the_elusive_hunt_for/
Across hardware:
  • stackoverflow.com/questions/79390210/does-ollama-guarantee-cross-platform-determinism-with-identical-quantization-se
It might be easier to just use llama-cli for this, it has a --temperature flag.

 Tagged (1)

  • Ollama set parameter on CLI

 Ancestors (15)

  1. Ollama
  2. Open source LLM
  3. Large language model
  4. Text-to-text model
  5. AI text generation
  6. Generative AI by modality
  7. Generative AI
  8. AI by capability
  9. Artificial intelligence
  10. Machine learning
  11. Computer
  12. Information technology
  13. Area of technology
  14. Technology
  15.  Home

 View article source

 Discussion (0)

New discussion

There are no discussions about this article yet.

 Articles by others on the same topic (0)

There are currently no matching articles.
  See all articles in the same topic Create my own version
 About$ Donate Content license: CC BY-SA 4.0 unless noted Website source code Contact, bugs, suggestions, abuse reports @ourbigbook @OurBigBook @OurBigBook