OurBigBook About$ Donate
 Sign in+ Sign up
by Ciro Santilli (@cirosantilli, 37)

Ollama set parameter on CLI

 ... AI text generation Text-to-text model Large language model Open source LLM Ollama Ollama parameter
 0 By others on same topic  0 Discussions  Updated 2025-06-12  +Created 2025-03-20  See my version
Tags: Ollama HOWTO
Impossible without expect? Fuck...
  • github.com/ollama/ollama/issues/2505
  • github.com/ollama/ollama/issues/1415
  • github.com/ollama/ollama/pull/3134
  • genai.stackexchange.com/questions/699/how-to-set-ollama-temperature-from-command-line/2277#2277
Attempt at: ollama-expect
  • Table of contents
    • ollama-expect Ollama set parameter on CLI

ollama-expect

 0  0 
Ollama set parameter on CLI
(root) / ollama-expect
Usage:
./ollama-expect <model> <prompt>
e.g.:
./ollama-expect llama3.2 'What is quantum field theory?'
Benchmarks:
  • P14s: 4.8s, CPU only
  • P51: 9.6s, uses Nvidia GPU

 Ancestors (16)

  1. Ollama parameter
  2. Ollama
  3. Open source LLM
  4. Large language model
  5. Text-to-text model
  6. AI text generation
  7. Generative AI by modality
  8. Generative AI
  9. AI by capability
  10. Artificial intelligence
  11. Machine learning
  12. Computer
  13. Information technology
  14. Area of technology
  15. Technology
  16.  Home

 View article source

 Discussion (0)

+ New discussion

There are no discussions about this article yet.

 Articles by others on the same topic (0)

There are currently no matching articles.
  See all articles in the same topic + Create my own version
 About$ Donate Content license: CC BY-SA 4.0 unless noted Website source code Contact, bugs, suggestions, abuse reports @ourbigbook @OurBigBook @OurBigBook