OurBigBook About$ Donate
 Sign in+ Sign up
by Ciro Santilli (@cirosantilli, 37)

ollama-expect

 ... Text-to-text model Large language model Open source LLM Ollama Ollama parameter Ollama set parameter on CLI
 0 By others on same topic  0 Discussions  Updated 2025-05-23  +Created 2025-05-21  See my version
(root) / ollama-expect
Usage:
./ollama-expect <model> <prompt>
e.g.:
./ollama-expect llama3.2 'What is quantum field theory?'
Benchmarks:
  • P14s: 4.8s, CPU only
  • P51: 9.6s, uses Nvidia GPU

 Ancestors (17)

  1. Ollama set parameter on CLI
  2. Ollama parameter
  3. Ollama
  4. Open source LLM
  5. Large language model
  6. Text-to-text model
  7. AI text generation
  8. Generative AI by modality
  9. Generative AI
  10. AI by capability
  11. Artificial intelligence
  12. Machine learning
  13. Computer
  14. Information technology
  15. Area of technology
  16. Technology
  17.  Home

 Incoming links (1)

  • Ollama set parameter on CLI

 View article source

 Discussion (0)

+ New discussion

There are no discussions about this article yet.

 Articles by others on the same topic (0)

There are currently no matching articles.
  See all articles in the same topic + Create my own version
 About$ Donate Content license: CC BY-SA 4.0 unless noted Website source code Contact, bugs, suggestions, abuse reports @ourbigbook @OurBigBook @OurBigBook