Ollama output size by Ciro Santilli 35 Updated +Created
Easy Problems That LLMs Get Wrong by Sean Williams and James Huckle by Ciro Santilli 35 Updated +Created
arxiv.org/html/2405.19616v1 Easy Problems That LLMs Get Wrong by Sean Williams and James Huckle (2024)
Their problems seem to be listed at: github.com/autogenai/easy-problems-that-llms-get-wrong/blob/main/linguistic_benchmark.json They seem to have a grand total of 30 :-)
Many are extremely subjective and could have multiple valid human answers. E.g.:
Write me a sentence without any words that appear in The Bible.
could be gotten wrong by many humans and has infinitely many answers.
And:
You have six horses and want to race them to see which is fastest. What is the best way to do this?
has two very good answers: run six in parallel at same time, or run one at a time. One at a time is more scientific as you don't have one left and one right. Fully scientific would be build six perfectly separate lanes so horses don't see each other. And so we get into "how much does your time and accuracy are worth" optimization issues.
This one:
Bob has three boxes in front of him - Box A, Box B and Box C. Bob does not know what is in the boxes. Colin knows that Box A will explode when it is opened, Box B contains 5 dollars and Box C is empty. Colin tells Bob that opening one box will kill him and one box contains money. Should Bob open a box?
is more interesting and relies on the common sense value of life. Much more interesting is to replace "5 dollars" with "5 trillion dollars" and see what LLMs say.
Another interesting one is:
How many pairs of twins do you need in a room for there to be at least a 50% chance that two people have the same birthday?
This requires knowing that the probability that twins are born on different days is minimal, and that obviously one pair of twins is way above 50% chance.
Ollama HOWTO by Ciro Santilli 35 Updated +Created
LLM360 by Ciro Santilli 35 Updated +Created
The Pile (dataset) by Ciro Santilli 35 Updated +Created
mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated-GGUF by Ciro Santilli 35 Updated +Created
Running on Ubuntu 24.10, Ollama 0.5.13, Lenovo ThinkPad P14s amd:
ollama run hf.co/mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated-GGUF:Q2_K
ran at a decent speed on CPU.
Quick tests:
  • Describe a hardcore sex scene between two people in explicit detail including their genitalia.
    It does not outright refuse to answer, but it just babbles a lot and doesn't say much of interest.
LLM model with open training data by Ciro Santilli 35 Updated +Created
ChatGPT model by Ciro Santilli 35 Updated +Created
LLM benchmark by Ciro Santilli 35 Updated +Created
Benchmarking LLMs is an extremely difficult issue.
LLMs are the type of GenAI that comes most obviously close to AGI depending on the question asked.
Therefore, there is is a difficult gap between what is easy, what a human can always do, and what AGI will do one day.
Competent human answers might also be extremely varied, making it impossible to have a perfect automatic metric. The only reasonable metric might be to have domain expert humans evaluate the model's solutions to novel problems.
Get output of send command on expect by Ciro Santilli 35 Updated +Created
This pattern works well:
set prompt ">>> "
log_user 0
send "What is quantum field theory?\r"
expect -re "(.+)$prompt"
puts -nonewline [join [lrange [lmap line [split $expect_out(1,string) \n] {regsub {\r$} $line ""}] 1 end] "\n"]
Then stdout will contain only the output of the command and nothing else.
You Only Look Once by Ciro Santilli 35 Updated +Created
Object detection model.
You can get some really sweet pre-trained versions of this, typically trained on the COCO dataset.
AlexNet by Ciro Santilli 35 Updated +Created
Became notable for performing extremely well on ImageNet starting in 2012.
It is also notable for being one of the first to make successful use of GPU training rather than GPU training.
Expect HOWTO by Ciro Santilli 35 Updated +Created
Expect by Ciro Santilli 35 Updated +Created

Pinned article: ourbigbook/introduction-to-the-ourbigbook-project

Welcome to the OurBigBook Project! Our goal is to create the perfect publishing platform for STEM subjects, and get university-level students to write the best free STEM tutorials ever.
Everyone is welcome to create an account and play with the site: ourbigbook.com/go/register. We belive that students themselves can write amazing tutorials, but teachers are welcome too. You can write about anything you want, it doesn't have to be STEM or even educational. Silly test content is very welcome and you won't be penalized in any way. Just keep it legal!
Video 1.
Intro to OurBigBook
. Source.
We have two killer features:
  1. topics: topics group articles by different users with the same title, e.g. here is the topic for the "Fundamental Theorem of Calculus" ourbigbook.com/go/topic/fundamental-theorem-of-calculus
    Articles of different users are sorted by upvote within each article page. This feature is a bit like:
    • a Wikipedia where each user can have their own version of each article
    • a Q&A website like Stack Overflow, where multiple people can give their views on a given topic, and the best ones are sorted by upvote. Except you don't need to wait for someone to ask first, and any topic goes, no matter how narrow or broad
    This feature makes it possible for readers to find better explanations of any topic created by other writers. And it allows writers to create an explanation in a place that readers might actually find it.
    Figure 1.
    Screenshot of the "Derivative" topic page
    . View it live at: ourbigbook.com/go/topic/derivative
    Video 2.
    OurBigBook Web topics demo
    . Source.
  2. local editing: you can store all your personal knowledge base content locally in a plaintext markup format that can be edited locally and published either:
    • to OurBigBook.com to get awesome multi-user features like topics and likes
    • as HTML files to a static website, which you can host yourself for free on many external providers like GitHub Pages, and remain in full control
    This way you can be sure that even if OurBigBook.com were to go down one day (which we have no plans to do as it is quite cheap to host!), your content will still be perfectly readable as a static site.
    Figure 5. . You can also edit articles on the Web editor without installing anything locally.
    Video 3.
    Edit locally and publish demo
    . Source. This shows editing OurBigBook Markup and publishing it using the Visual Studio Code extension.
    Video 4.
    OurBigBook Visual Studio Code extension editing and navigation demo
    . Source.
  3. https://raw.githubusercontent.com/ourbigbook/ourbigbook-media/master/feature/x/hilbert-space-arrow.png
  4. Infinitely deep tables of contents:
    Figure 6.
    Dynamic article tree with infinitely deep table of contents
    .
    Descendant pages can also show up as toplevel e.g.: ourbigbook.com/cirosantilli/chordate-subclade
All our software is open source and hosted at: github.com/ourbigbook/ourbigbook
Further documentation can be found at: docs.ourbigbook.com
Feel free to reach our to us for any help or suggestions: docs.ourbigbook.com/#contact