This dude actually managed to convince a brain-dead British court that he was Satoshi and force a takedown of the Bitcoin whitepaper from bitcoin.org/bitcoin.pdf where it had been for many years prior: coinmarketcap.com/academy/article/bitcoin-org-ordered-to-take-down-bitcoin-whitepaper-because-of-copyright-infringement The page was updated to simply display the following Satoshi quote:
It takes advantage of the nature of information being easy to spread but hard to stifle. - Satoshi Nakamoto
The mere thought that Satoshi would attempt to copyright takedown the Bitcoin whitepaper, and not be able to back his identidy with any cryptographic keys, makes one shrivel to the bones.
Also, kids, this is why you put a fucking license on everything you release to the public, and especially when doing so anonymously!!! A quick CC BY-SA on that paper would have prevented all this bullshit.
The existence of this outrageous fraudster has had two good effects on the world however it must be said:
- the release of Adam Back and Martti Malmi early email history with Satoshi: www.forbes.com/sites/digital-assets/2024/02/23/new-emails-reveal-staggering-clues-to-the-mystery-of-bitcoin-creator-satoshi-nakamoto
- the memes: Craig Steven Wright memes
Timeline:
- 2015-12-08 Wired article claims he may be Satoshi: www.wired.com/2015/12/bitcoins-creator-satoshi-nakamoto-is-probably-this-unknown-australian-genius/. A few days later, evidence of foul play emerged, and on 2019-04-30 Wired retracted the article altogether
- 2016-05-02 publicly claims he is Satoshi www.timesofisrael.com/australian-entrepreneur-craig-wright-says-he-created-bitcoin/
- 2024-05-20 British judge James Mellor fisting the fuck out of Craig: www.reuters.com/technology/self-proclaimed-bitcoin-inventor-lied-repeatedly-support-claim-says-uk-judge-2024-05-20/
An Australian computer scientist who claimed he invented bitcoin lied "extensively and repeatedly" and forged documents "on a grand scale" to support his false claim, a judge at London's High Court ruled on Monday.
Interesting
- www.reddit.com/r/Bitcoin/comments/4i7k9a/strange_edits_on_craig_wrights_wikipedia_page/ "Strange edits on Craig Wright's Wikipedia page made two days before the revelation, from an IP address in Barbados (possibly made by Craig himself?)"
Transparency (electromagnetic radiation) by
Ciro Santilli 37 Updated 2025-05-13 +Created 1970-01-01
Although Ciro Santilli is a bit past their era, there's an aura of technical excellence about those people. It just seems that they sucked at business. Those open source hippies. Erm, wait.
Bibliography:
- archive.org/details/sunburstascentof00hall Sunburst: the ascent of Sun Microsystems by Mark Hall (1990)
It can be seen as the limit case of an Einstein solid at high temperatures. At lower temperatures, the heat capacity depends on temperature.
This is good. But it misses some key operations, so much so that makes Ciro not want to learn/use it daily.
As of December 2023, the cheapest instance with an Nvidia GPU is g4nd.xlarge, so let's try that out. In that instance, lspci contains:so we see that it runs a Nvidia T4 GPU.
00:1e.0 3D controller: NVIDIA Corporation TU104GL [Tesla T4] (rev a1)
Be careful not to confuse it with g4ad.xlarge, which has an AMD GPU instead. TODO meaning of "ad"? "a" presumably means AMD, but what is the "d"?
Some documentation on which GPU is in each instance can seen at: docs.aws.amazon.com/dlami/latest/devguide/gpu.html (archive) with a list of which GPUs they have at that random point in time. Can the GPU ever change for a given instance name? Likely not. Also as of December 2023 the list is already outdated, e.g. P5 is now shown, though it is mentioned at: aws.amazon.com/ec2/instance-types/p5/
When selecting the instance to launch, the GPU does not show anywhere apparently on the instance information page, it is so bad!
Also note that this instance has 4 vCPUs, so on a new account you must first make a customer support request to Amazon to increase your limit from the default of 0 to 4, see also: stackoverflow.com/questions/68347900/you-have-requested-more-vcpu-capacity-than-your-current-vcpu-limit-of-0, otherwise instance launch will fail with:
You have requested more vCPU capacity than your current vCPU limit of 0 allows for the instance bucket that the specified instance type belongs to. Please visit aws.amazon.com/contact-us/ec2-request to request an adjustment to this limit.
When starting up the instance, also select:Once you finally managed to SSH into the instance, first we have to install drivers and reboot:and now running:shows something like:
- image: Ubuntu 22.04
- storage size: 30 GB (maximum free tier allowance)
sudo apt update
sudo apt install nvidia-driver-510 nvidia-utils-510 nvidia-cuda-toolkit
sudo reboot
nvidia-smi
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 525.147.05 Driver Version: 525.147.05 CUDA Version: 12.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 Tesla T4 Off | 00000000:00:1E.0 Off | 0 |
| N/A 25C P8 12W / 70W | 2MiB / 15360MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
If we start from the raw Ubuntu 22.04, first we have to install drivers:
- docs.aws.amazon.com/AWSEC2/latest/UserGuide/install-nvidia-driver.html official docs
- stackoverflow.com/questions/63689325/how-to-activate-the-use-of-a-gpu-on-aws-ec2-instance
- askubuntu.com/questions/1109662/how-do-i-install-cuda-on-an-ec2-ubuntu-18-04-instance
- askubuntu.com/questions/1397934/how-to-install-nvidia-cuda-driver-on-aws-ec2-instance
From there basically everything should just work as normal. E.g. we were able to run a CUDA hello world just fine along:
nvcc inc.cu
./a.out
One issue with this setup, besides the time it takes to setup, is that you might also have to pay some network charges as it downloads a bunch of stuff into the instance. We should try out some of the pre-built images. But it is also good to know this pristine setup just in case.
We then managed to run Ollama just fine with:which gave:so way faster than on my local desktop CPU, hurray.
curl https://ollama.ai/install.sh | sh
/bin/time ollama run llama2 'What is quantum field theory?'
0.07user 0.05system 0:16.91elapsed 0%CPU (0avgtext+0avgdata 16896maxresident)k
0inputs+0outputs (0major+1960minor)pagefaults 0swaps
After setup from: askubuntu.com/a/1309774/52975 we were able to run:which gave:so only marginally better than on P14s. It would be fun to see how much faster we could make things on a more powerful GPU.
head -n1000 pap.txt | ARGOS_DEVICE_TYPE=cuda time argos-translate --from-lang en --to-lang fr > pap-fr.txt
77.95user 2.87system 0:39.93elapsed 202%CPU (0avgtext+0avgdata 4345988maxresident)k
0inputs+88outputs (0major+910748minor)pagefaults 0swaps
Capoeira music is amazing. And some Brazilian pop adaptations of it have also been awesome.
Parabolicamará
. Source. By Gilberto Gil. Title track of the 1991 album.Triste Bahia by Caetano Veloso
. Source. Na Roda de Capoeira by Nara Leão
. Source. From the 1964 album "Opinião De Nara" Defining properties of elementary particles by
Ciro Santilli 37 Updated 2025-05-13 +Created 1970-01-01
A suggested at Physics from Symmetry by Jakob Schwichtenberg (2015) chapter 3.9 "Elementary particles", it appears that in the Standard Model, the behaviour of each particle can be uniquely defined by the following five numbers:
Once you specify these properties, you could in theory just pluck them into the Standard Model Lagrangian and you could simulate what happens.
Setting new random values for those properties would also allow us to create new particles. It appears unknown why we only see the particles that we do, and why they have the values of properties they have.
Why can't you collimate incoherent light as well as a laser? by
Ciro Santilli 37 Updated 2025-05-13 +Created 1970-01-01
You could put an LED in a cavity with a thin long hole but then, most rays, which are not aligned with the hole, will just bounce inside forever producing heat.
So you would have a very hot device, and very little efficiency on the light output. This heat might also behave like a black-body radiation source, so you would not have a single frequency.
The beauty of lasers is the laser cavity (two parallel mirrors around the medium) selects parallel motion preferentially, see e.g.: youtu.be/_JOchLyNO_w?t=832 from Video "How Lasers Work by Scientized (2017)"
Pinned article: ourbigbook/introduction-to-the-ourbigbook-project
Welcome to the OurBigBook Project! Our goal is to create the perfect publishing platform for STEM subjects, and get university-level students to write the best free STEM tutorials ever.
Everyone is welcome to create an account and play with the site: ourbigbook.com/go/register. We belive that students themselves can write amazing tutorials, but teachers are welcome too. You can write about anything you want, it doesn't have to be STEM or even educational. Silly test content is very welcome and you won't be penalized in any way. Just keep it legal!
Intro to OurBigBook
. Source. We have two killer features:
- topics: topics group articles by different users with the same title, e.g. here is the topic for the "Fundamental Theorem of Calculus" ourbigbook.com/go/topic/fundamental-theorem-of-calculusArticles of different users are sorted by upvote within each article page. This feature is a bit like:
- a Wikipedia where each user can have their own version of each article
- a Q&A website like Stack Overflow, where multiple people can give their views on a given topic, and the best ones are sorted by upvote. Except you don't need to wait for someone to ask first, and any topic goes, no matter how narrow or broad
This feature makes it possible for readers to find better explanations of any topic created by other writers. And it allows writers to create an explanation in a place that readers might actually find it.Figure 1. Screenshot of the "Derivative" topic page. View it live at: ourbigbook.com/go/topic/derivativeVideo 2. OurBigBook Web topics demo. Source. - local editing: you can store all your personal knowledge base content locally in a plaintext markup format that can be edited locally and published either:This way you can be sure that even if OurBigBook.com were to go down one day (which we have no plans to do as it is quite cheap to host!), your content will still be perfectly readable as a static site.
- to OurBigBook.com to get awesome multi-user features like topics and likes
- as HTML files to a static website, which you can host yourself for free on many external providers like GitHub Pages, and remain in full control
Figure 2. You can publish local OurBigBook lightweight markup files to either OurBigBook.com or as a static website.Figure 3. Visual Studio Code extension installation.Figure 5. . You can also edit articles on the Web editor without installing anything locally. Video 3. Edit locally and publish demo. Source. This shows editing OurBigBook Markup and publishing it using the Visual Studio Code extension. - Infinitely deep tables of contents:
All our software is open source and hosted at: github.com/ourbigbook/ourbigbook
Further documentation can be found at: docs.ourbigbook.com
Feel free to reach our to us for any help or suggestions: docs.ourbigbook.com/#contact