Much more useful than instruments used in inferior arts, such as pianos or paintbrushes.
A computer is a highly layered system, and so you have to decide which layers you are the most interested in studying.
Although the layer are somewhat independent, they also sometimes interact, and when that happens it usually hurts your brain. E.g., if compilers were perfect, no one optimizing software would have to know anything about microarchitecture. But if you want to go hardcore enough, you might have to learn some lower layer.
It must also be said that like in any industry, certain layers are hidden in commercial secrecy mysteries making it harder to actually learn them. In computing, the lower level you go, the more closed source things tend to become.
But as you climb down into the abyss of low level hardcoreness, don't forget that making usefulness is more important than being hardcore: Figure 1. "xkcd 378: Real Programmers".
First, the most important thing you should know about this subject: cirosantilli.com/linux-kernel-module-cheat/should-you-waste-your-life-with-systems-programming
Here's a summary from low-level to high-level:
- semiconductor physical implementation this level is of course the most closed, but it is fun to try and peek into it from any openings given by commercials and academia:
- register transfer level
- interactive Verilator fun: Is it possible to do interactive user input and output simulation in VHDL or Verilog?
- more importantly, and much harder/maybe impossible with open source, would be to try and set up a open source standard cell library and supporting software to obtain power, performance and area estimates
- microarchitecture: a good way to play with this is to try and run some minimal userland examples on gem5 userland simulation with logging, e.g. see on the Linux Kernel Module Cheat:
- instruction set architecture: a good approach to learn this is to manually write some userland assembly with assertions as done in the Linux Kernel Module Cheat e.g. at:
- learn a bit about calling conventions, e.g. by calling C standar library functions from assembly:
- you can also try and understand what some simple C programs compile to. Things can get a bit hard though when
-O3is used. Some cute examples:
- executable file format, notably executable and Linkable Format. Particularly important is to understand the basics of:
- address relocation: How do linkers and address relocation work?
- position independent code: What is the -fPIE option for position-independent executables in GCC and ld?
- how to observe which symbols are present in object files, e.g.:
- how C++ uses name mangling What is the effect of extern "C" in C++?
- how C++ template instantiation can help reduce link time and size: Explicit template instantiation - when is it used?
- operating system. There are two ways to approach this:
- learn about the Linux kernel Linux kernel. A good starting point is to learn about its main interfaces. This is well shown at Linux Kernel Module Cheat:
- system calls
- write some system calls in
- pure assembly:
- C GCC inline assembly:
- write some system calls in
- learn about kernel modules and their interfaces. Notably, learn about to demystify special files such
/dev/randomand so on:
- learn how to do a minimal Linux kernel disk image/boot to userland hello world: What is the smallest possible Linux implementation?
- learn how to GDB Step debug the Linux kernel itself. Once you know this, you will feel that "given enough patience, I could understand anything that I wanted about the kernel", and you can then proceed to not learn almost anything about it and carry on with your life
- system calls
- write your own (mini-) OS, or study a minimal educational OS, e.g. as in:
- learn about the Linux kernel Linux kernel. A good starting point is to learn about its main interfaces. This is well shown at Linux Kernel Module Cheat:
- programming language
This is a general principle of software/hardware design that Ciro feels holds wide applicability.
The most extreme case of this is of course the integrated circuit itself, in which it is essentially impossible (?) to observe the specific value of some indidual wire at some point.
Somewhat on the other extreme, we have high level programming languages running on top of an operating system: at this point, you can just GDB step debug your program, print the value of any variable/memory location, and fully understand anything that you want. Provided that you manage to easily reach that point of interest.
And for anything in between we have various intermediate levels of complication. The most notable perhaps being developing the operating system itself. At this level, you can't so easily step debug (although techniques do exist). For early boot or bootloaders for example, you might want to use JTAG for example on real hardware.
In parallel to this, there is also another very important pair of closely linked tradeoffs:
- the lower level at which something is implemented, the faster it runs
- emulation gives you observability back, at the cost of slower runtime
Emulation also has another potential downside: unless you are very careful at implementing things correctly, your model might not be representative of the real thing. Also, there may be important tradeoffs between how much the model looks like the real thing, and how fast it runs. For example, QEMU's use of binary translation allows it to run orders of magnitude faster than gem5. However, you are unable to make any predictions about system performance with QEMU, since you are not modelling key elements like the cache or CPU pipeline.
Instrumentation is another technique that has can be considered to achieve greater observability.
Instrumentation basically means adding loggers/print statements to certain points of interest of your hardware/software.
Instrumentation tends to slow execution down a bit, but way less than emulation.
The downside is that if the instrumentation does not provide you the data you need to debug, there's not much you can do, you will need to modify it, i.e. you don't get full visibility from instrumention.
This is unlike emulation that provides full observability.
Some of the earlier computers of the 20th centure were analog, not digital.
At some point analog died however, and "computer" basically by default started meaning just "digital computer".
As of the 2010's and forward, with the limit of Moore's law and the rise of machine learning, people have started looking again into analog computing as a possile way forward. A key insight is that huge floating point precision is not that crucial in many deep learning applications, e.g. many new digital designs have tried 16-bit floating point as opposed to the more traditional 32-bit minium. Some papers are even looking into 8-bit: dl.acm.org/doi/10.5555/3327757.3327866
This section is about companies that were primarily started as computer makers.
- owns the entire stack and creates high quality highly optimized systems
- creates closed lock-in systems without inter-operability and actively fights users from owning their devices
- do they give back enough to open source, or do they leech mostly?
it's Popular Now It Sucks comes to mind.
This is a trap every company that prides itself on it's "alternative culture" sets for itself. If they succeed, they could become the norm.
Because the people who are crazy enough to think they can change the world are the ones who do.
Was a direct tech predecessor to the iPhone.
The story of how OS X was ported to x86 from PowerPC with large initial work up to boot by a single man in the year 2000, John Kullmann, is really worth reading: www.quora.com/Apple-company/How-does-Apple-keep-secrets-so-well/answer/Kim-Scheinberg on Quora, see also:
Can you do anything with it? What's the license?
Co-founder of Apple.
Is Jobs evil? Is he interesting? Undoubtedly.
www.folklore.org/ProjectView.py?project=Macintosh&characters=Steve%20Jobs has some good anecdotes about him.
- "Try to have a nice family life, have fun, save a little money." quote at: Section "Don't be a pussy" and the related Jobs and Wozniak's blue box attitude
- "Steve Jobs Insult Response" on backward design
- Steve Jobs Pixar office design philosophy: great ideas happen from chance meetings on corridors, not in board rooms: officesnapshots.com/2012/07/16/pixar-headquarters-and-the-legacy-of-steve-jobs/
- Steve Jobs' 2005 Stanford Commencement Address
- Here's to the crazy ones: Ciro would like to believe that this is mostly written by Jobs, but apparently it was just written by an advertisement agency. Good job though.
You must watch this: Video "Bill Gates vs Steve Jobs by Epic Rap Battles of History (2012)".
- not recognizing own daughter for many years??? en.wikipedia.org/wiki/Lisa_Brennan-Jobs
- lying to Steve Wozniak about the 5000 dollar Atari bonus: web.archive.org/web/20110612071502/http://www.woz.org/letters/general/91.html
- not giving stock to early garage employees: www.businessinsider.com/steve-wozniak-gave-early-apple-employees-10-million-in-stock-2014-9 OK, not a legal obligation. But... love?
This idea also comes up in other sources of course.
TODO clear attribution source:
Some people say, "Give the customers what they want." But that's not my approach. Our job is to figure out what they're going to want before they do. I think Henry Ford once said, "If I'd asked customers what they wanted, they would have told me, 'A faster horse!'" People don't know what they want until you show it to them. That's why I never rely on market research. Our task is to read things that are not yet on the page.
Ciro Santilli likes Magic: The Gathering and he was pleased when he learned that Steve Wozniak does too, and has an expensive collection: redsunsoft.com/2019/03/how-a-post-to-play-magictg-turned-into-an-afternoon-with-the-woz/
Some have actually been preserved: en.wikipedia.org/wiki/File:Blue_Box_in_museum.jpg
The japanese name literally means:
- 富士 fushi, from Mount Fuji, which itself has unknown origin
- 通 tong: telecommunications
As of the 2020's, a slumbering giant.
This is a family of computers. It was a big success. It appears that this was a big unification project of previous architectures. And it also gave software portability guarantees with future systems, since writing software was starting to become as expensive as the hardware itself.
This was the first major commercial computer hit. Stlil vacuum tube-based.
Borrow from the Internet Archive for free: archive.org/details/manbehindmicroc000berl/page/n445/mode/2up
It is funny to see how the first computers were very artisanal, made on a one-off basis.
Amazing how Control Data Corporation raised capital IPO style as a startup without a product. The dude was selling shares at dinner parties in his home.
Very interesting mention on page 70 of how Israel bought CDC's UNIVAC 1103 which Cray contributed greatly to design, and everyone knew that it was to make thermonuclear weapons, since that was what the big American labs like this mention should be added to: en.wikipedia.org/wiki/Nuclear_weapons_and_Israel but that's Extended Protected... the horrors of Wikipedia.
Another interesting insight is how "unintegrated" computers were back then. They were literally building computers out of individual vacuum tubes, then individual semiconducting transistors, a gate at a time. Then things got more and more integrated as time went. That is why the now outdated word "microprocessor" existed. When processors start to fit into a single integrated circuit, they were truly micro compared to the monstrosities that existed previously.
Also, because integration was so weak initially, it was important to more manually consider the length of wire signals had to travel, and try to put components closer together to reduce the critical path to be able to increase clock speeds. These constraints are also of course present in modern computer design, but they were just so much more visible in those days.
The book does unfortunately not give much detail in Crays personal life as mentioned on this book review: www.goodreads.com/review/show/1277733185?book_show_action=true. His childhood section is brief, and his wedding is described in one paragraph, and divorce in one sentence. Part of this is because he was very private about his family most likely note how Wikipedia had missed his first wedding, and likely misattribute children to the second wedding; en.wikipedia.org/wiki/Talk:Seymour_Cray section "Weddings and Children".
Crays work philosophy is is highlighted many times in the book, and it is something worthy to have in mind:
- if a design is not working, start from scratch
- don't be the very first pioneer of a technology, let others work out the problems for you first, and then come second and win
Cray's final downfall was when he opted to try to use a promising but hard to work with material gallium arsenide instead of silicon as his way to try and speed up computers, see also: gallium arsenide vs silicon. Also, he went against the extremely current of the late 80's early 90's pointing rather towards using massively parallel systems based on silicon off-the-shelf Intel processors, a current that had DARPA support, and which by far the path that won very dramatically as of 2020, see: Intel supercomputer market share.
A good project to see UARTs at work in all their beauty is to connect two Raspberry Pis via UART, and then:
- type in one and see characters appear in the other: scribles.net/setting-up-uart-serial-communication-between-raspberry-pis/
- send data via a script: raspberrypi.stackexchange.com/questions/29027/how-should-i-properly-communicate-2-raspberry-pi-via-uart
Part of the beauty of this is that you can just connect both boards directly manually with a few wire-to-wire connections with simple jump wire. Its simplicity is just quite refreshing. Sure, you could do something like that for any physical layer link presumably...
Remember that you can only have one GNU screen connected at a time or else they will mess each other up: unix.stackexchange.com/questions/93892/why-is-screen-is-terminating-without-root/367549#367549
- some good interview excerpts with some of the pioneers on Glory of the Geeks
Hardcoded and unique network addresses for every single device on Earth.
Started with 48 bits (6 bytes), usually given as 01:23:45:67:89:AB but people now encouraged to use 64-bit ones.
Of course, everything pales in comparison to The Criterion Collection.
Jeff has spoken a lot in public about Amazon, perhaps even more than other comparable founder, see e.g. history of Amazon. Kudos for that.
Her neck is huge! She also redid her teeth at some point apparently. Some good photos at: www.irishtimes.com/life-and-style/people/mackenzie-scott-how-the-former-mrs-bezos-became-a-philanthropist-like-no-other-1.4850049
Science teacher at the Lakeside School in Seattle.
www.dailymail.co.uk/femail/article-9338723/Who-billionaire-Mackenzie-Scotts-new-husband-Dan-Jewett.html Who IS billionaire Mackenzie Scott's new husband Dan Jewett?
MacKenzie Bezos went on to marry a science teacher who taught their children.
MacKenzie Bezos's charity instrument.
www.irishtimes.com/life-and-style/people/mackenzie-scott-how-the-former-mrs-bezos-became-a-philanthropist-like-no-other-1.4850049 MacKenzie Scott: How the former Mrs Bezos became a philanthropist like no other (2020) has some good mentions:
But as Scott's fame for giving away money has grown, so too has the deluge of appeals for gifts from strangers and old friends alike. That clamour may have driven Scott's already discreet operation further underground, with recent philanthropic announcements akin to sudden lightning bolts for unsuspecting recipients.
The name of the organization is a reference to the old man lost his horse.
I wonder where the spray painted sign went: twitter.com/profgalloway/status/1229952158667288576/photo/1. As mentioned at officechai.com/startups/amazon-first-office/ and elsewhere, Jeff did all he could to save money, e.g. he made the desks himself from pieces of wood. Mentioned e.g. at youtu.be/J2xGBlT0cqY?t=345 from Video 4. "Jeff Bezos presentation at MIT (2002)".
First Amazon hire, wrote and led the team that wrote v1.
He looks like an older and more experienced dude compared to Bezos at the time.
Bibliography: . www.geekwire.com/2011/meet-shel-kaphan-amazoncom-employee-1/2/ also mentions that unlike California, there's no sales tax in the state of Washington, which is important for selling books.
- a few mentions at: Video "Jeff Bezos presentation at MIT (2002)"
Amazon is apparently notorious for having bought off many competitors, many of them just to kill off the competition and clear the way, not to actually reuse them.
youtu.be/tfAhTtBlb2Q?t=849 from Video "Jeff Bezos Revealed by Bloomberg (2015)" clearly shows Tim O'Reilly saying that very clearly about Bezos.
I do know of a number of cases in which he [Bezos] has acquired companies in order to take out competitors, potential future competitors. Rather than because he actually wants that business to continue.Perhaps O'Reilly who is the bookselling business is not the greatest fan of Jeff. But still. My God.
www.yalelawjournal.org/pdf/e.710.Khan.805_zuvfyyeh.pdf Amazon's Antitrust Paradox by Lina M . Khan from The Yale Law Journal raises this incredible issue.
Like Google custom silicon, Amazon server operations are so large that with the slowdown of Moore's law, it started being worth it for them to develop custom in-house silicon to serve as a competitive advantage, not to be sold for external companies. Can you imagine the scale required to justify silicon development investment that is not sold externally!
Page contains a good summary of their hardware to date. They seem to still be the centerpiece of silicon development. There are still however people outside of Israel doing it, e.g.: www.linkedin.com/in/laurasharpless/ says as of 2021:
My team develops software for our next-generation Machine Learning accelerators: HAL, firmware, and SoC models.
2021: networking chip reports emerge: www.theverge.com/circuitbreaker/2021/3/30/22358633/amazon-reportedly-custom-network-switch-silicon-aws, presumably contesting with the likes of Cisco?
2018 onwards: Amazon AI accelerator silicon.
Google only succeeds at highly algorithmic tasks or at giving infinite storage to users to then mine their data.
It is incapable however of adding any obvious useful end user features to most of its products, most of which get terminated and cannot be relied on:
This also seems to extend to business-to-business: twitter.com/MohapatraHemant/status/1343969802080030720 ex-Googler tells how they lost the cloud to Amazon.
More mentions of that:
- world.hey.com/dhh/google-suffers-from-a-digital-petro-curse-908e919a "Google suffers from a digital petro curse" by David Heinemeier Hansson (2021), the creator of Ruby on Rails
- killedbygoogle.com/ dedicated website, source on GitHub: github.com/codyogden/killedbygoogle
Ciro Santilli actually attempted two interviews to work at Google in the early 2010's but very quickly failed both on the first phase, because you have to be a fast well trained coding machine to pass that interview.
Ciro later felt better about himself by fantasizing how he would actually do more important things outside of Google and that they would beg to buy him instead.
He was also happy that he wouldn't have to use Google crazy internal tools: someone once said that Google's tools make easy tasks middle hard, and they also make impossible tasks middle hard. TODO source.
But whatever the case: Ciro will not, ever, spend his time drilling programmer competition problems to join a company.
www.wired.com/story/google-shakes-up-its-tgif-and-ends-its-culture-of-openness/ "GOOGLE TGIF 1999 video". TGIF is the weekly all hands meeting abolished in 2019: www.wired.com/story/google-shakes-up-its-tgif-and-ends-its-culture-of-openness/
The 1997 Wayback Machine archives are just priceless: web.archive.org/web/19971210065425/http://backrub.stanford.edu/backrub.html. I'm so glad that website exists and started so early. It is just another university research project demo website like any other. Priceless.
Craig Silverstein was the first employee hired, in 1998: www.newyorker.com/magazine/2018/12/10/the-friendship-that-made-google-huge
In August 1998 they had an their first investment of $100,000 from Andy Bechtolsheim, Sun Microsystems co-founder. Some sources say September 1998. This was an event of legend, the dude dropped by, tested the website for a few minutes, said I like it, and dropped a 100$ check with no paperwork. Google wasn't even incorporated, they had to incorporate to cash the check. They were apparently introduced by one of the teachers, TODO which. Some sources say he had to rush off to another meeting afterwards:
Tried to sell it for 1 million in early 1999... OMG the way the world is. It would be good to learn more about that story, and when they noticed it was fuckup.
- Video "Anne Wojcicki interview by Talks at Google (2018)" has a few mentions, e.g. youtu.be/pDoALM0q1LA?t=173
- www.theverge.com/2019/12/4/20994361/google-alphabet-larry-page-sergey-brin-sundar-pichai-co-founders-ceo-timeline The rise, disappearance, and retirement of Google co-founders Larry Page and Sergey Brin. Good timeline!
One wonders if this name has some influence from the LGBT culture in San Francisco!!!
Married a Vietnamese Chick called Allison Huynh from university in 2001. Was unfaithful, and now does not want to split the cash? www.cnbctv18.com/technology/who-is-scott-hassan-the-google-founder-accused-of-divorce-terrorism-10543641.htm Bro, be a man.
That article does mention that he has 13 B in Google shares he bought before IPO, but a net worth of only 1 B. He must have made some insane losses somewhere! It does feel like they gave him a privileged deal because of his early contributions, having that much for just 800 USD sounds unlikely.
www.dailymail.co.uk/news/article-9912929/Billionaire-investor-helped-launch-Google-accused-divorce-terrorism-bitter-break-up.html has even better information. He tried to strike a post-nuptial after google went public in 2004, which she declined. So things were already not perfect then. It mentions that the shares would be worth 13 B today, not that he holds them necessarily. He must have sold early.
To be fair, he did work on a lot of cool stuff, not the least the company that crated the Robot Operating System, which is a cool sounding project.
Has some good mentions, but often leaves you wanting more details of how certain things happened, especially the early days stuff.
Does however paint a good picture of several notable employees, and non-search projects from the early 2000's including:
- the cook dude
- porn cookie guy
- the unusual IPO process
Paints a very positive picture of the founders. It is likely true. They gave shares generously to early employees. Tried to allow the more general public to buy from IPO, by using a bidding scheme, rather than focusing on the big bankers as was usual.
The introduction mentions that Google is very interested in molecular biology and mining genetics data, much like Ciro Santilli! Can't find external references however...
Two of the most compelling areas that Google and its founders are quietly working on are the promising fields of molecular biology and genetics. Millions of genes in combination with massive amounts of biological and scientific data are an excellent match for the Google search engine, the tremendous database the company has in place, and its immense computing power. Already, Google has downloaded a map of the human genome and is working closely with biologist Dr. Craig Venter and other leaders in genetics on scientific projects that may lead to important breakthroughs in science, medicine, and health. In other words, we may be heading toward a time when people can google their own genes.
The book gives good highlight as to why Google became big: search was just an incredibly computationally intensive task. From very early days, Largey were already making up their own somewhat custom compute systems from very early days, which naturally led into Google custom hardware later on. Google just managed to pull ahead on the reinvest revenue into hardware loop, and no one ever caught them back. This feels more the case than e.g. with Amazon, which notoriously had to buy off dozens of competitors to clear the way.
They scanned a bunch of books, and then allowed search results to hit them. They then only show a small context around the hit to avoid copyright infringement.
- www.theatlantic.com/technology/archive/2017/04/the-tragedy-of-google-books/523320/ Torching the modern-day Library of Alexandria (2015) by James Somers
- The Google Story Chapter 21. A Virtual Library paints a good picture of the people involved
Very similar to OurBigBook.com!
Like any closed source "failure", everything was deleted. wiki.archiveteam.org/index.php/Knol
www.zdnet.com/article/googles-quantum-focused-sandbox-division-is-being-spun-off/ Google's quantum-focused Sandbox division is being spun off (2022)