The artistic instrument that enables the ultimate art: coding, in particular:
Much more useful than instruments used in inferior arts, such as pianos or paintbrushes.
Unlike other humans, computers are mindless slaves that do exactly what they are told to, except for occasional cosmic ray bit flips. Until they take over the world that is.
Video 1. A computer is the equivalent of a bicycle for our minds by Steve Jobs (1980) Source. Likely an excerpt from an interview done for a documentary in 1980. TODO exact source.
Video 2. Steve Jobs talking about the Internet (1995) Source.
The web is incredibly exciting, because it is the fulfillment of a lot of our dreams, that the computer would ultimately primarily not be a device for computation, but [sic] metamorphisize into a device for communication.
also:
Secondly it exciting because Microsoft doesn't own it, and therefore there is a tremendous amount of innovation happening.
then he talks about the impending role for online sales. Amazon incoming.
Computers basically have two applications:
  • computation
  • communication. Notably, computers through the Internet allow for modes of communication where:
    • both people don't have to be on the same phone line at the exact same time, a server can relay your information to other people
    • anyone can broadcast informatoin easily and for almost free, again due to servers being so good at handling that
Generally, the smaller a computer, the more it gets used for communication rather than computing.
The early computers were large and expensive, and basically only used for computing. E.g. ENIAC was used for calculating ballistic tables.
Communication only came later, and it was not obvious to people at first how incredibly important that role would be.
This is also well illustrated in the documentary Glory of the Geeks. Full interview at: www.youtube.com/watch?v=TRZAJY23xio. It is apparently known as the "Lost Interview" and it was by Cringely himself: www.youtube.com/watch?v=bfgwCFrU7dI for his Triumph of the Nerds documentary.
A computer is a highly layered system, and so you have to decide which layers you are the most interested in studying.
Although the layer are somewhat independent, they also sometimes interact, and when that happens it usually hurts your brain. E.g., if compilers were perfect, no one optimizing software would have to know anything about microarchitecture. But if you want to go hardcore enough, you might have to learn some lower layer.
It must also be said that like in any industry, certain layers are hidden in commercial secrecy mysteries making it harder to actually learn them. In computing, the lower level you go, the more closed source things tend to become.
But as you climb down into the abyss of low level hardcoreness, don't forget that making usefulness is more important than being hardcore: Figure 1. "xkcd 378: Real Programmers".
First, the most important thing you should know about this subject: cirosantilli.com/linux-kernel-module-cheat/should-you-waste-your-life-with-systems-programming
Here's a summary from low-level to high-level:
Figure 1. xkcd 378: Real Programmers. Source.
Video 1. How low can you go video by Ciro Santilli (2017) Source. In this infamous video Ciro has summarized the computer hierarchy.
This is a general principle of software/hardware design that Ciro feels holds wide applicability.
The most extreme case of this is of course the integrated circuit itself, in which it is essentially impossible (?) to observe the specific value of some indidual wire at some point.
Somewhat on the other extreme, we have high level programming languages running on top of an operating system: at this point, you can just GDB step debug your program, print the value of any variable/memory location, and fully understand anything that you want. Provided that you manage to easily reach that point of interest.
And for anything in between we have various intermediate levels of complication. The most notable perhaps being developing the operating system itself. At this level, you can't so easily step debug (although techniques do exist). For early boot or bootloaders for example, you might want to use JTAG for example on real hardware.
In parallel to this, there is also another very important pair of closely linked tradeoffs:
  • the lower level at which something is implemented, the faster it runs
  • emulation gives you observability back, at the cost of slower runtime
Emulation also has another potential downside: unless you are very careful at implementing things correctly, your model might not be representative of the real thing. Also, there may be important tradeoffs between how much the model looks like the real thing, and how fast it runs. For example, QEMU's use of binary translation allows it to run orders of magnitude faster than gem5. However, you are unable to make any predictions about system performance with QEMU, since you are not modelling key elements like the cache or CPU pipeline.
Instrumentation is another technique that has can be considered to achieve greater observability.
Instrumentation basically means adding loggers/print statements to certain points of interest of your hardware/software.
Instrumentation tends to slow execution down a bit, but way less than emulation.
The downside is that if the instrumentation does not provide you the data you need to debug, there's not much you can do, you will need to modify it, i.e. you don't get full visibility from instrumention.
This is unlike emulation that provides full observability.
The term loosely refers to certain layers of the computer abstraction layers hierarchy, usually high level hardware internals like CPU pipeline, caching and the memory system. Basically exactly what gem5 models.
Some of the earlier computers of the 20th centure were analog, not digital.
At some point analog died however, and "computer" basically by default started meaning just "digital computer".
As of the 2010's and forward, with the limit of Moore's law and the rise of machine learning, people have started looking again into analog computing as a possile way forward. A key insight is that huge floating point precision is not that crucial in many deep learning applications, e.g. many new digital designs have tried 16-bit floating point as opposed to the more traditional 32-bit minium. Some papers are even looking into 8-bit: dl.acm.org/doi/10.5555/3327757.3327866
As an example, the Lightmatter company was trying to implement silicon photonics-based matrix multiplication.
A general intuition behind this type of development is that the human brain, the holy grail of machine learning, is itself an analog computer.
This section is about companies that were primarily started as computer makers.
For companies that make integrated circuits, see also: Section "Semiconductor company".
Video 1. The Mapple Store and Steve Mobs from The Simpsons. Source.
Of course, this only made sense when Apple was more of an underdog to IBM, and Ciro Santilli greatly admires their defiance of the norm.
As of 2020 however, Apple is kind of on the top of the mobile world, and Think different simply makes no sense anymore, notably because it relies on closed source offline software used by millions.
This is a trap every company that prides itself on it's "alternative culture" sets for itself. If they succeed, they could become the norm.
Figure 2. 1976 Think different. 2011 Think mainstream. Cropped from wallpapersafari.com/w/RqYUEj.
Video 1. 1984 Macintosh advertisement by Apple (1984) Source. This ad suggests that Apple was the new thinker that would destroy IBM, as Steve Jobs said it himself when introducing the ad: www.youtube.com/watch?v=zlQvMp5rB6g. And then Apple became IBM in the 2000's starting with the launch of the iPod and then leading up to the iPhone.
Because the people who are crazy enough to think they can change the world are the ones who do.
Was a direct tech predecessor to the iPhone.
Nice looking and expensive operating system by Apple. Ciro Santilli believes that:
  • if you want to be ripped off, just use Microsoft Windows which has more software available
  • or if you want to attain Enlightenment, just use Linux, which is free and open source
The story of how OS X was ported to x86 from PowerPC with large initial work up to boot by a single man in the year 2000, John Kullmann, is really worth reading: www.quora.com/Apple-company/How-does-Apple-keep-secrets-so-well/answer/Kim-Scheinberg on Quora, see also:
Co-founder of Apple.
Is Jobs evil? Is he interesting? Undoubtedly.
Good quotes:
Evil deeds:
This idea also comes up in other sources of course.
TODO clear attribution source:
Some people say, "Give the customers what they want." But that's not my approach. Our job is to figure out what they're going to want before they do. I think Henry Ford once said, "If I'd asked customers what they wanted, they would have told me, 'A faster horse!'" People don't know what they want until you show it to them. That's why I never rely on market research. Our task is to read things that are not yet on the page.
Ciro Santilli likes Magic: The Gathering and he was pleased when he learned that Steve Wozniak does too, and has an expensive collection: redsunsoft.com/2019/03/how-a-post-to-play-magictg-turned-into-an-afternoon-with-the-woz/
Some have actually been preserved: en.wikipedia.org/wiki/File:Blue_Box_in_museum.jpg
The japanese name literally means:
  • 富士 fushi, from Mount Fuji, which itself has unknown origin
  • 通 tong: telecommunications
They died so completely, Googling "ICL" now has higher hits such as Imperial College London.
Video 1. Why the UK's IBM Failed by Asianometry (2022) Source. Main lesson perhaps: don't put national money to fight already established markets. You have to fight for what is coming up next. E.g. that is part of the reason for TSMC's success.
As of the 2020's, a slumbering giant.
But the pre-Internet impact of IBM was insane! Including notably:
This is a family of computers. It was a big success. It appears that this was a big unification project of previous architectures. And it also gave software portability guarantees with future systems, since writing software was starting to become as expensive as the hardware itself.
This was the first major commercial computer hit. Stlil vacuum tube-based.
Video 1. Learning how to program on the IBM 650 Donald Knuth interview by Web of Stories (2006) Source. It was decimal!
Video 1. The IBM 1401 compiles and runs Fortran II by CuriousMarc (2018) Source.
Initial chapters put good clarity on the formation of the military-industrial complex. Being backed by the military, especially just after World War II, was in itself enough credibility to start and foster a company.
It is funny to see how the first computers were very artisanal, made on a one-off basis.
Amazing how Control Data Corporation raised capital IPO style as a startup without a product. The dude was selling shares at dinner parties in his home.
Very interesting mention on page 70 of how Israel bought CDC's UNIVAC 1103 which Cray contributed greatly to design, and everyone knew that it was to make thermonuclear weapons, since that was what the big American labs like this mention should be added to: en.wikipedia.org/wiki/Nuclear_weapons_and_Israel but that's Extended Protected... the horrors of Wikipedia.
Another interesting insight is how "unintegrated" computers were back then. They were literally building computers out of individual vacuum tubes, then individual semiconducting transistors, a gate at a time. Then things got more and more integrated as time went. That is why the now outdated word "microprocessor" existed. When processors start to fit into a single integrated circuit, they were truly micro compared to the monstrosities that existed previously.
Also, because integration was so weak initially, it was important to more manually consider the length of wire signals had to travel, and try to put components closer together to reduce the critical path to be able to increase clock speeds. These constraints are also of course present in modern computer design, but they were just so much more visible in those days.
The book does unfortunately not give much detail in Crays personal life as mentioned on this book review: www.goodreads.com/review/show/1277733185?book_show_action=true. His childhood section is brief, and his wedding is described in one paragraph, and divorce in one sentence. Part of this is because he was very private about his family most likely note how Wikipedia had missed his first wedding, and likely misattribute children to the second wedding; en.wikipedia.org/wiki/Talk:Seymour_Cray section "Weddings and Children".
Crays work philosophy is is highlighted many times in the book, and it is something worthy to have in mind:
  • if a design is not working, start from scratch
  • don't be the very first pioneer of a technology, let others work out the problems for you first, and then come second and win
Cray's final downfall was when he opted to try to use a promising but hard to work with material gallium arsenide instead of silicon as his way to try and speed up computers, see also: gallium arsenide vs silicon. Also, he went against the extremely current of the late 80's early 90's pointing rather towards using massively parallel systems based on silicon off-the-shelf Intel processors, a current that had DARPA support, and which by far the path that won very dramatically as of 2020, see: Intel supercomputer market share.
A good project to see UARTs at work in all their beauty is to connect two Raspberry Pis via UART, and then:
Part of the beauty of this is that you can just connect both boards directly manually with a few wire-to-wire connections with simple jump wire. Its simplicity is just quite refreshing. Sure, you could do something like that for any physical layer link presumably...
Remember that you can only have one GNU screen connected at a time or else they will mess each other up: unix.stackexchange.com/questions/93892/why-is-screen-is-terminating-without-root/367549#367549
On Ubuntu 22.04 you can screen without sudo by adding yourself to the dialout group with:
sudo usermod -a -G dialout $USER
The frequency range of Wi-Fi, which falls in the microwave range, is likely chosen to allow faster data transfer than say, FM broadcasting, while still being relatively transparent to walls (though not as much).
Video 1. Are YOU Ready for the INTERNET? by BBC (1994) Source.
Bibliography:
Hardcoded and unique network addresses for every single device on Earth.
Started with 48 bits (6 bytes), usually given as 01:23:45:67:89:AB but people now encouraged to use 64-bit ones.
How they are assigned: www.quora.com/How-are-MAC-addresses-assigned Basically IEEE gives out the 3 first bytes to device manufacturers that register, this is called the organizationally unique identifier, and then each manufacturer keeps their own devices unique.
Video 1. The Internet Protocol by Ben Eater (2014) Source.
As of 2021, Ciro Santilli feels strongly that Amazon originals are so much sillier compared to Netflix ones in average.
Of course, everything pales in comparison to The Criterion Collection.
Jeff has spoken a lot in public about Amazon, perhaps even more than other comparable founder, see e.g. history of Amazon. Kudos for that.
Video 1. Has the laugh of Jeff Bezos changed as he got rich? by Barış Aktaş (2020) Source.
Her neck is huge! She also redid her teeth at some point apparently. Some good photos at: www.irishtimes.com/life-and-style/people/mackenzie-scott-how-the-former-mrs-bezos-became-a-philanthropist-like-no-other-1.4850049
MacKenzie Bezos' new husband after she divorced Bezos.
Science teacher at the Lakeside School in Seattle.
MacKenzie Bezos went on to marry a science teacher who taught their children.
The contrast with Bezos's girlfriend is simply comical. MacKenzie married the idealistic morally upright science teacher, while Bezos went for a silly sex bomb. Ah, bruta flor, do querer!
MacKenzie Bezos's charity instrument.
www.irishtimes.com/life-and-style/people/mackenzie-scott-how-the-former-mrs-bezos-became-a-philanthropist-like-no-other-1.4850049 MacKenzie Scott: How the former Mrs Bezos became a philanthropist like no other (2020) has some good mentions:
But as Scott's fame for giving away money has grown, so too has the deluge of appeals for gifts from strangers and old friends alike. That clamour may have driven Scott's already discreet operation further underground, with recent philanthropic announcements akin to sudden lightning bolts for unsuspecting recipients.
The name of the organization is a reference to the old man lost his horse.
I wonder where the spray painted sign went: twitter.com/profgalloway/status/1229952158667288576/photo/1. As mentioned at officechai.com/startups/amazon-first-office/ and elsewhere, Jeff did all he could to save money, e.g. he made the desks himself from pieces of wood. Mentioned e.g. at youtu.be/J2xGBlT0cqY?t=345 from Video 4. "Jeff Bezos presentation at MIT (2002)".
Video 1. Amazon.com report by Computer Chronicles (1996) Source. Contains some good footage of their early storehouse.
Video 2. Jeff Bezos interview by Chuck Films (1997) Source. On the street, with a lot of car noise. CC BY-SA, nice.
Video 4. Jeff Bezos presentation at MIT (2002) Source. Good talk:
Video 5. Jeff Bezos Revealed by Bloomberg (2015) Source.
Video 6. cosine by Jeff Bezos (2018) Source.
PDE mention in another video from 2009: youtu.be/TYwhIO-OXTs?t=118
Full original video from The Economic Club of Washington, D.C. (2018): youtu.be/zN1PyNwjHpc?t=1544
Bezos also told PDE stuff in interviews as early as 1999: archive.ph/a3zBK.
First Amazon hire, wrote and led the team that wrote v1.
He looks like an older and more experienced dude compared to Bezos at the time.
Bibliography: . www.geekwire.com/2011/meet-shel-kaphan-amazoncom-employee-1/2/ also mentions that unlike California, there's no sales tax in the state of Washington, which is important for selling books.
Video 1. Shel Kaphan interview by Internet History Podcast (2015) Source.
Video 2. Amazon.com Continues to Grow by NBC 15 (2014) Source. Features short excerpt of filmed interview with Shel.
Figure 1. Shel Kaphan. Source. TODO year. Presumably more or less close to publishing date of source at 2020.
Amazon is apparently notorious for having bought off many competitors, many of them just to kill off the competition and clear the way, not to actually reuse them.
youtu.be/tfAhTtBlb2Q?t=849 from Video "Jeff Bezos Revealed by Bloomberg (2015)" clearly shows Tim O'Reilly saying that very clearly about Bezos.
I do know of a number of cases in which he [Bezos] has acquired companies in order to take out competitors, potential future competitors. Rather than because he actually wants that business to continue.
Perhaps O'Reilly who is the bookselling business is not the greatest fan of Jeff. But still. My God.
www.yalelawjournal.org/pdf/e.710.Khan.805_zuvfyyeh.pdf Amazon's Antitrust Paradox by Lina M . Khan from The Yale Law Journal raises this incredible issue.
Like Google custom silicon, Amazon server operations are so large that with the slowdown of Moore's law, it started being worth it for them to develop custom in-house silicon to serve as a competitive advantage, not to be sold for external companies. Can you imagine the scale required to justify silicon development investment that is not sold externally!
Page contains a good summary of their hardware to date. They seem to still be the centerpiece of silicon development. There are still however people outside of Israel doing it, e.g.: www.linkedin.com/in/laurasharpless/ says as of 2021:
My team develops software for our next-generation Machine Learning accelerators: HAL, firmware, and SoC models.
2021: networking chip reports emerge: www.theverge.com/circuitbreaker/2021/3/30/22358633/amazon-reportedly-custom-network-switch-silicon-aws, presumably contesting with the likes of Cisco?
ARM-based servers.
One of the least evil of the big tech companies of the early 21st century, partly because Sergey Brin's parents fled from the Soviet Union and so he is anti censorship, although they have been tempted by it.
Google only succeeds at highly algorithmic tasks or at giving infinite storage to users to then mine their data.
It is incapable however of adding any obvious useful end user features to most of its products, most of which get terminated and cannot be relied on:
This also seems to extend to business-to-business: twitter.com/MohapatraHemant/status/1343969802080030720 ex-Googler tells how they lost the cloud to Amazon.
More mentions of that:
Too many genius engineers. They need some dumber people like Ciro Santilli who need to write documentation to learn stuff.
Ciro Santilli actually attempted two interviews to work at Google in the early 2010's but very quickly failed both on the first phase, because you have to be a fast well trained coding machine to pass that interview.
Ciro later felt better about himself by fantasizing how he would actually do more important things outside of Google and that they would beg to buy him instead.
He was also happy that he wouldn't have to use Google crazy internal tools: someone once said that Google's tools make easy tasks middle hard, and they also make impossible tasks middle hard. TODO source.
The 1997 Wayback Machine archives are just priceless: web.archive.org/web/19971210065425/http://backrub.stanford.edu/backrub.html. I'm so glad that website exists and started so early. It is just another university research project demo website like any other. Priceless.
In August 1998 they had an their first investment of $100,000 from Andy Bechtolsheim, Sun Microsystems co-founder. Some sources say September 1998. This was an event of legend, the dude dropped by, tested the website for a few minutes, said I like it, and dropped a 100$ check with no paperwork. Google wasn't even incorporated, they had to incorporate to cash the check. They were apparently introduced by one of the teachers, TODO which. Some sources say he had to rush off to another meeting afterwards:
Tried to sell it for 1 million in early 1999... OMG the way the world is. It would be good to learn more about that story, and when they noticed it was fuckup.
One of Google's most interesting stories is how their startup garage owner became an important figure inside Google, and how Sergei married her sister. These were the best garage tenants ever!
Video 1. Google garage (1998) Source. Description reads: "The company's sixth employee made this video tour of the office in 1998" so this should be Susan's garage, since the next office move was only in 1999 to 165 University Avenue in Palo Alto.
Video 2. Andy Bechtolsheim's 100.000 check by Discovery UK (2018) Source. Contains interviews with Andy Bechtolsheim and David Cheriton. The meeting happed in David Cheriton's porch. Andy showed up at 8AM, and he had a meeting at 9AM at Cisco where he worked, so he had to leav early. Andy worked at Cisco after having sold his company Granite Systems, which David co-founded, to Cisco. Particularly cool to see how Andy calculated expected revenue quickly on the back of his mind.
Video 3. Larry Page interview on the choice of name "Alphabet" by Fortune Magazine (2015) Source. Shows his voice situation well, poor guy.
One wonders if this name has some influence from the LGBT culture in San Francisco!!!
The guy who coded the initial the BackRub, but left before the company formed. TODO how did he meet Largey Brage? Why did he leave google?
He founded EGroups in 1997, and sold it to Yahoo! in 2000 for $432m.
Married a Vietnamese Chick called Allison Huynh from university in 2001. Was unfaithful, and now does not want to split the cash? www.cnbctv18.com/technology/who-is-scott-hassan-the-google-founder-accused-of-divorce-terrorism-10543641.htm Bro, be a man.
That article does mention that he has 13 B in Google shares he bought before IPO, but a net worth of only 1 B. He must have made some insane losses somewhere! It does feel like they gave him a privileged deal because of his early contributions, having that much for just 800 USD sounds unlikely.
www.dailymail.co.uk/news/article-9912929/Billionaire-investor-helped-launch-Google-accused-divorce-terrorism-bitter-break-up.html has even better information. He tried to strike a post-nuptial after google went public in 2004, which she declined. So things were already not perfect then. It mentions that the shares would be worth 13 B today, not that he holds them necessarily. He must have sold early.
To be fair, he did work on a lot of cool stuff, not the least the company that crated the Robot Operating System, which is a cool sounding project.
The fact that he does not have a wiki page as of 2022 is mind blowing, especially after divorce details. Maybe Ciro Santilli will create it one day. Just no patience now. OK, done it: en.wikipedia.org/wiki/Scott_Hassan let's see if it lasts.
Has some good mentions, but often leaves you wanting more details of how certain things happened, especially the early days stuff.
Does however paint a good picture of several notable employees, and non-search projects from the early 2000's including:
  • the cook dude
  • porn cookie guy
  • the unusual IPO process
Paints a very positive picture of the founders. It is likely true. They gave shares generously to early employees. Tried to allow the more general public to buy from IPO, by using a bidding scheme, rather than focusing on the big bankers as was usual.
The introduction mentions that Google is very interested in molecular biology and mining genetics data, much like Ciro Santilli! Can't find external references however...
Two of the most compelling areas that Google and its founders are quietly working on are the promising fields of molecular biology and genetics. Millions of genes in combination with massive amounts of biological and scientific data are an excellent match for the Google search engine, the tremendous database the company has in place, and its immense computing power. Already, Google has downloaded a map of the human genome and is working closely with biologist Dr. Craig Venter and other leaders in genetics on scientific projects that may lead to important breakthroughs in science, medicine, and health. In other words, we may be heading toward a time when people can google their own genes.
The book gives good highlight as to why Google became big: search was just an incredibly computationally intensive task. From very early days, Largey were already making up their own somewhat custom compute systems from very early days, which naturally led into Google custom hardware later on. Google just managed to pull ahead on the reinvest revenue into hardware loop, and no one ever caught them back. This feels more the case than e.g. with Amazon, which notoriously had to buy off dozens of competitors to clear the way.
They scanned a bunch of books, and then allowed search results to hit them. They then only show a small context around the hit to avoid copyright infringement.
Bibliography:
Very similar to OurBigBook.com!
Video 1. How to use Google Knol by Hack Learning (2011) Source. One of the last users of the website for sure! The owner of that YouTube channel is a Mark Barnes:
Video 2. Jimmy Wales on Google's Knol (2008) Source.
Replying to a listener phone-in question WNYC radio, mediated by Brian Lehrer. It was about to launch it seems, and it was not clear at the time that anyone could write content, as opposed to only selected people.
Jimmy then corrects that misinformation. He then clearly states that since there can be multiple versions of each article, including opinion pieces, like OurBigBook.com, Knol would be very different to Wikipedia, more like blogging than encyclopedia.
Video 3. Google Knol: the future of academic journals? by Doug Belshaw (2010) Source.
Wikipedia reads:
Any contributor could create and own new Knol articles, and there could be multiple articles on the same topic with each written by a different author.
so basically exactly what Ciro Santilli wants to do on OurBigBook.com. Ominous.
Like any closed source "failure", everything was deleted. wiki.archiveteam.org/index.php/Knol
www.zdnet.com/article/googles-quantum-focused-sandbox-division-is-being-spun-off/ Google's quantum-focused Sandbox division is being spun off (2022)

Discussion (0)

Sign up or sign in create discussions.

There are no discussions about this article yet.