Like Google custom silicon, Amazon server operations are so large that with the slowdown of Moore's law, it started being worth it for them to develop custom in-house silicon to serve as a competitive advantage, not to be sold for external companies. Can you imagine the scale required to justify silicon development investment that is not sold externally!
Some of the earlier computers of the 20th centure were analog computers, not digital.
At some point analog died however, and "computer" basically by default started meaning just "digital computer".
As of the 2010's and forward, with the limit of Moore's law and the rise of machine learning, people have started looking again into analog computing as a possile way forward. A key insight is that huge floating point precision is not that crucial in many deep learning applications, e.g. many new digital designs have tried 16-bit floating point as opposed to the more traditional 32-bit minium. Some papers are even looking into 8-bit: dl.acm.org/doi/10.5555/3327757.3327866
As an example, the Lightmatter company was trying to implement silicon photonics-based matrix multiplication.
A general intuition behind this type of development is that the human brain, the holy grail of machine learning, is itself an analog computer.
This ISA basically completely dominated the smartphone market of the 2010s and beyond, but it started appearing in other areas as the end of Moore's law made it more economical logical for large companies to start developing their own semiconductor, e.g. Google custom silicon, Amazon custom silicon.
It is exciting to see ARM entering the server, desktop and supercomputer market circa 2020, beyond its dominant mobile position and roots.
Ciro Santilli likes to see the underdogs rise, and bite off dominant ones.
The excitement also applies to RISC-V possibly over ARM mobile market one day conversely however.
Basically, as long as were a huge company seeking to develop a CPU and able to control your own ecosystem independently of Windows' desktop domination (held by the need for backward compatibility with a billion end user programs), ARM would be a possibility on your mind.
- in 2020, the Fugaku supercomputer, which uses an ARM-based Fujitsu designed chip, because the number 1 fastest supercomputer in TOP500: www.top500.org/lists/top500/2021/11/It was later beaten by another x86 supercomputer www.top500.org/lists/top500/2022/06/, but the message was clearly heard.
- 2012 hackaday.com/2012/07/09/pedal-powered-32-core-arm-linux-server/ pedal-powered 32-core Arm Linux server. A publicity stunt, but still, cool.
- AWS Graviton
Given enough computational power per dollar, AGI is inevitable, but it is not sure certain ever happen given the end of end of Moore's Law.
Alternatively, it could also be achieved genetically modified biological brains + brain in a vat.
Imagine a brain the size of a building, perfectly engineered to solve certain engineering problems, and giving hints to human operators + taking feedback from cameras and audio attached to the operators.
This likely implies transhumanism, and mind uploading.
Ciro Santilli joined the silicon industry at one point to help increase our computational capacity and reach AGI.
Ciro believes that the easiest route to full AI, if any, could involve Ciro's 2D reinforcement learning games.
Ciro Santilli is a fan of this late 2010's buzzword.
It basically came about because of the endless stream of useless software startups made since the 2000's by one or two people with no investments with the continued increase in computers and Internet speeds until the great wall was reached.
Deep tech means not one of those. More specifically, it means technologies that require significant investment in expensive materials and laboratory equipment to progress, such as molecular biology technologies and quantum computing.
And it basically comes down to technologies that wrestle with the fundamental laws of physics rather than software data wrangling.
Computers are of course limited by the laws of physics, but those are much hidden by several layers of indirection.
Full visibility, and full control, make computer tasks be tasks that eventually always work out more or less as expected.
The same does not hold true when real Physics is involved.
Physics is brutal.
To start with, you can't even see your system very clearly, and often doing so requires altering its behaviour.
For example, in molecular biology, most great discoveries are made after some new technique is made to be able to observe smaller things.
But you often have to kill your cells to make those observations, which makes it very hard to understand how they work dynamically.
What we would really want would be to track every single protein as it goes about inside the cell. But that is likely an impossible dream.
The same for the brain. If we had observations of every neuron, how long would it take to understand it? Not long, people are really good at reverse engineering things when there is enough information available to do so, see also science is the reverse engineering of nature.
Then, even when you start to see the system, you might have a very hard time controlling it, because it is so fragile. This is basically the case of quantum computing in 2020.
It is for those reasons that deep tech is so exciting.
The next big things will come from deep tech. Failure is always a possibility, and you can't know before you try.
But that's also why its so fun to dare.
Stuff that Ciro Santilli considers "deep tech" as of 2020:
- brain-computer interface
- fusion power. The question there is, when is "deep", "too deep"?
Google has put considerable effort into custom hardware to greatly optimize its stack, in a way that is quite notable compared to other tech companies.
- 2021 www.theregister.com/2021/03/23/google_to_build_server_socs/ Google vows to build its own server system-on-chips, hires Intel veteran. Inevitable with the end of Moore's law. Instruction set architecture unannounced however. I'll bet ARM instruction set
- 2021 codec acceleration for YouTube: www.tomshardware.com/uk/news/intel-replaces-xeons-with-custom-vcus
As of 2019, the silicon industry is ending, and molecular biology technology is one of the most promising and growing field of engineering.
Such advances could one day lead to both biological super-AGI and immortality.
Ciro Santilli is especially excited about DNA-related technologies, because DNA is the centerpiece of biology, and it is programmable.
First, during the 2000's, the cost of DNA sequencing fell to about 1000 USD per genome in the end of the 2010's: Figure 2. "Cost per genome vs Moore's law from 2000 to 2019", largely due to "Illumina's" technology.
The medical consequences of this revolution are still trickling down towards medical applications of 2019, inevitably, but somewhat slowly due to tight privacy control of medical records.
Ciro Santilli predicts that when the 100 dollar mark is reached, every person of the First world will have their genome sequenced, and then medical applications will be closer at hand than ever.
But even 100 dollars is not enough. Sequencing power is like computing power: humankind can never have enough. Sequencing is not a one per person thing. For example, as of 2019 tumors are already being sequenced to help understand and treat them, and scientists/doctors will sequence as many tumor cells as budget allows.
Then, in the 2010's, CRISPR/Cas9 gene editing started opening up the way to actually modifying the genome that we could now see through sequencing.
What's next?
Ciro believes that the next step in the revolution could be could be: de novo DNA synthesis.
This technology could be the key to the one of the ultimate dream of biologists: cheap programmable biology with push-button organism bootstrap!
Just imagine this: at the comfort of your own garage, you take some model organism of interest, maybe start humble with Escherichia coli. Then you modify its DNA to your liking, and upload it to a 3D printer sized machine on your workbench, which automatically synthesizes the DNA, and injects into a bootstrapped cell.
You then make experiments to check if the modified cell achieves your desired new properties, e.g. production of some protein, and if not reiterate, just like a software engineer.
Of course, even if we were able to do the bootstrap, the debugging process then becomes key, as visibility is the key limitation of biology, maybe we need other cheap technologies to come in at that point.
This a place point we see the beauty of evolution the brightest: evolution does not require observability. But it also implies that if your changes to the organism make it less fit, then your mutation will also likely be lost. This has to be one of the considerations done when designing your organism.
Other cool topic include:
- computational biology: simulations of cell metabolism, protein and small molecule, including computational protein folding and chemical reactions. This is basically the simulation part of omics.If we could only simulate those, we would basically "solve molecular biology". Just imagine, instead of experimenting for a hole year, the 2021 Nobel Prize in Physiology and Medicine could have been won from a few hours on a supercomputer to determine which protein had the desired properties, using just DNA sequencing as a starting point!
- microscopy: crystallography, cryoEM
- analytical chemistry: mass spectroscopy, single cell analysis (Single-cell RNA sequencing)
It's weird, cells feel a lot like embedded systems: small, complex, hard to observe, and profound.
Ciro is sad that by the time he dies, humanity won't have understood the human brain, maybe not even a measly Escherichia coli... Heck, even key molecular biology events are not yet fully understood, see e.g. transcription regulation.
One of the most exciting aspects of molecular biology technologies is their relatively low entry cost, compared for example to other areas such as fusion energy and quantum computing.
Unconditional basic income is Ciro Santilli's ultimate non-transhumanist technological dream: to reach a state of technological advancement and distribution of resources so high that everyone gets money for doing nothing, enough for:
- basic survival needs: food, housing, clothes, hygiene, etc.
- two children to keep the world going. Or immortality tech, but is harder and borderline transhumanist :-)
- high speed computer and Internet
Once a person has that, they can "learn, teach" and create whatever they want. Or play video games all day long if they wish.
Ciro Santilli will not live to see this, and is content with helping it happen faster by increasing the efficiency of the world as. And having at least two well educated kids to carry on the project after he dies :-)
Technologies which would help a lot towards unconditional basic income, and might be strictly required required are:
- artificial general intelligence
- affordable humanoid robots with human-like energy efficiency and power-to-weight ratio.This is even less likely than AGI due to the end of silicon Moore's Law and at the start of the Genome's Moore's law: information doubles, small sizes halve, but macroscopic mechanical artifacts stay the same.brain-computer interfaces are pretty certain to happen however after Ciro Santilli dies.
So in the worst case we can just grow brainless bodies and replace the cavity hole with a computer that controls the body, possibly with high level decisions coming from a remote building-sized genetically engineered biological AGI brain.
Of course, it is all about costs. A human costs about 130k 2010 USD/year. So how cheap can we make the AGI / robot human equivalent / year for a given task?
AGI + humanoid robots likely implies AI takeover though. It would then come down to human loving bots vs human hating bots fighting it out. It will be both terrifying and fun to watch.
AGI alone would be very dangerous, in case it can get control of our nuclear arsenals through software zero days or social engineering. Although some claim that is unlikely.
Humanity's best bet to achieve silicon AGI today is to work on: Ciro's 2D reinforcement learning games.
By Charles Bukowski mentioned e.g. at tatyanany.medium.com/slavery-was-never-abolished-it-was-only-extended-to-include-all-the-colors-6ca21d586e7e:
Slavery was never abolished, it was only extended to include all the colors.
Bibliography:
- www.youtube.com/watch?v=bldeaDRWJYcLecture 24: Unemployment, Re-employment & Income Security by Ian Shapiro (2019)