= Analog computer
{wiki}
Some of the earlier computers of the 20th centure were analog computers, not digital.
At some point analog died however, and "computer" basically by default started meaning just "<digital computer>".
As of the 2010's and forward, with the limit of <Moore's law> and the rise of <machine learning>, people have started looking again into analog computing as a possile way forward. A key insight is that huge floating point precision is not that crucial in many <deep learning> applications, e.g. many new digital designs have tried <16-bit floating point> as opposed to the more traditional 32-bit minium. Some papers are even looking into 8-bit: https://dl.acm.org/doi/10.5555/3327757.3327866
As an example, the <Lightmatter> company was trying to implement <silicon photonics>-based matrix multiplication.
A general intuition behind this type of development is that the <human brain>, the holy grail of <machine learning>, is itself an <analog computer>.
Back to article page