Analog computer Updated +Created
Some of the earlier computers of the 20th centure were analog computers, not digital.
At some point analog died however, and "computer" basically by default started meaning just "digital computer".
As of the 2010's and forward, with the limit of Moore's law and the rise of machine learning, people have started looking again into analog computing as a possile way forward. A key insight is that huge floating point precision is not that crucial in many deep learning applications, e.g. many new digital designs have tried 16-bit floating point as opposed to the more traditional 32-bit minium. Some papers are even looking into 8-bit: dl.acm.org/doi/10.5555/3327757.3327866
As an example, the Lightmatter company was trying to implement silicon photonics-based matrix multiplication.
A general intuition behind this type of development is that the human brain, the holy grail of machine learning, is itself an analog computer.
Telecommunication Updated +Created
Communicating at a distance, from Greek "tele" for distance!
A very cool thing about telecommunication is, besides how incredibly fast it advanced (in this sense it is no cooler than integrated circuit development), how much physics and information theory is involved in it. Applications of telecommunication implementation spill over to other fields, e.g. some proposed quantum computing approaches are remarkably related to telecommunication technology, e.g. microwaves and silicon photonics.
This understanding made Ciro Santilli wish he had opted for telecommunication engineering when he was back in school in Brazil. For some incomprehensible reason, telecommunications was the least competitive specialization in the electric engineering department at the time, behind even power electronics. This goes to show both how completely unrelated to reality university is, and how completely outdated Brazil is/was. Sad stuff.