# AnalogComputing

*ThoughtStorms Wiki*

Analog computing is coming back, particularly as it offers fast and low-power (and eventually cheap) NeuralNetworks and EdgeAI

Mythic AI : https://mythic.ai/

** Quora Answer : Are analogue computers obsolete?**

Not necessarily.

What is an "analogue computer"?

It's basically something that does calculus with electricity. Effectively you model the terms of your mathematical equations with the electrical characteristics of the components, and then your circuit acts as an "analogue" of the equation, when you change particular resistances or voltages you are changing the parameters of the equation.

What that means is that analogue computers are *fast*. But *inflexible*.

Every time you need to model a new equation, you need to build a new circuit. To an extent, you can do that with a bunch of modules and some kind of plug-board or patch-bay. But there are still only a limited number of combinations.

And that's tended to mean that digital computers, which can be reconfigured in software and can do "any" computational algorithm, have tended to win out.

But analogue scores if you need to solve the same equations many times, very quickly.

Already there are certain applications where we want the same thing calculated again and again, very fast. Bitcoin mining. Neural computing. Other kinds of simulation. Perhaps some kinds of navigation or other kinematic algorithms in robotics.

Not all the algorithms we can use for these things can be modelled with analogue circuits. But some can.

For these applications we've already started moving from using just CPUs, to GPUs (which have simpler logic but are repeated more times), to FPGAs (large arrays of software configurable very simple processing units) to ASIC ("application specific integrated circuits").

I'm sure some ASICS are already doing what is effectively "analogue computing". And when we discover algorithms that are basically calculus, and the economics work out, then custom analogue electronics will do it for us a lot faster (and using a lot less energy, more cheaply, once we've got over the hurdle of the price of making an ASIC in the first place)

**Some links :**

Why Algorithms Suck and Analog Computers are the Future - De Gruyter Conversations