Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Thought it was very interesting to see the precursors of modern computers and how they achieved the various mathematical functions mechanically


The Thomson (also known as Lord Kelvin of degrees K fame) tide predicting machine takes us back to the late 1800s.

https://en.wikipedia.org/wiki/Tide-predicting_machine

One implementation of it was a notable part of WWII:

> They came to be regarded as of military strategic importance during World War I, and again during the Second World War, when the US No.2 Tide Predicting Machine, described below, was classified, along with the data that it produced, and used to predict tides for the D-Day Normandy landings and all the island landings in the Pacific War.

https://en.wikipedia.org/wiki/Tide-Predicting_Machine_No._2

From Veritasium : The Most Powerful Computers You've Never Heard Of - the tide calculator plays a prominent part of the video. https://youtu.be/IgF3OX8nT0w (the next video is also in the same topic - Future Computers Will Be Radically Different https://youtu.be/GVsUOuSjvcg and that gets into more modern implementations and uses - https://the-analog-thing.org is the device shown in the video).


What's amazing is some of these fire-control systems using up to 15kW to keep all the motors and mechanicals moving!


On the topic of analog computers, I recently decided to explore the topic a bit and built something similar to "Tennis for Two" (one of the first video games) on a single breadboard using op amps and relays: https://blog.qiqitori.com/2024/08/implementing-tennis-for-tw...


And fascinating too that, unlike digital computers where operations take clock cycles, calculations in an analog computer are effectively instantaneous.


It's better than that. As an analog computer, both the inputs and outputs are _continuous_. So it's possible to get down to very small deltas that are only limited by the internal precision of the system itself, and the precision of measuring those inputs and outputs.

At the same time, precision is dictated by machining tolerances for the instruments in the calculation chain, as well as any mechanical forces in play at the time. Even the temperature of parts can change the dimensions of parts which can introduce error. And then there's the accumulation of error across a deep enough mechanical "pipeline".

What really gets me is how there is this tradeoff between analog and digital computers. Digital systems don't have precision errors from miss-shaped parts, but instead opt for errors in quantization (digitization) instead.


Backlash means there is delay in analog computers that is generally worse than in digital. However the calculations you perform on analog computers are generally much simpler and so you don't notice the lag.

edit: I should point out that some calculations that are trivial on analog computers and difficult on digital and so analog may have less lag on some specific calculations. However in general it is safe to say digital is faster overall even though in the real world you will find many examples where analog is faster.


The neat part is that it's almost instantaneous, but not quite.

How electricity flows through analog circuits and chooses the right path is another fascinating subject. Seeing how it actually works "in action" seems to glimpse some insights into ultra-fast computing paradigms: not just computing with analog circuits, but also structuring computation like circuits.

https://www.youtube.com/watch?v=2AXv49dDQJw


Not instantaneous on any scale a digital system would regard as important. You can't turn a shaft very far in a nanosecond, and you are in general limited by the inertia of the mechanical system as well as quantities such as bearing overheating, lubricant viscosity, the maximum force that can be applied through any particular component, and so on.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: