I don’t know, from the three stooges “I’m trying to think but nothing is happening”.
I only looked a few comments but they seem to agree. There is this one:
There are plenty of people who understand “hard real time”. Once upon a time I did. In hard real time there’s no debug code, which makes it exciting. It is indeed hard work.
The biggest problem up to well into the 1980s was in fact the analog/digital interface. The real world is analog, for the most part, at the scales we are considering.
To take the output from, say, a rate gyro, you need to digitise it. An interview question of the day might be “OK, you have a 12 bit A/D with an acquisition time of 15 microseconds. What is the highest frequency you can digitise?”
The answer is nonintuitive for most people. Assuming you are doing 11 bits plus sign, a complete cycle at full resolution involves 8192 one-bit steps. Under ideal conditions using DMA to get the data out, you will be able to digitise about 6Hz.
Now compare this with the acquisition of a fast op amp. The OP-37s we used to use back in the day might be running off +/-15V and the maximum excursion might be +/-10V.
The practical input offset error under working conditions might be +/-20 microvolts, giving a theoretical resolution equivalent to about 19 bits, so a fair number of stages might be needed to get down to +/-11 bit equivalent. The gain/bandwidth product is 63MHz. A quick calculation shows that a single stage could handle about 15kHz. What’s more, because of the high accuracy overhead, the signal accuracy doesn’t degrade like the digital one does because the noise at a kHz, say, is only 100nV.
For some applications, therefore, analog circuitry could handle much faster signals more accurately than a typical 12 bit military A/D of the day.
A lot can be done with analog circuitry, such as using analog switches to reconfigure circuits on the fly. By this means, automatic stabilisation and offset drift compensation can be built in. Analog multipliers using very precisely made transistor pairs were very much a thing in that period.
Of course once you have digitised the signal it will only degrade when you do a calculation that reduces the precision of the output, whereas with analog there is a steady loss of precision the bigger the system becomes. In the longer run, therefore, as A/D and D/A conversion got better, computers speeded up, FPUs got fast and cheap and it became possible to utilise multiple CPUs with many interrupt sources in real time systems, digital took over. But in the days of PDP-11s and the like, and with the extremely precise high performance components made by companies like Analog Devices, the tradeoffs often favoured analog.
personally worked with and helped tweak the designs of new analog computers from the 70s. Yes, we were still designing new analog “computers” for U.S. military systems (not just for the U.S. Navy) in the early 70s. I remember the transition to electrical analog computers. My personal experience was that both mechanical (properly maintained) and electrical (again, properly maintained) were good to about four significant figures.
Why was that good enough? Because the calculations were updated continuously. There was virtually zero latency or lag (for the systems on which I worked) between the time of the inputs to the time of the output. For digital computes of that era it might have taken seconds (or even 10s of seconds) to come up with the same answers. When implementation (like pointing a sensor at a far off target, or pointing a rapid fire gun) requires virtually zero lag, digital computers were simply not fast enough. The potential added accuracy to 7 or 8 significant figures was not worth the tradeoff.