When I was in the CG I took the Navy’s 5 " 38 Gun “C” School at Dam Neck Virginia. The design of the gun had to take include not only a system to absorb the recoil but also had to account for dissipating the energy from the recoil system.
Drawing of the recoil and counter-recoil systems. The arrow shows the motion of the housing in the slide during recoil.
My pop wrote the watch bill at Dam Neck shortly before he retired after 30 years. A missle guy and torpedo man. Chief TMC. He wrinkled his eyes more than a few times over procedures back then. Good old boy that got his hands dirty. I really miss him. Can only think his opinion of this incident would not be “Gracefull”
The 4.5" guns I sailed with the recoil was adsorbed by moving the 1/2 ton breech block down using hydraulics. With the shell and cartridge on the tray the retreating rammer triggered the breech block which locked in place and as soon as the circuit completed the gun fired. A well trained crew in a standard double turret could achieve 48 rounds per minute.
The modern 125mm gun has nobody in the turret.
When assembled precisely, analog computers can be much more accurate than digital computers on these types of questions. Because they use physical rather than digital inputs and outputs, they can represent curves and other geometric elements of calculations with an infinite level of resolution (though the precision of those calculations is based on how well their parts are machined, and loss from friction and slippage). There are no least significant digits dropped, and answers are continuous rather than dependent on “for-next” clock-driven computing cycles
I disagree with this. Analog computers (mechanical or electronic) are very fast in 1950s terms, limited only by how fast the parts can accurately move, plus settling times for electronic ones. But digital computers have gotten so absurdly fast that they can calculate these answers to arbitrary precision extremely rapidly; and be unaffected by wear, adjustment, lubrication, manufacturing tolerances etc.
I don’t know, from the three stooges “I’m trying to think but nothing is happening”.
I only looked a few comments but they seem to agree. There is this one:
There are plenty of people who understand “hard real time”. Once upon a time I did. In hard real time there’s no debug code, which makes it exciting. It is indeed hard work.
The biggest problem up to well into the 1980s was in fact the analog/digital interface. The real world is analog, for the most part, at the scales we are considering.
To take the output from, say, a rate gyro, you need to digitise it. An interview question of the day might be “OK, you have a 12 bit A/D with an acquisition time of 15 microseconds. What is the highest frequency you can digitise?”
The answer is nonintuitive for most people. Assuming you are doing 11 bits plus sign, a complete cycle at full resolution involves 8192 one-bit steps. Under ideal conditions using DMA to get the data out, you will be able to digitise about 6Hz.
Now compare this with the acquisition of a fast op amp. The OP-37s we used to use back in the day might be running off +/-15V and the maximum excursion might be +/-10V.
The practical input offset error under working conditions might be +/-20 microvolts, giving a theoretical resolution equivalent to about 19 bits, so a fair number of stages might be needed to get down to +/-11 bit equivalent. The gain/bandwidth product is 63MHz. A quick calculation shows that a single stage could handle about 15kHz. What’s more, because of the high accuracy overhead, the signal accuracy doesn’t degrade like the digital one does because the noise at a kHz, say, is only 100nV.
For some applications, therefore, analog circuitry could handle much faster signals more accurately than a typical 12 bit military A/D of the day.
A lot can be done with analog circuitry, such as using analog switches to reconfigure circuits on the fly. By this means, automatic stabilisation and offset drift compensation can be built in. Analog multipliers using very precisely made transistor pairs were very much a thing in that period.
Of course once you have digitised the signal it will only degrade when you do a calculation that reduces the precision of the output, whereas with analog there is a steady loss of precision the bigger the system becomes. In the longer run, therefore, as A/D and D/A conversion got better, computers speeded up, FPUs got fast and cheap and it became possible to utilise multiple CPUs with many interrupt sources in real time systems, digital took over. But in the days of PDP-11s and the like, and with the extremely precise high performance components made by companies like Analog Devices, the tradeoffs often favoured analog.
personally worked with and helped tweak the designs of new analog computers from the 70s. Yes, we were still designing new analog “computers” for U.S. military systems (not just for the U.S. Navy) in the early 70s. I remember the transition to electrical analog computers. My personal experience was that both mechanical (properly maintained) and electrical (again, properly maintained) were good to about four significant figures.
Why was that good enough? Because the calculations were updated continuously. There was virtually zero latency or lag (for the systems on which I worked) between the time of the inputs to the time of the output. For digital computes of that era it might have taken seconds (or even 10s of seconds) to come up with the same answers. When implementation (like pointing a sensor at a far off target, or pointing a rapid fire gun) requires virtually zero lag, digital computers were simply not fast enough. The potential added accuracy to 7 or 8 significant figures was not worth the tradeoff.
My first electronic job was working for Analog Devices in 1979, doing post-burn-in board repair on a universal weighing head that they built using their 12-bit ADCs and fancy op amps. Their boast was you could take an unamplified load cell, run the cable from it through a few kilometers of electrically noisy coal mine (or whatever) and get accurate readings.
This was a small side line on their main business of making the components themselves.
As an “expert” in 5" 38 cal recoil absorption, based on approximately 7500 rounds expended in Viet Nam, the way to “dissipate” the energy was to fire broadsides, not parallel with the axis of the ship. However, many of those rounds were expended along the centerline, but even then, the energy was handily dissipated by the whole ship vibrating like a tuning fork, with damn near everything falling “off”, “over”, or “down”. Light fixtures were hanging by electrical cords, racks [bunks] hanging by chains, cockelry strewn about the mess decks.
The IBM 360 would have been 1965 - 1978 (Wikipedia). I don’t know when digital replaced analog in fire control computers but the CG cutter I was on in 1975 had one. The article says they were in use in the 80’s.
My experience with them was in 1970. I had a job at a corporation in their information center while working my way thru school. In retrospect it is mind boggling. My cell phone can do way more than a 360/70 and they took up a LOT of space, like a garage space.
Not quite air blast injection but not too far removed from the first generation of what we now call common rail. They were Fairbanks Morse and Clevelands. I have had the pleasure of working on both since then.
I believe that fire-control system was a leftover from WW2, they were being replaced with early electronic versions but not being a torpedo guy I never had much to do with that stuff but was on the wardroom team working the plotting board.
My father was a senior field engineer for IBM in the 1960s. The major client was Bank of America in San Francisco and his machine was the 360. He said that the 360 made IBM the company it became. The computer covered 1 1/2 floors of the Bank building. My Mom grew to dislike that machine as Dad would return home with grease on his long sleeve white shirts (the required uniform of IBM). That machine being electrical/mechanical was also very noisy and hot. Ah the memories of childhood. Thanks.
Oh yes. If the AC shut off the entire IBM computer operation had to be shut down until the mechanics got the AC running again. It happened three times in one month so they brought in an HVAC design specialist with a PhD to have a look, he was one of my professors making some extra money.