Hello, I’m studying mid-latitude, Mercator sailings etc. I have the process down, but my answers are usually off by a couple of minutes, which is a problem. I’ve been playing around with how many decimal points I carry forward on the math but can’t find a way to get the answers dead-on. Any suggestions on how many decimal points to carry forward when? Thanks

I’m not sure if this is still true but a professor of mine used to claim that if you round to 4 decimal places every time you do an operation the answer comes out spot on.

I was told while at school that for USCG exams take everything out to four decimal points.

The difference between an angle of 45.1234° and another of 45.1235° is just 0.36 arcseconds, or 20 centimeters in position, a negligible quantity.

However, calculating positions and routes is done with trigonometric functions. Around 0° or 180° the differences of one degree are very, very small.

Therefore, astronomers use every digit they have and round only the final result (e.g. 15 significant digits in Excel).

The problem exists even for mariners:

Sine(90°) = 1

Sine(89.5°) = 0.999961923…, rounded to 4 decimal points it returns 1 too.

For calculations with trigonometric functions, 4 decimal points for every single step seem to be insufficient.

I don’t like it but he seemed certain that that was how the test writers did it. It gets the answers spot on.

Try it on practice tests and see how it works out, that’s all I’m suggesting.

I was told the same and it worked for every exam I took up to Master.

Thanks very much, I’ll give it a shot