Assuming perfect visibility, you can see a very flat beach from around 200 miles away from an airplane flying at 6.8 miles in altitude.
You can see a mountain like Everest farther away.
You can figure the beach distance out using the Pythagorean Theorem. The hypotenuse is the radius of the earth + height of the plane. The radius of the earth is the first leg. The viewing distance is the second leg, which is what you need to solve for.
To figure out the viewing distance from a 6.8 mile up plane to a mountain that's 5 miles up, you simply add the 6.8 miles to 5 miles and solve the same equation.
Hypotenuse: R+height of plane "h" = 3,900 + 6.8 + 5 = 3,911.8 miles Leg 1: R or 3,900 miles Leg 2: L (the variable we need to solve for, the visible distance to the horizon with a smooth globe shaped earth from a plane 6.8 miles up to a mountain 5 miles up)
3,911.8 = √(L 2 + 3,900 2 ) Square both sides gives us... 15302179.24 = (L 2 + 3,900 2 ) We square 3,900 to get... 15302179.24 = (L 2 + 15210000) Subtracting 15210000 from both sides gives us... 92179.24 = L 2 Taking the square root of 92179.24 gives us... L = 303.610342379833958 miles
So you can see the top of a 5 mile high mountain like Everest 300 miles away from a plane flying at 6.8 miles up, whereas you can only see the beach around 200 miles away.
(post is archived)