Quote:
Originally Posted by LukeS
I imagine that the disparity is so minutely different for
objects 1000m and 2000m away that a shaky finger and a cross
eyed measurement will not accurately bring out the real
distance/disparity, and generally as the curve flattens off as
we measure and collate the splitting for objects further and
further away, it will be increasingly difficult to calculate and
measure differences.
|
Yep, a quick calculation (using angle rather than distance, because it's
easier) gives a value of 0.463597 at 1000m, 0.463622 at 2000m. That's a
relative difference of about 0.0054 percent. The moon turns out to be
0.463647, another 0.0054 percent away from them by some strange coincidence.
(The formula for the angle I got was

where d is the distance to the object, c is the distance to finger
("calibration") and e is the distance between your nose and an
eye. I assumed c was 0.1 and e 0.05).
Edit: Here's a graph of the angle against distance (in metres):
Quote:
But
in principle the method should work, shouldn't it? |
Yep, it's called parallax and we do use it to measure nearby stellar objects
(albeit the "eyes" are the Earth on opposite points of its orbit
around the sun).