Originally Posted by
Champlain Islander
An arrow will certainly drop more than 4" at 40 yds no matter how fast it goes. That is just physics. Arrows come off the string climb to the line of sight then fall as the trajectory completes as gravity takes over. To use a single pin successfully out to 40 yds the zero point (line of sight) has to be adjusted out past 20 yds to closer to 30. Similar to sighting in your ML to 2" high at 50 to be zero at 100. If you need more range then adjust higher at 50 to push the zero out. Arrow goes up then down. Total rise and drop will equal much more than 4" but it is possible to have a 4" drop at 40 from the zero point depending on the zero yardage. My own 1 pin set up is good to 33 yds. I know at 15-20 yds I will hit slightly high but still in the vitals.
CI,
Say you've got a bow that drives the arrow a bit over 300 FPS. Let's make an assumption that the arrow doesn't lose velocity very quickly, and averages 300 FPS over the 100 yard range. So, 300 foot range, 300 FPS, the arrow's time of flight will be 1 second. This means at 100 yards, the arrow's drop will be 32 feet. If we use the 40 yard range, the arrow's actual drop will be 32 X .4, 12.8 feet. If you increase the arrow's speed to 400 FPS, these figures become 24 feet at 100 yards, and 9.6 feet at 40 yards.
Just can't call this "flat shooting", huh?
OldBob