ORIGINAL: rybohunter
A while back I did some similar calculations. I arbitrarily chose the horizontal deceleration of the arrow to be 0.25 ft/s per yard travelled. And I dragged my calcs out to 50 yds, at which it takes an arrow starting at 300 fps, .51 sec to get there.
The speed of sound is 1116 ft/s. It takes .134 sec to go 50 yds. Difference in time is .376.
At 30 yds an arrow takes .294 sec & sound takes .08 sec. Difference of .214 sec.
A deer does not have less time to react the farther away it is.
My bad, using your arbitrary deceleration ratio, there is a 5/100ths difference using the initial speeds I used. I concede that the arrow will slow down, but even at that distance its the difference is very miniscule. Now take into affect the sound level difference. Something that is 50 decibels at source is not going to be as audible at 50 yards. Is the softer sound going to startle an animal as much as the lounder one?