Hi there, I am wondering if someone could guide me through a problem.
I have the following data (derived from a horse race):
The distance is 1760 yards (1 mile). The standard (expected) time is 96.04s. Thus the avg speed to attain this is 18.32yps.
The distance is broken down into two sections, s1 = 0y-1320y; s2 = 1320y-1760y. Effectively s1 is 75% of the race distance.
We know that the optimal way of pacing the race is to run s2 at 105% of the total race speed.
Therefore, we can estimate the time and avg speed for each section to run optimally: s1 = 73.18s (18.04yps); s2 = 22.86s (19.24yps)
The race is run and the winners' data reads as follows: t = 97.44s (+1.40) (18.06yps); s1 = 74.60 (+1.42) (17.69yps); s2 = 22.84 (-0.02) (19.26yps)
What I am trying to establish is a method of calculating the relationship between s1 and s2, so if the speed of s1 increase, to what end does the speed of s2 decrease? Thus I can make adjustments for 'time loss' due to inefficiency of pacing.
Any help incredibly gratefully received!