The .11Hz change in pitch is WAY off the mark--that was calculated by looking at the angular difference at its extreme, which are many minutes apart, then calculating what this angular change means in terms of pitch. But, say the recording is of a 1,000 hz signal, at any point along the record, it is playing a 1,000 hz signal, which is what you would hear no matter where on the record, you are playing. To the extent the very tiny movement forward or backwards from the movement of the tonearm along an arced path changes pitch, it is extraordinarily small, and the amount of movement is dependent on the time frame one uses to measure the change. If one measures say a two second interval, there will be an extremely small change in position relative to the starting position, which, I suppose, could represent a theoretical pitch change; a one second interval would then be about half as much of a change, and .5 sec, half again (kind of a Zeno's paradox). The instantaneous pitch (if there can ever be such) would respresent a point with no change at all.
The fallacy of comparing the two extreme points on the record and calculating the difference as a change in pitch, is somewhat like the following analogy: Suppose I have a fifteen foot long car. If I move it ten feet forward, what I have after the move is a fifteen foot car whose location is 10 feet different from where it originally was located. It is not a stretched out 25 foot car covering the interval of its movement (again a problem Zeno grappled with).