The physical testing is interesting, and all, but in game application is more important.
A guy can be 0.5s quicker, but if another player reads the place to run to 0.5s faster then does it matter?
A player can be agile but lack spatial awareness.
A player can jump 10cm higher but if they can't time their jump to mark at the peak then does it really matter?
The cardio testing is a little more interesting, but it's easy to manage and pace your output on those tests. Some great runners struggle to pace their output and/or adrenaline in-game and get tired quicker.
As we discussed probably a few dozen pages ago, I call it the "The Colby McKercher theory"
Needs a balance of opinion. The testing data is really important as the main basis of opinion, but you then need to verify it's reading with the eye imo. Some players might simply never be in a position in scouted game situation to actually show their pace, or their vertical or their agility.
Or the nature of a game is so contested vs an open one that you aren't able to determine the endurance capabilities of players 1 game to the next.
I put most weight in the testing data, then try and confirm it's validity. Especially around speed, agility and vertical. Endurance is fine imo. Although intensity and determination come into players in game situation vs a running track (i.e George Wardlaw who would will himself in transition in game vs the Yo-Yo test)
In McKercher's case the data failed at the eye test validation. It's probably the biggest discrepancy I can remember
Phillips may be an example where the data has failed in the eye test at AFL level, but in the opposite direction.
He has very quick 20m, bordering rapid sprint times and above average endurance. The glandular may have killed his endurance, but I've only seen flashes of that speed compared to at times how he used it at u18 level.
It may have also been a dodgy testing year, which can happen. The testing equipment is brought in, setup and calibrated. It's not exactly done in a contained AIS lab.
Last edited: