It is the end of the Honours’ year in Sport and Exercise Science at the University of Canberra.
This year I have had the opportunity to be a member of James Simpson’s supervision team.
James is a postgraduate scholar in Performance Analysis at the Australian Institute of Sport.
He has been investigating Changes in Tactical Behaviour in Elite Women’s Water Polo.
I am mindful that James will be submitting a paper for peer review and so I will not preempt any submission he makes by sharing the detail of his research here.
I do want to share an outline of his work here to add another example to the literature on the occupational culture of a performance analyst.
In October 2014, the Fédération Internationale de Natation (FINA) approved a rule change in junior and youth water polo. Overall team size was reduced from thirteen to eleven. The number of players from each team in the pool during a game was reduced from seven to six, of which one must be the goal keeper.
The rule change was used for the first time in international competition at the 2015 Under 20 World Championships.
James worked with the head coach of the Australian women’s team at the World Championships.
James journey to provide a performance analysis service at the Under 20 World Championships followed a classic service model approach.
- Preliminary discussions between the AIS and Swimming Australia.
- Preliminary meetings between the head coach of the junior women’s team and James, the analyst.
- Agreement about priorities for the coach.
- Agreement about real-time and lapsed-time services.
- Confirmation of digital technology resources for the project.
- Logistics for accreditation, travel, accommodation and subsistence agreed and confirmed.
- Delivery of the service at the championships.
All of which seems very reasonable. But as with all performance analysis projects, James needed to be agile in his provision of the service. Two of the obstacles to the service exemplify the tenacity required to deliver on agreements made with a head coach who would be relying on a performance analyst:
- The venue for the championships was changed relatively close to the championships: a trip to Mexico became a trip to Greece.
- At the venue, despite assurances from the organisers, James was not able to film from the agreed position in the pool.
The second of these did place a great deal of stress on James as there was no official video record of the championships.
James overcame these obstacles and as a result has a total record of the championships for his archive and for subsequent analysis. He was sufficiently well organised to be able to code events in the pool in real time and then to add detail and check for accuracy with lapsed-time analysis.
Changes in Behaviour
Whilst James was exploring some statistical tests to compare behaviours in new and old rules, my colleague Chris Barnes and I went off on a tangent to look at some machine learning possibilities for the data James had collected.
Chris used JMP to explore a decision tree approach. His leaf report has these measures (response probability and response counts) for new rules (NR) and old rules (NR):
We thought this promising.
Chris visualised these data with these clustered correlations:
The dark red diagonal line shows maximum correlation.
This started us off thinking about how we might classify games by data alone … and how many additional games we might need to make our approach much more robust.
This is a work in progress. We are keen to share it here for some good reasons:
- This is not reported in James’ paper.
- It suggests a fascinating line of enquiry in an abstract approach to behavioural change in games where the rules have changed.
- It demonstrates, I think, the growing knowledge performance analysts will develop as they start to interrogate data collected with traditional performance analysis methods.
I am looking forward to the response of peer reviewers to James’ paper. There will be lots to discuss.
In the meantime, Chris and I will continue on our tangent.