Actionable insights: sport analytics

 

Introduction

A post by Mary Hamilton (2017) about her time at The Guardian has sent me off thinking about actionable insights in sport analytics.

In her article, Mary shares thirteen lessons from her time as executive editor for audience at The Guardian. Three of the thirteen had a particular resonance with me.

Insight 1 is ‘Data isn’t magic, it’s what you do with it that counts‘. She notes “We make better decisions when we’re better informed, and all data is is information”. She adds that developing an in-house data resource, Ophan, “It’s not just about putting numbers into the hands of editorial people — it’s explicitly about getting them to change the way they make decisions, and to make them better”.

Insight 11 is ‘Radical transparency helps people work with complexity‘. Mary observes “In a fast-moving environment where everything is constantly changing … you have no way of knowing what someone else might need to know in order to do their job well. The only way to deal with this is to be a conduit for information, and not bottle anything up or hide it unless it’s genuinely confidential”.

Insight 13 is ‘What you say matters far less than what you do‘. Mary’s take is “This should be obvious, but it probably isn’t. It doesn’t matter what you say you want, it’s what you do to make it happen that makes a difference in the world”.

Action

Mary’s thirteenth insight underscores the importance of action. In another context, Adam Cooper (2012) proposes:

Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data.

 

He adds “‘actionable‘ indicates that analytics is concerned with the potential for practical action rather than either theoretical description or mere reporting”.

Some of the key contributors to the sports analytics literature have focused on action.

In 2011, Ben Alamar and Vijay Mehrotra defined sport analytics as:

the management of structured historical data, the application of predictive analytic models that utilize that data, and the use of information systems to inform decision makers and enable them to help their organizations in gaining a competitive advantage on the field of play.

Three year later, Chris Anderson defined sports analytics as:

The discovery, communication, and implementation of actionable insights derived from structured information in order to improve the quality of decisions and performance in an organization.

In 2016, Bill Gerrard observed “Sports analytics is all about using data analysis to provide actionable insight for coaches and sporting directors”. He added “Analytics is analysis for purpose. It’s a servant function, there to help managers to make better informed decisions”. In his conceptualisation, “Analytics is decision-driven, domain-specific data analysis”.

Meta-Issues

In his essay on Concerning Human Understanding, John Locke asserts:

it is ambition enough to be employed as an under-labourer in clearing the ground a little, and removing some of the rubbish that lies in the way to knowledge

Bill’s comment about the servant function of analytics took me back to John Locke and under-labouring. I thought that any sport analyst would be keen to be such a labourer and contribute to “clearing the ground a little”.

One aspect of an analyst’s role is, I think, to reflect on the place granularity will play in the actionable insights we share. Another is to consider our creation of actionable insights from a user’s perspective (Kunal Jain, 2015).

But as Alexander Franks and his colleagues (2016) point out there are some important meta-issues at play here too. They consider “the metrics that provide the most unique, reliable, and useful information for decision-makers”. They employ three criteria to evaluate sport metrics:

  • Does the metric measure the same thing over time? (Stability)
  • Does the metric differentiate between players? (Discrimination)
  • Does the metric provide new information? (Independence)

Alexander and his colleagues note:

In general, any individual making a management, coaching, or gambling decision has potentially dozens of metrics at his/her disposal, but finding the right metrics to support a given decision can be daunting. We seek to ameliorate this problem by proposing a set of “meta-metrics” that describe which metrics provide the most unique and reliable information for decision-makers.

They add:

The core idea of our work is that quantifying sources of variability—and how these sources are related across metrics, players, and time—is essential for understanding how sports metrics can be used.

They conclude their discussion of meta-metrics by proposing a fourth meta-metric: relevance.

Relevance could simply be a qualitative description of the metric’s meaning or it could a quantitative summary of the causal or predictive relationship between the metric and an outcome of interest…

In Practice

Earlier this week, the Crusaders rugby union team, from New Zealand, advertised for a sport scientist. The job description provides a fascinating empirical focus for the discussion in this blog post.

The position description has these elements:

  • Reporting to the Crusaders’ Head Strength and Conditioning Coach, you will be responsible for overseeing and coordinating all aspects of the Crusaders’ performance monitoring systems including further enhancement of data collection, processing and reporting methods.
  • You will also be responsible for collating and reporting on all performance monitoring data to ensure optimal player loading for conditioning and recovery.
  • To be successful in this role, you will need to be appropriately qualified by training and/or experience, including a proven ability in research, data analysis and reporting including an outstanding level of understanding of performance monitoring and analysis tools ideally in a rugby environment.
  • You will also need to demonstrate extensive experience in the use of GPS Technology both hardware and software…

This kind of role is now becoming more and more frequent. It will be good to learn how the post holder adapts to this role and provides relevant, actionable insights that are domain specific whilst being mindful that in addition to structured data there are increasing opportunities for the analysis of unstructured data.

Photo Credits

Tree (Keith Lyons, CC BY 4.0)

Crusaders v Cheetahs (Geof Wilson, CC BY-NC-ND 2.0)

What is it we do in Performance Analysis?

One of Jacquie Tran‘s delightful sketchnotes appeared in my Twitter feed a couple of days ago …

It coincided with a message I received from Jamie Coles and the subsequent guest posts that appeared on Clyde Street today.

Doug‘s definition of performance analysis includes ‘insight’, ‘information’ and ‘decisions’. Jacquie’s note of his definition sent me off thinking about some other words too … ‘augmentation’, ‘support’ and ‘actionable’.

In my thinking, I returned to two seminal papers from the same year, 1991, that helped me reflect on what the craft of performance analysis might involve at the time I was establishing the Centre for Notational Analysis in Cardiff:

Ian Franks and Gary Miller, Training coaches to observe and remember. Their abstract:

This study tested a video training method that was intended to improve the observational skills of soccer coaches. Three groups of soccer coaches were tested prior to and following a training period. The experimental group was exposed to a video training programme that was designed to highlight certain key elements of soccer team performance. Although both control groups were exposed to the same video excerpts as the experimental group, they were given different orienting activities. The subjects in control group 2 were asked to discuss these excerpts with a colleague and then write a report on what they had seen, while control group 1 members repeated prior test conditions that required them to remember certain events that preceded the scoring of goals. The results indicate that, although all coaches were incapable of remembering more than 40% of pertinent information, the subjects in the experimental group improved their ability to recall all events that surrounded the ‘taking of shots’.

Richard Schmidt‘s, Frequent augmented feedback can degrade learning: Evidence and interpretations. His abstract includes these observations:

Several lines of evidence from various research paradigms show that, as compared to feedback provided frequently (after every trial) less frequent feedback provides benefits in learning as measured on tests of long-term retention.  … several interpretations are provided in terms of the underlying processes that are degraded by frequent feedback.

I do think both are very important primary sources for performance analysts. They form part of the epistemological foundations that informed Doug’s presentation.

His definition also includes ‘effective’ and ‘efficient’ dimensions. Both emphasise for me the social skills of the performance analyst in harmony with the everyday coaching environment and the rhythms of a season.

Jacquie’s sketchnote raised again for me the inevitable merging of performance analysis and analytics. I revisited Chris Anderson’s (2014) definition of sports analytics as:

The discovery, communication, and implementation of actionable insights derived from structured information in order to improve the quality of decisions and performance in an organization.

And Bill Gerard’s (2016) proposal for “a narrow definition of sports analytics” as the analysis of tactical data to support tactics-related sporting decisions. He suggests “this narrow definition captures the uniqueness and the innovatory nature of sports analytics as the analysis of tactical performance data.”

I am immensely grateful to Jacquie for this prompt. I was not able to attend at which Doug and others presented and found her visualisation of the day very welcome.

 

Insights for #UCSIA16 from Bill

Vault

I use Scoop.it! as a way to aggregate news each day about activities that interest me.

I curate some of the links with the #UCSIA16 tag.

This is one I posted today

Bill shares eight lessons on “how to use data analytics effectively to improve performance”.

  1. Analytics must always be decision-driven not data-led or technique-led.
  2. Analytics can only be effective in organisations with an evidence-based culture.
  3. Analytics should result in data reduction rather than adding to data overload.
  4. Data analysis is a signal-extraction process.
  5. The most important data are the expert data created within an organisation.
  6. Analytics is not all about big data.
  7. Analytics is mostly exploratory and explanatory, seldom predictive.
  8. Effective analysts are humble servants who respect the experience and expertise of the end-users.

I am delighted Bill has shared these insights. I like the humility dimension of number 8 on this list. His list provide an excellent guide for students following the #UCSIA16 WikiEducator course.

They are particularly helpful in the capstone topic for the course.

Photo Credit

Des Frawley, Athletics Carnival, Brisbane, 1952 (State Library of Queensland, no known copyright restrictions)