This is a guest post on Clyde Street written by Darrell Cobner.
I have tremendous admiration for Darrell, his vision and his practice. I think he exemplifies the digital scholarship of 21st century sport science. I benefit enormously from his commitment to connectivism.
I was thinking of posting a foreword and afterword to his post but I believe the issues he raises are so important that I need to write a separate response.
I do wish to affirm that disciplinary gaze in performance analysis is addressing fundamental issues in the social construction of pedagogy and is doing so in spaces in addition to peer reviewed journals. I am grateful to Darrell for raising these issues here.
He and I welcome your comments on this post. The choice of picture from the grounds of the Shanghai University of Sport is mine and shared with a CC BY 3.0 license.
Sharing the Practice of Performance Analysis
We are undertaking some of our biggest Performance Analysis (PA) challenges to date at Cardiff Metropolitan University. These include establishing a PA hub to service multiple teams and developing work-based learning curricula that encourage open discussion and reflection.
We are fortunate enough to be in an environment where, as a teaching team, we can ask questions of a diverse range of people: Level 4-7 students, coaches, athletes and external stakeholders. We are hopeful that our approach assists a disciplinary gaze (Carling, Wright, Nelson & Bradley, 2013) into social and cultural influences that impact PA delivery, student/analyst/coach/athlete learning, and research in applied settings.
Last week, I was exploring the current perceptions of the role of an analyst to deliver two one-hour learning experiences for our new first year students. This took me on a journey, leading to topical opinions; some which challenged approaches to PA and traditional PA research, and invited a response to question the best medium for disseminating information.
My attempt to create a balanced picture for the students included:
- Noting the consensus about PA reported in a recent undergraduate dissertation
- Compiling PA job descriptions accumulated over recent years
- Working through personal notes
- Reflecting on current PA practice
- Summarising two recent academic papers (Mackenzie & Cushion, 2013; Carling, Wright, Nelson & Bradley, 2013).
I have a practical, applied background in PA. I have struggled to engage with the academic elements in PA as I have found that direct applicability is not often clear. Although the exploration of the two papers was arduous at times, I was pleasantly surprised at the content, and there was a lot of thought provoking information to extract.
This prompted me to think about how we share the contents of papers in an open blogging space. The papers to which we refer are often closed behind a paywall. I am delighted that Mackenzie & Cushion (2013) is a free access paper.
A core theme that emerged from my reading was the uninspiring approaches and repetitious papers that have been published in PA. I was relieved that others shared the same viewpoint as me. But I really had to concentrate, read and re-read to get the points out (or my interpretation of them). Is that really necessary? Is this loss of flow in academic writing in fact a contributing factor for the blockage between research and applied practice?
Pulling my thoughts together for the OAPS101 small open online course was a great exercise this time last year. I revisited primary source papers from 1985 and 1987 that set an underpinning framework for PA in the coaching process. This encouraged me to consider whether had changed in PA and how PA could be progressed.
I can relate to many the themes that emerged from my reading of Mackenzie & Cushion (2013) and Carling, Wright, Nelson & Bradley (2013).
- Has PA become too big?
- Does it need to be broken down further into specialisms?
- Are we all performance analysts?
- Whose role is it to collect and analyse GPS data?
- Is too much expected of one person; when they are sometimes underpaid, sometimes unpaid?
I wondered if these thoughts might be explored in other forums. Keith Lyons, amongst others, has chosen this open route. I have the utmost respect for the peer review process but am profoundly committed to open sharing and engagement. Jessie Daniels has explored how digital media are transforming our practice. I like the fluidity of scholarship she presents:
My experience with the germ of an idea shared as a Tweet at an academic conference that became a blog post, then a series of blog posts, and (eventually) a peer-reviewed article is just one example of the changing nature of scholarship. From where I sit, being a scholar now involves creating knowledge in ways that are more open, more fluid, and more easily read by wider audiences.
I learn a great deal from a daily stream of blogs and feeds; from people who are sharing their thoughts without the motivation of titles or monetary gain (and sometimes an aspiring performance analysts looking for a role).
Should the conversations about academic papers migrate to platforms, including blogs, more conducive to conviviality and that might facilitate/accelerate change? Would this conversational approach enable the immediate flow of ideas between research and practice? This would enable more people to contribute to discussion and work in applied contexts.
I am attracted to the openness of blogging, the immediacy of conversation it allows, the ability to question thoughts or ask for clarification. Conversation and collaboration are very important for me. I have preferred to engage with my community of practice in this way rather than through formal peer review.
I wonder if open sharing makes it more likely practitioners will transform their practice. I do understand the challenges facing open access sharing and publication. My experience has been that most practitioners do not adopt unthinkingly new techniques for observing, recording and analysing performance. However my experience tells me that the practitioners are not going to academic journal papers as a first call.
At Cardiff Met we are trying to ensure that our PA students engage with the spectrum of agile scholarship outlined by Jessie Daniels.
The process of preparing the two one-hour learning experiences for our new first year students led me to ponder these questions:
1. We are encouraged to customise PA to the individual environment, but conversely, people are looking to compare global data using the same operational definitions and approaches. Do we need authenticity to explore the potential of PA or standardisation/stability to explore the potential of PA?
2. If there is to be a thrust towards centralised terminology/operational definitions, is the written word the best way of communicating them? The word limitation of journal articles was cited as a hindrance for allowing comparisons. Surely we are in a different era now? Within a discipline which utilises powerful visual media to deliver/share information, should we be progressing towards pictures/videos as demonstrations of operational definitions rather than written attempts?
3. The best assessors of reliability are the consumers – the players /coaches that actually use the available resources to review performance. Are they confident enough to rely on the information to make decisions based on the accuracy of the data collected by the service providers?
4. Being able to produce reams of data, extended reports and pretties is no longer the challenge. Is simplicity overlooked?
5. Big data – do the needles exist?
6. Is the chase of technology/features impacting on the PA process?
7. A lot of performance analysts have learned their craft in applied contexts. Many of them are learning this alongside learning how to manage and train people. How do we support the development of social skills?
8. What is to count as research? Do we need other metrics to extend the concept of impact?
9. Should the teaching be research-informed or practice-led? Particularly if the research elements are not being adopted in everyday practice.
Mackenzie, R. and Cushion, C. (2013). Performance analysis in football: A critical review and implications for future research. Journal of Sports Sciences. 31 (6), 639-676.
Carling, C., Wright, C., Nelson, L.J. and Bradley, P.S. (2013). Comment on ‘Performance analysis in football: A critical review and implications for future research’. Journal of Sports Sciences. DOI:10.1080/02640414.2013.807352.
Great article Darrell – some really good points in this.
In relation to the ‘peer review’ debate and the disconnect between applied and academic performance analysis, this is a big issue for PA as an industry. I have no doubt that there are very few coaches who go to academic journals as a first reference and there is a need to find a better way of disseminating the information.
Part of the problem is the structure of a Journal article – to anybody not used to reading them they can be very off putting. Coaches don’t have the inclination or time to invest in this. Unfortunately there is also pressure on researchers to publish in this format – as there is a strong resistance from certain groups to accepting any piece of research that has not gone through the ‘peer review’ process. I would believe (as you have addressed above) that the ultimate peer review is that of coaches and industry practitioners.
There are no easy answers to some of the problems you raised but well worth having the debate.
I agree, Rob.
I think much of the debate for me is about how we share stories about performance.
[…] I am delighted with the response to Darrell Cobner’s guest post. […]
[…] Guest Post: Darrell Cobner – Sharing the Practice of Performance Analysis […]
Enjoyed the piece Darrell, is my writing style that bad 🙂
Keep up the good work Keith!
I have posted some content on ‘Medium’ to explore the feasibility of the platform to create the conversation and interaction approached in this blog.