Connecting 131010


I have signed up for the Beta version of

It aims to be “an open platform for the collaborative evaluation of knowledge. It will combine sentence-level critique with community peer-review to provide commentary, references, and insight on top of news, blogs, scientific articles, books, terms of service, ballot initiatives, legislation and regulations, software code and more”.

I note that is founded on 12 principles. These principles resonate with CommentPress.

I am hopeful that both resources will enable a new form of inclusive disciplinary gaze, a topic that has been preoccupying me since the publication of a Clyde Street guest post by Darrell Cobner.

Both are open resources and embody the connectedness that fascinates me.

I see Curation as a fundamental contribution to this connectedness. Today, brought me Sue Waters’ digital curation links including John Pearce’s presentation Curation: The Next Big C.


After reading Annalee Newitz’s post about the Great Library of Alexandria I am wondering how we will deal with the immanence and permanence that cloud computing affords us. We should heed her conclusion:

Though we imagine that knowledge and civilizations are destroyed in one fell stroke, a rain of fire as it were, the truth is a lot more ugly and more slow. The ancient world’s greatest library didn’t die in battle — it died from thousands of little cuts, over centuries, that reduced this great institution of knowledge to a shadow of its former self.

We have so much data now that each of us will need to find a way of dealing with complexity. I was interested to learn this morning that Michelle Zhou has been using a sample of 200 Twitter messages to make an educated guess about personality traits. I will need to look carefully at topological data analysis if I am to understanding this complexity/simplicity relationship.

Perhaps an interim approach might be to participate in the workshops organised and facilitated by Jane Hart and Harold Jarche. I admire their Seek, Sense, Share approach. Their “workshops”  are intended to provide a semi- structured approach “to kickstart the informal, social learning that will be needed to become proficient”. The workshops are designed to give “just enough structure, without constraining personal and social learning”. Jane and Harold:

  • Curate what they think are the essential resources on a topic and also provide additional links and resources for those who are interested.
  • Encourage all discussions to be done in the private workshop group area, so that people can learn from each other.
  • Try to find ways to help each person as issues arise in the conversations.

After reading about their series of upcoming workshops and thinking about self-directed, intrinsically motivated approaches to learning, I came across Jason Farman’s Manifesto for Active Learning. I liked his sharing of fallibility and how he went about transforming his approach to blended teaching:

Working alongside these seasoned scholar-teachers, I realized that everything I had taken for granted about my own teaching wasn’t always the best approach. I very quickly realized that each one of my assumptions had to be reevaluated, beginning with the idea that I was a good teacher.

Connections and curation make this kind of professional reflection a rich experience. I am hopeful that my engagement with connected learning provides me with opportunities for the ongoing inclusion of practices as well as moments of conversive trauma.

I sense that and CommentPress will make it possible to explore these experiences with a community of practice willing to share experiences and insights.

Photo Credits

Presenting Social Marginalia (Kevin Lim, CC BY-NC-SA 2.0)

Curation of Information (George Couros, CC BY-NC-SA 2.0)

Guest Post: Darrell Cobner – Sharing the Practice of Performance Analysis


Out of constraint (Keith Lyons, CC BY 3.0
Out of constraint (Keith Lyons, CC BY 3.0

This is a guest post on Clyde Street written by Darrell Cobner.

I have tremendous admiration for Darrell, his vision and his practice. I think he exemplifies the digital scholarship of 21st century sport science. I benefit enormously from his commitment to connectivism.

I was thinking of posting a foreword and afterword to his post but I believe the issues he raises are so important that I need to write a separate response.

I do wish to affirm that disciplinary gaze in performance analysis is addressing fundamental issues in the social construction of pedagogy and is doing so in spaces in addition to peer reviewed journals. I am grateful to Darrell for raising these issues here.

He and I welcome your comments on this post. The choice of picture from the grounds of the Shanghai University of Sport is mine and shared with a CC BY 3.0 license.

Sharing the Practice of Performance Analysis

Darrell Cobner


We are undertaking some of our biggest Performance Analysis (PA) challenges to date at Cardiff Metropolitan University. These include establishing a PA hub to service multiple teams and developing work-based learning curricula that encourage open discussion and reflection.

We are fortunate enough to be in an environment where, as a teaching team, we can ask questions of a diverse range of people: Level 4-7 students, coaches, athletes and external stakeholders. We are hopeful that our approach assists a disciplinary gaze (Carling, Wright, Nelson & Bradley, 2013) into social and cultural influences that impact PA delivery, student/analyst/coach/athlete learning, and research in applied settings.

Last week, I was exploring the current perceptions of the role of an analyst to deliver two one-hour learning experiences for our new first year students. This took me on a journey, leading to topical opinions; some which challenged approaches to PA and traditional PA research, and invited a response to question the best medium for disseminating information.

My attempt to create a balanced picture for the students included:

  • Noting the consensus about PA reported in a recent undergraduate dissertation
  • Compiling PA job descriptions accumulated over recent years
  • Working through personal notes
  • Reflecting on current PA practice
  • Summarising two recent academic papers (Mackenzie & Cushion, 2013; Carling, Wright, Nelson & Bradley, 2013).

I have a practical, applied background in PA. I have struggled to engage with the academic elements in PA as I have found that direct applicability is not often clear. Although the exploration of the two papers was arduous at times, I was pleasantly surprised at the content, and there was a lot of thought provoking information to extract.

This prompted me to think about how we share the contents of papers in an open blogging space. The papers to which we refer are often closed behind a paywall. I am delighted that Mackenzie & Cushion (2013) is a free access paper.

A core theme that emerged from my reading was the uninspiring approaches and repetitious papers that have been published in PA. I was relieved that others shared the same viewpoint as me. But I really had to concentrate, read and re-read to get the points out (or my interpretation of them). Is that really necessary? Is this loss of flow in academic writing in fact a contributing factor for the blockage between research and applied practice?

Pulling my thoughts together for the OAPS101 small open online course was a great exercise this time last year. I revisited primary source papers from 1985 and 1987 that set an underpinning framework for PA in the coaching process. This encouraged me to consider whether had changed in PA and how PA could be progressed.

I can relate to many the themes that emerged from my reading of Mackenzie & Cushion (2013) and Carling, Wright, Nelson & Bradley (2013).

  • Has PA become too big?
  • Does it need to be broken down further into specialisms?
  • Are we all performance analysts?
  • Whose role is it to collect and analyse GPS data?
  • Is too much expected of one person; when they are sometimes underpaid, sometimes unpaid?

I wondered if these thoughts might be explored in other forums. Keith Lyons, amongst others, has chosen this open route. I have the utmost respect for the peer review process but am profoundly committed to open sharing and engagement. Jessie Daniels has explored how digital media are transforming our practice. I like the fluidity of scholarship she presents:

My experience with the germ of an idea shared as a Tweet at an academic conference that became a blog post, then a series of blog posts, and (eventually) a peer-reviewed article is just one example of the changing nature of scholarship.  From where I sit, being a scholar now involves creating knowledge in ways that are more open, more fluid, and more easily read by wider audiences. 

I learn a great deal from a daily stream of blogs and feeds; from people who are sharing their thoughts without the motivation of titles or monetary gain (and sometimes an aspiring performance analysts looking for a role).

Should the conversations about academic papers migrate to platforms, including blogs, more conducive to conviviality and that might facilitate/accelerate change? Would this conversational approach enable the immediate flow of ideas between research and practice? This would enable more people to contribute to discussion and work in applied contexts.

I am attracted to the openness of blogging, the immediacy of conversation it allows, the ability to question thoughts or ask for clarification. Conversation and collaboration are very important for me. I have preferred to engage with my community of practice in this way rather than through formal peer review.

I wonder if open sharing makes it more likely practitioners will transform their practice. I do understand the challenges facing open access sharing and publication. My experience has been that most practitioners do not adopt unthinkingly new techniques for observing, recording and analysing performance. However my experience tells me that the practitioners are not going to academic journal papers as a first call.

At Cardiff Met we are trying to ensure that our PA students engage with the spectrum of agile scholarship outlined by Jessie Daniels.

The process of preparing the two one-hour learning experiences for our new first year students led me to ponder these questions:

1. We are encouraged to customise PA to the individual environment, but conversely, people are looking to compare global data using the same operational definitions and approaches. Do we need authenticity to explore the potential of PA or standardisation/stability to explore the potential of PA?

2. If there is to be a thrust towards centralised terminology/operational definitions, is the written word the best way of communicating them? The word limitation of journal articles was cited as a hindrance for allowing comparisons. Surely we are in a different era now? Within a discipline which utilises powerful visual media to deliver/share information, should we be progressing towards pictures/videos as demonstrations of operational definitions rather than written attempts?

3. The best assessors of reliability are the consumers – the players /coaches that actually use the available resources to review performance. Are they confident enough to rely on the information to make decisions based on the accuracy of the data collected by the service providers?

4. Being able to produce reams of data, extended reports and pretties is no longer the challenge. Is simplicity overlooked?

5.  Big data – do the needles exist?

6.  Is the chase of technology/features impacting on the PA process?

7. A lot of performance analysts have learned their craft in applied contexts. Many of them are learning this alongside learning how to manage and train people. How do we support the development of social skills?

8. What is to count as research? Do we need other metrics to extend the concept of impact?

9. Should the teaching be research-informed or practice-led? Particularly if the research elements are not being adopted in everyday practice.


Mackenzie, R. and Cushion, C. (2013). Performance analysis in football: A critical review and implications for future research. Journal of Sports Sciences. 31 (6), 639-676.

Carling, C., Wright, C., Nelson, L.J. and Bradley, P.S. (2013). Comment on ‘Performance analysis in football: A critical review and implications for future research’. Journal of Sports Sciences. DOI:10.1080/02640414.2013.807352.



The Visual Analytic Turn

Seventeen years ago, Usama Fayyad, Gregory Piatesky-Shapiro and Padhraic Smyth wrote:

Across a wide variety of fields, data are being collected and accumulated at a dramatic pace. There is an urgent need for a new generation of computational theories and tools to assist humans in extracting useful information (knowledge) from the rapidly growing volumes of digital data. These theories and tools are the subject of the emerging field of knowledge discovery in databases (KDD).

I revisited their article in the AI Magazine this week after a number of finds prompted me to think about the visual analytic turn in sport.

The first visualisation that grabbed my attention was an English Premier League fixture strength table prepared by Neil Kellie (shared with me by Julian Zipparo). Neil used Tableau Public for his visualisation.


Neil developed his table by using a static star rating and a form rating combined to give a score for each fixture. This becomes a dynamic table as the season progresses. It has prompted me to think about how we weight previous year’s ranking in a model.

The Economist added its weight to the Fantasy Football discussions with its post on 16 August. The post uses topological data-analysis software provided by Ayasdi to visualise Opta data on the different attributes of players. In an experimental interactive chart:

the data is divided into overlapping groups. These groups contain clusters of data—in this case footballers with similar attributes—which are visualised as nodes. Because the groups overlap, footballers can appear in more than one node; when they do, a branch is drawn between the nodes. Some nodes have multiple connections, whereas others have few or none.


There is a 2m 32s introduction to the Ayasdi Viewer on YouTube. Lum et al (2013) exemplify their discussion of topology with an analysis of NBA roles. Their insights received considerable publicity earlier this year (“this topological network suggests a much finer stratification of players into thirteen positions rather than the traditional division into five positions”).

Back at Tableau Public, I found news of a Fanalytics seminar. One of the presenters at the workshop is Adam McCann.  Adam’s most recent blog post is a comparison of radar and parallel coordinate charts. Adam led me to a keynote address by Noah Iliinsky: Four Pillars of Data Visualization (46m YouTube video). Noah works in IBM’s Center for Advanced Visualization.


This snowball sample underscores for me just how many remarkable people are in the visualisation space. I am interested to learn that a number of these people are using Tableau Public … to share sport data.

In other links this week, Satyam Mukherjee shared his visualisation of Batting Partnerships in the first Ashes Test 2013:




Simon Gleave’s 26 Predictions: English Premier League forecasting laid bare reminded me of the discussions following Nate Silver’s analysis of the 2012 Presidential Elections. I enjoyed Simon’s juxtaposition of 26 pre-season Premier League predictions, “13 which are at least partially model based, and 13 from the media. The models select Manchester City as title favourites but the journalists favour Chelsea”. Simon’s post introduced me to James Grayson and his reflection on predictions about performance. I think Simon and James have a very impressive approach to data.

This week’s links have left me thinking about an idea I had back in 2005. I wondered at that time if I could become skilful enough to combine the insights offered by Edward Tufte and Usama Fayyad. More recently, I have been wondering if I could do that with the virtuosity that pervades Snow Fall.