I’ve just spent the last 2 days at the inaugural Sports Analytics Europe conference. The long-term aim for the organisers is to grow it into the MIT Sloan Sports Analytics Conference for Europe. It will be interesting to see if it can from from the 100-200 strong conference this year into the geekapooloza that is MIT.
Perhaps most interesting is the launch of NOSPA (Network of Sports Performance Analysts). I genuinely hope this initiative flourishes as it was on the coffee breaks and over dinner that I learned the most. Some of the speakers were excellent but I wonder is there a format of conference out there that could find a balance between presentations and unstructured learning.
Flicking back through my notebook now I can see some very common themes throughout the speakers.
1. What is the Performance Question?
EVERYTHING MUST START WITH A PERFORMANCE QUESTION. This is something I wrote about here, but every speaker touched on this. If you start with buying fancy equipment rather than a performance question you are going to waste time and resources.
It is ridiculous that so many teams start out with ‘we must buy this piece of equipment’ rather than ‘we need to be able to answer this question’. Stop concentrating on the fancy tools and concentrate on the performance question you need to answer.
2. Know Your Sport
This is not about being an expert coach in your sport. That question was asked of the panel and the answer back was very much it depends. Sometimes being a sports specific expert works really well but equally there are very successful analysts where the opposite is the case.
WITTW (What It Take To Win). Essentially have you got a benchmark of the performance standards you need to achieve to be #1 ? If the NGB doesn’t know that how can you expect to get there? This can be in individual sports, where Infostrada presented their rich data set of historical performance standards or as David Archer, Chris Barnes & Michael Bush presented their research paper on the Evolution of Match Performance Parameters for Various Playing Positions in the English Premier League. Again I think it is staggering how many people ignore this basic step in the analysis process.
3. Coach Analyst Relationship
I eventually stopped writing this down, it was mentioned that often. Without a good coach – analyst relationship, forget it. Stafford Murray (EIS) believes it takes 3-4 years to properly embed an analyst within a high-performance team. In a follow up question he was asked how to speed that process up. His advice was patience and don’t be a knob. Once someone has achieved the minimum qualification standards (MSc) they recruit on personality. With 50,000 undergrads studying sports courses in the UK that’s a lot of people who are going to achieve the minimum qualification standards. Finding ways to show you are not a knob is crucial.
Building on that idea of patience Tim Chartier, who works with a lot of the college teams and the NBA, spoke about how initially it is about giving the coaches what they want, maybe in a better format but the first step is to answer their questions, regardless of whether you think they are good questions or not. Once you start to build that relationship you can start to question and challenge – but it takes time.
To counter that Barry McNeill – CEO Europe, Catapult Sports (& formerly of Prozone) made the excellent point that most of the analysis is work to-date is transactional. We get asked for something and we supply it. But are we being disruptive enough? I certainly think there is plenty of evidence of disruptive work being done outside clubs and I guess we will never know the inner workings of clubs but it is an interesting statement coming from someone with the inside track.
It reminded me a lot of the famous Henry Ford quote;
“If I’d asked my customers what they wanted, they would have asked for faster horses”
I guess finding that balance between servicing the immediate needs of the coaches but also being disruptive is a sweet spot we would all like to find.
4. Feedback Science
Stafford also mentioned the idea of Feedback Science, an area that seems very under-researched, and he used the phrase ‘Edible Data’ which I think is great. It was a point made by Tim Chartier and Chris Barnes; The constant need to get our sometimes quite detailed and comprehensive analysis across in a way that is understandable and ultimately usable by coaching staff.
Tableau did an excellent product demonstration of putting raw data sets into an Edible format. My mid-season resolution is to get better at this tool.
5. Question the data
Whether it’s from wearable, 3rd party providers or self-collected, we are not asking enough questions of how reliable that data is. We need to be more vigilant on the accuracy and reliability of the data.
We are over-reliant on outcome data and not enough thought is given to process data. Also we don’t ask enough – ‘When can I trust that stat?’. How long does it take data to stabilize and become reliable enough to make assumptions on.