Twitter

Monday 6 July 2015

Explaining Goals - How much does shot quality, location and distance actually add?

As the shot quality argument rages on in football, I have been running numbers to see what all the fuss is about. Expected Goal (xG) believers and skeptics have been back and forth on the topic for ages.

I will admit that I am skeptical of the degree to which shot location, distance and quality impact goals. Admittedly, much of this doubt comes from the work done in hockey but also James Grayson and Ben Pugsley contributed to my lack of 'buy in'.

Even with this doubt, I thought that given the popularity of xG's it would better explain Goals For/Against compared to raw shot metrics.

I would like to thank Michael Caley* and Paul Riley** for making their xG results public. This piece is not directed at them or anyone else in particular. I was curious and it just happens that they do not hide behind a curtain of secrecy, which I really respect.




Explaining Goals For

Testing Errors

2014-15 EPL 
For a description of the work I did please go to the bottom of the page. 


When looking at the errors, the lower the number the better.
Shots on Target seems to be the forgotten/ignored metric in the shot quality argument. Without the influence of location they fared much better than the 2 xG's models.

Danger Zone Shots are the elephant in the room. They did a worse job than Total Shots For in explaining goals this past season.

For the 2014-15 season, we see little evidence that location made a significant impact. Total Shots > SiB > Danger Zone.


We Need More Data



Now I know you are thinking that this is only one season of one league. I tried to find past season xG's results for both Caley and Riley's (or any) models but was unable to find it on the web to test more seasons of the EPL (If you care to share, I would love to test the results).

I did find xG data from Michael Caley's site for the 2013-14 Bundesliga, La Liga and Serie A seasons.

Here are the results from the continent:




Bundesliga 2013-14



In Germany, we see that location has little impact on Goals For. Shots on Target come out the best. 


La Liga 2013-14


Shots on Target continue the trend of being the best predictor of Goals For.  Location? Not adding a lot. 



Serie A 2013-14


And finally in Italia we see that SoT leads the way.  Again, Location not really doing what I had expected.


What are xGF Measuring?


I ran the r^2 of xGF vs all the raw shot metrics to look for correlation and these were the best matches:

EPL:
Riley xGF vs Shots on Target For: .92
Caley xGF vs Shots Inside the Box For: .87

Bundesliga:
Caley xGF vs Shots Inside the Box For: .95

La Liga:
Caley xGF vs Shots Inside the Box For: .94

Serie A:
Caley xGF vs Shots Inside the Box For: .88

We see that Caley's model heavily measures Shots Inside the Box, whilst Riley's model closely resembles Shots on Target (As I understand it his model is mostly based on Shots on Target)

Those are strong relationships to basic shots counts.



Goals For: Does Location Matter?



From what I can tell there is not a lot that location adds to the story of explaining Goals For.

Most intriguing here is that Danger Zone Shots are less reliable than counting all Shots inside the Box in every league. 

Shot quality may exist in attack but its impact is negligible given all the factors that go into scoring a goal.

The lack of publicly available data to properly test the various models is the most frustrating thing. The secrecy and proprietary element of xG's is a hindrance to moving forward. Again, it is a breath of fresh air that Paul and Michael release their results (even if their formulas and algorithms remain a secret).



Explaining Goals Against

Testing Errors

2014-15 EPL




First thing to point out is that Goals Against are tougher to explain as shown by the lower r^2 values.

We see here that Caley's model does a better job on the defensive side of things. Shots on Target and Shots inside the Box trail only slightly in explaining Goals Against. 

Given all the different factors that go into a goal against, are these marginal gains reason to claim location or quality of shot is significant?

Total Shots, Danger Zone and Shots inside the Box are all pretty close based on the errors.  Again it’s tough to call the differences significant but Shots inside the Box just edge Danger Zone Shots in terms of explanatory power.

Here are the results from the continent:



Bundesliga 2013-14



All metrics are in close proximity based on errors. Shots on Target with the marginal edge.



La Liga 2013-14



In Spain, things are close again with Caley xGA edging out Shots on Target. 



Serie A 2013-14


Finally, in Serie A, it is to close to call.



What are xGA Measuring?




I ran the r^2 of xGA's vs all the raw shot metrics to look for correlation and these are what I found were the best matches:

EPL:
Riley xGA vs Shots on Target Ag: .89
Caley xGA vs Danger Zone Shots Ag: .88

Bundesliga:
Caley xGA vs Shots Inside the Box Ag: .95

La Liga:
Caley xGA vs Shots Inside the Box Ag: .95

Serie A:
Caley xGA vs Shots Inside the Box Ag: .94

Again, we see strong links to other non-complex shots counts.


Goals Against: Does Location Matter?



As with Goals For, I am struggling to differentiate shot quality from non-shot quality based metrics based on the errors.  xGA did a better job than their xGF counterparts but we only see slight differences in the explanatory power vs non-specific location based metrics.



Conclusion



Despite my initial skepticism, I came into this wondering 'how far ahead are xG's models?'  Given their widespread acceptance, I assumed that it would be obvious that adjusting for location/quality makes a significant difference.

Instead, the numbers show that shot distance/location/quality is just a tiny portion of shooting (both for and against). The added complexity seems to be unnecessary, at least in explaining goals. It tends to not add anything significant compared to non-location based metrics, specifically Shots on Target.

I have often seen analysis comparing xG to TSR, I think I have shown it’s more valuable to start comparing xG's to Shots on Target instead of Total Shots.

Danger Zone Shots need more rigorous testing too. Repeatability and predictability tests needs to be done so we can determine if this is more than a fancy name. Maybe that is harsh but I am not impressed with how they performed vs. how they are presented in the public domain.

I get that this will not be a popular post but I hope it creates a conversation and leads to more rigorous public testing of xG's models as well as the inputs that go into these models.  I guess in the end I want more transparency. The point of analytics are to find truths and debunk myths. The results are important to moving the conversation forward.
 
Lastly, I remain a skeptic on shot quality. I do believe it is in there somewhere but more work needs to be done.

Cheers,

Clarke
@footyinthecloud



*Michael Caley's data and methodology can be found here and he is on twitter: @MC_of_A

**Paul Riley's blog and methodology is found here and he is also on twitter: @footballfactman



Explanation

What I have done is use the shot data for the various leagues from Michael Caley's* site to find slope equations for Shots For, Shots on Target For, Shots Inside the Box For, Danger Zone Shots For vs. Actual Goals For and then compared the results to both Caley* & Riley** xGF.  I ran a bunch of error tests to help us see more than just the r^2.


For example here is Goals For vs Total Shots For:



We take the slope equation and plug the actual Total Shots For into x and this gives us our 'xGF based on Total Shots For' for the 2014-15 season.

We do this for each shot metric and compare to Caley and Riley xG model's by testing the errors of each metric in explaining Goals For.  If you want to know the meaning of the errors, I would suggest google :)




4 comments:

  1. There are two distinct objectives of any statistic, in my mind. One, does it explain the results that already happened, and two, does it predict future results that haven't yet happened. Shots on target is sure to explain goal scoring well because you can't score if the shot doesn't go on frame. That doesn't necessarily mean teams are getting better shots...it just means they were putting more on frame. Was it skill or unsustainable "luck"?

    When you time-lag the data, and only measure shots on target from period 1 and goals scored from period 2, then you're regression is measuring a statistic's predictability. I have already done this for 4.5 seasons of MLS now over at American Soccer Analysis, and I have found that combinations of expected goals and accuracy (shots on target or goals scored) work well together for predicting future outcomes.

    Furthermore, when it comes to explaining which shots on target do score, location matters (as well as some other things). I've published our logistic model using only shots on target*, and many other things are statistically significant indicators of whether or not the shot on target actually went in. That's the model we used for valuing a keeper's ability to prevent goals.

    *http://www.americansocceranalysis.com/explanation/ (the second one)

    ReplyDelete
  2. I am not oblivious to predictability vs explanatory. Shots on Target in MLS may not repeatable/predictable but in the EPL and La Liga (don't have German/Italy data) they are. Maybe this is because of the bigger talent skew? SoT are also decent at predicting future goals in both EPL & La Liga.

    yr to yr repeatability (r^2) (09-15):
    EPL SoT For: 0.67 (~82% skill)
    EPL SoT Ag: 0.55 (~74% skill)
    La Liga SoT For: 0.74 (~86% skill)
    La Liga SoT Ag: 0.43 (~66% skill)
    EPL SoT +/-: 0.72 (85% skill)
    La Liga SoT +/-: 0.72 (85% skill)

    Predictability (r^2) (09-15):
    EPL GFyr2 vs SoT For yr1: 0.53
    EPL GAyr2 vs SoT Agyr1: 0.47
    La Liga GFyr2 vs SoT For yr1: 0.71
    LaLiga GAyr2 vs SoT Ag yr1: 0.38
    EPL GD yr2 vs SoT +/-yr1: 0.63
    La Liga GD yr2 vs SoT yr1: 0.72

    The question I had was how much does location add? Is it significant? From the leagues and seasons I had it seemed to provide little extra. I am not saying it doesn't matter or doesnt exist. I am also not saying that one metric is better than the other. For people who don't have xG's models or do not want to take the time to create one, basic shot metrics do really good.

    Clarke




    ReplyDelete
  3. Th?s is my first time pay a visit at here ?nd i am truly impressed t? ?ead everthing at single place.
    Hi there to all, it’s really a good for me to go to see this site, it consists of valuable Information.

    football leagues in uk

    ReplyDelete
  4. The 1987 Football Association (FA) Cup Final was played at Wembley Stadium between Coventry City and Tottenham Hotspur, on the 16th May 1987. Tottenham had finished in 3rd position in the League with Clive Allen finishing as the leagues top scorer with 33 goals during the season and 48 goals in all competitions. Coventry had finished in 10th position in the league and were making their first ever appearance in an FA Cup final. click here

    ReplyDelete