Modifié 10 juill. 2022 à 18 h 19
Quoting: BeterChiarelli
Care to walk us through the math to reinforce our understandings?
Quoting: Tintin
I'd like to know as well!
How does putting a rate on GSAx turn into SV%?
And is GSAA a different metric from GSAx or are they the same thing?
GSAx is goals saved above expected: It uses an xG model which accounts for shot quality (distance, angle, time since last shot event, change in angle from previous shot event, etc) to predict how many goals should have been scored based on the number and quality of shots the goalie faced (
expected goals). If you take this number and subtract the number of goals the goalie actually allowed (
actual goals), you get
goals saved above expected.
GSAA is goals saved above average: it is
NOT an advanced stat. All it does is use the average league save percentage and multiplies it by the number of shots a goalie has faced to get the average number of saves assuming a league average save percentage (not accounting for shot quality at all). You can then take the difference between the average number of saves and the total shots faced to get the average number of goals allowed. The difference between this value and the actual number of goals against is GSAA. Its a pretty useless stat since it treats every shot as equal - a dump in from center ice that happens to be on net is counted the same as a 2-on-0 breakaway ending with a cross-ice one timer
For GSAx, it makes sense to normalize the data by dividing by either A) time on ice, or B) shots/shot attempts faced. WIthout normalizing the data you cannot accurately compare between a goalie who has played 40 games and a goalie who has played 20 games. Normalizing by time on ice (usually this stat is GSAx/60: goals saved above expect per 60 minutes of ice time) is most common but is not completely fair for comparing goalies who face a drastically different number of shots per minute played (eg: if there are 2 goalies who are equally good and one faces 25 shots per 60 min and the other faces 35 shots per 60 mins, if you normalize by time on ice the goalie who faces more shots will have more chances to increase his GSAx since he faces more shots. This results in different GSAx/60 stats even though the goalies are equally good). In my opinion, normalizing GSAx by shots (or shot attempts - but to a lesser extent) is better since every shot affects GSAx but not every minute played affects GSAx (you can go multiple minutes of game time without facing any shots against). This will likely be highly correlated with SV% since theres not a huge difference in shot quality on average, but it also provides a lot of info that SV% doesnt. GSAx/Shot is the best metric to evaluate goaltenders in my opinion
For GSAA, the stat is pretty useless anyway and normalizing by shots faced doesnt really make sense since you arent accounting for shot quality anyway. From the thread
@BeterChiarelli linked earlier this is what he was talking about having a R squared value of 1: SV% and GSAA/Shot. This basically means GSAA/Shot is a useless stat and doesnt tell you anything that SV% doesnt already tell you
I made a quick excel spreadsheet using moneypuck data so you can see the formulas for the different columns and some graphs showing the correlation between SV% and a couple of these normalized stats. I dont think I can directly share files on here so I just put it in a public google drive:
https://drive.google.com/drive/folders/1rX32cDqpf08SHQQhfvtT-jcZ7MBsDgIe?usp=sharing
EDIT: google sheets kinda messed with the chart format so you can either download it and open it as a .xlsx file or look at the images in the drive