Email Marc

Blog Tags


Tuesday, May 09, 2006

How do we know?

Some of the more regular readers of this blog know I occasionally go off on some wild big-picture discussions from time to time, the greatest being perhaps my snooze-fest post where I applied the guidelines for higher-ed accreditation to junior hockey programs. Unfortunately, it’s long, boring, and probably didn’t mean much to the vast majority of readers who lack the background in this kind of thought.

So I’m going to take a step back and try something that might be a little easier to relate to, and discuss business scorecards and key performance indicators. In many ways this is jumping ahead, because you can’t develop these until your vision/mission/values/goals have been established. However, I am going to assume there is already a set of these elements in play, all existing within the envelope of developing hockey players and advancing them to the “next level.” I don’t want to bog this down in specificities related to some league or level (that’s what the comments function is for, so fire away there), and regardless I feel there is probably 80% congruency of thought anyway, the other 20% existing in that realm of constructive and destructive politics that makes this sport interesting (and helps make this blog possible).

So, assuming there are some generic goals out there, both on the ice and off, how do we know if/when they have been met/accomplished? Well, that’s what measures are for. I’m not psychic and I don’t have a nifty tricorder from Star Trek, so there’s no way currently to glance at a program and determine their health. Yes, we have some general sense… but intuition is not knowledge.


Product Measures

Winning Pct.
League Rank (an ordinal ranking based on playoff and regular season performance)

Total Players (those who saw action in at least one game last season)

Average Age of All Players
Weighted Average Age of All Players (weighs age against games played)

Average Age of Rookies (those not playing at this level or higher for more than 20 games the previous season)

Average Age of Veterans

Source of Rookies - Midget AAA
Source of Rookies - Midget AA
Source of Rookies - High School
Source of Rookies - Junior B
Source of Rookies - Other (International, Jr. C, etc)

Percent of All Players (Final Roster) Whose Home Town or Last Team is Within 200 Miles of Your Team

D1 Commitments – Total
D1 Commitments – Hockey East
D1 Commitments – CCHA
D1 Commitments – WCHA
D1 Commitments – ECAC
D1 Commitments – Atlantic Hockey
D1 Commitments – CHA
D1 Commitments – Independent

D3 Commitments

Number of NHL Draft Picks
Number of Players Ranked by Central Scouting Bureau (Final Rankings)

Percentage of D1 Commitments Still Rostered with Two Seasons Later (those still playing for their college team and have not washed out)

Percentage of D3 Commitments Still Rostered Two Seasons Later (those still playing for their college team and have not washed out)

Average HS GPA of Rookies
Average HS GPA of All Players
Average Top Combined ACT Score of All Players
Average Top SAT Score of All Players

Total Alumni in Pros – NHL
Total Alumni in Pros – AHL
Total Alumni in Pros – AA
Total Alumni in Pros – International

Average on-ice practice time per week
Average off-ice practice time per week
Number of Tryout Camps Held (Including Final Camp)
Total Players Attending All Camps
Total Unduplicated Players Attending All Camps

Business Measures

Total Staff FTEs
Total Hockey Staff FTEs
Total Business Staff FTEs
Total Game Night Operations Staff

Total Budget
Total Staff Payroll
Total Travel Expenses
Total Recruiting Expenses (Including Camp Expenses)
Total Ice Expenses (practice/game ice only, do not include tryout camps)
Total Daily Game Operations Budget (include cost of officials)
Total Net Gain/Loss

Total Paid Attendance
Average Paid Attendance
Total Drop Count Attendance
Average Drop Count Attendance
Percent of Facility Capacity (Paid)
Percent of Facility Capacity (Drop)
Totals Comps
Average Comps
Total Paid Season Tickets (exclude billets and other comps, include corporate package deals)
Total Ticket Revenue
Total Merchandise Revenue
Total Sponsorship Revenue
Total Camp Revenue
Total Concessions Revenue

Note: this is not a complete list

Most of these measures are what I consider to be of the first order, in that they are basic sums (totals). There are a few second order derivatives, such as the averages, but the ones I listed are pretty common to most people (like average attendance). The next step in performance evaluation is to pit some measures against each other. If a specific goal is to win with a younger team, then divide your winning percentage by your average age. Want to measure efficiency in the front office, then ratio your revenues against your FTEs, or corporate revenues per sponsorship. Also, earlier in this post I said that intuition is not knowledge. Well, data is not knowledge, either, so I caution readers who might interpret these measures (and the data that would come from them) as knowledge. This, in part, goes back to me jumping ahead to measures before first establishing the required vision/mission/values/goals. Data is never knowledge until one asks of the data, “why?” WHY is the data what it is? When you know why the data is what it is, and what impacts it both positively and negatively, then you have something of value.

Of course, none of this means anything unless you have something to compare it to, and that’s where transparency comes in. How do you know you’re a high-level performer if you don’t have any peer data? Thus we arrive at the punch line: If an individual team does this, the usefulness is limited. But, if an entire league shares their data, the value of the information increases markedly. For one thing, teams are competitive, and if driven (ideally) by the same vision/mission/values/goals, they are immediately going to step it up both on and off the ice. A basic truth of performance is that it really does matter when/if you’re keeping score. Secondly, for the business measures, it’s in everyone’s best interests to share their best practices. “How did one team improve so much in a given area,” followed by “could their change in process or strategy improve my performance, too?” I’m not saying teams/leagues share all their results with the public. Heaven knows I drown in enough of data as it is, so I’m not particularly interested unless someone wants to put me on the payroll (hint hint, nudge nudge, say no more).

Anyway, that’s my school of thought on this. There’s plenty more to discuss, like why these measures, and it’s my hope these types of discussions take place as leagues and USAH go into their summer meetings.

Comments on "How do we know?"


Anonymous Marc's wife said ... (1:24 PM, May 09, 2006) : 

Well you mentioned ACT and SAT scores as part of the whole player evaluation package. Are you going to break that down into subscores? What about the subscore total system utilized by the NCAA Clearinghouse for the ACT? Are you going to include all of the tests (including the new Writing portions) or are you just including the grandfathered portions? (This again would be following the current NCAA standards for Clearinghouse requirements, but not necessarily NCAA collegiate admissions.)


Anonymous Anonymous said ... (2:19 PM, May 09, 2006) : 


If your criteria were really applied to the whole Junior hockey system, I honestly believe that one could not justify it on a scientific basis. I suspect, that applying some measureable evaluative system would result in kids going to college after high school like all other sports. Of course all the now unemployed folks involved in Junior hockey would decry the criteria and sing the praises and benefits of the Junior system.


post a comment