arrgghhh can't get it to line up!

Error rates
This rating is a percentage indicating how this player's error rate compares to the average fielder at his position in the era in which he played. A rating of 100 means the player is average -- that is, he makes 100% of the errors expected of someone at that position. A player who makes only 50% as many errors as his peers is rated 50. Someone who makes twice as many errors as his peers is rated 200.

The following table summarizes how error rates have changed over time, in five-year intervals. Each entry in the table is the number of errors made per 100 full games (or 900 defensive innings).


This table shows the errors per 100 games over time by position.

  Year P C 1B 2B 3B SS OF 1895 24 27 26 44 46 67 19 1900 22 24 23 38 38 59 14 1905 18 22 20 31 28 50 10 1910 16 19 18 28 25 45 9 1915 16 17 15 25 22 40 9 1920 14 14 13 23 20 35 8 1925 12 12 11 21 17 32 8 1930 10 10 10 19 16 30 7 1935 10 10 10 18 16 27 7 1940 10 10 10 17 15 25 6 1945 10 9 9 16 15 23 6 1950 10 9 9 15 15 22 5 1955 10 9 9 14 15 20 5 1960 10 9 9 13 15 19 5 1965 10 9 9 13 15 19 5 1970 10 9 8 12 15 18 5 1975 10 9 8 12 15 17 5 1980 9 9 8 11 15 16 5 1985 9 9 8 10 15 16 4 1990 9 8 8 9 15 15 4 1995 9 8 8 9 15 15 4 2000 8 7 7 9 14 14 4


For example, to assign an error rating to a shortstop from 1912, determine how many errors that player made per 100 games. Suppose the player made 39 errors and was the shortstop about 80% of the time. Based on a 154-game schedule, that's about 123 full games. In 100 games, he would have made 39 x 100 / 123 = 32 errors. Looking at the rows for 1910 and 1915 in the table, we can estimate that the average shortstop in 1912 made 43 errors per 100 games. Our shortstop's rate is 32, which is 74% of 43, so his rating is 74.