Have you ever watched a pitcher and wondered if the ERA really is the best measure for a pitcher?  In baseball, numbers rule the game to the nth degree.  There are multiple layers of numbers that describe just about any quality that you can think of.  The statistics range from layperson stats such as batting average and ERA to the more advanced sabermetrics of BABIP, GB and FB rates, and WHIP. 

Of all the statistics that you can look at, why is ERA still relevant?

ERA is the average number of earned runs that a pitcher allows per nine innings of work.  On the surface this seems reasonable, but the deeper you look, the less meaning this stat actually has. 

For example, the more innings that a pitcher pitches in a day, the more susceptible he is to giving up runs.  This puts starting pitchers at an ERA disadvantage compared to relievers because relievers will pitch to fewer batters and have more “gas in the tank” when they face them. 

Starters have to face the same batter multiple times in a game; the second and third time a batter sees a pitcher is when they do the most damage. Relief pitchers have artificially suppressed ERAs they face each batter only once and have more energy to do so. 

Second, rarely do pitchers throw an entire nine innings in a game. So why does it make sense to measure pitchers on how many earned runs they give up over two games?      

I would prefer to have an ERA measure that was calculated as earned runs per inning pitched.  This way it makes no difference how many innings a pitcher throws to compare any two pitchers regardless of their roles. 

Going one step further, I would like to see an ERA that is defense-independent: Not all pitchers have a good defense behind them and that should not affect their ratings.

Read more MLB news on BleacherReport.com