I recently wrote a piece regarding the diminishing value of ERA as a measure of a starting pitcher's effectiveness. The basic gist of it is that due
to the rise of the compartmentalized bullpen and the attendant fall of the complete game, earned run average is operating under the false assumption
that a starting pitcher will go nine innings and has therefore lost its luster. In other words, the diminishing value of a nine-inning projection in
the virtual absence of the complete game is an issue that needs to be addressed. After all, statistics are the foundation of baseball analysis.
I also propose a new stat for starters called Durable Earned Run Quotient (DERQ). DERQ essentially measures a starter's earned runs per innings
pitched as a function of innings pitched per start, or stinginess as a function of durability.
The piece from from my website (Dean's List: Amusing Musings on Sport follows), but you can also find it at
"The end of an ERA"
Fact: statistics are the foundation of baseball analysis. Yet statistics amount to little more than a numerical representation of the obvious. Hits,
walks, homeruns - the usual. In the beginning, the statistic created the stat, which was nothing more than a symbol of brevity. Today, the statistic
is simple fact, but the stat is a complex theory. The statistic is an idea, but the stat is philosophy. The statistic is the "status quo," but the
stat is evolution. Quite simply, the stat has made the statistic just another statistic, and earned run average is no different.
Before we move on, Encyclopedia.com offers us the following historical analysis of ERA:
"Henry Chadwick is credited with first devising the statistic. It caught on as a measure of pitching effectiveness after relief pitching came into
vogue in the 1900s. Prior to the 1900s, every pitcher was expected to pitch a complete game (and, in fact, for many years afterward). After pitchers
like Otis Crandall and Charlie Hall made names for themselves as relief specialists, gauging a pitcher's effectiveness became more difficult using the
traditional method of tabulating wins and losses. The National League first kept official earned run average statistics in 1912 (the statistic was
called 'Heydler's Statistic' for a while, after then-NL secretary John Heydler), with the American League following suit afterward."
And so begins our dismantling of ERA...
Topping things off is the bottom line: starting pitchers in Major League Baseball are being pampered like never before. On top of averaging just six
innings per outing, starters are going nine in only three percent of their starts in 2004.
The complete game isn't just on a steady decline - it's on life support. Way back in 1917, Boston's Babe Ruth led Major League Baseball with 34
complete games. A decade later, 30 did the trick. In 1937, 27 was enough. In 1947, 24 was enough. In 1957, Warren Spahn led the Majors with just
eighteen complete games.
On the surface, not much changed over the next forty years. Like Spahn before him, Roger Clemens topped the Bigs with eighteen complete games in 1987.
A decade later, it only took thirteen finishes for Pedro Martinez to lead the Majors. But then the slope got really slippery. Since 1999, no pitcher
has even notched as many as ten complete games in a year. So what happened?
Well, it's actually still happening. Ironically, ERA is dying for the same reason it was created: the rise of the reliever. Today's departmentalized
bullpen has become so vital to success that it practically forbids the complete game.
The starters of Major League Baseball should be irate. The glorified mop-up men of the 'pen stole their stat and slowly cheapened it to death. Now
even the Paul Quantrills of the world have ERAs under two. What a rip-off. The funny thing is, ERA wasn't the bullpen's stat the first place. It was
created for the starters they relieved. What good is a nine-inning projection if a pitcher can't last more than two innings? And let's not forget
about Eric Gagne. Prior to becoming the most dominant reliever in the history of baseball, Gagne was a mediocre starter at best.
Back to the point. For decades, ERA has been based upon the false assumption that starting pitchers go nine innings with certain regularity. For the
record, let's interpret "certain regularity" to mean at least half the time. When a theory is operating under false pretenses, it must be laid to
rest. With that, I pronounce ERA dead. But don't blame me - I didn't kill ERA, the bullpen did.
To create a new stat for starters, for starters we need a new assumption. Our new stat will operate under the assumption that a starting pitcher won't
go nine innings. An inherent implication of our assumption is that it doesn't matter how many earned runs he might have given up over the course of an
entire regulation game. Now we just need to define the hardware of the starter's true value. Only then can we create the software known as the stat.
It might sound inane, but the essence of pitching isn't winning. A porous defense or a faulty bullpen can always blow the game. It's not about
strikeouts, either. Flyouts and groundouts are just as good. It's not even about hits and walks allowed. It's about earned runs allowed. In the
practical absence of the complete game, it's also about durability.
Factor one is earned runs allowed per inning pitched, or ER/IP. "The lower, the better" applies here. Factor two is innings pitched per game started,
or IP/GS. "The higher the better" applies in this case. Finally, we divide factor two by factor one to get the starter's Durable Earned Run Quotient.
Naturally, we'll name our new stat DERQ. And do note that "the higher the better" applies to quotients such as DERQ that have a "higher the better"
numerator and a "lower the better" denominator.
DERQ primarily measures a pitcher's per-inning performance. But DERQ takes the essence of ERA a step further by measuring earned runs per inning in
terms of durability instead of arbitrarily projecting the number over nine innings. In other words, a starter's per-inning performance has greater
significance the longer it can be maintained.
DERQ won't look pretty at first, but this isn't the Miss America Pageant. And it just so happens that brief synopsis of Jeff Weaver's early years will
help you see DERQ's inner beauty.
Before disaster struck in New York, the former Tiger improved steadily during his first four Big League seasons in Detroit. The breakdown follows in
terms of DERQ:
As a rookie in 1999, Weaver was just plain bad. In 29 starts, he pitched 160 innings and gave up 100 earned runs. Likewise, his DERQ was a lowly
Jeff's sophomore season was better, but nonetheless average. In 30 starts, Weaver pitched 197 innings and conceded 94 earned runs. His DERQ that year
was a mediocre 13.8...
Weaver became a good pitcher during his third season in 2001.. That year, Jeff let up 104 earned runs in 229.1 innings over 33 starts. His 15.32 DERQ
reflected this improvement...
In 2002, Weaver began pitching like an All-Star. During the first half of the year in Detroit, Jeff started seventeen games and gave up only 42 earned
runs in 121.2 innings. His 20.3 DERQ was evidence of his rise - albeit temporary - to excellence.
If that doesn't satisfy you, think about DERQ this way:
A bad pitcher typically gives up three runs in five innings. His DERQ is 8.3...
On average, the average pitcher gives up three runs in six innings, good for a DERQ of 12...
Then we have the good pitcher, who typically gives up three runs in seven innings. His DERQ is 16.3...
Lastly, we have the All-Star. On most days, the All-Star starter concedes two runs in seven innings, good for a DERQ of 24.5.
You might not need it, but the following scale sums up DERQ in a nutshell: 0-10: Bad; 10-15: Average; 15-20: Good; 20+: All-Star.
I strongly advise you to get to know DERQ - STAT! By the way, I just said stat eighteen times. And that was a statistic, not a stat.
[Edited on 8/12/04 by deanchristopher]