## Monday, December 12, 2005

### A little bit of baseball trivia

(Not that any of this is true in history)

Let's say that Ty Cobb's seaon batting average is the same as Shoeless Joe Jackson's at the beginning of a late-season double-header. Assume both batters have had hundreds of at bats.

Cobb went 7 for 8 (.875) while Jackson went 9 for 12 (.750). But at the end of the day, Jackson's season average turned out to be higher than Cobb's. How is this possible?

1. Cobb could have gone to bat more that day and have whiffed every time. I don't know too much about baseball.

2. Shoeless Joe had a couple of walks that don't go against the BA?

3. they had different at bats in the first part of the season

4. Cobb had more season at-bats. So, even though his single-day performance was better, its effect on the season batting average was less than that of Shoeless Joe. Using an example I can do my head...
If Cobb was 100 for 200 (.500) on the season, he ends the day at 107 for 208 ~.514
If Shoeless Joe was 50 for 100 (.500), he ends the day 59 for 112 ~.526. The post says to assume hundreds of at-bats, so for fun, I'll add an extra zero to everything.
1007/2008 ~ .501
509/1012 ~ .503

5. Going 9 for 12 is the same as going 7 for 8, and following that up with 2 for 4. Assuming their averages were both well below .500, Jackson's extra 2 for 4 increment could be enough to raise his average above Cobb's.

If you don't believe me, here's some math:
If they were both 60/200 (.300) at the beginning of the day, at the end of the day, Cobb is 67/208 = .322 and Jackson is 69/212 = .325.

Leave your answer or, if you want to post a question of your own, send me an e-mail. Look in the about section to find my e-mail address. If it's new, I'll post it soon.

Please don't leave spam or 'Awesome blog, come visit mine' messages. I'll delete them soon after.

Enter your Email and join hundreds of others who get their Question of the Day sent right to their mailbox