In modem times of professional baseball, a season batting average higher than 0.400 is considered a nearly unachievable goal, but it was more frequently seen in the early years before the 1940s. It is tempting to suggest that the disappearance of 0.400 hit ters indicates decline in the skills of the players, but Gould (1997) makes a case for the contrary. According to Gould, the variance of the batting average among individ ual players has decreased as professional baseball gets better, and with the mean level remaining unchanged, it caused the extreme value in batting average to decrease. In that case, the change would have happened gradually over time. We approach the phenomenon with a change point analysis of extreme batting average using the top batting average data every year from Major League baseball in the United States. A likelihood ratio test is proposed to test the change point with a profile likelihood method, and the p-value of the observed testing statistic is obtained from a paramet ric bootstrap procedure. The test procedure is extended to test a smoothly changing model versus a model with a change point, either one of which could be set as the null hypothesis with the other one as the alternative hypothesis. A change point was detected in the 1940s, and the change point model provided better fit than a smoothly changing model in model comparison. The results call for further, alternative expla nation of the disappearance of 0.400 hitters.
|Title of host publication||Extreme Value Modeling and Risk Analysis|
|Subtitle of host publication||Methods and Applications|
|Number of pages||12|
|Publication status||Published - 2016 Jan 6|
Bibliographical notePublisher Copyright:
© 2016 by Taylor & Francis Group, LLC.
All Science Journal Classification (ASJC) codes