Who Has the Best Fantasy Football Projections? 2016 Update

March 12, 2016

(This article was first published on R – Fantasy Football Analytics, and kindly contributed to R-bloggers)

In prior posts, we demonstrated how to download projections from numerous sources, calculate custom projections for your league, and compare the accuracy of different sources of projections (2013, 2014, 2015).  In the latest version of our annual series, we hold the forecasters accountable and see who had the most and least accurate fantasy football projections over the last 4 years.

The R Script

You can download the R script for comparing the projections from different sources here.  You can download the historical projections and performance using our Projections tool.

To compare the accuracy of the projections, we use the following metrics:

For a discussion of these metrics, see here and here.

Whose Predictions Were the Best?

The results are in the table below.  We compared the accuracy for projections of the following positions: QB, RB, WR, and TE.  The rows represent the different sources of predictions (e.g., ESPN, CBS) and the columns represent the different measures of accuracy for the last four years and the average across years.  The source with the best measure for each metric is in blue.
Source 2012 2013 2014 2015 Average
Fantasy Football Analytics: Average .670 .545 .567 .635 .618 .577 .626 .553 .620 .578
Fantasy Football Analytics: Robust Average .667 .549 .561 .636 .613 .581 .628 .554 .617 .580
Fantasy Football Analytics: Weighted Average .626 .553
CBS Average .637 .604 .479 .722 .575 .632 .500 .664 .548 .656
EDS Football .554 .651 .584 .624 .569 .638
ESPN .576 .669 .500 .705 .498 .723 .615 .585 .547 .671
FantasySharks .529 .673
FFtoday .661 .551 .550 .646 .530 .659 .546 .626 .572 .621
FOX Sports .459 .720 .550 .677 .505 .699
NFL.com .551 .650 .505 .709 .518 .692 .582 .632 .539 .671
numberFire .486 .712 .560 .643 .523 .678
RTSports .547 .670
WalterFootball .472 .713 .431 .724 .452 .719
Yahoo .547 .645 .635 .554 .591 .600
Here is how the projections ranked over the last four years (based on MASE):
  1. Fantasy Football Analytics: Average (or Weighted Average)
  2. Fantasy Football Analytics: Robust Average
  3. Yahoo
  4. FFtoday
  5. EDS Football
  6. CBS Average
  7. RTSports
  8. ESPN
  9. NFL.com
  10. FantasySharks
  11. numberFire
  12. FOX Sports
  13. WalterFootball

Notes: CBS estimates were averaged across Jamey Eisenberg and Dave Richard.  FantasyFootballNerd projections were not included because the full projections are subscription only.  We did not calculate the weighted average prior to 2015.  The accuracy estimates may differ slightly from those provided in prior years because a) we now use standard league scoring settings (you can see the league scoring settings we used here) and b) we are only examining the following positions: QB, RB, WR, and TE. The weights for the weighted average were based on historical accuracy (1-MASE).  For the analysts not included in the accuracy calculations, we calculated the average (1-MASE) value and subtracted 1/2 the standard deviation of (1-MASE).  The weights in the weighted average for 2015 were:

CBS Average: .428
EDS Football: .428
ESPN: .383
FantasyFootballNerd: .428
FFToday: .482
FOX Sports: .428
NFL.com: .384
numberFire: .404
RTSports.com: .428
WalterFootball: .428
Yahoo Sports: .433

Here is a scatterplot of our average projections in relation to players’ actual fantasy points scored in 2015:

Accuracy 2015


Interesting Observations

  1. Projections that combined multiple sources of projections (FFA Average, Weighted Average, Robust Average) were more accurate than all single sources of projections (e.g., CBS, NFL.com, ESPN) every year.  This is consistent with the wisdom of the crowd.
  2. The simple average (mean) was more accurate than the robust average.  The robust average gives extreme values less weight in the calculation of the average.  This suggests that outliers may reflect meaningful sources of variance (i.e., they may help capture a player’s ceiling/floor) and may not be bad projections (i.e., error/noise).
  3. The weighted average was equally accurate compared to the simple average.  Weights were based on historical accuracy.  If the best analysts are consistently more accurate than other analysts, the weighted average will likely outperform the mean.  If, on the other hand, analysts don’t reliably outperform each other, the mean might be more accurate.
  4. The FFA Average explained 57–67% of the variation in players’ actual performance.  That means that the projections are somewhat accurate but have much room for improvement in terms of prediction accuracy.  1/3 to 1/2 of the variance in actual points is unexplained by projections.  Nevertheless, the projections are likely more accurate than pre-season rankings.
  5. The R-squared of the FFA average projection was .67 in 2012, .57 in 2013, .62 in 2014, and .63 in 2015.  This suggests that players are more predictable in some years than others.
  6. There was little consistency in performance across time among sites that used single projections (CBS, NFL.com, ESPN). In 2012, CBS was the most accurate single source of projection but they were the least accurate in 2013.  Moreover, ESPN was among the least accurate in 2014, but they were among the most accurate in 2015.  This suggests that no single source reliably outperforms the others.  While some sites may do better than others in any given year (because of fairly random variability–i.e., chance), it is unlikely that they will continue to outperform the other sites.
  7. Projections were more accurate for some positions than others.  Projections were much more accurate for QBs and WRs than for RBs.  Projections were the least accurate for Ks, DBs, and DSTs.  For more info, see here.  Here is how positions ranked in accuracy of their projections (from most to least accurate):
    1. QB: R2 = .71
    2. WR: R2 = .57
    3. LB: R2 = .56
    4. TE: R2 = .54
    5. DL: R2 = .48
    6. RB: R2 = .47
    7. K: R2 = .38
    8. DB: R2 = .32
    9. DST: R2 = .15
  8. Projections over-estimated players’ performance by about 4–10 points every year across most positions (based on mean error).  It will be interesting to see if this pattern holds in future seasons.  If it does, we could account for this over-expectation in players’ projections.  In a future post, I hope to explore the types of players for whom this over-expectation occurs.


Fantasy Football Analytics had the most accurate projections over the last four years.  Why?  We average across sources.  Combining sources of projections removes some of their individual judgment biases (error) and gives us a more accurate fantasy projection.  No single source (CBS, NFL.com, ESPN) reliably outperformed the others or the crowd, suggesting that differences between them are likely due in large part to chance.  In sum, crowd projections are more accurate than individuals’ judgments for fantasy football projections.  People often like to “go with their gut” when picking players.  That’s fine—fantasy football is a game.  Do what is fun for you.  But, crowd projections are the most reliably accurate of any source.  Do with that what you will!  But don’t take my word for it.  Examine the accuracy yourself with our Projections tool and see what you find.  And let us know if you find something interesting!

The post Who Has the Best Fantasy Football Projections? 2016 Update appeared first on Fantasy Football Analytics.

To leave a comment for the author, please follow the link and comment on their blog: R – Fantasy Football Analytics.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...

If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.


Mango solutions

plotly webpage

dominolab webpage

Zero Inflated Models and Generalized Linear Mixed Models with R

Quantide: statistical consulting and training





CRC R books series

Six Sigma Online Training

Contact us if you wish to help support R-bloggers, and place your banner here.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)