Skip to main content

Table 3 Summary of results from the FluSight influenza forecast challenges*

From: Applying infectious disease forecasting to public health: a path forward using influenza forecasting examples

 2013–14 season2014–15 season2015–16 season2016–17 season2017–18 season
Number of participating teams95112122
Number of submitted forecasts137142829
Season onset top skillN/A**0.410.180.780.69
Peak week top skillN/A0.490.200.490.50
Peak intensity top skillN/A0.170.660.360.26
1-week ahead top skillN/A0.430.890.600.54
2-weeks ahead top skillN/A0.360.760.460.37
3-weeks ahead top skillN/A0.370.660.410.29
4-weeks ahead top skillN/A0.350.580.380.26
Overall top performing teamColumbia UniversityDelphi group, Carnegie Mellon UniversityDelphi group, Carnegie Mellon UniversityDelphi group, Carnegie Mellon UniversityDelphi group, Carnegie Mellon University
  1. *Skill scores for 2016–17 and 2017–18 challenges have not been published. Results from 2018 to 19 challenge are not complete as of August 2019
  2. †Number of submitted forecasts do not include the unweighted average ensemble or historical average forecasts
  3. **The logarithmic scoring rule used to determine forecast skill scores was not introduced until the second year of the challenge (2014–15). Skill scores for the challenge pilot (2013–14) are therefore not available