This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
With the pool stage over, I pulled together a quick workflow to see how the predictions fared against the actual results. Of the 37 games that were eventually played, there were 4 results that went against the data and my linear regression’s prediction. Funnily enough, these games all involved either Fiji or Japan.
It will have been difficult to avoid the coverage of Japan’s incredible few games in which they overthrew both Ireland and Scotland against the odds (and against the predictive model). Whether this is down to the effect of home advantage, an incredible ability to cause an upset or an uptrend in form that the data doesn’t account for, it’s hard to tell.
To analyse the correct predicted results, I leveraged the Smart Tile functionality found in the (under-appreciated) Tile tool to see the spread of disparity between the predicted margin and the actual margin.
As you can see in the summary below, an impressive 10 matches were predicted within 5 points of the actual result. This includes one match that was predicted absolutely correct; the incredibly encapsulating battle between Wales and Australia that ended up with Wales sneaking it by 4.
With the semi-finals upcoming, it will be interesting to see if we will have any more upsets or if the ever-favourites New Zealand will prevail. As an Englishman myself, I hope that the prediction doesn’t run true and that England can upset the odds. Below are the knockout stage predictions when I updated the workflow last week.