How well all six models matched the 2007-08 final coaches' poll.

Table explanation: The Spearman Correlation Coefficients (SCC) below relate to how well the top 15, 25 and 35 teams, in the final coaches' poll, match with the predictions for the models listed. The '0,1,2' column contains how many top 35 team rankings were only off by 0, 1, or 2 places, and this is followed by the average deviation between the actual ranking and the prediction. ("X" in the large table at the bottom represents the number of wins in the NIT tournament that a team earned.)

SCC15 SCC25 SCC35 0,1,2 Average
ZP2 0.96964 0.97731 0.96537 8,10,7 1.9000
PR2 0.98036 0.97077 0.95655 8,10,5 2.2857
50T 0.98036 0.97538 0.96467 8,9,7 2.0286
LN2 0.95983 0.96231 0.95851 11,5,7 2.0571
ZPF 0.89107 0.94115 0.94128 7,9,5 2.5426
MCB 0.95357 0.88615 0.93480 9,8,6 2.4000
OCC 0.94464 0.97154 0.92700 11,7,6 2.3286

Commentary: The 2008 tournament was the first (and only) one where all #1 seeds reached the Final Four, and possibly that contributed to the fact that every one of the models matched the top 5 teams in the final poll exactly. All of the models also matched the top 15, 25 and 35 fairly well - across the board: the ZP2 model recorded its second highest SCC-35 value, and its fourth lowest average difference ever; the PR2 model recorded its 3rd best SCC-15 value; and the 50T model had its second lowest SCC-35 (and SCC-15) value. This was probably the best overall year for the MCB model: the average difference tied for its lowest (with 2017), the SCC-35 value was its second highest, this SCC-15 value was third best. MCB also had Ohio State (the NIT champion) pretty well pegged, with all but the LN2 and OCC models ranking them higher than their (tie for the) #35 ranking. OCC did fairly well, but was off by 13 for Texas A&M (#40, but polls said #27), and by 11 with regards to Kansas State (#41 when the final poll said #30). However, the OCC model matched the final rank for Indiana and BYU 'exactly', which was at least two positions better than the six other models; OCC predicted USC's final rank best as well - for these three teams that appeared in the 'lower tier' of the final poll. (CBI champ Tulsa did not receive much recognition by the seven models here, but they did only received one vote in this year's final poll.)

TeamName Poll Wins ZP2 PR2 50T LN2 ZPF MCB OCC
Kansas 1 6 1 1 1 1 1 1 1
Memphis 2 5 2 2 2 2 2 2 2
NorthCarolina 3 4 3 3 3 3 3 3 3
UCLA 4 4 4 4 4 4 4 4 4
Texas 5 3 5 5 5 5 5 5 5
Louisville 6 3 7 7 7 7 9 7 8
Tennessee 7 2 8 8 8 9 7 9 9
Xavier 8 3 6 6 6 6 8 6 7
Davidson 9 3 11 10 10 11 14 10 10
Wisconsin 10 2 9 9 9 8 6 8 6
Stanford 11 2 10 11 11 10 10 12 11
Georgetown 12 1 12 12 12 12 11 13 12
MichiganSt 13 2 15 14 14 15 15 14 14
Butler 14 1 14 15 15 14 13 17 16
WashingtonSt 15 2 16 16 16 16 17 15 15
Duke 16 1 13 13 13 13 12 11 13
WestVirginia 17 2 18 18 18 19 22 16 18
Pittsburgh 18 1 19 20 20 22 19 21 21
NotreDame 19 1 17 17 17 17 16 20 17
Purdue 20 1 22 22 22 21 18 24 20
Marquette 21 1 24 24 24 24 26 18 22
WKentucky 22 2 20 19 19 18 21 19 19
Drake 23 0 23 23 23 23 20 23 23
Villanova 24 2 21 21 21 20 23 26 25
Vanderbilt 25 0 26 29 27 25 24 39 28
Connecticut 26 0 25 25 25 26 25 32 29
TexasA&M 27 1 28 28 29 30 31 22 40
MississippiSt 28 1 30 31 31 28 29 30 26
Clemson 29 0 35 35 34 36 37 29 31
KansasSt 30 1 33 33 33 31 32 25 41
Gonzaga 31 0 38 38 37 38 39 31 34
Arkansas 32 1 29 30 30 29 30 34 27
Indiana 33 0 36 37 36 37 38 28 33
UNLV 34 1 27 27 28 27 27 27 24
OhioSt 35T "5" 32 26 26 39 28 36 32
BYU 35T 0 41 41 40 40 40 37 35
MiamiFL 37 1 34 34 35 32 33 33 42
USC 38T 0 42 42 41 42 42 41 37
Tulsa 38T CBI 110 110 113 101 101 83 93