Current Actual RPI Ratings, Ranks, and Related Information
The following tables show actual RPI ratings and ranks and other information based on games played through Sunday, October 20. The first table is for teams and the second for conferences.
This year, the NCAA expanded the pre-NCAA Tournament season by a week, from 12 to 13 weeks. In the table for teams, there are color coded columns that show which teams are potential #1 through #4 seeds, which teams almost certainly will get at large positions if not seeded, and which teams potentially may get at large positions, as of the completion of week 10 of the season. The ranges in those columns are based on a 12-week season. Since the season now is 13 weeks, I will deal with that as follows:
This week, which is as of the completion of week 10, I am showing the historic week 10 ranges;
Next week, which will be as of the completion of week 11, I again will show the historic week 10 ranges;
The following week, which will be as of the completion of week 12, I will show the historic week 11 ranges; and
The final week, which will be as of the end of the season, I will show the end-of-season (historic week 12) ranges.
Because of the change in season length, it is possible that the ranges in this week's report should be a little larger than what the teams table shows.
Here is the teams table. Scroll to the right to see all the columns.
Below is the conferences table. In this table, I suggest you take a close look at the two right hand yellow columns. The first of these shows, for each conference, the average NCAA RPI rank of its opponents (Conference Opponents Average NCAA RPI Rank). The second shows, for each conference, the average NCAA RPI formula Strength of Schedule Contributor rank of its opponents (Conference Opponents Average NCAA RPI Strength of Schedule Contributor Rank). As you can see, for the stronger conferences, at the top, their opponents' RPI formula ranks as strength of schedule contributors are much poorer than their actual RPI ranks. As you go down the table from the stronger to the weaker conferences, this reverses so that at the bottom, the weaker conferences' opponents' RPI formula ranks as strength of schedule contributors are much better than their actual RPI ranks. This pattern is a function of the NCAA RPI's faulty method for calculating strength of schedule and results in the RPI's discrimination against stronger and in favor or weaker conferences that I often have written about.
And new this week, here is a similar table for the regions into which I divide teams based on where teams from the different states play the majority or plurality of their games.
Predicted Team RPI and Balanced RPI Ranks, Plus RPI and Balanced RPI Strength of Schedule Contributor Ranks
The following table for teams and the next ones for conferences and regions show predicted end-of-season ranks based on the actual results of games played through October 20 and predicted results of games not yet played. The predicted results of future games are based on teams' actual RPI ratings from games played through October 20.
In the table, ARPI 2015 BPs is ranks using the NCAA's 2024 RPI Formula. URPI 50 50 SoS Iteration 15 is using the Balanced RPI formula.
Predicted NCAA Tournament Automatic Qualifiers, Disqualified Teams, and At Large Selection Status, All for the Top 57 Teams
Below, I show predicted #1 through #8 seeds and at large selections based on the Women's Soccer Committee's historic decision patterns. Since we still have three weeks of regular season play (including conference tournaments), there will be changes, possibly significant, from the predictions. Nevertheless, we now are moving closer to where things will end up.
The first table below is for potential #1 seeds. The #1 seeds always have come from the teams ranked #1 through 7 in the end-of-season RPI rankings, so the table is limited to the teams predicted to fall in that rank range. The table is based on applying history-based standards to team scores in relation to a series of factors, all of which are related to the NCAA-specified criteria the Committee is required to use in making at large decisions. For each factor, there is a standard that says, if a team met this standard historically, the team always has gotten a #1 seed. I refer to this as a "yes" standard. For most of the factors, there likewise is a standard that says, if a team met this standard historically, it never has gotten a #1 seed. This is a "no" standard. In the table, I have sorted the Top 7 RPI #1 seed teams in order of the number of yes standards they meet and then in order of the number of no standards.
This shows North Carolina, Arkansas, and Mississippi State as clear #1 seeds. Wake Forest, Duke, and Penn State have profiles the Committee has not seen before (meeting both "yes" and "no" standards), but are possitle #1 seeds. Southern California likewise is a possible #1 seed. The following table applies the "tiebreaker" for #1 seeds to the "possible" group. (A "tiebreaker" is a factor that historically has been the best predictor of Committee decisions for a particular decision.)
As you can see, Duke scores best on the tiebreaker and so joins North Carolina, Arkansas, and Mississippi State as the predicted #1 seeds.
The candidates for #2 seeds are teams ranked through #14. With the #1 seeds already assigned, this produces the following table:
Here, the decision is clear: Wake Forest, Penn State, Michigan State, and Southern California are #2 seeds.
The candidates for #3 seeds are teams ranked through #23. With the #1 and 2 seeds already assigned, this produces the following table:
Here, Stanford and Florida State are clear #3 seeds. Notre Dame, Iowa, UCLA, and Georgetown are potential #3 seeds. Here is the tiebreaker table for #3 seeds for those teams:
Based on the tiebreaker, Iowa and Notre Dame join Stanford and Florida State for #3 seeds.
The candidates for #4 seeds are teams ranked through #26. With the #1, 2, and 3 seeds already assigned, this produces the following table:
Here, Auburn is a clear #4 seed. Utah State, UCLA, Georgetown, Vanderbilt, Oklahoma State, and South Carolina are potential #4 seeds. Here is the tiebreaker table for #4 seeds for those teams:
Based on the tiebreaker Vanderbilt, UCLA, and South Carolina join Auburn for #4 seeds.
For # 5 through #8 seeds, the candidates are teams ranked #1 through #49. Although the data are limited since we have had those seeds for only a few years, the best indicator of which teams will get those seeds is a combination of teams' RPI ranks and their Top 60 Head to Head results ranks:
Using this table, the #5 seeds are Utah State, TCU, Georgetown, and Xavier. The #6s are Virginia, Texas Tech, St. Louis, and Oklahoma State. The #7s are Kentucky, Viginia Tech, West Virginia, and Liberty. The #8s are Ohio State, Western Michigan, Fairfield, and Santa Clara.
For the remaining At Large selections, the candidates run up to RPI #57, producing the following initial table:
Here, Rutgers, Minnesota, Pittsburgh, Wisconsin, Georgia, and Texas are clear At Large teams, with 5 additional spots to fill. Colorado, Tennessee, Creighton, Pepperdine, Dayton, BYU, Massachusetts, and Washington are candidates for those spots. The remaining teams are not at large selections. For At Large, the tiebreaker is a combination of teams' RPI ranks and their ranks in terms of their Common Opponent results in relation to other Top 60 teams.
Based on the tiebreaker, Pepperdine, Dayton, BYU, Washington, and Massachusetts get the last at large positions.
Based on the above, this produces the final compilation of seeds, Automatic Qualifiers, at large selections, and Top 57 teams not getting at large selections. In the NCAA Seed or Selection column, the seeds are self explanatory, the 5s are unseeded Automatic Qualifiers, the 6s are unseeded at large selections, and the 7s are Top 57 teams not getting at large positions.
What If the Committee Were Using the Balanced RPI?
If the Committee were using the Balanced RPI, which does not have the NCAA RPI's problem of discrimination in relation to conferences and regions, the following teams would drop out of the RPI Top 57:
Army (AQ) drops from #44 to #63; South Florida (no at large) from #43 to #65; Liberty (AQ) from #40 to #68; James Madison (AQ) from #47 to #70); Columbia (AQ) from #53 to #72; Grand Canyon (AQ) from #50 to #73); Ohio (AQ) from #57 to #94; Fairfield (AQ) from #41 to #102).
The following teams would move into the Balanced RPI Top 57:
California climbs from #76 to #38; Loyola Marymount from #81 to #41; Baylor from #59 to #45; Utah Valley from #61 to #49; Utah from #84 to #50; Memphis from #63 to #51; Clemson from #64 to #55; Kansas from #73 to #57.
It's worth noting that these shifts primarily are teams from weaker conferences moving out of the Top 57 and teams from stronger conferences moving in.
In terms of actual at large changes, it is likely that Dayton, BYU, and Massachusetts would lose their predicted at large positions and would be replaced by Tennessee, Colorado, and California.