Monday, October 14, 2024

2024 ARTICLE 11: POST-WEEK-9 ACTUAL RATINGS AND UPDATED PREDICTIONS

 Current Actual RPI Ratings, Ranks, and Related Information

The following tables show actual RPI ratings and ranks and other information based on games played through Sunday, October 13.  The first table is for teams and the second for conferences.

Note: Scroll to the right to see all the columns.





Predicted Team RPI and Balanced RPI Ranks, Plus RPI and Balanced RPI Strength of Schedule Contributor Ranks

The following table for teams and the next one for conferences show predicted end-of-season ranks based on the actual results of games played through October 13 and predicted results of games not yet played.  The predicted results of future games are based on teams' actual RPI ratings from games played through October 13.

In the table, ARPI 2015 BPs is ranks using the NCAA's 2024 RPI Formula.  URPI 50 50 SoS Iteration 15 is using the Balanced RPI formula.





Predicted NCAA Tournament Automatic Qualifiers, Disqualified Teams, and At Large Selection Status, All for the Top 57 Teams

Starting this week, I will show predicted #1 through #8 seeds and at large selections based on the Women's Soccer Committee's historic decision patterns.  With a good part of the season remaining to be played, there will be changes, including substantial changes, from the predictions.  My main purpose now is to show how the prediction system works.

The first table below is for potential #1 seeds.  The #1 seeds always have come from the teams ranked #1 through 7 in the end-of-season RPI rankings, so the table is limited to the teams predicted to fall in that rank range.  The table is based on applying history-based standards to team scores in relation to a series of factors, all of which are related to the NCAA-specified criteria the Committee is required to use in making at large decisions.  For each factor, there is a standard that says, if a team met this standard historically, the team always has gotten a #1 seed.  I refer to this as a "yes" standard.  For most of the factors, there likewise is a standard that says, if a team met this standard historically, it never has gotten a #1 seed.  This is a "no" standard.  In the table, I have sorted the Top 7 RPI #1 seed teams in order of the number of yes standards they meet and then in order of the number of no standards.


This table has the #1 seed candidates sorted in order of the number of "yes" #1 seed standards they meet, from largest number to smallest (1 Seed Total), and the number of "no" standards they meet, from smallest to largest (No 1 Seed Total).  In the table, North Carolina and Mississippi State each meets at least one "yes" standard and no "no" standards, so each historically would get a #1 seed.  Duke and Penn State meet both "yes" and "no" standards, which means they have profiles the Committee has not seen historically.  Arkansas and Southern California meet no "yes" and no "no" standards, which means their getting or not getting #1 seeds would be consistent with Committee historic patterns.  Wake Forest meets no "yes" and 5 "no" standards, which means it historically would not get a #1 seed.  Altogether this means we have North Carolina and Mississippi State as #1 seeds, with Duke, Penn State, Arkansas, and Southern California as candidates for the remaining two #1 seed positions.

When there are more #1 seed candidates remaining than there are positions to fill, historically the best indicator of which teams the Committee will pick is a standard that combines a team's RPI rank with its conference final standing position, with each weighted at 50%.  For conference standing, the standard uses the average of the team's regular season conference standing position and its conference tournament finishing position.  Using this standard produces the following table, where the lower the standard value the better:


The system thus predicts that in addition to North Carolina and Mississippi State, the Committee will give #1 seeds to Duke and Arkansas.

Going through a similar process for #2 seeds, where the candidates are teams ranked #1 through #14, produces the following initial table:


In this table, Iowa, Wake Forest, and Stanford are clear #1 seeds.  Penn State, Southern California, and Auburn are potential #2s.  The last four teams are not #2s.  The tiebreaker for #2 seeds is a standard combining teams' conference standing with their Top 60 Head to Head Results Ranks:


Thus Southern California joins Iowa, Wake Forest, and Stanford as #2 seeds.

For #3 seeds, the candidates are teams ranked #1 through #23, producing the following initial table:


Here, Penn State and Michigan State are clear #3 seeds.  Xavier, Florida State, TCU, Auburn, UCLA, and Georgetown are candidates.  The remaining teams are not #3 seeds.  For #3s, the tiebreaker is teams' ranks in terms of their common opponents results in relation to results of other Top 60 teams.


Thus TCU and Auburn join Penn State and Michigan State as #3 seeds.

For #4 seeds, the candidates are teams ranked #1 through #26, producing the following initial table:


Here, Florida State and Ohio State are clear #4 seeds.  Utah State, Xavier, UCLA, Georgetown, and Pepperdine are candidates.  The remaining teams are not #4 seeds.  For #4s, the tiebreaker is a combination of teams' ranks in terms of their results against Top 50 opponents and their conferences' ranks.


Thus UCLA and Georgetown join Florida State and Ohio State as #4 seeds.

For # 5 through #8 seeds, the candidates are teams ranked #1 through #49.  Although the data are limited since we have had those seeds for only a few years, the best indicator of which teams will get those seeds is a combination of teams' RPI ranks and their Top 60 Head to Head results ranks:


Using this table, the #5 seeds are Utah State, Xavier, West Virginia, and Vanderbilt.  The #6s are St. Louis, Virginia, Liberty, and Pittsburgh.  The #7s are Wisconsin, Notre Dame, Pepperdine, and South Carolina.  The #8s are Virginia Tech, Western Michigan, Kentucky, and Oklahoma State.

For the remaining At Large selections, the candidates run up to RPI #57, producing the following initial table:


Here, Rutgers, Texas Tech, Texas, and Minnesota are clear At Large teams, with 6 additional spots to fill.  Buffalo, Oklahoma, Connecticut, Arizona, Georgia, Colorado, BYU, Washington, and Tennessee are candidates.  The remaining teams are not at large selections.  For At Large, the tiebreaker is a combination of teams' RPI ranks and their ranks in terms of their Common Opponent results in relation to other Top 60 teams.


Thus the unseeded at large selections are Rutgers, Texas Tech, Texas, and Minnesota, joined by Arizona, Georgia, Buffalo, Oklahoma, BYU, and Colorado.  Connecticut, Washington, and Tennessee just miss getting at large positions.

What If the Committee Were Using the Balanced RPI?

If the Committee were using the Balanced RPI, which does not have the NCAA RPI's problem of discrimination in relation to conferences and regions, Washington, California (which would move inside the Top 57 and thus be a candidate), and Tennessee would get at large selections and BYU, Oklahoma, and Buffalo (which would move outside the Top 57 and thus not be a candidate) would not.  

No comments:

Post a Comment