Tuesday, November 5, 2024

2024 ARTICLE 14: POST-WEEK-12 ACTUAL RATINGS AND UPDATED PREDICTIONS

I am sorry to be late with this week's report, but I had to spend time figuring out why my and the Chris Henderson/All White Kit ratings and ranks did not match the NCAA ratings and ranks published as of November 3.  As it turns out, the culprit is either Duquesne or George Mason, with the wrong outcome for their October 27 game having found its way into the NCAA's RPI data base.  The actual result was a 1-1 tie.  The score in the data base was a 2-1 George Mason win.  This had a ripple effect throughout the ratings and ranks.  The NCAA stats staff became aware of this, corrected it, and as of Tuesday morning published corrected RPI ratings and ranks.

 Current Actual RPI Ratings, Ranks, and Related Information

The following tables show actual RPI ratings and ranks and other information based on games played through Sunday, November 3.  The first table is for teams, the second for conferences, and the third for geographic regions.  Scroll to the right to see all the columns.







Predicted Team RPI and Balanced RPI Ranks, Plus RPI and Balanced RPI Strength of Schedule Contributor Ranks

The following table for teams and the next ones for conferences and regions show predicted end-of-season ranks based on the actual results of games played through November 3 and predicted results of games not yet played, including conference tournament games.  The predicted results of future games are based on teams' actual RPI ratings from games played through November 3.

In the table, ARPI 2015 BPs is ranks using the NCAA's 2024 RPI Formula.  URPI 50 50 SoS Iteration 15 is using the Balanced RPI formula.







Predicted NCAA Tournament Automatic Qualifiers, Disqualified Teams, Seeds, and At Large Selection Status, All for the Top 57 Teams

Below, I show predicted #1 through #8 seeds and at large selections based on the Women's Soccer Committee's historic decision patterns.  With one week of regular season play left to go consisting mostly of conference tournaments, there will be some changes from the predictions.  Nevertheless, we now are getting close to where things will end up.

The first table below is for potential #1 seeds.  The #1 seeds always have come from the teams ranked #1 through 7 in the end-of-season RPI rankings, so the table is limited to the teams predicted to fall in that rank range.  The table is based on applying history-based standards to team scores in relation to a series of factors, all of which are related to the NCAA-specified criteria the Committee is required to use in making at large decisions.  For each factor, there is a standard that says, if a team met this standard historically, the team always has gotten a #1 seed.  I refer to this as a "yes" standard.  For most of the factors, there likewise is a standard that says, if a team met this standard historically, it never has gotten a #1 seed.  This is a "no" standard.  In the table, I have sorted the Top 7 RPI #1 seed teams in order of the number of yes standards they meet and then in order of the number of no standards.



This shows Mississippi State as 1 clear #1 seed.  Duke, North Carolina, and Penn State have profiles the Committee has not seen before (meeting both "yes" and "no" standards), but are possible #1 seeds.  Arkansas, Southern California, and Florida State likewise are possible #1 seeds.  The following table applies the "tiebreaker" for #1 seeds to the "possible" group.  (A "tiebreaker" is a factor that historically has been the best predictor for a particular Committee decision.)


As you can see, Duke, Southern California, and Arkansas score best on the tiebreaker and so join Mississippi State as the predicted #1 seeds.

The candidates for #2 seeds are teams ranked through #14.  With the #1 seeds already assigned, this produces the following table:



This shows North Carolina, Wake Forest, and Florida State as clear #2 seeds.  Penn State has a profile the Committee has not seen before, but is a possitle #2 seed.  There are no other possible #2 seeds.  Given that, North Carolina, Wake Forest, Florida State, and Penn State are predicted #2 seeds.

The candidates for #3 seeds are teams ranked through #23.  With the #1 and 2 seeds already assigned, this produces the following table:



This shows there are no clear #3 seeds.  Notre Dame and Virginia have profiles the Committee has not seen before, but are possible #3 seeds.  Iowa, Georgetown, UCLA, South Carolina, and Michigan State likewise are possible #3 seeds.  The following table applies the "tiebreaker" for #3 seeds to the "possible" group.



As you can see, Iowa, Michigan State, Notre Dame, and UCLA score best on the tiebreaker and so are the predicted #3 seeds.

The candidates for #4 seeds are teams ranked through #26.  With the #1, 2, and 3 seeds already assigned, this produces the following table:



This shows no clear #4 seeds.  Virginia, Stanford, and Utah State have profiles the Committee has not seen before, but are possitle #4 seeds.  Georgetown, South Carolina, Ohio State, and Texas likewise are possible #4 seeds.  The following table applies the "tiebreaker" for #4 seeds to the "possible" group.



As you can see, Stanford, Ohio State, Virginia, and South Carolina score best on the tiebreaker and so are the predicted #4 seeds.

For # 5 through #8 seeds, the candidates are the not already seeded teams ranked #1 through #49.  Although the data are limited since we have had those seeds for only a few years, the best indicator of which teams will get those seeds is a combination of teams' RPI ranks and their Top 60 Head to Head results ranks:



Using this table, the #5 seeds are TCU, Utah State, Georgetown, and St. Louis.  The #6s are Auburn, Texas, Western Michigan, and BYU.  The #7s are Vanderbilt, Kentucky, Minnesota, and Rutgers.  The #8s are Texas Tech, Xavier, Virginia Tech, and Liberty.

For the remaining At Large selections, the candidates run up to RPI #57, producing the following initial table:



In the NCAA Seed or Selection column, the 5s are the unseeded Automatic Qualifiers.

As the table indicates, Oklahoma State, Georgia, Pepperdine, and Wisconsin are clear At Large teams.  This leaves 6 additional spots to fill.  West Virginia and Pittsburgh have profiles the Committee has not seen before and are candidates for those spots.  Tennessee, LSU, Washington, Kansas, and Colorado also are candidates for those spots.  In addition, Massachusetts, Dayton, California, and Connecticut are close, so they may be additional candidates for the open spots if the Committee breaks its historic patterns.  The remaining teams are unlikely: Oklahoma, Rice, and Army.

The following table applies the "tiebreaker" for the last at large candidates and also for the close candidates:



Based on the two above tables, the predicted unseeded at large selections go to Oklahoma State, Georgia, Pepperdine, Wisconsin, West Virginia, Tennessee, Pittsburgh, and Washington and either Massachusetts and Dayton if the Committee breaks with past patterns or LSU and Kansas if the Committee does not.

Based on the above, this produces the final compilation of seeds, Automatic Qualifiers, at large selections, and Top 57 teams not getting at large selections.  In the NCAA Seed or Selection column, in addition to the seeds and at large selections, the 6s are unseeded at large selections, the 6.5s are the "edge of the bubble" group from which 2 teams will get at large positions with the ones getting them depending on whether the Committee breaks with its historic patterns, and the 7s are Top 57 teams not getting at large positions.  And, of course, the Committee could break even more of its historic patterns, so one always must take that possibility into account.



What If the Committee Were Using the Balanced RPI?

If the Committee were using the Balanced RPI, which does not have the NCAA RPI's problem of discrimination in relation to conferences and regions, the following teams would drop out of the RPI Top 57:

Fairfield, AQ, drops from NCAA RPI 39 to Balanced RPI 90; Liberty, AQ, 41 to 68; South Florida, AQ, 42 to 66; Massachusetts, 43 to 62; Dayton, 45 to 60; James Madison (AQ as of 11/3), 52 to 67; Army, 56 to 77; and Rice, 57 to 96.

The following teams would move into the Balanced RPI Top 57:

Arizona, 61 to 40; Loyola Marymount, 80 to 47; Baylor, 65 to 49; UC Davis, 76 to 53; Alabama, 85 to 54; Illinois, 101 to 55; Boston College, 71 to 56; and Nebraska, 111 to 57.

It's worth noting that these shifts primarily are teams from weaker conferences moving out of the Top 57 and teams from stronger conferences moving in.  In addition, no teams from the West geographic region move out of the Top 57 and three from the West region move in.  None of this is surprising given the RPI's discrimination problems.

In terms of actual at large changes, it is likely that Oklahoma State, West Virginia, and one spot from among Kansas, Massachusetts, and Dayton would lose their predicted at large positions and would be replaced by California, Colorado, and Baylor.

Monday, October 28, 2024

2024 ARTICLE 13: POST-WEEK-11 ACTUAL RATINGS AND UPDATED PREDICTIONS

 Current Actual RPI Ratings, Ranks, and Related Information

The following tables show actual RPI ratings and ranks and other information based on games played through Sunday, October 27.  The first table is for teams, the second for conferences, and the third for geographic regions.  Scroll to the right to see all the columns.






Predicted Team RPI and Balanced RPI Ranks, Plus RPI and Balanced RPI Strength of Schedule Contributor Ranks

The following table for teams and the next ones for conferences and regions show predicted end-of-season ranks based on the actual results of games played through October 27 and predicted results of games not yet played, including conference tournament games.  The predicted results of future games are based on teams' actual RPI ratings from games played through October 27.

In the table, ARPI 2015 BPs is ranks using the NCAA's 2024 RPI Formula.  URPI 50 50 SoS Iteration 15 is using the Balanced RPI formula.






Predicted NCAA Tournament Automatic Qualifiers, Disqualified Teams, and At Large Selection Status, All for the Top 57 Teams

Below, I show predicted #1 through #8 seeds and at large selections based on the Women's Soccer Committee's historic decision patterns.  With two weeks of regular season play left to go (including conference tournaments), there will be changes, possibly significant, from the predictions.  Nevertheless, we now are getting closer to where things will end up.

The first table below is for potential #1 seeds.  The #1 seeds always have come from the teams ranked #1 through 7 in the end-of-season RPI rankings, so the table is limited to the teams predicted to fall in that rank range.  The table is based on applying history-based standards to team scores in relation to a series of factors, all of which are related to the NCAA-specified criteria the Committee is required to use in making at large decisions.  For each factor, there is a standard that says, if a team met this standard historically, the team always has gotten a #1 seed.  I refer to this as a "yes" standard.  For most of the factors, there likewise is a standard that says, if a team met this standard historically, it never has gotten a #1 seed.  This is a "no" standard.  In the table, I have sorted the Top 7 RPI #1 seed teams in order of the number of yes standards they meet and then in order of the number of no standards.


This shows North Carolina and Mississippi State as clear #1 seeds.  Duke and Wake Forest have profiles the Committee has not seen before (meeting both "yes" and "no" standards), but are possitle #1 seeds.  Arkansas and Southern California likewise are possible #1 seeds.  The following table applies the "tiebreaker" for #1 seeds to the "possible" group.  (A "tiebreaker" is a factor that historically has been the best predictor for a particular Committee decision.)


As you can see, Duke and Southern California score best on the tiebreaker and so join North Carolina and Mississippi State as the predicted #1 seeds.

The candidates for #2 seeds are teams ranked through #14.  With the #1 seeds already assigned, this produces the following table:



This shows Wake Forest, Arkansas, and Florida State as clear #2 seeds.  Penn State and Stanford have profiles the Committee has not seen before, but are possitle #2 seeds.  Iowa and Michigan State likewise are possible #2 seeds.  The following table applies the "tiebreaker" for #2 seeds to the "possible" group.


As you can see, Iowa scores best on the tiebreaker and so joins Wake Forest, Arkansas, and Florida State as the predicted #2 seeds.

The candidates for #3 seeds are teams ranked through #23.  With the #1 and 2 seeds already assigned, this produces the following table:


This shows Penn State and Stanford as clear #3 seeds.  Notre Dame has a profile the Committee has not seen before, but is a possitle #3 seed.  Michigan State, UCLA, and Georgetown likewise are possible #3 seeds.  The following table applies the "tiebreaker" for #3 seeds to the "possible" group.


As you can see, Michigan State and Notre Dame score best on the tiebreaker and so join Penn State and Stanford as the predicted #3 seeds.

The candidates for #4 seeds are teams ranked through #26.  With the #1, 2, and 3 seeds already assigned, this produces the following table:



This shows no clear #4 seeds.  Virginia and Utah State have profiles the Committee has not seen before, but are possitle #4 seeds.  UCLA, South Carolina, Vanderbilt, Georgetown, TCU, and Auburn likewise are possible #4 seeds.  The following table applies the "tiebreaker" for #4 seeds to the "possible" group.


As you can see, Virginia, UCLA, South Carolina, and Vanderbilt score best on the tiebreaker and so are the predicted #4 seeds.

For # 5 through #8 seeds, the candidates are the not already seeded teams ranked #1 through #49.  Although the data are limited since we have had those seeds for only a few years, the best indicator of which teams will get those seeds is a combination of teams' RPI ranks and their Top 60 Head to Head results ranks:


Using this table, the #5 seeds are TCU, Utah State, Auburn, and Georgetown.  The #6s are Virginia Tech, Minnesota, Ohio State, and Xavier.  The #7s are St. Louis, Kentucky, Western Michigan, and West Virginia.  The #8s are Texas, Liberty, Wisconisn, and Santa Clara.

For the remaining At Large selections, the candidates run up to RPI #57, producing the following initial table:


Here, Oklahoma State, Georgia, Texas Tech, and BYU are clear At Large teams, with 7 additional spots to fill.  Pittsburgh, Buffalo, and Memphis have profiles the Committee has not seen before and are candidates for those spots.  Pepperdine, Rutgers, Washington, and Arizona also are candidates for those spots.  Texas A&M would be a candidate but has a winning percentage below 0.500 due to a predicted conference tournament first round loss and thus is not a candidate,  The remaining teams are not at large selections.  Since there are only 7 eligible candidates to fill the 7 open spots, all of Pittsburgh, Buffalo, Memphis, Pepperdine, Rutgers, Washington, and Arizona fill those spots.

Based on the above, this produces the final compilation of seeds, Automatic Qualifiers, at large selections, and Top 57 teams not getting at large selections.  In the NCAA Seed or Selection column, the seeds are self explanatory, the 5s are unseeded Automatic Qualifiers, the 6s are unseeded at large selections, the 6.5 is disqualified, and the 7s are Top 57 teams not getting at large positions.


What If the Committee Were Using the Balanced RPI?

If the Committee were using the Balanced RPI, which does not have the NCAA RPI's problem of discrimination in relation to conferences and regions, the following teams would drop out of the RPI Top 57:

Fairfield, AQ, drops from RPI rank 37 to Balanced RPI rank 94; Liberty AQ 39 to 68; South Florida AQ 40 to 71; Dayton No At Large 41 to 62; James Madison AQ 44 to 64; Massachusetts No At Large 46 to 70; Columbia AQ 47 to 63; Army No AL 49 to 65; Buffalo Yes At Large 54 to 79; Texas A&M DQ 55 to 66.

The following teams would move into the Balanced RPI Top 57:

California 65 to 35; Colorado 58 to 40; Loyola Marymount 79 to 43; Tennessee 60 to 33; Illinois 98 to 51; UC Davis 80 to 52; Baylor 74 to 53; Connecticut 64 to 54; Kansas 64 to 56; Utah 86 to 57.

It's worth noting that these shifts primarily are teams from weaker conferences moving out of the Top 57 and teams from stronger conferences moving in.  In addition, no teams from the West geographic region move out of the Top 57 and five from the West region move in.

In terms of actual at large changes, it is likely that Oklahoma State, Memphis, and Buffalo would lose their predicted at large positions and would be replaced by California, Tennessee, and Colorado.

Monday, October 21, 2024

2024 ARTICLE 12: POST-WEEK-10 ACTUAL RATINGS AND UPDATED PREDICTIONS

Current Actual RPI Ratings, Ranks, and Related Information

The following tables show actual RPI ratings and ranks and other information based on games played through Sunday, October 20.  The first table is for teams and the second for conferences.

This year, the NCAA expanded the pre-NCAA Tournament season by a week, from 12 to 13 weeks.  In the table for teams, there are color coded columns that show which teams are potential #1 through #4 seeds, which teams almost certainly will get at large positions if not seeded, and which teams potentially may get at large positions, as of the completion of week 10 of the season.  The ranges in those columns are based on a 12-week season.  Since the season now is 13 weeks, I will deal with that as follows:

This week, which is as of the completion of week 10, I am showing the historic week 10 ranges;

Next week, which will be as of the completion of week 11, I again will show the historic week 10 ranges;

The following week, which will be as of the completion of week 12, I will show the historic week 11 ranges; and

The final week, which will be as of the end of the season, I will show the end-of-season (historic week 12) ranges.

 Because of the change in season length, it is possible that the ranges in this week's report should be a little larger than what the  teams table shows. 

Here is the teams table.  Scroll to the right to see all the columns.


Below is the conferences table.  In this table, I suggest you take a close look at the two right hand yellow columns.  The first of these shows, for each conference, the average NCAA RPI rank of its opponents (Conference Opponents Average NCAA RPI Rank).  The second shows, for each conference, the average NCAA RPI formula Strength of Schedule Contributor rank of its opponents (Conference Opponents Average NCAA RPI Strength of Schedule Contributor Rank).  As you can see, for the stronger conferences, at the top, their opponents' RPI formula ranks as strength of schedule contributors are much poorer than their actual RPI ranks.  As you go down the table from the stronger to the weaker conferences, this reverses so that at the bottom, the weaker conferences' opponents' RPI formula ranks as strength of schedule contributors are much better than their actual RPI ranks.  This pattern is a function of the NCAA RPI's faulty method for calculating strength of schedule and results in the RPI's discrimination against stronger and in favor or weaker conferences that I often have written about.



And new this week, here is a similar table for the regions into which I divide teams based on where teams from the different states play the majority or plurality of their games.




Predicted Team RPI and Balanced RPI Ranks, Plus RPI and Balanced RPI Strength of Schedule Contributor Ranks

The following table for teams and the next ones for conferences and regions show predicted end-of-season ranks based on the actual results of games played through October 20 and predicted results of games not yet played.  The predicted results of future games are based on teams' actual RPI ratings from games played through October 20.

In the table, ARPI 2015 BPs is ranks using the NCAA's 2024 RPI Formula.  URPI 50 50 SoS Iteration 15 is using the Balanced RPI formula.








Predicted NCAA Tournament Automatic Qualifiers, Disqualified Teams, and At Large Selection Status, All for the Top 57 Teams

Below, I show predicted #1 through #8 seeds and at large selections based on the Women's Soccer Committee's historic decision patterns.  Since we still have three weeks of regular season play (including conference tournaments), there will be changes, possibly significant, from the predictions.  Nevertheless, we now are moving closer to where things will end up.

The first table below is for potential #1 seeds.  The #1 seeds always have come from the teams ranked #1 through 7 in the end-of-season RPI rankings, so the table is limited to the teams predicted to fall in that rank range.  The table is based on applying history-based standards to team scores in relation to a series of factors, all of which are related to the NCAA-specified criteria the Committee is required to use in making at large decisions.  For each factor, there is a standard that says, if a team met this standard historically, the team always has gotten a #1 seed.  I refer to this as a "yes" standard.  For most of the factors, there likewise is a standard that says, if a team met this standard historically, it never has gotten a #1 seed.  This is a "no" standard.  In the table, I have sorted the Top 7 RPI #1 seed teams in order of the number of yes standards they meet and then in order of the number of no standards.




This shows North Carolina, Arkansas, and Mississippi State as clear #1 seeds.  Wake Forest, Duke, and Penn State have profiles the Committee has not seen before (meeting both "yes" and "no" standards), but are possitle #1 seeds.  Southern California likewise is a possible #1 seed.  The following table applies the "tiebreaker" for #1 seeds to the "possible" group.  (A "tiebreaker" is a factor that historically has been the best predictor of Committee decisions for a particular decision.)


 
As you can see, Duke scores best on the tiebreaker and so joins North Carolina, Arkansas, and Mississippi State as the predicted #1 seeds.

The candidates for #2 seeds are teams ranked through #14.  With the #1 seeds already assigned, this produces the following table:



Here, the decision is clear: Wake Forest, Penn State, Michigan State, and Southern California are #2 seeds.

The candidates for #3 seeds are teams ranked through #23.  With the #1 and 2 seeds already assigned, this produces the following table:



Here, Stanford and Florida State are clear #3 seeds.  Notre Dame, Iowa, UCLA, and Georgetown are potential #3 seeds.  Here is the tiebreaker table for #3 seeds for those teams:




Based on the tiebreaker, Iowa and Notre Dame join Stanford and Florida State for #3 seeds.

The candidates for #4 seeds are teams ranked through #26.  With the #1, 2, and 3 seeds already assigned, this produces the following table:



Here, Auburn is a clear #4 seed.  Utah State, UCLA, Georgetown, Vanderbilt, Oklahoma State, and South Carolina are potential #4 seeds.  Here is the tiebreaker table for #4 seeds for those teams:



Based on the tiebreaker Vanderbilt, UCLA, and South Carolina join Auburn for #4 seeds.

For # 5 through #8 seeds, the candidates are teams ranked #1 through #49.  Although the data are limited since we have had those seeds for only a few years, the best indicator of which teams will get those seeds is a combination of teams' RPI ranks and their Top 60 Head to Head results ranks:



Using this table, the #5 seeds are Utah State, TCU, Georgetown, and Xavier.  The #6s are Virginia, Texas Tech, St. Louis, and Oklahoma State.  The #7s are Kentucky, Viginia Tech, West Virginia, and Liberty.  The #8s are Ohio State, Western Michigan, Fairfield, and Santa Clara.

For the remaining At Large selections, the candidates run up to RPI #57, producing the following initial table:


Here, Rutgers, Minnesota, Pittsburgh, Wisconsin, Georgia, and Texas are clear At Large teams, with 5 additional spots to fill.  Colorado, Tennessee, Creighton, Pepperdine, Dayton, BYU, Massachusetts, and Washington are candidates for those spots.  The remaining teams are not at large selections.  For At Large, the tiebreaker is a combination of teams' RPI ranks and their ranks in terms of their Common Opponent results in relation to other Top 60 teams.


Based on the tiebreaker, Pepperdine, Dayton, BYU, Washington, and Massachusetts get the last at large positions.

Based on the above, this produces the final compilation of seeds, Automatic Qualifiers, at large selections, and Top 57 teams not getting at large selections.  In the NCAA Seed or Selection column, the seeds are self explanatory, the 5s are unseeded Automatic Qualifiers, the 6s are unseeded at large selections, and the 7s are Top 57 teams not getting at large positions.


What If the Committee Were Using the Balanced RPI?

If the Committee were using the Balanced RPI, which does not have the NCAA RPI's problem of discrimination in relation to conferences and regions, the following teams would drop out of the RPI Top 57:

Army (AQ) drops from #44 to #63; South Florida (no at large) from #43 to #65; Liberty (AQ) from #40 to #68; James Madison (AQ) from #47 to #70); Columbia (AQ) from #53 to #72; Grand Canyon (AQ) from #50 to #73); Ohio (AQ) from #57 to #94; Fairfield (AQ) from #41 to #102).

The following teams would move into the Balanced RPI Top 57:

California climbs from #76 to #38; Loyola Marymount from #81 to #41; Baylor from #59 to #45; Utah Valley from #61 to #49; Utah from #84 to #50; Memphis from #63 to #51; Clemson from #64 to #55; Kansas from #73 to #57.

It's worth noting that these shifts primarily are teams from weaker conferences moving out of the Top 57 and teams from stronger conferences moving in.

In terms of actual at large changes, it is likely that Dayton, BYU, and Massachusetts would lose their predicted at large positions and would be replaced by Tennessee, Colorado, and California.