No matter where you stand on the Admissions and Consumer Transparency Supplement (ACTS) survey component that the Trump Administration wants to add to IPEDS this year, we should all want it to produce trustworthy data. That is almost certain not to happen, not with the enormous number of new questions and data fields it will require colleges and universities to report on, not with the expectation that they will submit their reports in the next six months, and not with the survey being administered by an agency that laid off almost everyone who worked in NCES and everyone who worked on IPEDS.
The data that will come out of the ACTS survey is likely to be riddled with errors and missing information, which is bad enough in itself. What is even worse is that ACTS will place hundreds and hundreds of colleges making good faith efforts to complete a deeply flawed survey at risk for audits and punishment. In his Executive Order “Ensuring Transparency in Higher Education Admissions,” which led to the creation of ACTS, President Trump ordered the Secretary of Education to “increase accuracy checks of submitted data” and “take remedial action…if institutions fail to submit data in a timely manner or are found to have submitted incomplete or inaccurate data.” Without significant improvements and more time for institutions to prepare, ACTS will become a trap for institutions of higher education, forcing them to carry out an impossible task in an absurd amount of time and then punishing them for not succeeding.
The Trump Administration and the Department of Education should recall the major lesson from DOGE: good ideas don’t matter when you have a terrible process carried out by people who have no clue what they are doing. Most people understand the need to make the federal government more efficient, but ramming through huge changes without talking to experts and stakeholders does not just fail to deliver results; it makes things much worse. DOGE did irreparable harm to USAID, illegally cancelled thousands of research contracts, and fired the people who protect us from ebola and floods, all while missing the savings it promised by over $800 billion. It took the Trump Administration less than a year to see that DOGE was not working and to wind it down.
The White House and the Department of Education are about to DOGE IPEDS with the ACTS survey component. Increasing transparency in admissions is a good idea, but, like DOGE, the Department of Education is rushing major changes to existing systems and doing so with little understanding of how college admissions or the agency’s annual higher education survey work and with even less regard for the impossible situation they are putting almost 2,000 colleges and universities in. The ACTS survey will massively expand the administrative burden on colleges and universities that are already facing a host of financial pressures and now will face legal pressure to comply.
It’s not just the size of the task that is a problem. It’s the timing and survey components themselves. The Department of Education wants colleges and universities to undertake this huge data reporting project in less than six months, filling out thousands and thousands of new data fields that are poorly defined or are totally lacking in, well, data.
The size, timing, and technical problems with ACTS will make it impossible for any institutions to accurately complete the survey, no matter how hard they try. Wealthy private colleges and state flagship universities might be able to meet some of this challenge, but at hundreds and hundreds of smaller private colleges and regional public colleges the office of institutional research is in reality an officer of institutional research who, if they’re lucky might have some part-time help or a work study student helping out occasionally.
We need more data to better understand how college admissions works and how we can make it fairer, but those efforts depend on high-quality data we can trust. Resolving the technical problems I run through below will take time, which is why the Trump Administration should delay implementing ACTS until 2026 at the soonest.
The Trump Administration Wants Colleges to Complete Nearly 70,000 New Undergraduate Data Fields in IPEDS in 2025-26
The ACTS survey component will add more than 100 new questions about undergraduate admissions, financial aid, and outcomes to the IPEDS survey as well as thousands of new fields to complete. The massive addition of data fields is due to Trump Administration’s desire for data by race, sex, income, parental education, test scores, GPA, and more (although not by legacy status). According to my estimates, the Department of Education is asking colleges to complete more than 11,000 new data fields every year. The table below attempts to show how I arrived at that total.
Gathering the raw data, conducting the statistical analysis, and reporting this much new data to the federal government this year sounds ridiculous enough, until you recall that institutions are being asked to do the same thing for the previous 5 years as well. Colleges and universities will need to provide data for almost 70,000 new undergraduate fields.
And that is only half of it. ACTS will also require new data reporting on graduate programs. The details were not announced, but this requirement will likely double the number of new fields to be completed. It may amount to the equivalent of adding a whole other IPEDS survey. At the very least, the Department of Education should delay historical reporting until institutions have completed a couple successful cycles of single-year reporting in ACTS.
ACTS will be even harder if NCES does not resolve a wide range of technical problems, some of which I discuss below.
Technical Problems with ACTS
In order to answer a question, you need to know what it means. In order to conduct a reliable survey, you need everyone answering it to agree on what the questions mean and how they should be answered. You also need to have the data available to answer them. ACTS fails on all these counts.
There are multiple new data elements in ACTS, which demand guidance from NCES in order to guarantee that institutions are completing the data fields correctly and in a uniform manner. Guidance is not likely to appear until after the mandatory sixty-day comment period on the proposed changes is over and after the understaffed Department of Education and National Center for Education Statistics (NCES) respond to comments, and after the guidance and survey components are created and revised. That means institutions will probably not see the new ACTS survey questions or any guidance until late October at the soonest. The IPEDS Admissions survey typically opens in December and closes in February.
Below I break the technical problems with the ACTS survey components into two groups: poorly defined data fields and fields for which there is little or no data available to colleges and universities. This analysis is is not meant to be a comprehensive list of technical problems. I suspect there are many more problems. I am looking for institutional research officers, particularly at smaller institutions, to talk to about the problems they have spotted in the survey. Contact me here.
Poorly Defined Data Fields
Undergraduate Students
We are off to a bad start when it is not clear what comprises the most basic element of the ACTS survey component. The announcement of planned changes to IPEDS in the federal register mentions “undergraduate students” in the context of admissions and “newly enrolled undergraduate students” in the context of financial aid, which suggests that the admissions component will include all undergraduate students. IPEDS currently collects a wide range of data on all enrolled undergraduates at institutions, but the Admissions survey is limited to first-time students for the obvious reason that this is the stage when students are admitted if they are not transfers. There is a separate survey for transfer students. The announcement fails to mention transfers and transfer admissions, let alone address how they should be included in ACTS.
Including all undergrads in an admissions survey would be a break from current practice and add significantly to the administrative burden. Each year’s reported data would entail collecting what is in fact at least four years of admissions data, although at most institutions significant numbers of undergraduates take five, six, or more years to complete a degree. If institutions are required to provide admissions data for all undergraduates going back to academic year 2020-21, they will in fact need to pull data from 2015 or even earlier. Has the Department of Education conducted any research to determine how likely institutions are to have data going back a decade or more?
For what it’s worth, including admissions data for all enrolled undergraduates will actually undermine the Trump Administration’s stated desire to evaluate institutional response to the SFFA decision on admissions practices, since it will be several years before we see entire student bodies admitted post-SFFA. The administrative burden imposed by ACTS would be reduced by limiting the survey of admissions components to first-time students, and the integrity of the data would be improved.
Race
Currently, there are nine racial/ethnic groups included in IPEDS reporting, including categories for nonresidents and no race reported. It is unclear whether ACTS data will be disaggregated by all nine groups, but it should be if the Department of Education wants to avoid large gaps in reporting.
This disaggregation will get more complicated in a couple years. IPEDS is scheduled to make some significant changes to its reporting of race and ethnicity in 2027. In addition to adding a “Middle Eastern or North African” category, the survey will no longer separate “Hispanic or Latino” ethnicity out from race, and it will begin reporting more granular data on students who identify with two or more racial and ethnic categories. For example, a student who identifies as Asian and White will be identified in the “Two or More Races” category, but they will also likely be identified as “Asian and White” within that category. (The details have not been finalized). For the first time, students who identify as Hispanic and one or more other races will also be counted in the “Two or More Races” category, not just as “Hispanic or Latino.” These changes will make race-based comparisons over time much harder, but, to be fair, that will be the case whether or not ACTS becomes a reality. What’s more salient is that these changes in reporting, particularly around people who identify as two or more races, will multiply the burden on institutions by significantly expanding the number of disaggregations they will need to carry out on the data.
There is, too, the problem that ACTS could create an incentive for institutions not to ask about race and for applicants not to report it. In the wake of the SFFA decision, there were signs that the number of applicants to highly selective institutions who did not share their race increased. (The Common Application did not find this trend across the entire applicant pool, however.) If that trend grows among applicants or if more institutions stop asking students to check a box on race, it will undermine the quality of the data and limit its usefulness. It would be smart for the Department of Education to conduct some analysis of the potential impact of ACTS on non-reporting and non-recording of race in college applications before disaggregating sex/race pairs by so many components in IPEDS.
Sex
In 2021, IPEDS added “gender unknown” and “another gender” to “men” and “women” as categories under “gender.” This year “gender” will be replaced with “sex,” and the only two options will be “male” and “female.” It is unclear how colleges will report data from the past five years on sex without using “unknown” and “another” as categories. Guidance is necessary here.
Quintiles
The ACTS survey component will use test score and GPA quintiles to further disaggregate data, even though IPEDS has not, to my knowledge, asked institutions to break down any data categories into fifths. The current Admissions survey component asks institutions to report the 25th, 50th, and 75th percentiles of SAT and ACT scores for all students for whom a test score was used in the admissions decision. It also asks them to report the number and percentage of first-time students that submitted a test score. ACTS wants institutions to further disaggregate student counts of applicants, admits, and enrollments, already disaggregated by sex and race, by test score and GPA quintiles, but the announcement does not explain what quintiles institutions are meant to reference here. Are these quintiles for each disaggregated group in the pool, for the total pool, or for the nation? National quintiles do not exist for GPA, and institutional research offices cannot be expected to keep abreast of test score percentiles as they change each year, so presumably this field is meant to cover institutional pools, but at which level? Nor is the survey consistent with its terminology. In the description of plans to collect data on graduation rates, the ACTS announcement talks about high school GPA ranges, rather than quintiles. It will be necessary for IPEDS to provide guidance on what it wants collected by quintiles, and NCES should give institutions at least a year to prepare for this new data collection.
GPA
When I was discussing the proposed changes to IPEDS with an admissions dean at a highly selective college, one of the things he wanted to talk about was the heavy emphasis on GPA, an emphasis he and his colleagues at peer institutions do not share. Although high school students and economists might obsess over GPAs, admissions offices at more selective institutions tend to pay little attention to the number. They care about good grades in rigorous academic classes, not a single number that is calculated in so many ways as to render it meaningless. At some high schools, the GPA is calculated using all courses; at others it only includes core academic courses. Some high schools include freshman year grades; others do not. I recently saw a transcript from a public high school in San Diego that provided three GPAs, not one of which matched the GPA that the University of California admissions system would calculate using its A-G method.
The days in which a perfect GPA is a 4.0 are long gone. Some private schools no longer calculate GPA. Many high schools weight course grades based on their academic rigor, so an A in a basic math class might be worth 4 points but an A in an AP class might be worth 5 points. At Thomas Jefferson High School in Virginia, senior GPAs ranged in 2023 from 3.255 to 4.663. The Denver public school system uses a 5.2-point scale. Stuyvesant uses a 100 point scale. Phillips Exeter Academy uses an 11-point scale. Phillips Academy in Andover, MA, uses a 6-point scale. If even the Phillipses can’t agree on how to calculate GPA, what hope is there for the thousands of high schools across the nation to standardize the practice?
This mess of GPA numbers will lead to absurd findings at some institutions. When this same admission dean calculated the average GPA for their most recent class, which enrolled students from across the nation and the globe, the average self-reported GPA came out to a 19. That’s not a typo. It was a 19.
For GPA to have any meaning or value as a data point, the Department of Education needs to provide guidance on how admissions offices should standardize the calculation of GPA across every school in the US and abroad. That calculation would not only add to the administrative burden on institutions, it would be of almost no value to admissions offices, since many of them do not in fact really care about GPAs and simply record a self-reported GPA in their files.
Test Scores
ACTS will require institutions to report test scores by quintiles by sex-race pairs, but it is not clear which test scores will be used. IPEDS currently reports the 25th, 50th, and 75th percentile scores for enrolled first year students for the following tests:
- SAT Math
- SAT Reading and Writing
- ACT Composite
- ACT Math
- ACT Reading
ACTS should use these same five scores in order to keep historical comparisons in place, but disaggregating all five of them by race will significantly increase the amount of data being reported to IPEDS. It is also unclear whether other tests used in admissions, such as the Classic Learning Test, will be included in data reporting.
Parental Education
Parental education is an important category in college admissions. Many admissions offices rightly recognize that having a parent with a bachelor’s degree or higher provides a range of advantages to an applicant. To compensate for this advantage, some institutions provide an admission preference for so-called first generation applicants, whose parents have not attained a bachelor’s degree. It makes good sense to disaggregate admission data by parents’ highest level of education, as ACTS plans to do, but this survey question will be of little value if NCES does not provide clear guidance on how institutions should ask about and report this data point. A recent Common Application study determined that, “depending on the exact definition of first-generation status used, the number of first-generation applicants on the Common App in 2022 can vary from 304,338 to 709,850.”
I have assumed that this question will include a range of responses and not just be a simple binary, such as first gen or not, but how many responses will it include? In its reporting on parental education level, NCES uses seven categories for highest level of education, based on the American Community Survey, as does the University of California application. The Common Application uses ten school types plus eight degree types to categorize parental education. It is unclear whether ACTS plans to ask for both parents’ highest level of educational attainment or just the parent with the highest level. What if that parent is deceased or otherwise absent from an applicant’s life? What if the parents earned their degrees abroad? Clearly defining this category and providing time for institutions to modify their applications to reflect these definitions will be necessary to make this data meaningful.
Average Cost of Attendance:
ACTS will collect the “average cost of attendance” and disaggregate it by test scores, GPA, family income, and form of enrollment. I put this category in quotation marks because the terminology is muddled here. Presumably, what the Department of Education wants here is the average net price, not the cost of attendance. These are technical terms that are already used by IPEDS and the College Scorecard, which defines the “average cost of attendance” (COA) as the sum of “tuition and fees, books and supplies, and living expenses for all full-time, first-time, degree-/certificate-seeking undergraduates who receive Title IV aid.”
At most institutions most students will not in fact pay full COA. They will receive grants and discounts that lower the COA, which is why higher education experts focus much more on the net price students pay, which is “derived from the full cost of attendance…minus federal, state, and institutional grant/scholarship aid, for full-time, first-time undergraduate Title IV-receiving students.” Net price is almost certainly what the Department of Education wants to collect, because it makes no sense to disaggregate COA, since that number is essentially the same for all students, although there may be variation depending on whether they live on or off campus. The agency needs to clarify that what it actually wants here is net price, not cost of attendance.
It also needs to clarify just which students it wants this data for. As with the admissions questions in ACTS, the language suggests that data will be collected for all undergraduates, since the last sentence of the outcomes paragraph notes that “additional data may be gathered to better understand remedial or other non-credit coursework for newly enrolled students.’ As seen in the discussion above, cost of attendance and net price are currently calculated for “full-time, first-time undergraduate Title IV-receiving students,” i.e., full-time freshmen who get federal financial aid. Let’s leave aside for now the issue that this calculation leaves out potentially large portions of the student body who do not get federal aid. If the Department of Education actually does want to calculate net price for all undergraduates, it will need to clarify that it is asking institutions to create a new calculation alongside the existing reporting of net price by income range. It will also need to clarify how many cohorts this calculation will cover.
Average Cumulative College GPA at the End of the Year
ACTS will ask colleges and universities to report and disaggregate the year-end GPAs of enrolled students overall and by race and sex pairs. Once again, we run into the problem of whether this category will be calculated for all undergraduates or just first-time students. If it includes all undergraduates, then it will be hard to disentangle pre- and post-SFFA outcomes. The average should probably be weighted to reflect different cohort sizes and the likelihood that cumulative GPAs for upperclassmen will trend higher since they will not include students who stopped out in earlier years for academic or other reasons. Will transfer students be included in this average? Will their GPA include grades from their previous institutions?
Graduation Rates and Graduates’ final GPA
The plan to collect “graduation rates further disaggregated by admission test score quintiles and ranges of high school grade point average” as well as “graduates’ final cumulative grade point average” not only inherits all the problems with test score and high school GPA collection but also creates new ones. Will this be a 4-year, 6-year, or 8-year graduation rate? The College Scorecard uses an 8-year graduation rate. Will this graduation rate include students who transfer into an institution? How will it treat students who transfer out? These questions need to be resolved, and guidance must be provided.
Fields With No Data or Missing Data
Family Income and Pell Eligibility
The impulse to disaggregate admissions data by income and Pell Grant eligibility is a good one, but any effort to do so must start with a recognition that institutional access to this financial data is limited at the point of enrollment, extremely limited at the point of admission, and virtually absent as the point of application. Whatever information institutions have and report to the Department of Education about a student’s household income or Pell eligibility comes from the FAFSA. Institutions will not have FAFSAs for very many or quite likely most applicants. The situation will be only slightly better for admitted students. Given this lack of data, it makes little sense for ACTS to disaggregate applicants and admits by income or Pell status.
Even with enrolled students, there is going to be a significant portion of the class that never completes a FAFSA at many institutions, which means they will have no data on household income. The same will be true for international students, who of course do not fill out a FAFSA because they are not eligible for financial aid. At the 12 Ivy Plus institutions, more than half to three-quarters of the freshman class do not receive federal aid of any kind.
IPEDS currently reports net price across five ranges of family income, but only for enrolled students who completed the FAFSA. Will ACTS include a sixth group–FAFSA non-completers–in this data collection, or will it leave the non-completers out of the reporting altogether? Of the two options, the former would be better, but disaggregating by non-completers would be an entirely new data field, and institutions would need time and guidance to begin including it in their reporting.
Test scores
During the height of COVID, admissions exams were much less available and almost every institution in the nation made reporting test scores optional out of necessity. Today, more than 80 percent of colleges and universities remain test optional, which means that they will not have any test scores for some portion of their students, or test blind, which means that they will have test scores for none of their students. It is not uncommon at even very selective institutions for a third to half of the enrolled students to have not submitted a test score. Will ACTS include non-submitters as an additional category when it disaggregates by test score quintiles? At the very least, the new survey should report the portion of applicants, admits, and enrollments who submitted no test score as part of their application and disaggregate that data by race and sex.
Privacy and Suppressed Data
One of the inevitable effects of disaggregating data to the degree that ACTS will is that the resulting group counts will be so small that they risk revealing personally identifiable information about an individual student. For instance, disaggregating data for the three Native American/Alaskan Native students in the 2024 freshman Harvard class would run a strong chance of revealing sensitive information about those students. The same could well be true for all 24 of the Native undergraduates at Harvard last year, once this group is disaggregated by sex and, say, early action enrollment. Colleges and universities have a legal obligation to protect each student’s privacy under the Family Educational Rights and Privacy Act (FERPA). The Higher Education Opportunity Act of 2008 states “if the number of students in subgroups is sufficient to yield statistically reliable information and reporting will not reveal personally identifiable information about an individual student. If such number is not sufficient for such purposes, then the institution shall note that the institution enrolled too few of such students to so disclose or report with confidence and confidentiality.” Will ACTS suppress data for small subgroups? What will be the threshold for suppression?
ACTS Got 11,000 Data Fields, but Legacy Ain’t One
Even with this huge expansion of data and transparency around admissions, financial aid, and outcomes, the Department of Education has ignored some of the most important factors in admission to highly selective colleges and universities. Opportunity Insights observed in a recent study that 24% of the admissions advantage that the very rich have at elite universities is explained by the recruitment of athletes; 46% is driven by preferential admission legacies, who are five times as likely to be admitted as the average applicant with similar test scores, demographic characteristics, and admissions office ratings; and the remaining 31% of the admissions advantage comes from attending expensive private high schools. The plaintiffs in SFFA and their expert witnesses recognized that legacy preferences and athletic recruiting played such a profound role in determining who got into Harvard that they had to exclude them from their analysis. Keeping them in their models would distort the effect of race-conscious admissions policies on admissions decisions. The Trump Administration has been calling on universities to admit students based on merit, not ancestry, and yet it has left the most egregious example of basing admissions on ancestry rather than merit, legacy preferences, out of this data collection. As a result, legacy preferences and athletic recruiting will distort the findings, but no one will be able to see their effect. NCES must include disaggregated data for legacy admissions and for athletic recruits in ACTS if it wants a fuller picture of the college admissions process and if it wants to make that process more meritocratic.
The Department of Education Needs to Think before it ACTS
Given the serious technical problems with ACTS and the absurdly compressed timeline for requiring institutions to complete more than 32,000 data fields that are poorly defined and/or impossible to complete, the Department of Education has no choice. It must delay the ACTS survey in its entirety until 2026-7 at the soonest.
The Trump Administration and the Department of Education reached a similar conclusion earlier this year, when it extended the reporting deadline for the much less burdensome financial value transparency and gainful employment (FVT/GE) regulations by seven months to ensure institutions had the time to complete their reporting accurately. These regulations, passed by the Biden Administration, had similar problems to ACTS with guidance and timing, although institutions already had more than a year to prepare for the reporting and were ultimately given a second year thanks to the Trump Administration’s extension.
Rushing ACTS through will lead to bad data, put hundreds of colleges in a precarious legal position, and ultimately undermine the Trump Administration’s own interest in identifying colleges and universities it believes are violating civil rights laws and disregarding the SFFA decision. Any institution the administration accuses of violations will have strong legal grounds to challenge the accusation based on the reckless process under which ACTS is being foisted on colleges and universities. Many of DOGE’s efforts to move fast and break things were rejected in the courts because of that process.
Some might believe the best course would be to let ACTS proceed as planned and let it fail, but I think that is a mistake. We need more transparency in college admissions, but we need to be able to trust the data, which is why the Department of Education should pause ACTS for at least a year.
If you want to submit a comment on the proposed changes to IPEDS, you can do that here. You have until Oct 14, 2025.

Leave a Reply