George Washington University significantly overstated the academic credentials of students who entered the school in fall 2011, reporting that 78 percent had graduated in the top 10 percent of their high school class when only 58 percent had done so, university officials disclosed Thursday.

The overstatement, which GWU officials called the result of a flaw in the school’s reporting methods dating back more than a decade, inflated a key variable used to calculate the university’s standing in national rankings.

Lists compiled by U.S. News & World Report and other analysts, which seek to give an objective measure of a university’s excellence and value, have become a driving force in higher education in recent decades. The rankings often influence how students choose colleges. They also have come under fierce criticism from academic leaders who say that ranking universities in such a way is a fool’s errand and that schools themselves suffer when they chase prestige.

In September, U.S. News & World Report ranked the D.C. university 51st among national universities based in part on data gathered for the fall 2011 freshman class. It is unclear whether that ranking is errant, given Thursday’s disclosure. GWU officials said they have delivered corrected class-rank data to U.S. News.

“I deeply regret this error and want to assure you that corrective action has been taken and safeguards put into place to prevent such errors from occurring in the future,” GWU President Steven Knapp told the university community in a statement.

GWU ranked 50th on the U.S. News list last year, a point of pride for a school that has a reputation on the rise. Whether that year’s ranking — or other previous rankings — would be affected by the data problem is also unknown.

With its disclosure, GWU joined other prominent schools that have publicly acknowledged reporting flawed data about the achievements of their students. Claremont ­McKenna College in California and Emory University in Georgia acknowledged this year that school officials had inflated SAT scores of incoming freshmen in key reports.

Ultimately, U.S. News concluded that the 2011-12 rankings of the two schools weren’t affected. Robert Morse, data research director for U.S. News, could not be reached late Thursday to comment on the GWU disclosure.

Student selectivity is a key part of the formula U.S. News uses to rank schools, accounting for 15 percent of a school’s overall score. To calculate selectivity, U.S. News checks SAT and ACT admissions test data (50 percent of the selectivity score), class rank data (40 percent) and acceptance rates (10 percent).

Other factors in the overall U.S. News calculations include the academic reputation of schools as determined in surveys, graduation and retention rates, financial resources, faculty resources and alumni giving.

GWU officials said the class-rank problem came to light during an administrative reorganization under Steven R. Lerman, the provost and executive vice president for academic affairs, who has held that position since July 2010.

Forrest Maltzman, senior vice provost for academic affairs, said Lerman asked him last summer to examine how the university collects and reports admissions data.

Maltzman said that some experts he consulted within the university expressed concern about the class-rank data because more and more high schools have declined to give universities information about which graduates are in the top 10 percent.

Closer examination of the university’s data procedures found that GWU staff often would estimate whether a strong student was in the top 10 percent of a high school class. Maltzman said those internal estimates, combined with actual class-ranking information, were then used to answer questions for what is known as the “common data set,” which collects standardized information for major publishers of college guidebooks. The procedure, he said, traced back to the late 1990s.

GWU officials said the university hired an auditing firm, Baker Tilly Beers & Cutler, to review the school’s records on student selectivity. The auditor, they said, found no problems outside of the class-rank data.

Maltzman said no evidence has been found that the “error” was intentional. He said personnel at the university “have been held accountable,” but he declined to elaborate. Kathryn Napper has been GWU dean of undergraduate admissions since 1997. The university declined to make her available to comment.

Henry R. Broaddus, dean of admission at the College of William and Mary, said “there is heightened scrutiny” of admissions data this year at all colleges following the disclosures at Claremont McKenna and Emory. He said that questions about class ranking have multiplied as more high schools have declined to provide such information. Fairfax County schools, for example, do not give colleges any information about how their students rank among their high school peers, Broaddus said.

Broaddus said the swing of 20 percentage points in GWU’s data — from reporting that 78 percent of incoming students were in the top 10 percent of their high school class to 58 percent — appeared to be significant. “It does strike me as a dramatic discrepancy,” Broaddus said.