Measuring ARRA: Asking the Right Questions

July 22, 2010

Also in Issue 1, Volume 7:

State Chamber Educates on Standards & Reform

Congress Debates Education Funding

Monitoring and Evaluation Plans of Selected ARRA Programs

At-a-Glance: American Recovery and Reinvestment Act of 2009


Measuring ARRA:  Asking the Right Questions

Once American Recovery and Reinvestment Act (ARRA) funds are distributed to states and other grantees, they become very difficult to track. Mandatory quarterly reporting is limited to dollars spent and jobs saved or created, leaving out many of the details observers would like to know.

But according to a new ED report, there soon should be a variety of information on how these programs are operating and what is being accomplished using ARRA dollars. Even more important, this summer presents an opportunity to weigh in on how the success of ARRA programs will be measured. (The full report is available at http://www2.ed.gov/policy/gen/leg/recovery/recovery-plans-2010.pdf).

ARRA supports 21 programs within ED, some new and some expanded with ARRA funds. Although the new programs (State Fiscal Stabilization Fund, Race to the Top, and Investing in Innovation) have understandably attracted most of the attention, ARRA has prompted the Department to expand its monitoring of the programs that have received additional funding, providing observers with a rare opportunity to shape how these programs are judged.

ARRA Monitoring and Oversight

The Department of Education has established agency-wide teams to oversee implementation and reporting. These teams are now establishing monitoring plans and metrics. For example, the Department is developing performance measures for the $4 billion Race to the Top and the $3.545 billion Title I School Improvement Grants. Both of these large programs are intended to dramatically alter education, and deserve well-thought-out agency and external oversight.

The Department also is currently developing the form for the report that states will be required to submit for the State Fiscal Stabilization Fund.

Presumably to lessen the chances of fraud and waste, the Department has adopted a risk-management monitoring strategy, where ARRA programs are examined “to identify concentrations of risk that will inform the targeting of ED’s technical assistance and oversight.” Risk factors include program size, program complexity, and prior audit findings.

However, these are compliance-based factors, not likely to push forward achievement or innovation. In keeping with this compliance focus, program-specific metrics tend toward reporting participation data. It may be up to the community to demand strong performance data.

For example, consider the State Longitudinal Data Systems funding. First, the Department is proposing to look at the number of states with a data system that includes all elements of the America COMPETES Act. However, it could look at how those data systems are being used, for example, to provide achievement information to classroom teachers. This summer presents two opportunities: to press the Department to expand its expectations from compliance to performance; and to provide input on the actual metric – how many states should have a qualifying system to count as a success.

This chart summarizes the Department’s proposed monitoring and evaluation efforts. Note that while some programs have established reporting web sites for the public, others have not.

ED has already made good progress when it comes to transparency around the Investing In Innovation (I3) program—at least with respect to posting information on applications.  [See “Education Stimulus Report: Volume 1, Issue 6”]  However, once the winners are announced next month, the focus will shift to how these funds are being spent and what can be learned from the programs being funded.  The chart below provides an overview of the outcomes that grantees will be required to track.

However, in and of themselves, the proposed measures seem unlikely to inspire innovation, or future funding.  Lacking are some of the straightforward questions which really have the ability to demonstrate effectiveness and innovation, such as:

  • How many programs improved student achievement by at least one grade level?
  • How many programs increased the achievement of English-language learners and students with disabilities – and by how much?
  • How many programs accelerated student learning so that students were on grade level?
  • How many programs increased graduation rates?
  • How many programs resulted in a lower dropout rate?
  • How many programs resulted in sustainable use of data in classroom settings?
  • How many programs appear replicable in other settings?
  • How much time is sufficient to improve outcomes?

With an investment this large, the business community must insist on additional meaningful measures that will guide education in the future, well beyond this specific program.

Additional Federal Oversight

The General Accountability Office (GAO) and the Department of Education’s  Office of Inspector General (IG) have also been involved in monitoring ARRA funds.

ARRA mandates the GAO review state and local stimulus funding received from across all federal agencies. This has included a focus on Department of Education programs and resulted in several reports, most recently a review of two districts in North Carolina.  Earlier this year they provided Congress with recommendations on oversight of ARRA funds including ways in which the Department of Education should step up its reporting requirements. These and other reports on ARRA can be found on the GAO’s website at www.gao.gov.

ED’s Inspector General also has oversight of ARRA  funds and was provided $14 million to specifically carry out this function.  Their efforts so far have resulted in over a dozen separate reports focused on concerns ranging from proper state procedures for monitoring schools to more general reports on issues related to state education spending.  These reports can be found on the IG’s website at: www2.ed.gov/about/offices/list/oig/recoveryact.html.

Investing in Innovation
Development Grants (up to $5M)
Validation Grants (up to $30M)
Scale-Up Grants (up to $50M)
Short Term Measures

Percentage of grantees whose projects are being implemented with fidelity

Percentage of programs with ongoing evaluations that provide evidence of promise for improving student outcomes

Cost per student served

Percentage of grantees that reach annual target number of students

Percentage of programs with evaluations that will provide evidence of student improvement

Percentage of programs with evaluations that provide high-quality implementation data

Cost per student served

Percentage of grantees that reach annual target number of students

Percentage of programs with evaluations that will provide evidence of student improvement

Percentage of programs with evaluations that provide high-quality implementation data

Cost per student served

Long Term Measures

Percentage of programs with a completed evaluation with evidence of promise for improving student outcomes

Percentage of programs with a completed evaluation with information to facilitate further development, replication, or testing in other settings

Cost per student for proven promising strategies

Percentage of grantees that reach target number of students

Percentage of programs that complete evaluation providing evidence of student improvement

Percentage of programs that complete evaluation that provides information for replication

Cost per student for proven effective strategies

Percentage of grantees that reach target number of students

Percentage of programs that complete evaluation providing evidence of student improvement

Percentage of programs that complete evaluation that provides information for replication

Cost per student for proven effective strategies

Download ICW's Education Stimulus Report (Vol. 1, Issue 7) (pdf)