MANAGING FOR RESULTS:
The Performance Management Playbook for
Federal Awarding Agencies
April 2020
VERSION I
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 1
Executive Summary
In coordination with the Chief Financial Officers Council (CFOC) and the Performance
Improvement Council (PIC), the President’s Management Agenda (PMA) Cross Agency Priority
(CAP) Goal, Result-Oriented Accountability for Grants, Performance Workgroup is proud to
release version I of the “Managing for Results: The Performance Management Playbook for
Federal Awarding Agencies (PM Playbook).
Playbook Purpose. The purpose of the PM Playbook is to provide Federal awarding agencies
with promising practices for increasing their emphasis on analyzing program and project results
as well as individual award recipient performance, while maintaining, and where possible
minimizing, compliance efforts. Some ideas in the PM Playbook are reflected in the proposed
revisions to Title 2 of the Code of Federal Regulations (2 CFR) for Grants and Agreements. As a
playbook, this document is not Office of Management and Budget (OMB) guidance but a
resource for Federal awarding agencies as they continue efforts to improve the design and
implementation of Federal financial assistance programs for awards. This first version of the PM
Playbook is released with the intent to engage stakeholders on practices and principles for
improving performance to help shape the Federal strategy in this area and to influence future
revisions to 2 CFR. Importantly, the PM Playbook represents the Federal government’s shift in a
direction toward performance and focusing on results. Subsequent versions of the PM Playbook
will be released as organizational learning occurs in implementing the practices and concepts
outline in the document.
Shifting the Grants Management Paradigm. Federal awarding agencies are encouraged to
begin to make a paradigm shift in grants management from one heavy on compliance to a more
balanced approach that includes establishing measurable program and project goals and
analyzing data to improve results. This effort supports the President’s Management Agenda
(PMA), which seeks to improve the ability of agencies to “deliver mission outcomes, provide
excellent service, and effectively steward taxpayer dollars on behalf of the American people.
1
To track and achieve these priorities, the PMA leverages Cross-Agency Priority (CAP) Goals,
including the Results-Oriented Accountability for Grants CAP Goal (Grants CAP Goal). The
purpose of the Grants CAP Goal is to “maximize the value of grant funding by applying a risk-
based, data-driven framework that balances compliance requirements with demonstrating
successful results.”
2
Strategy for Getting There. The Grants CAP goal has four strategies, including one dedicated
to “achieving program and project goals and objectives.” The objective of this strategy is to
demonstrate advancement toward or achievement of program goals and objectives by focusing
1
The President’s Management Agenda website: https://www.performance.gov/PMA/PMA.html
2
The President’s Management Agenda, Results-Oriented Accountability for Grants website:
https://www.performance.gov/CAP/grants/
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 2
on developing processes and tools to help Federal awarding entities improve their ability to
monitor program and project performance, improve award recipient performance, and ultimately
demonstrate to the American taxpayer that they are receiving value for funds spent on grant
programs. The PM Playbook is a product of that effort. An interagency workgroup is charged
with achieving the goals of this strategy, including the development of the PM Playbook.
3
The
intended audience for the PM Playbook is Federal awarding agencies that provide Federal
financial assistance (grants and/or cooperative agreements) to non-Federal entities. While
Federal statutes require compliance activities to be upheld, awarding agencies’ focus on
compliance often overshadows the importance of examining performance results on a recurring
basis during the grant period of performance and immediately after awards are completed.
Agencies often have difficulty showing that Federal dollars are spent wisely and that those
dollars have the intended impact and produce value to the taxpayer. See Figure 1: Balancing
compliance and performance to achieve results.
Figure 1: Shifting the balance from compliance toward performance to achieve results
4
To assess program impact, agencies are encouraged to establish clear program goals and
objectives, and measure both project and individual award recipient progress against them.
Applied in the context of Federal financial assistance awards, the practices identified in this
playbook are informed by and complement the work of the Performance Improvement Council
(PIC), which has focused on institutionalizing performance management as a key management
discipline and capability more broadly within the Federal government following enactment of the
Government Performance and Results Modernization Act (GPRA). The PM Playbook is one of
several recent administration efforts to modernize the Federal grants management process by
strengthening the Federal agency approach to performance. Some of these additional efforts
include, but are not limited to: the Grants Management Federal Integrated Business Framework
3
An interagency work group representing nearly 20 Federal agencies designed the PM Playbook.
4
Figure developed by authors of the PM Playbook.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 3
(FIBF), the Reducing Federal Administrative Burdens on Research Report, the Performance
Improvement Council’s Goal Playbook, and proposed revisions to the “Uniform Administrative
Requirements, Cost Principles, and Audit Requirements” in Title 2 of the Code of Federal
Regulations, Chapter 200 (2 CFR 200).
5678
As part of on-going efforts to continue the dialogue on this topic and develop future iterations of
this work, the Grants CAP Goal Performance Workgroup is looking to hear from stakeholders at
[email protected]OP.Gov with any comments, suggestions, and examples of success to be
considered in future iterations of the PM Playbook.
5
The Federal Integrated Business Framework (FIBF): https://ussm.gsa.gov/fibf/
6
Reducing Federal Administrative Burdens on Research Report: https://www.whitehouse.gov/wp-
content/uploads/2018/05/Reducing-Federal-Administrative-and-Regulatory-Burdens-on-Research.pdf
7
Performance Improvement Council’s Goal Playbook: https://www.pic.gov/goalplaybook/
8
Federal Register Notice for the Proposed Revisions to Title 2 of the Code of Federal Regulations:
https://s3.amazonaws.com/public-inspection.federalregister.gov/2019-28524.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 4
TABLE OF CONTENTS
I. Introduction ...........................................................................................................................6
Definitions ...............................................................................................................................7
Federal Laws and Regulations ............................................................................................... 10
II. Performance Management Basics ..................................................................................... 11
Programmatic Performance Management Principles .............................................................. 11
Risk Management and Performance Management .................................................................. 14
The Federal Grants Lifecycle ................................................................................................. 16
III. The Performance Management Approach for Grants .................................................... 17
Phase 1: Program Administration........................................................................................... 19
Program Design Steps ........................................................................................................ 23
Notice of Funding Opportunity (NOFO) ............................................................................ 27
Performance Management Requirements ........................................................................... 30
Phase 2: Pre-Award Management .......................................................................................... 32
Selection Criteria for Making Awards ................................................................................ 32
Phase 3: Award Management ................................................................................................ 34
Risk Assessment and Special Conditions............................................................................ 34
Federal Award and Performance Reporting ........................................................................ 35
Issuing Awards .................................................................................................................. 36
Phase 4: Post-Award Management and Closeout ................................................................... 36
Award Recipient Performance Monitoring and Assessment................................................ 37
Award Closeout ................................................................................................................. 39
Phase 5: Program Oversight ................................................................................................... 41
Analysis of Program and Project Results ............................................................................ 42
Dissemination of Lessons Learned ..................................................................................... 44
Program Evaluation............................................................................................................ 44
Federal Evidence Building ................................................................................................. 45
IV. Maintaining a Results-Oriented Culture ......................................................................... 45
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 5
V. Conclusion .......................................................................................................................... 50
VI. Appendices ........................................................................................................................ 52
Appendix A. Glossary of Terms ............................................................................................ 52
Appendix B. Key Stakeholders .............................................................................................. 57
Appendix C. Federal Laws and Regulations ........................................................................... 59
VII. Agency Acknowledgements ............................................................................................. 65
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 6
I. Introduction
The primary purpose of the Performance Management Playbook (PM Playbook) is to provide
Federal awarding agencies with promising performance practices for examining larger program
and project goals and subsequent results as well as individual award recipient performance. To
this end, the PM Playbook breaks down performance management into three distinct, yet
connected, levels of activity. These activities take place at the program (i.e., assistance listing),
project (i.e., Notice of Funding Opportunity (NOFO)), and sub-project (i.e.; award recipient)
levels.
9
See Figure 2 below for an illustration of the three levels of performance activities. While
some Federal agencies may use other terms for these same activities, the PM Playbook uses
program, project, and sub-project throughout the document to avoid confusion over terminology.
Figure 2: Performance Activity Levels
10
The PM Playbook promotes a common understanding of performance management practices and
processes for Federal awarding agencies and is a resource for leaders and others who want to
strengthen their agency’s approach to performance by focusing on program, project, and sub-
project goals, objectives, and results.
11
9
The term sub-project refers to the activities that an award recipient plans to accomplish with the award.
10
Figure developed by authors of the PM Playbook.
11
Most often, performance is assessed within the award period. However, at times evaluations may examine award
recipient and/or program performance after the award period has ended.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 7
Program goals and their intended results, however, differ by type of Federal program. For
example, criminal justice programs may focus on specific goals such as reducing crime; basic
scientific research programs may focus on expanding knowledge and/or promoting new
discoveries; and infrastructure programs may fund specific building or transportation projects.
With this in mind, the PM Playbook highlights how practices may differ depending on the types
of awards an agency oversees (such as service delivery, science/research, or infrastructure).
I.A. Definitions
The PM Playbook contains a Glossary of Terms in the Appendix, which align with the standard
language and definitions in 2 CFR Part 200 and OMB Circular A-11 (2019 version), Preparation,
Submission, and Execution of the Budget (A-11 (2019)). Federal agencies often use different
terms and phrases to describe the same activities or processes. For example, different agencies
use “Funding Opportunity,” “Funding Opportunity Announcement,” “Notice of Funding
Opportunity Announcement(NOFO), and/or “solicitation” to refer to guidance documents with
programmatic information and instructions for applicants on how to apply for awards. For
consistency and clarity, the PM Playbook uses NOFO since this is the phrase used in 2 CFR 200.
A NOFO is “any paper or electronic issuance that an agency uses to announce a funding
opportunity.
12
OMB defines performance management as the “use of goals, measurement, evaluation, analysis
and data-driven reviews to improve the effectiveness and efficiency of agency operations.”
13
The PM Playbook references a more granular level and uses the phrase “performance
management” to refer to program and project results as well as award recipient performance.
14
While Federal agencies can be recipients of Federal awards, for the purposes of the PM
Playbook, the phrase “award recipientrefers to non-Federal entities (NFE) that receive Federal
financial assistance.
15
In addition, the term “programthroughout this playbook refers to all Federal awards assigned a
single assistance listing number in the System for Award Management (SAM), which was
formerly the Catalogue of Federal Domestic Assistance (CFDA).
16
2 CFR 200 requires
assistance listings to have unique titles and be clearly aligned with the program’s authorization
12
The definition described above is found in 2 CFR §25.200. In 2 CFR §200.1, NOFO is further defined as a
“formal announcement of the availability of Federal funding through a financial assistance program from a Federal
awarding agency.”
13
OMB Circular A-11 (2019 version) Part 6, Section 200, p23: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
14
This phrase is not to be confused with human resources or employee performance.
15
2 CFR §200.1 defines a non-Federal entity as a “state, local government, Indian Tribe, Institution of Higher
Education, or non-profit organization that carries out a Federal award as a recipient or sub recipient.”
16
Beta.SAM.gov describes an assistance listing as a program designed to “provide assistance to the American public
in the form of projects, services, and activities, which support a broad range of programssuch as education, health
care, research, infrastructure, economic development and other programs.”
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 8
and Congressional intent. To clarify further, a program (assistance listing) may have one or more
associated projects.
17
The PM Playbook uses the term “project” to refer to the activities escribed in individual NOFOs.
For example, each year the Department of Justice (DOJ) Second Chance Act (SCA) program
(i.e., assistance listing) has multiple NOFOs, which provide funding for projects that fall under
the SCA authorization. The goals and objectives of each project are associated with the larger
program.
18
See Figure 3 for an illustration on how a government-wide initiative flows down to a
program, projects, funding vehicle, award, and recipient for the DOJ SCA example.
17
It is important to note that the activity codes used for the Federal budget program inventory (see OMB Circular A-
11 Part 6 (2019 version)), are not the same as the assistance listing number. In June 2019, OMB issued updated
guidance to agencies on further implementation of the GPRAMA 2010 requirement for a Federal Program
Inventory, to leverage program activity(ies), as defined in 31 U.S.C 1115(h)(11), for implementation of the program
reporting requirements. Agencies’ consideration of how these Program Activities link to budget, performance, and
other information will transform the reporting framework to enable improved decision-making, accountability, and
transparency of the Federal Government. The current links between the Program Activity and the CFDA can be a
one-to-one, a one-to-many, or a many-to-one relationship. These linkages can be explored on USASpending.gov.
More information can be found in A-11, Part 6, (2019 version).
18
Please note, in some instances a NOFO may be aligned with more than one program (assistance listing).
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 9
Figure 3: DOJ SCA Example: Program, Project, Funding Vehicle, Award and Recipient
Relationships
19
20
19
The Second Chance Act (SCA) supports state, local, and tribal governments and nonprofit organizations in their
work to reduce recidivism and improve outcomes for people returning from state and federal prisons, local jails, and
juvenile facilities. https://csgjusticecenter.org/nrrc/projects/second-chance-act/
20
Figure developed by authors of the PM Playbook an illustrative example. Actual projects and funding vehicles
many vary.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 10
Other key definitions to keep in mind when reviewing the PM Playbook are output and outcome.
Table 1: Output and Outcome Measures Defined
2122
Output
Quantity of products or services delivered by
a program, such as the number of inspections
completed or the number of people trained.
Outcome
The desired results of a program. For
example, an outcome of a nation-wide
program aimed to prevent the transmission of
HIV infection might be a lower rate of new
HIV infections in the United States. Agencies
are strongly encouraged to set outcome-
focused performance goals to ensure they
apply the full range of tools at their disposal
to improve outcomes and find lower cost
ways to deliver.
I.B. Federal Laws and Regulations
Several statutes and regulations underpin performance management in the 21
st
century. Some of
the Federal laws and regulations that the PM Playbook aligns with are listed below.
23
Many of
these Federal laws and regulations are related in that they promote the same objectives outlined
in the PMA: to increase transparency, accountability, and results-oriented decision-making. They
also promote a risk-based approach to making awards, establishing clear goals and objectives to
show progress toward achieving results, and showing the taxpayer what they are receiving for
the funding spent on grant programs. See Appendix D for more information on each of the
Federal laws and regulations.
Federal Grant and Cooperative Agreement Act of 1977
Chief Financial Officers Act of 1990
Clinger-Cohen Act of 1996 (previously the Information Technology Management
Reform Act)
Federal Funding Accountability and Transparency Act (FFATA) of 2006
Government Performance and Results Modernization Act of 2010
Digital Accountability and Transparency Act (DATA) of 2014
21
OMB Circular A-11 (version 2019) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
22
OMB Circular A-11 (version 2019) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
23
Agencies may implement the tools and promising practices in the PM Playbook in support of their authorizations
and appropriations as established by Congress.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 11
Program Management Improvement Accountability Act (PMIAA) of 2016
American Competitiveness and Innovation Act (AICA) of 2017
Foundations for Evidence-Based Policymaking Act of 2018
o The Office of Management and Budget (OMB) M-19-23, “Phase 1
Implementation of the Foundations for Evidence-Based Policymaking Act of
2018: Learning Agendas, Personnel, and Planning Guidance” and M-20-12,
“Phase 4 Implementation of the Foundations for Evidence-Based
Policymaking Act of 2018: Program Evaluation Standards and Practices
Grant Reporting Efficiency and Agreements Transparency (GREAT) Act of 2019
OMB Circular A-123, “Management’s Responsibility for Enterprise Risk
Management and Internal Control”
OMB Circular A-11, Part 6, (2019 version), The Federal Performance Framework
for Improving Program and Service Delivery
Code of Federal Regulations (CFR), Title 2 “Grants and Agreements,” Part 200
“Uniform Administrative Requirements, Cost Principles, and Audit Requirements for
Federal Awards” (Developed 2013, Revised 2020)
II. Performance Management Basics
As a reference guide Federal agencies may use in the context of financial assistance awards, the
PM Playbook focuses on strengthening the Federal government’s approach to performance
management by encouraging Federal awarding agencies to set measurable program and project
goals and objectives, and to use relevant data to measure both program and project results and
award recipient performance. While there is no one “right” path to improve performance
management practices, this section provides an overview of how performance management may
fit into larger agency processes as well as the grants lifecycle. This section is not all-inclusive but
rather highlights significant issues that agency leaders and their program, policy, grant, and
performance managers should consider at all three levels of performance activity: program,
project, and sub-project.
II.A. Programmatic Performance Management Principles
The PM Playbook promotes several important principles that are necessary to successfully
implement a performance management framework within an awarding agency. While there are
other important principles, such as transparency and accountability that awarding agencies may
consider, the four principles listed below are essential to changing and/or strengthening Federal
awarding agency approaches to program, project, and sub-project performance.
1. Leadership Support is Critical to Success
2. Performance Management is Everyones Responsibility
3. Data Informed Decision-Making Improves Results
4. Continuous Improvement is Crucial to Achieving Results
1. Leadership Support is Critical to Success
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 12
Performance management processes are most successful when leaders champion them.
Leaders at all levels of an agency should participate in clearly defining program goals and
objectives (in relation to statutes, appropriations, and agency priorities). Leaders should
communicate program goals and objectives to grant managers and other relevant employees
so that they can align project (i.e., NOFO) goals and objectives to them.
Promising Practice - Leadership Support
Leadership support at the U.S. Department of State (State Department) has been integral to
the success of the agency’s Managing for Results:
Program Design and Performance
Management Toolkit.
24
The State Department implemented the toolkit as a manual for its
bureaus, offices, and posts to use to assess the degree to which their programs and projects
were successful in advancing long-term strategic plan goals and achieving short-term results.
The toolkit describes the major steps of program design and can be us
ed to design new
programs and/or evaluate whether programs are o
n track to meet their intended goals.
Several leaders championed the initial use of the toolkit and due to leadership’s continued
backing, the toolkit remains in wide-use throughout the State Department.
2. Performance Management is Everyone’s Responsibility
Everyone involved in the grants management lifecycle plays an important role in achieving
effective results. Federal employees in awarding agencies are encouraged to understand and
participate in the performance management activities related to their programs. Grant
managers, performance analysts, program managers, and others should be involved in the
entire program design and implementation process so that clear, measurable program goals
and performance measures are established and tracked through each associated project, as
applicable. Agency policies should provide clear guidance on the performance management
roles and responsibilities within the organization.
24
The Department of State’s “Program Design and Performance Management (PD/PM) Toolkit” provides an
extensive overview of how Federal agencies can achieve their strategic goals by following strong program design
and performance management practices. See
https://www.state.gov/wp-content/uploads/2018/12/Program-Design-
and-Performance-Management-Toolkit.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 13
Promising Practice Clear Lines of Responsibility
The Department of Health and Human Services’ (HHS) Office of Grants provides
Department-wide leadership on grants policy and evaluation, and maintains HHS Grants
Policy Administration Manual (GPAM). GPAM describes the roles and responsibilities of
several grant management positions, including the grants management officer (GMO), the
grants management specialist (GMS), and the project officer/program official (PO). While
the PO has the main responsibility for program design, NOFO development, and
monitoring program and project performance, the PO works closely with both the GMO
and GMS throughout the grants lifecycle.
3. Data Informed Decision-Making Improves Results
Data are critical to making informed decisions and improving results. Agencies use data to
assess whether and to what degree they are successful in meeting their strategic plan goals by
looking at program and project results. Agencies realize the benefits of collecting and
analyzing performance data (i.e., historical, prospective, and current) about programs and
projects when that information is used to make decisions about improving program and
project results.
Promising Practice - Data-Informed Risk Decision-Making
The National Aeronautics and Space Agency (NASA) developed the NASA Risked-
Informed Decision-Making (RIDM) Handbook to address the importance of assessing risk
as part of the analysis of alternatives within a deliberative, data-informed decision-making
process.
25
Although it was written primarily for program and project requirements-setting
decisions, its principles are applicable to all decision-making under conditions of
uncertainty, where each alternative brings with it its own risks. RIDM is an intentional
process that uses a diverse set of performance measures, in addition to other data to inform
decision-making. NASA manages its high-level objectives through agency strategic goals,
which cascade through the NASA organizational hierarchy as explicitly established and
stated objectives and performance requirements for each unit. Organizational unit
managers use RIDM to understand the risk implications of their decisions on these
objectives and to ensure that the risks to which their decisions expose the unit are within
their risk acceptance authorities.
26
This process ensures that program and project risk
exposures are understood, formally accepted, and consistent with risk tolerances at all
levels of the NASA organizational hierarchy.
25
The NASA Risk-Informed Decision-Making Handbook:
https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20100021361.pdf
26
NASA Risk Management Directive:
https://nodis3.gsfc.nasa.gov/npg_img/N_PR_8000_004B_/N_PR_8000_004B_.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 14
4. Continuous Improvement is Crucial to Achieving Results
Performance management does not exist in a vacuum. When awarding agencies analyze
performance data, such as performance measures and evaluation findings consistently
throughout the grant lifecycle, they can use what they learn to improve program and project
results and award recipient performance. A continuous process of analyzing data and
providing technical assistance to improve programs and projects can also help Federal
awarding agencies better implement their missions, achieve their strategic plan goals, and
improve program results.
Promising Practice - Continuous Improvement
The U.S. Environmental Protection Agency’s (EPA) Office of Continuous Improvement
(OCI) coordinates the agency-wide implementation of the EPA Lean Management System
(ELMS).
27
EPA uses ELMS, which is based on Lean process improvement principles and
tools to help set ambitious and achievable targets for their programs, measure results, and
improve processes by bridging gaps between targets and results. ELMS uses regularly
updated performance and workflow data to make improvements and monitor progress
toward achieving EPA’s Strategic Plan targets.
II.B. Risk Management and Performance Management
Risk management is important to consider for performance management. As noted in the
Enterprise Risk Management (ERM) for the U.S. Federal Government (2016) Playbook,
Federal agencies should assess risk as part of their strategic planning, budget formulation and
execution, and grants management activities.
28
ERM promotes cross-agency discussion of risks
and coordination of risk mitigation. These discussions promote better performance management
strategies, as grants, program, and performance management are often separate agency functions.
While ERM practices take place at the agency level, grant managers and others assess risk at the
award recipient level.
29
Risk management is also a critical component of an agency’s
performance management framework because it helps identify risks that may affect advancement
toward or the achievement of a project or sub-projects goals and objectives. Typically,
integrating risk management for grants begins in the program administration and pre-award
27
The EPA Office of Continuous Improvement: https://www.epa.gov/aboutepa/about-office-continuous-
improvement-oci
28
The ERM Playbook helps government agencies meet the requirements of the Office of Management and Budget
Circular (OMB) A-123. The ERM Playbook provides high-level key concepts for consideration when establishing a
comprehensive and effective ERM program.
See https://cfo.gov/wp-content/uploads/2016/07/FINAL-ERM-
Playbook.pdf .
29
OMB Circular A-123 and Title 2 of the Code of Federal Regulations (CFR).
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 15
phases of the grants lifecycle; and staff with responsibility over grant programs and grant
performance assess and manage risks during the entire process. Further, integrating risk
management practices after an award is made can assist Federal managers determine an
appropriate level of project oversight to monitor award recipient progress.
As noted in OMB Circular A-123, risk assessment is a component of internal control and an
integral part of agency internal control processes.
30
Grant managers and others assess and
monitor risk as part of their award management activities, and work to document as well as
improve internal control processes. Thus, the PM Playbook notes that there are areas of
intersection with both ERM and award recipient risk where opportunities and threats are
important to recognize.
31
Promising Practice - Assessing Risk
The Department of Education (ED) developed Risk Management Tools to assist employees
involved in implementing agency award programs with mitigating risk throughout the grants
lifecycle.
32
These tools include: 1) Grant Training Courses, including on internal controls; 2)
States’ Risk Management Practices, Tools, and Resources; 3) An Internal Controls Technical
Assistance Presentation; and 4) a Risk Management Project Management Presentation. ED’s
grant managers and others have found these tools helpful in assessing and implementing award
oversight activities.
30
OMB Circular A-123: https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2016/m-16-
17.pdf
31
Federal Integrated Business Framework (FIBF): https://ussm.gsa.gov/fibf-gm/
32
Department of Education Risk Management Tools: https://www2.ed.gov/fund/grant/about/risk-management-
tools.html?src=grants-page
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 16
II.C. The Federal Grants Lifecycle
The PM Playbook follows a similar grants lifecycle structure as the Federal Integrated Business
Framework (FIBF) for Grants Management.
33
The PM Playbook strongly emphasizes program
design in Phase I of the grants lifecycle, which also mirrors proposed revisions to 2 CFR 200,
which includes a new provision on program design.
Table 2: At a Glance: Performance Management as Part of the Grants Lifecycle
34
Grant Phase
Description
Performance Activities
Activity Level
Phase 1: Program
Administration
Define the problem
to be solved and
the desired long-
term program
results, develop
NOFOs at the
project level, and
create merit review
process standards.
Conduct program-planning
activities; and establish program
goals, objectives, and
performance measures.
Link project goals to the larger
program, establish award
recipient responsibilities for
reporting performance
indicators, develop risk
reduction strategies, assess past
performance, and identify
independent sources of data
when appropriate.
1) Program
2) Project
Grant Phase
Description
Performance Activities
Activity Level
Phase 2: Pre-Award
Management
(See also 2 CFR 200
Subpart CPre-Federal
Award Requirements
and Contents of Federal
Awards)
Review
applications and
select recipients.
Notify approved
applicants of award
selection.
Evaluate and document
application eligibility and merit.
Evaluate and document
applicant risk based on past
performance, as applicable.
2) Project
3) Sub-project
Phase 3: Award
Management
(See also 2 CFR 200
Subpart CPre-Federal
Award Requirements
and Contents of Federal
Awards)
Award recipient
notification
Issue award notifications,
including special conditions to
address award recipient risks
and reporting on performance
3) Sub-project
33
Federal Integrated Business Framework (FIBF): https://ussm.gsa.gov/fibf-gm/ .
34
Table developed by authors of the PM Playbook.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 17
Phase 4: Post-
Award Management
and Closeout
(See also 2 CFR 200
Subpart DPost
Federal Award
Requirements)
Monitor and assess
award recipient
financial and
performance data.
Perform grant
closeout activities
Document and analyze award
recipient performance data,
notify award recipient of
concerns about performance,
and document corrective
actions, when needed. Review
and resolve audit findings, and
update recipient risk
assessments based on results.
3) Sub-Project
Phase 5: Program
Oversight
(See also 2 CFR 200
Subpart FAudit
Requirements)
Program and
project-level
analysis and review
Report on program and project
level performance data and
related projects, including
whether the program and/or
project made progress toward
their goals and objectives;
utilize results to improve
projects in the next funding
cycle; develop and disseminate
lessons learned and promising
practices; and conduct
evaluations
1) Program
2) Project
The PM Playbook discuss these five phases in detail in Section III below:
III. The Performance Management Approach for Grants
The PM Playbook outlines a performance management approach for grants that aligns with the
FIBF and emphasizes the importance of the program design phase. While GPRAMA and OMB
Circular A-11 provide guidance to agencies on requirements related to strategic and performance
planning, reporting, and goal setting in an organizational context, there are fewer Federally
mandated policies and tools on program design, NOFO development, and performance
management for Federal awards. The PM Playbook addresses this gap by providing details on
promising practices used by Federal agencies at each of the five phases of the grants lifecycle
highlighted below.
During the grants lifecycle, Federal awarding agencies focus on both compliance and
performance activities. Most often, these activities are combined when grant managers and
others monitor award recipients. Too often, however, Federal awarding agencies do not clearly
distinguish between these types of activities. As a result, Federal awarding agencies may request
only compliance related measures in the NOFO, rather than both compliance and performance
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 18
related measures. Per 2 CFR Part 200, recipient must be informed of all reporting requirements,
including performance requirements, within NOFOs.
It is necessary to distinguish between these activities to ensure that 1) individual award recipients
comply with programmatic, financial, and performance requirements; and 2) program and
project results are advanced or successfully achieved based on their goals and objectives.
To assist in lessening this confusion, the PM playbook provides the following definitions:
Compliance Activities
35
: Compliance activities are the administrative, financial, audit, and
program requirements described in the NOFO and are used for recipient oversight and
monitoring, which conform with the Federal rules and regulations on reporting in 2 CFR Part
200. The primary purpose of compliance activities is to document that funds are spent in
accordance with the terms of the Federal award, including accomplishing the intended sub-
project purpose. Compliance activities take place at the sub-project (or individual award
recipient) level and include:
Ensuring the timely expenditure of funds
Preventing fraud, waste, and abuse
Financial reporting
Identifying the technical assistance needs of the award recipient
Performance Activities
36
: Performance activities include both performance measurement
(outputs and outcomes) and program/project evaluations. Evaluations may also include studies to
answer specific questions about how well an intervention is achieving its outcomes and why.
Many programs and projects entail a range of interventions in addition to other activities. For
example, specific interventions for a rural grant program might include an evaluation of different
types of intervention models.
Performance measurement: Reporting on a program or project's progress toward and
accomplishment of its goals with performance indicators.
Program/project evaluations: Conducting studies to answer specific questions about how
well a program or project is achieving its outcomes and why.
The intent of performance activities are to focus on assessing higher-level program and project
outcomes and results. The ideal and preferred performance measures are outcomes; however, for
some programs it may be difficult to collect outcome measures during the period of performance.
In these instances, agencies may need to collect output measures to monitor performance. These
35
Developed by the authors of the PM Playbook.
36
Developed by the authors of the PM Playbook.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 19
output measures should be meaningful and consistent with the theory of change, maturity model,
or logic model documented during the program design phase described below. (See Table 1:
Output and Outcome Measures Defined). Examples of output measures include the number of
single parents that received home visits during the period of performance of an award or how
much of an infrastructure project was completed in a given timeframe. Examples of outcome
measures includes determining whether the single parent that received the home visit received
assistance that helped to improve their quality of life, or the percent reduction in substance abuse
relapse rates, or reduction of average commute time in a given metropolitan area.
See Figure 4: Compliance Activities versus Performance Activities.
37
III. A. Phase 1: Program Administration
38
The grants lifecycle begins with the enactment of an authorizing statute, which prompts an
agency to set-up and design the administration of the grant program, which focuses on planning
and creating assistance listings and related projects (i.e., NOFOs). Sound program design is an
essential component of performance management and program administration. Ideally, program
design takes place before an agency drafts related projects. This enables Federal agency
leadership and employees to codify program goals, objectives, and intended results before
specifying the goals and objectives of specific projects in a NOFO.
37
Figure developed by authors of the PM Playbook.
38
Refer to the OMB Uniform Guidance (2 CFR 200) for additional information: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=c5f8c20480c04403862fca0bcaf78a74&mc=true&n=pt2.1.200&r=PART&ty=HTML#
_top
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 20
Program design begins with aligning program goals and objectives with Congressional intent as
stated in the program authorization and appropriations bill language. The program goals and
objectives should also be aligned with Federal agency leadership priorities, strategic plans and
priority goals. A well-designed program has clear goals and objectives that facilitate the delivery
of meaningful results, whether a new scientific discovery, positive impact on citizen’s daily life,
or improvement of the Nation’s infrastructure. Well-designed programs also represent a critical
component of an agency’s implementation strategies and efforts that contribute to and support
the longer-term outcomes of an agency’s strategic plan.
Program Design is Critical to Achieving Results
Program design occurs before a project is developed and described in a NOFO. Ideally, an
agency first designs the program (or assistance listing), including specifying goals, objectives,
and intended results, before developing one or more projects (NOFOs) under the program.
Consider the following metaphor: In some instances, a program is like an aircraft carrier, and
the projects are like the planes onboard the ship. The carrier has macro-level, mission goals
and each plane carries out micro-level goals and objectives based on their specific
assignments.
Agencies may also use program design principles when developing projects. In fact, the steps for
project development are the same as program development with the addition of steps: 1) aligning
the goals and objectives of the project back to those of the larger program; and 2) including
project-specific performance indicators in the NOFO. Thus, program design activities may occur
at both the program and project levels.
39
39
The PM Playbook authors note that performance requirements may vary by type of Federal financial assistance.
Formula awards are noncompetitive and based on a predetermined formula. They are governed by statutes or
Congressional appropriations acts that specify what factors are used to determine eligibility, how the funds will be
allocated among eligible recipients, the method by which an applicant must demonstrate its eligibility for funding,
and sometimes even performance reporting requirements. Discretionary awards, however, are typically provided
through a competitive process. They also are awarded based on legislative and regulatory requirements, but agencies
typically have more discretion over specifying program and performance requirements for award recipients.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 21
Table 3: Program/Project Design - Activity Levels
40
Level 1: Program
Federal awarding agencies establish program
goals, objectives, and intended results that
align with appropriations, which are described
for each assistance listing on beta.sam.gov.
Level 2: Project
Federal awarding agencies establish project
goals, objectives, and intended results (that
align with the larger program), which are
described in a NOFO.
The steps involved in program and project design take place before agencies write NOFOs and
include:
1. Developing a problem statement with complexity awareness.
41
2. Identifying goals and objectives.
3. Developing a theory of change, maturity model, and/or logic model depicting the
program or project’s structure.
42
4. Developing performance indicators, as appropriate, to measure the program and/or
project results, which may include independently available sources of data.
5. Identifying stakeholders that may benefit from promising practices, discoveries, or
expanded knowledge.
6. Research existing programs that address similar problems for information on previous
challenges and successes.
7. Develop an evaluation strategy (See Section IV.E.3 on Evaluations for more
information).
40
Table developed by authors of the PM Playbook.
41
The phrase “complexity awareness” is often used in research but is applicable in many situations. Complexity-
awareness acknowledges the prevalence and importance of non-linear, unpredictable interrelationships, non-linear
causality and emergent properties that may impact a problem.
42
See glossary for definitions of these phrases.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 22
Promising Practice - Program and Project Design, Monitoring, and Evaluation Policy
In 2017, the Department of State issued a “Program and Project Design, Monitoring, and
Evaluation Policy” to establish clear links from its strategic plan goals, to achieve those goals
through key programs and projects, and to collect data on whether these efforts were working
as intended. The policy clearly defines the terms program and project as follows:
Program: A set of activities, processes, aimed at achieving a goal or objective that is
typically implemented by several parties over a specified period of time and may cut
across sectors, themes, and/or geographic areas.
Project: A set of activities intended to achieve a defined product, service, or result with
specified resources within a set schedule. Multiple projects often make up the portfolio
of a program and support achieving a goal or objective.
According to the State Department, program and project design work serves as a foundation
for the collection and validation of performance monitoring data, confirming alignment to
strategic objectives, and purposeful evaluative and learning questions. The State Department
developed guidance and a plan for implementing the policy. This plan included coordination
with bureaus and offices throughout the agency to complete program and project design steps
for their major lines of effort.
Bureaus that could integrate sound program design, monitoring, and evaluation practices did
so either with:
A well-resourced effort with strong leadership buy-in, or
With a bureau champion who was able to convince colleagues that well-
documented program designs, monitoring plans, and progress reviews could
improve the efficiency and efficacy of their work and their ability to answer
questions from stakeholders.
Implementation brought additional benefits, including broader collaboration among bureaus
and missions and greater connectivity among the sources of information on planning,
budgeting, managing and evaluating projects and programs.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 23
III.A.1 Program Design Steps
As noted previously, Federal agencies can focus on program design at both the assistance listing
and the project level. Federal agencies often design programs with multiple projects. While the
steps outlined below focus on programs, the same steps can be applied to project development.
Step 1. Develop a problem statement. Program design begins with understanding
Congressional intent in authorizing and appropriations language, the priorities of agency
leadership, and/or the lessons learned from previous programs and projects. This understanding
informs the purpose of a program and/or project and guides its design.
43
Align Program Purpose with Authorization Language
The Department of Transportation’s Federal Motor Carrier Safety Administration (FMCSA)
mission is to improve commercial motor vehicle (CMV) carrier and driver safety through the
administration of Federal award programs. The purpose of FMSCA’s Motor Carrier and
Safety Assistance Program (MSCAP) is to reduce CMV-involved crashes, fatalities, and
injuries through consistent, uniform, and effective CMV safety programs (assistance listing
#20.218), which is consistent with the program’s authorization under the Safe, Accountable,
Flexible, Efficient Transportation Equity Act.
44
After the intent of the program is understood, a problem statement is developed. The problem
statement clearly defines the nature and extent of the problem to be addressed. Agencies may
develop problem (or need) statements by conducting a situational analysis or needs assessment,
which may include internal and external stakeholder engagement. This involves a systematic
gathering and analysis of data and information relevant to the program the problem seeks to
address and identifying priorities, concerns, and perspectives of those with an interest in the
problem or addressed need. During this step, evidence is gathered (if available) to help inform
the successful ways to advance or achieve the program’s goals. If it is unclear if evidence exists
on the program, the program should consult with the agency evaluation officer (EO) or chief data
officer (CDO). The data and evidence gathered will help to inform both program and project
development.
43
As noted previously, the PM Playbook begins the program design process at the assistance listing level (not at the
project or NOFO level). However, using the “program design” process at the NOFO level can be extremely useful,
especially when new programs and their corresponding projects are developed.
44
Pub. L. No.10959, § 4107(b), 119 Stat. 1144, 1720 (2005), amended by SAFETEA-LU Technical Corrections
Act of 2008, Pub. L. 110-244, 122 Stat. 1572 (2008).
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 24
Promising Practice Consultation with Experts to Improve Program Design
The National Aeronautics and Space Administration’s (NASA) Science Mission Directorate
(SMD) uses NASA's Strategic Goals and Objectives and the high-level objectives that flow
from them as one of four components to its research grant program design activities. In turn,
the high-level objectives are derived from “Decadal Surveys” created by the National
Academies of Science, Engineering and Medicine every ten years and reviewed every five
years reports summarizing the state of the art of SMD’s four science foci (Heliophysics,
Earth Science, Planetary Science, and Astrophysics) and containing recommendations for
future work.
45
The NASA Advisory Council’s Science Committee, a high-level standing committee of
the NASA Advisory Council (NAC), supports the advisory needs of the NASA Administrator,
SMD, and other NASA Mission Directorates. The NAC’s Science Committee is the third
design component. This committee provides input to NASA’s Earth and Space science-related
discretionary research grant programs, large flight missions, NASA facilities, etc.
Finally, the SMD contracts with the Space Studies Board (SSB) at the National Academies of
Sciences, Engineering, and Medicine to engage the Nation’s science expert stakeholders to
identify and prioritize leading-edge scientific questions and the observations required to
answer them as the fourth component. The SSB’s evidence-based consensus studies examine
key questions posed by NASA and other U.S. government agencies.
SMD integrates these design practices to create an evidence-based feedback system. For
example, Decadal Surveys inform the NASA’s strategic and SMD science plan production and
allow grant programs to be kept up to date rather than be completely reliant on agency
produced program goals and objectives.
Step 2. Identify goals and objectives. Goals establish the direction and focus of a program and
serve as the foundation for developing program objectives. They are broad statements about what
should happen because of the program, although the program may not achieve the long-term
result(s) within the period of performance for the program. When establishing program goal(s),
state the goal(s) clearly, avoid vague statements that lack criteria for evaluating program
effectiveness, and phrase the goal(s) in terms of the change the program should advance and/or
achieve, rather than as an activity or summary of the services or products the program will
provide.
Objectives are the intermediate effects or results the program can achieve towards advancing
program goal(s). They are statements of the condition(s) or state(s) the program expects to
achieve or affect within the timeframe and resources of the program. High-quality objectives
45
See NASA’s example to Improve Program Design at https://science.nasa.gov/about-us/science-strategy.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 25
often incorporate SMART principles: Specific, Measurable, Achievable, Relevant, and Time-
bound.
46
SMART objectives help to identify elements of the evaluation plan and performance
management framework, including performance indicators and data collection criteria. Program
objectives often serve as the starting point for developing projects and drafting the related
NOFOs. While there is no one right way to design programs and related projects, agencies
cannot assess program and project success if their goals and objective are not clearly stated from
the beginning.
Step 3. Develop a theory of change, maturity model, or logic model depicting the program's
structure. Logic models, maturity models, and theories of change are the building blocks for
developing programs. They may be used individually or together. For example, a theory of
change defines a cause-and-effect relationship between a specific intervention, or service
activity, and an intended outcome.
47
A theory of change explains how and why a program is
expected to produce a desired result. A theory of change also can be used to summarize why
changes in a logic model are expected to occur, and logic models can be used to show a
summary of the underlying theory.
48
A maturity model is a tool that is used to assess the effectiveness of a program. The maturity
model is also used to determine the capabilities that are needed to improve performance. While
maturity models are often used as a project management tool, they also can help to assess a
program and/or projects need for improvement.
49
Logic models describe how programs are linked to the results the program is expected to
advance or achieve. Logic models are intended to identify problems (in the problem statement),
name desired results (in the goals and objectives) and develop strategies for achieving results.
Outcomes are the primary changes that are expected to occur as a result of the program or
project’s activities, and are linked to the program or project goals and objectives.
50
Logic models provide a visual representation of the causal relationships between a sequence of
related events, connecting the need for a planned program or project with desired results. Logic
models identify strategic elements (e.g., inputs, outputs, activities, and outcomes) and their
relationships. This also includes statements about the assumptions and external risks that may
influence success or present challenges.
46
For more information on SMART goals: https://hr.wayne.edu/leads/phase1/smart-objectives
47
More information about Theory of Change: https://www.nationalservice.gov/resources/performance-
measurement/theory-change
48
More information about Theory of Change: https://www.theoryofchange.org/what-is-theory-of-change/
49
https://www.pmi.org/learning/library/maturity-model-implementation-case-study-8882
50
See glossary for definition of “outcome.”
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 26
Logic models also include program activities. These are proposed approaches to produce results
that meet the stated goals and objectives. A clear description of program activities provides the
basis for developing procedures for program implementation and measures to monitor
progress/status.
A well-built logic model is a powerful communications tool. Logic models show what a program
is doing (activities) and what the program plans to advance or achieve (goals and outcomes).
Logic models also communicate how a program is relevant (needs statement) and its intended
impact (outcomes). Agencies also may create cascading logic models, where the program
(assistance listing) logic model informs the development of logic models for individual projects
(NOFOs).
Promising Practice Establishment of Program Design Process
The Bureau of Justice Assistance (BJA), in the Department of Justice’s Office of Justice
Programs, developed an extensive program design process to assist its grant, performance, and
policy managers in developing goals and objectives, deliverables, logic models, and
performance indicators. The process includes a program design manual, which provides
information on how to facilitate the process and worksheets for creating program statements,
including a theory of change; logic models; and performance indicators. BJA’s program
design process has been successful in helping employees work toward a common
understanding of a program’s purpose, goals, objectives, and intended results. The process has
also helped employees develop projects (NOFOs) that align with the goals and objectives of
the larger program. The same process has been used to develop better projects and design
NOFOs.
Step 4. Develop performance indicators to measure program and/or project
accomplishments. Performance indicators should provide strategic and relevant information that
answer agency leadership and stakeholders’ questions. They should measure the results of the
actions that help to advance or achieve the program’s goals and objectives. While there is no
specific formula for developing performance indicators, characteristics of effective indicators
include the following:
Reflect results, not the activities used to produce results
Relate directly to a goal
Are based on measurable data
Are practical and easily understood by all
Are accepted and have owners
51
51
A performance indicator owner is the person responsible for the process that the indicator is assessing.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 27
Step 5: Identify stakeholders that may benefit from any promising practices, discoveries, or
expanded knowledge.
Involving stakeholders early in the process helps establish buy-in before the project begins. It is
also important to build strong, ongoing partnerships with stakeholders that are in some way
seeking the same outcome to determine what may be missing from the program design.
Step 6: Research existing programs that address similar problems for information on
previous challenges and successes.
Program design can be much improved by researching challenges and successes of similar
programs. One possible source to research these challenges and successes is an agency
evaluation clearinghouse. See section III.E.2, Dissemination of Lessons Learned for an example
of the Department of Education’s “What Works Clearinghouse.This clearinghouse was
designed with the goal to provide educators with the information that they need to make
evidence-based decisions.
Step 7: Develop an evaluation strategy
When designing a program, an evaluation plan may also be developed at the Program Design
phase. See Section III.E.3 Evaluation for more information.
III.A.2 Notice of Funding Opportunity (NOFO)
Per Appendix I, Full Text of the Notices of Funding Opportunity, of the OMB Uniform
Guidance located at 2 CFR 200, the pre-award phase includes: 1) developing NOFOs; 2)
establishing performance requirements for award recipients; and 3) establishing the merit review
process and criteria for ranking applicants.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 28
Note on Terminology What is a sub-project?
NOFO’s may use terms like “initiative,” “program,” or “project” to refer to the activities that an
award recipient plans to accomplish with their award. To avoid confusion, the PM Playbook
uses the phrase “sub-project” to refer to these activities. As noted previously, the PM Playbook
uses the term “project” to refer to the activities specified in a NOFO. See Figure 1: DOJ SCA
Example: Program, Project, Funding Vehicle and Recipient Relationships
III.B.1. Notice of Funding Opportunity (NOFO) (2 CFR 200, Appendix I)
The pre-award phase begins with reviewing a program’s goals and objectives as well as its logic
or maturity model. At this point, an agency may decide to fund one or more projects (NOFOs)
under the larger program. When developing NOFOs, grant managers and others should align the
goals, objectives, and performance indicators of the proposed project directly back to the larger
goals of the program. Agency leadership should review the program goals, objectives, activities,
and outcomes to determine which specific areas and/or activities the NOFOs will address. As
noted earlier, NOFO development is a project activity (level 2), and may mirror the process used
for developing a program.
Projects are funded through a NOFO
The process of developing a NOFO begins with aligning the goals and objectives of the
project funded in a NOFO with the larger goals of the program developed during the program
design process. NOFOs often have similar but more specific goals and objectives that
align with the activities and outcomes captured in the program logic model. For example,
the Department of Justice’s Second Chance Act (SCA) program has broader goals than those
for the SCA Comprehensive Community-based Adult Reentry Program.
SCA program (assistance listing) goal: Reduce recidivism
SCA Comprehensive Community-based Adult Reentry project (NOFO) sub-goal:
Increase the availability of reentry services with comprehensive case management
plans that directly address criminal behaviors.
The purpose of the NOFO should seek to contribute to advancing and/or achieving the
overarching program goal as well as the project’s goals and objectives. Ideally, a project
addresses one or more of the program’s objectives (which are the means by which the program
goal(s) is/are advanced or achieved). During NOFO development, Federal programs develop
clear instructions on the types of performance indicators that award recipients must develop and
report on for their project. This process may include requiring potential award recipients to
include proposed target numbers as the baseline data for key performance indicators in their
applications. As each program’s objectives have their own set of performance indicators, the
project should include the subset of indicators and data collection criteria associated with the
program objective. A NOFO may include program as well as project specific indicators, and
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 29
should include a description of requirements needed to monitor compliance and outcomes for
assessing program results.
Appendix 1 of 2 CFR 200 - Full Text of Notice of Funding Opportunity
52
2 CFR 200 requires that all NOFO’s include the following sections:
A. Program Description
The program description may be as long as needed to adequately communicate to
potential applicants the areas in which funding may be provided. This section includes
many items, including the communication of indicators of successful projects.
B. Federal Award Information
Information contained in this section helps the applicant make an informed decision about
whether to submit a proposal.
C. Eligibility Information
Considerations or factors that determine applicant or application eligibility are included
in this section. Some examples include the types of entities that are eligible to apply,
information about cost sharing or matching (if applicable), and criteria that would make
an application or project ineligible for Federal awards. Also included in this section is a
general statement of eligibility that reflects the statutory authority for the program, and
other pertinent legal or policy requirements or restrictions. If an applicable statute or
regulation restricts eligibility, the NOFO must include an appropriate reference or cite to
the law.
D. Application and Submission Information
This section addresses how potential applicants will get application forms, kits or other
materials needed to apply. Additional required information for this section includes the
format for the application, required registration in the System for Award Management
(SAM), the submission dates and times, the intergovernmental review disclaimer (if
applicable), funding restrictions and other requirements.
E. Application Review Information
Information in this section includes how to apply for the award, application review
criteria, the review and selection process, specifications if the total Federal share is
expected to be greater than the simplified acquisition threshold, anticipated announcement
and Federal award dates.
52
2 CFR 200 Appendix I to Part 200Full Text of Notice of Funding Opportunity: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=56df1ac037a4c02d491c0e6c56fbfe9b&mc=true&n=pt2.1.200&r=PART&ty=HTML#
ap2.1.200_1521.i
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 30
Review Criteria
Discuss those elements used to consider the comparative value of different proposals
submitted under the NOFO. This section should include information on the criteria
used (and assignment of percentage weight, if applicable) by reviewers to evaluate
applications for competitive awards. This information may vary by NOFO.
Review Process
Include information about how applications will be reviewed (e.g., through peer
review) and who will make the final award decisions.
F. Federal Award Administration Information
Included in this section are Federal award notices, administrative and national policy
requirements, and reporting requirements. The reporting requirements includes the type,
frequency, and means of submission of post-Federal award reporting requirements.
G. Federal Awarding Agency Contact(s)
A point of contact for the potential applicant to ask questions or assistance while the NOFO is
open. This can include the contact name of grant or program manager, title, phone number, or
email.
III.A.3. Performance Management Requirements (2 CFR §§200.210, 200.301)
A NOFO often includes performance measurement requirements for the overall project as well
as the sub-project proposals, including performance indicators and targets, as well as baseline
data. The NOFO may require the potential award recipient to use project-specific performance
indicators and/or to propose their own indicators that relate to their sub-project. The information
collected from recipients to support agency program performance reporting must be cleared by
OMB as stated under the Paperwork Reduction Act.
53
Agencies are also encouraged to request
flexibilities from OMB in support of innovative program design. For additional information on
the request for agency flexibilities, see 2 CFR §200.102.
54
53
“The Paperwork Reduction Act (PRA) of 1995 gives the Office of Management and Budget (OMB) authority
over the collection of certain information, including performance measures and others types of data, by Federal
agencies. OMB must approve all new and revised agency data collection plans before they are made public.” For
more information on the PRA: https://pra.digital.gov/
54
See the reference to 2 CFR §200.102: https://www.ecfr.gov/cgi-bin/text-
idx?SID=2942841ca6493ad99f7ad13c59bce7f9&mc=true&node=se2.1.200_1102&rgn=div8
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 31
1. Performance measurement requirements: While the NOFO includes performance
measurement requirements established by the project, the proposed sub-project should
include:
a. The data collection and reporting methods the potential award recipient would use if
funded and why those methods are likely to yield reliable, valid, and meaningful
performance data.
b. If the potential award recipient is required to collect data after the substantive work of
a sub-project is complete, they should describe the data collection and reporting
methods they would use during the post-performance period and why those methods
are likely to yield reliable, valid, and meaningful performance data.
55
2. Project performance indicators: When the Federal awarding agency requires a potential
award recipient to create and propose their own sub-project performance indicators,
baseline data, or performance targets, the NOFO could include:
a. Performance Indicator. How each proposed performance indicator would accurately
measure the performance of the sub-project and how the proposed performance
indicator is consistent with the performance measures established for the program
funding the competition.
b. Data source. The data source and collection process to support its general accuracy
and reliability, as well as any data limitations.
c. Baseline data. Why is each proposed baseline valid? If the applicant has determined
that there is no established baseline data for a particular performance indicator, they
should provide an explanation of why there is no established baseline. In addition,
they should describe how and when the applicant would establish a valid baseline for
the performance indicator.
d. Performance targets. Why each proposed performance target is ambitious yet
achievable compared to the baseline for the performance indicator and when, during
the period of performance, the potential award recipient would meet the performance
target(s).
55
See also materials developed by the Performance Improvement Council (PIC). For example, the PIC published the
“Performance Measurement Basics” which is a great resource for developing performance measures.
55
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 32
Promising Practice - Technical Assistance on Performance Measurement
Training and technical assistance to prospective applicants on performance measurement can
be extremely helpful, especially where the applicant is required to establish/propose its own
program-specific measures. This training at an in-person event or via webinar should focus on
the program’s goals, what makes a good measure, establishing baselines and targets and data
collection and validation methods.
III.B. Phase 2: Pre-Award Management (§2 CFR 200 Subpart C)
56
The pre-award management phase includes performance activities during the grant application,
review, and selection process.
III.B.1. Selection Criteria for Making Awards (§2 CFR 200.204 Appendix I.E.)
57
To assess the performance-related aspects of a proposed sub-project, the Federal awarding
agency generally adheres to predetermined selection criteria.
In addition to developing
predetermined selection criteria for an award, it is important for the success of the program or
project to develop a budget and budget justification that is well thought out and meets Federal
requirements. Selection criteria only apply for discretionary awards. Below are four general
selection criteria and factors to consider:
1. The quality of the design of the proposed sub-project. (i.e., application, proposal)
a. The goals, objectives, and outcomes of the sub-project are clearly specified and
measurable.
b. The design of the proposed sub-project includes a thorough, high-quality review of the
relevant literature, a high-quality plan for project implementation, and the use of
appropriate methodological tools to ensure successful advancement or achievement of
project objectives.
c. The proposed demonstration design and procedures for documenting project activities
and results are of high quality.
d. The potential award recipients level of past performance, if applicable. (i.e., to what
degree did they successfully meet project goals in the past?).
56
Refer to the OMB Uniform Guidance (2 CFR 200) for additional information about the grants lifecycle pre-award
phase. (Subpart C- Pre-Federal Award Requirements and Contents of Federal Awards and Appendix XII to Part
200- Award Term and Condition for Recipient Integrity and Performance Matters)
https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=c5f8c20480c04403862fca0bcaf78a74&mc=true&n=pt2.1.200&r=PART&ty=HTML#
_top
57
2 CFR 200 §§200.204 Federal awarding agency review of merit proposals and 200.210: Information contained in
a Federal award: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=56df1ac037a4c02d491c0e6c56fbfe9b&mc=true&n=pt2.1.200&r=PART&ty=HTML#
_top
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 33
2. The proposed results that would be advanced or achieved through the award is supported by
evidence. This portion of a grant application should be reviewed by an individual with
evaluation and evidence expertise, either within the Federal awarding agency or recruited as
a grant reviewer. For awards that use tiered evidence strategies, the funding announcement
should specify the level of evidence required for each funding tier.
a. Provide demographic, economic conditions, and other information such as the number of
intended beneficiaries using reliable sources of information.
b. Include program evaluation results conducted on similar previously conducted sub-
project.
c. Include program evaluation results on a similar program or project, with an explanation
on how the proposed program or project differs or innovates.
d. Include other types of evidence that help demonstrate the likelihood that the proposed
sub-project would be successful.
3. The quality of the evaluation proposed by the applicant, when required by the Federal
awarding agency. This portion of an application should be reviewed by an evaluation
methods expert, either within the agency or as recruited as a reviewer. See OMB Circular A-
11 for more information on evaluation. The Federal awarding agency should look at the
following items when determining the quality of the evaluation proposed by the applicant:
a. The methods of evaluation are feasible, rigorous, and appropriate to the goals, objectives,
and outcomes of the proposed sub-project.
b. The evaluation is being conducted by individuals qualified to conduct the applicable
methods and can demonstrate objectivity and independence from the project or sub-
project.
4. Past performance of the recipient, when available. Review the past performance of a sub-
project to determine what could be improved or what worked well. The following factors
should be considered:
a. The potential award recipient's performance outcomes under a previous award under any
agency program.
b. The potential award recipient's failure under any agency program to submit a
performance report or its submission of a performance report of unacceptable quality.
c. Past performance information on this entity for a similar or the same program, with
context on how this relates to proposed goals for this new award. This should include
outcomes achieved during a specified performance period and explain the number of
affected beneficiaries or participants served.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 34
The selection of proposed sub-projects for awards are based on the authorizing statue, the
selection criteria, and any priorities or other requirements applicable to the project and program.
Once selected, the proposed sub-projects are often listed in rank order based on the evaluation of
their quality according to the selection criteria. Past performance may be considered when
determining the order in which the proposed project will be selected for award.
III.C. Phase 3: Award Management (2 CFR 200 Subpart C)
58
The award phase consists of: 1) documenting all standard terms and conditions; 2) developing
and documenting special conditions to address award recipient risks; and 3) issuing the awards.
For the purposes of this playbook, however, the following sections only focus on project and
sub-project (i.e., award recipient) performance.
III.C.1. Risk Assessment and Special Conditions (2 CFR §§200.205, 200.207)
Before funding an application, a Federal awarding agency should review the risk posed by the
potential award recipient in terms of their ability to accomplish sub-project goals and
objectives.
59
According to Title 2 CFR §200.205 Federal awarding agency review of risk posed
by applicants, a risk assessment considers a number of items, including:
Prior and/or current performance information, including progress in achieving
previous corrective actions set in place to resolve performance-related findings, if
applicable.
Single audit and financial data.
When Federal employees identify programmatic performance risks associated with potential
award recipients, agencies often impose specific award conditions in the Notice of Award (NOA)
to address the identified performance risks. These conditions include risk mitigation strategies
that are beyond routine post award monitoring and oversight strategies. Federal awarding
agencies also impose special conditions on an award if the potential award recipient has a history
of unsatisfactory performance or has not fulfilled the conditions of a prior award.
60
58
2 CFR 200 Subpart C- Pre-Federal Award Requirements and Contents of Federal Awards:
https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=c5f8c20480c04403862fca0bcaf78a74&mc=true&n=pt2.1.200&r=PART&ty=HTML#
_top
59
2 CFR §200.205 Federal awarding agency review of risk posed by applicants: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=c5f8c20480c04403862fca0bcaf78a74&mc=true&n=pt2.1.200&r=PART&ty=HTML#
_top
60
The Federal awarding agency may make a Federal award to a recipient who does not fully meet these standards if
it is determined that the information is not relevant to the current Federal award or if there are specific condition that
can mitigate the effects of the risk.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 35
Promising Practice Mitigating Risk and Reducing Costs by Sharing Research Results
The National Aeronautics and Space Administration (NASA) employs tools to mitigate risk
and reduce cost by sharing research results. For example, all NASA funded researchers
archive manuscript versions of peer reviewed publications in PubMed Central® (PMC), a free
full-text archive at the U.S. National Institutes of Health's National Library of Medicine
(NIH/NLM). Rather than create a duplicative archive, NASA relies on NIH’s infrastructure,
which simplifies the public’s access to the results of federal grants and reduces costs.
The NASA Scientific and Technical Information (STI) Program supports the advancement of
aerospace knowledge and contributes to U.S. competitiveness in aerospace research and
development. This mission support program helps NASA avoid duplication of research by
sharing information. The NASA STI Program acquires, processes, archives, announces, and
disseminates NASA STI via the NASA Technical Reports Server (NRRS).
The NTRS is a world-class collection of STI with a public interface that is available at:
https://ntrs.nasa.gov/search.jsp. The information types include conference papers, patents,
technical videos, etc. created or funded by some NASA assistance awards and procurements.
NASA restricts input access to only those contractors and award recipients who may fill out
the NTRS-R Registration form. At grant closeout, NASA requires award recipients to report
whether or not the grant produced new technology. The grant manager at NASA must verify
the recipient’s answer before closeout is finalized.
When alerting an award recipient that there are special conditions related to performance,
Federal awarding agencies may provide the following types of information:
The reason why the additional requirements are being imposed;
The nature of the action needed to remove the additional requirements, if applicable;
The time allowed for completing the actions, if applicable, and
The method for requesting reconsideration of the additional requirements imposed.
III.C.2. Federal Award and Performance Reporting (2 CFR §200.210)
As noted in 2 CFR 200, a Notice of Award (NOA) must include information on the types of
performance information an award recipient must report on related to the results intended to be
advanced or achieved by the Federal project described in the NOFO.
61
This may include:
1. Performance Goals,
2. Indicators,
3. Milestones,
61
2 CFR §200.210(d) Information Contained in a Federal Award: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=c5f8c20480c04403862fca0bcaf78a74&mc=true&n=pt2.1.200&r=PART&ty=HTML#
_top
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 36
4. Expected Outcomes, and
5. Timeline for accomplishments.
Agencies should measure award recipient performance in ways that help to improve program
results and facilitate sharing lessons learned and the adoption of promising practices.
62
Promising Practice - Award Recipient Performance Reports
DOL’s Employment and Training Administration (ETA) typically requires its recipients to
submit quarterly performance reports that include data on how many participants received
services during the quarter, the types of services and training they received; credentials
attained; and employment outcomes. In addition, recipients submit narrative reports that
describe how the project is functioning in achieving the goals and objectives of the project
challenges as well as successes in managing the award. To use this data, ETA program staff
create “dashboards” that show data metrics against targets for each award recipient to track the
progress from quarter to quarter. This tool helps identify which award recipients are on track
to accomplish their overall goals and to determine which may need technical assistance or to
make course corrections.
III.C.3. Issuing Awards (2 CFR §200.210)
After a Federal awarding agency completes the entire review process, they notify potential award
recipients whether they will receive an award. Federal agencies are expected to clearly articulate
performance and reporting expectations in the terms and conditions of awards. After an award
recipient receives a NOA and the Federal awarding agency disburses the award funds, they will
begin to implement their sub-project.
63
The award recipient is responsible for meeting the
administrative, financial, and performance reporting requirements of the award.
III.D. Phase 4: Post-Award Management and Closeout (2 CFR Subpart D)
64
The post-award phase includes monitoring compliance and assessing award recipient
performance through reporting and the use of data analytics. Agencies monitor award recipient
progress toward project goals and objectives through programmatic reporting and analyzing
performance indicators. At the end of the project period, award recipients must also submit their
final financial and programmatic reports.
62
2 CFR §200.301 Performance Measurement: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=c5f8c20480c04403862fca0bcaf78a74&mc=true&n=pt2.1.200&r=PART&ty=HTML#
_top
63
Some grant funds are awarded on a reimbursable basis.
64
2 CFR 200 Subpart D- Post Federal Award Requirements: https://www.ecfr.gov/cgi-bin/text-
idx?SID=39abcf390fdf8c8adc219f844112a18f&mc=true&tpl=/ecfrbrowse/Title02/2cfr200_main_02.tpl
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 37
III.D.1. Award Recipient Performance Monitoring and Assessment (2 CFR §200.328)
Performance monitoring and assessment involves regularly collecting and analyzing data in
order to monitor award recipient compliance as well as to track progress against proposed targets
and goals. Performance monitoring can help identify whether the sub-project is meeting interim
milestones. Performance assessments can provide insight into whether the sub-project’s goals are
being advanced or achieved.
Level 3: Sub-Project Activities
Award recipient performance monitoring and assessment takes place at level 3, sub-project,
and includes both monitoring for compliance (e.g., meeting financial and performance
expectations), and assessing the degree to which the sub-project meets its intended goals.
The assessment of an award recipients performance begins with the review of performance data
and results during the period of performance for the award.
65
Federal employees review an award
recipient’s progress and performance reports on a quarterly, semi-annual, and/or annual basis.
66
Federal awarding agencies may collect award recipient data through various avenues. Some
Federal agencies collect data through online systems that allow award recipients to upload data
files and submit reports.
67
Performance reports generally include both quantitative and
qualitative information. Agencies commonly create performance report handbooks for award
recipients that describe each of the data elements, how they should report the data, and the
documentation that is required.
65
This should further specify that performance management should begin as soon as data and results start getting
reported and should be reviewed throughout the period of performance since some grants last multiple years.
66
Prior to any data collection, Agencies must obtain OMB approval before actually collecting the data. The
Paperwork Reduction Act (PRA) clearance process for data collections is a lengthy one that should begin
concurrently with the program design phase so that grantees can submit their data on schedule.
67
Although the quality and use of Federal IT systems for collecting performance data is outside the scope of this
document, readers should note that many challenges exist in both implementing new systems and updating older
ones.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 38
Promising Practice Interim Performance Report
68
The Department of Education (ED) requires recipients of multi-year discretionary awards to
submit annual performance reports. ED uses this information to determine whether recipients
have demonstrated substantial progress toward meeting the project goals and objectives. For
example, award recipients must report on at least one performance indicator for each project
objective (and sub-project objectives as applicable). They must establish targets for each
indicator and provide actual performance data demonstrating progress towards meeting or
exceeding this target. Based on this data, they also must describe how they are making
progress toward achieving the project’s goals and objectives.
Throughout the grants lifecycle, award recipients may also receive ongoing technical assistance
and annual audits to ensure that sub-project performance data collected is accurate and valid.
This technical assistance can include handbooks, training events and tutorials, and access to
subject matter experts either in person or virtually. Federal agencies often have a data
verification and validation process for checking information after an award recipient submits
data. This process often includes information technology checks (e.g., automatically checking
that data does not fall outside of a prescribed range).
In addition, Federal awarding agencies may analyze how award recipients are performing against
expected benchmarks or targets. The terms and conditions of the award may have specified that
the award recipients may be required to set annual targets. This information can identify
necessary course corrections in the program overall. Analyzing performance data is also useful
for informing technical assistance efforts. This includes not only technical assistance for
performance management but also programmatic technical assistance to address common
challenges that award recipients may be experiencing with implementation of the project. If the
performance data shows that recipients are missing targets, such as lower than expected
enrollments in their specified activities, this may be an indication that further analysis is
warranted to determine if subpar performance is a result of the program or project design or if
there is a need to provide award recipients with technical assistance.
There are several tools and resources that Federal employees may use in tracking award recipient
performance. Some employees use “dashboards” that can show data metrics for each award
recipient during each reporting period to look across awards to determine possible trouble.
Graphs, charts, infographics, and other visualizations can be useful in interpreting performance
data for sub-projects (level 3). Collecting and analyzing aggregate data can also demonstrate the
effectiveness of technical assistance interventions. These types of tools can help look at award
68
Department of Education Instructions for Grant Performance Report:
https://www2.ed.gov/fund/grant/apply/appforms/ed524b_instructions.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 39
recipient performance over the period of performance to see emerging trends and indicate
whether an award recipient needs technical assistance. For example, perhaps one metric was low
for one reporting period across all awards; however, recipients received technical assistance and
the metric improved by the subsequent reporting period. One suggested practice is for Federal
awarding agencies to upload technical assistance information, including tools and reports, on a
public website for recipients to access on an ongoing basis. It is recommended that these tools
and reports be updated periodically to remain current and relevant.
Promising Practice - Instructing Recipients on How to Write Performance Reports
The National Aeronautics and Space Administration’s Established Program to Stimulate
Competitive Research (EPSCoR) provides instructions to award recipients on how to write
progress and performance reports. The goal of the program is to promote research
opportunities that support NASA’s research, science, and technology priorities. EPSCoR’s
progress reports include:
1. The major goals and objectives of the project
2. Significant accomplishments under these goals
3. How results have been disseminated to interested communities and the public
4. Products such as publications, papers, websites, inventions, patents and technologies
5. Collaboration with other researchers, industries, agencies and organizations
6. How the investment in the project has beneficially impacted the public
7. Significant changes from the original proposal
III.D.2. Award Closeout (2 CFR §§200.343, 200.344)
As award recipients close out their individual sub-projects, Federal awarding agencies have an
additional opportunity to confirm and review final financial and programmatic performance data.
Closeout is a process by which a Federal awarding agency determines that all applicable
administrative actions and all required work of the award have been completed by the award
recipient. During the “closeout” period, recipients must submit all reports required under the
award within 90 days after the award expires or is terminated.
69
Proposed revisions to 2 CFR 200
include an extension from 90 days to 120 days for recipients to submit closeout reports and
liquidate financial obligations.
70
69
2 CFR §200.343 Closeout: https://www.ecfr.gov/cgi-bin/text-
idx?SID=39abcf390fdf8c8adc219f844112a18f&mc=true&tpl=/ecfrbrowse/Title02/2cfr200_main_02.tpl
70
Proposed Revisions to 2 CFR Federal Register
Noticehttps://www.federalregister.gov/documents/2020/01/22/2019-28524/guidance-for-grants-and-agreements
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 40
Federal employees may closeout an award after the following administrative actions and required
work of the grant have been completed:
a. The grant period has expired.
b. All approved extensions have expired.
c. There are no funds remaining in the account, or there are no issues related to the funds
remaining in the account.
d. All performance and financial reports and data required by the terms and conditions of
the award have been received and accepted by Federal employees who determine all
programmatic requirements for the grant have been met.
e. All identified programmatic or financial issues/findings have been resolved, including
special conditions, high risk, and monitoring findings of noncompliance.
f. The Single Audit, if required during the period of performance, is completed, all audit
findings (including from Federal audits) are resolved, corrective actions are successfully
completed, and amounts due back have been paid or an approved payment plan has been
established.
If the award recipient does not complete associated administrative actions and required work, the
award may be closed out in noncompliance based on each agency’s established factors and
circumstances. Closing an award in noncompliance will be part of the award recipient’s record
with the Federal awarding agency and could impact future funding opportunities. Therefore,
everything should be done to make certain the award is closed in compliance. However, non-
compliant status is not a reason to avoid closing out an award.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 41
Promising Practice Performance Progress Reports
71
During the
closeout period, the National Institutes of Health (NIH) requires award recipients to
submit final Research Performance Progress Reports (RPPR). In the final RPPR, NIH asks
award recipients to discuss their accomplishments towards the goal of the sub-project,
manuscripts and publications produced, personnel who have worked on the project, changes to
level of effort of key personnel on the project, actual or planned challenges or delays in the
projects and plans for resolving them, significant changes regarding human or animal subjects,
and enrollment reports for clinical studies. NIH askes award recipients to provide information
on both accomplishments and products produced during the award period.
1. Accomplishments
What were the major goals and objectives of the sub-project?
What was accomplished under these goals?
What opportunities for training and professional development did the sub-project
provide?
How were the results disseminated to communities of interest?
2. Products
Publications, conference papers, and presentations
Website(s) or other internet site(s)
Technologies or techniques
Inventions, patent applications, and/or licenses
Other products, such as data or databases, physical collections, audio or video products,
software, models, educational aids or curricula, instruments or equipment, research material,
interventions (e.g., clinical or educational), or new business creation.
III.E. Phase 5: Program Oversight
The program oversight phase includes analyzing performance data, writing reports,
disseminating lessons learned, and finalizing program or project evaluations.
71
The Department of Health and Human Services (HHS), National Institutes of Health (NIH) Research Performance
Progress Report: https://grants.nih.gov/grants/post-award-monitoring-and-reporting.htm
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 42
III.E.1 Analysis of Program and Project Results
Agencies examine all performance data for an entire program and its related projects after the
period of performance for a project funded through a NOFO has ended. This step focuses on
reviewing both project level and the larger program’s outcomes as well as documenting
promising practices learned. Federal employees review overall program performance to assess
the degree to which programs and their related projects advanced or achieved their goals over the
course of the award.
The first step in this analysis is to review all of the data from the programs and/or project’s
performance indicators. Four types of data analysis can be helpful to interpret performance data.
These are:
Descriptive analysis answers what happened?
Example: 5,000 people were screened for the HIV virus and 95 percent of those
that tested positive were enrolled in treatment.
Diagnostic analysis answers why something happened?
Example: Using data mining techniques, the researcher discovered that providers
in certain geographical locations had unusual patterns of medical claims.
Predicative analysis answers what is likely to happen?
Example: Using data mining techniques, the researcher was able to predict which
people entering the clinic needed behavioral health services.
Prescriptive analysis answers what action to take?
Example: Using artificial intelligence, the data analyst evaluated the cost-
effectiveness of alternative recidivism programs.
Agencies may present performance results in several ways depending on the intended audience,
including in graphs, tables and statistical comparisons, as well as in dashboards, spreadsheets,
and formal reports. Graphs, charts, infographics, and other types of visualization tools are
extremely useful for assessing results for projects as well as larger programs (Levels 2 and 1). To
assess project goals, Federal agencies examine a “cohort of sub-projects,” meaning that the
analysis assesses all of the sub-projects that were awarded under a single project. Similarly, they
also may examine all of the projects together that were initiated under a larger program.
Although the example below in Figure 5 is based on court statistics, it provides an example of
how project and program data could be shown in order to see larger trends.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 43
Promising Practice Analysis of Program and Project Results
Figure 5: Department of Justice National Center for State Courts Data Visualization
72
Measuring and analyzing the outcomes of a projects and program leads to many policy
questions, such as:
Was the program successful in meeting performance targets?
What was the effect on society resulting from the program?
Can the performance outcomes of the program or project influence goals and objectives
for future projects and programs and NOFOs?
Did the results provide adequate value relative to the costs?
How will these results affect the project and/or program’s future budget requests?
How can Federal awarding agencies use performance data in decision-making?
Federal awarding agencies should use the results of their data analysis to make future decisions
in the continuation and/or refinement of projects and larger programs. Ultimately, the end goal is
72
Department of Justice National Center for State Courts Data Visualization:
https://public.tableau.com/profile/ncscviz#!/vizhome/JuvenileDependencyversusDelinquencyCaseload/Story
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 44
to have an effective program that achieves its objectives and captures lessons learned to
communicate a program’s success and outcomes. Agencies should: 1) analyze performance data
and use that information for future decision-making; and 2) make data transparent to internal and
external stakeholders and share information performance results on their websites.
III.E.2. Dissemination of Lessons Learned
Federal awarding agencies are responsible for and encouraged to develop lessons learned. Once
developed, the lessons learned should be disseminated.
One example of this is the Center for Disease Control’s (CDC) Unified Process Practice Guide
on Lessons Learned.
73
According to the CDC guide, lessons learned are the learnings gained
during or at the end of a single award or a cohort of sub-projects. The purpose of practicing
lessons learned is to share knowledge to: 1) promote good outcomes; and 2) prevent undesirable
outcomes. Each documented lessoned learned should have at least these elements:
1. Project information, including goals, objectives, and results.
2. A clear description of the lesson.
3. A background summary describing how the lesson was learned.
4. Benefits of using the lesson and how the lesson may be used in the future.
There are many different ways to disseminate lessons learned. One example of the dissemination
of lessons learned is the Department of Education “What Works Clearinghouse,” designed to
“review the existing research on different programs, products, practices and policies in
education.”
74
The goal of this clearinghouse is to provide educators with the information that
they need to make evidence-based decisions.
III.E.3. Program Evaluation
Agencies also measure the success of programs and projects by conducting evaluations.
Evaluations can play an important role in performance management because they can provide an
analysis of how a single recipient or a cohort of sub-projects have performed over the entire
period of performance rather than at specific points in time. Program evaluations are systematic
studies to assess how well a program advances or achieves its intended results or outcomes.
Evaluations can help policy-makers and agency managers strengthen the design and operation of
a program and can help determine how best to spend taxpayer dollars effectively.
Evaluations should address questions related to the overall performance of a program or project,
the effectiveness of particular project strategies, and/or factors that relate to variability in
program or project effectiveness. Evaluations can also examine questions related to measurement
of progress, such as the reliability of performance data, identifying appropriate goals or targets
for performance, and understanding factors surrounding a program.
73
CDC Lessons Learned Practices Guide:
https://www2a.cdc.gov/cdcup/library/practices_guides/CDC_UP_Lessons_Learned_Practices_Guide.pdf
74
For more information on the ED “What Works Clearinghouse” visit the following website:
https://ies.ed.gov/ncee/wwc/
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 45
Program evaluation begins with a discussion during the program design step (See Section IV.A.
Program Design) about when a program might undergo an evaluation. This involves thinking
through program evaluation questions regarding the program logic model and focusing on
questions that have real value for stakeholders and decision-makers.
Additionally, appropriate and measurable indicators and their data collection criteria developed
during the program design phase are key to a sound evaluation. Early identification of indicators
allows the program team to learn what baseline data already may be available to help evaluate
the project, or to design a process to collect baseline data before initiating or implementing the
program. The logic model is useful for identifying elements of the program that are most likely
to yield useful evaluation data and to identify an appropriate sequence for collecting data and
measuring progress.
III.E.4. Federal Evidence Building
There are many ongoing efforts across the Federal Government to improve the way that agencies
build and use evidence in decision-making. The Foundations for Evidence-Based Policymaking
Act of 2018 (Evidence Act) offers a new set of tools that can help agencies better build and use
evidence and data to support decision-making and more efficient and effective execution of their
missions and operations. Agencies with award making responsibilities should consider promising
performance management practices in their efforts to establish data governance, learning
agendas, evaluation plans, and capacity assessments. Please refer to OMB M-19-23 and M-20-12
for more information on these requirements.
75
The Evidence Act and OMB implementation guidance reinforces existing Federal policies and
procedures and also creates a new paradigm that calls on agencies to significantly rethink how
they currently plan and organize evidence building, data management, and data access functions
to ensure an integrated and direct connection to data and evidence needs. When taken together
with the President’s Management Agenda, specifically the Federal Data Strategy and Grants
CAP Goals, these collective efforts should improve how Federal agencies obtain the data and
evidence necessary to make critical decisions about program operations, policy, and
regulations.
76
These critical decisions also aim to gain visibility into the impact of resource
allocation on achieving mission objectives.
IV. Maintaining a Results-Oriented Culture
As discussed at the beginning of the PM Playbook, the purpose of the Grants CAP Goal is to
“maximize the value of grant funding by applying a risk-based, data-driven framework that
75
See the Office of Management and Budget Memorandum M-19-23, Phase 1 Implementation of the Foundations
for Evidence-Based Policymaking Act of 2018: Learning Agendas, Personnel, and Planning Guidance (2019):
https://www.whitehouse.gov/wp-content/uploads/2019/07/M-19-23.pdf
and Phase 4 Implementation of the
Foundations of Evidence-Based Policymaking Act of 2018: Program Evaluation Standards and Practices:
https://www.whitehouse.gov/wp-content/uploads/2020/03/M-20-12.pdf
76
Federal Data Strategy Website: https://strategy.data.gov/
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 46
balances compliance requirements with demonstrating successful results for the American
taxpayer.”
77
A major strategy for achieving this goal is to “hold recipients accountable for good
performance practices that support achievement of program goals and objectives.”
78
As the PM
Playbook has highlighted, Federal awarding agencies can manage for results by: 1)
understanding what makes federal award programs and their projects successful; and 2)
establishing and/or maintaining a “results-oriented” culture in their organization.
Agencies measure success by examining evidence, such as performance indicators,
administrative data, survey results, scientific findings, descriptive research studies, and
evaluation results. Federal awarding agencies support this focus on results through the creation
of learning agendas and the use of risk-based, data-informed decision-making.
1. Learning agendas: A learning agenda is a plan for identifying and answering policy questions
about programs and other items, and includes information on how data will be collected and
analyzed to support the use of evidence in decision-making.
79
Previously, Federal awarding
agencies such as DOL, HUD, and USAID used learning agendas to focus on evaluating
evidence.
80
Agency-wide learning agendas that align and are submitted with agency strategic
plans are now required under the Evidence Act.
77
The President’s Management Agenda, Results-Oriented Accountability for Grants website:
https://www.performance.gov/CAP/grants/
78
The President’s Management Agenda, Results-Oriented Accountability for Grants website:
https://www.performance.gov/CAP/grants/
79
A learning agenda is equivalent to the agency evidence-building plan required in Section 101 of the Evidence Act.
5 U.S.C. §312(a).
80
United States Agency for International Development (USAID) Learning Agenda Approach Document:
https://usaidlearninglab.org/sites/default/files/resource/files/defining_a_learning_agenda.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 47
Promising Practices - Implementing a Learning Agenda
81
The United States Agency for International Development (USAID) developed a resource guide
for implementing a learning agenda approach. USAID defines a learning agenda as “a set of
broad questions directly related to the work that an agency conducts that, when answered,
enables the agency to work more effectively and efficiently, particularly pertaining to
evaluation, evidence, and decision-making.” Learning agendas help establish a results-oriented
culture by:
Identifying and prioritizing the questions that need to be answered to improve program
effectiveness and build evidence;
Answering questions with appropriate tools and methods;
Implementing studies and analyses based on the strongest available methods;
Involving key stakeholders;
Acting on the results of what is learned; and
Disseminating findings for program improvement.
A strong learning agenda approach:
Maximizes results by helping agency and implementing partners learn more quickly
and make iterative, timely course corrections;
Reinforces the strategic direction of agency programs and policies by including
learning in all parts of program design and implementation;
Adapts as evidence and context shifts;
Helps the agency, implementing partners, and others identify and focus on priorities to
maintain and strengthen strategic direction;
Remains flexible. Although the learning agenda may be formally updated on a
particular timeline (e.g. once a year), it should not unnecessarily bind agencies or
discourage new ideas and updates.
Accommodates short and longer-term priorities and intentionally build evidence over
time towards strategic objectives.
2. Risk-based, data-informed decision-making: Decisions should be made while using a risk-
based approach. A risk based, data-informed decision-making approach begins with
collecting “evidence” (such as performance indicators and findings from research studies and
program evaluations) and ends with using this data along with an assessment of risk to
improve program implementation. This type of decision-making focuses on using a risk scale
to help grant managers and others assess the impact of changing program and/or project
81
United States Agency for International Development (USAID) Learning Agenda Approach Document:
https://usaidlearninglab.org/sites/default/files/resource/files/defining_a_learning_agenda.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 48
implementation based on available evidence. Program and project-level risk assessments can
be assessed by creating three types of scales (see below).
1) Impact Scale: To assess the level of impact of a decision.
Minor: Requires little to no change in how the program operates
Moderate: Requires a change that would alter program operations in a
generally positive direction
Extreme: Requires a risky change where the outcome on program
operations is uncertain
2) Frequency Scale: To access the probability of a change
Rare: Less than 20% chance of a negative impact of change
Likely: >20% and <80 % chance of a negative impact of change
Frequent: 80% and above chance of a negative impact of a program
change
3) Vulnerability Scale: To access the level of preparation for a negative change
Low: Program ready to implement change
Medium: Program needs time to implement change
High: Program would have to be completely altered to implement change
The President’s Management Agenda, Grants CAP goal includes a strategy dedicated to
“managing risk.” The goal of this strategy is to leverage data, including data produced by annual
audits, to assess and manage recipient risk. The potential result of this strategy is a
comprehensive risk management tool for government-wide use.
Federal awarding agencies are encouraged to either get involved with shaping this government-
wide strategy or periodically view the progress of this strategy on performance.gov for the latest
information.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 49
Promising Practice - Create a Risk Appetite Statement
82
United States Agency for International Development (USAID) issued a “Risk Appetite
Statement” to provide agency employees with guidance on the amount and type of risk the
agency will accept to achieve strategic plan goals and objectives. USAID defined seven
categories of risk, including programmatic risk, that could undermine program effectiveness
and results. USAID also created a risk scale with three categories from low to medium to high
for many agency functions, such as programs (High) vs Fiduciary (Low).
Low Risk AppetiteRisk is avoided or minimized
Medium Risk AppetiteBalancing between the potential benefits and costs
High Risk AppetiteDisciplined risk-taking where potential benefits outweigh costs.
Federal awarding agencies may use different types of procedures to create and maintain a results-
oriented culture that highlights performance results and outcomes within their organizations.
These include:
1. Leadership champions: Leadership support is essential to successful cultural change. The
change process needs committed leaders at different levels of the organization to support
the creation of new ways of doing business.
2. Federal awarding agency performance management frameworks: A performance
management framework tailored to the mission and needs of an agency can help
communicate the who, what, and how of culture change. For example, who will be
impacted, what is the impact of change, and how will the change be accomplished?
3. Data-driven reviews of performance and progress: A critical aspect of performance
management is reviewing program and project data and conducting regular assessments
on their level of success in meeting program goals and project objectives. Grant managers
and others can use the results of these program and project reviews to make changes to
programs and future projects (in NOFOs).
4. Standard operating procedures: Agencies should create and maintain standard operating
procedures (SOPs) for employees to follow. SOPs assist in teaching employees how to
best conduct performance management practices.
5. Performance management manuals or toolkits: Like SOPs, agencies can codify their
performance management policies and practices in manuals, and can help teach
employees how to implement these policies and practices in toolkits. Both manuals and
toolkits are effective communication tools.
82
United States Agency for International Development (USAID) Risk Appetite Statement, June 2018.
https://www.usaid.gov/sites/default/files/documents/1868/USAID_Risk-Appetite-Statement_Jun2018.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 50
6. Training for internal and external stakeholders: Training is an essential aspect of culture
change. Both Federal employees as well as current and potential recipients will need to
understand why and how change is taking place.
V. Conclusion
While considering a results-oriented culture for Federal award making, Federal awarding
agencies are encouraged to begin to make a paradigm shift in grants management from one
heavy on compliance to an approach focused more on performance that includes establishing
measurable program and project goals and analyzing data to improve results. The development
of a results-oriented culture requires change management and will occur over time. In the future,
program impact will be assessed across the Federal government and taxpayers will have a clear
picture of the impact of Federal dollars spent on programs. To assess program impact, agencies
must establish clear program and project goals and objectives, and measure project and
individual award recipient progress against them. The PM Playbook was developed to assist
Federal awarding agencies with improving their assessment of program impact.
There are several multifaceted strategies developed to achieve the purpose of the Grants CAP
Goal to “maximize the value of grant funding by applying a risk-based, data-driven framework
that balances compliance requirements with demonstrating successful results.”
83
The
performance strategy to “achieve program goals and objectives” is only one part of the equation.
The elements of performance, risk, and compliance all fit together to achieve this goal and the
PM Playbook describes each of these elements throughout. Risk and compliance activities
support performance and evidence-based decision-making.
83
The President’s Management Agenda, Results-Oriented Accountability for Grants website:
https://www.performance.gov/CAP/grants/
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 51
Figure 6: Achieving a balanced approach to making a Federal award
84
This is the beginning of a dialogue on this topic and future versions of the PM Playbook will
encompass a larger scope and broader stakeholder group to help shape this area and possible
revisions to 2 CFR.
84
Figure developed by authors of the PM Playbook.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 52
VI. Appendices
Appendix A. Glossary of Terms
Applicant
85
: A person or entity that submits an application for Federal financial assistance.
Application
86
: The mechanism that an applicant uses to submit their credentials to obtain
Federal financial assistance funding.
Assistance Listing
87
: The publically available listing of Federal assistance programs managed
and administered by the General Services Administration. Formerly known as the Catalog of
Federal Domestic Assistance (CFDA).
Assistance Listing Level
88
: This playbook used the phrase “assistance listing level” to refer to
Federal assistance programs during programmatic performance management activities in order to
distinguish the difference between a program and a project.
Assistance Listing Number
89
: A unique number assigned to identify a Federal assistance
listing. Formerly known as the CFDA Number.
Compliance
90
: Compliance means meeting the obligations associated with accepting a Federal
award. This includes making sure that award funds are spent in accordance with applicable
statutes, requirements in the Notice of Funding Opportunity Announcement (NOFO),
specifications in the notice of award and the award recipient budget submission, and specific
agency policies.
Cooperative Agreement
91
: A legal instrument of financial assistance between a Federal
awarding agency or pass-through entity and a non-Federal entity that, consistent with 31 U.S.C.
6302-6305:
a) Is used to enter into a relationship the principal purpose of which is to transfer
anything of value from the Federal awarding agency or pass-through entity to the
85
Developed by authors of the PM Playbook.
86
Developed by authors of the PM Playbook.
87
2 CFR 200 Definitions: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=7936a2f1db857b8d4e3244a0555d11fb&mc=true&n=pt2.1.200&r=PART&
ty=HTML
88
Developed by authors of the PM Playbook.
89
2 CFR 200 Definitions: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=7936a2f1db857b8d4e3244a0555d11fb&mc=true&n=pt2.1.200&r=PART&ty=HTML
90
Defined by authors of the PM Playbook.
91
2 CFR 200 Definitions: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=7936a2f1db857b8d4e3244a0555d11fb&mc=true&n=pt2.1.200&r=PART&ty=HTML
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 53
non-Federal entity to carry out a public purpose authorized by a law of the United
States (see 31 U.S.C. 6101(3)); and not to acquire property or services for the Federal
Government or pass-through entity's direct benefit or use;
b) Is distinguished from a grant in that it provides for substantial involvement between
the Federal awarding agency or pass-through entity and the non-Federal entity in
carrying out the activity contemplated by the Federal award.
c) The term does not include:
1. A cooperative research and development agreement as defined in 15 U.S.C.
3710a; or
2. An agreement that provides only:
i. Direct United States Government cash assistance to an individual;
ii. A subsidy;
iii. A loan;
iv. A loan guarantee; or
v. Insurance.
Goal, Performance
92
: A statement of the level of performance to be accomplished within a
timeframe, expressed as a tangible, measurable objective or as a quantitative standard, value, or
rate. For the purposes of this guidance and implementation of the GPRA Modernization Act, a
performance goal includes a performance indicator, a target, and a time period. The GPRA
Modernization Act requires performance goals to be expressed in an objective, quantifiable, and
measurable form unless agencies in consultation with OMB determine that it is not feasible. In
such cases an “alternative form” performance goal may be used. The requirement for OMB
approval of an alternative form goal applies to performance goals only. Milestones are often used
as the basis of an alternative form performance goal. Performance goals specified in alternative
form must be described in a way that makes it possible to discern if progress is being made
toward the goal.
Indicator
93
: A measurable value that indicates the state or level of something.
Various types of indicators (e.g. outcome, output, customer service, process, efficiency) may be
used as either performance indicators or other indicators. Agencies are encouraged to use
outcome indicators as performance indicators where feasible and appropriate. Agencies also are
encouraged to consider whether indicators have been validated through research conducted to be
well correlated with what they are intended to measure. Some examples include the following:
Indicator, Input. A type of measure that indicates the consumption of resources,
especially time and/or money, used.
92
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
93
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 54
Indicator, Outcome. A type of measure that indicates progress against achieving the
intended result of a program. Indicates changes in conditions that the government is
trying to influence.
Indicator, Customer Service. A Type of measure that indicates or informs the
improvement of government’s interaction with those it serves or regulates.
A performance indicator is the indicator for a performance goal or within an agency Priority
Goal statement that will be used to track progress toward a goal or target within a timeframe. By
definition, the indicators for which agencies set targets with timeframes are performance
indicators.
Internal Controls
94
:
(a) Internal controls for non-Federal entities means processes designed and implemented by
non-Federal entities to provide reasonable assurance regarding the achievement of objectives
in the following categories:
(1) Effectiveness and efficiency of operations;
(2) Reliability of reporting for internal and external use; and
(3) Compliance with applicable laws and regulations.
(b) Internal controls Federal awarding agencies are required to follow are located in OMB
Circular A-123.
Non-Federal Entity
95
: A state, local government, Indian tribe, Institutions of Higher Education
(IHE), or nonprofit organization that carries out a Federal award as a recipient or subrecipient.
Notice of Funding Opportunity (NOFO), Funding Opportunity Announcement (FOA),
Notice of Funding Announcement (NOFA)
96
: The publicly available document that contains
all the official information about a Federal grant. This is how a Federal awarding agency
announces the availability of a grant, and provides instructions on how to apply for that grant.
Output
97
: Quantity of products or services delivered by a program, such as the number of
inspections completed or the number of people trained.
94
2 CFR 200 Definitions: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=7936a2f1db857b8d4e3244a0555d11fb&mc=true&n=pt2.1.200&r=PART&ty=HTML
95
2 CFR 200 Definitions: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=7936a2f1db857b8d4e3244a0555d11fb&mc=true&n=pt2.1.200&r=PART&ty=HTML
96
The Grants.Gov definition for an award solicitation:
https://webcache.googleusercontent.com/search?q=cache:NvT2yhohG84J:https://blog.grants.gov/2017/10/18/what-
is-a-funding-opportunity-announcement/+&cd=15&hl=en&ct=clnk&gl=us
97
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 55
Outcome
98
: The desired results of a program. For example, an outcome of a nation-wide
program aimed to prevent the transmission of HIV infection might be a lower rate of new HIV
infections in the U.S. Agencies are strongly encouraged to set outcome-focused performance
goals to ensure they apply the full range of tools at their disposal to improve outcomes and find
lower cost ways to deliver. However, there are circumstances where the effects of a program on
final outcomes are so small and confounded with other factors that it may be more appropriate to
base performance goals on indicators or intermediate outcomes. Ideally, those indicators and
intermediate outcomes should have strong theoretical and empirical ties to final outcomes.
Performance
99
: Performance means the measurement and analysis of outcomes and results,
which generates reliable data on the effectiveness and efficiency of a project and/or program.
Performance Management
100
: Use of goals, measurement, evaluation, analysis, and data driven
reviews to improve results of programs and the effectiveness and efficiency of agency
operations. Performance management activities often consist of planning, goal setting,
measuring, analyzing, reviewing, identifying performance improvement actions, reporting,
implementing, and evaluating. The primary purpose of performance management is to improve
performance and then to find lower cost ways to deliver effective programs.
Program
101
: Generally, an organized set of activities directed toward a common purpose or goal
that an agency undertakes or proposes to carry out its responsibilities. Within this broad
definition, agencies and their stakeholders currently use the term “program” in different ways.
Agencies have widely varying missions and achieve these missions through different
programmatic approaches, so differences in the use of the term “program” are legitimate and
meaningful. For this reason, OMB does not prescribe a superseding definition of “program”;
rather, consistent with the GPRA Modernization Act, agencies may identify programs consistent
with the manner in which the agency uses programs to interact with key stakeholders and to
execute its mission.
Program Activity
102
: Program activity means the principle program activity listed in the
program and financing schedules of the annual budget of the United States Government (see 31
USC 1115 (h)). Note, program activities do not necessarily match “programs” as defined in the
Government Performance and Results Act (GPRA), the GPRA Modernization Act, or the
Catalog of Federal Domestic Assistance.
98
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
99
Defined by authors of the PM Playbook.
100
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
101
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
102
Defined by authors of the PM Playbook.
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 56
Program Inventory
103
: Program inventory refers to the Government Performance and Results
Act Modernization Act of 2010 (GPRAMA) requirement that OMB create and maintain an
inventory of federal programs. This inventory typically is a list of agency programs that align
directly with the program activities listed in the annual budget (or congressional justification).
Project
104
: A temporary endeavor to create a unique product or service with a start date, a
completion date, and a defined scope. Projects are executed in a manner to improve the efficient
and effective implementation of programs and contribute to or aligned with agency goals and
objectives.
Recipient
105
: A non-Federal entity that receives a Federal award directly from a Federal
awarding agency. The term recipient does not include subrecipients or an individual that is a
beneficiary of the award.
Risk Assessment
106
: An evaluation process used to: 1) determine if prior areas of concern
regarding an award recipient that were identified in (a) previous year(s) have been resolved; 2)
identify persistent or long-standing areas of concern that have the potential to result in a finding;
and 3) identify new area(s) of concern that may need to be addressed to ensure successful
administration of a grant award
Risk Management
107
: Risk management refers to the process of assessing, managing, and
mitigating risk that may occur in Agency activities.
Sub-project
108
: NOFO’s may use terms like “initiative,” “program,” or “project” to refer to the
activities that an award recipient plans to accomplish with their award. To avoid confusion, this
playbook uses the phrase “sub-project” to refer to these activities. As noted previously, the
playbook uses the term “project” to refer to the activities specified in a NOFO. See Figure 1: DOJ
SCA Example: Program, Project, Funding Vehicle and Recipient Relationships
Target
109
: Quantifiable or otherwise measurable characteristic typically expressed as a number
that tells how well or at what level an agency or one of its components aspires to perform. In
103
Defined by authors of the PM Playbook.
104
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
105
2 CFR 200 Definitions: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=7936a2f1db857b8d4e3244a0555d11fb&mc=true&n=pt2.1.200&r=PART&ty=HTML
106
Defined by authors of the PM Playbook.
107
Defined by authors of the PM Playbook.
108
Defined by authors of the PM Playbook.
109
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 57
setting and communicating targets, where available, agencies should include the baseline value
from which the target change is calculated.
Appendix B. Key Stakeholders
Agency COOs
110
: Deputy Secretaries or equivalent, provide organizational leadership to
improve performance.
Agency CXOs
111
: Executives who lead agency management functions, such as the Chief
Financial Officer (CFO), Chief Human Capital Officer (CHCO), Chief Acquisition Officer
(CAO), Chief Information Officer (CIO), and Chief Data Officer (CDO). Executives leading
these management functions work closely with the PIO, agency head and COO to ensure that
mission support resources are effectively and efficiently aligned and deployed to achieve the
agency mission. This includes such activities as routinely leading efforts to set goals, make
results transparent, review progress, and make course corrections as needed to ensure that the
agency’s management functions are effective in supporting agency goals and objectives.
Agency PIOs
112
: Report directly to the COO, are responsible for supporting the agency head and
COO in leading efforts to set goals, make results transparent, review progress and make course
corrections.
Chief Financial Officers Council (CFOC)
113
: The Council was established pursuant to Chief
Financial Officers (CFO) Act of 1990 (Public Law 101-576). It is an organization of the CFOs
and Deputy CFOs of the largest Federal agencies, senior officials of the Office of Management
and Budget, and the Department of the Treasury who work collaboratively to improve financial
management in the U.S. Government. The management of grants is conducted under different
offices of Federal awarding agencies throughout the Federal government. The decision was made
by OMB for the management of grants to be conducted under the CFOC when the CFO Act was
enacted.
Chief Risk Officers (CRO)
114
: Senior agency official leading an equivalent function, who
champion agency-wide efforts to manage risk within the agency and advise senior leaders on the
strategically-aligned portfolio view of risks at the agency.
110
OMB Circular A-11 (2019 version) Part 6, Executive Summary: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
111
OMB Circular A-11 (2019 version) Part 6, Executive Summary: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
112
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
113
Chief Financial Officers Council: https://cfo.gov/about/
114
OMB Circular A-123: https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2016/m-16-17
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 58
Congress
115
: When referring to a time-period (e.g., the 114th Congress which convened on
January 6, 2015) rather than the legislative branch generally, a Congress is the national
legislature in office (for approximately two years). It begins with the convening of a new
Congress comprised of members elected in the most-recent election and ends with the
adjournment sine die of the legislature (typically after a new election has occurred).
Evaluation Officers
116
: who play a leading role in overseeing the agency’s evaluation activities,
learning agenda, and information reported to OMB on evidence, as well as collaborating with,
shaping, and making contributions to other evidence-building functions within the agency.
Goal Leaders
117
: are officials named by the agency head or COO who are held accountable for
leading implementation efforts to achieve a goal. This role includes laying out strategies to
achieve the goal, managing execution, regularly reviewing performance, engaging others as
needed and correcting course as appropriate.
Government Accountability Office (GAO)
118
: The U.S. Government Accountability Office
(GAO) is an independent, nonpartisan agency that works for Congress. Often called the
"congressional watchdog," GAO examines how taxpayer dollars are spent and provides Congress
and Federal agencies with objective, reliable information to help the government save money and
work more efficiently.
Local community served
119
: Beneficiary of the Federal program. This is not the recipient of the
financial assistance funding; rather, it is the individual or group of individuals who receive the
benefits of the intent of the Federal program.
Office of the Inspector’s General (OIG)
120
: Per the Inspector General Act of 1978, as
amended, the Inspector General's mission is to:
Conduct independent and objective audits, investigations and inspections
Prevent and detect waste, fraud and abuse,
Promote economy, effectiveness and efficiency,
Review pending legislation and regulation, and
Keep the agency head and Congress fully and currently informed.
115
More information about Congress: https://congress.gov/help/legislative-glossary#c
116
OMB Circular A-11 (2019 version) Part 6, Section 290: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
117
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
118
Direct text from the GAO site: https://www.gao.gov/about
119
Defined by authors of the PM Playbook.
120
More information about the Office of the Inspector General: https://www.ignet.gov/content/frequently-asked-
questions
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 59
Pass-through entity (PTE)
121
: A non-Federal entity that provides a subaward to a subrecipient
to carry out part of a Federal program.
Program Management Improvement Officers
122
, who must report directly to the COO or
other equivalent senior agency official responsible for agency program performance, and are
responsible for leading efforts to enhance the role and practice of program and project
management (P/PM).
Recipient: See Appendix A. Glossary of Terms.
The Performance Improvement Council (PIC)
123
is comprised of agency PIOs and OMB and
advises on the development of government-wide policies designed to strengthen agency
management and facilitate cross-agency learning and cooperation. The PIC is supported by the
General Service Administration’s (GSA) Office of Shared Solutions and Performance
Improvement (OSSPI), which works with Agencies to develop solutions to matters that affect
mission activity, management functions and performance, as well as support OMB and Goal
Leaders in analyzing progress on Priority Goals.
The Program Management Policy Council (PMPC)
124
is comprised of agency PMIOs and
OMB and advises on the development and implementation of policies and strategies for
strengthening program and project management within the Federal Government by facilitating
cross-agency learning, cooperation, and sharing promising practices identified by agencies and
private industry.
Appendix C. Federal Laws and Regulations
Federal Grant and Cooperative Agreement Act of 1977
125
:
Federal grant agreements and cooperative agreements are defined by the Federal
Grant and Cooperative Agreement Act of 1977, as codified in Title 31 Section 6304
of the U.S. Code. A key purpose of the Act was to distinguish financial assistance
121
See 2 CFR 200 Definitions: https://www.ecfr.gov/cgi-
bin/retrieveECFR?gp=&SID=7936a2f1db857b8d4e3244a0555d11fb&mc=true&n=pt2.1.200&r=PART&ty=HTML
122
OMB Circular A-11 (2019 version) Part 6, Section 270: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
123
OMB Circular A-11 (2019 version) Part 6, Section 200: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
124
OMB Circular A-11 (2019 version) Part 6, Section 270: https://www.whitehouse.gov/wp-
content/uploads/2018/06/a11.pdf
125
More information about the Federal Grant and Cooperative Agreement Act of 1977:
https://cfo.gov/lms/Lesson2/Module2/Lesson2/00200.htm
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 60
relationships (grant agreements and cooperative agreements) between Federal
procurement (contracts) relationships. Along with definitions and clarifications, the
Act provides criteria for the Federal agency to select the appropriate legal instruments
to achieve:
o Uniformity in their use
o A clear definition of the relationships between the Federal agency and the
entity; and,
o A better understanding of the responsibilities of the Federal agency and the
entity.
Chief Financial Officers Act of 1990
126
:
Establishes the OMB Deputy Director for Management (DDM), OMB Office of
Federal Financial Management, agency Chief Financial Officers, agency Deputy
Chief Financial Officers, and the Chief Financial Officers Council.
Requires each executive agency to prepare annual financial statements for submission
to the Director.
Requires the Director to report to the Congress on which executive agencies perform
substantial commercial functions for which financial statements practicably can be
prepared. Provides for audits of such statements.
Sets forth requirements for specified departments, agencies, and bureaus to report a
financial statement to the Director of OMB and requires the Director to report an
analysis of such statements to the Congress.
Requires an audit of each financial statement prepared under this Act.
Revises the mandate and general procedures for: (1) the audit of financial statements
of Government corporations; and (2) the annual management reports of such
corporations.
Declares that no capital accounting standard or principle, including any human capital
standard or principle, shall be adopted for use in an executive department or agency
until it has been reported to the Congress and 45 days of continuous congressional
session have expired.
Clinger-Cohen Act of 1996 (previously the Information Technology Management Reform
Act):
In conjunction with OMB Circular A-130, the Clinger-Cohen Act improved the way
the federal government acquires, uses, and disposes of information technology (IT).
This Act directs agencies to consider their missions when undertaking significant
investments in IT. The Act also plays an integral role in the federal government’s
effort to standardize data categories, increase accessibility of program and financial
126
Chief Financial Officers Act of 1990: https://www.congress.gov/bill/101st-congress/house-bill/5687
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 61
data, and improve workflow automation (e.g., compliance, grant and contract
reporting), and increase access to performance data.
Federal Funding Accountability and Transparency Act of 2006 (FFATA)
127
:
Congress passed FFATA in 2006 with the intent to provide more transparency in government
decision-making to hold the government accountable for government spending decisions. As
executive agencies and their components enact FFATA requirements, the public and federal
employees will have increased access to information about program funding, performance, and
expenditures. FFATA required the creation of USASpending.gov, which has increased public
availability of information on federal contracts and grant awards.
Government Performance and Results Modernization Act (GPRA)
128
:
Congress enacted GPRA in 1993. The law requires federal executive agencies to develop
strategic plans with long-term goals, submit annual performance plans, and report on projected
and prior year performance to Congress. To comply with GPRA, agencies develop strategic
plans, performance plans, and conduct gap analyses of projects. GPRA established project
planning, strategic planning, and a reporting framework for agencies to demonstrate progress
towards achieving strategic goals.
Government Performance and Results Act Modernization Act of 2010
129
: (GPRAMA):
GPRAMA strengthened federal requirements defined by GPRA to produce more frequent,
relevant data for better-informed decision-making. GPRAMA requires more frequent reporting
and reviews than GPRA as a way to increase the use of performance information in program
decision-making, and provide more clarity on the connection between planning, programs, and
performance information. GPRAMA also requires annual progress reviews on agency strategic
objectives as established in their strategic plans to help inform decision- making.
Digital Accountability and Transparency (DATA) Act of 2014
130
:
Enacted in 2016 as an update to FFATA, the DATA Act more closely links grant expenditures to
federal programs. The DATA Act requires federal agencies to combine accounting, procurement,
and financial assistance data to increase the transparency of how federal dollars are spent. The
DATA Act also improves the accuracy of information reported on USASpending.gov, by
requiring agencies to perform a more thorough review of financial data prior to submission.
127
Federal Funding Accountability and Transparency Act of 2006: https://www.congress.gov/bill/109th-
congress/senate-bill/2590
128
Government Performance and Results Act : https://www.govtrack.us/congress/bills/103/s20/text
129
Government Performance and Results Act Modernization Act of 2010: https://www.congress.gov/bill/111th-
congress/house-bill/2142
130
Digital Accountability and Transparency Act of 2014: https://www.congress.gov/bill/113th-congress/senate-
bill/994
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 62
Program Management Improvement Accountability Act (PMIAA) of 2016
131
:
This law establishes the following additional functions for the OMB Deputy Director of
Management:
Adopt and oversee implementation of government-wide standards, policies, and
guidelines for program and project management for executive agencies;
Chair the Program Management Policy Council (established by this Act);
Establish standards and policies for executive agencies consistent with widely
accepted standards for program and project management planning and delivery;
Engage with the private sector to identify best practices in program and project
management that would improve federal program and project management;
Conduct portfolio reviews to address programs identified as high risk by the
Government Accountability Office (GAO);
Conduct portfolio reviews of agency programs at least annually to assess the quality
and effectiveness of program management; and
Establish a five-year strategic plan for program and project management.
American Competitiveness and Innovation Act (AICA) of 2017
132
:
The bill was developed to maximize basic research, reduce administrative and regulatory burden,
maximize science, technology, engineering and math education, leverage the private sector,
manufacturing, innovation and technology transfer.
This bill requires the National Science Foundation (NSF) to maintain the intellectual
merit and broader impacts criteria as the basis for evaluating grant proposals in the
merit review process.
The National Institute of Standards and Technology (NIST) shall: (1) research
information systems for future cybersecurity needs; and (2) develop a process to
research and identify, or if necessary, develop cryptography standards and guidelines
for future cybersecurity needs, including quantum-resistant cryptography standards.
The Office of Management and Budget shall establish an interagency working group
to reduce administrative burdens of federally funded researchers while protecting the
public's interest in the transparency of, and accountability for, federally funded
activities.
The Office of Science and Technology Policy (OSTP) shall establish a body under
the NSTC to identify and coordinate international science and technology cooperation
in order to strengthen U.S. science and technology enterprise, improve economic and
national security, and support U.S. foreign policy goals.
131
Program Management Improvement Accountability Act of 2016: https://www.congress.gov/bill/114th-
congress/senate-bill/1550
132
American Competitiveness and Innovation Act of 2017: https://www.congress.gov/bill/114th-congress/senate-
bill/3084
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 63
The NSF, the Department of Education, the National Oceanic and Atmospheric
Administration (NOAA), and the National Aeronautics and Space Administration
(NASA) shall establish the STEM Education Advisory Panel to advise the NSTC
Committee on STEM Education on matters related to science, technology,
engineering, and mathematics (STEM).
Foundations for Evidence-Based Policymaking Act of 2018
133
:
This bill requires agency data to be accessible and requires agencies to plan to develop statistical
evidence to support policymaking.
OMB Memorandum M-19-23, “Phase 1 Implementation of the Foundations for Evidence-
Based Policymaking Act of 2018: Learning Agendas, Personnel, and Planning
Guidance
134
:
Phase 1 OMB Guidance for the Foundations for Evidence-Based Policymaking Act of 2018
which emphasizes collaboration and coordination to advance data and evidence-building
functions in the Federal Government by statutorily mandating Federal evidence-building
activities, open government data, and confidential information protection and statistical
efficiency. Evidence is broadly defined and includes foundational fact-finding, performance
measurement, policy analysis, and program evaluation.
OMB Memorandum M-20-12, “Phase 4 Implementation of the Foundation for Evidence-
Based Policymaking Act of 2018: Program Evaluation Standards and Practices”
135
:
Phase 4 OMB Guidance for the Foundation for Evidence-Based Policymaking Act of 2018,
which provides program evaluation standards to guide agencies in developing and implementing
evaluation activities, evaluation policies, and in hiring and retaining qualified staff. It also
provides examples of leading practices for agencies to draw upon as they build evaluation
capacity, develop policies and procedures, and carry out evaluations to support evidence-based
policymaking.
Grant Reporting Efficiency and Agreements Transparency (GREAT) Act of 2019
136
:
This bill requires the establishment and use of data standards for information reported
by recipients of federal grants.
The bill requires the Office of Management and Budget, jointly with the executive
department that issues the most federal grant awards, to (1) establish government-
wide data standards for information reported by grant recipients, (2) issue guidance
133
Foundations for Evidence-Based Policymaking Act of 2018: https://www.congress.gov/bill/115th-
congress/house-bill/4174
134
OMB Memorandum M-19-23: https://www.whitehouse.gov/wp-content/uploads/2019/07/M-19-23.pdf
135
OMB Memorandum M-20-12: https://www.whitehouse.gov/wp-content/uploads/2020/03/M-20-12.pdf
136
Grant Reporting Efficiency and Transparency Act of 2019: https://www.congress.gov/bill/116th-congress/house-
bill/150
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 64
directing federal agencies to apply those standards, and (3) require the publication of
recipient-reported data collected from all agencies on a single public website.
Each agency shall ensure its awards use the data standards for future information
collection requests.
OMB Circular A-123, “Management’s Responsibility for Enterprise Risk Management and
Internal Control”
Circular A-123 requires all agencies to implement an Enterprise Risk Management capability
coordinated with the strategic planning and strategic review process established by the
GPRAMA. Agencies also are required to follow the internal control processes required by the
Federal Managers Financial Integrity Act of 1982 and Government Accountability Office’s
(GAO)’s Green Book.
OMB Circular A-11, Part 6 (2019 version), “The Federal Performance Framework for
Improving Program and Service Delivery”
As a complement to GPRAMA, OMB Circular A-11, Part 6, Section 200, (2019 version),
“Overview of the Federal Performance Framework,” provides guidance to federal executive
agencies on the adoption of GPRAMA requirements. These include the development
implementation of integrated strategic planning, performance management, and budgeting
activities (See Sections 220, 230, 270 below):
Section 220 “Cross-Agency Priority Goals and Federal Performance Plans” outlines
how executive agencies may participate in cross-agency priority goals, which are a
subset of Presidential priorities.
Section 230 “Agency Strategic Planning” describes how executive
agencies should conduct strategic planning activities and develop strategic
plans, which highlight strategic goals and objectives that define what the
agency wants to accomplish in terms of outcomes or results.
Section 270 “Performance and Strategic Reviews” requires executive agencies to
conduct quarterly data-driven, performance reviews on the progress of each
strategic objective established in the Agency Strategic Plan. Such reviews should
inform strategic decision-making, budget formulation, and near-term agency
actions.
Code of Federal Regulations, Title 2 “Grants and Agreements,Part 200 “Uniform
Administrative Requirements, Cost Principles, and Audit Requirements for Federal
Awards” (Developed 2013, Revised 2020)
137138
137
2 CFR 200: Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Federal
Awards: https://www.ecfr.gov/cgi-bin/text-
idx?SID=6710c8abe4ad631c481e73180543bb39&mc=true&tpl=/ecfrbrowse/Title02/2cfr200_main_02.tpl
138
About 2 CFR 200: https://cfo.gov/grants/uniform-guidance/
The practices in the PM Playbook are shared informally for the purpose of peer to peer technical assistance, stakeholder engagement, and
dialogue. This document is not official OMB guidance, and is not intended for audit purposes. Implementation of the practices discussed may
differ by agency and applicable legal authority.
April 27, 2020 Page 65
In December 2014, OMB together with Federal awarding agencies issued an interim
final rule to implement the Uniform Administrative Requirements, Cost Principles,
and Audit Requirements for Federal Awards (Uniform Guidance). This guidance and
implementing regulations delivers on President Obama’s second term management
agenda and his first term directives under Executive Order 13520, the February 28,
2011 Presidential Memorandum, and the objectives laid out in OMB Memorandum
M-13-17 to better target financial risks and better direct resources to achieve
evidence-based outcomes. The final guidance, originally published December 26,
2013 (available at 78 FR 78589) simultaneously improves performance, transparency,
and oversight for Federal awards.
OMB issued a proposed rule in January 2020, which laid out many proposed
revisions to the guidance. This proposed rule is expected to be published by the end
of fiscal year 2020. The Federal Register notice for the proposed rule can be found at
the following link: https://www.federalregister.gov/documents/2020/01/22/2019-
28524/guidance-for-grants-and-agreements
VII. Agency Acknowledgements
The CFOC and the PIC would like to thank the Grants CAP Goal, Performance Workgroup for
their tremendous contributions with developing the PM Playbook.