Evaluation of Promise Neighborhoods
Part A: Supporting Statement for Paperwork Reduction Act Submission
September 2021
Contents
A1. Circumstances making the collection of data necessary 1
A3. Use of technology to reduce burden 3
A4. Efforts to identify and avoid duplication 3
A5. Efforts to minimize burden on small businesses or other entities 4
A6. Consequences of not collecting the information 4
A8. Federal register announcement and consultation 4
A10. Assurances of confidentiality 5
A11. Questions of a sensitive nature 6
A12. Estimates of respondent burden 7
A13. Estimate of the cost burden to respondents 8
A14. Estimates of annualized government costs 8
A16. Plans for tabulation and publication of results 8
A17. Display of expiration date for OMB approval 9
A18. Exceptions to certification statement item 19 of OMB form 83-1 9
Tables
A.2. List of Technical Working Group participants, their affiliation and relevant expertise 5
A.3. Estimates of respondent burden 7
Appendix A: Email Notifications
Appendix B: Current Grantee Survey
Appendix C: Excel Workbook for Current Grantees
Appendix D: Previous Grantee Survey
Appendix E: Data Request Memo
The U.S. Department of Education’s (ED) Institute of Education Sciences (IES) requests clearance for data collection activities to support a study of the Promise Neighborhoods program. This program is funded through federal grants authorized by Title IV of the Elementary and Secondary Education Act (ESEA), most recently reauthorized as the Every Student Succeeds Act (ESSA). Congress has invested $506 million in Promise Neighborhoods grants and mandated an evaluation of the program.1 Modeled in part after the Harlem Children’s Zone, the program aims to build on existing community services and strengths to provide a comprehensive and coordinated pipeline of educational and developmental services from "cradle to career" to benefit children and families in the country’s most distressed neighborhoods.
This package requests approval to conduct a survey of Promise Neighborhoods grantees and to collect multiple years of administrative school records from districts. These data will be used to study the implementation and outcomes of the Promise Neighborhoods program.
To date, there has not been a national evaluation of the Promise Neighborhoods program despite the fact that 68 communities have been supported by program funds since 2010. Congress recently mandated such an evaluation under ESSA. Understanding the implementation of and outcomes associated with the Promise Neighborhoods program is particularly critical in the wake of the COVID-19 pandemic. Students and families in underserved neighborhoods, including racial and ethnic minority communities, are likely to have been disproportionately affected by the pandemic and may benefit from enhanced and coordinated services to recover. The evaluation’s findings will provide valuable evidence for a number of stakeholders, including ED, which administers the program; members of Congress, who decide whether to appropriate funds to the program; grantees seeking to improve their programs; and state, district, and school policymakers and community organizations who are considering similar initiatives.
IES contracted with Mathematica and its partners – Social Policy Research Associates and the Urban Institute – to conduct this study. This study will have two components that address several research questions (Table A.1). The first is an implementation analysis, which will describe the implementation of the Promise Neighborhoods grants in terms of the services offered, the characteristics of service recipients, the degree to which services are coordinated, implementation challenges, and funding sources. The second component is an outcomes analysis, which will assess whether any changes in outcomes after the grant award were unique to Promise Neighborhoods schools or whether similar changes were observed in other similar schools.
Table A.1. Key research questions
Implementation analysis |
|
|
|
|
Outcomes analysis |
|
The implementation analysis will be based on information collected through a grantee survey administered in fall 2021. Respondents will include 12 current grantees who were awarded five-year grants in FY2016, 2017, or 2018 and 10 previous grantees who were awarded five-year grants in FY2011 or 2012.2 This sample represents all Promise Neighborhoods grantees from FY2011 through FY2020, as grants were not awarded in FY2013, 2014, 2015, 2019, or 2020.
The survey will gather information about the Promise Neighborhoods services offered during the grant period, including the number and types of services by pipeline stage3, the recipients served, the needs each service focused on, and whether each service was added, improved, or expanded during the grant period. It will also ask how services changed during the grant period, the types of schools and students served by the Promise Neighborhood, implementation challenges, how services are coordinated and connected, and funding/cost of the program. The surveys for previous and current grantees include similar sets of questions, with a few purposeful differences. For example, the survey for previous grantees does not ask about the proportion of recipients who received the intended dosage of services, because it would likely be challenging for respondents to provide this type of in-depth information for grants that have ended. In addition, the current grantee survey includes a few questions about how the COVID-19 pandemic may have influenced community needs, the services provided, and the allocation of Promise Neighborhood grant funds.
One component of the survey for current grantees is an Excel workbook which will be pre-populated with existing information on each Promise Neighborhood’s services to minimize burden. The existing information will be pulled from previous annual performance reports provided to Urban Institute by the grantees and previous conversations with grantees that Mathematica had when designing the evaluation. Respondents will be asked to confirm and supplement (if necessary) the existing data. A separate tab in the workbook will automatically calculate counts that the grantees can then use to more quickly answer questions in the survey.
The outcomes analysis will be based on school-level administrative data. In fall 2021, the study team will obtain school-level data for schools located in all FY2011 and FY2012 Promise Neighborhoods and for a group of similar schools—called comparison schools—that were not served by a Promise Neighborhoods grant. The outcomes analysis will focus on longer-term outcomes, which is consistent with the program’s theory of action and with grantees’ reports that the initial period after grant award is often focused on start-up activities. The outcomes analysis will not include the FY2016 Promise Neighborhoods because there would be missing data for some outcomes in some years due to the coronavirus pandemic occurring during the middle of these grants and the resulting sparse and/or uneven administration of assessments for multiple years. In addition, the outcomes analysis will not include the FY2017, FY2018, and FY2021 Promise Neighborhoods because their grant periods are not yet completed. Fewer years of outcome data would be available for schools in these neighborhoods, which would not allow for an analysis of longer-term outcomes.
The study team will obtain school-level electronic records from districts where Promise Neighborhoods and comparison schools are located to gather information on school enrollment, achievement, attendance, graduation, college enrollment, kindergarten readiness, student mobility, and background characteristics of the student body. The study team will collect these data for the three years before the grant was received and all five years of the grant period. For example, for a FY2011 Promise Neighborhood, the study team will collect data for school years 2008–2009 through 2015–2016.
The data collection plan is designed to use technology in ways that minimize respondent burden.
Implementation analysis
The study team will use technology to reduce burden associated with the survey by including a tab in the Excel workbook that will automatically calculate counts that the current grantees can use when responding to questions in the survey. In addition, the study team will encrypt electronic versions of the survey materials so they can be shared securely via email with grantees. Encrypting files will enable respondents to (1) safely provide or confirm fine-grained service information electronically; (2) complete sections of the survey as they gather information from colleagues or other sources, without having to re-navigate to a specific question in a web survey; and (3) electronically share the survey with colleagues, if necessary. Respondents will complete the survey materials electronically and return them to Mathematica securely via email, using the password-protected files. .
Outcomes analysis
The study team will use technology in two key ways to minimize burden on school districts. First, the study team will provide districts with a secure file transfer website so they can deliver data files efficiently and securely. Second, the study team will not impose any requirements on data file formats and will accept data in the format that is most convenient to each district.
No similar evaluations are being conducted, and there is no equivalent source for the information to be collected. Moreover, the data collection plan reflects careful attention to the potential sources of information, particularly to the reliability of the information and the efficiency in gathering it. The data collection plan avoids unnecessary collection of information from multiple sources. For example, student outcomes will be measured using existing administrative data, instead of administering a new assessment as part of this study.
To avoid duplication of effort, the study team will pre-populate the Excel workbook component of the survey with existing information on the current grantees. These grantees will be asked to verify the pre-populated information, add anything that is missing, and update information if necessary.
No small businesses will be involved in this study, but the school districts and about half of the Promise Neighborhoods grantees are small entities. The data collection procedures have been designed to minimize burden on these entities:
Implementation analysis
Promise Neighborhoods grantees. In order to minimize burden to grantees, the study team will pre-populate the Excel workbook component of the survey and communicate with current grantees in advance of the data collection period to (1) make sure they understand what the study team will be asking them to complete, and (2) ensure that the study team is scheduling data collection for a time period that is convenient for them.
Outcomes analysis
School districts. To minimize burden on school districts when collecting administrative data, the study team will request only the variables and records that are essential to addressing the study’s research questions, and only collect the minimum number of years of data necessary for a credible analysis (i.e., three years before the grant period and the five years during the grant period). In addition, the study team will collect all requested records from each district at one time (Fall 2021) to avoid repeated requests.
The data collection plan described in this submission is necessary for ED to understand the challenges and successes of implementing the Promise Neighborhoods approach and provide evidence on the outcomes associated with Promise Neighborhoods grants. Without these data, the Congressionally-mandated evaluation would not be able to fulfill this key objective of providing such information to policy makers, practitioners, future Promise Neighborhoods grantees, and the public.
There are no special circumstances involved with this data collection.
A 60-day notice to solicit public comments was published in the Federal Register, Volume 86, No. 32029, page 32029-32030 on June 16, 2021. There were three non-substantive comment received. The 30-day notice will be published to solicit additional public comments.
In formulating the design for this evaluation, the study team sought input from individuals with expertise in place-based initiatives, evaluation methodology, and impact studies of federal grant programs. This input helps ensure the study is of the highest quality and that findings are relevant to policymakers, Promise Neighborhoods grantees, and other districts. Table A.2 lists the individuals who participated in the technical working group (TWG) meetings, their affiliation, and their relevant expertise.
Table A.2. List of Technical Working Group participants, their affiliation and relevant expertise
Name |
Title |
Affiliation |
Elizabeth Stuart |
Associate Dean for Education |
Johns Hopkins School of Public Health |
Petra Todd |
Professor of Economics |
University of Pennsylvania |
Betina Jean-Louis |
Principal consultant |
Arc of Evidence |
Pieta Blakley |
Consultant |
Blakely Consulting, LLC |
Margaret R. Burchinal |
Director of the Data Management and Analysis Center of the Frank Porter Graham Child Development Institute |
University of North Carolina at Chapel Hill |
James Jennings |
Professor Emeritus |
Tufts University |
Issac Castillo |
Director of Outcomes, Assessment, and Learning |
Venture Philanthropy Partners |
Michael McAfee |
President and CEO |
Policy Link |
Will S. Dobbie |
Professor of Public Policy |
Harvard Kennedy School |
Robin E. Smith |
Senior Director |
The DeBruce Foundation |
Sara McTarnaghan |
Research Associate |
Urban Institute |
Kenneth A. Dodge |
William McDougall Distinguished Professor of Public Policy and Professor of Psychology and Neuroscience |
Duke University |
Incentives are proposed for the previous Promise Neighborhoods grantees (those with grant awards in FY2011 and FY2012) participating in the study. Because the grants for previous Promise Neighborhoods grantees have ended, the study team believes an incentive will help encourage these grantees to fill out the survey as accurately and completely as possible while also decreasing the number of reminders needed and the amount of follow-up outreach. Incentives are also proposed because high response rates are needed to make the study findings reliable, especially since there are so few grantees overall. To acknowledge the 75 minutes required to complete the survey, the study team proposes to offer a $50 incentive to the previous grantees who complete the survey. The proposed amount is aligned with the incentive guidelines outlined in the March 22, 2005 memo, “Guidelines for Incentives for NCEE Evaluation Studies,” prepared for OMB. Payments will be made in the form of a gift card.
The study team is not proposing an incentive for the current grantees because ED expects them to participate in the study (The Education Department General Administrative Regulations, 34 C.F.R. § 76.591).
Mathematica and its research partners will conduct all data collection activities for this study in accordance with relevant regulations and requirements, which are:
The Privacy Act of 1974, P.L. 93-579 (5 U.S.C. 552a)
The “Buckley Amendment,” Family Educational Rights and Privacy Act (FERPA) of 1974 (20 U.S.C. 1232g; 34 CFR Part 99)
The Protection of Pupil Rights Amendment (PPRA) (20 U.S.C. 1232h; 34 CFR Part 98)
The Education Sciences Reform Act of 2002, Title I, Part E, Section 183
The study team will protect the confidentiality of all data collected for the study and will use it for research purposes only. The Mathematica project director and survey director will ensure that all individually identifiable information about respondents remains confidential. All data will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required. All members of the study team having access to the data will be trained and certified on the importance of confidentiality and data security. When reporting the results, data will be presented only in aggregate form, such that individuals and schools are not identified. Included in all voluntary requests for data will be the following or similar statement:
“Responses to this data collection will be used only for research purposes. The report prepared for this study will summarize findings across the sample and will not associate responses with a specific school or individual. We will not provide information that identifies your school to anyone outside the study team, except as required by law.”
The following safeguards are routinely used by Mathematica to maintain data confidentiality, and they will be consistently applied to this study:
All Mathematica employees are required to sign a confidentiality pledge that emphasizes the importance of confidentiality and describes employees’ obligations to maintain it.
Personally identifiable information (PII) is maintained on separate forms and files, which are linked only by random, study-specific identification numbers.
Access to hard copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.
Access to computer data files is protected by secure usernames and passwords, which are only available to specific users who have a need to access the data and who have the appropriate security clearances.
Sensitive data is encrypted and stored on removable storage devices that are kept physically secure when not in use.
Mathematica’s standard for maintaining confidentiality includes training staff regarding the meaning of confidentiality, particularly as it relates to handling requests for information, and providing assurance to respondents about the protection of their responses. It also includes built-in safeguards concerning status monitoring and receipt control systems. In addition, all study staff who have access to confidential data must obtain security clearance from ED which requires completing personnel security forms, providing fingerprints, and undergoing a background check.
No questions of a sensitive nature will be included in this study.
Table A.3 provides an estimate of burden for the data collection included in the current request, broken down by data collection activity. The estimates are based on the study team’s prior experience collecting survey data from federal grantees and administrative records data from districts.
The number of targeted respondents is 69, and the expected number of responses is 67. The total burden is estimated at 495 hours, or an average of 165 annual burden hours calculated across three years.
Table A.3. Estimates of respondent burden
Data Collection Activity and Respondent Type |
Number of Targeted Respondents |
Number of Expected Respondents |
Minutes per completion |
Number of administrations |
Burden
in |
Total
Burden |
Total Costsa |
Survey of Promise Neighborhoods grantees |
22 |
20 |
75 |
1 |
1,500 |
25 |
$1,206.00 |
School-level administrative data from districts |
47 |
47 |
600 |
1 |
28,200 |
470 |
$22,672.80 |
Total |
69 |
67 |
75-600 |
1 |
29,700 |
495 |
$23,878.80 |
Average burden per year over a three-year period |
23 |
22.3 |
n.a. |
n.a. |
9,900 |
165 |
a Assumes an average hourly wage of $48.24 for education managers, kindergarten through secondary (derived from the Bureau of Labor Statistics’ Occupational Employment and Wages for educational administrators, May 2019). See: https://www.bls.gov/oes/current/oes119032.htm#(4).
n.a. = not applicable.
The total of 495 hours is based on the assumption that the study team will reach out to all 22 eligible grantees (12 current and 10 previous) and at most 47 districts in which the Promise Neighborhoods schools and comparison schools are located. The exact number of districts will be defined by the number of comparison schools that are good matches for the Promise Neighborhoods schools. The study team has capped the number of districts at 47 to constrain costs. Based on prior experience and the expected intensity of follow-up efforts and incentives to encourage response, the study team expects at least an 85% response rate for the survey, and a 100% response rate for the administrative data.
The study team estimates the survey of grantees to take 75 minutes for both previous and current grantees. For previous grantees, this estimate includes time to locate and review information, talk with others at their organization, work through the exercise of thinking back to their grant period, and answer the survey questions. For current grantees, this time estimate includes time to locate and review information, talk with others at their organization, review and complete the Excel workbook, and answer the survey questions. We expect current grantees will need less time than previous grantees to locate information and talk with others, as most of the information we request in our materials should be easily accessible and/or at the front of their minds. But current grantees will also need to complete the Excel spreadsheet. As a result, we expect it will take both previous and current grantees 75 minutes to complete the materials.
The study team estimates that districts will spend an average of 600 minutes responding to the school-level administrative data request, including 60 minutes (1 hour) for an initial call to discuss the data request, 360 minutes (6 hours) to gather and provide the data files, 60 minutes (1 hour) to answer any questions from the study team about the data, and 120 minutes (2 hours) to gather and provide any revised data files.
There are no direct, start-up, or maintenance costs to respondents or record keepers associated with this data collection.
The total cost to the federal government for this study is $2,452,981. The estimated average annual cost over the three-year study is $817,660.
This is a new data collection. No changes apply.
This study team will address the study research questions using implementation and outcomes analyses:
Implementation analysis. The study team will describe the implementation of Promise Neighborhoods across the FY2011-2018 grantee cohorts. The study team will use information from the surveys to describe the number and types of services offered in Promise Neighborhoods, both before and during the grant period. The study team will also use the surveys from grantees to describe how Promise Neighborhoods grantees allocated funding across services and to describe funding received from sources other than the federal grant. The study team will use the number of children living in the neighborhood (from publicly available data) and the percentage of neighborhood children receiving services (from the survey data) to calculate the total cost per year, per participant. In addition, the study team will describe any implementation challenges experienced by grantees or their partner organizations and how they were addressed. Finally, the study team will gather information about the extent to which services were interconnected and coordinated, both before and during the grant.
The findings from the implementation of the FY2011 and FY2012 grants will be further used to provide context for the findings from the outcomes analysis (which only includes the FY2011 and FY2012 grantees). For example, if the study team finds no greater improvement in student outcomes in Promise Neighborhoods schools compared to comparison schools, there may be particular findings from the implementation analysis that could potentially explain why. Additional findings from analysis of data collected from FY2016, FY2017, and FY2018 grantees will provide valuable information on how Promise Neighborhoods are currently implemented, including how implementation has changed over time, which may foreshadow potential future outcomes associated with the grants.
Outcomes analysis. The study team will conduct an outcomes analysis focused on FY2011 and FY2012 Promise Neighborhoods. The study team will compare changes in student outcomes in Promise Neighborhoods schools to changes in outcomes in similar schools located in the same state as each Promise Neighborhood. Specifically, the study team will examine changes in the following school-level outcomes: kindergarten readiness, math and ELA achievement, attendance, graduation, and college enrollment. The study team will focus on these outcomes for two reasons. First, they are Government Performance and Results Act (GPRA) indicators that Promise Neighborhoods services aim to affect. Second, the Promise Neighborhoods grantees the study team spoke to during an earlier study that assessed the feasibility of evaluating Promise Neighborhoods listed these outcomes as high-priority objectives for their neighborhoods. The study team will identify similar schools in the same state as Promise Neighborhoods schools using a comprehensive set of demographic and outcome variables from publicly-available data. In addition to examining outcomes, the study team will descriptively compare student movement in and out of Promise Neighborhoods schools before and after grant award to student movement in and out of similar schools in the same state as the Promise Neighborhood. This comparison will provide context for interpreting observed changes in student outcomes in Promise Neighborhoods schools and provide evidence on whether any observed changes might be due to student mobility rather than the Promise Neighborhoods grants.
The study team will produce a final report with an anticipated release in 2023. The report will include both implementation and outcome analyses, as described in the prior section.
The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The study will display the OMB expiration date.
This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).
1 Title IV Part F Section 4624(i).
2 One grantee that operated two separate Promise Neighborhoods had their grant terminated and no longer exists. An additional grantee dissolved at end of their grant period. These grantees are excluded from the sample. The sample also excludes the FY2021 grantees because they will likely not have had enough time to fully implement their Promise Neighborhoods by the time the surveys are administered.
3 The Promise Neighborhoods cradle-to-career pipeline includes four stages: early childhood; K-12 education; college and career readiness; and family and community supports.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Promise Neighborhoods OMB Part A |
Subject | Promise Neighborhood OMB Part A |
Author | MATHEMATICA |
File Modified | 0000-00-00 |
File Created | 2021-10-05 |