National Center for Education Statistics (NCES)
Statewide Longitudinal Data System (SLDS) Survey 2017 –2019
OMB# 1850-0933 v.6
Supporting Statement Part A
October 2016
revised July 2018
1. Circumstances Making Collection of Information Necessary 1
2. Purposes and Uses of the Statewide Longitudinal Data System Survey 2
3. Appropriate Use of Information Technology 3
4. Efforts to Identify Duplication 3
6. Frequency of Data Collection 4
7. Special Circumstances of Data Collection 4
8. Consultants Inside and outside the Agency 4
9. Provision of payments or Gifts to Respondents 5
10. Assurance of Confidentiality 5
12. Estimates of Hour Burden for Information Collection 5
14. Annualized Cost to the Federal Government 6
15. Reasons for Changes in Response Burden and Costs 6
16. Time Schedule for SLDS Survey 6
17. Approval to not Display Expiration Date for OMB Approval 7
18. Exceptions to Certification for Paperwork Reduction Act Submissions 7
The National Center for Education Statistics (NCES), of the Institute of Education Sciences (IES), within the U.S. Department of Education, is requesting clearance to formalize the Statewide Longitudinal Data System (SLDS) Grant Program Interim Progress Report (IPR), which is intended to provide insight on State and U.S. territory SLDS capacity for automated linking of K-12, teacher, postsecondary, workforce, career and technical education (CTE), adult education, and early childhood data. This new SLDS Survey will be collected annually from State Education Agencies (SEAs), and will help inform NCES ongoing evaluation and targeted technical assistance efforts to enhance the quality of the SLDS Program’s support to States regarding systems development, enhancement, and use. This submission is to conduct the annual SLDS Survey from 2017 through 2019.
NCES is authorized to collect this information by the Education Sciences Reform Act of 2002 (ESRA 2002; 20 U.S.C., § 9543), which establishes the SLDS Grant Program.
The SLDS Survey will be the first formal, systematic collection of SLDS system capacity data of its kind. Aspects of these data have been obtained less systematically during regular interactions with SLDS grantees. This new effort will provide better information to meet NCES’ quarterly progress reporting regarding State capacity to link and use data, to inform future grant rounds, technical assistance efforts, and public knowledge of State capacity to link and use data.
While States have provided some indication of data linkages and use to NCES program staff through annual reporting, monthly monitoring updates, and State site visits, these data do not provide a comprehensive look at data capacity. Reasons for incomplete data include, but are not limited to:
Data are collected from States that have active grants, which results in missing data for non-grantee States; and
Grantee States report primarily on their proposed – and funded – projects. As a result, States might not be discussing the full capacity of their State data systems with program officers, which could lead to an under-reporting of capacity.
While the SLDS program office has attempted to collect more uniform information about data linkages (for example, asking about early learning program data linkages), the efforts have been limited to States with active grants.
External organizations, including the Data Quality Campaign (DQC), have conducted surveys to document data linkage and use capacity at the State level. The DQC data are limited by three factors:
DQC stopped collecting information about data linkages in 2011, with no replacement data source in place.
The DQC survey relied on fairly dichotomous measures of data linking (where a yes response indicated that a State had a link in place, and a no response indicated that a State did not have a link in place). States, however, tend to implement linkages more gradually. For example, a State might conduct a pilot in which Pre-K and K-12 data are linked for one Local Education Agency (LEA), or might link data from a limited set of Pre-K sources, such as Head Start or Early Head Start. The proposed NCES measure allows for States to report on the continuum of data linkage and capacity. For example, the proposed SLDS Survey enables States to rate their own data linkage and use efforts as “Not planned,” “Planned,” “In Progress,” or “Operational.” A State that has established a pilot data linkage process would be deemed “In Progress,” while a universal roll-out would be considered “Operational.”
The DQC survey produced less public information about how States were matching data (for example, through a manual process or an automated process) and who was matching the data (for example, a State agency or a vendor such as NSC). Such information is quite useful for assessing States’ needs and capacity for data linkages.
Since 2005, the U.S. Department of Education has awarded approximately $721 million in 97 grants to State Education Agencies to enable them to implement and enhance their SLDS systems. The Department now needs a clear and formal means of summarizing and communicating the status of these systems across all States and Territories to: 1) evaluate current needs for further systems development; 2) provide targeted technical assistance to States; and 3) accurately reflect progress on the development and use of statewide longitudinal data systems.
Survey results would inform:
Future grant rounds for the SLDS grant program and technical assistance support;
Program offices in the Department of Education, Department of Labor, and Health and Human Services, in addition to external stakeholders;
State development and support efforts; and
Public knowledge of State capacity to link and use longitudinal data.
State Information about State capacity for data linkages and use is vital to ensure that program dollars are targeted both for grant funding and for technical assistance development. As federal funding becomes increasingly limited (especially for SLDS infrastructure development as well as for long-term sustainability), we must have a clear sense of SLDS progress across the United States so that federal resources can be utilized and offered most efficiently and effectively. Currently, the SLDS grant program is responsible for providing OMB with up-to-date state capacity indicators on a quarterly basis, with the shortcoming that any changes or updates to these data primarily reflect information from active grantee states only. The report is produced based on continual communication with active grantees that allows the SLDS Program Officers to remain informed of these states’ systems’ capacity, progress, and constraints. Moreover, active grantees are responsible for providing summary reports on at least an annual basis, and this reporting validates assumptions and conversations that take place throughout the year between grantee states and SLDS Program Officers. Reporting for states without active grants has been only ad hoc.
As mentioned previously, there is a growing interest in SLDS capacity across the United States both internally within the Department of Education, among States and U.S. territories, and across agencies with common and shared interests (Department of Labor’s Workforce Data Quality Initiative, for example). The SLDS program regularly responds to questions regarding State capacity for data linking and use, including, for example:
How many States can link:
teacher preparation programs of teachers to student outcomes for students taught by those teachers (Title II);
K12 and postsecondary data (Performance metric, OPEPD);
K12, postsecondary, and workforce data (Performance metric, OPEPD, Department of Labor, Workforce Data Quality Campaign, White House Workforce Convening);
K12 and early learning data (Performance metric, Early Learning Challenge Technical Assistance, Office of Special Education Programs, US Department of Health and Human Services); and
How are States using data (Performance metric, US Department of Labor).
States and Territories themselves often seek information about which States are linking and using data and what their processes entail. The SLDS Program facilitates States’ efforts to share promising practices with each other. This enables States to more easily collaborate, learn from each other, share resources with each other, and avoid duplicative work.
The SLDS program also receives questions about State capacity from the public, which is interested in learning which data are available at the State level and how the data might be accessed. We plan to generate a set of metrics and use cases showing data-linking and data-use capacity by State, which will enable interested users to quickly ascertain which States have capacity to link data across sectors (for example, which States and Territories can link K12, postsecondary, and workforce data). It will also include some examples of State data use capacity, including, for example, which States are providing feedback reports so that policy makers at the local level have an understanding of how their high school graduates are faring in postsecondary education or the workforce.
The SLDS Survey will be distributed to SEAs electronically, as an email attachment, by the State’s Program Analyst contact (including State’s NCES Program Officer on the email). All states have a Program Officer and Analyst contact despite their grant status (including states that do not have active grants). The SLDS Survey was developed using Microsoft Word, from which a PDF version was developed to ease completion efforts and ultimately reduce respondent burden. The resulting standard form has a built-in skip pattern, text boxes, and formatting restrictions that cannot be manipulated. The PDF version is expected to facilitate the data collection process, increase the reliability of the data, and reduce error. The PDF can also be downloaded, printed, and completed manually. In the instance that a State is unable to download the PDF, the Microsoft Word version of the document can be provided upon request. It can also be manipulated within the MS Word platform, or printed and completed manually. Grantees will include the completed survey as an attachment to an email, sent from their Program Analyst and Program Officer.
Because the Survey is new to the SLDS Community, NCES plans to host one to two webinars in 2017 to introduce the Survey to States and their respective respondents. Respondents will be invited to participate in webinars via listserv email invitation. The listserv is used regularly to communicate with the SLDS community. Webinars will be used as a tool to provide more information regarding the purpose of the OMB-formalized survey, how to complete the instrument and in what manner the data will be used by NCES. In subsequent years, NCES will host one or two webinars per year on an as needed basis, to answer any questions states may have regarding the Survey.
The information collected through the SLDS Survey does not duplicate information requested or collected by any other federal agency. Further, there is no similar current information available on a consistent national basis that could be used or modified for these purposes. Program offices within the US Department of Education often request and report on similar data. Having a single source of information will decrease redundant data collections and improve ED’s ability to provide valid and reliable data for internal and external users.
As noted above, a similar but not equivalent survey has been conducted on an annual basis by DQC, a nonprofit organization participating in national effort to bring quality information to education stakeholders. Between 2005 and 2011, DQC surveyed States in an attempt to report their progress towards the building of longitudinal data systems and implementing effective data use. In 2009, DQC launched the 10 State Actions to Ensure Effective Data Use, which document States’ capacity to use the data in their systems.
While many of the questions that DQC has asked States to report on in the past parallel those set forth in the proposed SLDS Survey, the transition from data linking to data use in 2011 resulted in losing information about fundamental SLDS capacities. Because DQC has taken a new direction, States are no longer asked to report on the types of questions that can assist us in assessing SLDS progress to-date. It is crucial that these data continue to be collected at the national level to guide future efforts in SLDS development and to provide information about State capacity to link and use education data. NCES plans will use the DQC survey data responses as one of the resource to help us understand changes in State capacity since 2011 and evaluate State SLDS development and data use progress.
NCES has devised several measures to minimize the response burden for States and Territories participating in the SLDS Survey. Questions have been reviewed by the federal SLDS Program Team and the State Support Team (SST) members, a panel of experts who support the Program by offering technical assistance to States. All SST members have held leadership positions in their respective SEAs. As a result, they are generally aware of the level of burden that the Survey is likely to impose. Each contributor took this into account when providing input in an attempt to consider conflicting SEA responsibilities and demands and to minimize burden. Additionally, NCES will offer webinars to provide more information to respondents about the Survey, how to complete the instrument, and NCES’s plans for the data. These proactive efforts are aimed to minimize respondent burden over the long-term.
The SLDS Survey will be an annual survey that will begin in April 2017 and then in August of each year, beginning in 2018. Nationwide, SLDS system capacity changes frequently (ex. Infrastructure enhancements, evolving P20W agency collaborations, State legislation impacts, etc.), so collecting data less often would make the information too obsolete to be useful for targeted technical assistance planning.
There are no additional circumstances that will require special data collection efforts.
OMB requested that the survey instrument be reviewed by a methodological expert within NCES. SLDS staff asked Dr. Andy Zukerberg, at NCES, to review the instrument. Dr. Zukerberg suggested revising the skip pattern, providing further (but concise) definition of key concepts mentioned, considering shortening the survey in length, and piloting it with a few SEAs. Per recommendations, the skip pattern was revised and enhancements were made to concept definitions and survey instructions prior to piloting.
The SLDS survey was piloted with the Kentucky, Minnesota, and Washington State Project Teams. Each participating SEA was given approximately two weeks to complete the survey with notification that survey completion might require collaboration from other SLDS stakeholders, outside of the immediate project team. Once completed, a debrief teleconference was held to discuss possible improvements, suggestions, and other feedback. In general, pilot participants indicated that they preferred the SLDS Survey over the leading, external survey designed to measure State’s progress towards SLDS development and implementation, which by now has not been administered in the past five years. State pilot participants were satisfied with the length of the SLDS survey, stating that while it is somewhat extensive, it is comprehensive in assessing the current state and robustness of SLDS and P20W capacity. Based on the feedback received during the pilot, changes were made to the overall SLDS Survey structure, content, instructions, concept definitions, and language. As a result of the pilot, a comment box was also added to the end of the SLDS Survey so that State respondents could provide any desired clarifications or explanations.
In addition to the internal NCES review and SEA piloting, the following individuals from the SLDS State Support Team reviewed the data collection content and plans:
from Applied Engineering Management Corporation: Kathy Gosa (SST Lead), Missy Cochenour, Carla Howe, Bill Huennekens, Joyce Popp, Baron Rodriguez, and Jeff Sellers; and
from Chatis Consulting: Corey Chatis.
Once the SLDS survey is implemented, feedback and suggestions will be solicited and welcomed on an ongoing basis through the following measures:
Point of contact provided on the survey instrument,
Point of contact provided on SLDS website once the site reflects data from the SLDS Survey, and
Opportunity for discussion during monthly SLDS teleconference calls.
Additionally, during the 60-day public comment period announced in the Federal Register published on October 7, 2016 (Vol. 81, No. 195, pp. 69803-69804), NCES received three public comments. A document with copies of the three comments and NCES responses has been added to this submission.
No payments or gifts will be offered to survey respondents.
Data collected through the SLDS Survey are public domain data in their respective districts and States. As such, the data collection does not include a pledge of confidentiality.
None of the questions asked during the SLDS Survey are of a sensitive nature.
The response burden will vary by State and U.S. territory, with the expectation that on average it will take 2 hours for each SEA to complete the SLDS Survey. Although the expectation is for the Program Director or past Program Director to complete the survey on behalf of the State or U.S. territory, staff turnover and/or level of knowledge and expertise varies by State. For example, respondents from a States that has focused on building a K12 SLDS (as opposed to a P20W) might possess limited (if any) knowledge on workforce and postsecondary system capabilities due to the fact that the workforce system might not be housed in their agency. In such cases, cross-agency communication and collaboration may be required to effectively and successfully complete the SLDS survey. By contrast, a State or U.S. territory with a tenured respondent involved in the implementation of a P20W might be capable to complete the survey independently, with greater ease.
NCES will host one or two 30 minute webinars annually, on an as needed basis, to provide more information to respondents about the Survey, and to answer their questions.
The estimated hours per respondent are based on information directly provided by past state Program Directors who have completed comparable information requests in previous years. Assuming that the respondents (state education agency administrators) earn on average $47.511 per hour, the total annualized burden time cost to respondents for the SLDS Survey is estimated to be $6,652.
1 The mean salary for financial managers (SOC code 113031) working in State government is $47.51 per hour. The Occupation and Employment Statistics at the U.S. Department of Labor, Bureau of Labor Statistics (BLS) were accessed on June 9, 2015 SOC code: Standard Occupational Classification code -- see http://www.bls.gov/soc/home.htm.
Respondent |
Number of Respondents |
Number of Responses |
Estimated hours per respondent |
Total Respondent Burden Hours |
States / U.S. Territories |
56 |
56 |
2 |
112 |
Webinars |
56 |
56 |
0.5 |
28 |
TOTAL |
56 |
112 |
- |
140 |
SLDS Survey respondents will not incur any costs for this data collection other than their time to respond.
Method for estimating costs: The costs include the projected annual amount of time that Department of Education staff will spend on the survey, separated by labor type. Contracted staff is intended to support pre-collection, collection and analysis, with federal guidance and oversight. A mean salary for financial managers ($47.51 per hour) was used as the hourly rate.
Estimated Annual Cost of the SLDS Survey to Federal Government for Fiscal Year 2017
Labor Type |
Annual Labor Hours |
Cost |
Collection and Data Entry |
50 |
$2,375.50 |
Data Analysis |
80 |
$3,800.80 |
Product Development and Publishing |
40 |
$1,900.40 |
Total |
170 |
$8,076.70 |
Cost: Department of Education staff assigned to SLDS Survey include one-fifth of one FTE (GS15) Program Officer and one-fifth (FTE) of one Program Officer’s (GS12) time. Contracted staff also supporting the SLDS Survey (included in SLDS Contract total) consists of three full-time Program Analysts employed through Applied Engineering Management Corporation.
The Federal Government will incur no additional cost for the implementation of this survey beyond the existing cost of managing the SLDS grant program.
This is a new collection for the federal government. As such, it represents an overall burden increase.
The SLDS Survey is an annual collection, and the schedules for different years of its administration are shown below.
2017 Timeline |
SLDS Survey Collection, Processing, and Publication |
Early April 2017 |
Email instructions to SEA respondents |
April – June 15, 2017 |
One or two webinars, on an as-needed basis, to provide more information about the Survey, how to complete the instrument, NCES’s planned use of the data, and to address respondents’ questions about the Survey |
June 15–30, 2017 |
Survey final reminder email |
June 30, 2017 |
SEAs are urged to have finished submitting accurate and complete data |
July 15, 2017 |
Mandatory final submission date |
September 12, 2017 |
Response by SEA’s to requests for clarification, reconciliation, or other inquiries from NCES. All data issues to be resolved. Close survey submission on Tuesday following Labor Day. No files are accepted after close-out. |
October 15, 2017 |
NCES review of files, file documentation, and brief analysis completed. Provisional responses available for internal use but not publication |
November 15, 2017 |
Indicator tables and use cases become public, NCES website updated. Current year collection data will be available to assess and respond to ad hoc requests |
November 15, 2017 – April 1, 2018 |
Respondents have the option to make update, change, or reconciliation requests to adjust state-specific data reflected in the SLDS Survey public indicator tables |
Beginning in 2018, the SLDS Survey data collection will begin in August of each year and more time will be allocated to the review of the submitted data.
2018 & 2019 Timeline |
SLDS Survey Collection, Processing, and Publication |
August 2018 |
Email instructions to SEA respondents |
August – September 2018 |
One or two webinars, on an as-needed basis, to provide more information about the Survey, how to complete the instrument, NCES’s planned use of the data, and to address respondents’ questions about the Survey |
September 15–30, 2018 |
Survey final reminder email |
September 30, 2018 |
SEAs are urged to have finished submitting accurate and complete data |
October 15, 2018 |
Mandatory final submission date |
November 2018 |
Response by SEA’s to requests for clarification, reconciliation, or other inquiries from NCES. All data issues to be resolved. No files are accepted after close-out. |
February 15, 2018 |
NCES review of files, file documentation, and brief analysis completed. Provisional responses available for internal use but not publication |
June 15, 2018 |
Indicator tables and use cases become public, NCES website updated. Current year collection data will be available to assess and respond to ad hoc requests |
June 15 – August, 2018 |
Respondents have the option to make update, change, or reconciliation requests to adjust state-specific data reflected in the SLDS Survey public indicator tables |
NCES will generate a set of metrics and use cases showing data-linking and use capacity by State, which will enable interested users to quickly ascertain which States have capacity to link data across sectors, for example, which can link K12, postsecondary, and workforce data, and how they are using these data to inform policy and practice. The anticipation is that these metrics and use cases will be published to the SLDS website. The SLDS grant program is currently responsible for providing updated indicators to OMB on a quarterly basis, so the validity of this reporting will be enhanced as a result of this collection. As data needs evolve, the intention is to post more data publically. The data collected from the SLDS Survey will also be used to respond to questions from internal and external stakeholders regarding SLDS capacity in the States, and to inform future grant rounds and technical assistance planning.
No approval is sought to not display the expiration date of OMB approval.
There are no exceptions to the certification for Paperwork Reduction Act submission.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Schools and Staffing Survey |
Author | King, Kristen |
File Modified | 0000-00-00 |
File Created | 2021-01-20 |