Memorandum United States Department of Education
Institute of Education Sciences
National Center for Education Statistics
DATE: October 14, 2011
TO: Shelly Martinez, OMB
FROM: Freddie Cross, NCES
THROUGH: Kashka Kubzdela, NCES
SUBJECT: Beginning Teacher Longitudinal Study (BTLS) 2012 Questionnaire Items and Respondent Incentives Change Request (OMB# 1850-0868 v.3)
The purpose of this memo is to request clearance for changes to the Beginning Teacher Longitudinal Study (BTLS; OMB clearance number 1850-0868 v.2) wave 5 questionnaire, a revision to the incentive offered to potential survey respondents, and a submission of the results of a video reminder experiment.
The questionnaire changes were made to update relative dates and to refine language used in some of the BTLS survey items. Appendix A reflects changes other than relative date changes and explains the reason behind each change. Appendix B contains the full questionnaire, with its skip patterns, and shows which items have changed.
The spreadsheet in Appendix B is color-coded to highlight all changes. Changes to cells highlighted in blue mostly consist of changes to dates and variable names, while changes to cells highlighted in peach were made as a result of cognitive testing:
Only changes in columns L and M reflect wording changes (explained in Appendix A).
Changes in columns I, J, and N reflect updated variable labeling only, and changes in column K are a mixture of updated reference year and updated variable labeling, with red lettering indicating the specifics of the changes made:
For example, the variable name in cell I13 was updated from W3REGCL to W4REGCL. These are the retrospective questions asked of respondents who did not complete the previous questionnaire. Last year’s W3REGCL asked respondents if they were teaching in 2009-10 because they did not complete the wave 3 questionnaire. This year’s W4REGCL asks respondents if they were teaching 2010-11 because they did not complete the wave 4 questionnaire.
In column J, all leading letters were changed from either H to I or, for retrospective items, from G to H.
In column N, the skip pattern variable labels were updated to reflect the above changes to variable labels.
In summary, six items were deleted from the BTLS questionnaire (primarily due to low response rates), many items had only the date changed to make them appropriate for this next wave, some items were refined and the revisions cognitively tested, and two items were added. Cognitive testing was conducted by Macro International under a contract with the U.S. Census Bureau (see Appendix C for cog lab report).
Two items were added to the questionnaire because cognitive testing indicated that for items LVHOM through LVIMP respondents were not answering the question “Indicate the level of importance EACH of the following played in your decision to leave your pre-K-12 teaching position”, but rather were answering which are important reasons for leaving teaching. These items consist of specific reasons the respondent left teaching, with 24 matrix items over 9 pages of the questionnaire. We determined that respondent burden can be reduced by having respondents complete the added item LVWHY, which is an open item asking respondents to fill-in the reason they left teaching. This information will then be upcoded to the variables LVHOM through LVARW. If the respondent does not answer LVWHY, he or she will be directed to answer LVHOM through LVIMP.
Similarly, for items REHOM through REIMP, cognitive testing indicated that respondents were not answering the question “Indicate the level of importance EACH of the following played in your decision to return to the position of a pre-K-12 teacher”, but rather were answering which are important reasons for returning to teaching. These items consist of specific reasons the respondent returned to teaching with 17 matrix items over 6 pages of the questionnaire. We added item REWHY, which is an open item asking respondents to fill-in the reason they returned to teaching. This information will be upcoded to the variables REHOM through RESCH and, if the respondent does not answer REWHY, he or she will be directed to answer REHOM through REIMP.
We will use the results for the new items, LVWHY and REWHY to examine whether using this method significantly reduces respondent burden while providing the same responses.
The BTLS questionnaire has many different skip patterns to accommodate different career trajectories of teachers, and in each administration of BTLS the proportion of teachers with different trajectories changes. Therefore, although six items have been deleted and two added, overall we do not expect a significant change in the total burden to respondents as compared to the previous year.
During BTLS wave 4, we conducted an experiment on whether a reminder video sent to study participants who did not respond to the questionnaire by the first follow-up date had an effect on response rates. The results of this experiment are provided in Appendix D.
During BTLS wave 3 we conducted an incentive experiment, comparing response rates and times for respondents offered $10 versus $20 incentives. In wave 4, we offered all respondents $10. Due to the response rates achieved in wave 4, we would like to provide $20 incentives to all solicited potential respondents in wave 5. Wave 5 will be the last wave of this first BTLS study and we need to make certain that we minimize panel loses to assure the utility of the resulting data. Below we provide analyses of the wave 3 and wave 4 response patterns to explain why we believe offering $20 rather than $10 incentive will help us achieve the needed response rate.
NCES plans to begin a new BTLS cohort in 2015-16 and we will review the results of the propensity model approaches to providing incentives that are currently being tested by a number of NCES longitudinal studies to determine the appropriateness of such models for BTLS.
RELATIONSHIP BETWEEN INCENTIVES AND BTLS RESPONSE RATES
Introduction
The Beginning Teacher Longitudinal Study (BTLS) is a study of a group of public school teachers who began teaching in 2007 or 2008. The objective of the BTLS is to have a better understanding of the impact that different life events have on new teachers’ careers and how new teachers respond to these life transitions. Because the study will follow this cohort of teachers for multiple years, maintaining high response rates for each year of data collection is crucial to the usability of resulting data. One of the strategies utilized by BTLS to achieve high response rates is offering a monetary incentive to potential respondents.
In order to boost response rates for the 2009-10 data collection (wave 3; OMB# 1850-0868 v.1), NCES gave noncontingent cash incentives to study participants in advance of the survey instrument. Because an optimal incentive amount had not been determined at that time, NCES included an experimental design to test the effects of differential incentive amounts on response rates. All sampled cases in the BTLS cohort were randomly assigned to one of two experimental groups - $10 incentive group or $20 incentive group. They were mailed a letter with the incentive three days before they received an email containing the link to the online BTLS instrument. The impact of different monetary incentives on survey completion, completion date, and completeness of the survey responses was investigated.
The results showed that a larger incentive amount ($20) was associated with both a higher early survey response rate (responses prior to the start of telephone follow-up period from 2/1/2010 to 6/4/2010) and a higher final response rate. However, the cost of the wave 3 data collection, taking into consideration the cost of telephone follow-up ($39 per case on average), showed that $20 incentive group had higher average cost per respondent 1 compared with the $10 incentive group ($41 vs. $35). The results also showed that the incentive amount was not associated with the completeness of the survey. However, 97 percent of the study interviews reached the last page of the survey, meaning that most of the respondents who answered the required items also completed the survey.
2011-12 (Wave 4) Incentive Plan
Based on OMB clearance request passback comments of 12/30/2010, NCES decided to give a $10 incentive to all sampled cases during the 2010-11 BTLS data collection. NCES did not use the $20/$10 incentive design as in Wave 4 due to the following reasons:
We did not want to set long-term expectations for the $20 group if it was not clear that we could offer $20 to them in the future.
Given the current economic situation, it would have been relatively easy to explain the decrease in the incentive amount to the $20 group of Wave 3.
With the release of the BTLS documentation, the cohort may learn of the differential incentives and that could have negative impact on the future response rates.
The differential incentives might have complicated the Lego experiment that was planned for Wave 4.
The impact of the decreased incentives could in the end be investigated to contribute to the research on using incentives in longitudinal surveys.
Methods and Analysis
Similar to the 2009/2010 wave 3, the 2010/2011 wave 4 data (OMB# 1850-0868 v.1 and v.2) were primarily collected through a web instrument with telephone follow-up. The first item and several consequent items in the instruments were designed as required questions, that is, respondents could not proceed through the survey without giving answers to these questions. These required questions were used to determine respondents’ teaching status (current teachers vs. former teachers; stayers vs. movers) which determined the paths respondents took in the survey. In addition to the web instrument, BTLS participants also had the option to complete the survey over the phone by calling a toll-free number. During the telephone follow-up period (from 2/14/2011 to 7/1/2011), sampled study participants who had not responded to the web instrument were called and offered the opportunity to answer the survey questions over the phone.
The same methods used in wave 3 for informing study participants about the cash incentive and delivering the incentives to the Census Bureau’s National Processing Center (NPC) in Jeffersonville, IN, were used for the wave 4 data collection. Specifically, in late 2010, all sampled cases in the BTLS cohort received initial contact letters/e-mails informing them about the upcoming incentive and asking them to update their addresses using our online tool in order for the incentive to be mailed to their correct address. During early January of 2011, all of the sampled cases in the BTLS cohort with valid addresses received letters with the incentives. Based on the sample size, a total of $19,900 in $10 bills was sent. By the end of the data collection, 53 incentives were returned, 2 were study refusals and 51 were due to undeliverable addresses (UAAs).
The following research questions were explored in the analyses:
Was the same amount of incentive able to maintain the same early or final response rates?
Did a decreased amount of incentive result in lower early or final response rates?
Using chi-square tests, comparisons were made between the wave 3 and wave 4 on the number of interviews before telephone follow-up date (early responses) and the number of final interviews by the end of the data collection (final responses) within two groups (same incentive group and decreased incentive group), respectively.
These analyses were conducted using the Final Interview Status Recode (ISR) file which contains the 2010-11 case statuses – whether a case is an interview, nonrespondent, or out-of-scope for the collection. The file also contains the survey completion dates for complete cases or last login dates for incomplete cases.
Results
The sample size used in the analyses presented here is smaller than the number of people that received incentives in wave 4 because some of them didn’t receive any incentives2 in wave 3. The current sample includes 1,893 cases that received incentives in both wave 3 and wave 4 and were not deemed to be out-of scope (OOS) by the end of the wave 4 data collection. Table 1 shows the response rates of this sample in wave 3 and wave 4. The overall final response rates were 88 percent and 85 percent for wave 3 and wave 4, respectively. The chi-square test result showed a significant relationship between final response status (final respondent vs. final non-respondent) and the waves (chi-square with one degree of freedom = 6.87, p = .008), indicating that the overall final response rate was significantly lower in wave 4 than wave 3.
In wave 4, the final response rate of people who received the same amount of incentive was about 84 percent and that of people who received the decreased amount of incentive was about 86 percent. For the people who received the same amount of the incentive in both waves, the chi-square test result didn’t show any significant relationship between their final response status and the waves. That is, there was no significant change with their final response rates from wave 3 to wave 4. For people who received the decreased amount of incentive in the wave 4, the chi-square test result showed a significant relationship between the final response status and the waves (chi-square with one degree of freedom = 7.37, p = .007). That is, the response rate of this group was significantly lower in wave 4 than in wave 3. Chi-square test results didn’t show any significant relationships between early response status (early vs. non-early respondent) and the waves for both same incentive and decreased incentive groups.
Within wave 4, the early response rate of the decreased incentive group was about 55 percent, compared with about 50 percent response rate of the same incentive group. The chi-square test result showed a significant relationship between the early response status and the groups (chi-square with one degree of freedom = 4.02, p = .0045). This means that the group that received $20 in wave 3 but only $10 in wave 4 still had a significantly higher early response rate in wave 4 than the group that received $10 in both waves. The chi-square result didn’t show any significant relationship between the final response status and the groups.
Table 1. Response rates of BTLS wave 3 and wave 4 incentive experiment, by wave and incentive amount: 2009-2011 |
|||||||
Wave |
|
Number of teachers receiving incentives |
|
Early response rate1 |
|
Final response rate2 |
|
Wave 3 overall response rate |
1,893 |
|
53.1 |
|
88.0 |
|
|
10-dollar incentive group |
948 |
|
49.5 |
|
85.8 |
|
|
20-dollar incentive group |
945 |
|
56.8 |
|
90.3 |
|
|
Wave 4 overall response rate |
1,893 |
|
52.5 |
|
85.1 |
|
|
same incentive group ($10-$10) |
948 |
|
50.2 |
|
84.0 |
|
|
decreased incentive group ($20-$10) |
945 |
|
54.8 |
|
86.2 |
|
|
1
Early response rate is the percentage of study interviews before
the telephone follow-up date, 02/01/2010 for wave 3 and
02/14/2011 for wave 4. Both completed surveys and
partial-completed surveys with required items answered are
considered as study interviews in BTLS processing. |
Wave 4 Cost Analysis
Because the response rate for BTLS wave 4 didn’t reach the expected 85 percent at the end of the 4 months follow-up period, the data collection was extended for two additional weeks until 7/1/2011. Due to this and the following factors, Census estimated that the telephone follow-up cost was about $84 per followed-up case for wave 4, compared with $463 per case in wave 3:
A greater number of interviewer hours needed due to respondents becoming harder to reach each year.
Census had dedicated BTLS staff scheduled to take incoming wave 4 calls, while in wave 3 Census was able to utilize staff from another survey to answer incoming calls.
Summary of Wave 4 Findings
In summary, decreased incentive amount over waves was associated with lower final response rate, though early response rate was not affected. The changes in final and early response rates were not significant if the same incentive amount was used ($10 in both waves). A larger incentive ($20) had impact beyond the one wave’s data collection. It was associated with higher early response rate in the following wave, and therefore the follow-up cost of the following wave. The telephone follow-up cost was not stable over the waves. Given the prolonged data collection and different personnel used in telephone follow-up, the cost could change significantly from wave to wave.
Projections for Wave 5
Table 2 shows our projected final response rates and costs per respondent given a $20 vs. $10 incentive scenario in wave 5 to be administered in 2012. We expect to yield a considerably higher early and final response rates and a lower telephone follow-up cost under the $20 scenario, which would result in approximately the same total projected cost per case as the $10 scenario (~$62 per respondent4). In our estimations, the projected cost of telephone follow-up per case is based on the same follow-up cost per case as in wave 4. The projected early and final response rates for $10 incentive are the wave 4 early and final response rates of the same incentive group. The projected early and final response rates for $20 incentive are the wave 3 early and final response rates of the $20 incentive group.
The formula below is used to calculate the projected cost of telephone follow-up:
Projected cost of telephone follow-up = telephone follow-up cost per case ($84) x [(the number of cases – projected number of early respondents)
The formula below is used to calculate projected cost per respondent:
Total projected cost per respondent = (Projected cost of incentive + Projected cost of telephone follow-up)/Projected number of final respondents
Table 2. Total projected cost for BTLS Wave 5 incentive and telephone follow-ups, by incentive amount: 2011-12 |
|||||||||||||
Incentive amount |
Number of cases |
|
Projected cost of incentive |
|
Projected number of early respondents (response rate) |
|
Projected cost of telephone follow-up |
|
Projected number of final respondents (response rate) |
|
Total projected cost per respondent |
|
|
10 dollars |
|
1990 |
|
$19,900 |
|
995 (50%) |
|
$83,580 |
|
1672 (84%) |
|
$62 |
|
20 dollars |
|
1990 |
|
$39,800 |
|
1134 (57%) |
|
$71,879 |
|
1791 (90%) |
|
$62 |
|
Recommendations for Wave 5
The 2011-12 questionnaire is the fifth and the last data collection for the current BTLS cohort, it is crucial to the utility of the resulting data to maintain high final response rate. Based on the information we have learned through the use of different incentive amounts in the last two waves, we have estimated the overall cost of using $10 vs. $20 incentives for BTLS wave 5 data collection and recommend using $20 incentives in wave 5 to yield a higher response rate with the same cost.
1 As mentioned in the previous incentive experiment submitted to OMB, the calculation is Cost per respondent = (Cost of incentives – Returned incentives + Cost of telephone follow-ups)/Total number of respondents.
2 Some incentives were returned by the respondent or by the post office as undeliverable as addressed.
3 The cost per case for wave 3 differs from the cost per case provided to OMB in an earlier document. During analysis for wave 4, it was discovered that the wave 3 cost was based on the number of cases for which a Call Log was printed rather than the number of cases open when follow-up calls began (a difference of about 300 cases). Calculating wave 3 and 4 costs using the same method, results in the wave 3 cost being $46, rather than $39 that was previously reported.
4 Based on rounded cost amounts.
File Type | application/msword |
Author | ERSMCGILL |
Last Modified By | kashka.kubzdela |
File Modified | 2011-10-14 |
File Created | 2011-10-14 |