REL Southwest
Project Title: 4.11 Understanding the Impact of Providing Information to Parents about the Role of Algebra II: An Opportunistic Study
Responses to Technical Working Group Comments
June 2014
Description of Topic and Articulation of Need
Comment Hans Bos:
I am not going to provide detailed feedback on this background section but all of my feedback on the 4.10 introduction applies. I think it is especially important to provide citations for the strong hypotheses about the differential impacts for low-income students.
RESPONSE: We have added citations for studies that suggest that changing graduation requirements could have a negative impact on low-income and minority students.
Research Questions
Comment Geoffrey Borman:
CLARIFY: What strata-- Students stratified by demographics, or schools stratified by some characteristcs? Since Q 1a and 1b deal with differential impacts by varying strata of schools, I’m even more confused.
RESPONSE: The strata are defined below in the section describing the selection of schools to participate in the study. It would be somewhat awkward to define them in the research question.
Comment Hans Bos:
What are these?
RESPONSE: The strata are defined below in the section describing the selection of schools to participate in the study. It would be somewhat awkward to define them in the research question.
Literature Review
Comment Geoffrey Borman:
CLARIFY: I’m not exactly sure what this means. Does this mean that low-income and minority students’ parents are more influenced by school systems’ stated graduation requirements than more affluent and non-minority students’ parents? That would be interesting—and I’m not sure why that might be the case. Or, is this simply saying that these kids are most often adversely affected by these requirements because their outcomes often fall below the required thresholds?
RESPONSE: We are saying that low-income and minority children are influenced more than other students by school systems graduation requirements because they are more likely to complete the minimum requirements. This has been demonstrated by research showing that low-income and minority children take more and/or more difficult courses when graduation requirements become more stringent. This research leads us to suspect that these students may be particularly likely to not take algebra II once it is no longer a graduation requirement. We have added a sentence to better clarify this.
The Intervention
Comment Hans Bos:
I think it would be good to also present what the alternatives for Algebra II could be, just to make the document seem balanced and comprehensive. Algebra II may not be the best option for everyone. You just want people to be aware of the consequences of their decisions either way.
RESPONSE: The goal of the document is to provide students with information about how well-aligned the new graduation plans are with the college admissions at public four-year colleges and universities in Texas. The document will show each of the graduation plans and how they match up to the admissions requirements. This will inform students of what they need to take to be “college ready”. If they do not wish to attend college, that is fine. The brochure will let them see that, if they do not plan to attend college, they do not need to take algebra II.
Comment Hans Bos:
Why not also provide the information directly to the children?
RESPONSE: The informational brochures will be mailed directly to students’ homes. This way both students and parents should have access to the information. We considered providing additional materials for schools to distribute to students, but ultimately, we decided that it would be too difficult to determine whether or not schools had actually distributed the materials, and we did not wish to place additional burden on schools. Moreover, the materials are only appropriate for students in particular cohorts, and we did not wish to confuse students in non-targeted cohorts, as this would provide them with incorrect information about high school graduation requirements.
Comment Geoffrey Borman:
CLARIFY: What was produced in Florida?
RESPONSE: A flyer about graduation requirements that was sent to parents. A copy of the flyer is now included in Appendix C.
Comment Geoffrey Borman:
CLARIFY: What is the 10% rule?
RESPONSE: The 10% rule is defined in the introduction. The top 10% of students in each public high school gain automatic admission public four-year colleges or universities in Texas.
Comment Geoffrey Borman:
CLARIFY: The information goes to parents, and not students, right?
RESPONSE: Yes. This was a text error. It has been fixed.
Comment Geoffrey Borman:
FIX: Again, the intervention is targeted at the parents, and you hope that it influences students’ choices through this communication with their parents. I would suggest being more explicit and precise about explaining this throughout.
RESPONSE: Yes. The intervention is targeted at parents, and we hope that student course-selection is influence through communication with parents/guardians. We have changed the text to make this more explicit.
Recruitment and Information Dissemination
Comment Hans Bos:
I think you should stratify your sample so that you have a better chance of a representative sample regardless of your recruitment success.
RESPONSE: Yes. We are doing that. The method for doing this is described in the Selecting the Schools section.
Selecting the Schools
Comment Geoffrey Borman:
CLARIFY: The methods calls for using covariates that are predictive of the outcome--students’ decision to complete Algebra II—and this point should be detailed here I think.
RESPONSE: We have added language to clarify this.
Power
Comment Geoffrey Borman:
CLARIFY: How did you conclude that this is reasonable estimate of the expected impact? I’m having a difficult time conceptualizing what an impact of this magnitude would really mean in practical terms. One interpretation might be that in a typical T and C school of 1000 kids, the intervention may be expected to increase Algebra completion by 50 more students in the T schools.
RESPONSE: We determined that a difference of less than 5 percent between treatment and control groups would not be practically meaningful. In practical terms, it means that five percent less of the students in the control group will complete algebra II than in the treatment group. We do not find the above interpretation to be clearer, as such we have opted not to include this text in our proposal.
Comment Hans Bos:
That seems excessive for this experiment but I will pass judgment after reading the appendix.
RESPONSE: We’ve redone our power analysis, changing the assumptions slightly, and have come up with a somewhat smaller number of schools. We acknowledge that this study may be somewhat overpowered; however, we are hesitant to reduce the number of schools included in this study much further. Changing the number of schools in this study has a direct impact on project 4.10, since the control group for this study comprises the entire sample for project 4.10. We worry that we will not achieve a representative sample of schools in project 4.10, if we use fewer than 100 schools. Additionally, reducing the number of schools in 4.11 will have a negative impact on the power for project 4.10.
Attrition
Comment Geoffrey Borman:
FIX: New guidance on cluster randomized trials from WWC has implications for the student sample. The student sample, ideally, should be determined based on student rosters obtained just prior to random assignment. A student-level impact is estimated only for those students who remain at the schools they were attending just prior to random assignment—the “stayer” sample. This may influence decisions about when to randomize and, of course, how you define the student sample.
RESPONSE: Our intervention functions somewhat differently from most interventions. In this study, students won’t actually know that they are in a study. Moreover, they officially will not be in our study until they are in their final semester of Geometry. As such, any attrition from the study cannot be related to the intervention. This is similar to other multiyear RCTs that involve multiple cohorts of students. Our students may be in the same cohort of incoming students, but they are not in the same mathematics level cohort.
Data Analyses
Comment Hans Bos:
As in the other paper I think it is important to broaden these outcomes to include “opportunity cost” variables, such as other courses or activities that students do instead of Algebra II, as well as graduation rates.
RESPONSE: While we agree that the outcomes needed to be broadened in project 4.10, and we have done so, we do not believe that is appropriate for this study. Project 4.10 is designed to look at what courses students take instead of algebra II. As such, this will not be neglected. This project, 4.11, is designed to look at the impact of on intervention designed to inform students about the role of algebra II in college access and success. As such, we are limiting this study to examining enrollment and completion of algebra II. We would like to be able to look at changes in graduation and dropout rates, but we cannot do so, as the REL contract ends before data on these outcomes would be available.
Confirmatory
Analyses
Comment Hans Bos:
As with the other study, I think you should look at Algebra II enrollment as well.
RESPONSE: Algebra II enrollment has been added as an outcome in both studies.
Comment Geoffrey Borman:
CLARIFY: How many strata (and how many degrees of freedom) will be used?
RESPONSE: As described in the section on sample selection, the number of strata is determined by the analysis used to select schools. We suspect that the number of strata will be around 10; however, at present we do have a definite answer to this question.
Comment Geoffrey Borman:
CONSIDER: Would it also make sense to include as a school-level covariate prior Algebra II completion rates for earlier cohorts?
RESPONSE: Yes. We have added prior Algebra II completion rates as a school-level covariate.
Comment Geoffrey Borman:
CLARIFY: Will undeliverable letters come back to SEDL so that you may also track how many letters (and for whom) did not get delivered to the intended recipients? This could lead to some interesting TOT analyses, which would model school-level treatment assignment as an instrument for parent/student receipt of the information, and its corresponding relation to the outcome.
RESPONSE: The undelivered envelopes will be returned to REL Southwest. This will allow us to track how many envelopes were not delivered. As suggested above, we will conduct an instrumental variables analysis in which we model school-level treatment assignment as an instrument for parent/guardian receipt of information and its relationship to the algebra II enrollment and completion.
TWG Summary Notes
Geoffrey
Borman Summary Notes:
I
believe that the revisions have strengthened the proposal
considerably. There seems to be some general evidence and literature
that will be used to help craft the letter, and this has improved the
overall plan. Still, I would suggest that you pilot the letter with
non-sampled parents, if possible. Because the letter is the
intervention, I
suggest taking great care in how it is developed and piloted. Not
seeing an example of the actual letter also concerns me somewhat. I
am not certain why the difference in difference approach is proposed
in addition the HLM impact analysis. I would suggest eliminating
this unless a stronger rationale can be proposed. There are recent
WWC guidelines regarding cluster randomized trials, which I referred
to in my comments. I would suggest that you track your student
sample in accordance with these new guidelines. Regarding
implementation fidelity, I would suggest (as I noted in my comments)
that you also track which letters were undeliverable. In
high-minority and high-poverty schools, I would guess that the
undeliverable rates would be higher due to more frequent family and
student mobility. Thus, you could erroneously conclude that the
treatment was less effective in these schools, but in reality it may
have been that fewer parents actually received
the treatment in these schools. Finally, you should indicate that
you will evaluate whether Algebra II completion rates are normally
distributed. You may need to transform this outcome is it highly
skewed, for instance. There are a few other points of clarification
noted in my comments but, overall, I think this proposal is quite
strong.
RESPONSE: We do plan to test the brochure with parents of non-sampled students. Text referring to this has been added to the proposal. We have not yet created the brochure, but a copy of the brochure will be included in a later version of this proposal for RPR. We have removed the difference-in-difference analyses. We will also track the letters that are returned as undeliverable. The outcomes in this study are all binary. Therefore, we do not expect algebra II enrollment or completion to be normally distributed, and no transformation is necessary.
Summary Comment Hans Bos:
Here are my comments on the second proposal. Again, my apologies for the delay. I think this is an excellent proposal and a great opportunity to learn. Many of my big comments parallel those in the other proposal. I think you should focus on a larger set of outcomes and you need to make the proposal a bit more balanced about the Algebra II decision. It is possible that students who do not choose Algebra II do so because they know what they are doing. You have to allow for that possibility.
RESPONSE: We have added algebra II enrollment as an additional outcome to this study and project 4.10. Project 4.10, the companion to this study, does included a wider array of outcomes, including looking at the types of mathematics courses that students complete. As such, we are not repeating them here. The focus of this study is on the intervention, which is designed to provide students with information about the role of algebra II for college access and success. Thus, we are restricting our focus to outcomes that we believe are directly related to this, namely algebra II enrollment and completion.
Finally, I do not understand how you did the power analysis but it seems overly conservative. Even with a small MDES, my gut feeling is that this study should not take more than about 100-150 schools, provided you use individual student data and get some good school background data. But I didn’t repeat your calculations so I may be wrong.
RESPONSE: We have recalculated our power analysis and arrived at a somewhat smaller number of required schools—approximately 220. While we agree that this study may be overpowered, we are hesitant to reduce the sample further, as doing so would have a direct effect on the representativeness of the sample and power for project 4.10.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Ginger Stoker |
File Modified | 0000-00-00 |
File Created | 2021-01-25 |