Download:
pdf |
pdfAttachment A-9. Technical Working Group Suggestions
In June 2014, Southwest REL researchers obtained comments from two TWG reviewers: Dan
Goldhaber and Geoff Borman. This Appendix summarizes in bullet form the conceptual
suggestions that Southwest REL researchers received. Line edits are omitted from this
summary. The researchers responses are noted under each bullet prefaced by “RESPONSE.”
Description of Topic and Articulation of Need
Comments:
This requires a stronger argument. “We believe that” and “there is little reason to
expect” are not good arguments. Instead, describe what events/trends/developments
could mess things up and how your study will address these. Maybe you are doing this
later, but it’s worth doing in the introduction too.
This statement may be a bit too strong, in my opinion. This sounds like something one
might say about the experimental manipulation of such a policy, which is clearly not
the case here.
RESPONSE: We’ve added trend data for the past five years showing the percentage of
students who completed the pre-policy degree plans that required algebra II. There is
very little change across time in the percentage of students completing algebra II. We
have also altered the wording to make it sound less like we are using a quasiexperimental design.
Research Questions
I think it would be much better to focus on Enrollment and Completion rather than
Completion and Failure. The problem with failure is that you only observe it for
students who choose to enroll. That creates a huge potential selection bias.
Enrollment and Completion are both observed for everyone and by differencing them
you can back out effects on Failure without calling them that.
RESPONSE: We have added enrollment in algebra II as an outcome, and we have
changed the failure outcome. We are now looking at the percentage of students who
fail their third mathematics course, regardless of course content—so we will be looking
at the mathematics failure rates for all students, not just the ones who enroll in algebra
II.
As we discussed at the TWG meeting, if there is a way to assess whether students are
taking more “career- and vocational-type” classes after HB5, that would seem like a
key policy-relevant hypothesis to test.
RESPONSE: This is what we are planning to do as part of our descriptive analysis looking
at the types of mathematics courses that students are taking post-policy. We’ve added
some language to make this clearer.
Literature Review
This entire section is overly biased in favor of Algebra II. While I agree that HB5 is silly
and incomprehensible, you have to try to at least come up with some potential
counterarguments. Why did people want to get rid of Algebra II and dumb down their
kids? What are these other mathematics courses that kids can now take to meet the
requirements and how potentially beneficial are they? Are there any serious
education researchers who agree with Texas? Are there any potential positive effects
of HB5? (Students less likely to drop out or become discouraged, fewer underprepared students qualifying for UT, career students being better prepared through
specific courses they now have the freedom to take). Are there any states or countries
that have done something like this? What were their experiences?
RESPONSE: We have reconstructed the literature review to present better present both
sides of the algebra II debate and provide a more objective review of previous research.
I don’t think you can say that. The NUMBER of courses they take may not be affected
but the TYPE of courses could still be affected even for students who take more
courses than required.
RESPONSE: The graduation requirements at this time were simply increased; students
were required to take more courses in order to graduate. Students already taking the
required number of courses were not affected by the policy change.
Power
This would seem to be a tiny effect. What would be the practical policy relevance of
such an effect?
RESPONSE: We do not believe that an effect size this small would have practical policy
relevance. However, it is the minimum detectable effect size for this study, which we
are required to calculate per IES requirements. We have added language specifying
what we believe would be a policy relevant difference.
Outcomes Analyses
I see that success must mean attempted and passed (would any grade other than an
“F” qualify as a success?), but does failure mean not taken, or taken and not passed,
or both? I would think that taken and not passed rates would be higher pre-HB5,
because a larger and more diverse group, presumably, took the course when it was
required. How would this affect your analysis?
RESPONSE: Failure refers to not passing the course, which in Texas means earning a
grade of “F”. We will retain this definition of failure for our analyses; however, we will
look at failure rates for all students’ third mathematics course, rather than just the
failure rates for those who enroll in algebra II.
Limitations and Possible Solutions
Why use only one year? If you are going to do an interrupted time series study like
this you need to have at least a couple of pre-HB5 data points to account for history
and maturation bias problems. What if Algebra II enrollment was on an upward trend
independently already? That could cause you to draw completely the wrong
conclusion.
RESPONSE: We are not conducting an interrupted time series analysis.
File Type | application/pdf |
Author | Ginger Stoker |
File Modified | 2015-02-18 |
File Created | 2015-02-03 |