SUPPORTING STATEMENT
FOR OMB CLEARANCE
PART A
NASA Summer of Innovation FY2012
2012 PROGRAM DATA COLLECTION
National Aeronautics and Science Administration
December 16, 2011
Contents
A.1 Explanation of Circumstances That Make Collection of Data Necessary 1
A.2 How the Information Will Be Collected, by Whom, and For What Purpose 3
How Information Will Be Collected 3
Who Will Collect the Information 9
A.3 Use of Improved Information Technology to Reduce Burden 11
A.4 Efforts to Identify and Avoid Duplication 11
A.5 Efforts to Minimize Burden on Small Business or Other Entities 11
A.6 Consequences of Less-Frequent Data Collection 12
A.8 Federal Register Comments and Persons Consulted Outside the Agency 12
A.9 Payments to Respondents 14
A.10 Assurance of Confidentiality 14
A.11 Questions of a Sensitive Nature 15
A.12 Estimates of Respondent Burden 15
A.13 Estimates of the Cost Burden to Respondents 16
A.14 Estimates of Annualized Government Costs 17
A.15 Changes in Hour Burden 17
A.16 Time Schedule, Publication, and Analysis Plan 19
A.17 Display of Expiration Date for OMB Approval 20
A.1 Explanation of Circumstances That Make Collection of Data Necessary
The National Aeronautics and Space Administration (NASA) Office of Education, requests that the Office of Management and Budget (OMB) approve, under the Paperwork Reduction Act of 1995, a 16 month clearance for NASA to collect parent consent forms, student survey data, teacher survey data, and awardee implementation data as part of the formative study of NASA’s Summer of Innovation (SoI) Project FY2011. This package combines the data collection efforts under a single PRA and requests revisions to the data collection activities, instruments, and the sampling plan approved that was approved to evaluate the summer 2011 and school-year of 2011/2012 SoI activities.
In 2010, NASA’s Office of Education launched the SoI pilot, a NASA-infused summer experience for middle school students who underperform, are underrepresented, and underserved in science, technology, engineering, and mathematics (STEM) fields. The SoI pilot utilized a multi-faceted approach to reach and engage middle school students in STEM learning with NASA content and experiences. The topics ranged broadly and included activities concerning robotics, rocketry, engineering design, meteorology, space science, and climate science. Evaluation data were collected from various sources during the pilot to produce lessons learned regarding program design, implementation, and program evaluation. The pilot evaluation produced valuable insight into the program and was used to redesign SoI for 2011. However, it was limited in its ability to generate hypotheses about promising practices; in most cases the pilot evaluation was not able to field baseline surveys, necessary for assessing change in the program’s outcomes of interest.
Drawing heavily upon the lessons learned from the evaluation of the 2010 summer SoI pilot, NASA modified its approach to focus on expanding the capacity of community and school-based organizations to engage youth in STEM learning activities. In FY2011, NASA implemented a three-tiered solicitation and award structure, designed to provide selected awardees with different levels of funding and access to NASA staff, facilities, and technology to engage 4th through 9th grade students in intensive, high-quality, inquiry-based content learning experiences in STEM during the summer and the school-year. In FY 2011, NASA revised SoI’s expected outcomes to better reflect the nature and objectives of the new SoI. Given that the activities are short in duration, SoI shifted the focus of the program from attempting to impact student achievement directly to inspiring and engaging middle school students in NASA STEM content.
The 2012 programming requirements for existing SoI awardees are similar to those in FY 2011; national awardees are required to provide 40 hours of student STEM activities utilizing NASA content over the summer and an additional 25 hours by end of June 2013, while NASA Center partnerships must provide 20 hours of student STEM activities utilizing NASA content during the summer and an additional two STEM activities integrating NASA content by end of June 2013. For the national awards, organizations receiving SoI funding are required to provide classroom middle school teachers 40 hours of professional development by end of June 2013 and use them as part of their summer staff delivering NASA content; NASA Center partnerships are not expected to provide professional development for classroom teachers.
Expected outcomes for students in FY2012 include increased interest and engagement in STEM topics, careers, and leisure activities in the short term; ultimately, the program aims to contribute to an increase in the overall number of students pursuing STEM degrees and related careers and, more specifically, the proportion of underrepresented students who pursue these paths. NASA also seeks specific outcomes for classroom teachers in FY2012: to increase their access to and use of NASA content and resources in their classrooms so that over the long term, they have a better understanding of NASA content, increased confidence in teaching NASA topics, and improved ability to teach NASA topics. Furthermore, NASA seeks to build the sustainability of the awardees’ programs by supporting the development of partnerships with formal and informal STEM institutions so that they can eventually operate high-quality STEM programs independently at scale, as SoI funding diminishes in subsequent years.
As it continues to develop and refine the program, NASA is planning to contract with Abt Associates Inc. to continue the formative evaluation efforts with the FY2011 SoI national awards and NASA Center partnerships in the summer and school-year of 2012/2013. The design is similar to the one cleared in 2011, although the design and implementation approach have been refined to reflect the lessons learned from the 2011 evaluation activities: specifically, survey instruments, the parent consent form, implementation forms, the sampling plan and the non-response bias analysis plan, as well as some of the procedures to collect the data, have been revised to in response to the 2011 evaluation experiences. The upcoming evaluation will describe the different approaches taken by national awardees to meet the SoI requirements. It will also explore whether change occurs in key outcomes between baseline and follow-up surveys for participants at the national awardees and NASA Center SoI sites, where NASA is investing the majority of the SoI project funds.
This clearance request pertains to data collection and related activities that will occur between February 1, 2012 and June 30, 2013 for the formative evaluation. Data under this clearance will be collected during the second year of SoI implementation from the 2011 cohort of SoI national awardees and NASA Centers and their 2012-2013 participants (parents, students, and teachers). This clearance request combines and revises the evaluation design activities previously approved under two OMB control numbers, 2700-0150 and 2700-0151. This request includes revised, draft versions of the following instruments:
Parental consent form and associated survey and question-source-outcomes crosswalk (Appendices 1, 2 and 3)
Student survey and question-source-outcomes crosswalk (Appendices 4 and 5; note that baseline and follow-up surveys are identical);
Classroom teacher paper and online baseline and first follow-up surveys and question-source-outcomes crosswalk (Appendices 6, 7, 8, 9, and 10);
Awardee summer and school-year implementation reporting forms for student activities/events and professional development activities (Appendices 11, 12, and 13);
Teacher and after-school instructor school-year quarterly log forms (Appendices 14 and 15);
Awardee PI interview consent script (Appendix 16); and
Awardee PI pre-summer, post-summer, and post-school-year interview protocols (Appendices 17, 18, and 19).
The data to be collected are not available elsewhere unless collected through the national evaluation. The teacher and student instruments will be used to gather data prior to and after the summer activities in order to assess change in SoI’s key short term outcomes. Information about implementation will be gathered from numerous sources, including the awardee implementation reporting forms and awardee PI interviews. These data will allow the national evaluator to describe the different approaches taken by the awardees, as well as to compare awardees’ original plans with what they actually implement.
The national evaluation is an important opportunity to collect information needed to continue refining the program’s design. Further, by focusing on the second year of implementation and using updated instruments and an improved sampling design, the FY2012 evaluation will help NASA continue to gain an understanding of how SoI may influence students and educators. The evaluation will explore whether NASA’s requirements, in this second year of implementation, are feasible and appropriate, and continue to generate lessons learned for future implementations of SoI and NASA’s education activities more broadly. The evaluation too will also continue to collect evidence of which SoI practices and models may have promise, those that are correlated with desired outcomes and perceived by NASA as worthy of further investment, which will inform a more rigorous examination in a future summative study.
A.2 How the Information Will Be Collected, by Whom, and For What Purpose
How Information Will Be Collected
Data will be collected using several methods. Consent to participate and background information will be collected through parent consent forms that will be part of the registration process. Student and classroom teacher data will be collected through survey instruments, while implementation data for summer activities will be collected via awardee reporting forms and interviews with the awardees’ principal investigators. As the structure of the programs change for the school-year activities, implementation data will be collected through school-year implementation forms and teacher implementation forms. Because of different programmatic expectations and varying levels of funding, as well as the plans for internal monitoring of the Center activities, all types of data will be collected from the national awardees while only student survey data will be collected at NASA Center partnerships.
Parent Consent Forms
SoI awardees and NASA Centers will include a parent consent form, which contains a brief parent survey (Appendices 1 and 2), in the registration materials that parents must return to enroll students in the SoI activities. The parent consent forms are similar to those used in the 2011 data collection. The first page of the form provides information regarding the purpose of the data collection, describes what participation entails and how the national evaluators will protect students’ privacy. It also includes contact information for parents who have questions about the study and their students’ rights as participants. On the second page, parents are asked to indicate whether they provide permission for their student to participate in the student surveys that will be administered during the SoI sessions. Finally, the last page of the consent form is a brief survey that asks parents to report student and parent demographic characteristics as well as the reasons for enrolling their child in the program. A crosswalk describing how the items are used to address research questions is included in Appendix 3. The following revisions reflect 2012 improvements to the parent consent forms:
Changed dates to reflect the 2012-2013 school-year.
Simplified and clarified the language explaining purpose of the study, details of participation, and privacy
Revised parental education attainment response options to be consistent with response options used on the National Center for Education Statistics’ (NCES) High School Longitudinal Study (HSLS).
Added items about parents’ educational and occupational experiences in science-related fields.
Revised parent reasons for enrolling student response options to include themes identified in parents’ open-ended responses on the FY2011 form.
Added a question, adopted from the NCES HSLS survey, about parent expectations for student educational attainment.
Included item requesting emergency contact information.
The parent consent form will be available in two formats, paper and online. Offering multiple survey modes will ease burden on the awardees who collect the information, allowing the awardees to offer parents the most convenient mode. Awardees and Centers that would like to offer the parent consent form online would provide registering parents with the survey URL and a site-specific PIN, needed to gain access to the survey (see Appendix 20 for an example of the PIN and agreement to participate screens). PIN numbers would be unique to each Awardee/Center and PIN authentication would be necessary for access to the parent consent form. A link to the online survey will also be available on the NASA Summer of Innovation website. The data collected via the online surveys would be maintained on the survey vendor’s secure server and then safely transferred to the national evaluator. Data from the online survey will be collected, stored, and transferred in accordance with NASA’s privacy and security requirements. Awardees that choose to administer the paper version will return them to the national evaluator for safe-keeping and data entry. Participation in SoI programs will not be conditional on providing consent. We anticipate that enrollment will begin shortly after SoI funding decisions are announced in February 2012.
Student Surveys
Approximately 3,190 students across the national awards and 3,010 students across the NASA Center partnerships sites will be sampled and asked to complete SoI surveys (draft version of the revised student survey for all students is included in Appendix 4). A single survey will be used for all grades this year. This decision is based on the evaluation experience of summer 2011, when the national evaluator learned that the awardees and NASA Centers do not typically know student grade levels prior to the start of the camps, as students can register almost up until the first day of camp and awardees report that a good number of students who enroll in the program, do not actually show up. Further, last summer’s experience suggests that the older student version can be used to collect data from younger students. At a few locations, grades 4 and 5 students were administered the grades 6 through 9 version of the survey.1 A comparative analysis of the construct validity of the two scales that varied across the survey versions (i.e., science career interest and interest in NASA-related activities) was conducted between the group of 105 grades 4 and 5 students that took the grades 6 through 9 grade survey and the 738 students in grades 6 through 9 students that took the same survey; results indicate that the individual constructs of each of the scales remained highly correlated regardless of the student’s grade level,2 suggesting that the scales included on the grades 6 through 9 grade survey may also be appropriate for students across all the program’s targeted grade levels.
In addition to combining the two student survey versions into one, the following revisions were made:
Dates changed to reflect the 2012-2013 school-year.
Removed the nine-question scale pertaining to NASA-related activities to address feedback from awardees that these questions were redundant.
Added four items based on identified gaps in the FY2011 data collection. The four items are:
Education expectations
Career aspirations
Participation in FY2011 SoI
Reason for enrolling in SoI
A crosswalk that describes how the survey items link to the research questions, their purpose, and their sources is included in Appendix 5.
Paper surveys will be administered to students at baseline (prior to the start of the SoI summer activities) and immediately at the end of the summer SoI activities; a second follow-up survey will be mailed to students in spring 2013. The third wave will not be administered to students who participated in summer activities at NASA Centers, where students are minimally engaged in follow-on activities that are likely insufficient to affect their interest in science. To document changes over time on the outcome measures, the follow-up surveys will contain the same items as the baseline instruments.
Teacher Surveys
Teacher surveys will be administered to the 1,200 classroom teachers participating at national awardee sites at baseline (prior to the start of their summer involvement in SoI; see Appendix 6 for the paper version and Appendix 7 for the online version), at the end of the summer SoI activities in July or August 2012, depending on when activities are completed by their awardee (Appendix 8 for the paper version and Appendix 9 for the online version) and again at the conclusion of their awardee’s school-year activities (spring 2013).3 These surveys were all revised to reflect the 2011-2 year experiences. Based on teacher feedback to the 2011 teacher surveys, the current draft was shortened to reduce redundancy. In addition, some questions were added: a question asking teachers if they participated in during the 2011-2012, two questions to better capture teachers’ prior experiences with science-related content—one question asks about major field of study during postsecondary education and the second question asks for the content area in which teachers are certified. A crosswalk that describes how the survey items link to the research questions, their purpose, and their source is included in Appendix 10. To capture changes over time on teacher outcomes, the follow-up survey will contain the same items as the baseline instrument; the follow-up survey contains a few additional items to gather feedback regarding the SoI professional development.
Awardees will have the option of administering paper or online surveys to teachers, as requested by the national awardees to reduce their burden. Should the awardee prefer paper, they will include a paper survey as part of teachers’ employment/ registration materials; teachers will be asked to return them to their coordinator prior to participating in the summer activities. Alternatively, awardees and Centers that would like to offer the teacher survey online would provide registering teachers with the survey URL and a site-specific PIN, needed to gain access to the survey (see Appendix 20 for an example of the PIN and agreement to participate screens). PIN numbers would be unique to each Awardee/Center and PIN authentication would be necessary for access to the teacher survey. A link to the online survey will also be available on the NASA Summer of Innovation website. The data collected via the online surveys would be maintained on the survey vendor’s secure server and then securely transferred to the national evaluator. Data from the online survey will be collected, stored, and transferred in accordance with NASA’s privacy and security requirements.
All of the follow-up surveys will be online and emailed to teachers immediately following the completion of the summer activities (August 2012) and the school-year activities (spring 2013); in the few circumstances where internet access is not available, the national awardee will mail a paper survey to the awardee for administration. As NASA Center partnerships are not required to engage classroom teachers in their SoI activities, surveys will only be administered to teachers participating at the national awardee sites.
Awardee Implementation Reporting Forms
Links to electronic implementation forms will be sent to the PI or their evaluation coordinator at each awardee to collect information about the awardees’ professional development and student activities that are actually implemented. The PI will be responsible for ensuring that each lead instructor of a summer activity complete a student activities implementation form at the conclusion of each student summer camp (draft version included as Appendix 11), each student school-year event coordinated by the awardee (draft version included as Appendix 12), as well as at the end of each summer and school-year professional development session (draft version included as Appendix 13). Just as in 2011, these forms asks awardees to report the actual dates of implementation, the content used, the number of contact hours, the number of hours during which NASA content was used, the number of participants enrolled and attending, reasons for why participants did not complete the activity, and who led the activities. Taken together, the data collected through these forms will allow the evaluators to describe the different approaches taken by awardees to meet the NASA requirements. The forms will also inform data on student and teacher outcomes, perhaps identifying promising practices.
The implementation forms were revised based on their use in summer 2011. The changes were made to the implementation reporting forms are listed below:
Professional Development Implementation Form (same for summer and school-year)
Included a question about the format of the professional development event (e.g. webinar, face-to-face) as multiple formats were used in summer 2011. Questions specific to the webinar format were added.
Updated NASA units, lessons, and activities list to reflect what will be provided in 2012.
Updated the NASA Support Tool List to reflect what will be available to awardees in 2012.
Updated “types of participants” so that instructor is asked for proportions of each type of educator (e.g., classroom teachers, informal educators, university faculty, etc.) instead of an exact count to ease respondent burden.
Summer Student Activities Implementation Form
Revised to collect information at the camp level (not the classroom level) as not all camps were organized into classrooms, causing confusion in completing the forms.
Added an item to identify whether the structure was a stand-alone camp or was embedded within another camp or program.
If the camp is embedded within a larger program or camp, respondents are asked to identify type of program or camp into which SoI is embedded.
If the camp is embedded within a larger program or camp, ask respondents if the program or camp receives funding from 21CCLC, GEAR-UP, TRIO programs.
Added an item to identify whether the camp included field trips or other special events.
Updated NASA units, lessons, and activities list to reflect what will be available to awardees in 2012.
School-Year Student Event Implementation Form
This form is the same as the revised summer student activities implementation form, except that it asks about out-of-school programs (e.g., clubs, before/after school activities) rather than camps.
School-Year Quarterly Log Form
During the FY 2011 school-year, national awardees were not always involved in the delivery of school-year SoI events and activities, instead the teachers and instructors they trained continued to engage students in SoI activities. This structure necessitates collecting some of the school-year implementation data directly from teachers or after-school instructors rather than from the awardees’ coordinators. To accomplish this, the national evaluation will use the school-year teacher log form (included as Appendices 14 and 15), which is designed to be completed electronically quarterly. In FY2011, teachers were asked to complete monthly logs, but feedback from awardees indicated that this level of frequency was overly burdensome on teachers and that less frequent administration could still gather the necessary data. Less frequent data collection may increase teachers’ likelihoods of responding, producing more reliable albeit fewer data points.
Teachers and instructors within the awardees that do not directly provide structured school-year activities will receive reminders to complete the form via e-mail. This form is a shorter version of the student summer and school-year implementation forms. These quarterly logs ask respondents when they used NASA resources and the number of hours of NASA content they provide. Collecting this data will allow NASA to learn how school-year activities are implemented when an awardee does not coordinate them. Two versions of the form will be used, one for teachers and one for after-school instructors, to ensure that wording on the form is appropriate. The revisions are as follows:
Teacher School-Year Quarterly Log
Updated NASA units, lessons, and activities list to reflect what will be available to awardees in 2012.
Added an item to identify whether the structure was a stand-alone camp or was embedded within another camp or program.
If the camp is embedded within a larger program or camp, respondents are asked to identify type of program or camp into which SoI is embedded.
If the camp is embedded within a larger program or camp, ask respondents if the program or camp receives funding from 21CCLC, GEAR-UP, TRIO Programs.
Added an item to identify whether the activity included field trips or other special events.
After-School Instructors School-Year Quarterly Log
Same as the revised Teacher School-Year Quarterly Log, except that the questions are reworded to pertain to after-school program instructors instead of teachers.
Interviews
All eight principal investigators of the SoI national awardees will be asked to participate in one-on-one interviews at three points in time: before the start of summer activities (spring 2012), after the summer activities end (fall 2012), and again after the school-year activities are complete (summer 2013). See Appendix 16 for consent script and Appendices 17, 18 and 19 for the protocols. No changes were made to the FY2011 consent script or interviews protocols. In spring 2012, the interviews will discuss plans for the upcoming summer implementation either in person (should there be a SoI meeting) or over the telephone. In the fall, PIs will be asked about the summer activities, their capacity-building efforts, and their plans for the upcoming school-year. After all school-year activities are completed, the evaluation team will ask PIs to reflect on successes and challenges of the previous year and to discuss any potential changes to SoI implementation in summer 2013. These interviews will allow the evaluators to collect qualitative descriptions of the SoI programs as planned and as implemented that will complement the quantitative implementation data collected through reporting forms.
Review of Draft Online Forms
The FY2011 versions of the teacher surveys and implementation forms were programmed and reviewed by Abt SRBI’s survey senior staff - Courtney Kennedy, PhD (VP Advanced Methods Group) and Robb Magaw (Senior Project Director with over 20 years of experience in conducting survey research including Web surveys). This review ensured that the instruments met the highest standards supported by literature and best practices; Mr. Chintan Turakhia (Sr. Vice President of Social and Public Policy research group with over 20 years of survey research experience) served as project consultant. The online forms for 2012 will again be reviewed to ensure that best practices are leveraged. In addition, Abt-SRBI’s team will continue enhance the aesthetics and usability of the forms, to create a professional-looking format, ensure consistency in how items are presented, and improve the usability of navigation buttons. It is critical that respondents are able to navigate through the online form easily. Central to this goal is providing clear navigational buttons that stand out and are strategically located and adheres to the U.S. Health and Human Services Web design guidelines (2006).
At a minimum, the online version will be programed to addresses the following:
Employ a Paging Design: The forms contain skip patterns. That is, some items should only be answered by a subset of the respondents, and eligibility for these items is conditional on information entered earlier in the form. Forms featuring skips are best administered with a paging design rather than a scrolling design because skip patterns can be executed automatically without respondents needing to determine which items they should answer (Couper 2008). Automated skips can improve data quality by reducing errors of omission and errors of commission (Redline and Dillman 2002). Automated skips have also been shown to reduce the length of the time required to complete the form, particularly for respondents who are not eligible for certain items (Peytchev et al. 2006).
Eliminate Requests for Information That Can Be Captured Passively: The instruments were reviewed to remove any items that can be collected passively, eliminating the need for the respondent to report this information. Best practices of online data collection involve using some kind of authentication process to link each respondent to their form and no other (Couper 2008). The research literature supports using a “semiautomatic authentication” approach, which has the respondent enter a personal PIN when accessing the online form (Heerwegh and Loosveldt 2003). For example, under this approach it is not necessary for respondents to enter their username because it is embedded in the URL. Each PIN should be associated with a unique SoI Awardee, so when completing the form asking for both the PIN and the Awardee name will be redundant. Similarly, the date on which the form is completed can be captured automatically by the web survey software.
Present Definitions of Key Terms Effectively: The Word document version presents five key definitions on the introductory page of the Planning Form, which research literature indicates is not an effective approach for presenting definitions. For example, eye-tracking research demonstrates that respondents are reluctant to invest effort in reading text that is not on the “critical path” to completing the form (Galesic et al. 2008). In other words, the likelihood that respondents completing the form will read and process definitions presented on an introductory page is quite low. Rather than presenting this information on the first page, the programmed version included hyperlinks in the wording of the relevant items. For example, a hyperlink for the phrase “key partner” that if clicked led the respondent to a new page containing the definition. This way the definition is presented in the item itself and is available exactly when the respondent may need to reference it. That said, hyperlinks are not a perfect solution because research indicates that some respondents are unwilling to expend the effort required to click on the link and read the definition (Conrad et al. 2006). Ideally, the definition would be integrated into the wording of the item itself or the item would be re-written so that the definition is less necessary (Couper 2008).
Who Will Collect the Information
As part of the solicitation, national awardees have been notified that they will be required to identify an on-site national evaluator coordinator who will assist in the evaluation’s data collection. These coordinators will be responsible for administering the parent consent form, the student and certified teacher surveys as well as collecting the implementation forms. Likewise, the education leads at the NASA Centers who manage the SoI collaborations will be similarly responsible for the collection of parent consent and student survey data (note: no teacher surveys or implementation forms will be administered at the NASA Center sites). The awardee PIs and Center leads will be accountable to NASA for ensuring that all data collection occurs in a timely manner.
During the kick-off meeting, the national evaluator will present the purpose of the evaluation to the principal investigators (PIs) and NASA Center leads and outline their responsibilities. Right before fielding, mandatory webinar trainings will be provided to coordinators, NASA Center education leads, and PIs in order to prepare them to administer the surveys to ensure that the data are collected consistently across sites. Additional evaluation guidance will be provided in FY 2012 to awardees and Centers in the form of a comprehensive guide to the evaluation activities available online and in hardcopy. Finally, interviews with the awardees’ principal investigators will be conducted by the national evaluation team.
The purpose of this data collection effort is to support the national evaluation of the SoI project. The goal of the national evaluation is formative, that is, to gather data that will inform NASA’s continued development of the program as well as to assess whether evidence supports the progression to a more rigorous, summative, impact evaluation. As such, the evaluation will focus on describing SoI’s implementation and associated outcomes, but will not determine whether there is a causal link between the program and outcomes. Note, a group of evaluation experts will be convened in 2012 to design a more rigorous evaluation with a target implementation timeframe of 2014. The formative work will develop a description of the awardee models and possible linkages to desired outcomes, enable NASA to assess the fidelity of implementation, and generate lessons learned to improve future SoI activities.
Exhibit 1 below outlines the research questions for the SoI national evaluation, data sources, and constructs.
Exhibit 1: National Evaluation Research Questions |
|||
Research Questions |
SoI Tier of Interest |
Data Sources |
Constructs |
1. Who participates in SoI FY2011? |
National awardees and NASA Centers |
Parent consent forms |
Participant demographic information |
2. Does student interest in science change significantly between the baseline and follow-up surveys? If so, are these changes larger at some awardees/NASA Centers than others? |
National awardees and NASA Centers |
Student surveys |
Overall interest in science, career interest in science, leisure interest in science |
3. Does comfort in teaching NASA topics and access/use of NASA resources change between baseline and follow-up surveys? If so, are these changes larger at some awardees than others? |
National awardees only |
Teacher surveys |
Access and use of NASA content and resources; comfort in teaching NASA topics |
4. How do awardees plan and implement their summer and school-year activities? What are the similarities and differences across the approaches? Are there any apparent relationships between the approaches and desired outcomes? |
National awardees only |
Implementation forms; interviews |
Program activities, duration, content, delivery methods, participants |
5. What supports and challenges do awardees face in implementing their SoI programs? How do they negotiate these challenges? |
National awardees only |
Interviews |
Implementation challenges and successes |
6. How are awardees preparing to operate independently of SoI funding? |
National awardees only |
Interviews |
Sustainability planning |
The first research question will be answered using the parent consent form. The student surveys will address the second research question and allow the national evaluators to explore changes associated with participation in SoI in student interest in STEM (including overall interest, career interest, and leisure interest). In the student survey NASA focuses specifically on students’ interest in science and not other disciplines. While NASA is certain science will be addressed by all programs, technology, engineering, and mathematics may not be a focus of the summer programs across all awardees/Centers. The teacher surveys, which enquire about teacher access and use of NASA resources, as well as comfort in teaching NASA content (all of which addresses science-related topics), will inform the third research question.
As mentioned earlier, while measuring outcomes at multiple points in time can provide evidence of whether the outcomes of interest change, it will not allow us to rule out the possibility that something other than the program is affecting this change. However, it will support investigation into associations between implementation and outcomes of interest to inform future program strategy, as well as inform the future decision about whether a more rigorous impact evaluation should be undertaken.
Implementation data collected through reporting forms and interviews answer questions four, five, and six. Not only will the data be vital in understanding the context in which any change in key outcomes is identified, but will also serve as a resource to support additional research on STEM learning as it relates to informal and K-12 education by academic researchers and others interested in STEM engagement.
A.3 Use of Improved Information Technology to Reduce Burden
The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. The national evaluator will provide training and support to all awardees/Centers to assist in obtaining systematic and consistent data. Surveys were designed to require minimal effort, including only questions not available elsewhere. In particular, the student surveys were designed to be easy to read with straight-forward questions and minimal skip-patterns. Student survey data will be collected on paper distributed by the national evaluation coordinator at each awardee/Center during the summer and by mail in spring 2012. Parent consent forms as well as teacher surveys will be available online and on paper. These dual modes of survey data collection will allow awardees and the Center partners to choose the approach that is most convenient for their participants. Awardee implementation forms and quarterly logs will be administered online. The national evaluator’s electronic mail address and toll-free telephone number will be included on the first page of the teacher survey for participants who have questions. Taken together, these procedures are all designed to minimize the burden on respondents.
A.4 Efforts to Identify and Avoid Duplication
This effort will yield data to assess SoI implementation and measures of participant outcomes; as such, there is no similar evaluation being conducted and there is no alternative source for collecting the information. NASA has identified technical representatives who will be responsible for coordinating the requests for information from the SoI project team and contractors to ensure that duplicative questions are not asked.
A.5 Efforts to Minimize Burden on Small Business or Other Entities
No small businesses will be involved as respondents. The primary survey entities for data collection efforts described in this package are parents, students, teachers, and awardees. Burden is minimized for all respondents by requesting only the minimum information required to meet study objectives. All primary data collection will be coordinated by the site administrators in partnerships with the national evaluator, so as to reduce the burden on the SoI awardees and NASA Centers.
A.6 Consequences of Less-Frequent Data Collection
If the proposed parent survey data were not collected, NASA would not fulfill NASA’s compliance need to ascertain the demographic characteristics of the SoI participants. If the proposed student and teacher survey data were not collected, NASA would not fulfill its objectives in investigating student and teacher outcomes that may be associated with participation in SoI. Without the implementation data, NASA would not understand how the program models were intended to work or were actually implemented. In addition, NASA would not know what would be required to replicate the models, should they be associated with promising outcomes. Thus, by not collecting survey and implementation data, federal resources would be allocated and program decisions would be made in the absence of information about the actual activities provided by the SoI awardees and lessons learned.
A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations
There are no special circumstances associated with this data collection.
A.8 Federal Register Comments and Persons Consulted Outside the Agency
In accordance with the Paperwork Reduction Act of 1995, NASA published a notice in the Federal Register announcing the agency’s intention to request an OMB review of data collection activities. The notice was published on January 10, 2011, in volume 76, number 6, page 1461, and provided a 30-day period for public comments. To date, no comments have been received.
The parent, student and teacher survey instruments were developed by the national evaluators, Abt Associates, Inc. and staff from the Education Development Center (EDC), comprising: Ricky Takai, Principal Investigator; Hilary Rhodes, Project Director; Alina Martinez, Principal Associate; Kristen Neishi, Deputy Project Director; Melissa Velez, Survey Analysis Task Manager; and Tamara Linkow, Task Leader, all of Abt Associates, Inc., and Jacqueline DeLisi, Abigail Levy, and Yueming Jia at EDC. Laura LoGerfo, the Project Officer for High School Longitudinal Study of 2009 at the U.S. Department of Education National Center for Education Statistics, provided feedback on the parent, student, and teacher survey instruments. These surveys are based on the theory of change depicted in the SoI logic model and informed by the evaluators’ knowledge of the program. Items were selected from previously validated instruments as was feasible. Feedback on the instruments was solicited from staff at NASA’s Office of Education. Surveys were then pilot tested, first during the SoI pilot in 2010 and then again in 2011, to ensure items were unambiguous and had face validity, that is, to learn whether they measure outcomes as intended.
The 2011 student survey uses three well-established measurements of attitude and interest: an attitude measure adapted from the School and Social and Social Experiences Questionnaire4 and two interest measures from the Test of Science Related Attitudes.5 The Abt-EDC team revised the student surveys based on the lessons learned from the FY2011 administration to alleviate respondent burden, clarify items, and ensure the inclusion of items that measure NASA’s outcomes of interest for SoI in FY2012. Based on feedback from the FY2011 survey administration the following additional changes were made:
Dates changed to reflect the 2012-2013 school-year.
Removed the nine-question scale pertaining to NASA-related to address feedback from awardees that the survey was repetitive.
Added four items based on identified gaps in the FY2011 data collection. The four items are:
Education expectations
Career aspirations
Participation in FY2011 SoI (to be used in the analysis as a control variable)
Reason for enrolling in SoI
The biggest change proposed for 2012 is to use only one student survey version. Anecdotal evidence from the 2010 pilot indicated that younger middle schools had difficulty with certain items on the student survey, which led to the creation for two student surveys in FY2011; one for younger middle school students (grades 4 and 5) and one for older middle school students (grades 6 through 9). However, analysis from the FY2011 data demonstrate that the grades 6 through 9 version provides accurate measures of the survey constructs. At a few sites grades 4 and 5students were administered the grades 6 through 9 version of the survey by the awardee. A comparative analysis of the construct validity of the two scales that varied across the two versions of the survey (i.e. career interest and interest in NASA-related activities) between the group of 105 4th and 5th grade students that took the 6th through 9th grade survey and the 738 grades 6 through 9 students shows that individual constructs of each of the scales remained highly correlated regardless of the student’s grade level. The Cronbach’s alphas for the career interest scale was .90 for the 105 grades 4 and 5 students on both the baseline and follow-up surveys, which is only slightly higher than the grades 6 through 9 alphas of .88 and .89, on the baseline and follow-surveys, respectively. The alphas for the NASA-related activities scale were also consistent across the grade-level groupings (alphas for grades 4 and 5 students who took the grades 6 through 9 survey were .87 and .88 on the baseline and follow-surveys, respectively, while the alpha for grades 6 through 9 students was .87 on both the baseline and follow-up surveys). This suggests that the scales included on the grades 6 through 9 survey can be used for students across all targeted grade levels.
The Abt-EDC team then tested the FY2011 student surveys with seven students (four students in 3rd through 5th grade and 3 students in 6th through 9th grade) to estimate time for completion and understandability of text. It took students between 4 and 15 minutes, averaging 7.9 minutes, to complete.
Data from the FY2011 administration demonstrate that the three outcome constructs of interest (attitude toward science, career interest in science, and leisure interest in science) were appropriately grouped together. The Cronbach’s alpha, a statistical measure of internal consistency, for each construct was consistently high across both the baseline and follow-up surveys, ranging from 0.85 to 0.90. Given the minor changes to the FY2012 student survey and the fact that all changes were made in response to feedback from the FY2011 administration, no additional testing was completed.
The FY2012 survey remains very similar to the FY2011 surveys, with a few minor exceptions. Based on teacher feedback to the FY2011 teacher surveys, the current draft was shortened to minimize repetition and the NASA content areas were updated to better reflect classroom content areas.
The Abt-EDC team tested the FY2011 surveys with six secondary STEM former and current teachers to estimate time for completion and understandability of text. It took between 5 and 20 minutes, averaging 8 minutes, to complete. Based on the pilot, we learned that all teachers interpreted “how comfortable you are teaching” as a combination of their understanding of the material and their ability to use it in their classrooms. Slight modifications were made to the teacher survey based on the pilot test specifically to clarify what is meant by “NASA resources.” Given the similarities between the FY2011 and FY2012 surveys no further testing was done.
There will be no payments to respondents.
A.10 Assurance of Confidentiality
Every effort will be made to maintain the privacy of respondents to the extent provided by law, including the use of several procedural and control measures to protect the data from unauthorized use. Collected data will not be released with individual identifiers, and results will be presented only in aggregated form. A statement to this effect will be included on the first page of the parent consent form, on each teacher survey and will be read to students before administering the survey; it will also be read to awardees prior to conducting interviews (See Appendix 4 for the consent form read to students prior to survey administration and Appendix 16 the interview consent script). Respondents will be assured that all information identifying them will be kept private.
The procedures to protect data during information collection, data processing, and analysis activities are as follows:
All respondents included in the study sample will be informed that the information they provide will be used only for the purpose of this research. Individuals will not be cited as sources of information in prepared reports.
Hard-copy data collection forms will be delivered to a locked area at the contractor’s office for receipt and processing. The contractor will maintain restricted access to all data preparation areas (i.e., receipt, coding, and data entry). All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.
Respondents accessing the online surveys will go through the vendor’s website where they are protected by the vendor’s strict data security system. Only those given the seven-digit personal identification numbers (PIN) can enter the survey. Upon entering the PIN, the respondent moves to a non-public directory inaccessible through the Internet. When entered, the data interfaces with a script located on a second non-public directory accessible only to the vendor system administrator. Sample screen shots of the introductory screens that respondents will see are included in Appendix 20. The final introductory screens of the online surveys will comply with NASA’s privacy and security requirements.
The survey vendor takes every precaution to ensure data collected on the Internet remains both secure and confidential. All vendor data collection servers are housed in a facility that has redundant power, expandable bandwidth and a high level of physical security. All data collection and data storage servers are built on a Storage Area Network (SAN). Database servers are mirrored with an active/passive configuration. Passive servers become active 30 seconds after a hardware failure. We maintain enough excess hardware capacity to withstand a hardware failure on any single device. The facility is monitored 24 hours, 7 days a week by on-site professional security guards and monitored over continuous closed circuit video surveillance from a command Center via both stationary and 360° cameras located both outside and inside the facility. The vendor’s security measures comply with NASA's privacy and security requirements
Individual identifying information will be maintained separately from completed data collection forms and from computerized data files used for analysis.
The national evaluation team had the data collection protocols and surveys reviewed by Abt’s Institutional Review Board (IRB), Abt’s IRB has approved the data collection instruments, including the parent consent form, student and teacher surveys, the interview protocols, the consent script, and the implementation forms. The IRB will assure that the data collection protocols and procedures, including consent forms, abide by strict privacy procedures.
A.11 Questions of a Sensitive Nature
The questions included on the data collection instruments for this study do not involve sensitive topics and respondents may skip items if they so wish.
A.12 Estimates of Respondent Burden
Exhibit 2 presents estimates of the reporting burden for the parent consent form, student surveys, the teacher surveys and the implementation plan reporting: we estimate that the annualized response burden for the entire evaluation is 1,595 hours for students at awardees, 1,003.3 hours for students at NASA Centers, 1,400 hours for teachers (600 hours for baseline and two follow-up surveys and 800 hours associated with school-year quarterly implementation log forms), 2,916.7 hours for parents related to the parent consent form and 118.6 hours for awardees, the sum of the burden for student summer and school-year implementation forms (61.3 hours), summer and school-year professional development implementation forms (33.3 hours), and three interviews during the year (24 hours). The total burden associated with this evaluation is 7,034 hours.6
This estimate assumes that it will take about five minutes for parents to read the consent script and answer the demographic questions. As the form will be included in registration materials, we assume that all parents registering students will also review and return the documents. Estimates for the burden are based on time requirements from similar surveys conducted on comparable evaluations.
This estimate assumes that it will take both students and teachers about 10 minutes to read the survey’s introduction and answer the questions. Estimates for the student burden are based on time requirements from the 2011 administration of the similar surveys that were administered. Estimates for the teacher burden are based on time required during the 2011 data collection.
Burden associated with the collection of implementation data is estimated as follows. Given the time it took to complete implementation forms in 2011, we expect the summer implementation forms to require 10 minutes per camp. Assuming that each awardee has 31 camps during the summer, the maximum number that occurred in FY2011, and that the implementation form takes 10 minutes, the total summer camp reporting burden is 41.3 hours. Assuming that approximately 15 professional development sessions are held at each site during the summer (based on the maximum observed in 2011), and that these forms take 10 minutes, the professional development implementation reporting in the summer will require 20 hours.
Implementation forms for the school-year student activities will be collected from PIs/evaluation coordinators as well as from teachers and after-school instructors. We assume that, based on schedules for 2011-2012 school-year activities, awardees will coordinate 15 of their school-year activities and about 10 professional development sessions, so that we anticipate that they will complete 25 school-year implementation reporting forms. As the forms are essentially the same as the summer forms, each should take no more than 10 minutes for a burden of 20 hours for student reporting and 13.3 hours for professional development reporting across all sites. All awardees may ask the classroom educators who are trained in the summer to lead SoI activities independently. To collect their implementation data, we will send the each trained educator (total of 1,200) quarterly implementation log forms, which as a shortened version of the implementation reporting form, should take no more than 10 minutes each (total burden of 800 hours).
Qualitative implementation data will also be collected through interviews. Principal investigators will be asked to participate in one-hour interviews scheduled at three points in time: in the spring before summer 2012 implementation begins, in the fall after summer 2012 implementation ends, and in spring 2013 after the school-year activities end. Assuming that all attend, total burden of these interviews is 24 hours.
A.13 Estimates of the Cost Burden to Respondents
We estimate that the annualized cost burden is $11,563.75 for students at awardees (the baseline and two follow-up surveys), $7,274.17 for students at NASA Centers, $28,294 for teachers ($12,126 for the baseline and two follow-up surveys and $16,168 for the school-year log forms), $69,796.83 for parents, and $3,231.13 for awardees, which includes the cost burden associated with the student summer ($989.11) and school-year implementation forms ($478.60), summer ($478.60) and school-year professional development $319.07), and spring, fall, and end-of-year interviews ($965.76). Please see Exhibit 2.
The cost burden associated with the surveys is estimated as follows: for students, we used the federal minimum wage, for teachers we used the median income of middle school teachers (as of April 15, 2011), and for parents, we used 2009 national median income.
For the cost burden associated with the collection of implementation data, we assumed that:
Awardee evaluation coordinators will complete the planning and implementation forms;
Teachers and after-school instructors will complete teacher school-year implementation log forms; and
Awardee principal investigators will participate in the focus groups.
The cost per hour burden for filling out the implementation and school-year planning forms, as well as participating in the interviews, is calculated using the 2009 national median income. The cost per hour burden for PIs’ participation in focus groups is based on the assumption that this year’s PIs will be similar to those who participated in the pilot, several of whom were associate professors at baccalaureate institutions. As such, we calculated the burden based on the national average annual salary of associate professors. There is no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the data. Other than their time to complete the surveys and forms, as well as the time to participate in interviews and focus groups, which are estimated in Exhibit 2, there are no direct monetary costs to respondents.
A.14 Estimates of Annualized Government Costs
The estimated costs to the Federal Government for the data collection activities is $411,000 (based on the costs associated with data collection in 2011). The awardees’ evaluation coordinators will collect the survey and planning/implementation data.
This data collection builds on the data collected during the national evaluation of SoI in 2011. It constitutes the second year of data collection related to the activities of the FY2011 national awardees cohort and the NASA Center SoI FY2012 programs.
Exhibit 2. Estimates of Annualized Burden Hours and Cost for Data Collection |
|||||||
Data Collection Sources |
Number of Respondents |
Frequency of Response |
Total Minutes per Respondent |
Total Response Burden in Hours |
Estimated Cost Per Hour |
Total Cost Burden |
|
Awardee & Center Parent Surveys |
35,000 |
1 |
5 |
2,916.7 |
$23.93 a |
$69,795.83 |
|
Awardee Student Surveys |
3,190b |
3 |
30 |
1,595.0 |
$7.25 c |
$12,506.25 |
|
NASA Center Student Surveys |
3,010 |
2 |
20 |
1,003.3 |
$7.25 |
$8,337.50 |
|
Awardee Teacher Surveys |
1,200 |
3 |
30 |
600.0 |
$20.21d |
$12,126 |
|
Student Summer Implementation Form |
248 e |
1 |
10 |
41.3 |
$23.93 |
$989.11 |
|
Student School-Year Implementation Form |
120 f |
1 |
10 |
20.0 |
$23.93 |
$382.88 |
|
Summer PD Implementation Form |
120 g |
1 |
10 |
20.0 |
$23.93 |
$478.60 |
|
School-Year PD Implementation Form |
80h |
1 |
10 |
13.3 |
$23.93 |
$299.13 |
|
School-Year Teacher Quarterly Log Form |
1,200 i |
4 |
40 |
800.0 |
$20.21 |
$16,168.00 |
|
Awardee PI Interviews |
8 |
3 |
180 |
24.0 |
$40.24 j |
$965.76 |
|
Total Burden for Evaluation |
44,176 |
|
|
7,034 |
|
$120,158.89 |
|
Notes: |
|||||||
a Estimated cost per hour for parents is calculated based on the national median income of $49,777 (~23.93 per hour, assuming a 40 hour work week) for 2009 according to the Current Population Survey (http://www.census.gov/prod/2010pubs/p60-238.pdf, retrieved on March 9, 2011. |
|||||||
b Number of respondents based on estimated sample size, according to power calculations discussed in Supporting Statement B. |
|||||||
c Estimated cost per hour for students is calculated based on federal minimum wage of $7.25 per hour effective July 24, 2009. |
|||||||
d Estimated cost per hour for teachers is calculated by median income of middle school teachers of $42,033 (as of April 15, 2011), or $20.21 per hour (http://www.payscale.com/research/US/All_K-12_Teachers/Salary). |
|||||||
e In FY2011 the maximum of camps implemented by a single awardee in was 31. We assume this value across all awardees for FY2012, recognizing that it may exceed what is actually implemented, to provide a conservative estimate of burden. Furthermore, we assume that the camp coordinator will complete the reporting forms. |
|||||||
f In FY2011 the greatest number of planned student activities was 12; accordingly, we assume 15 across all awardees, to provide a conservative estimate of burden. Furthermore, we assume that the class instructor will complete the reporting forms. |
|||||||
g In FY2011, the greatest number of PD sessions was 15; accordingly, we assume this value across all awardees recognizing that it may exceed what is actually implemented, to provide a conservative estimate of burden. Furthermore, we assume that the class instructor will complete the reporting forms. |
|||||||
h In FY2011 a total of 75 PD sessions were planned during the school-year; to provide a conservative estimate of burden, we assume there will be 80 sessions in FY2012. Furthermore, we assume that the class instructor will complete the reporting forms. |
|||||||
i Note that calculation assumes that all teachers associated will fill out the quarterly teacher school-year implementation forms. |
|||||||
j Estimated cost per hour for PI’s is calculated based on assumption that, as last year, PIs will likely be associate professors, whose national average annual salary is $83,700 (~$40.24 per hour, assuming a 40 hour work week) for 2009-2010 at Baccalaureate institutions, as calculated using American Association of University Professors survey results (http://chronicle.com/article/Searchable-Database-AAUP/64231/, retrieved on March 4, 2011). |
A.16 Time Schedule, Publication, and Analysis Plan
The schedule shown in Exhibit 3 displays the sequence of activities required to conduct the information collection activities and includes key dates for activities related to data collection, analysis, and reporting. Two evaluation reports based on findings from the surveys and implementation data will be prepared; one following the completion of summer activities (fall 2012) and one after the completion of the school-year activities (summer 2013).
Exhibit 3. SoI Schedule |
||
Activities and Deliverables |
Responsible Party |
Date |
Parent consent form & associated short survey collection |
National evaluator & site administrators |
February – June 2012 |
Student survey data collection |
National evaluator & site administrators |
May – August 2012; June 2013 |
SoI planning interviews |
National evaluator |
Spring 2012 |
Teacher survey data collection |
National evaluator & site administrators |
May – September 2012; June 2013 |
SoI implementation forms submission |
Awardees |
June – August 2012 |
Data analysis of baseline/follow-up student and teacher surveys, implementation data |
National evaluator |
Fall/ Winter 2012 |
SoI “Lessons Learned” meeting and implementation interviews |
NASA & national evaluator |
Fall 2012 |
Expert panel review meeting |
NASA & national evaluator |
Summer 2012 |
National Report #1 |
National evaluator |
Fall 2012 |
Post school-year PI interviews |
National evaluator |
Spring 2013 |
Data analysis of post-school-year student and teacher survey, implementation data |
National evaluator |
Summer 2013 |
National Report #2 |
National evaluator |
Summer 2013 |
The national evaluator will conduct analyses on survey and implementation data to assess changes in student and teacher outcomes over time, how awardees implemented their activities, and how the two might be related. Survey data will be analyzed separately for NASA Centers and awardees and implementation data will only be collected from national awardees.
Analysis of Survey Data
Below, the analysis plan for the survey data is summarized. It is discussed in fuller detail in Supporting Statement B.
Descriptive Cross-Sectional Analyses
The evaluation team will calculate representative, cross-sectional proportions and averages of student outcomes at the student level across all awardees/Center and at the awardee/Center level, adjusting them for the sampling design by applying a calculation algorithm described in Supporting Statement B. Using the overall awardee/Center weight will allow for statements like, “the percent of students that ....,” so that it corresponds to the percent of students out of all SoI students in the country, not just the students that happen to be in the sample. Using the same calculation algorithm, but adjusting the weight to reflect all students at a particular awardee/Center will allow for the calculation of statistics that are representative of all students at a particular awardee/Center, (i.e., “the percent of students at Awardee A that ...”).
Because the universe of teachers will be sampled, the descriptive statistics for a single point in time do not need to be adjusted for sampling design. Means and standard deviations will be used to describe central tendency and variation for survey items using continuous scales. Frequency distributions and percentages will be used to summarize answers given on ordinal scales. Descriptive analyses about all awardees will be conducted on the all teacher respondents, while descriptive analyses about teachers within particular awardees will be restricted only to respondents from that awardee.
Descriptive Change Over Time Analyses
The evaluation team will examine the student and teacher survey data to provide simple descriptions of change in a variable over time. For the student surveys, we will test whether the difference in proportions and means between two time points is zero using z-tests and t-tests that take into account that the samples are overlapping. Namely, the standard errors, the precision of the estimate, will be computed taking into consideration that the samples are not independent (i.e., the same students take the surveys at different points in time) and that the estimation of variance must include covariance. For the teacher surveys, we will test whether the difference in proportions and means between two time points is zero using a McNemar test or paired t-test, depending on the distribution of the outcome variables. Both the student and teacher statistical tests are distinct from a model where the relationship between some predictor variable(s) and the change in the outcome variable over time is assessed.
Analysis of Implementation Data
Analysis of the implementation forms will be descriptive, using counts, ranges, frequencies, means, and standard deviations. Notes from the interviews will be coded using NVivo, a qualitative analysis software program that facilitates tagging and retrieval of data associated with selected themes, and content analyzed. The implementation data will allow us to explore how summer activities were implemented and how strategies were similar or different between awardees. Further, implementation data will be used to explore associations with survey outcomes and to generate hypotheses.
A.17 Display of Expiration Date for OMB Approval
NASA is not requesting a waiver for the display of the OMB approval number and expiration date on the data collection instruments.
A.18 Exceptions to Certification Statement
This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).
Conrad, F.G., Couper, M.P., Tourangeau, R., and Peytchev, A. 2006. Use and non-use of clarification features in web surveys. Journal of Official Statistics 22, 245-269.
Couper, M. P. 2008. Designing Effective Web Surveys. Cambridge, England: Cambridge University Press.
Fraser, B. J. (1981). TOSRA test of science related attitudes handbook. Hawthorn, Victoria, Australia: Australia Council for Educational Research.
Galesic, M., Tourangeau, R., Couper, M.P., and Conrad, F. 2008. Eye-tracking data: New insights on response order effects and other cognitive shortcuts in survey responding. Public Opinion Quarterly 72, 892-913.
Heerwegh, D. and Loosveldt, G. 2003. An evaluation of the semiautomatic login procedure to control web survey access. Social Science Computer Review 21, 223-234.
Peytchev, A., Couper, M.P., McCabe, S., and Crawford, S. 2006. Web survey design: Paging versus scrolling. Public Opinion Quarterly 70, 596-607.
Redline, Cleo and Dillman, Don. 2002. The Influence of alternative visual designs on respondents’ performance with branching questions in self-administered questionnaires. In R.M. Groves, D.A. Dillman, J.A. Eltinge, and R.J.A. Little (Eds.), Survey Nonresponse. New York: Wiley, 179-193.
Singh, K., Chang, M., & Dika, S. (2006). Affective and motivational factors in engagement and achievement in science. International Journal of Learning 12(6), 1447-9540.
U.S. Department of Health and Human Services (HHS). 2006. Research-Based Web Design & Usability Guidelines. Washington D.C.: Government Printing Office.
1 In most cases, awardees and Centers were not able to identify the grade level of students attending the SoI programs prior to their start; to ensure surveys were available at the start of activities, the national evaluator distributed the older survey version, which contained all survey items.
2 The alphas for the career interest scale was .90 for the 105 grade 4 and 5 students on both the baseline and follow-up surveys, which is only slightly higher than the alphas based on the 738 grade 6 through 9 students which were .88 and .89, on the baseline and follow-surveys, respectively. The alphas for the NASA-related activities scale were also consistent across the grade-level groupings (alphas for grades 4 and 5 students who took the grades 6 through 9 survey were .87 and .88 on the baseline and follow-surveys, respectively, while the alpha for grades 6 through 9 students was .87 on both the baseline and follow-up surveys).
3 The teacher post-summer activities follow-up survey and the post-school-year activities survey are the same.
4 Singh, K., Chang, M., & Dika, S. (2006). Affective and motivational factors in engagement and achievement in science. International Journal of Learning 12(6), 1447-9540.
5 Fraser, B. J. (1981). TOSRA test of science related attitudes handbook. Hawthorn, Victoria, Australia: Australia Council for Educational Research.
6 Note: small differences in sums due to rounding.
File Type | application/msword |
File Title | SUPPORTING STATEMENT |
Author | Melissa Velez |
Last Modified By | Tamara Linkow |
File Modified | 2011-12-16 |
File Created | 2011-12-16 |