Pre-testing of Evaluation Surveys:
An Examination of the Intersection of Domestic Human Trafficking with Child Welfare and Runaway and Homeless Youth Programs
Information Collection Request
0970 - 0355
Supporting Statement
Part A
October 2015
Submitted By:
Office of Planning, Research and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
7th Floor, West Aerospace Building
370 L’Enfant Promenade, SW
Washington, D.C. 20447
A1. Necessity for the Data Collection
This information collection request is for a generic clearance to allow us to use samples of more than nine participants in order to pretest a short form and long form of a tool intended to identify victims of human trafficking within child welfare (CW) and runaway and homeless youth (RHY) populations. This exploratory work is an early step in the development of a screening tool and will allow the research team to test the screening tool’s feasibility, reliability, and internal validity with a purposive sample of youth in a variety of CW and RHY settings.
Although there have been efforts to develop screening tools and protocols to identify victims of human trafficking, many of these tools are either not designed specifically for youth or have not been validated. The lack of a screening tool that is specifically designed to determine if a child is a victim of human trafficking is challenging for service providers and CW and RHY professionals who seek to identify human trafficking victims among their service populations.
The U.S. Department of Health and Human Services (HHS) has contracted with the Urban Institute (Urban), in collaboration with CW and RHY expert advisors, to conduct an exploratory study to develop a human trafficking screening tool to be pretested on RHY and youth in the CW system. This exploratory study is the first phase of a larger body of work that is needed to develop, test, and validate a human trafficking screening tool for potential use across a range of settings. For the current study, Urban will pre-test the tool by interviewing youth at various CW and RHY service settings across several sites in three states. However, the findings from this exploratory study will not establish the wider validity of the tool. Subsequent larger pilot studies, going through the normal Paperwork Reduction Act (PRA) clearance process, are necessary for establishing the tool’s broader validity and possible use in the field.
The collection is being undertaken at the discretion of HHS.
A2. Purpose of Survey and Data Collection Procedures
Overview of Purpose and Approach
The purpose of the tool administration and data collection described below is to pre-test and further develop a screening tool intended to identify victims of human trafficking among CW-involved and RHY-involved youth populations. This is an initial step to developing a tool that could provide ACF with useful information about human trafficking screening efforts in CW and RHY programs. To pretest the tool, we will administer it to a purposive sample of youth age 13-24 involved in CW and RHY programs across several sites in three states who voluntarily consent to participate in the study.
This pretest will be the first stage of a broader effort to create, pilot test, and validate a screening tool that could be used across RHY and CW settings. In future stages, the tool could be revised based on the findings of this pretest, and undergo further testing and validation under a full OMB information collection request.
Study Design
This study involves pretesting a screening tool to use in CW and RHY service environments. Below we describe the process that was used to create and finalize the current version of the tool, and describe the procedures we plan to use to pretest the tool.
The Screening Tool: the Life Experiences Survey
The tool that we will pretest is called the Life Experiences Survey. The full instrument is included in Appendix A (Survey Instrument & Example Screenshots). This survey includes a long-form and short-form version human trafficking screening instrument, as well as questions regarding youths’ demographics, country of origin, family/runaway/foster care backgrounds, trafficking-related risk factors, and previous attempts at help-seeking. The long-form version of the human trafficking screening tool includes 19 questions tapping into the core dimensions of sex and labor human trafficking, as follows1:
Force (3 questions),
Fraud (3 questions),
Coercion (8 questions), and
Commercial sexual exploitation (5 questions).
Of these 19 questions, several are adapted from prior human trafficking screening tools, most notably one developed by Covenant House. Appendix C (Sources of Trafficking Questions) repeats the human trafficking questions from our screener, with footnotes indicating the original source of the question. In most cases, the question wording was adapted slightly to work in our closed-response survey format.
These 19 questions are asked in the context of employer/work situations and cover youths’ lifetime and past-year experiences. The short-form version of the human trafficking screening tool—described hereafter as the URBAN trafficking screener—consists of five questions adopted with modification from the long version to capture the dimensions of force, fraud, coercion, and commercial sexual exploitation within the context of a work/employer relationship (plus two optional, additional questions regarding survival sex and emotional abuse). We will simultaneously administer the short-form and long-form version tools to allow for concurrent pretesting of both instruments. In addition to the items above, a 20th question on the long-form tool asks about youths’ lifetime and past-year experiences engaging in “survival sex,” or the trading of something sexual for money, shelter, food, or anything else of value without the involvement of a third party exploiter.
The current draft tool was completed after making revisions based on input provided by HHS trafficking, RHY, and CW advisors regarding the feasibility and face/content validity of the tool (e.g., does it appear to measure what it purports to, does it comprehensively measure trafficking experiences, and are the questions structured in easily answerable ways). We also received suggestions for revisions by our Youth Advisory Council (YAC). The YAC is comprised of 8 young adults age 18 and older,2 with a history of human trafficking experiences, physical/sexual victimization, and/or experience as runaway/homeless or child-welfare-involved youth. We conducted two web-based conferences through which YAC members provided feedback about the screening tool questions, including comments on the survey’s length, readability, interpretability, item wording, response choices, and other suggestions about how the tool could be tailored to best identify trafficking victims among RHY and CW populations. The YAC members reviewed, but did not take the screening tool survey.
Through the processes described above, we will have confirmed the feasibility of the tools’ overall length and item reading levels, as well as its face/content validity at assessing whether items concisely and comprehensively measure the trafficking dimensions they purport to measure and will be viable in the various service environments.
Programming and Pretesting the Instrument
After incorporating feedback from the trafficking/RHY/CW advisors at HHS and YAC reviews, we generated electronic and paper versions of the Life Experiences Survey (including the long- and short-form screening tools), with appropriate skip patterns. Urban staff and the YAC members checked the survey for length, appropriate skip patterns, question coding, and ease of use.
Next, to test the face validity and feasibility of administration of the Life Experiences Survey, Urban pre-piloted the survey in two separate sites in Washington, D.C. A total of nine youth were recruited from two different sites operated by Sasha Bruce Youthwork, a local organization that provides a variety of support and services to youth. At both sites, surveys were administered on a tablet through Qualtrics in either online or offline modes. Multiple youth took the survey in one room, but on separate tablets. At one site (Alabama Avenue) a software glitch in the Wi-Fi accessible version of the survey caused it to restart if youth scrolled too far down on the screen.
At the second site (Maryland Avenue), the surveys were administered in an offline setting that did not experience any software problems.
Participants were able to follow along on the tablet as a survey administrator read the instructions and consent form aloud. None of the youth expressed concerns regarding the clarity of instructions or consent. They understood that they could stop taking the survey at any time and seek help from available resources.
Overall, the participants expressed that the questions were clear. Youth suggested a few minor edits to the non-trafficking portions of the survey, such as adding an educational option of “Currently in a GED program” and clarifying the question on relationship with birth family to reflect that they maintained relationships with a few select family members. With regard to the face validity of the trafficking instrument embedded in the survey, participants expressed understanding of the individual questions but wanted to be reminded of the survey’s definition of “work” more frequently while answering the questions.
Based on this feedback from pre-pilot participants, the Life Experiences Survey was feasible to administer in an offline tablet mode, even in a group setting where youths’ individual responses were protected from others’ view. Further, the questions included in the survey—particularly those comprising the trafficking instrument—had a strong degree of face validity. That is, youth clearly understood what the questions were asking and understood the progression of the questions. It is also reassuring to note that youth did not perceive the questions to be triggering or invasive; although the participating youth expressed that the questions were detailed and thorough, they did not believe they were overly-personal or traumatizing.
Finalizing the Pretest Instrument
We have translated the current version of the survey into Spanish in both paper and electronic form, and are currently recording the Audio Computer Assisted Self Interview (ACASI) version of the electronic tool in both English and Spanish. Depending on the language needs of the youth at our partner sites, we will also consider translating the tool into other languages.
We will submit the final paper version of the survey instrument to our Institutional Review Board as an amendment for final approval and to OMB.
Pretesting the Instrument
This information collection request is to pretest the Life Experiences Survey. We will use two different methods of survey administration in this pretest: self-administered and practitioner-administered. Self-administration on a tablet or computer allows youth full anonymity and privacy of their responses, since responses will either be deposited in a lockbox or into the online, secure server where the survey responses are stored, unattached to any personal identifiers. Anonymous and private self-administration allows the potential for a more truthful response without the respondent’s fear of potential mandatory reporting steps in the case that a trafficking situation is reported. However, anonymous reporting will not guarantee service providers can connect youth to supportive services. Thus we will also test practitioner-administration. Practitioner-administration allows us to validate the tool in the manner in which it may be used in the future by service providers, in their attempts to identify trafficking victims as part of existing intake procedures and then connect them to services.
We plan to have 75 percent of the youth we survey take a self-administered, electronic version of the tool, and 25 percent take the survey as administered by an appropriate, trained service provider. Each respondent will be randomized to receive either the self-administered (SA) or service-provider administered (PA) version of the tool. The precise mechanism for assigning youth to the PA or SA condition will depend on site-specific conditions. For example, we might instruct staff to assign youth with birthdays in the months of October, November, or December to the PA group; youth with birthdays in the other nine months would be assigned to the SA group.
Whenever feasible, we will have the tool completed on a laptop or tablet. If any locations are not open to using the electronic survey, we will also have a paper version available with clearly marked skip patterns. We will maintain a record of which locations used paper tools and which used electronic tools, and will note in our dataset responses which came from paper versus electronic administration. We will also indicate in the dataset whether the questions were practitioner-administered or self-administered, which will allow us to evaluate how answers differ by mode of administration (practitioner-administered versus self-administered). During our initial visit to each location, we will make note of the specific circumstances in each location, such as the availability of desktop or laptop computers. As the pretest evolves, we will update this information if we learn our original assessment was not accurate.
We have been working to identify and select pretesting sites and we have developed a process, described below and in Supporting Statement, Part B, for survey administration and pretesting.
Limitations of this Pretest
As noted above, this pretesting process is only the first step to developing a human trafficking screening tool for youth that could be refined and undergo further validation. This study will rely on a convenience sample of youth, selected from RHY and CW service organizations that voluntarily agree to participate in our study. These organizations are not selected to be representative of all CW and RHY service organizations. Within those organizations, youth are free to volunteer to take the screening tool, or may opt not to take the screening tool. Therefore, the set of youth taking the tool within each organization will also be self-selected, perhaps in ways that correlate with their levels of involvement in human trafficking. For this reason, the rate of trafficking we observe among our pretest participants will in no way be taken as an accurate estimate of the prevalence of human trafficking among RHY or CW youth within each organization, or among RHY and CW-involved youth in each site or the country overall. This also means that further work, subject to full OMB review, would be needed to test the external validity of the screening tool, among a representative set of CW- and RHY-involved youth.
Instead, the goal of our study is to pretest the tool with a diverse set of youth who represent the range of race/ethnic, gender, age, and sexual orientation characteristics of CW and RHY involved youth. Our pretest will evaluate how answers compare between the short form and long form of the tool, among this diverse convenience sample of youth, and how answers compare between self-administered and practitioner-administered versions of the tool.
A3. Improved Information Technology to Reduce Burden
Whenever feasible, youth and practitioners will complete the tool on a laptop or tablet though we will also have a paper version in case a site cannot accommodate electronic administration and data collection. The laptop or computer version of the survey, available in English or Spanish, will have audio available to accompany all text for youth who wish to hear the questions and answers read aloud.
A4. Efforts to Identify Duplication
This research does not duplicate any other questionnaire design and validation efforts. Some local agencies in our sites may be pretesting related screening tools; however they appear to use different methods and more limited measures. In these sites, we plan to try and combine efforts to provide a cross-validation between our instruments, which will strengthen our ability to assess the validity of our tool and of their tool.
A5. Involvement of Small Organizations
The information collection activities proposed under this clearance will not be possible without the assistance of some small organizations. However, we have worked to reduce the burden on small organizations to the extent possible.
First, Urban will provide clear training to organizations prior to beginning survey administration, and will remain onsite for the first few days of administration in order to assist with the process and answer any questions that arise. Throughout the study period, Urban staff will be readily available to sites by phone or email to answer any questions that arise. If requested and if the budget allows, we will make a second in-person visit to select sites to further assist with the administration process. Second, our decision to have the majority (75 percent) of youth take the screening tool in self-administered form means that sites will only be asked to administer a small number of screening tools directly. Mostly, they will need to get youth set up on the computer or tablet, and then will be free to turn to other tasks until the youth finishes the survey. Third, we are also trying to identify a number of both CW and RHY organizations at each site, so that the number of surveys we ask from each individual organization is small.
A6. Consequences of Less Frequent Data Collection
We are planning for a one-time data collection process, for the purposes of pretesting a human trafficking screening tool.
A7. Special Circumstances
There are no special circumstances for the proposed data collection efforts.
A8. Federal Register Notice and Consultation
Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978 August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on September 15, 2014, (vol. 79, no. 178, p. 54985) and provided a 60-day period for public comment. ACF did not receive any comments in response to this notice.
HHS staff have been in close communication and coordinated with Urban Institute staff in developing the survey, developing our study design for pretesting and validation, and selecting sites for the study. A small group of youth (fewer than nine) in child welfare or RHY programs also comprised a Youth Advisory Council to review and comment on the proposed screening tool. The Council did not take the survey, just reviewed it and provided feedback. Another small group of youth (nine total) in Washington, DC were administered a version of the screening tool to help test for face validity and the feasibility of tool administration.
A9. Incentives for Respondents
Youth participants who take the screening tool will receive a $25 gift card as a token of appreciation for participating in the screening process. Giving tokens of appreciation is standard practice in this type of research (for example, at least some sites participating in the Youth Count! initiative, launched by the U.S. Interagency Council on Homelessness (USICH), the Departments of Housing and Urban Development (HUD), Health and Human Services (HHS), and Education (ED) provided $25 gift cards to youth taking Youth Count! surveys) and has been approved by the Urban Institute’s Institutional Review Board. In order to ensure that such dollar amounts are not coercive, especially for homeless or low-income youth, we indicate on all consent/assent forms and train screening tool administrators to make it very clear that participants who choose to end their participation early or skip certain questions will still receive the $25 gift card.
A10. Privacy of Respondents
Before administering the screening tool, we will obtain informed, voluntary consent from all participants, and will take a number of measures to protect the anonymity of the survey responses. The consent forms are included as Appendix D (Consent Forms).
All individuals who participate in this study will be informed of the risks and benefits of participation in the study and will be required to give their informed consent/assent prior to beginning the screening tool. To indicate their consent/assent to participate, youth taking the tool on a computer or tablet will click on the forward arrow in the survey to begin the survey. Youth taking the survey on paper will indicate their consent/assent by moving on and answering survey questions. We are not collecting signed consent forms because the signature on the consent forms would connect the youth’s names to their survey responses, and we wish to maintain full privacy for respondents. They will be informed that they can choose not to participate or end their participation at any time with no penalty to them. All practitioners who help in administering the tool will sign a pledge of privacy that explicitly states that they cannot share any of the responses with anyone who is not listed in their mandatory reporting guidelines. We will also sign a memorandum of understanding (MOU) with each participating CW or RHY site, which articulates the terms of privacy in detail. We will ensure that each site’s MOU is fully executed prior to pretesting the screening tool at that site.
Special Permissions
Given that we will be unable to obtain parental consent for children under age 18 who have run away or been kicked out of their homes, we will not obtain parental consent for participants. This study poses minimal risk to the youth study participants based on the guidance from the Office of Human Research Protections (OHRP) on a waiver of parental or guardian consent. The OHRP definition of minimal risk states that “the probability and magnitude of harm or discomfort anticipated in the research are not greater in and of themselves than those ordinarily encountered in daily life” (45 CFR 46.102(i)). The OHRP also provides guidance on the context in which parental or guardian permission for research involving children can be waived. The OHRP states that permission requirements can be waived under either 45 CFR 46.116(c) or (d). We believe that this research fits criteria (d): “An IRB may approve a consent procedure which does not include, or which alters, some or all of the elements of informed consent set forth in this section, or waive the requirements to obtain informed consent provided the IRB finds and documents that: (1) The research involves no more than minimal risk to the subjects; (2) The waiver or alteration will not adversely affect the rights and welfare of the subjects; (3) The research could not practicably be carried out without the waiver or alteration; and (4) Whenever appropriate, the subjects will be provided with additional pertinent information after participation.” We believe this study meets all four criteria. The Urban Institute IRB has approved our plans for obtaining youth consent and waiving parental consent.
In order to obtain permission for participation for children under age 18 who we reach through child welfare group homes, we will obtain the necessary permissions within each location for direct research with child welfare involved youth. Processes generally include filing paperwork for review by a research review board within a county, city, or state child welfare or human services agency.
Protections for individuals who take the self-administered tool
The individuals who participate in the study by self-administering the human trafficking screening tool (75% of sample) will do so anonymously; no individually identifying information will be recorded by any member of the research team. In cases where our screening tool is jointly administered with a local site’s own screening tool, a project-specific identifier will be attached to the youth’s response, solely for the purpose of comparing youths’ responses on the two screening instruments. The identifier chosen will be unique to this study, and will not be linked in any way to any other personal identifier. In the few sites where this may be possible, we will explain to the youth in our consent forms that their data will be shared in de-identified form, in this manner and for this sole purpose.
All individuals self-administering the screening tool on a tablet or computer will do so in private so that no one can see their responses. Responses recorded on tablets or computers will be kept secure as explained below. All individuals completing the tool on paper will also complete the tool in private and seal their responses in envelopes before placing them in a lockbox (provided to sites by Urban).
Protections for individuals who participate in practitioner-administered screenings
For participants in practitioner-administered screenings who are under the age of 18, mandatory child abuse reporting laws require practitioners to report to authorities any victimization that falls under the broad purview of “child abuse,” including all sex and labor trafficking victimization.3 During the assent process, youth will be informed of these laws and that if they do not feel comfortable with this requirement for practitioners to report such abuse or feel that they would be tempted to lie to protect themselves from such mandated reporting, they can refrain from participation at no penalty. In order to ensure the randomization of the sample, youth cannot opt to take the self-administered version if they were selected to receive the practitioner administered version. This may reduce the number of youth who participate, and especially youth who have been trafficked; however, it is important to document whether this tool can be administered in this manner. Therefore, any youth under the age of 18 who chooses to report trafficking victimization to a practitioner during the practitioner-administered screening is making an affirmative choice to potentially avail him or herself of the protections and services that will be provided by authorities upon reporting. In other words, the youth has willingly taken a step to sacrifice his or her anonymity in the interest of receiving help. This will be a protection for vulnerable youth in and of itself, but one that youth will only take advantage of if they feel comfortable and ready.
For participants in practitioner-administered screenings who are age 18 or older, mandated reporting laws will not pose a threat to the privacy of these youth, as laws that mandate the reporting of child abuse disclosed to practitioners only apply to youths under the age of 18. As such, regardless of what types of trafficking-related abuses these participants disclose, practitioners will not be mandated to report that abuse. The exception to this, as is the case in all face-to-face survey and interview administration, is if adult participants disclose their intent to commit a crime or harm themselves or others. Prior to the start of the screening tool, participants will be informed that practitioners will be required to report this information. More specifically, they will be told that the screening tool will not ask about intent to harm one’s self or others, but if participants choose to disclose it, the law requires that practitioners report it. This exception aside, the only individuals who will be able to link adult participants’ responses to the respondent are the practitioners who administer the tool.
We will make extensive efforts to train these practitioners on the importance of maintaining privacy. Practitioners will only be permitted to select multiple choice answers as they go through the screening tool with the at-risk individuals – they will not be permitted to take other notes or recordings. In addition to providing extensive training, where feasible, we will identify licensed social workers or clinical therapists to serve as the administrators at each of our sites. Practitioners trained in these fields will already have an understanding of the principles of privacy and human subjects’ protections. All practitioner-administered screenings will be conducted one-on-one in a private room.
Protections for all respondents
Participant responses recorded with a tablet or computer will be housed anonymously within the survey software system. Responses recorded on paper will be secured in a lockbox provided by the Urban Institute, which will be accessible only to one staff member on site who has received training in privacy and privacy measures, and has signed the Urban Institute Confidentiality Pledge.
Data Security at Urban Institute
For surveys taken by computer/tablet, we will use Qualtrics, a web-based survey service that allows surveys to be administered using a computer or any mobile device. The service provides an HTML5 app, also called a “web app” that is essentially a website with advanced technologies to make it feel like an ‘app.’ The local information stored by the Qualtrics mobile app only resides in the browser’s cache, so emptying the browser’s cache will clear all local information related to the Qualtrics survey from the device. In addition, access to the survey will be password protected; in order to start the survey, screening tool staff will have to enter a password for a respondent. This way, respondents cannot start taking the survey on their own or attempt to access another respondent’s data. The survey will also use SSL security, which will protect data as it is being transferred to the Qualtrics server while a respondent is taking the survey.
All data collected using Qualtrics are owned by the user and accessible only through logging on with the username and password. Qualtrics does not have access to surveys created by users or to respondent data collected via user surveys. More information on the security of the Qualtrics site and servers can be found here: http://www.Qualtrics.com/security/. Information on the security of surveys can be found here: http://www.Qualtrics.com/a/showFeatures.do?featureID=6.
Because the data are stored on a remote server and not on the device itself, there is no risk of someone being able to access completed survey data if a tablet is lost or stolen. Because the survey does not save partially completed surveys and allow respondents to continue their progress if they exit out of the browser, program staff will be instructed to restart the survey browser after every survey, and then reopen the browser for each subsequent survey. There is also language at the beginning and end of the survey to remind respondents to close the survey browser if they decide to stop taking the survey. These steps will prevent responses from being viewed by the next survey taker. In addition, because we are not using names in the web survey, even if the data were compromised, responses would not be identifiable to any person who took the survey. The data collected through the web survey, therefore, cannot be used to identify any respondents.
Urban staff will maintain the Qualtrics user account, with sole access to the Qualtrics database. Urban staff will be responsible for downloading the data from Qualtrics for use at Urban. The downloaded data will be protected as private. It will be stored only on an encrypted hard drive or on a flash drive stored in a locked filing cabinet.
A11. Sensitive Questions
By necessity, given the goal of the screener, some of the questions included in the screening tool are of a sensitive nature, such as questions about experiences of sexual, physical, or emotional abuse or questions about engaging in various forms of sex work in exchange for money, food, housing, drugs, or anything else.
Before finalizing the pretest survey instrument, we worked with a Youth Advisory Council, consisting of RHY and CW youth, some with trafficking experience, to word the questions in a sensitive and non-stigmatizing manner. We then further pre-tested the instrument with nine youth at an RHY program in Washington, DC, to see how youth reacted to the tool. These youth reported that questions did not trigger discomfort, nor were they overly invasive.
The pretest of this tool that we hope to complete with 600 youth will provide further information about youth’s willingness to honestly answer sensitive questions, and whether youth will answer such questions similarly in a self-administered setting and a practitioner-administered setting.
Connecting Youth to Services
Because youth safety and comfort is of utmost importance, we will work to ensure that proper supports are in place while youth take the screening tool, and as youth are identified as trafficking victims through the screening process. Urban will ensure that all sites either have a licensed clinical social worker on-site or have a quick and direct protocol for connecting youth with a clinical social worker during the entirety of the screening tool administration period. Most CW service centers and RHY sites already have these resources available around-the-clock. In order to help trafficking victims and other youth in need connect to services, we will consult with experts at each site to compile a list of local services to provide to every youth who participates in the screening tool pretest. This will allow youth to seek services anonymously and of their own accord. We will include a statement at the end of the survey tool that asks youth to voluntarily share their answers with the service provider if they are comfortable doing so or wish to seek additional assistance. We will clearly note for youth which types of information could trigger mandatory reporting if they share it with providers, so they can make an informed choice about disclosing their experiences. Anyone taking the anonymous survey will be provided a list of services and contacts that can assist the youth with any immediate and long-term needs that youth may use immediately or whenever they are ready to seek out services. We will ask service providers to keep track of how many youth taking the self-administered tool choose to share their answers.
A12. Estimation of Information Collection Burden
Newly Requested Information Collections
We request approval of 151 total burden hours for youth taking the screening tool (which includes the short form and long form within the same survey instrument). We expect the tool will take about 17 minutes total (2 minutes for the cognitive screener and 15 minutes for the tool, including reading the consent) per respondent, based on the mean survey length during our pre-testing of the tool with 9 youth involved in an RHY program in Washington, DC.
Total Burden Requested Under this Information Collection
Type of Respondent |
No. of Respondents |
No. of Responses per Respondent |
Average Burden per Response (in hours) |
Total Burden Hours |
Hourly Wage Rate |
Total Respondent Costs |
Youth, Self-Administered |
450 |
1 |
0.28 |
126 |
$19.24 |
$2,424.24 |
Youth, Practitioner-Administered |
150 |
1 |
0.28 |
42 |
$19.24 |
$808.08 |
TOTALS |
600 |
|
|
168 |
|
$3,232.32 |
To calculate the annualized cost to respondents for the hour burden, we assume an average household income of $40,180, or 200 percent of the poverty threshold of $20,090 for a family of three. OPRE projects are expected to study low-income populations. This figure translates to an hourly rate of $19.24. However this is a conservative, upper-bound estimate since we believe that many youth participating in this pretest may not be employed, or earn that much income.
A13. Cost Burden to Respondents or Record Keepers
There are no additional costs to respondents.
A14. Estimate of Cost to the Federal Government
The pretesting of this tool, including time spent developing the tool, and the time and resources we will use to analyze the results in order to validate the tool, is funded by a government contract totaling $152,975.
A15. Change in Burden
This request is for additional data collection under generic clearance 0970-0355.
A16. Plan and Time Schedule for Information Collection, Tabulation and Publication
Following the pretest, we will prepare a written report detailing findings from our review of prior screening tools; describing the selected screening tool’s development process and justification for its format and content; and detailing findings regarding its feasibility, reliability, and validity from this pretesting. The report will include descriptions about how the tool was administered in the sites, any challenges encountered, and lessons learned to overcome challenges to screening youth for signs of trafficking involvement. The report will note which questions in our draft tool seemed best able to screen for trafficking involvement, and which could be removed or amended for future studies. We will also note, if indicated, potential additional questions that could be added to screening tools.
The report will be written for an audience of researchers and HHS staff, as well as interested front line staff and organizations. The report will include instructions for how to use the tool consistent with its validated format and will note limitations of this pretest, including the need for additional research and testing to more comprehensively validate the tool. In advance of submission of a draft report to the COR, we will seek comments from members of our Youth Advisory Council, a group of eight youth who have experiences as runaway/homeless youth and/or in the child welfare system.
Draft aggregate data tables that will be included in the report will also be shared with service providers who assisted in the study, accompanied with text explaining the sampling procedures and tool pretesting procedures, emphasizing that this is a convenience sample, and that findings cannot be used as a prevalence estimate of trafficking among CW and RHY populations at their site or in any given geographic area.
Next steps
Possible next steps for this research agenda include revising the instrument based on findings from this pretest. Researchers could then apply for full OMB clearance, including a public comment period, in order to conduct a full pilot of the tool to further test the tool’s internal and external validity. If in our pretest, the short-form shows similar findings to the long-form, researchers could move forward with applying for full OMB clearance for a future validation of the short form of the tool.
Timeline
Data collection will begin in fall 2015, pending OMB approval, and continue until a total of 600 respondents have taken the screening tool. We expect this to require a few months. Then, in spring 2016, we will analyze our findings and complete our analyses of the validity and reliability of the tool. We will complete our final report in spring 2016.
A17. Reasons Not to Display OMB Expiration Date
All instruments will display the expiration date for OMB approval.
A18. Exceptions to Certification for Paperwork Reduction Act Submissions
No exceptions are necessary for this information collection.
1 See http://www.nij.gov/topics/crime/human-trafficking/pages/welcome.aspx for the definition of human trafficking
2 We included one member who was then 17 years old after she obtained written consent from her child welfare caseworker.
3 Mandatory reporting laws differ by state, which will be taken into consideration during site selection.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | OPRE OMB Clearance Manual |
Author | DHHS |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |