Download:
pdf |
pdfEvaluation of the Comprehensive
Technical Assistance Centers
OMB Clearance Request for Data Collection Instruments
Part B: Supporting Statement for Paperwork Reduction Act Submission
(Revision to OMB Clearance 1850-0914)
October 2016
Prepared for:
U.S. Department of Education
Contract No. ED-IES-13-C-0059
Prepared by:
IMPAQ International
INTRODUCTION
This document is a revision of the currently approved collection for the National Evaluation of
the Comprehensive Technical Assistance Centers, a five-year evaluation that has been
underway since September of 2013. The original request was submitted in December of 2013
and approved in March of 2014 (OMB Control Number 1850-0914). It included six data
collection instruments: 1) Design-Focused Interview Guide for Center Staff, 2) ImplementationFocused Interview Guide for Center Staff, 3) Implementation-Focused Interview Guide for
Technical Assistance (TA) Recipients, 4) Center Staff Survey, 5) TA Recipient Survey, and 6) TA
Event Observation Guide. Of these six approved instruments, instruments 1-3 have been
completed and the related burden hours are deleted in this revision. Instruments 4-6 are still
being used to collect data. In the originally approved submission, it was noted that the
Outcomes-Focused Interview Protocols would be developed and added later. The current
request is for the addition of two new data collection instruments: 1) Outcomes-Focused
Interview Guide for Center Staff; and 2) Outcomes-Focused Interview Guide for TA Recipients.
This submission includes the original statement, along with the description of the two new
protocols and an updated total response burden estimate. We are requesting review of the
new protocols and the revised burden estimate.
The Institute of Education Sciences (IES) within the U.S Department of Education (ED) is
conducting this evaluation. In the introduction to the supporting statement, we provide a
description of the Comprehensive Technical Assistance Centers program, the evaluation
questions, and the study design. The remaining sections of this document respond to specific
instructions of the Office of Management and Budget (OMB) for the preparation of a
supporting statement.
The Comprehensive Technical Assistance Centers
Title II of the Educational Technical Assistance Act of 2002 (F.T AA, Section 203)1 authorized the
Comprehensive Center Program, a discretionary grant program establishing technical assistance
centers. The Comprehensive Centers were last awarded in 2012, to “provide technical
assistance to State educational agencies (SEAs) that builds their capacity to support local
educational agencies (LEAs or districts) and schools, especially low-performing districts and
schools; improve educational outcomes for all students; close achievement gaps; and improve
the quality of instruction” (77 FR 33564)2.
In 2012, the Department of Education awarded new five-year grants to 15 Regional Centers and
7 Content Centers under the Comprehensive Center Program. The Regional Centers each serve
one to seven U.S. states, territories, and possessions. They provide technical assistance (TA)
that builds the capacity of SEAs to implement, support, scale up, and sustain initiatives that
1
http://www2.ed.gov/programs/newCenterp/legislation.html
https://www.federalregister.gov/articles/2012/06/06/2012-13735/applications-for-new-awards-comprehensiveCenters-program#h-4
2
1
help districts and schools improve student outcomes. The Regional Centers focus their work on
seven federal priority areas:
1. Implementing college- and career-ready standards and aligned, high-quality
assessments for all students;
2. Identifying, recruiting, developing, and retaining highly effective teachers and leaders;
3. Turning around the lowest-performing schools;
4. Ensuring the school readiness and success of preschool-age children and their successful
transition to kindergarten;
5. Building rigorous instructional pathways that support the successful transition of all
students from secondary education to college without the need for remediation, and
careers;
6. Identifying and scaling up innovative approaches to teaching and learning that
significantly improve student outcomes; and
7. Using data-based decision-making to improve instructional practices, policies, and
student outcomes.
The Content Centers provide the Regional Centers and SEAs with in-depth content knowledge
and expertise by providing information, publications, tools, and specialized technical assistance.
The 7 Content Centers are:
1.
2.
3.
4.
5.
6.
7.
Center on Standards and Assessments Implementation
Center on Great Teachers and Leaders
Center on School Turnaround
Center on Enhancing Early Learning Outcomes
Center on College and Career Readiness and Success
Center on Building State Capacity and Productivity
Center on Innovations in Learning
The National Evaluation of the Comprehensive Technical Assistance Centers
The National Evaluation is charged with examining and documenting how the Centers intend to
build SEA capacity (referred to as theories of action) and what types of activities they actually
conduct to build capacity.
Evaluation Questions
The evaluation will address questions in three areas:
Design:
1. How did the Centers define capacity building?
2. What theories of action did the Centers use to guide their general capacity-building
work?
3. How did the Centers assess the needs of their constituencies?
2
Implementation:
4. What strategies did Centers employ to achieve their outcomes?
5. To what extent did Centers implement technical assistance to their constituents as
planned?
6. To what extent and how did Centers collaborate with each other, by, for example,
sharing or building on other Centers’ resources and expertise?
Outcomes:
7. To what extent did Centers achieve their goals and objectives, especially capacitybuilding outcomes?
Focus on Two Federal Priority Areas
To gather more in-depth information, the evaluation will limit data collection on the
implementation and outcomes questions to two of the seven federal priority areas:
1. Identifying, recruiting, developing, and retaining highly effective teachers and leaders,
and
2. Ensuring the school readiness and success of preschool-age children and their successful
transition to kindergarten.
These two priority areas were purposefully selected. First, effective teachers and leaders is a
topic area in which all of the Regional Centers have ongoing projects. In addition, this is a topic
area where most SEAs have significant TA and capacity building needs, as many are choosing
and implementing educator evaluation systems or supporting districts and schools as they hire
and evaluate their professional staff. This priority is also tied to school reform efforts and large
federal funding streams such as the Race to the Top initiative, the School Improvement Grants,
and the Teacher Incentive Fund.
The second priority area, early learning, is another high-profile topic which has recently gained
increased attention. In response to federal initiatives and research findings on the benefits of
high-quality early education, many states have increased their funding for state-supported early
childhood education programs over the last few years. This evaluation is well poised to examine
the role that the Comprehensive Centers play in supporting state efforts in this priority area.
Given the overarching nature of the effective teachers and leaders area, and the recent policy
focus on early learning efforts, we believe that focusing on these two priority areas will give us
a good picture about how Centers generally develop SEA capacity (and in the case of Content
Centers, both SEA and Regional Center capacity) and what difference the Centers’ efforts may
have made. Further, we believe that SEAs’ capacity-building needs and the Centers’ approach
to providing TA in these two priority areas may differ across Centers in meaningful ways. These
differences are likely to produce different types of capacity-building outcomes (i.e., the needs
and approach to building capacity to develop effective teachers and leaders may be different
than the needs and approach to building capacity related to early learning initiatives). Thus, by
3
focusing on these two priority areas, we will gain detailed information on the Centers’ capacitybuilding activities, while still being able to learn about the variety of needs, approaches, and
outcomes.
The selection of two priority areas in no way implies that the Department has a preference for
these areas over others, or that the Centers or SEAs should shift the focus of their efforts to
these areas. Rather, this narrowing of focus allows us to target our resources in such a way that
we are able to learn about capacity-building activities and outcomes in sufficient detail.
Data Sources
Data collection for this study consists of interviews of Center staff and TA recipients, surveys of
Center staff and TA recipients, and observations of TA events. The study instruments include
three previously approved instruments for which all data collection has been completed (the
burden hours related to the completed instruments have been deleted in this revision); three
data collection instruments that will be in use for continuing data collection in 2017; and two
new instruments:
Already approved in the original OMB submission and all data collection completed:
Design-Focused Interview Guide for Center Staff
Implementation-Focused Interview Guide for Center Staff
Implementation-Focused Interview Guide for TA Recipients
Already approved in the original OMB submission and data collection continuing:
Center Staff Survey
TA Recipient Survey
TA Event Observation Guide
New data instruments to be reviewed in this submission:
Outcomes-Focused Interview Guide for Center Staff
o Purpose: To obtain the Center staff’s perspectives on outcomes of the Centers’
projects in the two key priority areas.
o Sample: The sample includes all 22 Centers, and an estimated 114 interview
participants. We estimate that 16 Centers will have projects in both
teacher/leader effectiveness and early learning. We will conduct two group
interviews at those Centers; one interview focused on each priority area. We
estimate that 6 Centers will have projects in teacher/leader effectiveness, but
not early learning. We will conduct one group interview at those Centers. This
totals to an estimate of 38 group interviews. We estimate an average of 3
participants per group interview, for a total of 114 participants. Each group
4
interview will include staff working in this priority area, including staff who work
on one focal project to be discussed in the interview. The focal project will be
among those that were discussed in 2016 implementation-focused interviews.
The outcomes-focused interviews will follow up on outcomes of these same
projects. Center directors will be asked to identify appropriate staff to
participate in these interviews. Groups may include TA managers, content
specialists, and Center directors.
o Timing: April-May 2017
Outcomes-Focused Interview Guide for TA Recipients
o Purpose: To obtain the TA recipients’ perspectives on outcomes of the Centers’
projects in the two key priority areas.
o Sample: The TA recipient sample includes 38 participants. Participants represent
recipients of Center TA (usually staff of state education agencies) who work on
the projects discussed in Center interviews in each of the two priority areas.
Thus, based on the estimate above of 38 group interviews that focus on one
project each, we estimate a total interview sample of 38 participants (22
recipients of teacher effectiveness projects and 16 recipients of early learning
projects). We will ask each Center to identify one individual who is a key
recipient of services provided through the focal project discussed in each of the
outcomes-focused interviews with Center staff.
o Timing: Within two months of Center site visits, approximately May-August,
2017
Evaluation Reports
The evaluation will produce two reports. The first report will be an interim report focusing on
how the Comprehensive Centers designed their work as technical assistance providers. The
report will describe the Centers’ underlying theories of action and definitions of “capacity
building,” and explain how the Centers assessed their constituencies’ needs and developed
work plans to address those needs. This report will be available in early 2017.
A final report will be produced in September 2018. The final report will integrate all study
findings.
5
DESCRIPTION OF STATISTICAL METHODS (PART B)
1. Describe the potential respondent universe (including a numerical estimate) and any
sampling or other respondent selection method to be used. Data on the number of
entities (e.g., establishments, state and local government units, households, or persons)
in the universe covered by the collection and in the corresponding sample are to be
provided in tabular form for the universe as a whole and for each of the strata in the
proposed sample. Indicate expected response rates for the proposed sample. Indicate
expected response rates for the collection as a whole. If the collection had been
conducted previously, include the actual response rate achieved during the last collection.
This study does not involve statistical methods for sample selection and estimation. The study is
designed to represent all of the Centers’ (22 Centers) work in design, and all of the Centers’
work in implementation and outcomes for two priority areas. (The number of Centers working
in each priority area may vary by year, but in data collection years so far, all Centers have
worked in the teacher/leader effectiveness area, and either 15 or 16 Centers have worked in
the early learning area.) Respondent universes vary by instrument. For surveys, these include
all Center staff and TA recipients relevant to the two priority areas, and the survey questions
address all projects in these areas. Implementation and outcomes interviews include smaller
numbers of projects and associated staff and constituents; the projects are purposively
selected. These smaller, purposive samples enable the researchers to meet the goal of the
study for an in-depth and nuanced understanding of capacity-building processes and outcomes.
In addition, the evaluation will profile a small number of projects in depth to illustrate the
process by which Centers design, implement, and produce outcomes for their technical
assistance.
Profiles of 6-10 successful projects will be developed within each of the two selected federal
priority areas. We will employ a purposeful selection process to identify projects for these
profiles. Within each priority area, we will develop profiles of Regional and Content Center
projects, and projects of varying scope (single state versus regional or national projects) and
emphasis. The profiles will involve no unique data collection other than observations of project
events; the profiles will draw on data gathered through all the instruments described in this
statement. Samples for new and continuing instruments are described below.
Already approved and continuing instruments:
Center Staff Survey: The Comprehensive Center staff survey was administered via online survey
software in 2015, 2016, and will be administered again in 2017. IMPAQ staff contact each of the
Centers to obtain the list of respondents for each site. The evaluation team provides guidelines
for the selection of survey respondents: they are staff who are actively involved in leading or
delivering technical assistance in the two focal priority areas.
TA Recipient Survey: The TA recipient survey was administered via online survey software in
2015, 2016, and will be again in 2017. The sample includes individuals who receive technical
assistance from the Centers in the two priority areas each year. Prior to survey administration
6
each year, IMPAQ staff communicate with the Centers to confirm the list of TA projects in the
two priority areas that were active in the last 12 months and the list of TA recipients and their
contact information. The evaluation team provides guidelines as to criteria for inclusion of TA
recipients in the sample; they must have been active participants in Center projects in the two
priority areas over the past 12 months.
TA Event Observation Guide: Evaluators will ask Center staff members in early 2017 to identify
upcoming observable services or events of projects under consideration for project profiles.
When possible, observations will be conducted in conjunction with the 2017 site visits.
Observations of webinars or other virtual events will be conducted as they are scheduled.
New data instruments to be reviewed in this submission:
Outcomes-Focused Interview Guide for Center Staff: This instrument, to be administered
during the April-May 2017 site visits, is designed to be administered at all Centers to the groups
that participated in the 2016 implementation interviews regarding projects in the two priority
areas. Based on earlier rounds of Center staff interviews, we expect a 100% response rate. We
will conduct one or two group interviews at each Center, depending on whether the Center has
projects in one or both of the priority areas. Each interview will include staff working in this
priority area and on a focal project in that area. Groups may include TA specialists, TA
managers, content specialists, and Center directors. We estimate that we will interview 22
groups, or one per Center, focusing on a project in the teacher/leader effectiveness area, and
16 groups, or one per 16 Centers, about projects in the early learning area, since some Centers
do not conduct projects in this priority area. We estimate an average of 3 participants per
group interview, for a total of 114 respondents.
Outcomes-Focused Interview Guide for TA Recipients: This instrument is designed for key
recipients of Centers’ TA (usually staff of state education agencies) who work on the projects
discussed in Center outcomes interviews; these interviews will be administered by telephone
within two months of the Center interviews in 2017. Based on earlier rounds of TA recipient
interviews, we estimate an 85% response rate. We will ask each Center to identify one
individual, a key constituent of the project discussed in each of the Center outcomes interviews.
Thus, based on the estimate of 38 group interviews with one focal project each, we estimate a
total interview sample of 38 (22 recipients of teacher effectiveness projects and 16 recipients of
early learning projects).
An overview of each new and continuing instrument, its purpose, its administration time, its
sample respondents, and the research questions addressed, is provided in exhibit 1 below. Data
collection using the Design-Focused Interview Guide, which addresses research questions 1-3,
has been completed. Data collection using the Implementation-Focused Interview Guides for
Center Staff and TA Recipients, which focus on questions 4-6, has also been completed. Data
collection using the TA Recipient and Center Staff Surveys, which address questions 4, 5, 6, and
7, will continue in 2017. Data collection with the new Outcomes-Focused Interview Guide for
7
Center Staff and the new Outcomes-Focused Interview Guide for TA Recipients, both of which
focus on question 7, will take place in 2017.
Exhibit 1. Instruments, Purpose, Timing, Sample, and Research Questions
Instrument
Newly Submitted
Purpose
Administration
Timing
OutcomesFocused
Interview Guide
for Center staff
Understand technical
assistance outcomes,
including longer term
outcomes, especially
how Centers have
built SEA capacity
Apr-May 2017
OutcomesFocused
Interview Guide
for TA Recipients
Understand Centers'
longer term outcomes
from TA recipient
perspectives,
especially how
Centers have built
SEA capacity
May-August
2017
Estimated Sample
Major Research
Questions Addressed
Centers with projects in the
two selected priority areas
(22 centers have projects in
teacher/leader
effectiveness and 16 have
projects in early learning);
staff who work on projects
in those areas; estimated
sample of 114 staff
State education staff
identified by Centers as key
recipients of projects
discussed in Center
interviews; estimated total
of 38 TA recipients
7. To what extent did
Centers achieve their
goals and objectives?
4. What strategies did
Centers employ to
achieve their outcomes?
5. To what extent did
Centers implement
technical assistance to
their constituencies as
planned?
6. To what extent and
how did Centers
collaborate with each
other?
7. To what extent did
Centers achieve their
goals and objectives?
7. To what extent did
Centers achieve their
goals and objectives?
7. To what extent did
Centers achieve their
goals and objectives?
Already Approved and Continuing
Center Staff
Survey
Understand Center
staff individual roles
and perceptions of
projects in the two
priority areas,
including how
projects build
capacity
May-June 2015,
2016, 2017
All Center staff who provide
TA in the selected priority
areas each year
TA Recipient
Survey
Investigate how
technical assistance
recipients, including
state education staff
and other participants
as relevant, perceive
and use the services
they receive, and how
these services have
helped them build
capacity
May-June 2015,
2016, 2017
All key TA recipients of
projects in selected priority
areas each year
8
TA Event
Observations
Provide observational
data about the
strategies that
Centers used to
support capacity
building and achieve
planned outcomes.
Will inform project
profiles.
In 2017, as
relevant
observable
activities are
identified
Observable events of TA
activities in profiled
projects. Estimated total of
6-10 observations.
4. What strategies did
Centers employ to
achieve their outcomes?
5. To what extent did
Centers implement
technical assistance to
their constituencies as
planned?
2. Describe the procedures for the collection of information, including:
Statistical methodology for stratification and sample selection.
Estimation procedure.
Degree of accuracy needed for the purpose described in the justification.
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce
burden.
Below, we describe basic data collection procedures for the new instruments. As noted above,
this study does not involve statistical methods for sample selection and estimation. The
evaluation is designed to represent the universe of Centers, with purposive samples of priority
areas and projects used in interviews to allow for in-depth examination of capacity-building
processes, as relevant to the research questions of the study. Methods to obtain high response
rates are described under Question 3.
Data collection process details for all interviews:
Each site visit team will consist of an evaluator from IMPAQ (the site liaison) and a consultant
from a subcontractor. The evaluators have expertise in the study design and evaluation
methods, while the consultants have expertise in technical assistance to state and local
education agencies. Having the two visitors work as a team during the site visits enhances the
accuracy of the data gathered. The same two visitors conduct the TA recipient telephone
interviews associated with each Center, but scheduling difficulties sometimes necessitate that
only one of the interviewers conduct each interview. The same pair of visitors attends all site
visits of the Centers to which they are assigned. To ensure accurate notes, each interview is
audio recorded if the interviewees permit.
The study leaders conduct annual trainings of the site visit teams so that all team members
share a consistent understanding of the study, the research questions, the interview questions
and probes, and the data collection needs. Prior to each wave of site visits all site visitors
convene in Washington, DC or Oakland, CA for a half-day training session. The session includes
a study overview or update, site visit logistics and activities, site pre- and post-visit
communication, data collection procedures on site (including a review of the interview
protocols), and data handling. The site visit task leaders have distributed a site visitor guide
with a checklist that describes all tasks the site visitors need to perform before, during, and
9
after each visit. All site visitors meet regularly to discuss issues and concerns during the site visit
data collection period.
3. Describe methods to maximize response and to deal with issues of non-response. The
accuracy and reliability of information collected must be shown to be adequate for
intended uses. For collections based on sampling, a special justification must be provided
for any collection that will not yield “reliable” data that can be generalized to the universe
studied.
Overall, the evaluation team has designed instruments so that they are easy to understand and
place as little burden on participants as possible. Evaluators conduct multiple rounds of followup by email and phone for all data collection activities. Data collection has been completed for
years 2015 and 2016; the final round of interviews and surveys will take place in 2017. Methods
of data collection that have been in place for the first two years will continue in 2017. Details
are provided below.
Center Staff Interviews and Surveys: Interviews take place during site visits. Site liaisons
contact Center directors to schedule interviews, explain their purpose and general focus, and
explain the criteria for participation in group interviews. For implementation and outcomes
interviews, site liaisons ask Center directors to identify the projects to be discussed and the lists
of appropriate participants in advance of the interviews, following guidance of the site liaisons.
Site liaisons provide the Center directors with a form to complete for this purpose. Site liaisons
and consultants then interview identified participants in person, as a group, during site visits. In
the event that some identified Center staff members are participating in site visits via
videoconferencing or teleconferencing, site visitors conduct the interviews using these
technologies.
Evaluators plan the list of Center staff survey participants each year with Center directors or
their designees. Initial invitation emails with links to the survey, customized for each
respondent, are sent to each sample member. A week later, reminder emails are sent to
everyone who has not yet responded. A week after that, another round of reminders is sent to
non-respondents. The next week, a third set of reminders is sent to remaining nonrespondents. After the first and second reminder emails, phone calls are made to nonrespondents to remind them to complete the survey.
10
TA Recipient Interviews and Surveys: For administration of the TA recipient interviews,
evaluators ask Centers for the contact information for relevant TA recipients of projects
discussed during the Center interviews. Evaluators contact the TA recipients using the following
procedures: 1) All identified SEA recipients receive an initial contact via email, requesting their
participation in a one-hour interview, and providing an overview of the scope and purpose of
the evaluation. The general purpose of the interview and the topics to be covered are also
described. 2) If there is no response within a week, evaluators continue to follow up by email or
phone on a weekly basis for up to three weeks, as needed. If evaluators are unable to schedule
a TA recipient interview for each relevant priority area project, evaluators ask the relevant
Center for names of alternate TA recipients who might serve as replacements or
representatives.
For administration of the TA recipient surveys, evaluators ask Center staff to identify and
provide contact information for TA recipients of all projects in the two focal priority areas.
Evaluators provide guidance to Center staff on the sample list: TA recipients identified for the
survey should be those who have worked directly with Center staff or who have participated
actively in Center-organized workgroups, conferences, or training sessions. Initial invitation
emails with links to the survey, customized for each respondent, are sent to each sample
member. A week later, reminder emails are sent to everyone who has not yet responded. A
week after that, another round of reminders is sent to non-respondents. The next week, a third
set of reminders is sent to remaining non-respondents. After the first and second reminder
emails, phone calls are made to non-respondents to remind them to complete the survey.
Throughout the fielding period, we contact the Centers regarding apparent email address errors
and receive corrections to contact information for various recipients. As we receive that
information, new invitations are sent to those people. In addition, during the phone follow-ups,
we receive new phone numbers for some recipients. When those numbers become available,
we send additional reminders to coincide with the additional calls.
4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as
an effective means of refining collections of information to minimize burden and improve
utility. Tests must be approved if they call for answers to identical questions from 10 or
more respondents. A proposed test or set of tests may be submitted for approval
separately or in combination with the main collection of information.
Study team members pre-tested drafts of the two new interview protocols with fewer than 10
respondents. The protocols were revised based on respondent feedback and to improve the
interview flow, clarity, and efficiency.
5. Provide the name and telephone number of individuals consulted on statistical aspects of
the design and the name of the agency unit, contractor(s), grantee(s), or other persons
who will actually collect and/or analyze the information for the agency.
The following individuals and organizations are involved in data collection, analysis, and
consulting on technical aspects of the study design.
11
Responsibility
Organization
Contact Name
Telephone
Number
Co-Principal Investigator
and Project Director
IMPAQ International
Phyllis Weinstock
510-597-2423
Co-Principal Investigator
IMPAQ International
Chris Brandt
443-259-5064
Associate Education
Research Scientist
Institute of
Education Sciences,
ED
Amy Johnson
202-245-7781
Senior Research Scientist
Institute of
Education Sciences,
ED
Thomas Wei
202-245-7474
Technical Working Group
Member
Consultant
Thomas Adams
530-848-9728
Technical Working Group
Member
Consultant
Margaret Goetz
609-737-2464
Technical Working Group
Member
Consultant
Laura Hamilton
412-683-2300
x4403
Technical Working Group
Member
Consultant
Constancia Warren
212-367-4595
Technical Working Group
Member
Consultant
Sharon Lynn Kagan
212-678-8255
Task Lead, Site Liaison
IMPAQ International
Andrea Beesley
443-832-2313
Task Lead, Site Liaison
IMPAQ International
Anne Chamberlain
443-259-5215
Task Lead
IMPAQ International
Michaela Gulemetova
202-774-1956
Co-Task Lead, Site Liaison
IMPAQ International
Nada Rayyes
510-597-2422
Site Liaison
IMPAQ International
Stephanie Levin
443-259-5413
Site Liaison
IMPAQ International
Raquel Sanchez
510-282-4794
Site Liaison
IMPAQ International
Linda Toms Barker
808-934-9297
Analytic Support
IMPAQ International
Ilana Barach
510-597-2412
Analytic Support, Site
Liaison
IMPAQ International
Maria DiFuccia
202-774-1948
Analytic Support
IMPAQ International
Fata Karva
202-774-1938
Analytic Support
IMPAQ International
Eliana Saltares
202-774-1972
Analytic Support
IMPAQ International
Brandon Saunders
202-774-1964
Analytic Support, Survey
IMPAQ International
Antoni Boston
443-259-5120
Analytic Support, Survey
IMPAQ International
Maria Chen
443-259-5520
12
Analytic Support, Survey
IMPAQ International
Rocco Russo
202-774-1994
Analytic Support, Survey
IMPAQ International
Mousumi Sarkar
202-774-1985
Analytic Support, Survey
IMPAQ International
Andrea Schwanz
443-259-5146
Analytic Support, Survey
IMPAQ International
Mikhail Thomas
443-259-5424
Analytic Support, Survey
IMPAQ International
Neil Thomas
443-259-5422
Analytic Support, Survey
IMPAQ International
John Wendt
443-259-5255
Technical Assistance
Expert, Site Visitor
Consultant
Michelle Feist
804-252-5714
Technical Assistance
Expert, Site Visitor
Consultant
Deborah Jonas
503-381-4164
Technical Assistance
Expert, Site Visitor
Consultant
Leslie Rennie-Hill
410-206-0394
Technical Assistance
Expert, Site Visitor
Consultant
Paul Smith
541-543-9179
Technical Assistance
Expert, Site Visitor
Consultant
Michelle Swanson
804-252-5714
13
File Type | application/pdf |
Author | Eileen Poe Yamagata |
File Modified | 2017-01-05 |
File Created | 2016-12-20 |