Using Qualitative Interviews in Evaluations: Improving Interview Data

Pages 40
Views 5

Please download to get full document.

View again

of 40
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Description
Using Qualitative Interviews in Evaluations: Improving Interview Data Part 2 of 2 Sarah Heinemeier, Compass Evaluation & Research Jill D. Lammert, Westat September Center to Improve Project Performance
Transcript
Using Qualitative Interviews in Evaluations: Improving Interview Data Part 2 of 2 Sarah Heinemeier, Compass Evaluation & Research Jill D. Lammert, Westat September Center to Improve Project Performance (CIPP), Westat These webinars were developed as part of the Center to Improve Project Performance (CIPP) operated by Westat for the U.S. Department of Education, Office of Special Education Programs (OSEP). For those of you who aren t familiar with CIPP, our purpose is to advance the rigor and objectivity of evaluations conducted by or for OSEP-funded projects. This involves providing technical assistance in evaluation to OSEP grantees and staff and preparing TA products on a variety of evaluation- related topics. We d like to thank our reviewers Jennifer Berktold, Susan Chibnall, and Jessica Edwards, and the OSEP Project Officers who provided input. Suggested Citation: Heinemeier, S., & Lammert, J.D.. (2015). Using qualitative interviews in evaluations: Introduction to interview planning (Part 2 of 2). Webinar presented as part of the Technical Assistance Coordination Center (TACC) Webinar Series, July 29, Rockville, MD: Westat. 1 Goals of the Two-Part Series Part 1 - Help a study team decide whether to include interviews in an evaluation design - Outline the steps associated with using qualitative interviews as part of a rigorous evaluation - Highlight key logistical decisions associated with planning high quality interviews Part 2 - Discuss what should be asked in the interview and how to collect high-quality interview data - Offer guidance on how to define and recognize high quality data - Discuss strategies to ensure high quality interview data are collected and utilized U.S.[)q ion-nt ofedo c1uon Welcome to part 2 in our two part webinar series on interviews. We want to start by briefly recapping the material presented in Part 1, which addressed decisions about including interviews in an evaluation and planning key logistical aspects of the interview process. Then we ll move on to our subject material for today. 2 Takeaways from Part 1 in this Series It's essential to know the evaluation needs, goals, and budget when planning to conduct interviews. The evaluation plan can help identify the specific variables that data collections must provide. Knowing what data are needed to answer the evaluation questions can help identify who should be interviewed. Consider the 5 Ws when planning interviews and use strategies to ensure high-quality data are collected and reported. U.S.[)q ion-nt ofedo c1uon Here are some of the key takeaway points from Part 1 of this two-part series. First, it s essential to know the evaluation needs, goals, and budget when planning to conduct interviews. Interviews should not be conducted in a vacuum from other evaluation plans and activities this runs the risk that the interview will not generate the precise and accurate data that are needed. The evaluation plan should guide interview planning and the development of the interview protocol to ensure important variables or data are captured. Knowing what data are need to answer the evaluation questions may point to specific individuals that should be interviewed. Finally, plan through the logistical tasks of interviews well in advance, to ensure the interviews proceed smoothly. We refer to these logistical tasks as the 5 Ws, which we will turn to next. 3 The 5 Ws of Planning High-Quality Qualitative Interviews 1. Who should you interview? 2. Where should you conduct the interviews? 3. When should you conduct the interviews? 4. What should you ask in the interview? 5. how can you collect high-quality interview data? V Westat PP Olfott o(spec..j f.duc.ot- l'togr-.ottu U-S Dq:wttof&h e:atoon In part 1, we discussed the 5 Ws of planning and conducting interviews. These include who, where, when, what, and how. We discussed the first three Ws on this list in part 1 we ll focus on the What and the How today. 4 Practical Example The project: Provide training in a new reading program to teachers (available in face-to-face, online synchronous, and online asynchronous modes). Goals of the evaluation: 1. Determine the quality, relevance, and usefulness of trainings 2. Determine the extent to which training opportunities (overall and by mode) have: a. Improved teacher knowledge, b. Improved teacher practice, c. Contributed to improved student performance in reading. 3. Determine how the trainings can be improved. U.S.[)q ion-nt ofedo c1uon In today s webinar we will continue using an example that was first presented in Part 1. The example is a project in which the evaluation may benefit from interviews as a data collection technique. In this example we have a project that provides training to teachers in three modes. The project s evaluation has three goals: (1) determine the quality, relevance, and usefulness of trainings, (2) determine the extent to which training opportunities have improved teacher knowledge, teacher practice, and student performance, and (3) determine how, if at all, trainings can be improved. 5 Practical Example (cont.) The design: Mixed-methods, quasi-experimental, prepost design The data are shown in the table below... Evaluation Focus Quantitative Qualitative Quality, relevance & usefulness of training Improved teacher knowledge Im proved teacher practice Improved student performance Tra ining evaluation forms Pre-post measures of knowledge Surveys Pre-post measures of performa nce Interviews Interviews Observations Interviews Interviews w-rk U.S.[)q ion-nt of EdO C1Uon Here is more information about our example: it is a mixed-methods, quasiexperimental design, with pre and post data collections. This means that the evaluation will collect data from teachers who participated in one or more training opportunities as well as teachers who did not participate in training. A mix of quantitative and qualitative data will be collected, including quantitative training evaluations, pre- and post-measures of teacher knowledge, and pre-and post-measures of student performance. The evaluation also incorporates qualitative data in the form of interviews and observations. 6 WHAT should you ask in the interview? Interview questions should elicit the data necessary to answer evaluation/research questions or triangulate findings - Questions need to target specific variables of interest - Questions may support the formative and summative aspects of the evaluation Follow an interview protocol to - Ensure all participants answer the same core questions - Ensure all questions deliver important information for the evaluation U.S.[)q ion-nt ofedo c1uon We ended part 1 with a brief consideration of WHAT should be asked in the interview and HOW high-quality data can be collected. As mentioned, we will go into more depth on these topics in today s session. Let s consider what first. The interview should provide specific and precise data in response to evaluation questions. That stated, interviews allow a study team to explore the boundaries of phenomena and experiences in order to get a better idea of what specific and precise may encompass. No matter the number or nature of the questions, it is critical is to establish a formal protocol for the interview, to ensure all data are collected in a uniform way by all data collectors. An interview protocol is a detailed document that gives the interviewer instructions for how to conduct the interview and lists the questions and, maybe even the probes, that an interviewer should ask. 7 Creating and Testing Hypotheses Create hypotheses that align with evaluation questions - Hypothesis specify the variables and variable values that determine whether or not the null hypothesis can be rejected - Null hypothesis: status quo or what would exist if the project had no impact - Alternative hypothesis: what might exist if the project does have an impact There may be more than one alternate hypothesis U.S.[)q ion-nt ofedo c1uon One tool for deciding WHAT to ask is to create hypotheses that can be tested through the collection, compilation, and triangulation of data including data from interviews. Triangulation: Process by which data from multiple data collections (different techniques, events, sources) are used to test or confirm findings or to fill knowledge gaps 8 Let's revisit our example... The project provides training in a new reading program to teachers Sample evaluation question: To what extent did the training contribute to improved teacher knowledge? - Possibility #1 : The training did not contribute to improved teacher knowledge (Null hypothesis) - Possibility #2: The training contributed to improved teacher knowledge (Alternate hypothesis) How do we know which possibility is supported? - Pre-post knowledge assessments indicate average gains of a predetermined magnitude - A pre-determined percentage of interview participants cite at least one way in which the training improved their knowledge - Both data sources verify that the training participated to improved knowledge Vwestat Olfott o(spec..j f.duc.ot- l'togr-.ottu U-S Dq:wttof&h e:atoon Returning to our example, let s consider our training project and one of the evaluation questions: To what extent did the training contribute to improved teacher knowledge? In this example, possibility 1 (which is the null hypothesis) is that the training DID NOT contribute to improved teacher knowledge. Possibility 2 (alternative hypothesis) is that the training DID contribute to improved knowledge. In this example, we decided to triangulate pre-post knowledge assessments with interviews to determine which possibility can be supported. 9 Testing Hypotheses: Real-life challenge Data may deliver contradictory results - For example, 50% of teachers may say the training did increase knowledge, but 50% may say it did not. - Which hypothesis is correct? Possible solutions - Take a deeper dive; ask more questions - Report the findings as contradictory V Westat PP Olfott o(spec..j f.duc.ot- l'togr-.ottu U-S Dq:wttof&h e:atoon A challenge with testing hypotheses is that data do not always agree (do not triangulate on the same finding) or do not always clearly support one hypothesis over another. In this example, 50% of interview participants said the training did improve their knowledge while the remaining 50% of participants said it didn t improve their knowledge. What are we supposed to do with these results? Can we interpret this to mean the program is making a difference? When this happens, it might be good to add data collection events to take deeper dives into the phenomenon, or the study team should report findings as contradictory. What are some other solutions you have used in the past? 10 HOW can you collect and use high-quality interview data? - Seven strategies 1. Create an interview protocol to provide precise data that align with evaluation questions. 2. Make efforts to reduce response bias throughout the interview. 3. Train (and re-train) interviewers to elicit complete and information-rich answers. 4. Screen data for quality DURING the interview. 5. Capture responses accurately and completely. 6. Conduct rigorous qualitative analysis. 7. Create a report template for compiling the interview data. Off'iu otsp«w.ducat- l'toer m U.S. [)q ion-nt ofedo c1uon Let s now turn to how high-quality data can be collected. In part 1 of the series, we introduced seven strategies: the use of protocols, trainings, data screening, data capturing, thorough analyses, and report templates. Before we discuss these in more depth, let s first define what we mean by high-quality data. 11 What are high-quality data? Precise: To what extent do data reflect an exact measurement or the exact [narrative] information needed to respond to a question? Accurate: To what extent do data reflect the actual value of an observation or achievement? For example, is a measure of height or weight accurate or off' by several inches or pounds? Does the participant provide a true (e.g., verifiable) account of a phenomena or experience? Reliable: To what extent can a data measurement be replicated with accuracy and precision? Does the participant have any reason to respond falsely? Consistent: To what extent do data or responses agree with each other? Complete: To what extent is complete information provided? For example, is the unit of measurement provided to help interpret the data? Is there enough context to understand a participant's response? V Westat P .,,,_...,,_,,, ' rn U.S.[)q ion-nt of EdO C1Uon It is important in every data collection to ensure high-quality data are collected this applies to both quantitative and qualitative data. There are five basic aspects of quality, as shown in this slide: precision, accuracy, reliability, consistency, and completeness. Let s take a few moments to review each of these aspects of quality do any seem more or less important than others? Do any of these seem more or less challenging, when it comes to collecting interview data? 12 Strategy 1: Create an Interview Protocol to Provide Precise Data Construct interview questions that will allow the study team to - Test hypotheses - Elicit a precise response, but without leading the respondent - Prompt for additional information, as needed V Westat PP Olfott o(spec..j f.duc.ot- l'togr-.ottu U-S Dq:wttof&h e:atoon Now that we ve defined high-quality data, let s turn back to the strategies for ensuring collection of high-quality data through interviews. Strategy 1 is to create and use an interview protocol that contains questions that will allow for testing hypotheses. It is helpful to think through and include question prompts when designing the interview protocol. Prompts may be necessary if the data an interviewee is providing isn t as precise, accurate, reliable, complete, or consistent as you would like (remember, these are the data quality characteristics). For example, let s suppose the interviewer asked a participant how they heard about a training and the participant responds from a friend. The interviewer may prompt the participant to indicate whether the friend is another teacher in the school, an administrator, a member of the training team, and so on. The challenge is assessing these characteristics of data quality on the fly interviewers need to know a lot about the topic and be well trained in the protocol in order to effectively use prompts to elicit high quality data. Some prompts are generated during the interview itself as new topics or questions arise out of the conversation with each respondent. It may be helpful to create instructions for interviewers on how to handle these new topics or questions (e.g., explore a new topic more fully or try to get the respondent back on topic) as it is possible for an interview to get derailed, with the result that all data needed to 13 answer the evaluation questions might not be collected. At a minimum, when this occurs, the interviewer should be sure to note the nature of the prompt that is generating new topics or questions so that the study team can explore whether it is necessary to add a new question or prompt to the protocol. 14 Question Types Structured Semi-structured Unstructured - Direct - Indirect Vwestat p There are three basic types of questions: structured, semi-structured, and unstructured. Further, the interviewercan deliver unstructured questions in a direct or indirect manner. Let s spend some time discussing each. 15 Types of Interview Questions: Structured Example: To what extent do you believe the training program provided the knowledge you need to successfully implement the reading program in your classroom? Please choose one of the following answers. a. Great extent b. Moderate extent c. Little extent d. Not at all U.S.[)q ion-nt ofedo c1uon Here is an example of a structured interview question. Notice the use of predetermined answer choices, which generates very precise data. A structured interview question may look very similar to a survey question this is by design. In fact, when surveys that are completed orally are a form of interview. 16 Types of Interview Questions: Semi-Structured Example: To what extent do you believe the training program provided the knowledge you need to successfully implement the reading program in your classroom? Please choose one of the following answers. a. Great extent b. Moderate extent c. Little extent d. Not at all Please explain your answer: U.S.[)q ion-nt ofedo c1uon Here is an example of a semi-structured question, which solicits precise data through forced choice of a response and asks the respondent to provide greater detail and explanation of his or her answer. 17 - Pros Structured Pros & Cons of Different Types of Interview Questions Semi- Structured Cons Help ensure data are collected in standardized, comparable, ways across multiple respondents; limited, if any, use of prompts reduces the amount of time needed to answer the questions. Can be analyzed relatively quickly using quantitative techniques. Help ensure complete or comprehensive data are collected across multiple respondents. Interviewer may have some ability to rephrase questions or to use prompts to probe for additional information. Interviewer cannot divert from the structure and language of the interview question to probe for additional information from respondent. Interviewer has limited ability to divert from the structure and language of the interview question to probe for additional information. The use of probes increases the length of the interview. Require quantitative and qualitative analysis techniques. U.S.[)q ion-nt of EdO C1Uon As a quick reference, we ve outlined common pros and cons for structured and semistructured questions. There is no perfect approach or question. Rather, the researcher will find the best choice for the project, balancing the need for high quality with the need to reduce bias. 18 Types of Interview Questions: Unstructured Example: In what ways do you think the training program provided (or didn't provide) you with the knowledge you need to successfully implement the reading program in your classroom? U.S.[)q ion-nt ofedo c1uon Here is an example of an unstructured question. This type of question may result in a broad range of responses and those responses may lack precision. However, they allow the respondent to provide a fuller response than structured or semi-structured questions. (NOTE: If unstructured questions are used, keep in mind that much more time will be needed to conduct data analysis.) 19 Types of Unstructured Questions: Direct & Indirect Examples of direct questions: How, if at all, did the training improve your knowledge? What were some of the new ideas and strategies you learned? Examples of indirect questions: What information was covered during the training? How familiar were you with some or all of these concepts prior to the training? What potential issues do you see with the direct and indirect questions listed above? U.S.[)q ion-nt ofedo c1uon As shown here, unstructured questions can be designed to approach a topic or question DIRECTLY, in which the interviewer asks for the specific data that are of interest. In contrast, a question may be framed INDIRECTLY, in which the interviewer asks more general questions about the topic. Direct questions tend to limit the types of responses a respondent is likely to give and may lead to a bit more bias in the interview. However, the precise data they generate can be useful for helping to answer evaluation questions and often require less extensive data analysis. Indirect questions tend to reduce bias in the way a question is asked (e.g., they may be less leading), but they also may decrease the precision of responses and increase the amount of data analysis required. In addition, indirect questions may generate answers that are not the information that is being sought. 20 Pros & Cons of Different Types of Interview Questions Un-struc
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks
SAVE OUR EARTH

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!

x