T7 Resources -- Data collection

1) Data Artifacts

Research methods are a specific set of procedures or tools for understanding more about how your action influenced the actions of others. You will need to decide what kind of artifacts you are going to collect. To make your decision, revisit your cycle research question. The first part should describe the action you plan to take and the second part the outcome you expect to effect. Now look at your outcome and ask yourself what sort of data would help you see the outcome from the eyes of others? How will you get past your own frames of thinking that shape how you "see" the outcome? In action research, it is good to collect more than one form of data and see if, when together, they help you understand what happened from multiple perspectives. In collaborative action research you may be collecting data in a team.

Here is a list to help you think about the artifacts you might collect:

Perspectives of others: What do they think or say?
  • Interviews
  • Surveys or questionnaires (students, parents, co-workers)
  • Focus groups/ informal discussions/student reflections,
  • Case Studies or subsamples of participants

Observations of Performances of others: What do they do?

  • Students' blogs or journals
  • Assessments of skills (tests, quizzes, homework, report cards, grades)
  • Rubrics (student created) and student portfolios or projects (see the Buck Institute for examples)
  • Students' self assessments
  • portfolios

--Record Keeping
  • Checklists of different actions
  • Number of interruptions, disruptions of learning
  • Logs of number of meetings
  • Minutes of the meetings
  • Photos/videos or other forms of visual data
  • audio recordings (for analysis of ways of talking)
  • Attendance or online participation, or completing rates
  • diaries or journals, field notes
  • photos

Request for assistance or information
  • Sociograms (who gets help from whom)
  • Number/type of requests for help
  • Number/type of questions asked

It is important to make your stance clear. You are not trying to be an objective bystander beyond the politics of the situation. Instead you are trying to effect change and you are likely to have strong desires to have the change result in specified outcomes. Your evidence is not meant to "prove to others" that your actions caused the outcomes. Rather it is for you to understand, at a deeper level, how your actions contributed to the outcome. Your honesty is to yourself first, and to others second. You want to know what about about a given practice is working. If you lie to yourself by overestimating the outcomes, you will fail to learn as much as you might. You are trying to figure out what actions lead to what reactions. Your care in looking at the outcomes from many different perspectives will be vital to your learning.

With this understanding, you are free to collect that type of evidence that will help you understand what is happening. You can use traditional social sciences methods and you can creatively figure out what evidence will help you understand the change.

  • A teacher who introduced a change in the way he taught themes and ideas through literature used his discussion with parents to at teacher conferences to collective evidence. His reasoning was that if students were really engaged in and working out the ideas, they would share them with others. One possible place of sharing is with parents who are often curious about is going on in school. Students share things that they find important or relevant. By finding out what parents knew about the unit ideas, he was finding out what students took home from their school experience.
  • A professional developer who was experimenting with a more engaging way of working with teachers, used a version of time on task to see if changes were effective. She noted the length of time teachers left the room for breaks or phone calls. She found when she shifted the activity the amount of time out of the workshop room decreased dramatically. She used this as evidence that the teacher found the shift in teaching more engaging.
  • A teacher who wanted to increase the amount of informal discussion of ideas among teachers created a number of professional learning experiences including visiting one another classrooms. He kept track of informal conversations with teachers. He would note the time and place of the chat and the percent that was social (about family or friends), procedural (what time or where things were taking place) or professional (dealing with instructional practices. This coding of his own conversation helped him see which activities prompted the most informal professional dialogue.
  • An media specialist in a hospital setting changed the way he educated nurses moving out of the role of teacher and become more of a support person. The data he collected included the use of media by nurses in their educational programs. He was able to track how this shift led to more use, and the ownership over the technology generated a higher demand for similar training which had not been the case with earlier efforts.
  • A doctor who wanted to develop his patients "self-healing" potential explored the nature of the doctor-patient interaction. He started by initiating an email contact before and after the visit. He audio taped the treatment sessions and listening to these tapes which helped him see a change that happened when he shifted the use of pronouns from "I" and "you" to "we." His next cycle was to experiment with changing his use of pronouns to "we" in discussing the problem and attending to how this shifted the dialogue and engagement of the patient in the healing process.

2) Learning How to Collect Data

Action Researchers collect many different forms of evidence, photographs, logs, journals, structured observations. Some of the data collection strategies overlap with evaluation research. OERL has created a set of innovative tutorials to help researchers new to data collection learn some of these research strategies. While theses efforts are meant to train evaluation researchers, they can be very useful for action researchers who decided to use one or more of these methods. Each module contains (1) step-by-step strategies and tools of evaluators (2) a scenarios which helps you see how to apply the tools to an evaluation problem and 3) a case study in which there are questions to be answered (and then compared to peers and experts). You do not have to leave your names and if some of the methodological concepts are new to you, don't panic. Take the learning you need from the tutorials.


Here is the list--As you can see, many of these will be of use to action researchers.
These tutorials will help you learn to write a simple and brief survey or develop a protocol for an interview that matches your research question and helps you avoid simple mistakes (for example combining more than one question in a single item).

* Designing an Evaluation:
.......Methodological Approach and Sampling
* Developing Written Questionnaires:
.......Determining if Questionnaires Should be Used
.......Writing Questionnaires
.......Questionnaire Design
.......Administering Questionnaires
* Developing Interviews:
.......Preparing an Interview Protocol
.......Administering Interviews
* Data Collection:
.......Procedures, Schedule, and Monitoring
* Instrument Triangulation and Adaptation
* Developing Observation Instruments

The Listening Resource blogs by Susan Elliot is are very useful for developing strategies in qualitative research.

If you are creating surveys, make sure that you have permission and consent to collect this information. If your work involves distributing surveys at the end of training sessions, or student activities that involve reflection, then you might not have to take any special provisions. However, if you are collecting information from protected groups that is not a part of their educational process, it is imperative that you have the proper oversight (from your principal, district and/or university) and that you obtain participant consent, and if children are involved, parental consent.

See the discussion of ethics in action research.

3) Building a Valid and Reliable Data Collection Plan

You will need to think about how to assure that the data you collect is a valid measure of your action and that your method of analysis was reliable or a reasonably accurate representation of the data. This chapter by Richard Sagor in Guiding School Improvement with Action Research will help you think though some of the issues around the validity of evidence to measure what you are exploring or the truthfulness of the selected data. The reliability of the measures address issues of accuracy -- did you summary the data to report the date to others in a way that adequately represents what happened.

4) Finding Instruments that others have used for Assessment

Whenever possible, it is better to use an instrument that has been validated by others. Searching online can help you find assessment instruments but here are some lists to get you started. Feel free to share other lists or resources you might find in your search

A tertiary practitioner's guide to collecting evidence of learner benefit-- This guide from our colleagues in New Zealand is focused on improving college level education has has some ideas for what forms of data to collect and use to assess teaching. You are welcome to download it.

National Institute for Learning Outcomes Assessment (NILOA) has as their mission to "discover and disseminate ways that academic programs and institutions can productively use assessment data internally to inform and strengthen undergraduate education, and externally to communicate with policy makers, families and other stakeholders." This is another great source for figuring out what data to collect and what you learn is likely to also apply to other learners with some adjustments.

The High School Survey of Student Engagement (HSSSE) might be of use as many teachers are concerned about student engagement.

Here is list of resources for finding assessment tools (mostly in science, technology and engineering) have been copied from Purdue University's Assessment Center

The following resources provide further information on other databases, additional assessments, research journals.
ATIS: Assessment Tools in Informal Science - Another database of assessment tools. Certain assessments in the Assessment Center link to ATIS as they can be found on their site
AWE: Assessing Women and Men in Engineering - Provides assessment tools for K-16 formal and informal education outreach activities.
Burros Institute of Mental Instruments - A searchable database of evaluations of tests.

The Buck Institute for Education has many great resources for project based learning and ways to assess the outcomes with rubrics.

EDC: Center for Science Education - A web guide created to familiarize educators with classroom assessments found in K-12
ETS Test Collection Database - A searchable database of tests.

Infinity Project - The Infinity Project is a curriculum designed to help educators in the fields of engineering and is suitable for grades 6-12 and early college students.

MCAS: Massachusetts Comprehensive Assessment System - A database of assessments and questionnaires that test knowledge on technology, science and engineering.

NAEP: National Assessment of Educational Progress - Largest nationally representative assessment of math, reading, science, writing, the arts, civics, economics, geography and US History.
PARE: Practical Assessment Research and Evaluation - Online journal that provides access to referred articles that can have a positive impact on assessment, research, evaluation and teaching practice.
RET: Research Experience for Teachers- Tracks trajectory of teachers' experiences over 5 years.
Tech Tally: Approaches to Assessing Technical Literacy - This book will be of special interest to individuals and groups promoting technological literacy in the United States, education and government policy makers in federal and state agencies, as well as the education research community.
STEM Education Instruments - The ITEST Learning Resource Center has compiled information on various instruments to help researchers, evaluators, and practitioners, identify and locate instruments used to assess learning and other related outcomes in STEM learning environments.

RETURN TO T7 Activities or to Tutorial 7
Continue forward to Tutorial 8

back to tutorial index-1708301.jpg