When educators are asked how technology is being used in instruction, the answer commonly involves a frequency report—how much “stuff,” or equipment, exists in the schools, how often it is used, and perhaps a breakdown of grade levels or content areas of use. More sophisticated queries look at which technology applications are being used, but this line of inquiry still focuses only on the learning materials themselves, not on the outcomes they are intended to produce. So when you observe technology use in action, what should you look for?

The Arlington, Va., Public Schools (APS) touched on this theme in identifying indicators of a rich and rigorous curriculum. Specifically, the APS school board wanted to know whether the percentage of classrooms that were successfully integrating technology into instruction was increasing. Staff members in the office of instructional media and technology (IMT) were charged with responding.

The adverb in the directive was key, shifting the query from quantity to quality. What constitutes successful integration? The school board was interested not only in raising the frequency of practice, but in ensuring that the practices were pedagogically sound and supported effective learning and teaching—hence, were successful.

What does successful integration look like? “I know it when I see it” is a common response. But verification through observation—by seeing instruction in action—demands standards. We asked ourselves, “What structures exist to help gauge the status of and improve technology integration through observation?”

Surveys for self-reporting are widely available and generously shared, as are checklists to identify which technologies are being used for which instructional goals (e.g., for research). But IMT staff unearthed no observation instruments that would help formulate a response to our school board—that is, no tool to aid in determining whether technology was being used successfully. So we decided to create our own observation device.

The tool we created had to lend valuable insight into how well technologies were being used to support the school system’s educational goals. It also had to identify where classroom lessons fell on a continuum of instructional technology implementation practices observed.

We developed our observation tool using an instructional design process, including cycles of field tests with experts and users. Our tool is based on observation devices used to determine the quality of instruction in content areas; research on change (most predominantly, the Concerns-Based Adoption Model and the Apple Classrooms of Tomorrow project); and the Levels of Technology Implementation (LoTi) framework developed by Dr. Christopher Moersch of the National Business Education Alliance and Oregon State University. The tool can be downloaded in PDF format from the district’s IMT web site under the description “Instructional Media & Technology Observation Form.”

The first part of the observation form is for collecting information about the setting (grade level, content area, instructional space, etc.) and technologies in use. The data recorded here are analyzed for trends in use—such as which technologies are being used at which grade levels, in which content areas, and in which settings (if at all!).

The second segment focuses on the APS learner goals of:

• Demonstrating a high degree of knowledge in subject areas;

• Communicating subject matter clearly;

• Solving problems using an effective process to reach viable solutions;

• Applying learning to the world beyond the classroom; and

• Self-assessing work and the work process to set goals for next steps.

For each academic goal, there are descriptors of what students might be doing when engaged in learning activities aimed at that goal. Observers check each descriptor for which they see students engaged with technology either actively (e.g., creating products, manipulating data, making decisions to affect outcomes) or passively (e.g., observing material presented through a technology format). Each learner goal then is assigned an overall qualitative rating that reflects how well technology is applied to student learning.

The data from the ratings provide a picture of which general academic goals and activities technology is supporting, which dominate technology integration, and how well. We use this information to identify where we need to improve how instructional goals are being accomplished through technology use. For example, our spring 2001 observations revealed that technology is underutilized in learning activities to exchange information or collaborate with others outside the classroom—a component of our learner goals that the power of technology addresses so well! The results highlight areas of focus for future professional development efforts and curriculum projects.

The third, most comprehensive section looks at levels of technology implementation. It draws heavily on the LoTi framework (for more information on LoTi, see the links on page 36). Our observation form includes six levels of implementation and corresponding descriptors. Additionally, each level of implementation is associated with specific pedagogical approaches that cite general teacher and student roles and responsibilities. Again, observers note all descriptors that are in evidence in the lesson; more often than not, descriptors across various levels are checked. Finally, an overall category rating for the observation is assigned, based on the pedagogy embraced and level of implementation observed.

How are these data applied to improving practices? Teachers and other instructional leaders can use the observation tool’s results to reflect on:

• The match between the teacher’s pedagogical philosophies and actual practices when using technology to support instruction;

• What the lesson required of students in terms of cognitive processing, and how technology contributed to the cognitive demand on the student;

• What the lesson required of the students in terms of problem solving, decision making, and authentic learning, and how technology contributed to the development of these skills;

• What the lesson required of the students in terms of self-direction and collaboration, and how technology contributed to these;

• How to revise the lesson to move the activity to a more sophisticated level, if appropriate to the learning goals;

• Professional development needs; and

• Curriculum support needs and potential projects.

The observation tool, when used thoughtfully, shifts the focus from what equipment we have to how we use it. We return to embracing our philosophies and goals for learners to drive our technology use.

Dr. Sheryl Asen is the instructional technology design and evaluation specialist for the Arlington, Va., Public Schools. She welcomes comments and questions at (703) 228-6088 or sasen@erols.com.

Related links: Instructional Media & Technology Observation Form
http://www.arlington.k12.va.us/departments/imt

Levels of Technology Implementation (LoTi) framework
http://www.learning-quest.com/lotibreak.html