*** NOTICE ***

 

The ERIC Clearinghouse on Information & Technology
web site is no longer in operation.

 

The United States Department of Education continues to offer the

 

ERIC Database

at

https://www.eric.ed.gov

 

All ERIC Clearinghouses plus AskERIC will be closed permanently as of December 31, 2003.

 

In January 2004, the Department of Education will implement a reengineering plan for ERIC. The new ERIC mission continues the core function of providing a centralized bibliographic database of journal articles and other published and unpublished education materials. It enhances the database by adding free full text and electronic links to commercial sources and by making it easy to use and up to date.

 

From January 2004 until the new ERIC model for acquiring education literature is developed later in 2004, no new materials will be received and accepted for the database. However, the ERIC database will continue to grow, as thousands of documents selected by the ERIC clearinghouses throughout 2003 will be added. When the new model is ready later in 2004, the new ERIC contractor will communicate with publishers, education organizations, and other database contributors to add publications and materials released from January 2004 forward.

 

Please use:

www.eric.ed.gov to

 

§         Search the ERIC database.

§         Search the ERIC Calendar of Education-Related Conferences.

§         Link to the ERIC Document Reproduction Service (EDRS) to purchase ERIC full-text documents.

§         Link to the ERIC Processing and Reference Facility to purchase ERIC tapes and tools.

§         Stay up-to-date about the ERIC transition to a new contractor and model.


Archived version of the site:

ERIC/IT Update Banner


Feature Articles

The Maryland Virtual High School CoreModels Project:
Harnessing Computer Modeling for Scientific Inquiry

By
Mary Ellen Verona and Susan Ragan

The Challenge

Preparing students to “do” science in the real world of the future means guiding them in “doing” science now. Just as teachers would not think about teaching biology without a microscope or chemistry without test tubes, most of today’s scientists do not think about doing any kind of science without the aid of computer modeling and visualization. Since the building of a computer model requires a very intimate knowledge of the phenomenon and its interconnected parts, computer modeling often uncovers student-held misconceptions. According to College and Beranek (1992), “Computational modeling ideas and activities should have a key and central role throughout the science curriculum - not peripherally, and not only as a part of a special or optional course.”

The Methods

A vital component of Maryland Virtual High School’s CoreModels Project (MVHS CoreModels) is the role of the teacher in developing activities and evaluating their effectiveness in the classroom. In harmony with the standards for professional development found in the National Science Education Standards (National Research Council, 1996) and synthesized by the National Institute for Science Education (Loucks-Horsley, Stiles & Hewson, 1996), MVHS teachers work within a collegial, collaborative environment to reflect on teaching practices, facilitate change in science education, integrate theory and practice in school settings, and produce knowledge about learning and teaching.

According to Kracjik, Blumenfeld, Marx, and Soloway (1994), “The same constructivist approach recommended as a basis for classroom practice also applies to teachers. They construct their knowledge through interaction with peers, applying ideas, reflecting on the results, and implementing modifications.”

Professional Development

Through summer workshops, email exchanges, and web-based sharing of materials, science teachers across the state of Maryland have formed collaborative relationships through which they discuss teaching strategies. As a result of working with computer models, teachers are asking students different questions that require the analysis of more complex situations. By looking carefully at how their students express their understanding through their written answers to the MVHS assessments, teachers are discovering student misconceptions and misunderstandings. That knowledge is driving changes in classroom instruction. Assessments are now seen as a diagnostic tool as well as an evaluation of student learning.

Student Learning

An important part of our efforts is to document the kinds of student learning that result from using modeling and visualization tools. Through the use of video analysis of student dialogue, constructed response questions scored based on the Maryland Science Rubrics, and end-of-semester exams, we are examining the impact our materials and practices have on student performance.

Learning Community

MVHS CoreModels considers itself to be a community of learners who are engaged in discovering how to improve science education through computer modeling and visualization. The stakeholders in this research effort include students, classroom teachers, district-level science leaders, boards of education, community college educators, education researchers, cognitive scientists, and computational scientists.

Activities relate to computer models designed using STELLA, a system dynamics software package. The curriculum reform standards such as the Benchmarks for Science Literacy, created by the American Association for the Advancement of Science’s (AAAS) Project 2061, focus on the nature of systems in the study of science. System concepts such as equilibrium and feedback transfer not only from one science to another, but to other subjects such as social sciences. Unlike much subject specific software, STELLA is a general package that can be used to model any dynamic system so students can continue to develop expertise throughout their school career in a variety of subjects. Activities are also aligned with the expectations of the new Maryland Science Core Learning Goals. Goal 1 provides general skills applicable to all science areas. Goals 2-5 list expectations related to biology, earth science, chemistry, and physics.

The CoreModels leadership team, including the project director and eleven Maryland teachers selected as the center directors (CDs) and supporting teachers (STs) of three geographically distributed CoreModels centers, refined and piloted relevant computer models and activity packets during the 1997-1998 school year. Over fifty additional participating teachers (PTs) tested these activities in learning about modeling at summer workshops. They implemented improved versions with their students during the 1998-1999 and 1999-2000 school years. Teacher support includes content instruction, guided practice with computer skills, ideas for engaging students, and discussion of strategies for countering student difficulties. In addition to conducting traditional summer and Saturday workshops, MVHS teacher leaders visited their peers on site to help in their first implementation attempts and for subsequent debriefing sessions. Links to CoreModels curriculum pages for physics, biology, earth science and chemistry were added to the MVHS Web site. In September 2000 MVHS CoreModels was designated one of five “promising” technology projects by the U.S. Department of Education.

The Research

Funding for CoreModels was provided by the National Science Foundation through the Research on Education, Policy and Practice (REPP) program. As a research project, the CoreModels Project has investigated two questions. Can computational modeling activities help students achieve core learning goals? In order for such activities to help students, they must actually be implemented in the classrooms. Thus the CoreModels vision stressed a high rate of effective implementation. This led to our second question. Can teachers support their peers in implementing these activities? Very preliminary data suggest that common principles of good instruction such as clear objectives and prompt teacher feedback are important in fostering student learning using modeling activities. When it occurs, peer support can help teachers take charge of their own learning. Peer support or collaboration and other forms of teacher professional development must be built into the school day.

Peer Collaboration

As described above, center directors and supporting teachers met as a group several times each quarter during the first year of the project. In addition, directors made appointments to visit activity enactment by the supporting teachers or invited them to visit the director’s classroom. In addition to working out problems with the modeling activities, we were piloting the peer support component of the project. Supporting teachers understood that they would take on a mentor role during the next year. These twelve individuals (project director, CDs and STs) were highly committed to the project and were generally able to work through the difficulties involved in an assigned mentorship. Since the CDs were released half time, there was some flexibility in scheduling meetings with those they mentored. In addition, the directors had the luxury of long phone calls during year one to support each other in mentoring the supporting teachers.

The second year began with a carefully designed program of peer support, which involved pairing Supporting Teachers with Participant Teachers and encouraging regular classroom visits and discussions between each ST/PT pair. Because pairs were originally matched across schools, logistical challenges (travel time, being willing or able to miss one’s own class time to visit another teacher’s class) played a major role in keeping most of these relationships from developing. Unlike the center directors, who had a reduced class load, the STs had to take professional leave (paid for by the project) and arrange for a substitute. In addition, supporting teachers were uncomfortable in their role as a “mentor,” especially since many Maryland districts had established mentoring relationships to help under-performing teachers. For this reason, teachers were generally unwilling to comment on or critique one another’s practices, which they understood to be a primary purpose of observing one another’s classrooms.

At the beginning of the second semester, directors met with supporting teachers to make mid-course corrections to the peer support paradigm. One teacher commented that:

“The peer support portion of the project has been and continues to be a struggle. The association of classroom visits with evaluations by supervisors and administrators appears to be deeply ingrained. Teachers seem to be receptive to workshop-type sessions but less amenable to having other teachers visit. Most PT’s seem to be anxious to collaborate. Workshops and regional meetings have been productive for all parties involved. In this setting, PT's seem to be more relaxed, creative, and analytical. Support seems to work better in a group situation unrelated to a classroom, more of a brainstorming session.”

The second year evaluation report hypothesized that the supporting teachers might not feel capable in their role as mentor. But another teacher replied that he was comfortable in his ability to support participating teachers, but had not found a way to communicate. Several of the six participating teachers he was assigned did not return email. He said, “As a peer mentor, I do not believe that I should force myself upon the participating teachers. On the other hand, there is no way to know how the project is being implemented in the classroom if the participating teachers are not observed.”

The mentors suggested several reasons for the lack of response of the new teachers:

  • Teachers expect to be autonomous; another person in the room is an invasion of privacy.
  • Teachers do not want to “burden” the supporting teacher without a significant need.
  • The summer experience went well; participating teachers have everything they need.
  • PTs do not invite others to observe since they cannot plan the computer use in advance.
  • PTs don’t recognize the need to document modeling activities in their classroom.
We realized that we had to reconsider interpersonal factors and explain the purpose of peer collaboration more carefully to all project participants. Instead of beginning visits to schools with classroom observation, relationships should develop first through pre-observation planning visits. Teachers might also refocus classroom observation with the ST acting as a helper and the CD observing both. Leaders also suggested post-implementation discussion of student difficulties and after school get-togethers to work on a new model or on assessment questions to supplement the modeling activity.

The main difficulty with the peer support paradigm was in scheduling visits between supporting and participating teachers. Center directors did visit participating teachers. Center directors were also extremely successful in working with teachers within their own schools. We were able to build on this success during the third year in accepting additional teachers from the current participating schools.

According to Friedman and Culp (2001), “What we did find was that teachers gradually shifted to intra-school, more informal forms of peer support, and the program followed the teachers’ lead and instituted cross-discipline, within-school, team-oriented peer support structures during Year 3 of the program. This model seemed to function more productively for teachers.”

For example, the five teachers who joined the project from a single school during year three provided a critical mass of interest and know-how in the school. In addition, the center director visited so often that she was considered an “adjunct faculty member.” Anxiety levels were reduced when teachers saw the director planning with colleagues and teaching as well as observing in their classrooms.

Although they may not have considered themselves mentors, some of the supporting teachers demonstrated considerable leadership ability. The ST’s were invaluable in facilitating small group discussions at district quarterly meetings. They developed discussion guides and other ways of providing structure and focus without inhibiting the full range of discussion issues. In addition to taking on increasingly prominent leadership roles within the project, they began outreach efforts within their schools and school districts in introducing teachers outside the project to modeling. Some became deeply involved in collaborating around modeling curriculum issues with other teachers. According to Friedman and Culp (2001), “The professional growth of this subset of program participants resulted in an expanded core group of teachers who were effectively leading the program and providing guidance to the larger cohort of teachers, strengthening an already strong group of teacher leaders and contributing to the persistent, gradual progress of the level and content of teachers’ discussion of modeling over the life of the program.”

Effect of Modeling on Student Learning

Each CoreModels activity was designed to meet the Maryland High School Science Core Learning Goals as well as the AAAS Project 2061 Benchmarks. At the same time that MVHS teachers were implementing these modeling activities, the Maryland State Board of Education (MSDE) was field-testing the Maryland High School Assessment (HSA) tests, the final piece of the state’s systemic reform plan. The HSA includes both selected response items (e.g., multiple-choice) and constructed response items which require the analysis, synthesis, and written expression of ideas. There were early indications that CoreModels activities were effective in supporting student learning. One teacher was thrilled with the first in the district results of his students on an early test of the biology HSA. Another teacher received accolades for the outstanding results of his physics students on the Force Concept Inventory, administered as part of his concurrent participation in the Arizona modeling project.

MVHS leaders decided that constructed response items scored using the MSDE rubric would be particularly relevant to teachers and to state leaders. We would also be assisting teachers in providing practice to their students by using the constructed response mode to measure student understanding gained through modeling. Since teachers reported that, as a result of using the materials, they saw improvement in their students’ ability to meaningfully interpret the graphical representation of data and understand the ability of a model to represent real world behavior, we sought to determine whether the teacher observations listed above were actually measurable. Two open-ended questions were designed for each activity in biology and physics. The first question presented the student with a graph produced by the STELLA model used to investigate a topic recently studied and asked the student to explain its meaning. The second question asked the student to evaluate the ability of the model to represent real world behavior. Both questions would be scored using the 5-point Maryland High School Science Rubric, the same one to be used on the High School Assessment exams. In the fall of 1999, we asked for teachers who could meet the following conditions:

  1. Cover three MVHS activities during the second semester and administer an assessment after each one.
  2. Send the original assessments to MVHS and keep a copy to return to their students.
  3. Score the copies according to the Maryland High School Science rubric.
  4. Return the scored copies to the students and discuss the answers before administering the next assessment.
Eleven biology teachers and four physics teachers responded to our request. Teachers with semester-long block classes were more likely to participate since they were beginning with new students. The teachers’ experience with STELLA ranged from 1 to 4 years, and the classes ranged from Basic Skills to Advanced Placement. Eleven schools were represented in the study, six rural, four suburban , and one urban. In the summer of 2000, these teachers met together with project leaders to score the assessments formally after general training and practice scoring for each topic. Each question was subjected to blind scoring by two teachers, with a third teacher resolving discrepancies.

Biology Results

Mean scores on the graph interpretation question dropped significantly (p<0.01) between time 1 and time 3, while mean scores on the modeling heuristic question rose significantly (p<0.01). The drop in scores on the graph interpretation question may be attributable to the fact that the third quiz was given at the end of the school year when student motivation was low. The results above do not include several teachers who were not able to give the third quiz. When the entire group of teachers was considered, there was an increase in graph interpretation scores from quiz 1 to quiz 2 that approached significance (p=0.055). The results for the question concerning the ability of a model to represent real world behavior are more promising. Even when the majority of third quizzes were given late in the school year, the mean for quiz 3 was higher than the mean for quiz 1.

Is student performance on a question type related to teacher comfort with that question type? Graph interpretation is an area in which many biology teachers have difficulty themselves in using mathematically accurate terminology. The teachers recognize this weakness in their backgrounds and are eager for more opportunities to practice graph interpretation skills with their students. Although the modeling activities provide that practice, it is possible that teacher reinforcement in classroom discussions needs to be improved in order to see steady improvement in student performance. We cannot expect to see student gains if their teachers are not clear in their own expression of the meaning of graphs.

Question 2 requires a written description of the similarities and differences between the model and the real world. Although teachers were initially uncomfortable with this question, we know from anecdotal evidence that they do become more comfortable and increase their focus on model interpretation skills after the administration of the first quiz. Therefore, the large increase in mean scores from quiz 1 to quiz 2 is at least partially attributable to increased focus on model interpretation.

Physics

As exposure to modeling activities increased, it was expected that student achievement would increase on both quiz questions. The data did not support this hypothesis. We observed that the means for the graph interpretation question went down, while the means for the model interpretation question went up. Scores on question 1 decreased between quiz 1 and 2 for some classes and between quiz 2 and 3 for others. To explore possible reasons for this discrepancy, we looked at the content of the quizzes. Question 1, which involved graph interpretation skills, appears to be more highly sensitive to the effect of content than question 2. One teacher gave eight assessments (n=24) providing the opportunity to look at the interplay between exposure to modeling activities and the difficulty of the specific topic being covered. Student performance increased significantly on both questions 1 and 2 over the first four quizzes covering kinematics-related topics. The concept of force was introduced in the firth modeling activity. For quiz 5, the student means dropped dramatically on question 1, but less so on question 2. Elevator, the topic covered on quiz 6, reinforced the concept of force. The student means on both questions 1 and 2 increased significantly. Therefore, it seems likely that the introduction of a new concept may play an important role in student assessments in spite of the number of previous exposures to modeling activities.

Although the increase in content difficulty has some explanatory value in the decrease in scores in moving from kinematics to dynamics, other factors cannot be dismissed. Question 2 required a written description of ways in which the computer model was similar to and different from the phenomenon it was meant to represent. The results there are more promising.

Summary

These results suggest that when teachers are supported by a peer-driven professional development program, they are able to integrate computer-based modeling within a range of curricular contexts to improve student understanding of some of the core scientific concepts underlying modeling as a scientific practice. Students’ abilities to interpret visual representations of data seem more resistant to improvement, especially when measured over multiple content areas.

According to Friedman and Culp (2001), “One conclusion that can be inferred from these findings is that central modeling concepts, such as the heuristic relationship of models to the physical world, seem to be relatively transferable concepts that can be elaborated across curricular content areas, while interpretation of visual representations of data remains, at least in this context, a more content-dependent skill that is not easily transferred from one content area to another. These findings demonstrate that CoreModels was successful in building teachers’ understanding of and ability to teach about modeling, not only as a way to explore specific content areas but as a particular conceptual approach to the task of scientific inquiry.”

References

American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York: Oxford University Press.

College, L., & Beranek, B. (1992). Setting a research and planning agenda for computer modeling in the pre-college curriculum (Final Report: NSF RED-9255877). Arlington, VA: National Science Foundation.

Friedman W., & Culp, K. (2001). Evaluation of the CoreModels project: Final report. New York: Education Development Center/Center for Children & Technology.

Krajcik, J. S., Blumenfeld, P. C., Marx, R. W., & Soloway, E. (1994). A collaborative model for helping teachers learn project-based instruction. Elementary School Journal, 94, 483-497.

National Research Council. (1996). The National Science Education standards. Washington, DC: National Academy Press.

Loucks-Horsley, S., Stiles, K., & Hewson, P. (1996). Principles of effective professional development for mathematics and science education: A synthesis of standards (NISE Brief Vol. 1, No. 1). Madison, WI: National Institute for Science Education, University of Wisconsin-Madison.