Assessments of the future?

In September I participated on a panel providing input to the Board on Testing and Assessment (BOTA) at the National Academy of Sciences.  The BOTA and the Board on Science Education (BOSE) are collaborating on a report that will outline recommendations for future science assessments to support the vision of the Framework.

During my presentation, I commented on the viability of technology-based and curriculum-embedded assessment exemplars showcased in the meeting.  I also addressed practical, technical, and psychometric challenges that might arise with the implementation of these approaches.

Whatever assessments we use in the future we will need to carefully consider the purpose for these assessments.  Rosemary Reshetar, Brian Reisner and others outlined for the BOTA some of the purposes of the assessment tools that they showcased during the daylong meeting.  Looking forward states will need to carefully examine assessment instruments to ensure that they are aligned to the vision of the Framework for K-12 Science Education and meet the expectations for formative and/or summative reporting needs.

Assessment should not drive our system BUT it can help it us to break some molds. For example, the integration of technology-based assessments like the ones described during the BOTA/BOSE meeting reinforce the importance of using technology to enhance learning. The integration of these assessments (like the recent NAEP Science Assessments) will push on the system to increasingly use technology to enhance learning. Phil Brookhouse and I will be doing a series of workshops this fall to outline how the MLTI tools can used to enhance integration of the Science and Engineering Practices. Maine is well poised for this integration.

The blueprint of the Framework makes it incumbent on us to produce assessments that look and report differently. This pending report from the BOTA, and its findings, will be critical for moving the Vision of the Framework forward.

I will let you know when the report is released.  For more information, you can watch the proceeding of BOTA/BOSE meeting.

Word knowledge, the Framework, and the Common Core

Christine Anderson-Morehouse

Last week Christine Anderson-Morehouse, presented a session on Word Knowledge at the Department’s K-12 Summer Literacy Institute: Transition Planning for the Common Core State Standards.   Christine is both a longtime science education consultant and the director at Midcoast Professional Development Center where her work has included supporting several partner schools in a statewide, five-year Maine Content Literacy Project grant.  I am delighted that she agreed to share her insights with us in the entry that follows…  Thank you, Christine.

 Acquiring Word Knowledge—A Key to Science Understanding 

Words are not just words. They are the nexus—the interface—between communication and thought. What makes vocabulary valuable and important is not the words themselves so much as the understandings they afford.  (Common Core State Standards, Appendix A)

Language is at the core of the Practices in the Framework for K-12 Science Education, especially Practice 6 (Constructing Explanations and Designing Solutions); Practice 7 (Engaging in Argument from Evidence) and Practice 8 (Obtaining, Evaluating and Communicating Information).

The Common Core State Standards for ELA / Literacy in History/Social Studies, Science and Technical Subjects make direct connections to Science. We see this link in the Common Core State Standards for Literacy in History/Social Studies, Science and Technical Subjects (CCSS), the Language Standards, which are intended to be embedded across all other CCSS standards including reading, writing, listening and speaking.   Although we in STEM education might not think of ourselves as literacy teachers, it’s imperative that we incorporate effective word work into our daily instruction.

I am fascinated by the related and complementary research into language/vocabulary instruction that’s described in the Common Core documents (for example, CCSS Appendix A. Acquiring Vocabulary p. 32).

Academic Language and Background Knowledge:  Which Words? 

In science, we know that students come to the classroom with preconceptions about how the world works. It’s our job to provide scaffolded experiences (both hands-on and through reading) to engage students’existing understandings and lead them gradually to new concepts in order for them to learn in a way that “sticks”.  So, too, must we scaffold student use of language.  All students and especially struggling students need explicit support if they are to understand and use Academic Language, both the “Tier 2 Words” (general academic words such as “contrast”, “analyze”, “note”–words that are frequently used in school but rarely explicitly taught) and the “Tier 3 Words” (domain-specific terminology that represents both brand new concepts (“density”) and terminology that extends existing conceptual knowledge (the term “raptor” as applied to an already-understood concept of “bird”).  Our choice of Tier 3 words should assuredly be based on the standards in science.

In selecting the general academic words that we’ll take the time to teach, we’d be wise to work with our colleagues across the curriculum and identify the Tier 2 words to emphasize at each grade level in each subject.   It is widely accepted among researchers that the difference in students’ academic vocabulary levels is a key factor in disparities in academic achievement and that students struggle in school or drop out of college not because they can’t read (decode letters) but rather, because they struggle to unlock the meaning of these more general, academic words.  The AWL (Academic Word List)  consists of 570 of the most common terms used across disciplines organized by frequency of use.  An excellent book by Amy Benjamin, Vocabulary at the Center, includes similar word lists that are organized by categories of meaning.  Either of these resources, combined with the descriptions of science practices in A Framework for K-12 Science Education, can support cross-disciplinary discussions and decision-making about which words we’ll emphasize in the science classroom.

Effective Vocabulary Instruction:  How? 

Historically there has been an emphasis on the “assign/define/test” mode of studying word lists. These words were often the bold words from textbook chapters.  Research now tells us that words won’t stick when instruction follows this practice because it involves little in the way of student engagement nor experiences that build long-term memory.  Rather, students should have guidance so that they can create their own linguistic and non-linguistic representations in order gradually to shape their understandings of word meaning.   In order to support this learning, teachers must provide multiple and varied exposure over time with repeated opportunities for students to interact with one another about the words that they’re learning.  Learning word parts (roots,and prefixes and suffixes)and use of games—playing with words—can increase  a student’s probability for academic success.

Two valuable, general resources about research-based vocabulary instruction are an article by Robert Marzano about a six-step vocabulary process and an entry by Susan Ebbers on the Vocabulogic blog (scroll down to a list of “effective and engaging vocabulary practices”).  Specific to science, I’ve enjoyed using some of the practical classroom strategies such as the Frayer Model, Synectics and many other strategies described in Page Keeley’s Science Formative Assessments book.

Strengthening STEM learning requires the integration of literacy as described in the standards of the Common Core and the Practices of the Framework.

Building assessments for the future

Last week I worked with science educators from around the state to review state assessment items that the MDOE will potentially include in future MEA Science assessments at grades 5, 8 and 11.  These groups of teachers, Item Review Committees (IRC), represent one step in a two-year development process that assessment items move through before the MDOE can include items in the State science assessment.

The teacher voice in IRCs is invaluable.  The teachers who participate help the MDOE and our assessment vendor, Measured Progress, ensure that items:

  • are accurate and, in the case of multiple choice items, have only one correct answer;
  • are clearly and appropriately worded for the grade level;
  • are aligned to the performance indicators and descriptors of the Maine Learning Results;
  • avoid bias and provide access; and
  • are correctly coded for Depth of Knowledge*.

I have been thinking a LOT about what future science assessments aligned to the NGSS might look like and I eagerly await the results of the National Academies report that will address this topic.  The National Academies expect to have this report completed in 2013.

Assessments aligned to standards that combine practices, crosscutting concepts and core ideas will certainly have to look different from current assessments in some ways. For example,  I am excited by what I see as the need for more computer enhanced simulations in future assessments.  The recent NAEP Science Assessment gives us a window into what this might possibly look like and the important opportunity these simulations can provide for understanding student thinking. I also see the need for more constructed response items that probe at student ability to make  and defend claims and connect their arguments to core ideas in science.

I am grateful that Maine has continued to include constructed response items in our MEA Science Assessment. Many states use a multiple choice only format.  Many states also do not release assessments items to the public.  Maine currently releases 50% of the common items annually.  Budget constraints forced MDOE to cut back the percentage of released items from 100% to a 50% several years ago.

Maine certainly doesn’t have the resources as a small, single, rural state to develop rich science simulation assessments we would like.  However, the MDOE and Measured Progress are collaborating, within our current assessment design, to enhance constructed response items to make them more forward thinking.  We already incorporate graphs, tables, and representations into MEA questions and we will be field testing items that ask students to make and defend claims and use models to predict and describe science ideas related to the Physical Setting and Living Environment. These efforts will help us to better understand the opportunities and challenges of science assessments of the future.

We fully expect that in the future states will band together in consortia for Science Assessment.  Working together, it is my hope that state will be able to pool resources to develop assessments that reflect the vision for teaching and learning (and assessment) we see in A Framework for K-12 Education.

How fast will the change take place?  As I described above, assessment development takes at least two years so… if the Legislature adopts the NGSS in June of 2013 in theory it would take minimally two years to develop items aligned to NGSS standards.  However there are many complications. For example, there may be potentially many states collaborating this could take time. In addition, states will be trying to blend this work with the roll out CCSS Assessments in 2014-2015, another complicating factor.

What does these mean?  It means that we have more reasons to understand the Framework. How much reading in the Framework have you completed this summer?

*Depth of Knowledge is a coding for the complexity of thinking that a student must perform to respond to the item.  This is very different from the difficulty of the item.  Item difficulty is determined by the number of students who get the item correct.

NAEP data suggest need for implementation of Framework vision

The National Assessment of Educational Progress (NAEP) released the results of the interactive science assessment administered nationwide to students at grades 4, 8, and 12.  Along with the results NAEP has released three of the Hands-on Tasks (HOTS) they developed to assess student ability to plan and conduct scientific investigations, reason through complex problems, and apply scientific knowledge to real world problems.  NAEP is also releasing all nine of the Interactive Computer Tasks which simulate natural and laboratory experimentation.

The NAEP data show three major findings that extend across the grade levels.

  • “Students were successful on parts of investigations that involved limited sets of data and straightforward observations of that data.
  • Students were challenged by parts of investigations that showed more variables to manipulate or involved strategic decision making to collect appropriate data.
  • The percentage of students who could select correct conclusions from an investigation was higher than for those students who could select correct conclusions and also explain their results.” (NAEP, 2012)

Students who participated in the NAEP also reported having limited opportunities in science classes to write explanations and explain results. While this data is correlational not causal, it certainly invites some pondering.  Would standards that place a priority on use of models and developing evidence-supported arguments lead to more opportunities in classrooms to develop reasoning skills?

The vision of the Framework seems like the right support at the right time for developing higher-order thinking and critical reasoning skills in Maine learners.

And what about assessments?

The most common follow-up questions during a presentation on the Next Generation Science Standards is … What about the assessments?  The answer is easy — They will have to be different from most assessments we currently have but other than that, no one really knows for sure.

The good news is that the National Academies Board on Science Education (the same group responsible for the development of A Framework for K-12 Science Education) is in the beginning stages of a report which will address the implications of the Framework for future science assessment. The National Academy expects to release this report shortly after the completion of the Next Generation Science Standards.

What we do know is that assessment almost always lags behind standards development by two years: one year for the development of the items and one year for the field testing of the items.  Achieve hopes to have the standards completed by December of 2012.  Assuming Achieve meets this goal, the Department would need to bring the standards before the legislature for approval.  It seems VERY optimistic to think that any assessments aligned to Next Generation Science Standards could be ready by Spring of 2015. Again… no one knows and it is too early for speculation.

The best thing to do is get familiar with the Framework and take some time to provide feedback to the NGSS.  For those of you who are forming groups to provide input, NSTA has developed an NGSS Study Group Guide, complete with agendas and facilitator notes!