Content Focus

Content as the Focus

When technology is seen as the tool for instructional delivery, communication, or information seeking, student learning is measured on mastery of content objectives rather than on mastery of technology use itself. Strategies for assessing student learning through technology-assisted processes and technology-produced products should be consistent with strategies used for assessing learning through more tradi- tional means. This is especially true if the existing strategies already address the student-centered, problem-based, interdisciplinary learning that is characteristic of constructivist learning environments. Constructivist learning integrates disciplines, skills, and strategies so that students can develop their own understandings. It challenges teachers to assess student learning in a format that most closely matches how the information was learned, as well as with formative methods that will provide students feedback with which to inform further learning.

Student learning through collaborative team projects, often culminated with products that have been produced using technology tools such as multimedia presentations, cannot adequately be assessed using traditional methods. More appropriate strategies for assessing growth in content knowledge within project- based, integrated learning activities include conferences with students, anecdotal records, and observation (Kumar & Bristor, 1999). Rubrics can strengthen these  assessments. Simkins (1999) defines rubrics as “sets of formal guidelines we use to rate examples of student work, usually presented in the form of a matrix with performance levels in the top row and performance dimensions along the left column”.

   TABLE 15.1  Assessment Rubric Example                                                         


Exemplary Expected Adequate Inadequate
Content Information is in a Information is in a Information is in Information is not
logical, intuitive logical sequence; some logical in a logical
sequence; information is sequence; some sequence and is
information  is clear, largely clear, information is confusing,
appropriate, and appropriate, and confusing, inappropriate, or
accurate. accurate. inappropriate, or inaccurate.
Written Maintains clear Maintains focus and Attempts to Frequently loses
work focus and logical displays maintain focus and focus and
organization; organization; organization, but organization; ideas
establishes a tone conveys complex occasionally is not are not conveyed
appropriate to the ideas with clear; simple ideas clearly or
intended audience; supportive details; are  conveyed well, supported with
clearly conveys has fewer than two but more complex details; has
complex ideas with misspellings and/or notions are not significant spelling
ample supportive grammatical errors. developed and errors and/or
details; has no supported with grammatical errors.
misspellings or details; has multiple
grammatical errors. misspellings and/or
grammatical errors.


Well-constructed rubrics assist teachers in making decisions about student skills, competencies, abilities, and attitudes within complex learning processes and prod- ucts (see Table 15.1). Students can be introduced to rubrics, and even students with learning disabilities can more precisely focus on learning when they know in ad- vance what will be expected of them in an activity (Jackson & Larkin, 2002). Stu- dent teams can broaden their understanding by assessing peer projects according to rubric criteria. Simkins (1999) provides these tips for constructing effective as- sessment rubrics:

  • Construct rubrics that are not too task specific—more general rubrics can be used again for other projects.
  • Construct rubrics that are not too general, because they will lack the specificity required to make appropriate distinctions.
  • Avoid highly detailed criteria that become more of a checklist than a rubric.
  • Use a limited number of dimensions, or main areas of focus, in order to decide where to look for the main learning priorities of the project.
  • Use key criteria that matter the most about each dimension of the project.
  • Use measurable criteria that can be counted or ranked.
  • Select descriptors that clearly describe traits that should be present in
  • Use four performance levels that make fine enough discrimination, yet are not too divisive.
  • Maintain an equal interval distance between levels so that the highest and next highest are an equal distance to the lowest and next lowest.
  • Involve students in creating rubrics so they will clearly understand what the expectations are, and “buy in” to using them.

Numerous rubric examples can also be found on the web (e.g., Kathy Schrock’s Guide for Educators at Practical assessment strategies using technology tools are described in the final sec- tion of this chapter.