Tuesday, April 15, 2008

Patterns & Memory and SPARK Podcasts

Dr. Ginger Campbell, the host and creator of The Brain Science Podcast, presents an excellent discussion of Elkhonon Goldberg's The Wisdom Paradox (click Listen to Older Episodes). Readers with an interest in the Architecture of Learning instructional design model will recognize the influence of this important work. I highly recommend this episode! The show notes for this episode can be found here.

I plan to post some insights from John Ratey's SPARK: The Revolutionary New Science of Exercise and the Brain (click Listen to Older Episodes) in a future post. You can hear Dr. Campbell's interview with Dr. Ratey in another episode of The Brain Science Podcast. Show notes for this episode can be found here.

Click the links above to be taken directly to the website where these can be downloaded. Both are interesting and loaded with information. You can also find these episodes and subscribe to the Brain Science Podcast at iTunes. Subscription is FREE! Enjoy!

Saturday, April 12, 2008

Writing Achievement? Part 2

In the previous posting, I questioned the “encouraging” student writing achievement on the latest National Assessment of Educational Progress—aka, the nation’s report card. With only about ⅓ of eighth grade students and about ¼ of high school seniors achieving proficiency, I’m puzzled by the seemingly excited results. Yes, progress is progress, but using these results to show that “the death of writing has been greatly exaggerated,” as the vice chairwoman of board overseeing the testing proclaimed (Dillon, 2008), seems akin to saying that because the tree is only ¾ dead it is not exaggerating to call it healthy.

However, I also questioned what the test actually tests and what we actually teach. Let’s begin with the test, as it mirrors the main error I see in much of our instruction.

Most achievement tests, like the NAEP, give students a prompt—usually narrative, informative, or persuasive—and a confined period, often 20-30 minutes, in which to “write.” The results are then assessed by an “expert” or a panel who determines an achievement level.

Is this an authentic assessment of writing? When professional writers and editors discuss writing, they emphasize revising as THE key to quality writing. Just this morning, I heard an author interviewed on NPR who said writing was a matter of getting a voice in your head and major revision. In A Writer’s Coach, editor Jack Hart spends about 15% of the book discussing matters of process and drafting and the rest of the book explaining all the details that need attention during multiple revisions.

This reality-based perspective reveals the weakness of the NAEP’s and our instruction’s approach to writing. We engage students in drafting and evaluate the results as if they were writing. Often, our approach lacks even sound pre-writing practices, such as developing a vision for what will be written. (And even this step, visioning, requires multiple actions to ensure the draft heads in the right direction.) In testing situations, the time allocated does not allow for much, if any, prewriting thought.

When I earned my undergraduate degrees, we were told that it was important for students to be writing frequently (i.e., daily) and to be producing vast amounts of written work, which we tended to proudly display—the more student writing on display, the better we must be teaching students to write. Right?

No, we were teaching students that writing meant creating a draft and calling it completed. Imagine you hired a builder to construct a new home, and when you arrived at the site several months later you found a pile of building materials. You question the builder who says, “But all the raw materials are here!” Having students draft frequently but rarely complete significant revisions engages students in piling up writing’s raw materials. A home will not emerge from the pile, and neither will authentic writing spring from just the draft.

I think one reason we take this approach is a lack of knowledge about revising writing. I know that in my case, even with a degree in English, I really did not understand revision processes until recently. I would tell students to revise their writing without any additional guidance or instruction. As a result, revising became little more than proofing, checking mostly for spelling errors.

So, it seems we have two “issues” to face. First, our current assessments of student writing do not actually measure writing as professionals working in the field describe it. Second, we need to engage students in significantly more revising of writing and provide the instruction necessary to equip students to engage in revision. As author and editor Susan Bell says, “The debate continues on whether you can teach someone to write; I know, unequivocally, that you can teach someone to edit” (p. 1). We need to teach students to edit, to revise and improve their drafts to produce authentic writing.

The NAEP likely reveals something about student achievement, but viewing it as an assessment of authentic writing seems off target. The best assessment of student writing capacity may be the teacher who instructs and observes at every stage of the process, noting the prewriting activity and, especially, the revision abilities students apply. No formalized test with confined, prompted drafting will ever produce writing samples that reveal authentic writing ability. For such assessment, we need teachers with deep understandings of the writing process and the collection of actions present in each step, and learning activities that emphasize revision, revision, revision. Currently, our students draft too much, and actually write too little.

I welcome your insights! Feel free to send me an email or, better yet, post a comment!

Bell, S. (2007). The artful edit: On the practice of editing yourself. New York: W. W. Norton.

Dillon, S. (2008). In test, few students are proficient writers. The New York Times, accessed April 3, 2008 via http://www.nytimes.com.

Monday, April 7, 2008

Writing Achievement? Part 1

The New York Times reported last week that despite the fact that about two-thirds of America’s eight grade students and about three-fourths of high school seniors failed to reach proficient writing levels on the National Assessment of Educational Progress, the federal government is “encouraged by the results” (Dillon, 2008). You read that correctly. With a minority of our students achieving proficiency in writing, the federal government finds cause for celebration.

The Fed’s optimism rests in the contrast of the results with other indicators of student achievement. For some time, the general conclusion has been that student writing capacity is in free-fall. However, the NAEP results show “modest increases in the writing skills of low-performing students” (Dillon, 2008). This is hailed as success, even though the performance of high-performing students remain unchanged.

What’s going on here? Are we so desperate for good news that even a failure to achieve proficiency is hailed as a victory? Certainly the gains achieved by low-performing students are moving in the right direction, but in four years since the last NAEP, those gains are merely modest while other scores remained unaffected.

And the celebrated “gains” are highly questionable. In the same report, the Times cites a 2006 survey of college professors that suggests a large majority of college students possess “limited writing skills” (Dillon, 2008), and a 2003 study that found American companies are spending billions of dollars on remedial training for employees—some “new hires straight out of college” (Dillon, 2008).

So, we have conflicting viewpoints. We should be excited by the moderate gains on the recent NAEP, but we should be mindful of the continued challenge we face in developing student writing capacity.

To me, even the NAEP results make a stronger case for being mindful of the challenge than excited by the gains. With nearly 75% of our seniors still lacking proficiency, the challenge far outweighs a reason for celebration.

However, some fundamental questions remain. Is the NAEP measuring writing ability? If we accept that it is measuring writing ability, what does it mean for instruction? And why are our current instructional efforts yielding such poor results?

I’ll address these questions in my next posting.

Dillon, S. (2008). In test, few students are proficient writers. The New York Times, accessed April 3, 2008 via http://www.nytimes.com.