(iStock)

It’s often said that young people today don’t know how to write, or, at least cannot write as well as their parents’ generation did. (Not that their parents were great writers, either). So how bad is the problem?

Pretty darn bad, and in specific ways that have now been evaluated by the people behind NoRedInk, a Web-based platform that attempts to help students in sixth through 12th grade improve their writing and critical-thinking skills through an adaptive, Internet-based curriculum. The material, including an evidence-based argument program, challenges students on a number of different skills, such as making logical deductions from a set of facts and recognizing vague language.

NoRedInk was started about five years ago by Jeff Scheur, a National Board Certified Teacher who taught — and graded high school English papers — for eight years, then decided to find a way to help students beyond his own classroom. There are both free and paid versions of the platform, which he said is used in about half of the school districts in the country.

Scheur said in an interview that the company decided recently to try to drill down on seven proficiency areas in which students get in trouble when it comes to writing and thinking critically about written material. From the more than 3 billion questions that students have so far answered on NoRedInk, the company analyzed questions answered by 213,284 students (see the demographic breakdown below) in sixth through 12th grade across all 50 states. The proficiency areas were:

• Evaluating Undeveloped and Undefendable Claims — Can students evaluate when claims are undefendable or too obvious to be debated?
• Identifying Arguments That Don’t Progress — Can students identify when arguments fail to advance?
• Making Logical Deductions — Can students make logical deductions based on evidence?
• Distinguishing Between Claims, Evidence, & Reasoning — Can students distinguish between claims, evidence, and reasoning?
• Analyzing Evidence — Can students tell when evidence is irrelevant, not credible, or not factual?
• Recognizing Vague Language — Can students detect and strip away vague or ambiguous language in arguments?
• Revising Wordiness — Can students detect and eliminate wordy, redundant, or unnecessary language from arguments?

The percentages of students who can do these specific things are low, as you can see below from the analysis.

“The reason why it is happening is prima facie,” Scheur said. “Kids don’t get strong modeling and they don’t get sufficient practice. They need strong modeling. They need practice.”

Large class sizes can make it difficult for teachers to assign a lot of long writing projects.

“I have a teacher in Michigan [who uses NoRedInk] with 38 students,” he said. “Imagine having five classes of 38. What approach do you take to grading those papers and giving feedback, and then asking for revisions and then evaluating those revisions? Teachers rush through and give cursory feedback, or they take their time to give real feedback, but that naturally limits the number of such assignments they can give over the year.”

And, of course a lot of writing instruction in recent years has been geared toward getting students to perform well on standardized tests that either don’t require real writing or call for essays of several paragraphs.

Another issue, Scheur said, is where students get their information. Many get it from the Internet, without evaluating for accuracy.

“Democracy depends on an informed citizenry and the mutual commitment to reasoned rational argument,” he said. “Those are the underpinnings of a system where  we make decisions together. So the fact that students are having trouble with this is concerning.”

Here’s the breakdown of the students and results in the NoRedInk analysis:

 

 

 

This comes from NoRedInk’s analysis:

*EVALUATING UNDEVELOPED AND UNDEFENDABLE CLAIMS
Modern political discourse has veered away from substantive discussion. Some claims are either entirely self-evident or are patently untrue and have no viable evidence to support them, making them inappropriate for substantive debate. Our curriculum presents students with claims that are worthy of debate, completely undefendable, and entirely self-evident to assess if they can differentiate between them.

Even when students have a 33% chance of guessing  correctly, about half of them (47%) can’t identify when claims “aren’t defendable” enough for substantive debate.

-0-0-

*IDENTIFYING ARGUMENTS THAT DON’T PROGRESS

Another challenge in modern political discourse is the Twitterization of our culture: making clickbait or 140-character arguments and failing to advance them.

Candidates frequently repeat themselves to drive home a point, rather than digging into the actual subject matter.

NoRedInk tests this skill by giving students claims that are followed by mere restatements and seeing whether they can identify the problem.


We also give students a series of statements to see if they can drag in the ones that actually advance the argument.

Fewer than half of students (47%) can identify when the follow-on statement is a mere repackaging of the original claim, even in a multiple-choice format.

*MAKING LOGICAL DEDUCTIONS
We also assess whether students can tell when arguments leap to unjustified conclusions based on the evidence.

 

Students do only 2% better than a coin flip (52%) when asked whether a piece of reasoning logically connects claims and evidence.

-0-0-

*DISTINGUISHING BETWEEN CLAIMS, EVIDENCE AND REASONING
To construct logical, persuasive arguments, students need to be able to understand the separate roles of claims, evidence, and reasoning. NoRedInk creates persuasive arguments on interesting topics and measures whether students can identify the different parts of the argument.

Again, less than half of students (49%) were able to complete the task successfully.

-0-0-

*ANALYZING EVIDENCE

Among the biggest challenges of modern political discourse is the lack of evidence to defend one’s claims.  Often, the “evidence” provided is not credible, not factual, or not relevant. NoRedInk’s curriculum assesses students on each of these dimensions.

Only about half of students (54%) can tell that these arguments are weak.

NOT CREDIBLE

 

Only 42% of students can tell that these arguments are weak.

 

 

Only about half (54%) of students can tell that these arguments are weak.

 

-0-0-

*RECOGNIZING VAGUE LANGUAGE

When individuals hide behind vague language, they can obscure the truth or present misleading information.

And here is an example of how our curriculum trains students to flag ambiguity:

Unfortunately, only 35% of students are able to detect vague or ambiguous language that’s used in arguments like this.

-0-0-

REVISING WORDINESS

In order for students to advocate for themselves, it’s important that they be able to articulate their ideas clearly and crispy, shedding
words that distract and detract from their arguments.

The last skill area that NoRedInk assessed is students’ ability to eliminate wordy, redundant, or irrelevant language that has the potential to weaken their arguments.

 

We found that only 33% of students are able to detect and remove language that qualifies as “wordy,” “redundant,” or “irrelevant.”

(Update: removing extraneous words that were not part of the slide presentation)