Friday, June 13, 2014

Common Core opponents are irrational - and ignorant

Forgive me getting a little overheated about this.

Right now, it's fashionable to trash the Common Core State Standards Initiative - or Common Core, as it's more generally known. However, after having read a lot of anti-CC articles (and comments on articles) on the Web, I have reached an important conclusion. Two conclusions, as a matter of fact.

1. Common Core opponents are irrational.
2. Common Core opponents are ignorant.

To illustrate their irrationality, I will point you once again to "Frustrated Parent." He couldn't figure out his son's math problem, and so he and the entire world blamed it on Common Core. I looked at the math problem in question. It simply presented another way to learn subtraction. As I explained in my posting about it, there are many different ways to teach subtraction.

Common Core does not tell you how to teach subtraction, but it allows for all those different teaching strategies.

To underscore the irrationality, I have been following the comments in another article about CC. One of the commenters was bashing CC because, to quote his words as well as I can from memory, it doesn't allow for the fact that there are many different ways to teach a concept, and it requires that the concept be taught in one way. Um, (a) nothing could be further from the truth; and (b) this directly contradicts the underlying notion behind the math problem that "Frustrated Parent" was bawling about.

As for ignorance, when I say "ignorant" I mean "stubbornly unwilling to learn (more) about a subject." I've read lots of comments by people who are spouting off authoritatively about something they know nothing about, when the truth is freely available, and easy to read, at the Common Core State Standards website. All they have to do is look it up. But they prefer to go on blabbering ignorantly about what they perceive as Common Core and all its shortcomings, unencumbered by facts or the truth.

Why are college kids so intolerant? Part Two

I have noticed that stuff on the Internet doesn't really last forever. It lasts until the publisher, copyright holder, or server administrator decides it's time to get rid of it. The Wayback Machine helps, but I doubt it can capture everything. Therefore, I decided to copy Matt Bai's excellent column and paste it right here.

I don't know if I can claim "fair use" for such a blatant copy. But I will include a link to the original: . And I will rewrite this posting to reduce Mr. Bai's essay to a bunch of acceptable excerpts, if he or the Yahoo lawyers ask me to.

Here, then, are Mr. Bai's words on the subject.

Don't blame college kids for intolerance. Blame us.
By Matt Bai
May 22, 2014 4:59 AM
Yahoo News

America's college kids are back and resting at home this week, which is a good thing, because during the long months away they seem to have gone completely out of their minds.

Last weekend, The New York Times' Jennifer Medina reported on the latest bizarre demand on campus: "trigger warnings" to let students know if the text they're about to study will expose them to some version of misogyny or homophobia, so they aren't unexpectedly traumatized by visions of things that can never be unseen – like, say, every novel written by a white man before 1960. That followed the public floggings of several commencement speakers whose invitations had to be rescinded, including such evildoers as former Secretary of State Condoleezza Rice, the International Monetary Fund's Christine Lagarde and Robert Birgeneau, the former chancellor of the University of California, Berkeley.

All of this has provoked a torrent of eloquent condemnation from pundits and academics, who worry that our elite universities, in the words of an editorial published in Monday’s Washington Post, are being "impoverished by intolerance." Which is a reasonable concern, except that it misses the point. It's not the students' fault that they expect to laze around in a world of ideological comfort. It's totally ours.

There's nothing new about the basic tension between speech and sensitivity on campus. When I was at Tufts in the late '80s, at the height of what we called political correctness, we argued fiercely about whether the military belonged on campus or whether certain faculty members were denied tenure because of their politics. But, by and large, we were primed to have the debate, not chill it.

We'd grown up with TV news that tried to get at complicated issues (Ted Koppel's "Nightline" was the single most influential news program of the era) and op-ed pages that crackled with competing arguments. I remember meeting William Colby, the former CIA director, at a symposium. A lot of us were disgusted by the role he had played in Vietnam, but it never occurred to us that he shouldn't speak or that his beliefs weren't at least defensible.

It was reasonable to hope, with the sudden explosion of what we called cyberspace a decade or so later, that this kind of exchange would become more commonplace and more enlightening, rather than less so. Only that's not what happened. Almost from the moment the first iteration of political blogs appeared, not long after the 2000 presidential election that exposed a deep cultural rift in America, like-minded activists began to wall themselves off from any version of reality they didn't like. They set about building ideological silos in the space where virtual town squares might have thrived.

Our political leaders and our media might have recognized the danger here and done their traditional duty, which was to ignore all the noise, and focus instead on explaining the complex realities of a country in social and technological transition. With some notable exceptions, that didn't happen, either. Instead, politics in the past 10 years has become a perennial contest of the already converted, a constant pursuit on either side of "base strategies" and data sets that tell you exactly which voters you need to turn out in order to get and hold power.

Those of us who cover and analyze the news – whose central purpose it is to challenge our own preconceptions about the world, and yours – haven't really performed much better, and I'm not just talking about the partisan rehashing on Fox News and MSNBC. Many of our most respected columnists and academics, too, occupy the predictable extremes, where they can always rely on the clicks of a comforted audience. They use a smokescreen of empiricism to prove to you, over and over again and without fail, that everything you already believe is borne out by some selective poll or study.

What's happened is that we've effectively left behind the Age of Persuasion and ushered in the Age of Confirmation. It sometimes seems the whole world exists to re-affirm our conceptions of it; you can get through days, even weeks, without being at all discomfited, if you know which sites to visit and which channels to watch.

This isn't confined to politics. We target self-help books and superhero movies at consumers whose habits we know, rather than do the hard work of trying to convince anyone to broaden their minds. (Did you like Sheryl Sandberg's book? Then you'll love Arianna Huffington's version, which is pretty much the same thing, right down to the catchphrase title and cover photo.) Log on to, the supermall of the confirmation culture, and you will instantly be introduced to all of the books, movies and songs that are exactly like all the others you've purchased recently.

We have more options and access to information than any society in human history, and less inclination to avail ourselves of it. Maybe we're just overwhelmed.

So tell me this: What exactly did we think the effect of all this was going to be on the generation after ours? Today's college senior was born around 1992 and developed a political awareness just as blogs and social media were bursting into the American consciousness. Did we really expect these kids to emerge from the moment with a sense of intellectual adventurism? Were they supposed to just know that the entire point of literature is to discomfort you with no warning at all?

Did we think the characteristic that F. Scott Fitzgerald cited as the hallmark of first-rate intelligence – "the ability to hold two opposed ideas in mind at the same time and retain the ability to function" – didn't have to be taught by example?

Here's the good news. First, while the loudest students have been grabbing the attention lately, anyone who spends any time on campus these days (or who reads some of the better polling of the so-called millennial generation) can tell you that a lot of younger Americans appreciate that something is wrong with the way we talk to each other, or don't. They're distrustful of old political and media institutions and eager to build a more tolerant, less fragmented society than their parents have to this point. That's to their credit.

Second, it's worth remembering that for all the missed opportunity around us, we're still in the infancy of the Internet culture, a moment roughly analogous to where television was in 1960. Our instinct has been to retreat into safe communities online that reinforce our convictions and banish all doubt. But media evolves, and political dialogue with it, and I'm betting we will figure out how to hear alternate (and even odious) worldviews without need for a trigger warning.

To paraphrase Martin Luther King Jr., the arc of technology is long, and it bends toward enlightenment.