Our world often feels too cluttered. So, as spring blossoms into full swing, Outlook asked 10 writers what ideas, traditions, people and places we’d be better off without. From Texas to flip-flops and college rankings, here are their picks.
What else should we throw away? Share here.
Automatic tax withholding
During World War II, the U.S. government needed to raise cash — and fast. A team of experts that included an obscure young economist named Milton Friedman came up with income tax withholding. It was, as one senator put it, the best way to “get the greatest amount of money with the least amount of squawks.”
Friedman, who would go on to become the high priest of the free market and small government, eventually appreciated the irony of that statement. He didn’t regret suggesting withholding as a wartime measure, but he spent the rest of his life lamenting its longevity in peacetime. “It never occurred to me at the time that I was helping to develop machinery that would make possible a government that I would come to criticize severely as too large, too intrusive, too destructive of freedom,” Friedman wrote in his 1998 memoir, “Two Lucky People.” “Yet, that was precisely what I was doing.”
Withholding numbs workers to the pain of their taxes. As the Treasury Department Web site explained as recently as 2009: Tax withholding “greatly eased the collection of the tax for both the taxpayer and the Bureau of Internal Revenue. However, it also greatly reduced the taxpayer’s awareness of the amount of tax being collected, i.e. it reduced the transparency of the tax, which made it easier to raise taxes in the future.” Oddly, that fact sheet no longer appears on Treasury’s Web site.
Withholding leaves naive taxpayers suffering from a kind of fiscal Stockholm syndrome. They actually celebrate when they get a tax refund, the way a broken hostage might thank a kidnapper who returns his property to him. A refund is when the government pays you back for the interest-free loan it forced you to make in the first place. Congratulations!
Withholding is corrosive to democracy for many reasons. The unspoken assumption is that the government’s needs are more important than yours. Withholding means we are, in effect, working for the government before we are working for ourselves.
Worse, since taxpayers are anesthetized to the pain of paying taxes, we’re becoming ever more disconnected from the product we are buying. There’s a reason Tax Day and Election Day are about as far apart as possible. Why not make everyone write a check every quarter? Better yet, make them write a check once a year — on Election Day. Not only would you get what you pay for, but comparison shopping works better when the price tag is in plain sight.
Jonah Goldberg, editor at large of the National Review Online and a fellow at the American Enterprise Institute, is the author of “The Tyranny of Cliches: How Liberals Cheat in the War of Ideas.”
Ben Bernanke has been savvy, creative and even daring in how he’s saved the U.S. and world economies from the abyss. But when his term is up in January, it will be time for him to go.
By then Bernanke will have put in eight years as Federal Reserve chairman; he is already known as one of history’s most consequential. He has no small number of fans among economists and on Wall Street who want him to stick around for another term to handle the unwinding of the unconventional programs he initiated. But eight years is enough.
When leaders start new jobs, they bring fresh eyes and energy to their organizations. As the years pass, they become more skilled at the job, but also more stuck in their ways and burned out. And leaders can peak earlier than you might think. A recent study found that, in terms of financial performance and other measures of success, a mere 4.8 years is the best tenure length for a chief executive.
From what I’ve seen in my reporting, central bankers can follow a similar trajectory. Today’s Bernanke is smarter about how financial markets will respond to the Fed’s moves and about how to communicate in news conferences and congressional hearings than the Bernanke I started covering in 2007. But to see the downside of staying too long, just look at his predecessor.
Alan Greenspan, who spent more than 18 years as Fed chair, took on the status of a demigod whose instincts were beyond question. Few people recognized during his chairmanship that the Fed was failing to adequately regulate big banks and subprime mortgages, contributing to the looming financial crisis. When Bernanke came to the Fed, he tried to rein in the imperialchairmanship. But the global megacrisis undermined those plans and gave him a bigger public profile than he intended. It’s time to try again, and ending Bernanke’s tenure after eight years would be a start.
We limit presidents to eight years in office. The FBI director’s term, since J. Edgar Hoover, has been limited to 10 years (though that was extended a bit for Robert Mueller). Bernanke’s counterpart in Europe, the president of the European Central Bank, gets eight. It’s time, Ben: Go write a book, give some speeches and enjoy semi-retirement. We need a new Fed chair.
Neil Irwin, a Washington Post columnist and Wonkblog economics editor, is the author of “The Alchemists: Three Central Bankers and a World on Fire.” Contact him at firstname.lastname@example.org.
If college rankings are to be believed, Warren Wilson College in North Carolina has the most liberal students, the College of Wooster in Ohio has the smartest professors, and Rice University has the happiest undergrads. And the very best college in the country is Princeton. Or Harvard and Princeton in a tie.
It doesn’t take a bachelor’s degree to figure out that most college rankings are at best highly flawed and at worst completely bogus. Rankings formally started back in the early 1980s, when U.S. News & World Report came up with measurements to judge the nation’s top universities so consumers could vet a school before enrolling. The list weighs many factors, including academic reputation, retention and graduation rates, faculty pay and credentials, incoming student test scores and alumni donations.
Over the years, this methodology has become complicated and controversial — and sometimes the results are inaccurate. In the past year, U.S. News publicly shamed a number of schools for fudging their numbers or outright cheating. George Washington University lost its No. 51 ranking after school officials disclosed that they had accidentally miscalculated the academic credentials of incoming freshmen.
U.S. News likes to describe its rankings as a public service to consumers — but sales of magazines, books and Web advertising driven by clicks have been highly profitable for the company. And that success and exposure prompted others to get into the game. Even the Obama administration recently unveiled a college scorecard Web site. It has data on nearly every college in the country and focuses on graduation rates and affordability, which some have criticized as an overly simple way to compare schools.
The worst rankings are those that attempt to evaluate such things as party scenes, dorm food and even professors’ looks — based on online surveys, questionable statistics and unfair stereotypes. There is no way such lists help students properly pick a college.
While rankings are often the starting place for many college searches, most families tend to make their final decisions based on information provided directly by the school. Other factors, such as cost, distance from home and a campus’s atmosphere during a whirlwind tour — and, frankly, gut instinct — can be far more influential than rankings. While a sliver of the population enrolls at the nation’s most selective, top-ranked colleges and universities, many more attend institutions you’ve probably never heard of, schools that are not always forthcoming about their abysmal graduation rates, sky-high student debt loads, teetering accreditation and serious financial problems.
How about we stop obsessing about rankings and start caring about that?
Jenna Johnson reports on higher education for The Washington Post. Contact her at email@example.com.
My father, rest his soul, used to say that every compliment should be the truth with a little sugar on top. He didn’t dole them out lightly, so whenever he complimented anyone, I took what he said as gospel. He knew that compliments, even when richly deserved, could backfire.
President Obama recently learned just how risky they can be when he called California Attorney General Kamala Harris the best-looking attorney general in the country, a remark he later apologized for. A week later, Rep. Steve Cohen (D-Tenn.) made a similar mistake, tweeting at Cyndi Lauper after a White House performance that he “couldn’t believe how hot u were.” He later said he tweeted — and deleted — that racy message to get media attention. However, the coverage it got probably wasn’t what he was fishing for.
And remember when, in the 2008 presidential campaign, then-Sen. Joe Biden called Obama “the first mainstream African American who is articulate and bright and clean and a nice-looking guy”? He said he regretted any offense his remarks might have caused, but they remain a cautionary tale.
Immigrants are used to similar backhanded compliments. “You speak English very well,” people often say to me, even after I tell them I’ve been living in this country on and off for 32 years.
Or: “You don’t look like you’re from [pick a country],” as if the person being complimented is on a split screen next to the stereotype in the other person’s head.
Then there are the over-the-top compliments that are so exaggerated they sound implausible. How can we be sure, for example, that someone is the sexiest or most intelligent man or woman alive when we haven’t met every sexy or intelligent person on the planet?
Then there are the “saving face” or “makeup” compliments, which usually follow an insult. For example, someone might call you a jerk when he or she doesn’t realize you are listening — and then turn around and compliment your shoes, to ease the blow. Or that person could suddenly blurt out that you are the sexiest and most intelligent person alive.
So imagine, my attentive and intelligent readers, all the time and face we would save if we no longer exerted so much energy fishing for compliments, tossing them out — and receiving, apologizing for and recovering from them.
How hot would that be?
Edwidge Danticat is a writer living in Miami. Her next novel, “Claire of the Sea Light,” will be published in August.
Flip-flop, flip-flop, flip-flop, flip-flop. What a glorious sound to hear when it is accompanied by crashing waves, squawking seagulls and ice being blended for drinks with little umbrellas. During the walk to work or around town on a Saturday. Anyplace where there isn’t sand within 20 steps? Not so much. Actually, not at all.
At the beach, the foot thong screams “I’m on vacation!” In an urban environment, the message is “I give up” or “I don’t care” — and not in a good way. The freedom flip-flops give the wearer is undeniable. The desire for comfort is understandable. But the sloppiness they inspire is inexcusable.
Ain’t nothin’ chic or hip about a man or a woman dressed to the nines with their 10 toes hanging out. There is most definitely nothing cute about the sight of pants scraping the pavement, or the horrible conditions of said feet. You know what I’m talking about: crusted heels and toes blackened by the dirt and debris picked up ambling the concrete jungle. And those gross conditions are exacerbated by hoofing it to the office or around the neighborhood in the cold or the rain. Street gravy, anyone?
There are painful consequences to wearing flip-flops. A 2008 study from Auburn University found that the shoes can lead to orthopedic problems, such as sore feet, ankles and legs. There are also the blisters, calluses and stubbed toes to consider. “They are terrible for the arches. They give you no support and they don’t protect your feet,” Kathya Zinszer, associate professor of podiatric medicine and director of community outreach at Temple University’s School of Podiatric Medicine told CNN. “You need to make sure that you are securing the biomechanics of your foot,” she said. “So if you just get what people call slides, or rubbery flip-flops, it is more dangerous than trying to do something barefoot.”
Look, I know I’m fighting a lonesome battle. Especially since the object I’d love to toss goes all the way back to the ancient Egyptians. But if you want to walk like an Egyptian, do so in lace-ups, heels or, if you must, any other type of sandal.
Jonathan Capehart is a Washington Post opinion writer. Contact him at firstname.lastname@example.org.
I recently asked my Twitter followers for heinous practices that would sound nicer if recast in innovation-friendly terms. So piracy became “copyright innovation,” child labor became “supply-chain innovation” and blackmail became — my favorite — “informational innovation.”
If these sound even vaguely plausible, it is because “innovation” has suddenly become ubiquitous and beloved. As is the case with “openness” or “sharing,” the mere invocation of innovation automatically confers credibility on ideas, companies or products that could — and should — be widely contested.
Throughout the ages, the meaning of “innovation” has evolved. (Yes, even innovation is not immune to innovation.) According to historian Benoit Godin, for more than 2,500 years, the innovator was “a heretic, a revolutionary, a cheater.” Innovators brought little but trouble: They challenged the status quo and undermined the stability of the state. As late as the 1940s, innovation was seen as a form of deviant behavior — like crime or delinquency.
By the 1950s, however, governments became convinced that technology and its cousin, innovation, are leading drivers of economic growth. A magazine for entrepreneurs called Innovation, launched in the mid-1960s, even adopted “Change or Die!” as its slogan.
As innovation has become synonymous with technological invention and economic efficiency, it’s no wonder that today’s gadget-obsessed and overworked society can’t find much wrong with the concept. A recent study that reviewed all academic articles about innovation published in English since the 1960s found that, out of thousands of studies, only 26 articles addressed the negative or undesirable consequences of innovation.
Sure, start-ups such as Uber or Airbnb — which help you book a cab or rent out your apartment for short vacations or sublets — might be innovating when it comes to providing transportation or short-term housing. But to know whether such innovation is desirable, we must look beyond technology and economics.
Will Uber circumvent the anti-discrimination rules that old-school cabs have to comply with? Will Airbnb help landlords get around rent control? If we don’t like anti-discrimination codes or rent control, we should abolish them through new laws, not erode them by innovating. “No innovation without representation” should be our rallying cry.
In an ideal world, start-up ventures would count as innovative only if they end up improving our society. The mere fact that we can do something more efficiently — without asking how such efficiency will affect the world around us — isn’t necessarily innovative.
Evgeny Morozov is the author of “To Save Everything, Click Here: The Folly of Technological Solutionism.” Contact him at email@example.com.
Here’s a news conference I’m sure President Obama would love to take back: In August he warned Syrian President Bashar al-Assad that if chemical weapons were used against the Syrian people, it would be a “red line” for the United States. Crossing the line, he has warned, would be a “game-changer,” triggering unspecified bad consequences for Syria.
Flash-forward to this past week. The “red line” has been crossed; the United States and its allies say chemical weapons have been used. And Obama is backed into a predictable corner, reduced to spinning furiously at a news conference. What he really meant, he said, was that this is a game-changer “for the international community,” and besides, we need a much fuller investigation before we do anything rash.
So instead of talking about how we can stop a civil war that’s claimed some 70,000 lives, we are talking about Obama and his red line.
Guess what? Such threats pretty much never work. Like many other world leaders, Obama resorted to this cheap “red line” formula because he didn’t have another move he was prepared to make.
Look at Iran, which apparently continues to pursue a nuclear weapons capability despite years of finger-wagging red lines. Harvard professor Graham Allison counts seven red lines set by the international community that Iran has crossed in recent years. As former U.S. ambassador James Jeffrey pointed out to me recently, red lines have worked when the United States was actually prepared to go to war if our ultimatums weren’t met. For example, when Saddam Hussein invaded Kuwait in 1990 and we demanded he leave. Or when we insisted that the Taliban give up Osama bin Laden after Sept. 11, 2001. In both cases, the red line was crossed — and American troops rolled in.
But all too often, a red line is not a dramatic precursor to an invasion but the diplomatic equivalent of the parental cop-out. We all know it: If you haven’t cleaned your room, done your homework, come downstairs by the time I count to three, you’ll really get in trouble. It never works, at least not in my house, unless you do something that matters — like take away TV time.
When it comes to Syria, Obama needs an “or else.” Better yet, he should give up red lines altogether; it’s one presidential tool that should be tossed out with the trash.
Susan E. Glasser is editor in chief of Foreign Policy.
RTs ≠ endorsements
Retweets, or RTs, are critical for sharing content on Twitter — the equivalent of forwarding an e-mail to a large list. For some reason, lots of Twitter users, especially journalists and think tankers, like to say in their Twitter bios that RTs are not equal, or ≠, to endorsements. They might think this covers their behinds, but often it’s a sign that they are too clever for their own good.
I hate to break it to you, folks, but RTs are implied endorsements. Forwarding an article by e-mail without explaining why you are passing it on implies that you agree with it (and that you are someone who likes to waste my time). RTing something without comment means the same thing. However, a comment isn’t always necessary: If you are willing to stand by the content you are RTing or it’s just, say, a photo you admire, no problem.
Some people RT stuff that you know they vehemently disagree with; they’re doing it to show the idiocy of the people they oppose. But doing so without a comment is confusing. The only way to make sure your tweets aren’t misconstrued is to add a few words before the material you’re retweeting. Even “I disagree” or “This guy’s more out of it than I thought” will help reduce the confusion.
Putting the RT disclaimer in your Twitter bio presumes that you won’t get in trouble for the content of the tweets you share. But that becomes a useless crutch, and your tweets go out unconnected to it.
No one sees something controversial you’ve RTed and says, “Gee, I wonder if this person agrees with that,” then clicks over to your bio and is massively relieved that you have that disclaimer.
I know several people who’ve gotten in trouble for their tweeting despite having those sorts of disclaimers in their bios. Take, for example, Jodi Rudoren, the Jerusalem bureau chief for the New York Times, whose bio says, “Tweets mean hey, look at this, nothing more.” She has one of the most closely scrutinized jobs in journalism, yet was so careless in her tweeting on the Middle East that that she earned a “Twitter sitter” — an editor at the Times was assigned to approve all her tweets and Facebook postings.
So, ditch the disclaimers. Instead, use the extra space to tell us about a couple of subjects you tweet about.
For decades, Texans have been clamoring about leaving the Union. Letting the Lone Star State secede would set a bad precedent. (See the Civil War of 1861 to 1865.) But what about expelling it instead? There is promise in that.
I’ve been thinking about this ever since I stood at the ruins of Fort Jesup, a U.S. Army base built in 1822 by Lt. Col. Zachary Taylor on the western edge of Louisiana to guard the Sabine River, which formed the international border between the United States and Texas, then part of Mexico. Why not go back to that situation? It worked then.
After all, what has Texas given us? Without it, we might have avoided the presidents who gave us two of our longest and least necessary wars — Vietnam and Iraq — and John F. Kennedy might still be alive.
We wouldn’t have the Dallas Cowboys, nor the right-wing oilmen the state seems to produce. Of course, we would also lose Gov. Rick Perry, who already is making the split easier with his talk of moving the roughly $1 billion in Texas gold reserves, now in a vault in New York, back into the state.
There’s a lot of great music out of Texas, from Lightnin’ Hopkins and Bob Wills to Buddy Holly, Janis Joplin, Lyle Lovett and Archie Bell & the Drells. But we could still listen to them, just as we enjoy the sounds of such Canadian crooners as Neil Young, Joni Mitchell and Feist.
Sure, there would be some problems. It would be harder to drive from New Orleans to Tucson. Perhaps we could hold on to the interstate highways through the former state — we built them, after all. They could be like the corridors for driving to West Berlin back in the days of the Soviet empire. Having Texas floating free could be destabilizing for Mexico — but we could probably build some fences to help handle that.
Texans have that Lone Star flag all set. I think they’re ready to fly solo and lonely once again. Let them go.
Thomas E. Ricks, a former defense reporter for The Washington Post, is the author of five books about the U.S. military, most recently “The Generals: American Military Command From World War II to Today.”
When women were relatively new to the working world it made sense to identify mothers who worked for pay. But the term “working mother,” which dates back to the late 19th century, now implies that individuals who care for children without pay don’t actually work. It’s time to abolish it.
As Ann Crittenden observed more than a decade ago in “The Price of Motherhood,” cooking, cleaning, driving children to and from school and activities, and watching and educating them are viewed as “labors of love,” rather than real, hard work. Yet these are time-consuming and demanding tasks that, when outsourced, have been valued at $60,000 to $100,000 a year.
Moreover, our continued use of the term “working mothers” when we don’t call men “working fathers” reinforces the idea that mothers should be at home — hence we need an adjective when they are not. By contrast, we presume that fathers should be at work and thus don’t need an adjective. The term also perpetuates a divide between stay-at-home mothers and those who work outside the home, when all women should feel united in pushing for policies such as paid family leave and affordable day care.
Far better is the phrase that a number of men are using, calling themselves “work at home” dads. Most of them are working on projects that bring in income while also taking care of the kids. Or we could call any parents who are not commuting to an office, whether they are working for pay or not, “work-at-home mothers” and “work-at-home fathers.”
But if we really want to let in some fresh air this spring, let’s change the frame entirely — away from men and women, mothers and fathers, office and home. Let’s talk about caregivers and breadwinners — and acknowledge that the vast majority of American workers are both.
Anne-Marie Slaughter, a Princeton professor, was the director of policy planning at the State Department from 2009 to 2011. She is the incoming president of the New America Foundation.