Spring Cleaning 2015

Outlook's seventh annual spring cleaning

Outlook's seventh annual

spring cleaning

Ten things to toss: With spring weather finally here to stay, Outlook asked 10 writers to nominate something we're better off without. Here are their picks.

  Select an item

Published on April 9, 2015

READERS' CHOICE: What else should we throw away? Discuss your suggestions in the comments or tweet #springcleaning ideas to @PostOutlook.

Office chairs

There may be no more euphemistically titled field of research than "inactivity physiology." Or: the never-before-needed study of how incredibly awful it is to sit on your keister all day.

By now we're all familiar with the general findings: Prolonged sitting leads to more heart disease, decreased sensitivity to insulin, core muscle atrophy, herniated lumbar disks and, to top it off, decreased oxygen to the brain, which can lead to slower cognitive functioning. As Marc Hamilton, head of the Inactivity Physiology Laboratory at the Pennington Biomedical Research Center, explained to me, sitting causes lipoprotein lipase, an enzyme that regulates fat in your bloodstream, to go to sleep. Hamilton stood during the interview.

Perhaps the most frustrating news about sitting all day is that regular, vigorous exercise is only a limited antidote. In one harrowing study, even within groups of people who exercised at least five hours a week, those who spent 11 or more hours a day sitting were significantly more likely to die within the next three years. Death is considered a poor health outcome. And it's especially unfortunate after you've wasted five hours a week getting ripped.

So this spring, get rid of the desk chairs in your office. Consider following the lead of Mayo Clinic endocrinologists and hold walking meetings. If people need an occasional break, keep some communal chairs on hand. Whatever it takes not to become a subject in a study of inactivity physiology. (Of course, full-time chairs should remain available to anyone who has a legitimate medical need to sit.)

You know that standing desks have arrived because the "Shouts and Murmurs" section of the New Yorker made fun of them recently. In a hilarious piece, the author rails against Big Office Chair while promising to progress to a swimming desk.

I'll accept that, with any health trend, zealotry can be irksome. As the standing-desk evangelist in my office, even I giggle at times when I see colleagues' heads popping up — our desks can be raised or lowered — over cubicle partitions like mushrooms. If you want to get the last laugh, though, it's time to take a stand.

David Epstein is an investigative reporter at ProPublica and the author of "The Sports Gene: Inside the Science of Extraordinary Athletic Performance."

Football helmets

Football players are taught to block with "hat and hands" technique. You make contact with your head and your hands simultaneously, then shed your opponent to the left or right. As an NFL linebacker, I hit my head roughly 100 times per game, often facing up with guards, then a fullback and then a tailbackall on a single play. I thrive on that intensity and physicality. But as studies have shown, it's because of the high-impact nature of the sport that concussions have done so much damage to so many players.

Efforts to develop high-tech helmets, ban high-risk tackles and impose astronomical fines have done little to dial down the hits. It's unclear that any helmet technology could prevent concussions. And even a five-figure fine can't stop a helmet-to-helmet collision when a quarterback drops his head slightly and a defensive player has only a fraction of a second to change his point of target.

We need to take the hard-shell helmets off. We could replace them with throwback leather helmets, which everyone understands provide limited protection. But we need to eliminate the idea that I'm in a cage, you're in a cage, and we can go at each other because we're indestructible.

That would require drilling everyone, players and coaches, from Pop Warner to the NFL, in shoulder tackling and other physical strategies of the sort that the Seattle Seahawks and other teams have begun incorporating.

Yes, removing plastic helmets would change how the game is played. But I don't buy that it would take the toughness out of football. Rugby players don't wear helmets, and having played with the English club Blackheath as part of my Travel Channel show, I can vouch that rugby is a tough and bloody sport. I came away from one practice with a black eye and more lactic acid buildup than I ever had in an NFL game.

Football is a beautiful game. It's America's game. By tossing hard-shell helmets, we can protect players while maintaining the sport's integrity.

Dhani Jones, who played football at Churchill High School in Potomac, Md., played 11 seasons in the NFL as a linebacker for the New York Giants, Philadelphia Eagles and Cincinnati Bengals.

Crowdfunding

Crowdfunding seems like such a great idea. Finally, the rank-and-file donors of America (and the world!) can pool their resources to tackle problems that matter or support companies with great ideas but no access to start-up money. Now anyone can breathe life into someone's creative vision. And occasionally they do: The Pebble smartwatch and Levar Burton's "Reading Rainbow" revival came about thanks to creditors like you.

The everyday reality is far less inspiring.

An endless stream of dubious campaigns flows through my inbox. There are projects to make potato salad or build an inflatable sculpture of Lionel Richie's head. MyFreeImplants.com lets you crowdfund breast augmentation so you can "help the women of your dreams achieve the body of their dreams." Trevolta.com allows you to pay for others to travel.

The problem is this: Unlike with investments in, say, SEC-compliant companies, there is simply no accountability. I want assurances that my money will back a successful endeavor, or at least a well-managed one with a detailed business plan, a budget and qualified employees in positions of authority. Once I fund a project, its creators should be required to show how they actually spent my money. Yet none of the crowdfunding platforms require these things.

Instead, money-seekers pitch to people ill-equipped to judge them, playing heavily to emotions. We want cool stuff like 3D-drawing pens or feel guilty denying a friend's new project. It truly is wonderful to fall in love with a good cause, but more than $10 million flows through crowdfunding sites monthly, and excitement is no check on how it's spent. Let's not forget, those sites make money on our feelings.

No wonder people feel no shame asking you to fund their 38 globe-trotting weddings, or their virtual-reality headset that never delivers or their games that raise half a million dollars and then die out.

While I'd love to help bring mini-robotic printers to market or help indigenous people in Burma receive medical care, the safest way to do those things is probably still to give to people who know what they're doing. Let's toss crowdfunding.

Leigh Shulman, a writer living in Argentina, runs Creative Revolution Retreats, writing retreats for women.

Selfies

The first known selfie was taken in 1839 by Robert Cornelius — an American pioneer in photography — when he produced a daguerreotype of himself. It's all been downhill from there. Selfies have already survived several waves of criticism, but 2014 saw the invention of the selfie stick, a monopod to which a photographer can attach a smartphone or a camera to provide deeper range for the photograph. The selfie stick ushers in a new, even worse and more dangerous era for the form. The stick doesn't just validate selfies by building a cottage industry around them. It also says, "Snap them everywhere!"

Please stop.

The old-fashioned selfie was bad enough. In New York's zoos, circuses and carnivals, taking selfies with tigers or other big cats is actually illegal. Turning your back to a large carnivore, apparently, is not a great idea. I've also seen individuals take selfies while crossing city streets or driving — talk about reckless.

In recognition of this absurd innovation, though, some organizations have gone further. The stick has been banned from the Australian Open, Emirates Stadium (home of Britain's Arsenal Football Club), the Smithsonian's Hirshhorn Museum and Sculpture Garden and the Museum of Modern Art. Music festivals such as Coachella and Lollapalooza have also forbidden the stick.

That's because, beyond the obvious narcissism of endlessly photographing oneself and blasting it over social networks for others to admire, selfies are dangerous — to animals, sports spectators, artwork and the rest of us. Last August, a Polish couple fell to their deaths off a cliff in Portugal, reportedly while trying to snap a selfie. And after a plane crash killed two people in Colorado last year, investigators blamed a pilot selfie session in the cockpit of the Cessna 150K.

When it comes to visiting national landmarks, distinguished museums or city streets, it's just not necessary. Focus your camera on your surroundings, not on yourself! Even Robert Cornelius would still believe you were there.

Ben Carson is emeritus professor of neurosurgery, oncology, plastic surgery and pediatrics at Johns Hopkins University, president and chief executive of American Business Collaborative, and a conservative activist.

Third year of law school

There's a wise old saying about law school: The first year they scare you to death, the second year they work you to death and the third year they bore you to death. Why waste students' time and money? Let's eliminate the 3L year of law school.

Cutting a year of classroom instruction would get students into the workforce sooner, with less debt. Getting a law degree can cost $250,000 or more these days, pushing many graduates into lucrative corporate law rather than public-interest or government careers. Cutting the price by a third would also help the more than 40 percent of law school graduates who don't have a full-time law job nine months after graduation.

Defenders of 3L year cite the large and growing body of law to be learned. As Justice Antonin Scalia put it, "To say you are a lawyer is to say you are learned in the law, and . . . you can't do that in two years." But by their third year, law students have generally taken all required courses, filling their schedules with esoteric electives instead. Fun, but hardly essential.

Already, some prominent law professors and deans have come out in favor of versions of this reform. A small number of schools have created two-year JD programs. Even President Obama, our constitutional law professor in chief, is on board. "In the first two years [of law school], young people are learning in the classroom," he said in a speech. "The third year, they'd be better off clerking or practicing in a firm even if they weren't getting paid that much, but that step alone would reduce the costs for the student."

He's right. Eliminating 3L year is long overdue.

David Lat is the founder and managing editor of Above the Law, a Web site covering the legal profession, and the author of "Supreme Ambitions: A Novel."

Ban on foreign-born presidents

The conspiracy theory that won't die — that Barack Obama can't be president because he was born in Kenya — is back for 2016, this time to haunt Canadian-born Sen. Ted Cruz. No, Obama wasn't born in Kenya. And yes, Cruz (R-Tex.) is eligible to be commander in chief; he was born to an American mother (something the conspiracy theorists conveniently forget about Obama, too).

But these non-controversies point to an important question: Why don't we let people who aren't "natural-born citizens" — the Constitution's requirement — run for president?

We have had many foreign-born governors (not just Ah-nold), U.S. representatives and senators. We have even had foreign-born people serve as top diplomats abroad and as chief foreign policy advisers: from secretaries of state Henry Kissinger and Madeleine Albright to national security adviser Zbigniew Brzezinski.

The "natural-born citizen" clause intends to guard the United States against foreign actors who could subvert our government. But if we trust so many foreign-born people to be our emissaries abroad or to shape our foreign policy, we can open the presidency, too.

Many of our early presidents didn't meet this requirement. Eight of the first nine presidents got a waiver, having been citizens at the time the Constitution was ratified but born as British subjects. Many of them are regarded as some of our best leaders.

The idea that a subversive, foreign-born candidate could be elected president while harboring nefarious goals is laughable. Maybe it was possible before the media picked apart every detail of a candidate's life — and candidates saw fit to release their birth certificates — but not today.

Do we really want to live in a country where Donald Trump can be president but an immigrant who has lived here since his or her youth can't? Maybe it's time to trust the voters to decide where politicians' allegiances lie.

Aaron Blake is managing editor of The Fix, The Washington Post's politics blog.

Starred dining reviews

Here are six words I can live without: "How many stars did they get?"

It doesn't matter if the stars come from a professional restaurant critic, a globe-trotting Michelin judge or a Yelper. Everyone wants to know who's in, who's out and most important: How do they rank?

I get it. Ratings are an easy measuring stick; they promise a quick answer to the question, "Is this place any good?"

But when someone asks me about a restaurant experience, I talk about the food, wine, cocktails, service, design and ambiance — the same information you'll find in any good restaurant review. Never have I answered that question by saying, "I'd give it two stars." A dining experience cannot be summed up by a single digit or symbol.

And the idea that you can compare one restaurant to another based on stars is misleading. For example, both R&R Taqueria in Elkridge, Md., and 1789 Restaurant in Georgetown are rated two stars by The Washington Post. R&R is great — an unassuming, authentic Mexican carryout in the back of a gas station. Meanwhile, 1789 is a D.C. institution, a special-occasion restaurant with white tablecloths, polished service, a seasonal menu and an in-house pastry chef. How can you possibly compare the two? It's like trying to decide who is better: AC/DC or Beethoven.

Others who want to change the system have suggested breaking star ratings into three categories: food, service and decor. While that would help differentiate fine dining from fast-casual, I don't think it solves the problem.

Our dining experiences don't need to be quantified by one rating or three. Remove the star ratings, and I'll bet more people take time to read the full reviews, which is why critics write them in the first place.

Mike Isabella is a Washington-based chef and restaurateur.

Art auctions

Art sales, which barely hit a speed bump after the financial collapse of 2008, have now gone into hyperdrive. A circuslike atmosphere reigns at mega-galleries, museums and art fairs of serious stock, with the likes of James Franco as center-ring performers. I'm American; I believe in making money and doing what you need to do in the name of art. But one precinct of this cattle call must be shut down and sent to hell: the auctions of art at Christie's, Sotheby's and Phillips.

These auction houses promote super-expensive works by verified modern masters, pieces that should be in museums. Wealthy bidders pack the house. Prices soar into the tens or even hundreds of millions of dollars, and the room erupts in applause. What they're applauding is art that is now out of reach for museums, which can't begin to compete at these prices. The ultra-wealthy Getty Museum was able to snag an Edouard Manet for $65.1 million at auction in 2014, but even the Getty had no chance of acquiring the Paul Gauguin that sold this year for $300 million, reportedly to someone from Qatar. Annual acquisition budgets at many museums are relatively modest — the Seattle Art Museum had a reported budget of $7.8 million for buying art in 2012; the Nelson-Atkins Museum of Art in Kansas City, Mo. puts the value of its 2014 acquisitions at $5.3 million, which includes art it was given. Most institutions can't stretch to buy even a second-level work by Francis Bacon or Gerhard Richter, let alone a Matisse.

Money and art have always been linked. The wealthy have been responsible for stocking many of the country's best museums with works to be enjoyed by the public indefinitely — witness the collection of Andrew W. Mellon at the National Gallery, Robert Lehman's gift to the Met and many others. But now the wealthy compete with museums on the trading floors, where speculation and spin turn masterpieces into trophies. Art auctions are vile pieces of work. A pox on all their houses.

Jerry Saltz is the senior art critic at New York magazine.

‘Moderate Muslims’

After the Charlie Hebdo attack in January, comedian Bill Maher tweeted that unless Muslims "strongly endorse the right of anyone to make fun of any religion/prophet," they aren't "moderate Muslim[s]." As the Middle East has devolved into chaos, the search for "moderate Muslims" has intensified. Once one is found, he or she must be feted, embraced and given a platform to share the good news — preferably in English or French.

The way we use the term, "moderate" means little more than "people we like or agree with." Almost always, it signals moderation relative to American or European standards of liberalism, freedom of speech, gender equality and so on. Yet in their own countries, people who want to depoliticize Islam and privatize religion aren't viewed as moderate; they're viewed as out of touch.

Setting up a "moderate" ideal lets us Americans off the hook for our own disastrous policies in the region. We can fall back on the idea that if only Muslims had a Reformation just like Christians did, they'd get their act together. In war zones such as Syria, we complain that there aren't enough moderate rebels to support. Why, exactly, would people who are willing to kill and die for a cause care about being moderate?

The search for moderate Muslims misunderstands the nature of the societies we're hoping to change. It would be extremely difficult to find many Egyptians, for instance, who would publicly affirm the right to blaspheme the prophet Muhammad. The spectrum is so skewed in a conservative direction that in countries like Egypt, even so-called secularists say and believe quite illiberal things.

The subtext of so many debates over Islam and the Middle East is frustration and impatience with Muslims for not joining our liberal, secular age. However well-intentioned, such discussions are patronizing and counterproductive. We shouldn't put Muslims in boxes that have little to do with the communities they live in.

Shadi Hamid, a fellow at the Project on U.S. Relations with the Islamic World at the Brookings Institution, is the author of "Temptations of Power: Islamists and Illiberal Democracy in a New Middle East."

Middle names

Your first name reveals what your parents expect or want from you. Your last name identifies your family. What does your middle name do? In some cases it may pay tribute to, or mollify, relatives. Sometimes it can be a nuisance, as Barack Hussein Obama discovered during the 2008 campaign. But mostly it fills a space that would otherwise seem empty, if only because we're used to seeing something there on driver's licenses and wedding invitations.

Middle names actually have a fairly short history in this country. You won't see any on the Mayflower passenger manifest and only a few among the signers of the Declaration of Independence. The custom, imported by German immigrants in the 1700s, was more common than not by the start of the Civil War and became nearly universal by the early 1900s.

Americans quickly abandoned the European practice of using middle names to make room for baptismal names. More typically, middle names here have been snobbish and frivolous. Whether in homage to family lineage, rich aunts, childhood friends or celebrities, the main purpose has been to offer substance — to create an enhanced identity. Back in more starchy times, a gentleman without a middle name was like someone wearing only one shoe. Think John Paul Jones, Oliver Wendell Holmes. Remove the middle name and what have you got? Someone who has to work a lot harder to make a name for himself.

Middle names were admittedly somewhat useful for identifying people when every other baby was a Mary or a Michael. But the pool of names, both first and last, has grown exponentially, so today there's rarely need for a middle name — or initial — that distinguishes one kindergartener from another.

The ancient Greeks were wise to the excess of middle names — preferring stand-alones such as Plato and Socrates over the Romans' Marcus Aurelius Antoninus Augustus. We, too, would be better off if we understood that one given name is enough.

Anne Bernays is a novelist and a co-author of "The Language of Names."

Editor’s picks

Spring Cleaning 2014

It's time to end midterm elections.

Spring Cleaning 2013

Let's let Texas go.

Credits