With spring weather finally here to stay, Outlook asked 10 writers to nominate something — an idea, a tradition, a technology, a hashtag, anything — we’re better off without. Here are their picks.

What else should we throw away? Enter your suggestions here or tweet #springcleaning ideas to @PostOutlook.

Air Force

In 1947, with the Red Army occupying much of Europe, the United States tasked its Air Force with planning for the atomic destruction of the Soviet Union’s cities and industrial heartland. This was the first mission for the newly independent Air Force, after a 30-year campaign by aviators to separate from the Army. The Air Force was expected to essentially win wars by airpower alone.

Except it’s tough to win wars from the air. Over Korea in the 1950s, cheap Soviet interceptors cut B-29s to pieces. A decade later, the Air Force’s bombers couldn’t force Hanoi to give up on South Vietnam, and its fighters struggled against pesky, agile North Vietnamese MiGs. In 1999, it took three months for NATO to bring Serbia to its knees. The Air Force performed well in Iraq and Afghanistan, but mainly in a supporting role.

Now we have several air forces — in the Army, Navy, Marine Corps and Coast Guard — because none of the other services trust the Air Force to meet its needs. America needs military aviation, but it doesn’t need a bureaucracy specifically devoted to airpower.

Drones further complicate the picture. For generations, the promise of flying a powerful jet drew would-be pilots to the Air Force. Managing robots from Nevada, however, is much less appealing. And the Air Force’s image has been hurt by growing unease over drones’ targeted killings, especially when it comes to assassinating U.S. citizens overseas.

Rather than continuing the Air Force’s search for definition and purpose, we should distribute its assets and responsibilities to the Army and the Navy. This would concentrate U.S. airpower on the support missions it performs so well, rather than the independent missions the Air Force does poorly. Cutting the Air Force might also correct the inaccurate notion that conflicts can be ended by airpower alone.

Robert M. Farley, is the author of “Grounded: The Case for Abolishing the United States Air Force.” Follow him on Twitter: @drfarls.

AP classes

To hear the College Board tell it, Advanced Placement classes can do it all: Prepare teens to succeed in college! Expose poor and minority students to more rigorous material! Reduce the cost of college by allowing students to graduate earlier!

The educational and emotional toll these classes take would be too high even if the AP program delivered on all its promises. But it doesn’t. Students take nearly 4 million AP tests each year, and more than 40 percent fail. Passing rates are even lower for Latino and African American students.

Over the past five years, I’ve interviewed hundreds of educators and students across the country. More often than not, they tell me that the demands of AP classes leave them rushing through material without fully absorbing it. The pressure to succeed contributes to soaring rates of teen anxiety and depression.

The exams don’t bolster college performance, either. A 2006 study of 18,000 college students in introductory science courses found that having taken AP science classes did not significantly boost their grades.

Students and schools can end the arms race by opting out of AP. Scarsdale High School in New York abolished AP classes in 2007 and replaced them with “Advanced Topics” courses focused on in-depth study. High Tech High in San Diego decided against them more than a decade ago; instead it offers honors credit in every class, and 71 percent of students take it. Cranbrook Kingswood School in Bloomfield Hills, Mich., limits students to two or three AP classes per year.

College admissions officials should reduce or eliminate their reliance on AP coursework, quantitative rankings and padded high school “résumés.” Hundreds of institutions have already made the SAT and the ACT optional. They could also cap the number of AP courses they would consider on applications — and make clear that they wouldn’t penalize students who attend schools that abolish or limit AP. If the Ivies took this simple step, other schools would follow, and students might arrive at college healthier and more ready to engage.

Vicki Abeles, an attorney and advocate for children and families, produced the documentary film, “Race to Nowhere.”

BRICs

People blame Goldman Sachs for many things. I blame the investment bank mainly for popularizing the acronym BRIC — Brazil, Russia, India and China — in a 2001 report by economist Jim O’Neill arguing that long-term growth in these emerging markets would surpass that of the world’s richer nations.

Investing in the BRICs sounded like a good idea when these countries were growing quickly and when the most-developed economies were sputtering. But the grouping quickly outlived its usefulness. It has become clear that, other than large territories and populations, the BRICs have little in common. Brazil and India have different domestic political challenges, China and Russia are pursuing disparate development strategies, and China’s geopolitical role is far more complicated than that of its BRIC comrades.

Now that the BRICs have entered a rough patch, the allure of these fast-growing economies has faded. Yet the bankers and consultants who dream up such monikers have simply created new ones. They include CIVETS (Colombia, Indonesia, Vietnam, Egypt, Turkey and South Africa), from the Economist Intelligence Unit; CARBS (Canada, Australia, Russia, Brazil and South Africa), identified by Citigroup; and MINT (Mexico, Indonesia, Nigeria and Turkey), coined by Fidelity Investments. Even O’Neill came back with the “Next Eleven” or N-11 (Bangladesh, Egypt, Indonesia, Iran, Mexico, Nigeria, Pakistan, the Philippines, Turkey, South Korea and Vietnam).

The rhetoric is familiar. The CIVETS countries are blessed with “diverse and dynamic” economies; the MINT nations enjoy “favorable demographics for at least the next 20 years”; and the N-11 “could potentially rival the G7 in terms of economic growth over time.”

But these categories reflect smart marketing and packaging of financial products rather than analytical originality or investment acumen. The main trait these countries share is that their economies are as volatile as their politics.

According to all indicators, the best acronym to invest in is still USA.

Moisés Naím, a scholar at the Carnegie Endowment for International Peace, is the chief international affairs columnist for El País and the author of “The End of Power.” Follow him on Twitter: @MoisesNaim.

Celebrity stylists

The entertainment business is brutal. So who can blame an actress for employing a stylist for every big event? I get it: I experienced red-carpet hoopla and the accompanying anxiety while at the 2010 Cannes Film Festival for the premiere of “Fair Game.” I borrowed a stunning dress from Giorgio Armani but went sans stylist. My fashion lifeline: Spanx.

Here’s a radical proposal: Ban the celebrity stylists, even for one high-profile event, and let’s see how the stars dress themselves, without professional help. Let them stand in front of their closets like the rest of us and figure out which shoes go best with which dress. And no borrowing of designer clothes or expensive jewels, which often come with their own bodyguard. Not for the people wearing them — for the jewels.

We treat celebrities as our royalty. If they had to dress themselves, maybe we would see them a bit more as the human beings they really are. After all, it is the displays of sheer individuality on the red carpet that we recall most vividly. Think Cher in a feather-and-sequin headdress at the 1986 Oscars, Bjork with a swan around her neck in 2001 or Helena Bonham Carter in pretty much anything. No boring sweetheart necklines for these ladies.

A world without stylists might not be as tasteful, but it would definitely be more fun and realistic.

Valerie Plame Wilson, is the author of “Blowback” and “Fair Game: How a Top CIA Agent Was Betrayed by Her Own Government.”

#firstworldproblems

The term “developing country” is a bit odd — as though societies start in the Stone Age and progress, inexorably, toward Las Vegas. It is better, though, than the old designation of “Third World” vs. “First.” Unfortunately, the “First World” has made a comeback, in the form of a self-indulgent and self-referential meme: first-world problems.

The phrase, which began as a Twitter hashtag but has infected offline conversations, is essentially a humblebrag disguised as a complaint. Bleating about traffic while en route to your beach house or venting that your favorite Moleskine notebook has been discontinued is a subtle way of flaunting your tastes and privilege. And no, your winking self-awareness doesn’t make it any less annoying.

Broadcasting your #firstworldproblems presumes that individuals in wealthy countries have troubles that individuals in poorer societies can’t possibly comprehend. It reinforces the false idea that there is no commonality across cultures. However, the human experience is more universal than different. Dry cleaners lose clothing, boyfriends do not call their girlfriends, Internet routers stop working, and products are sometimes out of stock — whether you’re in East Asia, East Africa or the Upper East Side.

Most dangerous, the term implies that no real problems exist in rich countries. The challenge of economic inequality, for example, was long overlooked in America, in part because of its self-regard as a wealthy nation. The United States revels in such exceptionalism — in spite of pay-to-play politics and high rates of executions, incarceration and deaths from gun violence. Now those are #firstworldproblems worth ranting about.

Dayo Olopade is the author of “The Bright Continent: Breaking Rules and Making Change in Modern Africa.” Follow her on Twitter: @madayo.

Low-fat products

With swimsuit season around the corner, May becomes the cruelest month, but it need not be. To ease your spring-slimming efforts, all you need to do is take one counterintuitive step: Purge the pantry of low-fat foods.

Yes, low-fat products make people fat. To replace the texture lost when fat is removed, food manufacturers use “fat replacers,” which are usually composed of carbohydrates — often just sugar.

We’ve been told for decades that eating fat makes us fat — and gives us heart disease — but the real culprits are carbohydrates. German scientists discovered the link in the 1920s. But U.S. researchers developed a fixation with dietary fat in the 1950s, and Americans have been obsessed with fat ever since. Today we consume more calories than we did in the early 1970s, although a smaller proportion of them come from fat. We are also eating at least 25 percent more carbohydrates — and we are fatter than ever.

A fat-free Caesar salad dressing will deliver about 12 times more carbs per serving than the regular version. “Light” Philadelphia cream cheese packs twice as many carbs and sugar as the full-fat kind. And at Starbucks, there’s no more carb-laden bakery item than the reduced-fat banana chocolate chip coffee cake, with 50 grams of sugar, equivalent to more than two Hershey’s milk chocolate bars. The regular Starbucks chocolate-marble loaf cake, with 24 grams of sugar, is no angel, but it’s still the better option.

The same is true for yogurt, cheese, ice cream, cookies, peanut butter, hot dogs and granola bars, among other items: The low-fat versions are almost always higher in carbohydrates.

Over the past decade, many clinical trials have established that a higher-fat diet is more effective than a low-fat one in fighting obesity and heart disease. Toast that with some cream in your coffee! It’s far more delicious and filling than nonfat milk — and it has much less sugar.

Nina Teicholz is the author of “The Big Fat Surprise: Why Butter, Meat and Cheese Belong in a Healthy Diet.” Follow her on Twitter: @bigfatsurprise.

Midterm elections

In Washington, the 2014 midterm election season is well underway. Actually, it has been since late 2013, when both parties started signaling that they were pretty much done with any serious legislating on immigration, trade or other challenges facing the republic. Never mind that a new class of lawmakers had just arrived in Washington in January 2013. Nope, it was time to start thinking about the midterms.

Holding nationwide elections for federal office every two years feeds the permanent-campaign mind-set that grips our elected officials and political reporters, too many of whom would rather dwell on generic congressional ballot surveys and Cook Political Report ratings than on actual lawmaking.

The two-year term requires new House members to start their fundraising for the next election as soon as they arrive in Washington. This winnows the pool of candidates — and not in a good way. “The skill of telemarketing does not translate very often into the skill of governing, so there are real implications for the kind of people you get on the other end,” Sen. Chris Murphy (D-Conn.), formerly of the House, observed last year.

And midterms aren’t exactly a display of democracy at its finest, given that far fewer voters turn out than in presidential years. For example, in the 2010 midterm elections, 89 million ballots were cast, compared with 129 million in 2012.

Yes, the founders believed that frequent elections would give House members, as James Madison put it, “immediate dependence on, and an intimate sympathy with, the people.” But there’s not much responsiveness to the public when more than 90 percent of House incumbents win their races.

So, onward with a constitutional amendment to expand House terms to four years. Voters would still be able to express midterm sentiments in elections for Senate and governors’ offices. Meanwhile, Washington could get back to work.

Alec MacGillis is a senior editor at the New Republic. He can be reached at amacgillis@tnr.com.

The post-apocalypse

Hollywood has always loved a disaster story. Lately, our heroes aren’t saving the world by drilling to the center of the Earth or engaging in robotic boxing matches with monsters from another universe. Instead, the world goes to hell by means of alien invasion, totalitarian government or radical income inequality.

The remedies for these post-apocalypses are often similar. They feature plucky teenage girls, such as archer Katniss Everdeen from “The Hunger Games” books and films; Tris Prior from Veronica Roth’s “Divergent” franchise; and Melanie, who acquires an alien parasite in “The Host,” the other hit from “Twilight” scribe Stephenie Meyer. These characters are often marginalized by their class status, but after they discover the truth about the governments or alien invaders that rule their worlds, and fight for social change, they become symbols of resistance. Also, there is smooching.

This trend has given audiences a welcome and overdue crop of young female heroines and has helped establish young actresses, such as Jennifer Lawrence and Shailene Woodley, as action stars.

But post-apocalypse stories get repetitive awfully fast. There’s always another dictator, and another radically stratified or sorted society, on the next page or the next movie poster. Now, these stories tell us less about how we might respond to the loss of technology, what kind of government we really crave in a crisis and the ways women lead — and more about what sells.

Instead of letting young women come to the forefront once the world has gone to pieces, a real test for Hollywood would be to see if it can make a hit out of a story in which a young woman staves off disaster.

Alyssa Rosenberg writes about culture for the Washington Post blog Act Four. Follow her on Twitter: @AlyssaRosenberg.

Presidents’ Day

Presidents’ Day is more than a little bizarre. Why would a country founded on the rejection of monarchy devote a day to the celebration of the office of the presidency and those who have occupied it?

The president was never meant to run the country. There’s a reason he doesn’t get pride of place in the Constitution: Article II deals with the presidency, right after Article I establishes Congress. Of course, we have expanded presidential authority pretty far since the founding. As Gene Healy, author of “The Cult of the Presidency,” writes: “The modern president is America’s shrink, a social worker, our very own national talk show host. He’s also the Supreme Warlord of the Earth.” Discarding Presidents’ Day would not undo that deplorable trend, but at least we’d no longer be cheerleaders for it.

Getting rid of Presidents’ Day would not be difficult. All we would have to do is start calling the third Monday of February by its proper name under federal law: George Washington’s Birthday.* The father of our country is worth a holiday, not least because he adopted the title “Mr. President” rather than “Your Majesty” or some other exalted term. Presidents’ Day undercuts that republican modesty.

And making the shift would have one other advantage: eliminating a perennial punctuation problem. Where does that apostrophe go?

Ramesh Ponnuru, a senior editor for National Review, is a visiting fellow at the American Enterprise Institute. He can be reached at ramesh.ponnuru@aei.org.

* Because of an editing error introduced without the author’s knowledge or permission, the original published version of this Spring Cleaning contribution gave the mistaken impression that the author did not know that the third Monday in February is officially designated by Congress as George Washington’s Birthday and is only colloquially known as Presidents’ Day. The author’s text has been restored.

Status updates

What is your status? Please do share. Make sure to inform us of what you are eating or reading or doing at this very moment. Convince us you are happy and fulfilled. Otherwise, how would we know?

I’m not the first person to decry how publicly self-involved we have become. And I’m certainly no less addicted to my smartphone than the average American multitasker, as my wife will attest. But there is a fine line between relying on social media for information and communication — and using it to broadcast your every experience.

Have I interrupted a date with my wife to keep up with the latest happenings in Ukraine? Yes. Have I done so to Instagram my salad because (OMG!) it has currants in it? Never.

Does that make me a social-media elitist? Probably.

But I am concerned about the way our obsession with the status update has transformed reality into something else entirely. When we regularly step out of the present in order to record it, filter it and then share it with people we do not care enough about to actually be around, we are removing ourselves from the moments we’re documenting. Our filtered lives are subsuming our real ones.

So by all means enjoy that salad. But stop documenting it. No one needs a press conference about your appetizer. You are not news.

Reza Aslan’s most recent book is “Zealot: The Life and Times of Jesus of Nazareth.” Follow him on Twitter: @rezaaslan.