‘Unconscious bias’

By Kara Swisher

You can’t throw a hammer in Silicon Valley these days and not hit a gender or race bias controversy. And believe me, I’d like to throw a lot of hammers, aimed squarely at the still-thick heads of most tech leaders on this most important of topics. The problem is everywhere: Uber. Oracle. Google. And VC firms, so many VC firms.

My biggest gripe is about their convoluted and costly efforts to address lagging hiring and pay for women and people of color. These self-described masters of quantification have all kinds of hurt-feelings excuses. “But the pipeline!” they cry, as if people were oil and talent was a dwindling resource that couldn’t be made again and again.

The worst excuse is what is widely called “unconscious bias” — bias that kicks in automatically, with our supposedly unthinking brains making often-inaccurate snap judgments.

While I am fully aware of the science behind the concept — which basically boils down to the fact that we are all beasts at heart — it’s pure laziness by some of the world’s smartest and most innovative people to pretend they are unconscious of something so glaringly clear. It both abrogates the responsibility of leaders and fobs it off on training and classes that never seem to solve the problem, no matter how much money is spent. Studies suggest these programs may actually backfire, conveying that stereotyping is normal and, therefore, okay.

Here’s a suggestion for tech execs: Pick up your head and look. If you see 10 white men on a board of 10, or an engineering staff that’s 80 percent white and male, with no faces unlike your own, you’re not building a meritocracy. Instead, it’s a mirror-tocracy, and what’s reflecting back at you is neither pretty nor unconscious.

Twitter: @karaswisher

Kara Swisher is a technology journalist and executive editor of Recode.

Healthy substitutes

By Nina Teicholz

Government nutrition guidelines and magazine advice columns have long promoted healthy substitutes for everyday foods. Whole industries have been built around alternate foods that are supposed to make us feel better and live longer. But in many cases, the healthiest choice is to forgo the “healthy” substitutes.

Consider the egg-white omelet. We were told for decades to avoid yolks and limit our dietary cholesterol to help protect against heart disease. Yet in 2015, the U.S. dietary guidelines dropped the daily cap on cholesterol. It turns out that studies since the 1950s had found that dietary cholesterol had little meaningful effect on blood cholesterol. What a shame for all of those delicious omelets we never got to eat. And, more seriously, for all the vitamins we missed — egg yolks are far more nutrient-dense than the whites, with super-rich amounts of biotin , choline and lutein.

It’s the same story with low-fat foods: For decades we’ve snacked with abandon on low-fat cookies and pretzels, exactly as the American Heart Association advised . Yet health officials no longer recommend a limit on total fat — that was more non-fact-based advice, it turns out. A low-fat diet is now “associated with dislipidemia,” according to the federal Dietary Guidelines Advisory Committee, which means it’s linked to heart disease. Whoops! Hello, guacamole.

Agave syrup, too, can go. Agave is marketed as a natural sweetener because it comes from a plant. But so does sugar — from sugar cane or sugar beets. And both sweeteners are a combination of glucose and fructose. Agave happens to have far more fructose , which is directly implicated in fatty liver disease.

And soy milk? It’s extracted from soybeans using pressure, heat and hexane, a solvent. The resulting rancid mixture must be steamed to eliminate bad odors, bleached to remove the gray color, and then enhanced with sweeteners, artificial colors and synthetic vitamins. No, thanks.

Twitter: @bigfatsurprise

Nina Teicholz is the author of “The Big Fat Surprise: Why Butter, Meat and Cheese Belong in a Healthy Diet.”

Cropped pants

By Tim Gunn

My view of what’s in your closet is accepting and democratic: I don’t care what you wear, provided that you accept responsibility for wearing it. But there are some items for which nobody should shoulder responsibility, including leggings-as-pants (how? why?) and the dreaded dropped-crotch trouser (horrors). Still, my No. 1 offender is the cropped pant. It’s fashion’s Bronx cheer.

Because the only fit issue is in the waist (rather than the length, too), I see these monstrosities more and more. Universally unflattering, they are neither here nor there; they’re too long to be shorts and too short to be capris, with which they are often confused. They are loose and fall straight from the waist at the widest part of the hip, unlike capris, which are snug and slightly tapered. The male version is misleadingly referred to as “manpris.” Balderdash!

The key to getting fashion right is the harmony and balance of silhouette (your true shape), proportion (think of yourself as divided into thirds from shoulder to ankle) and fit (neither voluminous nor tight), in concert with color and pattern. In terms of proportion, your thirds are the following: shoulder to waist, waist to knee, knee to ankle. Your apparel should conform to those demarcations. Proper shorts will fall to just above the knee. Proper pants will extend roughly to the ankle.

The cropped pant wreaks havoc with proportion. Because the hem falls between the knee and the ankle, usually at the widest part of the calf, the cropped pant succeeds in making you look shorter. I don’t know anyone, female or male, who strives for that goal.

Worn thoughtfully and strategically, clothes create a positive optical illusion, suggesting that you are longer and leaner than you actually are. Worn carelessly, clothes — particularly cropped pants — can do the opposite.

Twitter: @TimGunn

Tim Gunn is a design educator, author and co-host of “Project Runway.”

Countdown clocks

By Tom Cotton

Just 364 days, 23 hours and 59 minutes to go until The Washington Post publishes its 2018 Spring Cleaning issue. In the meantime, we ought to ditch the countdown clock.

These days, it seems like news channels are always counting down to something. But I can’t quite agree with the networks about what’s considered a clock-worthy “event.” New Year’s? Sure. A presidential address? Possibly. Rachel Maddow’s Trump tax return “exposé”? Hardly. In fact, cable news networks use the clock to plug so much of their own programming that they end up promoting themselves far more than actual news. Consider: Last year, CNN hyped one of its presidential debates with a clock that counted down to 8:30 p.m . But just as viewers tuned in, the network aired 30 more minutes of programmed punditry — and then started the debate for real at 9 p.m. Like so many other cable news clocks, it was a countdown to a letdown.

But there’s a more serious point here. A well-informed public is essential to democratic self-government. And the countdown clock reflects a frenzied, over-caffeinated news culture that ill serves a vigilant citizenry. It lowers important topics to the realm of entertainment and raises quotidian trivia to the status of “breaking news” — which is often neither breaking nor news. That makes it harder for us to think about what’s going on in the world in a deliberate fashion. It plays into the hands of politicians, who benefit from a distracted public. It defeats a main purpose of reporting the news — to inform and educate.

BREAKING: And we would be better off without it.

Twitter: @SenTomCotton

Tom Cotton is a Republican senator from Arkansas.

No reservations

By Tom Sietsema

It’s been official for a while now: Washington is a great place to eat. Across the city and at every price point, food fans can find cuisine to talk up.

Something less appetizing is also true of the District. Customers are increasingly asked to stand in line for the opportunity to taste the handiwork of top chefs. On any given night, queues stretch from the doors of Bad Saint, Little Serow and Rose’s Luxury, among many other fashionable food destinations, for the flavors of the Philippines, northeastern Thailand and contemporary America. None of these tastemakers take reservations.

Restaurants that hew to this policy say it frees them from the tyranny of latecomers and no-shows — bad guest behavior that eats into their profits. The practice also has a democratic ring to it, since theoretically anyone has a shot at a seat. Rank and money don’t carry much clout in restaurants that don’t let you save spots.

The problem is that whole swaths of us are relegated to dreaming about clams and sausage in XO sauce rather than experiencing it firsthand. If you’re a person of a certain age — unable, say, to leave work early, or go out after dark, or stand for long — too bad. These restaurants are not for you. Nor are they welcoming of suburb dwellers, who run the risk of coming into the city for dinner and not being fed, or parents of young children, who hesitate to shell out babysitter money for time spent outside a dining room.

No-reservations restaurants defend themselves by saying not all establishments are meant for everybody. True, a steakhouse is the wrong place for a vegetarian to reach nirvana. But as someone whose first priority is diners, I’d like to point out that “restaurant” is derived from the French word for “restore,” and restaurants that don’t let you plan in advance are anything but fortifying.

Twitter: @tomsietsema

Tom Sietsema is the food critic for The Washington Post.


By Dan Steinberg

For six months last year, the Washington Capitals were the best hockey team in the world. For two weeks in the early spring, they weren’t. So their season ended after a second-round playoff loss to the Penguins, and the LOL-L-L-L narrative persisted: This was a Washington team that just wasn’t good enough. 

Left mostly unsaid: Actually, they were good enough! They won 12 percent more games than any other team. In any rational league, they would have been rewarded for their season-long excellence with something grander than a second-rate trophy and a dumb commemorative T-shirt destined for the Salvation Army. 

But Americans remain wedded to their maddening playoffs — just as American owners remain wedded to postseason ticket, merchandise, hot dog and beer sales. Short playoff series are rocked by more statistical noise than a Wisconsin presidential poll, yet we bestow “greatness” on teams that get hot for a few weeks at the right time. 

In no league is this as cruel as the NHL, where just twice in the past 13 years has the most successful regular-season team celebrated a Stanley Cup title. It’s self-defeating for the sport, too; why care about a result in October or November when lasting judgments are formed over coin flips in May and June? 

But it isn’t just hockey, as Washingtonians well know. In 2012, the Nationals won the most games in baseball but lost a first-round playoff series to St. Louis. Then we all had to pretend that the Cardinals bled liquid bravery and the Nats were fatally flawed, rather than accepting the obvious: Over a short playoff series, existential randomness often plays a starring role. 

The solution is simple. Make like domestic European soccer leagues and give the biggest trophy to the best team, even if it limits hot dog sales. The chances of that happening? About as good as the chances of the 2017 Capitals — who again boast the best record in hockey — finally winning a Stanley Cup.

Twitter: @dcsportsbog

Dan Steinberg is a sports columnist for The Washington Post.


By Amanda Erickson

If you want empty affirmation at any price, peruse BuzzFeed’s list of “self-care tips so extra they just might work”: “Buy yourself a fancy-ass robe and sit around drinking tea out of [a] goblet like some decadent royal.” “Get yourself a whole cake.” “Book someone to deep clean your home, go out for a massage while it’s happening, and then come back and enjoy pretending that you’re a relaxed human who actually has their life together.”

In other words: Self-care is for people who can afford fancy-ass robes, cleaning services, massages and goblets. A once-radical act has become a marketing opportunity.

In the 1960s, therapists and academics coined “self-care” in part to help trauma therapists, social workers and activists avoid burnout by talking through their feelings and connecting with others in similar circumstances. In the ’70s, self-care became more overtly political as a way for people on the margins to fortify themselves in an oppressive system. In a 1988 book, black poet Audre Lorde wrote that “caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare.”

But as the term has made its way into the mainstream, it has lost its connection to radical politics. Folded into commercials and marketing guides, and used to sell products from “wellness retreats ” to Korean skin-care routines , self-care has become something you deserve when you’re feeling blue, no matter the reason.

In this new understanding of self-care, keeping yourself strong isn’t about your role in a movement: It’s just about you. As feminist author Laurie Penny wrote in the Baffler last year, the “obsessive ritualization of self-care comes at the expense of collective engagement, collapsing every social problem into a personal quest for the good life.”

The modern concept of self-care is no longer useful; it has become shallow and selfish. Let’s get rid of it.

Twitter: @AmandaWaPo

Amanda Erickson writes about foreign affairs for The Washington Post.


By Jeff MacGregor

Made famous (or maybe notorious) by the likes of thinker and essayist Jeet Heer , perfected by bumptious billionaire Mark Cuban and Eric “Guys, it’s time for some game theory” Garland, the tweetstorm is now normcore.

Get rid of it.

I refer, of course, to the long, linked, numbered threads of Twitter posts strung together to form an essay, one scrolling sentence fragment at a time. You’ll find these threads on dog Twitter and cat Twitter, the kids are alt-right Twitter, Oxford comma Twitter and on and on, through the 140-character infinities of the Twitterverse.

Back in the day — two years ago — Twitter was all non sequitur and pictures of sandwiches, and if we wrote 27 tweets on a single topic we expected you to read them up the page in reverse until some stranger ran them through Storify.

But brevity is the soul of twit, so if you want to write an essay, maybe just write an essay and send us the link, k?

cuz as alwys pnctuation and cap’zation are the rl vctms as we dscvr & dply nw cntrctns 4 mkng sns in lmtd spcs

Or maybe these threaded tweets are simply serial telegrams stop an old idea on a new platform stop in the same way the top 10 list really began with the stele of Hammurabi stop.

Or old-time tabloid gossip columns in which every sentence was a new paragraph, and starcrost woosome twosomes were seen dining justthisclose at Ciro’s or Sardi’s or the Stork Club.

Michel de Montaigne, 16th-century inventor of the personal essay, would be proud of its persistence and adaptability. Unless he was trying to read one on the bus, on his phone, in 38 parts.

In which case — like trying to read the Lord’s Prayer inscribed on the head of a pin — his question for the engraver wouldn’t be how, but why?

Twitter: @Jeff_MacGregor

Jeff MacGregor is a writer-at-large for Smithsonian Magazine.

Wedding registries

By Caitlin Flanagan

According to lore, the wedding registry was born in the 1920s, at Marshall Field’s in Chicago. It was simplicity itself: A young woman could list her china and silver patterns, and guests could select a gift they knew she’d like. Was it a bit gauche to direct people to buy you a particular present? Probably. But it was the ’20s. If a girl wanted something, sometimes she had to take matters into her own hands.

The registry as we have come to know it, however, was born in 1993, when some marketing whiz at Target realized a new use for the barcode scanner. Engaged couples could walk through the store scanning things they might need or want: towels, sheets, smart TVs. The technology immediately reshaped private behavior and social mores. Zapping things with the scanner is fun! It’s like a shopping spree with no bill and an element of surprise — which of these things will actually arrive as gifts? Today, the creation of a vast and highly specialized registry (at high-end housewares stores, Walmart, department stores) is as much a part of wedding custom as pretending you’re Lutheran to get the church you want.

You can’t be invited to many weddings without realizing there’s a lot of junk on registries. Couples zap far more than they possibly need, without ranking items by level of desire — so we log on, find a good price point and plug in our credit card numbers.

The function of a wedding gift is to let the couple know: We support you in this new life together. The function of the online wedding registry is to wedge corporate America into that exchange of goodwill. Better to Venmo some cash and let them get what they actually want — perhaps a chance to pay down their credit cards or student loans — than to add more weight to the marital landfill.

Twitter: @CaitlinPacific

Caitlin Flanagan is the author of “Girl Land” and a contributing editor to the Atlantic.

College football

By Patrick Hruby

Imagine you’re running a national university. Your athletic department has a modest proposal. It would like to add a varsity sport that has a tortured relationship with academics, carries a significant risk of physical injury and of triggering litigation against your institution — and will invariably result in about 100 of your male students being hit in the head, repeatedly.

Thanks, but no thanks, you’d certainly say. What kind of school needs a boxing team in 2017?

Why do we treat football any differently? Universities are supposed to advance knowledge, nurture young minds and protect students. Football runs contrary to all three. A growing body of scientific evidence suggests that the sport can be very bad for the brain; it doesn’t take peering through a neuropathologist’s microscope to know it also can be bad for bones and ligaments, leaving participants with lifetime ailments and expenses. Through the antitrust-law-violating sham otherwise known as amateurism, big-time campus football exploits the labor of uncompensated athletes, turning hardworking makers into government subsidy takers by annually transferring billions to the coaches and administrators who call the shots. (Is it really surprising that the on-field workforce is predominantly African American, while management is overwhelmingly white?) Meanwhile, the system too often fails to deliver decent educations to the same players it pickpockets: Half of the black male players in the top programs don’t graduate within six years.

Then there are the sport’s ancillary benefits: sex assault scandals, academic malfeasance, class-action concussion suits, strength coaches who make $600,000 and Xanadu-like football facilities. Oh, and because athletic department donations and college sports season-ticket purchases are largely tax-exempt, everyone else in society is helping to pick up the tab.

When the University of Chicago dropped its powerhouse football program in 1939, school President Robert Maynard Hutchins argued that the sport had become a distraction. That has never been more true than it is today.

Twitter: @patrick_hruby

Patrick Hruby is a D.C.-based journalist and contributing editor at Vice Sports.