As you probably know, Facebook has been under fire for the way it edits its Trending Topics. Gizmodo reported earlier this month that trending news editors at the social network had purposefully suppressed news from a conservative political viewpoint. Facebook confirmed that chief executive Mark Zuckerberg is going to meet with top conservatives in California — including Glenn Beck and Arthur Brooks — in response to the claims, which Facebook has denied.

Although many pixels have been spilled over what Facebook did, with what intent and to what effect, to me, the real question should be what do we want Facebook to be, anyway?

Do we want Facebook to act as a news site? It certainly never started out that way — it was supposed to give you headlines about your friends, not world events. To some end, that is still what Facebook is; impartiality, a watchword for American journalism, has never been one for Facebook. In fact, it actively didn’t want to be impartial — it wanted to be personalized. Liberal users might be more interested in liberal news. Conservatives might want to see conservative news. Facebook doesn’t care one way or the other, as long as people see what they want.

Yet we, as users, made Facebook into a news source. According to a recent study from the American Press Institute, 51 percent of Americans get their news from social media sites and, the study said, Facebook is the most-consulted network.

And it's adapted according to our habits. When Facebook added the Trending Topics feature in 2014, it didn't really fit with what most people thought of as Facebook's mission as a social network. Right off the bat, users criticized Facebook for showing fluffy news — the gossipy, scandalous, “you won’t believe” stories we all profess to hate but click on anyway. So, over time, the company has crafted guidelines to promote “real news” — or at least more  timely news — on the site.

Facebook turned to curation, fueled in part by humans and in part by machines, to make things seem fair. The problem is that while we like to pretend data is neutral, algorithms — because they are written by humans — have bias.

If we wanted a pure, unadulterated, democratic link machine, after all, there is, for example, Reddit. Reddit has no algorithm. It prides itself on the fact. The things that get the most positive votes rise to the top of the site, as the most noteworthy things on the Internet. And at time of writing, the very best thing on the Internet is a picture of a 17-year-old cat.

That’s genuinely great, in its own way. It’s probably not what you want to see in your list of headlines for the day. And taken to extremes, an uncurated feed could be genuinely upsetting to people. What if a group flooded Facebook with a targeted aim to get something awful into Trending Topics, such as a gruesome crime scene photo? It would reflect, legitimately and unvarnished, what was spiking on the site. But people would be screaming for curation.

Even Reddit is taking steps to prune back that wild thicket of opinion into something more contained because of criticism. Earlier this month, I spoke with Reddit co-founder Alexis Ohanian about analysis that showed that Reddit proved “Godwin’s Law” — the anecdotal but deeply enshrined belief that all Internet discussions will eventually devolve into comparisons to Hitler and the Nazis.

Apart from noting that Reddit’s own independent analysis showed that “Tom Brady” was referenced as many times as “Adolf Hitler” across Reddit’s threads, Ohanian also said his company walks a fine balance. It’s never been so easy to see what people are talking about around the world at any given moment — and, on the flip side, so difficult to parse the signal from the noise. “We take this position we have as leaders in the community very seriously, and are devoted to grow as a platform that promotes free expression and is a worldwide community,” Ohanian said.

On a smaller scale, that’s what Facebook aimed to do with Trending Topics — police a worldwide conversation. It didn’t aim to become a gatekeeper for the news; that’s a side effect. We, as users, could also possibly recognize it’s dealing with something it was never really designed to do.

None of this is to say Facebook doesn’t deserve some of the criticism it’s getting now. It saw how users were using the site, and capitalized on those behaviors and pursued a larger role in media. And, it’s not stopping — Facebook wants to be everything from our personal assistant to our customer service representative. In that ambitious expansion, Facebook risks more challenges like this in the future, and more responsibilities — every new Facebook feature seems to bring the tech giant potentially in competition with a new industry.

So yes, Facebook should learn some things. This controversy is a good thing overall and can be used to improve this specific product. Maybe it can also give Facebook pause before it wades into its next area.

But it should also reflect something back to us about how we use the network, what we want out of it and what it can reasonably deliver.