This undated photograph shows the server room at Facebook’s data center in Prineville, Ore. (Alan Brandt/Facebook/AP)

Mitch Daniels, a Post contributing columnist, is president of Purdue University and a former governor of Indiana.

Isn’t technology wonderful? At Purdue University, the same IT infrastructure that enables us to manage student assignments and grades, operate residential and dining facilities, and support a leading community of scientific researchers produces as a byproduct a massive amount of fascinating information. We know where each student is anytime — which is virtually all the time — their mobile devices are connected to our WiFi network. When they enter their dorm, or dining court, or recreational facility, they swipe in, and a machine captures the time and place. Whether they’re in class or in their rooms, a machine knows when they’re online and where they’re going while there. Forget that old ominous line, “We know where you live.” These days, it’s, “We know where you are.”

University people are curious by nature, and much of today’s “Big Data” era was born at our school. So it’s only natural that we would want to delve into this treasure trove of information in search of illuminating patterns — especially those that might prove helpful to those same students, whose academic success is the heart of our mission.

Peggy Johnson, executive vice president of business development at Microsoft, urges companies to have both human and machine learning to protect against biased and unethical algorithms in data science. (Washington Post Live)

Does the data say that too many days away from campus, or too many absences from class, or too much in-class browsing of websites unrelated to the course, or too few visits to the gym, correlates with lower grades? Does eating meals with the same people day after day appear to help scholastic performance? If so, shouldn’t we bring this to the students’ attention, for their own good?

For the past two years, virtually every entering Purdue freshman (there is an opt-out option that few exercise) has been given a mobile application through which the university sends them personalized information about ways to improve their chances of academic success.

So far, the information employed and the ways we’re using it have not seemed at all problematic. “Is that combination of courses you chose a historically tough one? Here’s where you can find a tutor.” “Did you know that students who wait as long as you did to sign up for courses are more likely to struggle? The registrar’s office opens at 8 a.m. tomorrow.”

But that’s today. With the best of motives, schools like ours will feel the urge to use more and more personal data, through more and more insistent tactics, all in the “best interest of the students.”

Much has been written in recent years about the possibilities for “nudging” people to do things they otherwise might not choose to do. In particular, would-be social engineers who have run into trouble, and often political backlash, when trying to order people to change their behavior, have looked longingly on more subtle means of influence.

So when does “nudge” come to “shove”? We don’t have to theorize about the far end of that spectrum. The future is now, in the form of China’s new “social credit system,” already in effect for volunteers and becoming mandatory in 2020. Citizens conforming to governmentally approved behaviors will earn a high numerical rating; nonconformists can expect unhappy consequences. Those with high scores will enjoy a multitude of preferences, ranging from VIP hotel rooms and air travel to better schools for their children. Paying your bills, or spending money on work clothes instead of video games, will be worth points. But to protect your high rating, be sure to say nice things, and never, never skeptical things, about the government, in your digital life, or be linked to others who do. This all is designed to deliver, in author Rachel Botsman’s apt coinage, “gamified obedience.”

This subject needs a highly public examination sooner not later. The data is here, it is in fact “Big,” and it calls out to the sincerely curious to be analyzed and utilized for good, in all the institutions suddenly knowledgeable about the previously most private aspects of our daily lives.

Somewhere between connecting a struggling student with a tutor and penalizing for life a person insufficiently enthusiastic of a reigning regime, judgment calls will be required and lines of self-restraint drawn. People serene in their assurance that they know what is best for others will have to stop and ask themselves, or be asked by the rest of us, on what authority they became the Nudgers and the Great Approvers. Many of us will have to stop and ask whether our good intentions are carrying us past boundaries where privacy and individual autonomy should still prevail.