Daniel W. Drezner is a professor of international politics at the Fletcher School of Law and Diplomacy at Tufts University and a regular contributor to PostEverything.

Willis Ware, a former Rand Corp. engineer who helped build early computers in the 1940s and ’50, poses for a photo, location not known, in 1962.(AP Photo/Rand Corp.)

I had the good fortune on Wednesday to hear economist Robert Gordon talk about his magnum opus, “The Rise and Fall of American Economic Growth.” Gordon has a somber tale to tell. He argues that U.S. economic growth ain’t what it used to be, and that ain’t gonna change over the next 25 years. This is due to myriad headwinds such as demographic slowdowns, rising inequality, fiscal constraints, and — most important — the failure of newer technologies to jumpstart economic growth the way that the Second Industrial Revolution did.

[U.S. economy slows, with GDP growing 0.5% in first quarter]

It’s his last point — about the effect of information technology on productivity — that prompts so much fierce debate.  Economists are furiously debating whether the visible innovations in the information sector are leading to productivity advances that are going undetected in the current productivity statistics. On the one hand, the aggregate data suggests a serious productivity slowdown over the past decade. On the other hand, Google’s chief economist, Hal Varian, insists that “there is a lack of appreciation for what’s happening in Silicon Valley, because we don’t have a good way to measure it.”

Surely, there are sectors, such as higher education, in which technological innovations can yield significant productivity gains, right? All that talk about MOOCs and flipped classrooms and the like will make a difference in productivity, yes?

As an optimist, I’ve long resisted Gordon’s argument — but this is one area where I’m beginning to suspect that he’s right and Silicon Valley is wrong.

I’ve been teaching for close to 20 years now. During that time, the IT revolution has fundamentally transformed what I do on a day-to-day basis. It is massively easier for me to access data that helps inform my classes. The ability to use audio-visual methods to broadcast a video or audio to my students is much easier. I’ve Skyped in as a guest lecturer for numerous other colleagues. Course websites have made it far easier for me to communicate with my students, and for them to communicate with me. There is no denying that on some dimensions, technological change has made it much easier for me to do my day job.

And yet, over the past decade, I have also gone in a more Luddite direction. After having a laissez-faire policy on laptops in my classrooms for my first decade of teaching, I have pretty much banned them. I knew that taking notes by hand is much, much better for learning than taking notes on a computer (the latter allows the student to transcribe without thinking; the former forces the student to cognitively process what is worthy of note-taking and what is not), but I figured that was the student’s choice. The tipping point for me was research showing that open screens in a classroom distract students close to the screen. So I went all paternalistic and decided to eliminate them from my classroom. The effect was immediate — my students were more engaged with the material.

My lectures are pretty low-tech as well. I use videos in class on occasion, but I usually deploy them at the start and then start lecturing. Otherwise lights have to be dimmed and that’s an invitation for the student to tune out. Similarly, I don’t use Powerpoint for my notes — because that just invites the student to transcribe the points in the slide without thinking about them.

One could argue that Skyping in as a guest lecturer, or just broadcasting a superstar professor into other universities, could improve the quality of the classroom experience. But I doubt it. Speaking from experience, lecturing remotely is a radically imperfect substitute for interacting in the same physical space. A mediocre but in-the-flesh-professor still provides a superior education environment than a remote lecturer that one watches on a screen.

There has been one innovation over the past generation that has made my in-class teaching better. The whiteboard is way better than the blackboard to use. Otherwise, I have become warier of new technologies in the classroom.

Maybe this is just me being a Luddite, and, as digital natives, millennial professors will figure out how to properly exploit information technologies in the classroom. And outside of the classroom, I’m a pretty big fan of these new technologies.

But when it comes to higher education, I think Gordon is right and Varian is wrong. There are gains to be wrung from technological innovation — but they’re much more limited than Silicon Valley wants you to believe.