Manipulating media to manipulate voters is dishonest enough, but this case is especially offensive: Mr. Barkan has amyotrophic lateral sclerosis. He speaks using a computerized voice that lends itself to the sort of splicing editors must engage in to put words in someone’s mouth. A spokeswoman for Mr. Scalise told The Post that the video had been “condensed . . . to the essence of what he was asking,” and that the congressman’s team believes Mr. Biden’s “position and answer is clear regardless.” But if something is clear without artifice, doesn’t altering reality only make a muddle? Misrepresenting Mr. Biden’s position with hyperbole is one thing; misrepresenting it by showing him answering a question that was never asked is another. Mr. Scalise gave some ground Monday morning, when he said on “Fox & Friends,” “Look, it shouldn’t have been edited.”
The uproar over “deepfakes” often obscures the threat of “cheap fakes” like this one, which are easier both to manufacture and to defend. White House social media director Dan Scavino underscored the point on Monday, when he shared a video edited to falsely show Mr. Biden falling asleep during a televised interview. Twitter, to its credit, labeled both smears as manipulated media according to a policy rolled out early this year. Facebook introduced rules on the same subject around the same time.
Observers have wondered where these sites would draw the tricky lines between editing for concision or dramatic effect and editing deliberately to deceive; this week gives us an example of a company coming down on the right side. But the events also portend a blizzard of deliberate deception to come. Our politicians should have an interest in preserving truth in public debate. Yet when they don’t, platforms must be ready to keep them honest.