“THESE ARE not my words,” activist Ady Barkan tweeted to Rep. Steve Scalise (R-La.) on Sunday. “I have lost my ability to speak, but not my agency or thoughts. You and your team have doctored my words for your own political gain.” He was right.

The minority whip shared a now-deleted video on Twitter that purported to show Democratic presidential nominee Joe Biden voicing his support for defunding the police — a position the candidate doesn’t hold, though his opponents have spent millions of dollars pretending that he does. “But do we agree that we can redirect some of the funding?” Mr. Barkan had asked Mr. Biden after the latter laid out a series of nuanced reforms. The montage Mr. Scalise posted appended language from earlier in the interview to the end of this question: “for police.” In that light, Mr. Biden’s answer of “yes, absolutely” reads as a much more direct endorsement of a proposal from which he actually has taken care to distance himself.

Manipulating media to manipulate voters is dishonest enough, but this case is especially offensive: Mr. Barkan has amyotrophic lateral sclerosis. He speaks using a computerized voice that lends itself to the sort of splicing editors must engage in to put words in someone’s mouth. A spokeswoman for Mr. Scalise told The Post that the video had been “condensed . . . to the essence of what he was asking,” and that the congressman’s team believes Mr. Biden’s “position and answer is clear regardless.” But if something is clear without artifice, doesn’t altering reality only make a muddle? Misrepresenting Mr. Biden’s position with hyperbole is one thing; misrepresenting it by showing him answering a question that was never asked is another. Mr. Scalise gave some ground Monday morning, when he said on “Fox & Friends,” “Look, it shouldn’t have been edited.”

The uproar over “deepfakes” often obscures the threat of “cheap fakes” like this one, which are easier both to manufacture and to defend. White House social media director Dan Scavino underscored the point on Monday, when he shared a video edited to falsely show Mr. Biden falling asleep during a televised interview. Twitter, to its credit, labeled both smears as manipulated media according to a policy rolled out early this year. Facebook introduced rules on the same subject around the same time.

Observers have wondered where these sites would draw the tricky lines between editing for concision or dramatic effect and editing deliberately to deceive; this week gives us an example of a company coming down on the right side. But the events also portend a blizzard of deliberate deception to come. Our politicians should have an interest in preserving truth in public debate. Yet when they don’t, platforms must be ready to keep them honest.

Read more: