During peak hours, stock traders use complex algorithms to generate millions of messages per second about what they’re buying, selling or canceling.

But a spate of high-profile technology glitches has proved that these algorithms don’t always interact without a hitch, and regulators are exploring whether they should force market players to test their technologies before unleashing them on the world.

On Wednesday, Nasdaq plans to unveil a product that it hopes will show regulators that markets can police themselves. Starting early next year, the system should allow traders to replay and interact with historical market data so they can test their algorithms in real-world conditions.

Until recently, the issue was moot. The historical view was that the government did not need to intervene because firms had strong financial incentives to ensure that their systems worked properly, several market experts said.

“If one brokerage’s system failed, that firm would bear the cost of the failure,” said Charles Jones, a professor at Columbia University’s school of business.

But that was before automated trading took hold and before the market grew to include 13 U.S. exchanges and many more alternative trading venues, bringing with it what Securities and Exchange Commission Chairman Mary Jo White recently described as systems that “can fail or operate in unexpected and unintended ways.”

One of those failures was the “flash crash” of May 6, 2010, when the stock market plunged nearly 1,000 points in minutes and then whipsawed back up. Last year, runaway trades linked to faulty computers at Knight Capital caused another major disruption.

The SEC reacted by approving a series of measures, including the adoption of “circuit breakers” and other mechanisms that the exchanges can use to halt trading when the price of a stock moves too far, too fast. The agency also started requiring trading firms with direct access to the exchanges to put in place controls to prevent erroneous trades.

But the glitches kept coming. More recently, a software malfunction led Nasdaq to halt tradingfor most of the afternoon on Aug. 22. Last month, trading on the nation’s options exchanges was briefly interrupted because of problems with a software system administered by the New York Stock Exchange.

The SEC has proposed that the exchanges abide by certain minimum testing standards for their technology, and coordinate their testing with trading firms.

Against that backdrop, Nasdaq is unveiling its market simulation product, developed by a New Jersey high frequency trading firm called Tradeworx — the same firm that began streaming real-time trade data from the exchanges to SEC headquarters this year.

Manoj Narang, chief executive of Tradeworx, said it’s only a matter of time before regulators begin mandating testing requirements for the exchanges and their customers.

A Tradeworx subsidiary has been providing the simulator technology to its clients since 2009. Nasdaq’s customers should soon gain remote access to the same technology, which will be housed at Nasdaq’s data center in New Jersey.

“It’s a way for industry to take the lead in an area before regulators firm up their decisions on what they want to do next,” Narang said.

Eric Noll, an executive vice president at Nasdaq, said the product “is recognition that there needs to be a robust and efficient way for firms to test their algorithms.”