Facebook used to tell its programmers to "move fast and break things."

Who knew that one of the things could be the integrity of elections.

Yet we're now learning that information about 50 million Facebook users was misused by Cambridge Analytica, a political consultancy with ties to President Donald Trump that specializes in manipulating voter opinions.

As reported in the New York Times and the U.K.'s Guardian, information about millions of Facebook users ended up at the firm. It uses such data to psychologically profile voters and influence them with pinpointed messages.

Steve Bannon oversaw the firm's Facebook data-gathering before leading Trump's campaign and serving as chief White House strategist. Among other things, Cambridge Analytica took credit for spreading the campaign's "Crooked Hillary" theme.

The firm used data that a researcher collected with Facebook surveys, then inappropriately disclosed. The firm's influence on the 2016 election is unclear, but the situation shows how polluted the river of information has become on social media, where two-thirds of Americans now get news.

Obviously, this should prompt more aggressive regulation of Facebook's core business, which is amassing user data to precisely target advertising. Stronger rules and penalties are needed for companies that leak data, effectively force users to share personal information and have weak controls over how their data is used.

This is not just a technical privacy issue, though. It's also a civics problem, which is harder to solve.

Facebook's targeting system is used by political campaigns and foreign governments seeking to disrupt elections and sow discord. Facebook is stepping up enforcement but must be more transparent in reporting such activity and how it screens malicious messages.

Still, the onus is ultimately on the public to be critical consumers of news, so it's less vulnerable to psychological manipulation enabled by companies like Facebook.

Facebook's debacle should also prompt a discussion about how to strengthen democracy to withstand the dangers of a few giant, opaque companies becoming arbiters of knowledge of current affairs, with enormous influence over voter opinions.

Our government and election rules adapted to the emergence of radio, TV and the internet. They should be resilient enough to withstand the rise of megacompanies using secret algorithms to manipulate the flow of information.

Facebook CEO Mark Zuckerberg belatedly admitted mistakes were made and vowed to better protect user data. The company already cracked down on the Cambridge Analytica leakage several years ago.

Even so, the company's repeated failures to uphold privacy promises and shortcomings in controlling data it shares demand a stronger response.

Members of Congress have rightly called for hearings. The Federal Trade Commission is looking at whether Facebook violated a 2011 consent decree that was supposed to prevent privacy violations.

This is an opportunity for U.S. regulators to catch up to peers in Europe who have responded more assertively to privacy, taxation and antitrust concerns with U.S. tech giants.

Penalties and more privacy safeguards are in order.

FROM AN EDITORIAL IN THE SEATTLE TIMES