Myth: 2014 marked the beginning of the end for pro football

The terrible headlines for the National Football League began in 2014's first month with the nation's highest office. "I would not let my son play pro football," President Obama said in an interview with the New Yorker published in January. The statement set a fitting tone.

The NFL faced continual self-made crises this year, tumult that led to predictions of a death spiral for the nation's most powerful sports league. But to believe that 2014 was the beginning of the end for professional football would be to ignore historical precedents, the NFL's staggering foothold in American culture and the continued willingness of its participants.

In 1905, after a spate of football-related deaths and amid cries to ban the game, President Teddy Roosevelt convened coaches and college presidents at the White House to change rules and resuscitate the sport. Football has survived existential threats ever since. If anything, 2014 proved the NFL immune to the very worst it has to offer.

As 5,000 retired players sued the league, the NFL admitted in federal court that one in three former players will develop cognitive problems. Baltimore Ravens running back Ray Rice knocked out his fiancée in a casino elevator, and NFL Commissioner Roger Goodell bungled Rice's disciplinary proceedings so thoroughly that a former judge vacated his ultimate suspension. Video of the incident exploded the controversy, exposing the NFL as more interested in polishing its image than in administering justice. As the scandal mushroomed, Minnesota Vikings star running back Adrian Peterson was charged with child abuse for, among other acts, whipping his 4-year-old son in the scrotum with a stick.

The NFL challenged fans to watch — and they watched in droves. The highest-rated network show of the year was NBC's "Sunday Night Football." The highest-rated cable show? ESPN's "Monday Night Football." According to the Fantasy Sports Trade Association, roughly 33 million people played fantasy football this year.

Health risks haven't turned off viewers, and despite Obama's sentiment, they have done little to stifle participants, either. It's true that between 2010 and 2012, participation in Pop Warner youth football plummeted 9.5 percent, believed to be the largest two-year decline in the program's history. But since 2012, participation has remained level at roughly 225,000 kids, Pop Warner spokesman Josh Pruce said.

The allure of playing football matches the appeal of watching the sport. Chris Conte, a Chicago Bears defensive back educated at the University of California, said recently that he would "rather have the experience of playing and, who knows, die 10, 15 years earlier" than not be able to play in the league.

"There is something about the game," Denver Broncos wide receiver Wes Welker, who returned to the NFL at age 33 after suffering three concussions in 10 months, told ESPN the Magazine. "There's nothing like that competitiveness. That feeling and that rush — you can't really get it anywhere else."

The American public has sided with Welker, all the way up to the White House. As Obama was conceding that he would bar his hypothetical son from football, he was watching television. The Miami Dolphins-Carolina Panthers game was on.

Adam Kilgore covers national sports.

Myth: American race relations deteriorated

It's easy to see where this impression comes from. The name "Ferguson" conjures scenes of urban unrest — Molotov cocktails, armored vehicles, looting — that we hadn't seen in this country in many years. The tragic police killings of black men there and on Staten Island galvanized a protest movement that some are now seeking to implicate in the execution-style shootings of two police officers in Brooklyn. An NBC News poll showed that 57 percent of Americans believe that race relations are "fairly bad" or "very bad," the most pessimistic reading since 1995 — after the O.J. Simpson trial.

For two reasons, in the grand scheme of things, this is all wrong.

First, polling by Gallup shows that while concern about race relations as the nation's most pressing problem did spike in 2014, it began to inch up — from basically zero — almost three years ago. My reading is that it was George Zimmerman's killing of Trayvon Martin, in February 2012, that prompted people to start digging trenches on both sides of the issue we're still arguing about: whether society unfairly devalues the lives of African-American boys and men.

Second, and more important, is the fact that this painful, spasmodic pattern — neglect, followed by outrage and/or panic, followed by more neglect — is the way we deal with race in this country and even make progress. Heaven help us.

I would never minimize the racism that endures in American society or the struggle that remains to wipe it out. I'm talking about not only structural inequalities that have never been properly addressed, such as the wealth gap and ingrained patterns of segregated housing, but also racist attitudes — conscious or unconscious. That said, to equate where we are now with where we were when I was growing up in Jim Crow South Carolina is laughable. If not for the triumph of the civil rights movement, I don't know where I would be, but there is no way on God's Earth I would be writing a column for the Washington Post.

Since the 1970s, race has been a secondary concern for many Americans — until something happens to focus their attention: The 1992 riots in Los Angeles after the acquittal of the police officers who beat Rodney King. The election of Obama. The arrest of Harvard professor Henry Louis Gates Jr. The racist rants of Cliven Bundy and Donald Sterling. The killings of Michael Brown and Eric Garner.

On rare occasions, as on Inauguration Day in 2009, we hug up and tell one another how far we've come. Far more often, we argue — passionately, bitterly — until we tire ourselves out or something else catches our attention. Forty years of polling data strongly suggest that the supposed worsening in race relations is just a transient spike.

Eugene Robinson is an opinion columnist.

Myth: 2014 was the year privacy died

Credit card numbers from Home Depot. Intimate celebrity shots from iCloud accounts. Thousands of confidential documents from Sony Pictures Entertainment. Nearly every month this year, it seemed there was an instance of personal information going public. While it was a year of unprecedented hacks, it was also a year that launched many debates about privacy and efforts to safeguard it, too.

Emboldened by Edward Snowden's surveillance revelations, Apple and Google declared that they would make it impossible to unlock most of their devices for the police — even in the face of search warrants. The hacking attacks on Home Depot and Target prompted credit card firms and retailers to make more-secure credit cards (and credit card terminals) available to the average shopper. And the U.S. Supreme Court ruled that police officers need warrants to search the cellphones of anyone they arrest. Even if you do make a mistake, the controversial "right to be forgotten" ruling in Europe — which allows people to ask Google, Bing and other search engines to scrub unflattering pages from their search results — shows that there's some support out there for letting people's pasts stay in the past.

We still have a long way to go to understand how privacy functions in a digital age. A recent study from the Pew Internet and American Life Project showed that more than half of Internet users think privacy policies promise that personal data will remain confidential; in truth, privacy policies often list all the ways that your information is being shared.

But the average person seems to be starting to pay more attention to how the information he or she shares online — on social media and with technology companies — is being used. Privacy has now become a selling point. The rise of "anonymous" messaging apps such as Snapchat and Whisper shows that there's a desire to share more wisely. (Though you shouldn't expect anything on the Internet to be truly private.)

In a separate survey, Pew also found that more people than ever are taking active steps to protect their privacy online. A whopping 86 percent of Internet users said they've done things such as clear their Web browsers of the logs that show which sites they've visited and of pieces of tracking code — aka cookies — that the advertising industry uses to glean information. (When was the last time you heard about 86 percent of people doing anything?) Even more encouraging, most teens say they take steps to lock down certain kinds of online information.

And there's growing support for making some experiences offline ones. Chef Spike Mendelsohn told the Washington City Paper that there are no photos allowed in his D.C. speakeasy, the Sheppard, out of respect for others' privacy.

Hayley Tsukayama is a technology reporter.

Myth: 2014 was the year the economy came roaring back

There was something different in the way President Obama talked about the economy in one of his weekly radio addresses this month. Something about the way he talked up "the strongest year for job growth since the 1990s." Some hint of — could it be? — swagger. "The six years since the financial crisis have demanded hard work and sacrifice on everyone's part," Obama said. "But as a country, we have every right to be proud of what we've got to show for it."

It's certainly true that 2014 has been a relatively good year for the economy, especially lately. Gross domestic product grew particularly fast in the third quarter — at a 5 percent clip, the strongest in more than a decade — and employers added more than 300,000 jobs in November. But the key word is "relatively." The economy looked pretty good this year largely because it has looked so bad in the recent past.

If you consider economic growth going back to the 1980s, you'll see that in the past few years it was steady — but slower than it was in the music-video era. The boom years of the 1980s and '90s saw annual GDP growth above 4 percent. Annual growth has yet to crack 3 percent in this recovery, and 2014 probably won't be an exception, once all the numbers are in.

This year probably wasn't a big economic springboard for future expansion, either. Forecasters generally expect U.S. growth to hover around 3 percent for the next couple of years. But there are reasons — all related to government and central bank policy — to fear that's overly optimistic. The economy wasn't slowed by any government shutdowns, tax increases or big spending cuts coming out of Washington in 2014. That could change next year as Obama spars with Republicans, who will be in control of both the House and the Senate. The Federal Reserve appears set to begin raising interest rates in 2015 after years of keeping them near zero and running a variety of efforts to juice the recovery. That could boost an already strengthening dollar, which would hurt exports — and growth.

And then there's the matter of wages; most folks' haven't gone up for a long time. At the end of this year, economists were grasping at data points suggesting that broadly shared salary increases could finally be around the corner. But they haven't appeared yet, a fact that even Obama more or less acknowledged in his crowing address. There's a connection here. The typical worker's paycheck has fallen to the same real level it was in 1966. Until a lot more people's paychecks get a lot fatter, it's going to be hard to convince Americans that, as their president put it, their economic "resurgence is real."

Jim Tankersley is the editor of Storyline.

Myth: 2014 was the year the Arab Spring ended

In May 2011, President Obama delivered a speech hailing the "extraordinary change taking place" in the Arab world. In Egypt, Tunisia, Yemen, Libya and even Syria, ordinary people were taking to the streets and calling for democracy. In some places, entrenched autocrats had fallen. "After … the first shout," Obama said, quoting a young Syrian, "you feel dignity."

A few months later, a senior White House official even said he believed that the departure of Syrian President Bashar Assad was "inevitable."

But in 2014, the United States started bombing extremist militants fighting the Assad regime. If the bloom of the Arab Spring had already faded, it seemed that this year brought the unforgiving Arab Winter.

In Libya and Yemen, civil wars and political violence made some long for the stability of tyrants. In Egypt, Abdel Fatah al-Sissi, a former army chief, went from architect of a coup to president. He has consolidated power with Orwellian ruthlessness: Journalists have been jailed; Islamists banned. A popular TV satirist faces millions of dollars in fines after a court ruled that the "nation doesn't need satirical shows."

Meanwhile, the coalition of Arab countries assembled by the Obama administration to fight the Islamic State extremists also happens to be the bulwark of Arab authoritarianism: Sunni kingdoms, led by Saudi Arabia, that watched in horror in 2011 as uprisings unseated friendly dictators nearby.

So it's easy to see why many, including some in Washington, have come to regard the Arab Spring as an exuberant mistake, one that allowed political Islam to gain sway and muddled U.S. strategic interests in the region.

In one country, though, the Arab Spring survives. Tunisia, where the uprisings of 2011 began, has experienced a fitful transition, but democracy has endured. Though often at odds, secular parties as well as Islamists have accepted the results of the final round of presidential elections this month.

Tunisia is not the exception but the blueprint — a country with a relatively robust civil society and political parties that have recognized the value of an inclusive system.

The gains are fragile, but building democracy after almost half a century of authoritarian rule is not easy; in Latin America, it took decades.

The desire for dignity and representation voiced in 2011 across the Arab world has not disappeared. Obama was right to embrace that spirit of change. It's just going to take a while.

Ishaan Tharoor covers foreign affairs for the WorldViews blog.