Since the 1990s, U.S. law enforcement has expressed concern about "going dark," roughly defined as an inability to access encrypted communications or data even with a court order. Silicon Valley companies are rolling out encrypted products that allow users alone to access their data, and in the wake of the Paris and San Bernardino, Calif., terrorist attacks, law enforcement officials argue that their fears are being realized. The FBI is engaged in a public battle with Apple over access to data stored on the iPhone of one of the San Bernardino attackers and cautions that encrypted messaging apps could hinder the organization's ability to uncover terrorism plots.

To prevent future attacks, law enforcement has urged U.S. tech giants to build in "backdoors" or "front doors" to their products — essentially, the technical ability to decrypt communications pursuant to a warrant. Silicon Valley and computer scientists argue that any solution allowing someone other than the data's owner to decrypt communications amounts to a flaw that could be exploited by criminals and state actors and thus weakens security for everyone. Moreover, proponents of encryption point out that numerous countries and groups have developed their own products and services, meaning anti-encryption policies will only hurt the competitiveness of U.S. companies without providing access to a great deal of suspect communications. Despite the technologists' claim of the intractability of the problem, U.S. officials insist there is a technological workaround and have sought to compel tech companies' assistance to break into encrypted devices.

In the past year, both sides have repeated their talking points many times. The debate has been dominated by absolutists. Some cybersecurity experts and privacy advocates are loath to concede that "going dark" is a problem at all, while many in law enforcement are scornful of what they see as decisions motivated by business interests and remain adamant that anything less than a real-time, on-demand decryption capability is unacceptable.

It does not have to be like this. There are solutions that allow law enforcement to gather the evidence it needs without introducing encryption backdoors. Here are three worthy of consideration:

First, Congress could empower law enforcement to exploit existing security flaws in communications software to access the data it needs. Put simply, law enforcement should have the ability to hack into a suspect's smartphone or computer with a court order, such as a warrant.

It's no secret that software is riddled with security flaws. As some prominent computer security experts have argued, such lawful hacking would allow authorities to use existing vulnerabilities to obtain evidence instead of creating new backdoors. Although this would entail law enforcement adopting the same techniques as criminals, tight judicial oversight would ensure that lawful hacking is employed responsibly, much like the restrictions that already apply to wiretapping.

Second, the executive branch should explore the possibility of developing a national capacity to decrypt data for law- enforcement purposes. The challenge of "going dark" affects state and local law enforcement the most: They are the least likely to have the resources and technical capabilities to decrypt data relevant to an investigation. Creating a national decryption capability, housed within the FBI and drawing upon the expertise of the National Security Agency, would provide assistance to state and local law enforcement, similar to what the FBI provides for fingerprint and biometric data.

Third, and most important, law enforcement needs to improve its tech literacy. Law enforcement was confronted with a problem akin to "going dark" when, in the 1990s, organized-crime suspects started using disposable phones that hampered wiretaps. Nevertheless, law enforcement adapted its procedures, and arrests and prosecution of organized-crime suspects continued.

Running into an encrypted communication does not necessarily mean an evidence trail will go cold. Encryption can occur on a device, as the data are transmitted and when they are stored in the cloud. Encryption in one avenue doesn't necessarily mean the other two avenues will also be encrypted. For example, Apple can access the content of an encrypted iPhone if it has been backed up to iCloud, Apple's cloud storage system. Recognizing how and when encryption occurs, and the different security offerings of the more popular service providers, may help law enforcement access data. Better tech literacy might have avoided the current Apple-FBI fight. The FBI could have obtained more information from the San Bernardino attacker's iPhone if it had not hastily ordered the county to reset his iCloud password.

These three proposals will not be fully acceptable either to technologists or to law enforcement. Some in the technology community will recoil at the idea of the NSA supporting law enforcement, while others will resent the need to keep pace with Silicon Valley's offerings. Nevertheless, a one-size-fits-all solution isn't likely. The debate has been going on in one form or another for more than 20 years. It's time to consider some realistic solutions.

Adam Segal is the director and Alex Grigsby is the assistant director of the digital and cyberspace policy program at the Council on Foreign Relations. Segal is author of "The Hacked World Order." They wrote this article for the Washington Post.