By now, nearly everyone in the Western world has heard of the on-going campaign by many law enforcement and intelligence agencies to get legislation requiring technology vendors, and in particular manufacturers of mobile devices, to offer back doors and/or off-by-default encryption. While this battle seems to by cyclical, rearing its head every few years, the latest push has been in response to the terror attacks in Paris, France and San Bernardino, CA, with the most recent events being a Federal court judge ordering Apple to come up with some mechanism to defeat the encryption (or, at the very least, prevent the auto-wipe from kicking in) to aid the FBI in accessing the device.
Many sources have already discussed the legal, moral and technical arguments both in favor and against such mandates. However, I’d like to explore them in a different context, specifically what such mechanisms might mean in the context of certifying commercial products for use in the Federal government, both in classified and unclassified networks.
The Federal government is encouraged to leverage COTS solutions where they may, as this reduces costs vs. having to contract specific solutions just for them, when they are trying to solve common problems that industry has already addressed. Of course, due to the sensitive nature of the work the government does, and the data that they possess, they have a need for products to meet certain minimum security baselines. This is where Common Criteria, Commercial Solutions for Classified (CSfC), the UC APL and FIPS come in.
While, due to restrictions on algorithms and cipher suites, it is common for many commercial products to have a “FIPS mode,” it is less common to have a “Common Criteria” mode. Many vendors find it convenient to just make sure that their product matches up with the requirements, which serve as sort of a high-level, categorical threat model. Building upon these requirements, CSFC helps provide assurances that commercial products are suitably safe to be used in higher-security environments.
Looking at the Security Problems Definitions and Security Objectives in the Protection Profile for Mobile Device Fundamentals version 2.0, we see that many of the identified threats and objectives deal with exactly the issues that are currently being brought up in the Paris and San Bernardino cases:
- (T.EAVESDROP) An attacker is positioned on a wireless communications channel or elsewhere on the network infrastructure. Attackers may monitor and gain access to data exchanged between the Mobile Device and other endpoints.
- (T.PHYSICAL) The loss or theft of the Mobile Device may give rise to loss of confidentiality of user data including credentials. These physical access threats may involve attacks which attempt to access the device through external hardware ports, through its user interface, and also through direct and possibly destructive access to its storage media. The goal of such attacks is to access data from a lost or stolen device which is not expected to return to its user.
- (O.COMMS) To address the network eavesdropping and network attack threats described in Section 3.1, concerning wireless transmission of Enterprise and user data and configuration data between the TOE and remote network entities, conformant TOEs will use a trusted communication path.
- (O.STORAGE) To address the issue of loss of confidentiality of user data in the event of loss of a Mobile Device (T.PHYSICAL), conformant TOEs will use data-at-rest protection. The TOE will be capable of encrypting data and keys stored on the device and will prevent unauthorized access to encrypted data.
- (O.AUTH) To address the issue of loss of confidentiality of user data in the event of loss of a Mobile Device (T.PHYSICAL), users are required to enter an authentication factor to the device prior to accessing protected functionality and data. … Repeated attempts by a user to authorize to the TSF will be limited or throttled to enforce a delay between unsuccessful attempts.
In the SFRs themselves, we also find that FCS_CKM_EXT.5 deals with TSF wipe, requiring that protected data be securely erased, including key destruction.
Looking at the Data At Rest (DAR) Capability Package v2.8, IAD specifically mentions backdoors as a potential threat in the supply chain of the device:
- Threat actions include manufacturing faulty or counterfeit parts of components that can be used to disrupt system or network performance, leaving open back doors in hardware that allow attackers easy ways to attack and evade monitoring, as well as easy ways to steal data or tamper with the integrity of existing/new data
This same language is echoed in the Mobile Access CP v1.1.
The language in the CC and CSfC documents, basically identifying everything that the FBI is asking for as potential threat which needs to be countered in any commercial product that wants to be available to use by the government, leads one to understand why the current DIRNSA, Adm. Mike Rodgers, recently made statements in favor of strong cryptography.
Of course, the NSA hasn’t always taken such stances. Many people will remember the Dual_EC_DRBG fiasco from several years ago, which recently reared its ugly head again in CVE-2015-7755, wherein a different malicious Q-value had been inserted into the code, and another bug leveraged to keep the TDES whitening from taking place. Once a backdoor is in place, it is there for everyone, no matter what color hat the person taking advantage of it is wearing. For another example, we can look back to the Athens Affair, wherein mechanisms mandated for CALEA compliance were hijacked and used by still unknown perpetrators to spy on high-level officials in the Greek government.
In order to meet some hypothetical legislative requirement for either backdoors, or weak encryption, vendors might have to have split code bases, and possibly split build environments to make sure that weakened code doesn’t accidentally make it into the “good” code base. Many vendors may choose not to go through the trouble or incur the costs, limiting the available options for COTS solutions available for the government.
Additionally, assuming that vendors want to play by the split rules, would that mean requiring additional assurance activities requiring labs to verify that the backdoors are not present in the code, and that the binaries submitted for evaluation? This would, of course, most certainly require disclosure of the nature of the backdoor mechanisms to the evaluation labs, widening the loop of people with knowledge of the specifics in each product, as well as surely slowing down the evaluation process, once companies had re-architected their products and build systems to support the new mandates.
Even if we ignore the real effects on the privacy and security of private-sector users by weakening encryption or adding back doors to systems, exposing everyone to greater risk from “cyber” crime, we at least have a logical inconsistency wherein the Federal Government wants to be able to purchase commercial products with strong cryptographic systems, but certain parts of that same government do not want those products to be produced as-is, thereby potentially limiting their own access.
Then again, if the back door in the MIKEY-SAKKE VoIP protocol designed for the UK by GCHQ is a precedent, maybe the FBI is willing to follow suit, no matter what the economic, social, or security consequences.