Scholarly article on topic 'Bayesian Authentication: Quantifying Security of the Hancke-Kuhn Protocol'

Bayesian Authentication: Quantifying Security of the Hancke-Kuhn Protocol Academic research paper on "Computer and information sciences"

Share paper
{"security protocol" / "pervasive authentication" / "symbolic model" / "Bayesian reasoning" / "distance bounding"}

Abstract of research paper on Computer and information sciences, author of scientific article — Dusko Pavlovic, Catherine Meadows

Abstract As mobile devices pervade physical space, the familiar authentication patterns are becoming insufficient: besides entity authentication, many applications require, e.g., location authentication. Many interesting protocols have been proposed and implemented to provide such strengthened forms of authentication, but there are very few proofs that such protocols satisfy the required security properties. In some cases, the proofs can be provided in the symbolic model. More often, various physical factors invalidate the perfect cryptography assumption, and the symbolic model does not apply. In such cases, the protocol cannot be secure in an absolute logical sense, but only with a high probability. But while probabilistic reasoning is thus necessary, the analysis in the full computational model may not be warranted, since the protocol security does not depend on any computational assumptions, or on attacker's computational power, but only on some guessing chances. We refine the Dolev-Yao algebraic method for protocol analysis by a probabilistic model of guessing, needed to analyze protocols that mix weak cryptography with physical properties of nonstandard communication channels. Applying this model, we provide a precise security proof for a proximity authentication protocol, due to Hancke and Kuhn, that uses probabilistic reasoning to achieve its goals.

Academic research paper on topic "Bayesian Authentication: Quantifying Security of the Hancke-Kuhn Protocol"

Available online at


Electronic Notes in Theoretical Computer Science 265 (2010) 97-122

Bayesian Authentication: Quantifying Security of the Hancke-Kuhn Protocol

Dusko Pavlovic1 '2

Kestrel Institute and Oxford University

Catherine Meadows1 '3

Naval Research Laboratory


As mobile devices pervade physical space, the familiar authentication patterns are becoming insufficient: besides entity authentication, many applications require, e.g., location authentication. Many interesting protocols have been proposed and implemented to provide such strengthened forms of authentication, but there are very few proofs that such protocols satisfy the required security properties. In some cases, the proofs can be provided in the symbolic model. More often, various physical factors invalidate the perfect cryptography assumption, and the symbolic model does not apply. In such cases, the protocol cannot be secure in an absolute logical sense, but only with a high probability. But while probabilistic reasoning is thus necessary, the analysis in the full computational model may not be warranted, since the protocol security does not depend on any computational assumptions, or on attacker's computational power, but only on some guessing chances.

We refine the Dolev-Yao algebraic method for protocol analysis by a probabilistic model of guessing, needed to analyze protocols that mix weak cryptography with physical properties of nonstandard communication channels. Applying this model, we provide a precise security proof for a proximity authentication protocol, due to Hancke and Kuhn, that uses probabilistic reasoning to achieve its goals.

Keywords: security protocol, pervasive authentication, symbolic model, Bayesian reasoning, distance bounding

1 Introduction

Two paradigms of security. Traditionally, two paradigms have been used for proving protocol security. The first one, captured by the symbolic model, commonly known as "Dolev-Yao", describes both protocol and attacker in terms of an algebraic

1 Supported by ONR.



1571-0661/$ - see front matter, Published by Elsevier B.V. doi:10.1016/j.entcs.2010.08.007

theory [14]. While this has been criticized as crude, it is often highly effective and easily automated. The other paradigm, captured by the computational model, usually relies on some notion of indistinguishability from the point of view of a computationally limited attacker [18]. Recently, a lot of research [3,32], starting with [1], has been devoted to drawing the two paradigms closer together. This strategy has generally been to rely upon crypto-algorithms that themselves satisfy strong enough definitions of security, so that, if used in the proper way, they can be treated as Dolev-Yao "black boxes".

Problem of pervasive security. However, there is an emerging class of security protocols for which it seems difficult to bring these two paradigms together. Such protocols arise in heterogenous networks of diverse computational and communication devices, with mixed type channels between them [34]. Nowadays ubiquitous, such networks can be viewed as a realization of Doug Engelbart's visionary idea of smart space and pervasive computation [16]. The spatial aspects of computation give rise to a new family of security problems, where the standard authentication requirements need to be strengthened by proofs of spatial proximity. In some cases, it has been possible to refine symbolic methods to get stronger proofs [23,30]. But there are other cases that resist symbolic analysis.One such case is the Hancke-Kuhn distance bounding protocol [21], which we analyze in the present paper. The protocol consists of a timed challenge-response exchange in which a prover Peggy needs to convince a verifier Victor that she is in the vicinity. Peggy's rapid response to Victor's challenge is implemented using a rapidly computable function. The requirement that the function must be rapidly computable turns out to weaken it cryptographically. One of the main requirements of cryptographic strength is diffusion: for a boolean function, each bit of the output should depend on each bit of the input. But such a function is not rapidly computable. The other way around, an on-line function, that produces its output while still receiving its input, is easier to compute, but cannot be cryptographically strong. So there is a tradeoff between cryptographic strength and rapid computability. We explore this tradeoff in Sec. 5, and quantify the information leakage of on-line functions. The Hancke-Kuhn protocol is based on such a function.

Already in the original presentation [21] of their protocol, Hancke and Kuhn wrote down an estimate of the attacker's chance to guess a response bit. However, besides attempting to guess some bits of the response, the attacker may also attempt to guess the secret on which the response is based. Moreover, he may attempt his guesses directly, or make use of the responses stored from other sessions. Last but not least, he may collude with Peggy. Towards a precise security proof, the diverse strategies available to the attacker must be evaluated together, and exhaustively. This requires a formal model of protocol execution.

Bayesian security. But what model to use? The symbolic model cannot be used because the perfect cryptography assumption is not validated by the on-line function, which is the central feature of the protocol. On the other hand, the cryptographic strength and weakness of this function, and the resulting security and insecurity of their protocol, does not have anything to do with any computational

assumptions, or with the computational power of the adversary: it only depends on guessing chances, which cannot be essentially increased by computational power. Thus using the computational model does not contribute to the analysis of the central feature of the protocol, although it does apply to any implementation.

The most natural model for analyzing the Hancke-Kuhn protocol that we came up with extends the symbolic model by a rudimentary probabilistic theory of guessing. It retains the perfect cryptography assumption for the standard cryptographic primitives used in the protocol, in particular for the keyed hash function. In a probabilistic context, though, the perfect cryptography assumption means that the output distributions of the relevant cryptographic primitives are statistically indistinguishable from the uniform distribution. Assuming this for the hash function used in the protocol brings us close to the random oracle assumption, often used in computational analyses [4]. There is a sense in which the random oracle assumption can be construed as the probabilistic version of the perfect cryptography assumption.

In summary, we contend that the simplest model capturing the central features of the Hancke-Kuhn authentication protocol must be probabilistic, but need not be computational. The probabilistic model that we propose is an extension of the symbolic theories used in our previous work [22,8,24]. On the other hand, a version of the standard computational model can be obtained as an extension of this probabilistic model (by distinguishing a submonoid of feasible functions within our monoid of randomized boolean functions). It should be noted that these logical maps between the models go in the opposite direction from those in the explorations of the computational soundness of the various fragments of the symbolic model [1,3,32]. In such explorations, the symbolic languages are mapped (interpreted) in the computational language; here, a more concrete model is mapped onto a more abstract model, which is its quotient, just like blocks of low-level code are mapped onto the expressions of a high-level programming language, or like more concrete state machines are mapped on more abstract state machines [25,26]. It follows that anything proven about the abstract model remains valid about its more concrete implementations: e.g., the Bayesian reasoning about secrecy remains valid in the computational model — provided that the assumed randomness of the hash function can be validated. This proviso is, of course, not satisfied in practice, since cryptographic hash functions are not truly random. The task, thus, remains to strengthen or refine the reasoning as to be able to discharge such unrealistic assumptions. This logical strategy was discusssed in [22,8]. While not widely accepted in security, this is a standard approach to refinement based software development: e.g., Euclid's algorithm is usually described assuming the ring of integers; but the assumption that there are infinitely many integers must be discharged before the algorithm is implemented in a real computer.

The space does not allow us to delve into the details of this approach, as applied to security. They will be presented elsewhere. In the present paper, we attempt to present a very special instance of this approach, where a modest probabilistic extension of the symbolic model suffices for the problem at hand — yet it leads to an

essentially different reasoning framework, with bayesian derivations instead of logical. The resulting technical divergence, mitigated by the conceptual guidance from the underlying simpler model, should be viewed as one of the main features of the incremental approach, pursued in the Protocol Derivation Logic (PDL) [22,8,24]. In [23], PDL was already used to analyze distance bounding protocols, similar to Hancke-Kuhn's, and for reasoning about pervasive security in general. An interesting feature of the current probabilistic extension of PDL is that the concept of guards, originally developed for reasoning about secrecy [24], now provides a crucial stepping stone into our analysis of guessing chances, and of the concrete authentication guarantees in the Hancke-Kuhn protocol in Sec. 6, as well as in the abstract view of symbolic authentication in Thm. 3.4.

Related work. As already mentioned, the closest relative of the PDL formalism, underlying this work, and briefly summarized in Sec. 3, is PCL [15,11,10]. Both formalisms owe a lot to strand spaces [17], in spirit, and in execution models, although the logical methods diverge. Our probabilistic extension of PDL is predated by the probabilistic extension of PCL in [12], and by the probabilistic extension of strand spaces in [20]. But each of the three probabilistic approaches has a different intent, and a completely different implementation, conceptually and technically. It would be interesting to explore these differences more closely, as some tasks may yield to combined modeling methods.

Paper outline. The paper continues with a review of distance bounding authentication, and a description of the Hancke-Kuhn protocol. In Sec. 3 we provide a brief overview of the derivational method of protocol analysis, and of PDL. We also recall the algebraic notions of derivability and guards, originally used for derivational analyses of secrecy, and here adapted for authenticity. The probabilistic versions of these notions are introduced in Sec. 4, and then used to model guessing. The gathered tools are then put to use. In Sec. 5, we analyze the information leakage of on-line functions in general, and characterize the Hancke-Kuhn function among them. In Sec. 6, we quantify the authentication achieved in the Hancke-Kuhn protocol. Sec. 7 closes the paper with a summary of the results and a discussion of the extensions. All proofs are in the Appendix.

2 The Hancke-Kuhn protocol

2.1 Background

In a man-in-the-middle attack on a challenge-response protocol, the attacker relays messages, sometimes modified, between the legitimate participants. If resending a message takes time, the legitimate participants may observe slower traffic. This has been proposed as a method to prevent man-in-the-middle attacks. In particular, the challenger can measure the presumed round trip of his challenge and of responder's response, and compute a maximal distance of the responder, assuming an upper bound on the message velocity. This can assure the authenticity of the response, if it is known that the attacker cannot be too close. This is the idea of distance

bounding [13,5]. The early security analyses of distance bounding protocols go back to the early 1990s [6]. The interest in this type of authentication re-emerged recently, with the task of device pairing and a genuine need for proximity authentication in pervasive networks. From the outset, the basic idea of distance bounding was to combine some cryptographic authentication tools, such as hashes or signatures, with a physical constraint, such as the limited speed of message exchange. Most distance bounding protocols [6,7,23] implement this combination by using two channel types: the standard network channels for the cryptographic authentication, and the timed channels for the rapid response. The Hancke-Kuhn protocol [21] stands out by it simplicity, and by the fact that both cryptographic data and the rapid response are sent on the timed channel. This, however, comes for the price of information leakage, which makes the security analysis interesting.

2.2 The protocol

As mentioned before, the goal of the Hancke-Kuhn protocol is that the prover Peggy proves to the verifier Victor that she is nearby. It is assumed that Peggy and Victor share a long term secret s, and a public hash function H. The relevant security requirement from H will turn out to be a version of the range preimage resistance [29]. The simplest way to present a protocol session is to view it in two stages.

In the first stage, Peggy and Victor exchange values a and b, which can be predictable for the attacker, but must never be reused by Peggy and Victor in more than one protocol session. The values a and b can thus be viewed as counters.



Fig. 1. Hancke-Kuhn protocol: Second Stage

In the second stage, Peggy and Victor both form the hash h = H(s :: a :: b) and proceed with the exchange on Fig. 1. If Victor's challenge x = (xi) E Z is a bitstring of length i, then the hash h should be 21 bits long which we view as a concatenation h = h(0) :: h(1) E Z^ of two strings of i bits. The function ffl : Z2 x Z^ —► Z2 is defined bitwise for i = 1, 2,... ,i by

(x ffl h)i = h(Xi) (1)

To summarize Fig. 1,

• Victor generates a random bitstring x of length i, and sends each bit xi of x at times ri.

• To each bit Peggy responds with h(0) if x = 0, and with h(1) if x = 1.

• Victor receives Peggy's i-th bit response at time r¿. He knows h as well, and can check that these responses are correct. If only he and Peggy know h, then the responder must be Peggy. He then uses the times between the sending the challenges and receiving the responses, together with the velocity of the message signal, to compute his distance from Peggy.

2.3 Discussion

Leaking information to the attacker. The crucial component of the protocol is the Hancke-Kuhn function Ш. Its main feature is that it is rapidly computable, as efficiently as the exclusive or ф. It is thus as suitable for timed authentication as ф, but it also leaks information, although less than ф: while x and x ф g allow extracting g because g = x ф x ф g, x and x Ш h allow extracting only half of the bits of h. However, it is easy to see from (1) that from x, and x Ш h, and moreover (—x) Ш h, the attacker can extract all of h. That is why Peggy and Victor must not reuse their counters. If h = H(s :: a :: b) can be used in two responses, then an attacker can challenge Peggy twice, first with x and then with —x, and thus get xШh and (—x) Ш h as the two responses. From this, he can extract h and impersonate Peggy to Victor. Even if the counters are never reused, the fact that half of the response bits can be acquired by an attacker needs to be carefully examined, and his chances to guess the rest evaluated.

Overlooked assumption. Hancke and Kuhn's estimate that the probability that an attacker may succeed in impersonating Peggy is (|)|x| relies on the implicit assumption that |x| < |s|. Otherwise, if |x| > |s|, the attacker has better odds to guess s than x. In practice, of course, the assumption |x| < |s| is usually satisfied, because the secret s is usually at least 256 bits long, while the challenge x may be shorter. Strictly speaking, though, the impression that protocol's security only depends on the length of the challenge x is not correct, since a short secret s would make it vulnerable.

Dishonest prover and the kernel. Another interesting weakness is that the value of Peggy's i-th response bit (x Ш h)i does not depend on xi if hf = h«. A dishonest Peggy can thus analyze the hash h and respond without waiting for xi whenever h(0) = h(1). If the response time is averaged, she is likely to appear closer to Victor than she really is.

Since Victor's counter b is predictable, Peggy can attempt to choose her own counter a to maximize the size of the kernel Kh of h = H(s :: a :: b), defined

Kh = {i < t | h(0) = h(1)} (2)

The larger the kernel, the closer Peggy can appear to Victor. However, the problem of finding a value a such that, for a fixed s and b, the image H(s :: a :: b) has a desired property is a version of the range preimage problem [29]. The assumption that H is a hash function, and in particular that it is a one-way function, implies that dishonest Peggy's advantage in finding a preimage a such that H(s :: a :: b),

given s and b, falls within a desired range of strings with a large kernel, is negligible. This means that dishonest prover's manipulation of the kernel is unfeasible.

Further ad hoc observations get more complicated, without providing any definite assurances. This demonstrates the need for a rigorous analysis within a formal model.

Modeling the essence of the Hancke-Kuhn protocol. The assumption that H is a one-way function will turn out to be the only point where the security of the Hancke-Kuhn protocol depends on computation. All other attack strategies only involve guessing chances. To show this, in the following sections we introduce a probabilistic (Bayesian) protocol model, which strictly extends the standard algebraic (symbolic) model, and is a strict fragment of the standard computational model. The hash H is modeled as a randomized function, as defined in Sec. 4. The perfect cryptography assumption of the symbolic model lifts in our Bayesian model to the assumption that the hashes are truly random, which is, of course, analogous to the random oracle assumption in the computational model. It allows us to abstract away the generic and negligible vulnerabilities, and to focus on the interesting aspects of the security of the Hancke-Kuhn protocol, achieved in spite of the cryptographic weakness of the Ш function as it central feature.

3 Algebraic protocol models

We analyze the Hancke-Kuhn protocol by the derivational method. The varied versions of this method have been applied to many protocols [15,22,8,11,10]. While the algebraic protocol model suffices in most cases, the Hancke-Kuhn protocol requires an evaluation of guessing chances. We attempt to find a simple model that will allow this.

3.1 Message algebras

In the Dolev-Yao protocol model, messages are represented as terms of a free algebra of encryption and decryption operations [14]. More general algebraic models allow additional operations, and additional equations [9]. Recall that an algebraic theory is a pair (O,E), where O is a set of finitary operations (given as symbols with arities), and E a set of well-formed equations (i.e. where each operation has a correct number of arguments) [19].

Definition 3.1 An algebraic theory T = (O,E) is called a message theory if O includes a binary pairing (—, —) operation, and the unary operations n1 and n2 such that E contains the equations n1(u, v) = u, n2(u, v) = v, and ((x, y) ,z) = (x, (y, z)). A message algebra is a polynomial extension T[X] of a T-algebra T.

Remarks. The third equation implies that there is a unique n-tupling operation for every n. The first two imply that the components of any tuple can be recovered. A polynomial extension T[X] is the free T-algebra generated by adjoining a set of indeterminates X to a T-algebra T [19, §8]. The elements x,y,z ... of X are used to represent nonces and other randomly generated values. This is justified by the fact

that indeterminates can be consistently renamed: nothing changes if we permute them. That is just the property required from the random values generated in a run of a protocol4 .

3.2 Protocol models

There are several protocol modeling formalisms that can be used for protocol derivations. The process calculus in [15,11] was designed specifically for this purpose. Strand spaces [17] were designed for a different purpose, but they can be adapted for protocol derivations too. In [22,8,24] we used partially ordered multisets (pomsets) of actions [27], which allow simple tool support [2]. We stick with this approach, but the subtle (or in some cases not so subtle) differences between these approaches are of no consequence here. For completeness, we provide a brief overview. For more detail, the reader may want to consult some of the mentioned references.

In all cases, the set of actions A is generated over the message algebra T[X] by a grammar allowing each term t e T[X] to be sent in the action (t) e A, and received in the action (t) e A. Moreover, an indeterminate x eX can be introduced into a protocol by the binding action (vx) e A, which is read as "generate fresh x".


<cVP X

(rVP x)

((cVP x))

((rVP X»

Fig. 2. CR template

Fig. 2 shows the abstract challenge-response protocol template, where the verifier Victor authenticates the prover Peggy. It is assumed that only Peggy is able to transform the fresh challenge cVP x into the response rVPx. This assumption is construed as a constraint on the operations cVP and rVP. The actions ((t)), and ((t)) are syntactic sugar for "send (resp. receive) a message from which anyone can extract t".

3.3 Views, derivability and guards

As usual, the communication channels are assumed to be controlled by the attacker: she observes all sent messages, and controls their delivery. However, she may not

4 Of course, this is not the only requirement imposed on nonces and random values. The other requirement is that they are known only locally, i.e. by those principals who generate them, or who receive them unencrypted. This requirement is not formalized within the algebra of messages, but by the binding rules of process calculus or actions by which the messages are sent [11,24].

be able to invert all operations, and she has no insight into the fresh or secret data of other principals. Hence the different views of the various protocol participants.

A state a reached in a protocol execution is a lower closed pomset of actions executed up to that point, with an assignment of values to principals' local variables, which they use to store messages and their local computations. The view rp of a principal P at a state a consists of all terms that P may have observed up to a, and all terms that she could derive from that. Formally, this last clause means that rp is upper closed under the derivability relation

h h e ^ vt e e e o(n) 3si,...,s„ e h. t = ^(si,...,s„) (3)

where s, e C T[X] are finite sets of terms, O(n) is the set of well-formed n-ary operations in the signature O, and the equation is derivable from E.

Authentication by challenge-response

The challenge-response protocol in Fig. 2 validates authentication if Victor is justified in drawing a global conclusion from his local observation: i.e., having observed his own actions in on the left, Victor should have good reasons to conclude that Peggy must have performed her actions on the right, and that all these actions should be ordered as on the figure. Intuitively, this conclusion of Victor's can be justified by the assumptions that

(i) anyone who originated the response rVPx had to previously receive the challenge cVP x, which could only happen after Victor sent this challenge;

(ii) no one could produce rVPx without knowing the secret sVP, so it must be Peggy.

This last conclusion is based on the assumption that only Peggy knows sVP, or only Peggy and Victor. In both cases, Victor's reasoning is the same, because he knows that he did not send rVPx.

Using the derivability relation, these informal justifications can be refined into slightly more formal proof obligations in terms of (3), as follows. For any set of principals n, it is required that

(i) whenever there is a derivation s h rVPx, then there must also be a derivation s h cVPx, for any set of terms s observed by n in a run of CR before rVPx is sent;

(ii) whenever there is a derivation s,

h rVPx, then there must also be a derivation s, cVP x h s , for any set of terms s known to n in a run of CR before rVPx is sent.

This type of authentication reasoning can be formalized using the notion of guards from [24].

Definition 3.2 We say that a set of sets of terms G algebraically guards a term t with respect to a set of terms Y, and write G guards t within Y if for all s C Yholds

H h t e G. H h r (4)

Explanation. We say that, in a context C, G guards t if every computation path to t leads through some element of G. In other words, if H allows computing t, then it is "because" it allows computing some of t's guards from G.

Example. Let Y = (DH) be the set of terms that may become known to the participants and eavesdroppers of a run of the Diffie-Hellman protocol. Then

{{x,gy}, {y,gx}} guards gxy within (DH)

Note that gxy can be derived not only from {x, gy } and {y, gx} but also from {g, x, y} and {g,xy}; however, neither of these sets can occur in a run of the Diffie-Hellman protocol between two honest principals, so they are not contained in the set Y = (DH).

Definition 3.3 Let Q be a protocol run, and A a set of actions in Q. The term context is the set

Q(A) = |J rP U rPA

where n is the set of principals engaged in the run, rP is the set of terms known to a principal P initially, and rpA is the set of terms known to P before any of the actions a e A are executed in Q.

Using the guard relation, we can prove that the challenge-response protocol validates authentication.

Theorem 3.4 Let Q be a run of the challenge-response protocol on Fig. 2. Suppose that the functions cVP and rVP satisfy

{{cVPx,sVP}} guards rVPx within Q(rvPx)

where sVP is a secret known only to Peggy (and possibly to Victor). Then Victor is justified in drawing the following global conclusion from his local observations:

V : (vx)v > (cVPx)v > (rVPx)v

(vx)v > (cVPx)v > ((cVPx))p > ((rVPx))- > (rVPx)v) (cr)

where the relation a>b says that action a occurs before action b, and ((m))— denotes the first time P sends message m after creating it.

The proof of this theorem is obtained by expanding the definition of the guard relation and analyzing the term context of the challenge-response protocol. Several examples of reasoning with this relation can be found in [24].

Comment about perfect cryptography. The algebraic guard relation is based on the assumption that a term can only be derived algebraically, using the given operations and equations. A term t thus either lies in a subalgebra generated by a set of terms H, or not, and we have H h t V H h t. This means that the attacks on the implementation of the term t are abstracted away. In particular, we assume

that it is impossible to cryptanalyze the bitstrings representing t, and to derive t by accumulating partial information about it. In other words, we assume perfect cryptography.

Moreover, we assume that the algebraic derivations s h t only use the equations specified in the given algebraic theory T = (O,E). This means that the message algebra T is assumed to be a free T-algebra, or that it is computationally unfeasible for the attacker to find any additional equations that T satisfies, not specified in the theory T, and to use them in his derivations. This is roughly the pseudo-free algebra assumption [28].

Can we apply Thm. 3.4 to the Hancke-Kuhn protocol? The Hancke-Kuhn protocol on Fig. 1 is obviously a timed version of the challenge response template from Fig. 2, for which Thm. 3.4 provides a general security claim. If the guard condition holds, then the Theorem yields the security of the Hancke-Kuhn protocol.

In the algebraic model, the attacker at a given state either knows a term, or not. As explained in Sec. 2, the attacker on the Hancke-Kuhn protocol may always obtain half of the bits of the secret shared by Victor and Peggy by challenging her. Does this mean that the attacker gets to know the secret? If not, then the guard condition is satisfied. To apply Thm. 3.4, we should thus set up the algebraic model so that a term is known only when all of its bits are known.

Howeber, the same security proof would also hold for a modified version of the Hancke-Kuhn protocol, e.g. where x ffl h = h(0) if x = a and x ffl h = h(1) otherwise, for some fixed a E Z^. The attacker still cannot algebraically derive the term x ffl h without x, because this term still depends on x. The guard condition holds, and thus the protocol is algebraically secure. In reality, though, the attacker who always responds with

will succeed with a probability greater than 1 — 2 , assuming that the challenge x is drawn uniformly. The algebraic security of the Hancke-Kuhn type of protocols is not very realistic.

4 Protocol models with guessing

In this section we propose a probabilistic refinement of the guard relation, which captures and quantifies just the partial information leaks, like the one in the Hancke-Kuhn protocol, without adding any unnecessary conceptual machinery.

4.1 Implementing and guessing messages

In order to reason about the feasibility of the algebraic operations on messages, and about guessing, we consider the implementations of the messages t E T in an algebra Q of strings, which carries the structure of a message T-algebra, and moreover set of randomized functions.

For concreteness, we assume that Q = Z2 is the set of bitstrings. However, any graded free monoid would do, since the only operations that we use are the concatenation and the length.

4.1.1 Implementing messages

Let H be a partially ordered set. We call an infinitely increasing chain ho < hi < h2 < ••• in H a H-tower. We denote by H- the set of towers in H. Any free monoid Q is partially ordered by the prefix relation

a d b ^^ 3c e Q. a :: c = b

where a :: c can be viewed as the concatenation of the strings a and c. We call Q-towers streams. They are just infinite sequences of strings, strictly extending each other: a stream is a sequence a = {a^eN C QN such that a^ d a^+1 for all £. A stream a is called an £-stream if the length of £-th element is exactly ja^j = £. The set of streams through Q is denoted by Q-.

N can be viewed as the special case, since a natural number can be viewed as a string of 1s. The set N- consists of strictly increasing sequences of natural numbers.

Definition 4.1 Let X be a set of indeterminates. Its strength is a map j-j : X —► N-, assigning to each indeterminate x for each value of the security parameter £ e N the required length jxj^ e N.

An environment is a partial map q : X — Q- such that jq(x)^j = jxj^ whenever n(x)^ is defined.

An implementation of a T-algebra T is an injective T-algebra homomorphism |-J : T ~ Q-.

An environment and an implementation induce a T-algebra homomorphism |—Jn : T[Xn] —► Q-, where Xn C X is the domain of definition of q. We call this homomorphism an implementation too whenever it is injective.

The implementation of the algebra T assigns a unique string to each term. By definition of the polynomial algebra T[Xn], every algebra homomorphism T —► U to another algebra U, and a function Xn —► U induce a unique algebra homomorphism T [X, ] U.

Since any algebraic operation on Q lifts to a pointwise operation over any power Qn, it also lifts to streams. So Q- is also a T-algebra, and a monoid for (elementwise) concatenation. 5

Notation. When confusion seems unlikely, we ignore the difference between the indeterminates x, y... e X and their environment values q(x),q(y)... e Q.

4.1.2 Randomized functions Consider the set of partial functions

R = {f : Q x Q - Q jVxVpiV^./(pi,a) | A f(p2,a) | ^ jpij = jp2j}

where /(p, a) | means that / is defined on p, a, and jpj is the length of the bitstring p. The set R is a monoid with the following composition operation

f ◦ g(p2 :: pi,a)= f (p2,g(pi,a))

5 Grading is not an algebraic operation, and it does not lift: the length of each stream is infinite.

and with the function i (o, a) = a as the unit, where o denotes the empty string. We interpret the elements of R as randomized functions over Q: the first argument p represents the random seed, and the second argument a is the actual input. The output fa can then be viewed as a random variable with the probability distribution

Probfa = b) = #(p|f (f;a) = b} (5)

where r is the length of all p for which f (p, a) is defined. Leaving the seed implicit,

we denote randomized functions, as presented in R, in the form f : Q —► Q.

Definition 4.2 A stream of functions is a sequence f = (f?}?eN E RN which is monotone, in the sense that for all streams a, p E Qw, at every i E N holds

f?(p?,a?) I A f£+1(p^+1,a^+1) | f?(p?,a?) C f£+1(p^+1,a^+1)

We denote the monoid of streams of functions by Rw.

4.1.3 Indistinguishability

Surviving the flood of negligible factors. Every subterm of every term in every security protocol can in principle be guessed. Such probabilities are usually tolerably small: they are negligible functions of some security parameter i. In probabilistic analyses, it is often convenient to ignore such events of negligible probability. In a protocol analysis, tracking all terms and subterms that can be guessed with a negligible probability can lead to a lengthy list, without revealing anything non-negligible. In this section, we provide an underpinning for formal probabilistic reasoning up to negligible factors.

The frequencies of events are established by repeated sampling. The number of samples needed for a reasonable estimate depends on a priori chance that the event will occur. If this chance is 1 in n, then the number of the needed sample is an increasing function of n.

When sampling a stream a = (a? }?eN, we assume that a reasonable amount of samples should not be greater than q(i), where q is a function from a rig6 Q C Nn. In cryptography it is customary to take Q = N[x], the polynomials with non-negative integer coefficients. Streams are thus sampled a polynomial number of times. If the probability that the difference between a? and b? will be detected in q(i) samples remains small for all i, then a = (a?}?eN and b = (b? }?eN are considered indistinguishable. In other words, a and b are indistinguishable if the probability that a? and b? are different is less than for all q E Q. Now we formalize this intuition.

Definition 4.3 A function v : N —> [0,1] is said to be Q-negligible if it converges to 0 faster than for all q E Q, i.e.

Vq E Q 3n E N Vi > n. v(i) < -1-

6 A rig Q is a "ring without the negatives": it consists of two commutative monoid structures, (Q, +, 0) and (Q, •, 1), such that x • (y + z) = x • y + x • z and x • 0 = 0.

The set of Q-negligible functions is denoted by Q. The ordering on streams a,b e [0,1]N is defined up to negligible functions, i.e.

a < b ^^ 3vV£. a^ + v(£) < b^

We say that a, b e [0,1]N are Q-indistinguishable, and write a Q b, if a < b and b < a, or equivalently

a - b ^ 3vV£. ja^ - b*j < v(£)

Assumption, examples. For simplicity, we take Q to be the rig N[x] of polynomials with non-negative integer coefficients, as it is usually taken in cryptography. Then, e.g., for a = {2^and b = {£-2}^eN holds a — 0, but b — 0, where 0 is viewed as the constrant sequence.

Definition 4.4 Streams of functions / and g are indistinguishable if the sequences Prob(/a = b) and Prob(ga = b) are indistinguishable for all streams a,b e Q-. We abbreviate

/ — g ^^ Vab e Q-. Prob(/a = b) — Prob(ga = b)

Definition 4.5 A flow is an equivalence class of streams of randomized functions. The flow monoid R is thus

R = R- / —

4-2 Probabilistic derivability

In contrast with the algebraic derivability relation from Sec. 3.3, the probabilistic derivability relation does capture partial information leaks, using the implementations of the terms. While S / в may happen because some t € в is not algebraically derivable from S, it may be easy to guess many bits of information about в from S. We formalize this by saying that for some stream of randomized functions f € R, Prob(f |S] = [в]) is high. By assumption, the messages в are easily decoded from their implementations [в]. So if some f is likely to output [в] on the input [S], then the chance to derive в from S is high. This is what we want to capture by the following randomized derivability relation, which quantifies guessing chance.

Let X(S) С X be the set of indeterminates that occur in S. Any minimal environment n in which the [S]n is defined must be defined over X(S). Since for each I the required number of bits for each x € X(S) is fixed to |x|^, each % must select the same number of bits

|X (S)|, = £ |x|,

xeX (s)

So there are 2X(s)^ environments to interpret S for the security parameter I. Our chance to guess в from S is the probability that a flow f € R will output [в]п when given the input [S]n, for the random choices of n. Hence the following definition.

Definition 4.6 The guessing chance [S / в is the stream of probabilities

rs h e1 \/ I Msj? = [ej?}

Is h e? =V -2TX^)k--(6)

viewed up to indistinguishability. We abbreviate [0 h e] to [e].

Since the functions in the sequence (f?}?eN compute on streams [sj?, together they form a stream of functions f E Rw, i.e. a flow f [sj = e.

Examples. For any closed term t E T, i.e. such that X(t) = 0, it holds that [t] =1. To see this, note that [tj is given in the empty environment n0, and thus X(t) = 0 implies |X(t)|? = 0 for all i. The supremum of (6) is reached at the constant function stream f() = [tj, and gives [t] = #{n0 12{()=^} = 1.

On the other hand, for every x E X holds [x]? = 0. There are exactly environments nx, defined on x alone. To guess x without any inputs, we need a constant flow f, such that f () = [xj = nx(x), i.e. a constant stream of functions = (x)?. Whichever f we may choose, exactly one environment will give

f() = nx(x). So for every constant flow f holds 12fi)/= ^. The supremum in (6) is thus reached for all constant f E R, and [x] ? = ^^. But the sequence is indistinguishable from 0, as pointed out after Def. 4.3.

4.2.1 Subbayesian reasoning and Advantage Proposition 4.7 For all sets of terms s, r, e holds

[s h rl ■ [s, r h el < [s h r, el (7)

When [r] > 0, it follows that

[r h el (8)

The inequalities become equalities if s and e have no indeterminates in common.

Definition 4.8 The advantage provided by a set of terms s in computing the terms e is the value

Adv[s h el = [s h el — [e1 When this advantage is zero, we say that e is flow independent of s, and write [s±el ^ Adv[s h el =0 ^ [s h el = [e1

4.3 Probabilistic guards

The idea of the guard relation is that a term t is guarded by one of the guards from G if whenever t is derived, then at least one of the guards r E G is also derived. In the algebraic model, this was simple enough to state by Definition 3.2. When t can be guessed, then this crude statement needs to be refined: the event that t is guessed must be preceded by the event that some r E G is guessed.

Definition 4.9 We say that a set of sets of terms G guards (against guessing) a term t with respect to a set of terms Y, and write G guards t within Y if for all s C Y

such that Adv [H h t] > 0 holds

[H h t] < V [H h r] ■ [H, r h t] (9)

Explanation. In the algebraic case, (4) was an attempt to capture the intuition that G guards t if all computational paths to t lead through some r e G, assuming the context C. The above definition extends this attempt to computational paths with guessing. If we get any help from H to guess t, then that help is not greater than the help we get from it to guess some guard r e G of t first, and then to guess t from this guard. Applied to message theories with trivial implementations (e.g. with Q = 1), Def. 4.9 boils down to Def. 3.2, in the sense that the guessing chance is always constantly 0 or constantly 1, and (9) reduces to (4).

To simplify notation, we elide the environment subscripts from |—Jn whenever q is inessential for the argument.

5 Partitioned functions and El

In this section we analyze a class of quickly computable functions, like the one used in the Hancke-Kuhn protocol. One way to ensure that a function is quickly computable is to require that the bit dependency of its outputs from its inputs must be partitioned: the i-th block of output bits should only depend on the i-th block of input bits. Since in this section we are dealing with purely random input, our results are presented in terms of streams, not flows.

Definition 5.1 We say that a boolean function / : Z™ —► Z^ is partitioned when m = mi + m2 + ••• + m^ n = ni + n2 + ••• + n / = /i :: /2 :: ••• :: ^ where / : Z™ —> Z^, for i = 1, 2,...£ are independent on the inputs and the outputs of all other component functions, in the sense that [x1,/1(x1) ± /i(xi)], where i = {j < £j j = i}.

Clearly, a boolean function receiving its input string sequentially can already return the i-th block of its outputs while still receiving i + 1st block of the inputs. Unfortunately, this convenient property also decreases cryptographic strength of the function, which requires that each bit of the output depends on each bit of the input [33]. In particular, knowing a value /(z) of a partitioned function increases the chance of guessing /(x). We make this precise in the next section.

5.1 Guessing partitioned functions

Proposition 5.2 (a) Let f be a randomized partitioned function, and let x,z G Z™ be fixed bitstrings with a common block x^ = z G Z^. Then [x,z,f (z) h f (x)] >

(b) Let f : Z2 —> Z^ be randomized bitwise partitioned, i.e. jm^j = jn^j = 1 for

all i < i. Then [x,z,f (z) h f (x)1 > 2-A(x>z), where A(x,z) = #(i|x = z} is the Hamming distance.

A consequence of Prop. 5.2 is that a proximity authentication protocol, implemented using a partitioned function R to compute the response rVPx = R(sVP ,cVP x), cannot be secure in an absolute sense, because the response may be guessed with a non-negligible probability from the other responses rVPz. Moreover, it seems that the attacker can always obtain some other responses rVPz by impersonating Victor and issuing challenges cVP z.

Lemma 5.3 A randomized boolean function f : Z2 —> Z2 is bitwise partitioned if and only if for every x E Z2 it holds that

f (x)= x ffl (f (0?) :: f (1?)) (10)

where ffl is the Hancke-Kuhn function (1), and 0?, 1? E Z2 are the strings of 0s and 1s, respectively.

Bitwise partitioned functions with a minimal guessing probability can now be completely characterized: they turn out to be precisely the Hancke-Kuhn functions (1) for which the values at 0 and at 1 are independent.

Proposition 5.4 Suppose that f : Z2 —> Z2 is a randomized bitwise partitioned function such that x ± f (0?) :: f (1?) . Then for fixed z and x E Z

[x,z,f (z) h f (x)1 = 2-A(z'x) (11)

if and only if for every i < i it holds that

[fi(0) ± fi(1)1 and [fi(1) ± fi(0)1 (12)

Remark. In a sense, x ffl (—) : Z2? —> Z2 is thus a "one-and-half-way function", since x ffl h discloses only one half of the bits of h.

On the other hand, (—) ffl h : Z^ —► Z^ is not only an example of a bitwise partitioned function, satisfying the needs of the Hancke-Kuhn protocol, but it is a canonical way to represent such functions.

5.2 Guessing x ffl h

We now consider the probability of guessing x ffl h given various sorts of information that may be learned in the Hancke-Kuhn protocol.

Definition 5.5 a) For x E Z2 and I C i = (0,1,2,...i— 1} we define x®1 E Z2 to be the bit string obtained by replacing for all i E I the bits xi with a "wild card" ©

x®i f © if j E 1 j 1 xj otherwise

b) For h = h(0) :: h(1), where h(0),h(1) E Z2 we define the kernel Kh to be the set of places where its first and its second half coincide, e.g.

Kh = (i E i | h(0) = h(1)}.

We make use of these definitions in the following.

Proposition 5.6 Suppose that h the concatenation of two constant ¿-bit streams, and x is a uniformly distributed ¿-bit stream. Then

(a) [h h x ffl h]¿ = 2lKh—

(b) [x, h h x ffl h]¿ = [x®Kh,h h x ffl h]

The following lemma concerns the problem of deriving x ffl h from z ffl h for some

Proposition 5.7 Let h be the concatenation of two uniformly distributed ¿-bit streams, let x be a uniformly distributed ¿-bit stream, and let z be any ¿-bit stream. Then the following holds.

[z ffl h h x ffl h] = [z, z ffl h h x ffl h] =

6 Security of Hancke-Kuhn

We quantify the security of the Hancke-Kuhn protocol by evaluating Prob(crp), i.e. the probability that the sequence of events in a complete protocol run validates the following reasoning of Victor's

V : (vx)v >t (x)v >t(x ffl h)v

^(vx)v > t(x)v > (x)p > (x ffl h)-^ > t(x ffl h)^ (crp)

corresponding to the run on Fig. 1. In order to evaluate this probability, we analyze the probability that (crp) fails. How can it happen that Victor observes a satisfactory sequence of his own actions

V = (vx)v >T(x) v >t (x ffl h)v (13)

but that the desired run

O = t(x)v > (x)p > (x ffl h)-^ >t(x ffl h)v (14)

did not take place? There are just two possibilities:

A: the responder does not know the secret s, i.e. he is the Attacker,

E: the responder knows the secret s, i.e. he is Peggy, but the response is sent Early, without receiving the challenge.

The remaining case, that the responder is Peggy, and she responds to the challenge, is just the event O. Thus -O = A U E. It follows that

Prob(crp) = Prob(O|V) = 1 - Prob(A U E|V)

> 1 - Prob(A|V) - Prob(E|V) (15)

The (in)security of the Hancke-Kuhn protocol thus boils down to evaluating Prob(A|V) and Prob(E|V). The following lemmas and propositions show that these probabilities are negligible. The proofs are in the Appendix.

Response token. Recall that Peggy's response token h = H(s :: a :: b) is derived from the shared secret s, Peggy's counter a, and Victor's counter b, using a secure public hash function H. In this section, h abbreviates H(s :: a :: b).

Assumption 6.1 The above decomposition of —O as AuE is valid only if h = H(s :: a :: b) is such that

• |s| ^ |x|, i.e. attacker's chance to guess the secret s is negligible compared with his chance to guess the challenge x;

• the counters a and b are never reused (although they may be predictable). Otherwise, the attacker may guess h, and —O may not be covered by AuE.

6.1 Guards in undesired runs

In order to evaluate Prob(crp), we need to determine the probability that the correct response x ffl h is guessed in the undesired runs A and E. Towards this goal, we explore what can be guessed in the term contexts (cf. Def. 3.3) A(x ffl h) and E(x). The following lemmas simplify this question.

Lemma 6.2 (a) Let A be an attack run with a long term secret s, Peggy's counter

a, Victor's counter b, and Attacker's challenge z, for which he obtains the response z ffl h, where h = H(s :: a :: b). Then for any s C A(x ffl h) it holds that

[s h x ffl h1 = [s n (s, a, b, x, z, z ffl h} h x ffl h]

(b) Let E be a run with a long term secret s, Peggy's counter a, Victor's counter

b, and where Peggy responds early. Then for any s C E(x) it holds that

[s h x ffl h1 = [s n (s, a, b} h x ffl h1 Lemma 6.3 For h = H(s :: a :: b) and Y C (z, z ffl h} it holds that

[x ffl h]? = [x, z h x ffl h]? = 2-? (16)

[a,b,s,x®Kh h x ffl h1 =1 (17)

a, b, s, x, Y h x ffl h] =1 (18)

Proposition 6.4 ((s}, (z ffl h}} guards x ffl h within A(x ffl h)

Proposition 6.5 {(x®Kh}} guards x ffl h within E(x)

The guards displayed in the preceding Propositions will now be used to evaluate Prob(V|A) and Prob(V|E), i.e. the probabilities that the authentication may fail because the Attacker breaks it, or because Peggy's succeeds in responding Early.

6.2 Bounds on undesired runs

Proposition 6.4 and the definition of probabilistic guards say that, for a given challenge x, the probability that an Attacker can violate authentication is bounded

above by

h s] ■ h x E h] or by [$ h z E h] ■ [$,z E h h x E h]

where $ = {a, b, z, z E h}. The first quantity is clearly negligible. We must show the same for the second.

Likewise, Proposition 6.5 implies that the probability that Peggy can respond Early is bounded above by

[s, a, b h x®Kh] ■ [s, a, b, x®Kh h x E h]

Note that in the attack run A, the Attacker cannot learn x until after she has created z. The distribution of z is thus independent from that of x.

Proposition 6.6 Suppose that the Attacker, before receiving Victor's challenge x, can pick her own challenge z and obtain a single response z E h. Then the stream of expected probabilities Prob(VjA) that the Attacker can deceive Victor by guessing x E h is indistinguishable from the stream of probabilities p defined by

p- = > 2 - lx, z, z E h h x I

^ 2-t [x, z, z ffl h h x ffl h]t = (3)

This means that Prob(VjA) is negligible.

Proposition 6.7 The stream of expected probabilities Prob(VjE) that Peggy can deceive Victor by guessing and sending her response before she receives the challenge

is indistinguishable from the stream q defined by

f 3 x 1

qt [h h x ffl h

hez2 xez2

This means that Prob(VjE) is negligible.

Note in particular that this means that in both cases the stream of probabilities

is indistinguishable from zero, since the stream (^J is itself indistinguishable from zero.

The final result is obtained by putting Propositions 6.4 and 6.6 together.

Theorem 6.8 Suppose that the Hancke-Kuhn protocol is realized in such a way that it satisfyes 6.1, and does not always fail for trivial reasons: i.e., there are some sessions with an honest prover Peggy and an honest verifier Victor. Formally, this means that there are C, D e (0,1) such that

• Prob(A), Prob(E) < C, i.e. not every response is from an Attacker, or too Early,

• Prob(V) > D, i.e. Victor sometimes observes a satisfactory run and accepts.

Then Prob(crp) is indistinguishable from 1. In other words, the Hancke-Kuhn protocol achieves authentication almost certainly.

D. Pavlovic, C. Meadows / Electronic Notes in Theoretical Computer Science 265 (2010) 97—122

7 Conclusion

We have presented a framework for extending algebraic cryptographic models to probabilistic models and used it to construct a probabilistic extension of the Protocol Derivation Logic. We have illustrated it by applying it to an analysis of the Hancke-Kuhn distance bounding protocol. We expect that it will be useful in the analysis of many other protocols that rely on weak cryptography to take advantage of non-standard communication channels.

We should also point out that the potential applications of our framework go far beyond purely probabilistic extensions. The main thing that needs to be done to make our framework applicable to computational models is to define a notion of feasibly computable functions, so that guessing probability can be defined in terms of feasible function streams instead of all possible function streams. We have defined such a notion and are currently investigating its applications to protocols. In future work, we expect to present a more general framework that can incorporate a wide range of methods of cryptographic reasoning.


We are grateful to Joshua Guttman, John Mitchell, Mike Mislove, and to several anonymous referees for careful reading of earlier versions of this paper, and for valuable suggestions towards improvements in presentation.


[1] M. Abadi and P. Rogaway. Reconciling two views of cryptography (the computational soundness of formal encryption). J. of Cryptology, 15(2): 103-127, 2002.

[2] Matthias Anlauff, Dusko Pavlovic, Richard Waldinger, and Stephen Westfold. Proving authentication properties in the Protocol Derivation Assistant. In Pierpaolo Degano, Ralph Kusters, and Luca Vigano, editors, Proceedings of FCS-ARSPA 2006. ACM, 2006. to appear.

[3] Michael Backes, Dennis Hofheinz, and Dominique Unruh. Cosp: a general framework for computational soundness proofs. In Ehab Al-Shaer, Somesh Jha, and Angelos D. Keromytis, editors, ACM Conference on Computer and Communications Security, pages 66-78. ACM, 2009.

[4] M. Bellare and P. Rogaway. Random oracles are practical: a paradigm for designing efficient protocols. In CCS '93: Proceedings of the 1st ACM conference on Computer and communications security, pages 62-73, New York, NY, USA, 1993. ACM.

[5] Thomas Beth and Yvo Desmedt. Identification tokens - or: Solving the chess grandmaster problem. In CRYPTO '90: Proceedings of the 10th Annual International Cryptology Conference on Advances in Cryptology, pages 169-177, London, UK, 1991. Springer-Verlag.

[6] Stefan Brands and David Chaum. Distance-bounding protocols. In EUROCRYPT '93: Workshop on the theory and application of cryptographic techniques on Advances in cryptology, pages 344-359, Secaucus, NJ, USA, 1994. Springer-Verlag New York, Inc.

[7] S. Capkun and J. P. Hubaux. Secure positioning in wireless networks. IEEE Journal on Selected Areas in Communication, 24(2), February 2006.

[8] Iliano Cervesato, Catherine Meadows, and Dusko Pavlovic. An encapsulated authentication logic for reasoning about key distribution protocols. In Joshua Guttman, editor, Proceedings of CSFW 2005, pages 48-61. IEEE, 2005.

[9] V. Cortier, S. Delaune, and P. Lafourcade. A survey of algebraic properties used in cryptographic protocols. J. Comput. Secur., 14(1):1-43, 2006.

[10] A. Datta, A. Derek, J. Mitchell, and A. Roy. Protocol composition logic (PCL). Electron. Notes Theor. Comput. Sci., i72:3ii—358, 2007.

[11] Anupam Datta, Ante Derek, John Mitchell, and Dusko Pavlovic. A derivation system and compositional logic for security protocols. J. of Comp. Security, i3:423—482, 2005.

[12] Anupam Datta, Ante Derek, John C. Mitchell, Vitaly Shmatikov, and Mathieu Turuani. Probabilistic polynomial-time semantics for a protocol security logic. In Luis Caires, Giuseppe F. Italiano, Luis Monteiro, Catuscia Palamidessi, and Moti Yung, editors, ICALP, volume 3580 of Lecture Notes in Computer Science, pages i6—29. Springer, 2005.

[13] Y. Desmedt. Major security problems with the "unforgeable"(Feige-)Fiat-Shamir proofs of identity and how to overcome them. In Securicom 88, 6th worldwide congress on computer and communications security and protection, pages i47—i59, Paris France, March i988.

[14] Danny Dolev and Andrew C. Yao. On the security of public key protocols. Information Theory, IEEE Transactions on, 29(2):i98-208, i983.

[15] Nancy Durgin, John Mitchell, and Dusko Pavlovic. A compositional logic for proving security properties of protocols. J. of Comp. Security, ii(4):677-72i, 2004.

[16] Douglas Engelbart. Augmenting human intellect: A conceptual framework., October i962.

[17] Javier Thayer Fabrega, Jonathan Herzog, and Joshua Guttman. Strand spaces: What makes a security protocol correct? Journal of Computer Security, 7:i9i—230, i999.

[18] Oded Goldreich. Foundations of Cryptography. Volume I: Basic Tools. Cambridge University Press, 2000.

[19] George A. Gratzer. Universal Algebra. Van Nostrand Princeton, N.J.,, i968.

[20] Joshua D. Guttman, F. Javier Thayer, and Lenore D. Zuck. The faithfulness of abstract protocol analysis: Message authentication. Journal of Computer Security, i2(6):865—89i, 2004.

[21] Gerhard P. Hancke and Markus G. Kuhn. An RFID distance bounding protocol. In SECURECOMM '05: Proceedings of the First International Conference on Security and Privacy for Emerging Areas in Communications Networks, pages 67—73, Washington, DC, USA, 2005. IEEE Computer Society.

[22] Catherine Meadows and Dusko Pavlovic. Deriving, attacking and defending the GDOI protocol. In Peter Ryan, Pierangela Samarati, Dieter Gollmann, and Refik Molva, editors, Proceedings of ESORICS 2004, volume 3i93 of Lecture Notes in Computer Science, pages 53—72. Springer Verlag, 2004.

[23] Catherine Meadows, Radha Poovendran, Dusko Pavlovic, LiWu Chang, and Paul Syverson. Distance bounding protocols: authentication logic analysis and collusion attacks. In R. Poovendran, C. Wang, and S. Roy, editors, Secure Localization and Time Synchronization in Wireless Ad Hoc and Sensor Networks. Springer Verlag, 2006.

[24] Dusko Pavlovic and Catherine Meadows. Deriving secrecy properties in key establishment protocols. In Dieter Gollmann and Andrei Sabelfeld, editors, Proceedings of ESORICS 2006, volume 4i89 of Lecture Notes in Computer Science. Springer Verlag, 2006.

[25] Dusko Pavlovic and Douglas R. Smith. Composition and refinement of behavioral specifications. In Automated Software Engineering 2001. The Sixteenth International Conference on Automated Software Engineering. IEEE, 200i.

[26] Dusko Pavlovic and Douglas R. Smith. Guarded transitions in evolving specifications. In H. Kirchner and C. Ringeissen, editors, Proceedings of AMAST 2002, volume 2422 of Lecture Notes in Computer Science, pages 4ii—425. Springer Verlag, 2002.

[27] Vaughan Pratt. Modelling concurrency with partial orders. Internat. J. Parallel Programming, i5:33— 7i, i987.

[28] Ronald L. Rivest. On the notion of pseudo-free groups. In Moni Naor, editor, TCC, volume 295i of Lecture Notes in Computer Science, pages 505—52i. Springer, 2004.

[29] Phillip Rogaway and Thomas Shrimpton. Cryptographic hash-function basics: Definitions, implications, and separations for preimage resistance, second-preimage resistance, and collision resistance. In Bimal K. Roy and Willi Meier, editors, Proceedings of FSE, volume 30i7 of Lecture Notes in Computer Science, pages 37i—388. Springer, 2004.

[30] Patrick Schaller, Benedikt Schmidt, David Basin, and Srdjan Capkun. Modeling and verifying physical properties of security protocols for wireless networks. In In Proceedings of the IEEE Computer Security Foundations Symposium. IEEE Computer Society Press, 2009.

[31] Steve Selvin. On the Monty Hall problem. American Statistician, 29(3):i34, August i975. (letter to the editor).

[32] Steve Kremer Véronique Cortier and Bogdan Warinschi. A survey of symbolic methods in computational analysis of cryptographic systems. J. of Automated Reasoning, 2010. to appear.

[33] A. F. Webster and Stafford E. Tavares. On the design of S-boxes. In Hugh C. Williams, editor, Proceedings of CRYPTO 1985, volume 218 of Lecture Notes in Computer Science, pages 523—534. Springer, 1985.

[34] Ford Long Wong and Frank Stajano. Multichannel security protocols. IEEE Pervasive Computing, 6(4):31—39, 2007.

A Appendix: The Proofs

Proof of Prop. 4.7. Let /, and g, be randomized functions. Consider the sets F = {x, I /,[%, = P]x,} and G = {n, | g, [S, = [0]^}.

Claim 1. If for x, y £ X(S, r) and n, such that g,[S, r]n, = [0]n, holds n,(x) = n,(y), then for n,, which is equal to n, everywhere except on n,(x) = g,(y), holds that y,[S, rjn, = [0Jb,, for 3 modified accordingly. (Intuitively, separating two pieces of input can only provide more information, not less.)

Claim 2. If /,[S]X, = [r]x,} and dom(x,) C dom(n,), with x,(x) = x,(y) ^ n,(x) = n,(y), then /, can be precomposed with a permutation to yield /, with dom(/,) C dom(n,) and /¿[S]^ = [r]^}.

The consequence of these claims is that we can modify /, and g, to /, and g, so that #F = #F and # = (G.

Now let h,(x) = /, (x) :: g,(x :: y). Since thus h,[S]^ = (/[S]^) :: (g ([S]n, :: /[S]n,)) = [r, 0]^ holds, we have

#{n, | /,[S], = [r],} #{n, | g,[S, r], = [0],} <

#{n, | h,[S], = [r, 0],}


The inequality [S h r] ■ [S, r h 0 < [S h r, 0 follows by observing that

#{n, | /,[S], = [r],} = #{x, | /,[S], = [r],} 2ls>r,©|i 2ls'r,li

Proof of Prop. 5.2. For (a), xi = zi yields /¿(x^ = /¿^), so we only need to guess at most n — n bits. For (b), xi and zi are bits, and n — A(x, z) of them are equal, so we only need to guess at most A(x, z) bits. □

Proof of Lemma 5.3. (/(x))i = /¿(xi) = (x ffl (/(0,) :: / (1,)))i holds by the definition of bitwise partitioned functions at the first step, and by (1) at the second step. □

Proof of Prop. 5.4. Assumptions (12) say that the inequality xi = zi implies /¿(zi) h

/¿(xi)] = [xi h /¿(xi)]. On the other hand, by definition, the components of a partitioned function are mutually independent.Hence

i i [x,z,f (z) h f (x)] =H [x,z,f (z) h )] = n [Xi h fi(Xi)]

i=1 i=1

= n 2 =2-A(z,x)


The other way around, using (11) at the second step, we get

J] [x,z,f (z) h fi(Xi)] = [x,z,f(z) h f (x)] = 2-A(z'x) i=1

= n h )]

which, with the componentwise independence, yields (12). □

Proof of Prop. 5.6. Note that for each i G Kh, the bit (x ffl h) = h(0) = h(1) does not depend on xi. This means that x ffl h only depends on x®Kh. □

Proof of Prop. 5.7. Guessing x ffl h from z and z ffl h can be modeled as a version of the Monty Hall problem [31], where Monty randomly selects x and h and the contestant chooses z. Monty then announces z ffl h and the contestant guesses x ffl h.

Since the bits of xffl are independent, it is enough to consider the case £ = 1. Monty then flips three fair coins to pick the secret bits x, h(0), and hi1), while the contestant picks a bit z. Monty then announces z ffl h = h(z). Should the contestant now guess that x ffl h = z ffl h, or should he switch to x ffl h = — (z ffl h)?

Denote by q the probability that the contestant picks xfflh = zfflh. If h(0) = h(1), the contestant wins with this choice, because the value x ffl h is the same for every x. Since h(0) and hi1) were randomly chosen, Prob(h(0) = h(1)) = |. Otherwise, if h(0) = h(1), then x ffl h = z ffl h holds if and only if x = z. Since x is random, Prob(x = z) = 2, and hence Prob(h(0) = h(1) A x = z) = 4, because h(0), h(1) and x are independent.

The probability that the contestant will make a correct guess is thus

q ■ (Prob (h(0) = h(1)) + Prob (h(0) = h(1) A x = z)) = ^

To maximize this probability, the contestant needs q = 1, and should thus stickwith Monty's bit z ffl h.

The proof for [z ffl h h x ffl h] differs just in the detail that z is not chosen by the contestant, but obeys some unknown distribution. However, x is still independent of z. Thus for some p, Prob(x = z) = Prob(x = 0) ■ Prob(z = 0) + Prob(x = 1) ■ Prob(z = 1) = 1p + 1 (1 - p) = |. □

Proof of Lemma 6.2(a). By assumption, the outputs of the hash function H are indistinguishable from random strings, and thus satisfy [H(u) ± H(v)] for all u = v.

Recall that A(x ffl h) is the union of the contexts observed by the possible participants in the run A, before xfflh is known. Besides s, known by Victor and Peggy,

and a, b and x, announced publicly but never reused, the context A(x ffl h) thus also contains a single additional challenge z, issued by the Attacker, and the corresponding response z ffl h (provided by Peggy before she receives Victor's challenge x).

Moreover, the Attacker may issue a family Y C Z2 of additional challenges to Peggy, and construct a list {by}yeY of the future values of Victor's counter. To each new challenge, Peggy will respond with y ffl hy, where each response token hy = H(s :: ay :: by) is derived using a new value of the counter ay. By assumption, [hy ± h] holds for all y. Independently of the distance of Y and the challenge x, the responses y ffl hy will provide no information about x ffl h. In summary, the term context A(x ffl h) is thus

{s, a, b, x, z, z ffl h} U {y, ay ,by,y ffl hy | hy = H(s.ay .by) A y G Y}

for some Y C Z2, where a : Y ^ Z2 is injective, and b : Y ^ Z^ arbitrary. The assumption about H implies [y, ay,by, y ffl hy ± x ffl h], which further tells that for any S C A (x ffl h)

{s, a, b, z, z ffl h} n S = 0 [S ± x ffl h] and we are done.

The proof of 6.2(b) is analogous, but slightly simpler, elaborating the fact that obtaining one challenge tells nothing about another one. □

Proof of Lemma 6.3. Since h is indistinguishable from random, the bits of any h^ are indistinguishable from independent. The probability of guessing any chosen substring of length £ in h is indistinguishable from In particular, the probability of guessing x^ ffl h^ for a chosen x^ is indistinguishable from Knowing which substring is being guessed presents no advantage, and thus [x^ h x^ ffl h^] =

Equations (17) and (18) follow from Prop. 5.6. □

Proof of Prop. 6.4. The claim follows from the fact that each set S C A(x ffl h) such that Adv [S h x ffl h] > 0 satisfies at least one of the following inequalities:

[S h x ffl h] < [S h s] ■ [S,s h x ffl h] (A.1)

[S h x ffl h] < [S h z ffl h] ■ [S,z ffl h h x ffl h] (A.2)

According to Lemma 6.2(a) for each subset S of A(xffl h) such that a G S, it suffices to consider the set S n {s, a, b, x, z, z ffl h}. Once the problem is reduced this far, the rest follows by case analysis, using Lemma 6.3. □

Proof of Prop. 6.5. The claim is that each S C E(x) such that Adv [S h x ffl h] > 0 satisfies

[S h x ffl h] < [S h ■ [S,x®Kh h x ffl h] (A.3)

Lemma 6.2(b) says that it suffices to consider S n {s,a,b} if a G S. Thus, we only need to consider the subsets of {s, a, b}, and since b is deterministic, this reduces to the subsets of {s,a}. The assumption that the stream h is indistinguishable from

random implies [S h xgh], = 2— whenever S is a proper subset of {s, a}. So (A.3) holds trivially in that case. For S = {s, a}, using Prop. 5.6 and Lemma 6.3, we have [S h xfflh], = [S h x®Kh], = 2|Kh| — and on the other hand [S,x®Kh h xfflh], = 1. Hence (A.3). □

Proof of Prop. 6.6. Since Prob(x £ Z2) = 2— by assumption, and [x, z, z ffl h h x a h] = 2-A(z'x) by (11), it follows that

£ 2— [x,z,z ffl h h x ffl h], = 2— ■ £ ifj 2-i = 2— ■ 3, = (4

xez^ ¿=0 w v

Proof of Prop. 6.7. By hypothesis the token h = H(s :: a :: b) is indistinguishable from a random value. Since [s, a, b ± x] also holds by assumption, [s,a, b h x a h = [h h x ffl h] follows, because s,a,b can only be useful to derive h = H(s :: a :: b). But Prop. 5.6(a) then implies that [s,a,b h x ffl h], = 2i—, where i = |«h|. The expected value that Peggy will guess x ffl h are averaged over the possible values of h, and hence

2-,[h h x a h], = 2— ■ £ Q 2i— = 2"^ ■ 3, = (4) ,

heZ^ xeZ^

Proof of Thm. 6.8. By (15), to prove the Theorem, it suffices to show that both Prob(A|V) and Prob(E|V) are negligible. The Bayes' Theorem and the hypotheses imply

t, w „„n Prob(V | A) ■ Prob(A) Prob(V | A) ■ C

Prob(A|V) =-( Prob(V) ( ) <-0-

Since Prob(V|A) is negligible by Prop. 6.6, Prob(A|V) is negligible too. The fact that Prob(E|V) is negligible follows in the same way from Prop. 6.7. □