Results 11 - 20
of
22
Non-viability deductions in arc-consistency computation
- in "Proc. of the Nineteenth International Conference on Logic Programming (ICLP 2004)", Lecture Notes in Computer Science, Springer-Verlag, 2004, p. 343–355. Internal Reports
"... Abstract Arc-Consistency (AC) techniques have been used extensively in the study of Constraint Satisfaction Problems (CSP). These techniques are used to simplify the CSP before or during the search for its solutions. Some of the most efficient algorithms for AC computation are AC6++ and AC-7. The no ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
(Show Context)
Abstract Arc-Consistency (AC) techniques have been used extensively in the study of Constraint Satisfaction Problems (CSP). These techniques are used to simplify the CSP before or during the search for its solutions. Some of the most efficient algorithms for AC computation are AC6++ and AC-7. The novelty of these algorithms is that they satisfy the so-called four desirable properties for AC computation. The main purpose of these interesting properties is to reduce as far as possible the number of constraint checks during AC computation while keeping a reasonable space complexity. In this paper we prove that, despite providing a remarkable reduction in the number of constraint checks, the four desirable properties do not guarantee a minimal number of constraint checks. We therefore refute the minimality claim in the paper introducing these properties. Furthermore, we propose a new desirable property for AC computation and extend AC6++ and AC-7 to consider such a property. We show theoretically and experimentally that the new property provides a further substantial reduction in the number of constraint checks. 1
Modelling and detecting the cascade vulnerability problem using soft constraints
- in: Proceedings of ACM Symposium on Applied Computing (SAC-2004), ACM
, 2004
"... Establishing network security is based not just on the security of its component systems but also on how they are configured to interoperate. In this paper we consider how soft constraints provide an approach to detecting the cascade vulnerability problem: whether system interoperation provides circ ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
(Show Context)
Establishing network security is based not just on the security of its component systems but also on how they are configured to interoperate. In this paper we consider how soft constraints provide an approach to detecting the cascade vulnerability problem: whether system interoperation provides circuitous or cascading routes across the network that increase the risk of violation of multilevel security. Taking the constraints approach means that we are building on techniques that have proven success in solving large-scale problems from other domains.
Semiring-based soft constraints
"... Abstract. The semiring-based formalism to model soft constraint has been introduced in 1995 by Ugo Montanari and the authors of this paper. The idea was to make constraint programming more flexible and widely applicable. We also wanted to define the extension via a general formalism, so that all its ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Abstract. The semiring-based formalism to model soft constraint has been introduced in 1995 by Ugo Montanari and the authors of this paper. The idea was to make constraint programming more flexible and widely applicable. We also wanted to define the extension via a general formalism, so that all its instances could inherit its properties and be easily compared. Since then, much work has been done to study, extend, and apply this formalism. This papers gives a brief summary of some of these research activities. 1 Before soft constraints: a brief introduction to constraint programming
Symmetriesof NonlinearityConstraints
"... Abstract. We find symmetries for constraints that model the nonlinearity requirements of a discrete function f: Z2n → Z2m(n> m). Such constraints are very important, as the functions are employed in generating deterministic but difficult-to-analyze permutations used in symmetric cryptographic sys ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. We find symmetries for constraints that model the nonlinearity requirements of a discrete function f: Z2n → Z2m(n> m). Such constraints are very important, as the functions are employed in generating deterministic but difficult-to-analyze permutations used in symmetric cryptographic systems. There, such functions are referred to as Substitution Boxes (S-boxes). The nonlinearity is a complex requirement that has been traditionally formulated using a set of criteria (that we interpret as new constraints). Most of these constraints are found to exhibit symmetries that can be exploited for reducing the size of the search space, and for efficiently generating new solutions. Among discovered symmetries, a bit inversion symmetry (a special case of the value reversal symmetry) and a rotational symmetry (a special case of variable symmetry) are foundtoapplytoallstudiednonlinearityconstraintswithoutaffectingtheirsecuritymetric,andquadrupletheefficiencyofsolvers.Theoreticalandexperimental results onsymmetryare reported. 1
DOI: 10.1017/S1471068404002121 Printed in the United Kingdom Soft constraint programming to analysing security protocols
"... Security protocols stipulate how the remote principals of a computer network should interact in order to obtain specific security goals. The crucial goals of confidentiality and authentication may be achieved in various forms, each of different strength. Using soft (rather than crisp) constraints, w ..."
Abstract
- Add to MetaCart
Security protocols stipulate how the remote principals of a computer network should interact in order to obtain specific security goals. The crucial goals of confidentiality and authentication may be achieved in various forms, each of different strength. Using soft (rather than crisp) constraints, we develop a uniform formal notion for the two goals. They are no longer formalised as mere yes/no properties as in the existing literature, but gain an extra parameter, the security level. For example, different messages can enjoy different levels of confidentiality, or a principal can achieve different levels of authentication with different principals. The goals are formalised within a general framework for protocol analysis that is amenable to mechanisation by model checking. Following the application of the framework to analysing the asymmetric Needham-Schroeder protocol (Bella and Bistarelli 2001; Bella and Bistarelli pear), we have recently discovered a new attack on that protocol as a form of retaliation by principals who have been attacked previously. Having commented on that attack, we then demonstrate the framework on a bigger, largely deployed protocol consisting of three phases, Kerberos.
A Constraint Based Framework for Dependability
"... Abstract. An integrity policy defines the situations when modification of information is authorized and is enforced by the security mechanisms of the system. However, in a complex application system it is possible that an integrity policy may have been incorrectly specified and, as a result, a user ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. An integrity policy defines the situations when modification of information is authorized and is enforced by the security mechanisms of the system. However, in a complex application system it is possible that an integrity policy may have been incorrectly specified and, as a result, a user may be authorized to modify information that can lead to an unexpected system compromise. In this paper we propose a scalable and quantitative technique that uses constraint solving to model and analyze the effectiveness of application system integrity policies. 1
Abstract ARSPA 2004 Preliminary Version Believing the Integrity of a System (Invited Talk)
"... An integrity policy defines the situations when modification of information is authorised and is enforced by the protection mechanisms of a system. Traditional models of protection tend to define integrity in terms of ad-hoc authorisation techniques whose effectiveness are justified more on the basi ..."
Abstract
- Add to MetaCart
(Show Context)
An integrity policy defines the situations when modification of information is authorised and is enforced by the protection mechanisms of a system. Traditional models of protection tend to define integrity in terms of ad-hoc authorisation techniques whose effectiveness are justified more on the basis of experience and ”best practice” rather than on any theoretical foundation. In a complex application system it is possible that an integrity policy may have been incorrectly configured, or that the protection mechanisms are inadequate, resulting in an unexpected system compromise. This paper examines the meaning of integrity and and describes a simple belief logic approach for analysing the integrity of a system configuration.
A Process Calculus for Universal Concurrent Constraint Programming: Semantics, Logic and Application
"... (utcc) process calculus; a generalisation of Timed Concurrent Constraint Programming. The utcc calculus allows for the specification of mobile behaviours in the sense of Milner’s π-calculus: Generation and communication of private links. We first endow utcc with an operational semantics and then wit ..."
Abstract
- Add to MetaCart
(Show Context)
(utcc) process calculus; a generalisation of Timed Concurrent Constraint Programming. The utcc calculus allows for the specification of mobile behaviours in the sense of Milner’s π-calculus: Generation and communication of private links. We first endow utcc with an operational semantics and then with a symbolic semantics to deal with problematic operational aspects involving infinitely many substitutions and divergent internal computations. The novelty of the symbolic semantics is to use temporal constraints to represent finitely infinitely-many substitutions. We also show that utcc has a strong connection with Pnueli’s Temporal Logic. This connection can be used to prove reachability properties of utcc processes. As a compelling example, we use utcc to exhibit the secrecy flaw of the Needham-Schroeder security protocol. 1
Private and Efficient Stable Marriages (Matching)
, 2006
"... We provide algorithms guaranteeing high levels of privacy by computing uniformly random solutions to stable marriages problems. We also provide efficient algorithms extracting a nonuniformly random solution and guaranteeing t-privacy for any threshold t. The most private solution is expensive and is ..."
Abstract
- Add to MetaCart
We provide algorithms guaranteeing high levels of privacy by computing uniformly random solutions to stable marriages problems. We also provide efficient algorithms extracting a nonuniformly random solution and guaranteeing t-privacy for any threshold t. The most private solution is expensive and is based on a distributed /shared CSP model of the problem. The most efficient version is based on running the Gale-Shapley algorithm after shuffling the men (or women) in the shared secret description of the problem. We introduce an efficient arithmetic circuit for the Gale-Shapley algorithm that can employ a cryptographic primitive we propose for vector access with an arbitrary number of participants.
A Constraint-based Framework for the Cascade Vulnerability Problem
, 2004
"... Establishing network security is based not just on the security of its component systems but also on how they are configured to interoperate. In this paper we consider how soft constraints provide an approach to detecting the cascade vulnerability problem: whether system interoperation provides circ ..."
Abstract
- Add to MetaCart
Establishing network security is based not just on the security of its component systems but also on how they are configured to interoperate. In this paper we consider how soft constraints provide an approach to detecting the cascade vulnerability problem: whether system interoperation provides circuitous or cascading routes across the network that increase the risk of violation of multilevel security. Taking the constraints approach means that we are building on techniques that have proven success in solving large-scale problems from other domains.