Posts Tagged ‘Cryptography

Limitations of Modern Cryptosystems

Before exploring quantum key distribution, it is important to understand the state of modern cryptography and how quantum cryptography may address current digital cryptography limitations. Since public key cryptography involves complex calculations that are relatively slow, they are employed to exchange keys rather than for the encryption of voluminous amounts of date. For example, widely deployed solutions, such as the RSA and the Diffie-Hellman key negotiation schemes, are typically used to distribute symmetric keys among remote parties. However, because asymmetric encryption is significantly slower than symmetric encryption, a hybrid approach is preferred by many institutions to take advantage of the speed of a shared key system and the security of a public key system for the initial exchange of the symmetric key. Thus, this approach exploits the speed and performance of a symmetric key system while leveraging the scalability of a public key infrastructure.

However, public key cryptosystems such as RSA and Diffie-Hellman are not based on concrete mathematical proofs. Rather, these algorithms are considered to be reasonably secure based on years of public scrutiny over the fundamental process of factoring large integers into their primes, which is said to be “intractable”. In other words, by the time the encryption algorithm could be defeated, the information being protected would have already lost all of its value. Thus, the power of these algorithms is based on the fact that there is no known mathematical operation for quickly factoring very large numbers given today’s computer processing power.

Secondly, there is uncertainty whether a theorem may be developed in the future  or perhaps already available that can factor large numbers into their primes in a timely manner. At present, there is no existing proof stating that it is impossible to develop such a factoring theorem. As a result, public key systems are thus vulnerable to the uncertainty regarding the future creation of such a theorem, which would have a significant affect on the algorithm being mathematical intractable. This uncertainty provides potential risk to areas of national security and intellectual property which require perfect security.

In sum, modern cryptography is vulnerable to both technological progress of computing power and evolution in mathematics to quickly reverse one way functions such as that of factoring large integers. If a factoring theorem were publicized or computing became powerful enough to defeat public cryptography, then business, governments, militaries and other affected institutions would have to spend significant resources to research the risk of damage and potentially deploy a new and costly cryptography system quickly.

Tags : , , , , , , , , ,

Security a New Dimension in Embedded System Design

Embedded systems, which will be ubiquitously used to capture, store, manipulate, and access data of a sensitive nature, pose several unique and interesting security challenges. Security has been the subject of intensive research in the areas of cryptography, computing, and networking. However, security is often mis-construed by embedded system designers as the addition of features, such as specific cryptographic algorithms and security protocols, to the system. In reality, it is an entirely new metric that designers should consider throughout the design process, along with other metrics such as cost, performance, and power.security in one form or another is a requirement for an increasing number of embedded systems, ranging from low-end systems such as PDAs, wireless handsets, networked sensors, and smart cards, to high-end systems such as routers, gateways, firewalls, storage servers, and web servers. Technological advances that have spurred the development of these electronic systems have also ushered in seemingly parallel trends in the sophistication of security attacks. It has been observed that the cost of insecurity in electronic systems can be very high. For example, it was estimated that the “I Love You” virus caused nearly one billion dollars in lost revenues worldwide.
With an increasing proliferation of such attacks, it is not surprising that a large number of users in the mobile commerce world (nearly 52% of cell phone users and 47% of PDA users, according to a survey by Forrester Research) feel that security is the single largest concern preventing the successful deployment of next-generation mobile services. With the evolution of the Internet, information and communications security has gained significant attention. For example, various security protocols and standards such as IPSec, SSL, WEP, and WTLS, are used for secure communications. While security protocols and the cryptographic algorithms they contain address security considerations from a functional perspective, many embedded systems are constrained by the environments they operate in, and by the resources they possess. For such systems, there are several factors that are moving security considerations from a functioncentric perspective into a system architecture (hardware/software) design issue.

Tags : , , , , , , , , , , , , , , , , , , , ,

Securing against software attacks

Software is a central and critical aspect of the computer (and embedded system) security problem. Software defects with security ramifications —including implementation bugs such as buffer overflows and design flaws such as inconsistent error handling — promise to be with us for years. All too often, malicious intruders can hack into systems by exploiting software defects. Moreover, Internet-enabled software applications present the most common security risk encountered today, with software’s ever expanding complexity and extensibility adding further fuel to the fire.

Software security’s best practices leverage good software engineering practice and involve thinking about security early in the software design life cycle (SDLC), knowing and understanding common threats (including language-based flaws and pitfalls), designing for security, and subjecting all software artifacts to thorough objective risk analyses and testing.

Security is an emergent property of a software system. This is a subtle point often lost on development people who tend to focus on functionality. Obviously, there are security functions in the world, and most modern software includes security features, but adding features such as SSL (for cryptographically protecting communications) does not present a complete solution to the security problem. A security problem is more likely to arise because of a problem in a standard part of the system (e.g., the API to the server) than in some given security feature. This is an important reason why software security must be part of a full lifecycle approach. Just as you cannot test quality into a piece of software, you can not spray paint security features onto a design and expect it to become secure.

Figure 1: Software security best practices applied to various software artifacts in the Software Design Life Cycle (SDLC)

As practitioners become aware of software security’s importance, they are increasingly adopting and evolving a set of best practices to address the problem. There is no substitute for working software security as deeply into the development process as possible and taking advantage of the engineering lessons software practitioners have learned over the years. Figure 1 specifies one set of best practices and shows how software practitioners can apply them to the various software artifacts produced during software development. Although the artifacts are shown laid out in a linear sequence, most organizations follow an iterative approach, which means that best practices will be cycled through more than once as the software evolves.

Software security best practices apply at various levels:

  1. The requirements level: Security requirements must cover both overt functional security (e.g., the use of applied cryptography) and emergent characteristics.
  2. The design and architecture level: A system must be coherent and present a unified security architecture that takes into account security principles (such as the principle of least privilege).
  3. The code level: Static analysis tools — tools that scan source code for common vulnerabilities — can discover implementation bugs at the code level.

Risks crop up during all stages of the software life cycle, so a constant risk analysis thread, with recurring risk tracking and monitoring activities, is highly recommended. Security testing must encompass two strategies: testing security functionality with standard functional testing techniques, and risk-based security testing based on attack patterns and threat models. Penetration testing is also useful, especially if an architectural risk analysis is specifically driving the tests.

Tags : , , , , , , , , , , , , , , , , ,

Future Benefits of Quantum Computers

1. Cryptography and Peter Shor’s Algorithm

In 1994 Peter Shor (Bell Laboratories) found out the first quantum algorithm that, in principle, can perform an efficient factorization. This became a complex application that only a quantum computer could do. Factoring is one of the most important problems in cryptography. For instance, the security of RSA (electronic banking security system) – public key cryptography – depends on factoring and it is a big problem. Because of many useful features of quantum computer, scientists put more efforts to build it. However, breaking any kind of current encryption that takes almost centuries on existing computers, may just take a few years on quantum computer.

2. Artificial Intelligence

It has been mentioned that quantum computers will be much faster and consequently will perform a large amount of operations in a very short period of time. On the other side, increasing the speed of operation will help computers to learn faster even using the one of the simplest methods – mistake bound model for learning.

3. Other Benefits

High performance will allow us in development of complex compression algorithms, voice and image recognition, molecular simulations, true randomness and quantum communication. Randomness is important in simulations. Molecular simulations are important for developing simulation applications for chemistry and biology. With the help of quantum communication both receiver and sender are alerted when an eavesdropper tries to catch the signal. Quantum bits also allow more information to be communicated per bit. Quantum computers make communication more secure.

Tags : , , , , , , , , , , , , , ,

The DARPA Quantum Network

The DARPA Quantum Network aims to strengthen QKD’s performance in these weaker areas. In some instances, this involves the introduction of newer QKD technologies; for example, we hope to achieve rapid delivery of keys by introducing a new, high-speed source of entangled photons. In other instances, we rely on an improved system architecture to achieve these goals; thus, we tackle distance- and location independence by introducing a network of trusted relays. Whereas most work to date has focused on the physical layer of quantum cryptography – e.g. the modulation, transmission, and detection of single photons – our own research effort aims to build QKD networks. As such, it is oriented to a large extent towards novel protocols and architectures for highly-secure communications across a heterogenous variety of under lying kinds of QKD links.

Figure 1. A Virtual Private Network (VPN) based on Quantum Key Distribution

Our security model is the cryptographic Virtual Private Network (VPN). Conventional VPNs use both public-key and symmetric cryptography to achieve confidentiality and authentication/integrity. Public-key mechanisms support key exchange or agreement, and authenticate the endpoints. Symmetric mechanisms (e.g. 3DES, SHA1) provide traffic confidentiality and integrity. Thus VPN systems can provide confidentiality and authentication / integrity without trusting the public network interconnecting the VPN sites. In our work, existing VPN key agreement primitives are augmented or completely replaced by keys provided by quantum cryptography. The remainder of the VPN construct is left unchanged; see Fig. 1. Thus our QKD-secured network isfully compatible with conventional Internet hosts, routers, firewalls, and so forth.

At time of writing, we are slightly over one year into a projected five-year effort to build the full DARPA Quantum Network. In our first year, we have built a complete quantum cryptographic link, and a QKD protocol engine and working suite of QKD protocols, and have integrated this cryptographic substrate into an IPsec-based Virtual Private Network. This entire system has been continuously operational since December 2002, and we are now in the process of characterizing its behavior and tuning it. In coming years, we plan to build a second link based on two photon entanglement, and to build various forms of end-to-end networks for QKD across a variety of kinds of links. We expect the majority of our links to be implemented in dark fiber but some may also be implemented in free space, either in the labor outdoors.

Tags : , , , , , , , , , , , , , ,

The use of the random oracle model in cryptography

Possibly the most controversial issue in provable security research is the use of the random oracle model (Bellare & Rogaway 1993). The random oracle model involves modelling certain parts of cryptosystems, called hash functions, as totally random functions about whose internal workings the attacker has no information.This theoretical model vastly simplies the analysis of cryptosystems and allows many schemes to be ‘proven’ secure that would otherwise be too complicated to be proven secure.

A hash function is a keyless algorithm that takes arbitrary-length inputs and outputs a fixed-length hash value or hash. There are several properties that one would expect a hash function to exhibit, including pre-image resistance (given a random element of the output set, it should be computationally infeasible to find a pre-image of that element)  and collision resistance  (it should be computationally infeasible to find two elements that have the same hash value). However, there are many more properties that we might require of a hash function depending on the circumstances. For example, it might be hoped that if the hash function is evaluated on two related inputs, then the outputs will appear unrelated.

From a provable security point of view, hash functions present a difficult problem. They are usually developed using symmetric techniques, either as standalone algorithms or based on the use of a block cipher. Thus it is difficult to apply the reductionist theory of provable security to them because there are no natural candidate problems to which we may reduce the security. There are constructions of hash functions from block ciphers for which it can be proven that the hash function has certain properties (such as pre-image and collision resistance) as long as the underlying block cipher is indistinguishable from a random permutation however, it is impossible for any publicly known function to produce outputs that appear independent when evaluated on two known inputs.

The random oracle model attempts to overcome our inability to make strong statements about the security of hash functions by modelling them as completely random functions about which an attacker has no information. The attacker (andall other parties in the security model) may evaluate such a random hash functionby querying an oracle. The original interpretation of this simplifcation was that it heuristically demonstrated that a cryptosystem was secure up to attacks against the system that may be introduced via the use of a specific hash function. Equivalently,it was thought that a proof of security in the random oracle model meant that, with overwhelming probability, the cryptosystem was secure when instantiated with a randomly chosen hash function.This interpretation of the random oracle model is correct up to a point. It ispossible to construct families of efficient hash functions for which it is computationally infeasible to differentiate between access to an oracle which computes a randomly selected hash function, and access to an oracle which computes a random function. If such a hash function is used in place of the random oracle, then we can be sure that the scheme is secure against attackers whose only interaction with the hash function is to directly compute the output of the hash function on certain inputs.

The one key difference between the random oracle model and the use of a hash function selected at random from a random looking function family is that in the latter case the attacker is given access to a description of a Turing machine that can compute the hash function  in the former the attacker is not given such a description.This led to the cataclysmic result of Ran Canetti  (2004) who demonstrated that it was possible to have a scheme that was provably secure in the random oracle model,and yet insecure when the random oracle was replaced with any hash function. The trick Canetti  employ is to use knowledge of the Turing machine that computes the hash function like a password that forces the cryptosystem to release sensitive information.

Tags : , , ,