next up previous
Next: Estimation Up: Applications Previous: Applications

Cryptography

The central problem in cryptography is that of transmitting information from a point A to a point B by means of a possibly insecure channel, in such a way that the original message can only be recovered by the rightful recipients. The participants in the transaction are: Alice, the originator of the message, Bob, the receiver, and Oscar, a possible opponent who wishes to gain unauthorized control of the message. Figure 2 is essentially the same diagram utilized by Shannon to analyze the security of a communications system employing a cipherer (encoding algorithm) and a decipherer (decoding algorithm) based on a key [28] [29].

Figure 2: Secret Comunications
\includegraphics[height=2in, width=4in]{c2.eps}

The key is the only piece of information Oscar does not have. It controls the operation of both encoding and decoding algorithms: the transformation of the original text into ciphertext and back into text. One of the basic requirements in cryptography is that the strength of the cryptosystem reside exclusively on the key and not on the encoding/decoding algorithm. These could be public knowledge and this fact should not compromise the system [30].

One way this problem can be solved is by means of chaotic dynamical systems. There are several methods by which the original message can be combined with a chaotic sequence to produce a seemly ciphertext, one that appears random and disordered. However, most of these methods operate directly over an analog signal and produce analog cryptosignals that require a wide band to be transmitted. This is a waste of resources if only one signal is being encoded because it would imply the cryptosignal requires more bandwidth than the original signal even though both carry essentially the same information [24].

A better solution is to operate directly on a discrete space and generate chaotic sequences from a finite length alphabet that can be combined with the original (digital) message to produce the (discrete) ciphertext. Since we are not interested in incrementing the bandwidth necessary to transmit the ciphertext (we want it to be the same the original text would require) any transformation we apply is either a permutation of the original symbols in the text or some sort of invertible substitution. In the latter, the digits or letters of the ciphertext can be calculated from the original and the key (that determines the chaotic sequence used in encoding/decoding) as follows:


\begin{displaymath}
c_i=C_k (t_k) = t_i + k (mod \;\textit{m})
\end{displaymath} (19)

where $c_i$ is the ciphertext's $i-th$ digit, $t_i$ the $i-th$ digit of the original text and $k$ the actual value of chaotic sequence. We are also assuming $m$ to be the alphabet size.

Decoding is simply performed by the inverse operation of (19):


\begin{displaymath}
t_i = c_i - k (mod \textit{m} )
\end{displaymath} (20)

Which is possible for Bob because he knows the key and can reconstruct the chaotic sequence appearing in (20). On the other hand, this is very difficult for Oscar because he lacks information on the original conditions of the chaotic system that led to the production of $k$. Actually, the real key is the set of initial conditions that give rise to the longer $k$, called a running key, used in (19) and (20).

Then, Alice, Bob, and Oscar, all know exactly what chaotic system is being used, but only Alice an Bob share the knowledge of the exact initial conditions of the system. This constitutes the key. Oscar may try to guess these conditions but his chance of success is inversely proportional to $2n$, where $n$ is the number of effective bits used to describe the initial conditions. Moreover, even if Oscar is able to come up with a close guess it would not help too much because a chaotic system's sensitivity to initial conditions would multiply any original discrepancy in a few iterations.

To illustrate this method we use as our chaotic system the quadratic map:


\begin{displaymath}
x_{k+1}=x^2_k -2
\end{displaymath} (21)

It can be proven that (21) is essentially the same as (12) for $\lambda = 4$, which is when the logistic map exhibits total chaos [31]. The key is the initial point $x_0$ .

Figure 3: $k-gram$ entropy
\includegraphics[height=3in, width=4in]{c3.eps}

Figure 4: Entropy difference
\includegraphics[height=3in, width=4in]{c5.eps}

Figures 3 and 4 show the effect of applying the encoding algorithm to a series of twelve 10000-character text fragments extracted from literary works written in English [24]. We can observe in Figure 3 that the results for fifteen different keys are almost identical: the complexity of the resulting ciphertext is greater than that of the text at every level. Figure 4 shows only the difference in complexity between the ciphertext and the original text, so it is easier to see that low order k-grams are the ones that benefit most in this complexity increase. In a few words: the ciphertext appears more complicated than the original text at any scale, but mostly at that of low letter groupings.

Here we have made use of the $k-gram$ entropy as our complexity measure (a $k-gram$ being a grouping of $k$ letters):


\begin{displaymath}
H_k=-\Sigma_{i}P_{ik}log_2 (P_{ki})
\end{displaymath} (22)

where $P_ki$ is the frequency of the $i-th k-gram$.


next up previous
Next: Estimation Up: Applications Previous: Applications
root 2001-01-22