Announcement

Collapse
No announcement yet.

Rule 30 encryption

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bascule
    replied
    Re: Rule 30 encryption

    Originally posted by Thorn View Post
    Neat stuff, bascule. Is there some maximum message length of the clear text that is to be avoided because after that the algorithm makes the key length unwieldy?
    The amount of memory needed to run the cipher scales O(n) with the message length. That's the real problem.

    You could try to get around this by taking some aperiodic snapshot of a section of the CA and using that as a new key, however I suspect this could lead to a potential attack vector.

    Leave a comment:


  • Thorn
    replied
    Re: Rule 30 encryption

    Neat stuff, bascule. Is there some maximum message length of the clear text that is to be avoided because after that the algorithm makes the key length unwieldy?

    Leave a comment:


  • bascule
    started a topic Rule 30 encryption

    Rule 30 encryption

    Cellular automata (CA) are systems of bits with specific rules for flipping neighboring bits on and off. Run one over time and interesting patterns appear. The most famous is Conway's Game of Life, which is pretty much ubiquitously known among geeks, to the point that some jerkass proposed the life glider be the official "geek symbol"

    Anywho, supergenius Stephen Wolfram studied CA in depth, starting with simple 1-dimensional ones which evolve over time (i.e. one dimension of space, one dimension of time). Contrast with Life which has 2 dimensions of space and one dimension of time.

    Anyway, one of these 1d CAs exhibited patterns he considered to be statistically random. It's known as rule 30, and maps 8 possible input states (stored in 3 bits) to 8 possible output states (for determining what state the center bit will be in on the next iteration). Here's a diagram of what inputs map to what outputs:



    This can be used to make a stream cipher. Here's the basic description of how:

    Generate a random key of the desired length.

    Use this to obfuscate the first key-length worth of data (something a little bit better than a straight up XOR is handy here)

    Pad the beginning and end of the key with a 0.

    Compute the next iteration of rule 30.

    Use this to obfuscate original key-length + 2 bits worth of data.

    Rinse, repeat.

    Here's a paper on it:

    http://www.cs.indiana.edu/~dgerman/2...nce/dgelbm.pdf

    So what's the magic secret? Why can such a simple algorithm provide high security encryption?

    The growing key length is really the problem. The statistical randomness comes from the ever-growing pattern.

    For this reason it's not really a practical form of streaming encryption compared to your average block cipher. But even so, it's mind boggling that such a simple algorithm could provide high security streaming encryption.

    Here's what Rule 30 actually looks like, plotted over time:



    More on Rule 30:

    http://en.wikipedia.org/wiki/Rule_30
Working...
X