Announcement

Collapse
No announcement yet.

Smartworm

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Smartworm

    I'm pretty certain someone has thought of this before, but has anyone imagined the potential of seed AI + an Internet worm?

    One of the biggest problems facing AI researchers at present is the computational complexity of the brain: to model all of the neurons in the human brain and their dense connectivity structure would require at least several petaflops of computational power.

    The thing is, that's obviously out there now, if a seed AI program could create a botnet of every single computer on the Internet.

    The nice thing about a brain is that it scales; many of the structures (including ones assumed to comprise the machinery of consciousness such as the "columns" of the cerebral cortex) are merely slightly variadic repetitions of the same basic pattern.

    A seed AI program contained within an Internet worm, armed with a small set of initial exploits to get it going, would take over successively more computers. But unlike other worms its intelligence would increase for each computer it infected and linked back into the global neural network.

    It could then start devising its own exploits and taking control of more and more computers, all of which continues to increase its intelligence. Never mind the fact that as a seed AI it's been specifically created as a self-improving algorithm (ala Neuromancer)

    I guess the hard part would be hitting the "critical mass" of enough infected hosts for the "Smartworm" to be intelligent enough to begin designing its own exploits and/or self-improving its own intelligence.
    45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B0
    45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B1
    [ redacted ]

  • #2
    Originally posted by bascule
    I'm pretty certain someone has thought of this before, but has anyone imagined the potential of seed AI + an Internet worm?

    One of the biggest problems facing AI researchers at present is the computational complexity of the brain: to model all of the neurons in the human brain and their dense connectivity structure would require at least several petaflops of computational power.

    The thing is, that's obviously out there now, if a seed AI program could create a botnet of every single computer on the Internet.

    The nice thing about a brain is that it scales; many of the structures (including ones assumed to comprise the machinery of consciousness such as the "columns" of the cerebral cortex) are merely slightly variadic repetitions of the same basic pattern.

    A seed AI program contained within an Internet worm, armed with a small set of initial exploits to get it going, would take over successively more computers. But unlike other worms its intelligence would increase for each computer it infected and linked back into the global neural network.

    It could then start devising its own exploits and taking control of more and more computers, all of which continues to increase its intelligence. Never mind the fact that as a seed AI it's been specifically created as a self-improving algorithm (ala Neuromancer)

    I guess the hard part would be hitting the "critical mass" of enough infected hosts for the "Smartworm" to be intelligent enough to begin designing its own exploits and/or self-improving its own intelligence.


    But wouldnt it be interesting if the "creator" of this worm had something to do with The CCortex Project.

    Hmm, Sounds a lot like the plot from T3.

    Comment


    • #3
      Originally posted by CP99
      Hmm, Sounds a lot like the plot from T3.
      Kind of. T3 had a worm which they figured they could somehow stop by unleashing SAI upon the Internet. What I'm proposing here is a way any yokel could (somewhat maliciously) brew up seed AI in their basement (provided they were a genius and had access to something like Artificial Development's CorticalDB)

      But wouldnt it be interesting if the "creator" of this worm had something to do with The CCortex Project.
      As outlined above, he could simply be one of their clients...
      45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B0
      45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B1
      [ redacted ]

      Comment


      • #4
        Originally posted by bascule
        I'm pretty certain someone has thought of this before, but has anyone imagined the potential of seed AI + an Internet worm?

        One of the biggest problems facing AI researchers at present is the computational complexity of the brain: to model all of the neurons in the human brain and their dense connectivity structure would require at least several petaflops of computational power.

        The thing is, that's obviously out there now, if a seed AI program could create a botnet of every single computer on the Internet.

        The nice thing about a brain is that it scales; many of the structures (including ones assumed to comprise the machinery of consciousness such as the "columns" of the cerebral cortex) are merely slightly variadic repetitions of the same basic pattern.

        A seed AI program contained within an Internet worm, armed with a small set of initial exploits to get it going, would take over successively more computers. But unlike other worms its intelligence would increase for each computer it infected and linked back into the global neural network.

        It could then start devising its own exploits and taking control of more and more computers, all of which continues to increase its intelligence. Never mind the fact that as a seed AI it's been specifically created as a self-improving algorithm (ala Neuromancer)

        I guess the hard part would be hitting the "critical mass" of enough infected hosts for the "Smartworm" to be intelligent enough to begin designing its own exploits and/or self-improving its own intelligence.
        Read "Shockwave Rider" by John Brunner (1975) He had a similar worm as a plot device.
        Thorn
        "If you can't be a good example, then you'll just have to be a horrible warning." - Catherine Aird

        Comment


        • #5
          Would this not be another example of a "polymorphic worm" ? In which some have more intelligence then others.

          Comment


          • #6
            Originally posted by cindy
            Would this not be another example of a "polymorphic worm" ? In which some have more intelligence then others.
            Polymorphic code doesn't necessarily imply any functional change whatsoever. It's more like how in DNA multiple codons code for the same amino acid. So you can have genes where no two codons are alike but they still code for the same protein sequence.
            45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B0
            45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B1
            [ redacted ]

            Comment


            • #7
              Could be interesting to see, basically it would create it's own P2P network which would use to communicate and build up info. lot of people stay infected for months, if years, so it could automaticly detect this and create "super nodes" to keep master copies of the AI database and update each one, while minor databases would spread accross newly infected machines. Would be fun to write something like that and throw it on an isolated network to see what happends.
              Red Squirrel

              Comment


              • #8
                Indeed a nice idea to try something like that on an isolated network. But I find it also dangerous. What if the writer of such code would use it to do harm? Or (maybe a bit to sci-fi) the worm itself turns evil, for when it get smarter it would be able to think for itself.

                Anyway, would be fun to see :P

                Comment


                • #9
                  What if the worm was very small and could lay dormant for years. If it wasn't picked up on for a long time (because it doesn't do anything yet) it could grow. Then if someone found out it was being looked into, an ignition file, like second half could complete it. It would now access the net, and reach that "critical mass" mentioned. By then downloading any updated files that were designed during the down time. Hell, Bill Gates could do this today if he had a reason for it.

                  This is just above the moron level for a theory, but I'm just throwing it out there.

                  I think MyDoom was found before it started working right?

                  Another problem is that if it takes so much to get close to a human's thinking capabillity, it may not grow strong enough to overpower man; or even a group of men. (or women for politically correct reasons)

                  Comment


                  • #10
                    If I had the knowledge of AI I'd try this out on a (very) isolated network. sounds like it could be fun.

                    Then you create another one that has to try and defeat the bad one. WW2, now available in a VMware session.
                    Red Squirrel

                    Comment

                    Working...
                    X