Announcement

Collapse
No announcement yet.

Wires in your brain?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Wires in your brain?

    Would you accept an implant that allowed your brain to communicate with digital networks bidirectionally at the level of abstract thought? Obviously the security implecations are, well, nuts, but if consciousness were augmented in such a manner, wouldn't the flaws in human consciousness itself from which security vulnerabilities stem be eliminated as well?
    45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B0
    45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B1
    [ redacted ]

  • #2
    Originally posted by bascule
    Would you accept an implant that allowed your brain to communicate with digital networks bidirectionally at the level of abstract thought?
    This reminds me of a section of The Hitchhicker's Guide to the Galaxy where the people of a planet are cursed with the ability to read each others minds, but as a result of this, find they cannot have privacy. Their solution? If they always talk, then they cannot think about private things.

    if consciousness were augmented in such a manner, wouldn't the flaws in human consciousness itself from which security vulnerabilities stem be eliminated as well?
    Two words, "Mental illness." Imagine what would happen to a population when at least one person with mental problems were included. Consider those studies which state men think about sex at least once every 15 minutes, 10 minutes, 5 minutes or 1 minute.

    Our brains were not designed (through evolution) to work this way with other brains directly passing conscious thought. I would be worried that we would not have mechanisms to control the conscious thoughts we received or sent-- especially while dreaming.

    There would also be an issue where "few nodes" would understand the security implications of specific bits of code, and be hammered by the many nodes which were clueless.

    Also, many security problems exist in "border cases" where it is not a single section of code, but perhaps how the code is used elsewhere or in combination with other code that leads to unexpected problems.

    Comment


    • #3
      Hell no i wouldnt put that shit in my head. Dude havent u ever seen ghost in the shell? If someone sends your computer a virus u can get that fixed. I dont think its that easy with your brain.

      Comment


      • #4
        Originally posted by bascule
        Would you accept an implant that allowed your brain to communicate with digital networks bidirectionally at the level of abstract thought? Obviously the security implecations are, well, nuts, but if consciousness were augmented in such a manner, wouldn't the flaws in human consciousness itself from which security vulnerabilities stem be eliminated as well?
        At first though, yes, the ability controll computers with just a thought is really intriguing. But then, as you brought up, one would have to worry about the security of it. I would say 'yes', AS LONG AS someone could prove to me that it was a 'perfect' system. As there is no such thing as true perfection and this could not be proven, then it would have to be 'no'. And by 'perfect' i mean un-hackable. And we know that this also improbable. I guess for safety reasons i would have to say 'no', no matter how tempting it may be. As for security vulnerabilities? I believe we would look at the code, implimentation and hardware much more closely. Technicaly we would still be human, and there for we would still be capable of mistakes. But then again, a human being is the bi-product of there thoughts, among other things, and if we where to alter that place in which our thoughts come from, then maybe we wouldnt be human at all.
        I need a cigarette.
        Good question though. What made you think of it?
        The only stupid question is the one that you dont ask.
        Or the one that ends up in dev/null.

        Comment


        • #5
          I'd say the majority of security vulnerabilities arise from two things:

          1. It's hard to remember stuff
          2. Math is hard

          If such a device were to make machine language as intuitive as any other form of thinking or expression, and memory and mathematical skills were "perfect", then why would security vulnerabilities continue to arise?

          I suppose conversely such a device would make it substantially easier to find complex race conditions and other problems which otherwise would be beyond the ability of the human mind to decipher...
          45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B0
          45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B1
          [ redacted ]

          Comment


          • #6
            Yes, but aren't we talking about two different things here: the COMMUNICATION side and the UTILIZATION/COMPUTATIONAL side- meaning just because you can communicate directly with a machine does not imply your ability to understand signals from that machine. For instance I can blast a small, slow processor with all kinds of data, but just because I have a pipe to it does not imply that the data will be used or understood properly....

            Would a certain amount of 'adaptation' take place? Surely.

            As far as the hacking goes, is even a non-wired brain safe? What about social engineering? What about hypnosis? Surgery to alter the brain?

            You're right, there is no 'perfect' system (as you defined perfect meaning non-hackable)- But does one really exist now? A bullet to the head is one hell of a system crash. (I'm just saying~)

            The only criteria I would have about doing some type of augmentation like that would be a demonstration of improvement- baby steps to artificial evolution if you will....and didn't Sony just take out some strange patents like this?

            LosT

            Comment


            • #7
              Originally posted by bascule
              I'd say the majority of security vulnerabilities arise from two things:

              1. It's hard to remember stuff
              2. Math is hard
              Which is "trusting user data" without checking?
              Which is trusting your compiler to not leave too many clues about your encryption cipher implementation?
              Consider design decisions.
              Consider management's input: "Must be backwards compatible" or "Must meet export restrictions for an International Market, and Import restrictions of country XYZ."

              If such a device were to make machine language as intuitive as any other form of thinking or expression, and memory and mathematical skills were "perfect", then why would security vulnerabilities continue to arise?
              Consider the notion that even the best coders have a single project limit where they understand all of the pieces at the same time of about 10k lines of code.

              Go on to consider that with higher languages we have abstractions that can give us fewer lines of code to complete the same takes after compiliation or being interprited. (You are talking machine languge.)

              I am not so sure the device would make our math better.

              A sharing of thoughts does not increase the maximum capacity we have to fully understand all the parts of a project in under 10k lines of code.

              Now. Consider the number of lines of code in MS Windows XP and estimate how many lines of machine code that would be.

              Consider what would be required for a person to understand the whole project.

              You apply an additive effect to "help" the process, but don't apply it to errors. The Mythical Man Month shows us effects of adding people to a project; it does not leaad to linear growth to code productivity. Adding more brains to a project can slow it down. Allowing bad ideas to pass directly? Such effects would be subject to an additive effect as well.

              I think the best application of transmitting thoughts would be education.
              Last edited by TheCotMan; April 13, 2005, 12:44. Reason: punctuation, missing word

              Comment


              • #8
                What about nano technology that is being tested today?

                http://www.zyvex.com/nanotech/nanotechAndMedicine.html

                Little machines that a doctor injects into your system to perform tissue repair etc..

                This to me is also a scary concept. What if you cannot rid them from your system and they are not perfroming the correct function?

                I have seen first hand the incompentency of doctors and technology, such as their wireless deployments and their newest bluetooth device in which some carry all their patient information without the second thought of security or privacy. So injecting little nanobots into my systems? No Thanks. Not yet anyway.

                Comment


                • #9
                  Security is always a Trade-off.... as a smart guy says... (Bruce Schneier)

                  Convenience VS Security

                  Most of the time... Complexity VS Simplicity also can be seen..

                  Privacy VS Security (esp in a post 9/11 world...)

                  So... Wired directly to my computer? A very cool thought.... Implications with such a big idea/change are beyond our level of comprenhention....

                  Connecting technology and electronic devices to biological devices (ie people/animals) is always going to be... 'intresting'... Questions of religion, morals, security, abuse and the future are waiting to strike on a topic such as this...
                  Nice job though... fun to dream. (nightmare or not in this case, its up to you...)
                  The only constant in the universe is change itself

                  Comment


                  • #10
                    Read-only?

                    For what I know, currently this kind of technology is to control computers using "brain waves". Not yet "downloading" stuff into your brain, or your brain actually executing code. Now..

                    Originally posted by Dr. Z2A
                    Hell no i wouldnt put that shit in my head. Dude havent u ever seen ghost in the shell? If someone sends your computer a virus u can get that fixed. I dont think its that easy with your brain.
                    Actually, this is exactly the kind of thing I think about when I hear about full brain/machine interaction. I've yet to hear or see a truly unhackable system, so I do not see any guarantee that a system such as this wouldn't "crash" and make my brain execute something weird. Not to mention the things that would do to your brain! I do not want to lose memories or do things I do not want to do because of, say, checking my e-mail.

                    Yes, having this direct interaction can be great. But it opens up a whole can of worms, for a security breach in a system like this can bring severe consequences.

                    As for the math machine, well, if it could be done, that might be good as long as the device is only accessible by my brain. (oh, I wish I had that when I took multivariable calculus!)
                    "Programming in Visual Basic is like making a building out of LEGOs. Use C, the king of programming languages!"

                    0x029A
                    The number of the Beast!

                    Comment


                    • #11
                      Originally posted by danix

                      AI do not want to lose memories or do things I do not want to do because of, say, checking my e-mail.

                      (oh, I wish I had that when I took multivariable calculus!)

                      What about the ability to erase bad memories?

                      And multivariable calc isn't that bad. :)

                      LosT

                      Comment


                      • #12
                        Originally posted by LosT
                        What about the ability to erase bad memories?

                        And multivariable calc isn't that bad. :)

                        LosT
                        But is it really a good idea to erase bad memories? you have to consider that you learned things from those experiences. And if one could erase bad memories, someone could go and mind wipe you completely(replace memories with fakes maybe?).

                        Little machines that a doctor injects into your system to perform tissue repair etc..

                        This to me is also a scary concept. What if you cannot rid them from your system and they are not perfroming the correct function?
                        I think they could probly make a vaccine for that kind of situation. But who knows, they could really fuck you up if someone didn't set it up right.

                        Comment


                        • #13
                          Originally posted by Dr. Z2A
                          Hell no i wouldnt put that shit in my head. Dude havent u ever seen ghost in the shell? If someone sends your computer a virus u can get that fixed. I dont think its that easy with your brain.
                          I'm assuming you are joking here. :-)

                          Here is an interesting point. I'd argue that humans have mental viruses since we have had history:

                          Consider ideas that are passed from one human to others through speech or writing. Some mental viruses are strong enough to sway large groups of people to share a common goal that was not common before the idea infected their minds.

                          Popular commercials as an example... catchy songs or phrases which cause people to repeat them. ("Can you hear me now?") Popular generational phrases. ("All your base are belong to us.") While the strongest ideas-- ideas that can push people to kill, or die, can be found in [idea omitted] and [idea omitted] which aren't usually discussed here because of what happens when their payoads are activated.

                          It seems we get patched against many of these viruses through education, rational thought, logical analsysis and understanding.

                          How might this change if we were able to receive new ideas that bypass our filters?

                          Comment


                          • #14
                            Originally posted by TheCotMan
                            I'd argue that humans have mental viruses since we have had history
                            It sounds like TheCotMan has read/would like Neal Stephenson's Snow Crash...

                            Comment


                            • #15
                              Originally posted by TheCotMan
                              I'm assuming you are joking here. :-)

                              Here is an interesting point. I'd argue that humans have mental viruses since we have had history:

                              Consider ideas that are passed from one human to others through speech or writing. Some mental viruses are strong enough to sway large groups of people to share a common goal that was not common before the idea infected their minds.

                              Popular commercials as an example... catchy songs or phrases which cause people to repeat them. ("Can you hear me now?") Popular generational phrases. ("All your base are belong to us.") While the strongest ideas-- ideas that can push people to kill, or die, can be found in [idea omitted] and [idea omitted] which aren't usually discussed here because of what happens when their payoads are activated.

                              It seems we get patched against many of these viruses through education, rational thought, logical analsysis and understanding.

                              How might this change if we were able to receive new ideas that bypass our filters?

                              Well I wasn't joking first of all. And I wasn't talking virii in the form of ideas. This stuff uses techonology connected to your brain right? I was talking about a virus for the tech that would have adverse affects on your brain. An idea you can block yourself from going with, you have choice about it and it usually has to be a good idea to pursuade us. I meant that technology would be used to "fry" your brain or something like that (with you having no choice in the matter).

                              Comment

                              Working...
                              X