Would you accept an implant that allowed your brain to communicate with digital networks bidirectionally at the level of abstract thought? Obviously the security implecations are, well, nuts, but if consciousness were augmented in such a manner, wouldn't the flaws in human consciousness itself from which security vulnerabilities stem be eliminated as well?
Announcement
Collapse
No announcement yet.
Wires in your brain?
Collapse
X
-
Originally posted by basculeWould you accept an implant that allowed your brain to communicate with digital networks bidirectionally at the level of abstract thought?
if consciousness were augmented in such a manner, wouldn't the flaws in human consciousness itself from which security vulnerabilities stem be eliminated as well?
Our brains were not designed (through evolution) to work this way with other brains directly passing conscious thought. I would be worried that we would not have mechanisms to control the conscious thoughts we received or sent-- especially while dreaming.
There would also be an issue where "few nodes" would understand the security implications of specific bits of code, and be hammered by the many nodes which were clueless.
Also, many security problems exist in "border cases" where it is not a single section of code, but perhaps how the code is used elsewhere or in combination with other code that leads to unexpected problems.
-
Originally posted by basculeWould you accept an implant that allowed your brain to communicate with digital networks bidirectionally at the level of abstract thought? Obviously the security implecations are, well, nuts, but if consciousness were augmented in such a manner, wouldn't the flaws in human consciousness itself from which security vulnerabilities stem be eliminated as well?
I need a cigarette.
Good question though. What made you think of it?The only stupid question is the one that you dont ask.
Or the one that ends up in dev/null.
Comment
-
I'd say the majority of security vulnerabilities arise from two things:
1. It's hard to remember stuff
2. Math is hard
If such a device were to make machine language as intuitive as any other form of thinking or expression, and memory and mathematical skills were "perfect", then why would security vulnerabilities continue to arise?
I suppose conversely such a device would make it substantially easier to find complex race conditions and other problems which otherwise would be beyond the ability of the human mind to decipher...45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B0
45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B1
[ redacted ]
Comment
-
Yes, but aren't we talking about two different things here: the COMMUNICATION side and the UTILIZATION/COMPUTATIONAL side- meaning just because you can communicate directly with a machine does not imply your ability to understand signals from that machine. For instance I can blast a small, slow processor with all kinds of data, but just because I have a pipe to it does not imply that the data will be used or understood properly....
Would a certain amount of 'adaptation' take place? Surely.
As far as the hacking goes, is even a non-wired brain safe? What about social engineering? What about hypnosis? Surgery to alter the brain?
You're right, there is no 'perfect' system (as you defined perfect meaning non-hackable)- But does one really exist now? A bullet to the head is one hell of a system crash. (I'm just saying~)
The only criteria I would have about doing some type of augmentation like that would be a demonstration of improvement- baby steps to artificial evolution if you will....and didn't Sony just take out some strange patents like this?
LosT
Comment
-
Originally posted by basculeI'd say the majority of security vulnerabilities arise from two things:
1. It's hard to remember stuff
2. Math is hard
Which is trusting your compiler to not leave too many clues about your encryption cipher implementation?
Consider design decisions.
Consider management's input: "Must be backwards compatible" or "Must meet export restrictions for an International Market, and Import restrictions of country XYZ."
If such a device were to make machine language as intuitive as any other form of thinking or expression, and memory and mathematical skills were "perfect", then why would security vulnerabilities continue to arise?
Go on to consider that with higher languages we have abstractions that can give us fewer lines of code to complete the same takes after compiliation or being interprited. (You are talking machine languge.)
I am not so sure the device would make our math better.
A sharing of thoughts does not increase the maximum capacity we have to fully understand all the parts of a project in under 10k lines of code.
Now. Consider the number of lines of code in MS Windows XP and estimate how many lines of machine code that would be.
Consider what would be required for a person to understand the whole project.
You apply an additive effect to "help" the process, but don't apply it to errors. The Mythical Man Month shows us effects of adding people to a project; it does not leaad to linear growth to code productivity. Adding more brains to a project can slow it down. Allowing bad ideas to pass directly? Such effects would be subject to an additive effect as well.
I think the best application of transmitting thoughts would be education.
Comment
-
What about nano technology that is being tested today?
http://www.zyvex.com/nanotech/nanotechAndMedicine.html
Little machines that a doctor injects into your system to perform tissue repair etc..
This to me is also a scary concept. What if you cannot rid them from your system and they are not perfroming the correct function?
I have seen first hand the incompentency of doctors and technology, such as their wireless deployments and their newest bluetooth device in which some carry all their patient information without the second thought of security or privacy. So injecting little nanobots into my systems? No Thanks. Not yet anyway.
Comment
-
Security is always a Trade-off.... as a smart guy says... (Bruce Schneier)
Convenience VS Security
Most of the time... Complexity VS Simplicity also can be seen..
Privacy VS Security (esp in a post 9/11 world...)
So... Wired directly to my computer? A very cool thought.... Implications with such a big idea/change are beyond our level of comprenhention....
Connecting technology and electronic devices to biological devices (ie people/animals) is always going to be... 'intresting'... Questions of religion, morals, security, abuse and the future are waiting to strike on a topic such as this...
Nice job though... fun to dream. (nightmare or not in this case, its up to you...)The only constant in the universe is change itself
Comment
-
Read-only?
For what I know, currently this kind of technology is to control computers using "brain waves". Not yet "downloading" stuff into your brain, or your brain actually executing code. Now..
Originally posted by Dr. Z2A
Hell no i wouldnt put that shit in my head. Dude havent u ever seen ghost in the shell? If someone sends your computer a virus u can get that fixed. I dont think its that easy with your brain.
Yes, having this direct interaction can be great. But it opens up a whole can of worms, for a security breach in a system like this can bring severe consequences.
As for the math machine, well, if it could be done, that might be good as long as the device is only accessible by my brain. (oh, I wish I had that when I took multivariable calculus!)"Programming in Visual Basic is like making a building out of LEGOs. Use C, the king of programming languages!"
0x029A
The number of the Beast!
Comment
-
Originally posted by danix
AI do not want to lose memories or do things I do not want to do because of, say, checking my e-mail.
(oh, I wish I had that when I took multivariable calculus!)
What about the ability to erase bad memories?
And multivariable calc isn't that bad. :)
LosT
Comment
-
Originally posted by LosTWhat about the ability to erase bad memories?
And multivariable calc isn't that bad. :)
LosT
Little machines that a doctor injects into your system to perform tissue repair etc..
This to me is also a scary concept. What if you cannot rid them from your system and they are not perfroming the correct function?
Comment
-
Originally posted by Dr. Z2AHell no i wouldnt put that shit in my head. Dude havent u ever seen ghost in the shell? If someone sends your computer a virus u can get that fixed. I dont think its that easy with your brain.
Here is an interesting point. I'd argue that humans have mental viruses since we have had history:
Consider ideas that are passed from one human to others through speech or writing. Some mental viruses are strong enough to sway large groups of people to share a common goal that was not common before the idea infected their minds.
Popular commercials as an example... catchy songs or phrases which cause people to repeat them. ("Can you hear me now?") Popular generational phrases. ("All your base are belong to us.") While the strongest ideas-- ideas that can push people to kill, or die, can be found in [idea omitted] and [idea omitted] which aren't usually discussed here because of what happens when their payoads are activated.
It seems we get patched against many of these viruses through education, rational thought, logical analsysis and understanding.
How might this change if we were able to receive new ideas that bypass our filters?
Comment
-
Originally posted by TheCotManI'm assuming you are joking here. :-)
Here is an interesting point. I'd argue that humans have mental viruses since we have had history:
Consider ideas that are passed from one human to others through speech or writing. Some mental viruses are strong enough to sway large groups of people to share a common goal that was not common before the idea infected their minds.
Popular commercials as an example... catchy songs or phrases which cause people to repeat them. ("Can you hear me now?") Popular generational phrases. ("All your base are belong to us.") While the strongest ideas-- ideas that can push people to kill, or die, can be found in [idea omitted] and [idea omitted] which aren't usually discussed here because of what happens when their payoads are activated.
It seems we get patched against many of these viruses through education, rational thought, logical analsysis and understanding.
How might this change if we were able to receive new ideas that bypass our filters?
Well I wasn't joking first of all. And I wasn't talking virii in the form of ideas. This stuff uses techonology connected to your brain right? I was talking about a virus for the tech that would have adverse affects on your brain. An idea you can block yourself from going with, you have choice about it and it usually has to be a good idea to pursuade us. I meant that technology would be used to "fry" your brain or something like that (with you having no choice in the matter).
Comment
Comment