Here's something to throw out to the corporate folks. I was recently tasked with developing a technology evaluation program here at my job. Since my role is security focused, its going to be tuned towards evaluating security related technology. However, one of the things I was looking to do was follow some industry 'best practices' when it came to evaluating technology, however I am having a hard time finding any. I've seen a few folks refer to industry standards, but I cannot seem to track down anything hard and fast that says 'A good idea when evaluating enterprise technology is ______'. So, I figured I'd toss it out here. How do you evaluate technology and products in your busniness environment?
Announcement
Collapse
No announcement yet.
Technology Evaluation
Collapse
X
-
Take it apart, put it back together, break it, fix it. If after all that there are no pieces left over, keep it.
Most evaluations have to be tailored to the business's specific needs. I know of no 'general' best practices, however I'm not the most experienced in that area either.Never drink anything larger than your head!
-
I am not corporate, but...
Many of the security evaluations I have seen are from the perspective of:
"What is the cost of not using this technology"
and applying those costs to a risk multiplier for comparison to the cost of implementing and maintaining the technology being considered. This allows for prioiritization of purchases.
Add reliability and availability to your evaluation as the cost for ensuring these will include system rebuilds, man hours of work, backup software/hardware.
These costs can come up when there is a security violation as a system needs to be rebuilt, and the loss in man hours of people that need services provded by the system that is down while being rebuilt, and data/man-hours lost since last backup.
If you must ensure data privacy, then there are likely costs to information leakage which can sometimes be added to the above. e.g. Do you have to pay for customer's/employee's "free" credit checks if your technology exposes SSN/CCN information?
What you describe is something that we really need to do at my work, but never seem to have the time. If you get a solution in private, I'd appreciate seeing what you find too.Last edited by TheCotMan; December 12, 2005, 13:57. Reason: tons of typos and bad word choices fixed
Comment
-
Unfortunately I don't have a "methodology" for doing this type of testing, but like Renderman said, just start hammering at it and see what it does. If it is to go into a specific environment, such as healthcare, banking, or federal, you can always fall back to evaluating it against HIPAA, GLBA, or NIST 800-53(A), etc.
I usually just start at the first functionality of a device, and keep a network traffic sniffer going the entire time. I would have never thought that a custom VPN client would send a junk "ping" to a DSL address in Canada to check heartbeat before offering the "Network is Unavailable" message. ;-)
So, sorry for no formal framework for testing, just the regulatory requirements of the specific field. I would be interested if you do turn something up, however.Aut disce aut discede
Comment
-
I am on the same side of the fence as TheCotMan. However, I did find something you may be looking for noid.
You may request a whitepaper, entitled BEST PRACTICES for Enterprise Network Security, from tradepub.com.
The online form is fairly long, but it may be worth it.
Comment
-
Originally posted by TheCotManI am not corporate, but...
You smell corporate to me
Many of the security evaluations I have seen are from the perspective of:
Risk Analysis
"What is the cost of not using this technology"
and applying those costs to a risk multiplier for comparison to the cost of implementing and maintaining the technology being considered. This allows for prioiritization of purchases.
ROI
Add reliability and availability to your evaluation as the cost for ensuring these will include system rebuilds, man hours of work, backup software/hardware.
These costs can come up when there is a security violation as a system needs to be rebuilt, and the loss in man hours of people that need services provded by the system that is down while being rebuilt, and data/man-hours lost since last backup.
RA
If you must ensure data privacy, then there are likely costs to information leakage which can sometimes be added to the above. e.g. Do you have to pay for customer's/employee's "free" credit checks if your technology exposes SSN/CCN information?
Test Cases - QA
What you describe is something that we really need to do at my work, but never seem to have the time. If you get a solution in private, I'd appreciate seeing what you find too.
_____________________________
Evaluation:
Make a list of all the products functiions - ALL
Create test cases for all the functions
Did it work, does it still work?
And then there is always the negitive test cases.Last edited by cindy; December 14, 2005, 16:18.
Comment
-
In addition to the points made above:
One thing i'm keen on is getting the 'does it do what it claims on the box' test over and done with as rapidly as possible. We know how the software's meant to work, so inferring whether or not it's doing that from the evaluation guide or manual is easy. What's harder is finding the cases where it breaks.
The problem is how to roll this into an evaluation program that suits all cases. Does the product run on a client/server model with an agent on each desktop reporting back to central server, what layers of traffic is it looking at, what does it hook if it does need to run on the desktop, etc.
Quite honestly, I don't know that there necessarily *is* a way to write an all-in-one testing guide - at best you can come up with something that details necessary hardware, impact on system and network resources, budgetary considerations / ROI, projected extent of failure if it breaks, and so forth. My gut feeling is that you want to leave this document as broad as possible in the actual areas of testing while narrowing down the practicalities of implementing said solution to within a reasonable degree. This way, if you get into the testing phase and it turns out to be completely unworkable you've got some CYA on your end by being able to say, 'hey, the vendor claims it does this, but it really doesn't, it keeps crashing, etc.'
Comment
-
The way I've evaluated technology is to have meetings with sales reps from all the companies we're looking at making a purchase from, letting them know who they're competing with, and let them know what the competitors are offering.
This is usually enough to get them to set up demo units that you can do testing on. You can then evaluate several environments firsthand and pick the one that works the best (for the price)45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B0
45 5F E1 04 22 CA 29 C4 93 3F 95 05 2B 79 2A B1
[ redacted ]
Comment
Comment