Saturday, September 29, 2007

Blackboxes and Trust

I'm sure you've heard the saying "you wouldn't buy a car that had the hood sealed shut would you?" - Followed up by an open source zealot fanatic person explaining to you why that analogy works for software. Well I actually do agree with that analogy. Anton Chuvakin put it into better words then I ever could in this blog post.

Every single day very large and important organizations rely on software to keep themselves running (hospitals, infrastructure control, intelligence agencies, the military ... and so on). Yet nearly none of these organizations are legally allowed to see the source code of that software. There is just absolute blind trust in its ability to work correctly and be reliable. Not to mention secure.

Where is the proof this software isn't full of backdoors, vulnerabilities, logic bugs or more. Organizations such as those above need to start asking (demanding) their vendors provide some real proof that the source code or binary was audited by a third party - i.e. not the original developers of the software. This proof works both ways. It gives the company the chance to say "hey - we can't catch all the bugs, but we did our best, and thats why you should choose us over our competition". And customers are given a little more trust in the investment they just made. Because now they know their vendor went further then the competition to produce a better quality product.

Lets take Windows Vista for example - many hackers have audited its source code on while on Microsoft's payroll. This is a good thing, and Microsoft can now say to customers "YES we did audit our code after development". Which is a lot more then most other vendors out there can say. The flip side to this argument is open source. Just because the source is open doesn't mean people have reviewed it for vulnerabilities (download a random sourceforge project and you will understand what I mean). But on the other hand, it does give the customer/user the ability to inspect the software they are relying so heavily on.

How many of you can honestly say the software products your company relies on have been audited by a third party?


0kn0ck(Aditya K Sood) said...

The point is very generic from security context. This issue is of blind trust. But at the same time if a company reveals software code then the business benchmarks are going to be affected. Moreover its a corporate

But better solution is required.

Robert Hensing said...

This falls under the realm of 'Software Assurance' IMHO and I think this is going to be the next big thing in (at least) government software procurement . . . the government will lead the way over the public sector by demanding more secure software. Microsoft already has various source code sharing programs . . . many universities get access to the Windows source code as do various governments. Check out the government security program:

In addition - Microsoft has published the 'Software Security Development Lifecycle' book - which basically describes in detail the process they go through to write secure code. How many other large software vendors have done something like that?

I agree - 3rd party and even in-house peentration testing is a must . . . Microsoft gave their 3rd party pen-testers full access to the source code to improve the bug finding capability. :)

Travis said...


I like to hear this argument repeated every so often and you put a nice take on it. Open source doesn't mean its more secure, but it sure is easier to secure code that you can see. L4L (Linux for life).


Doug said...

I'm actually of two minds about this topic. No, of course I would not buy a car with the hood sealed shut, but if I were an auto manufacturer I would feel pretty confident that no one is going to build a copy of my engine and put me out of business. If I were selling software and came up with a really great idea/script/whatever I would be a little more nervous at letting just anyone see it, as it's much easier to copy/paste than to construct an internal combustion engine...

That being said, I can also see the other side of the argument: 'why should I have to buy (possibly) buggy/backdoored/unsecure software?' The answer is of course that you shouldn't have to, especially if it is going to cost you money to 'fix the glitch' or the software is running healthcare or national defense related infrastructure.

IMHO, a number of things need to happen to work this out:
1) Establishment of TRUSTED security firms that audit software (esp. big-time stuff such as OS's and critical infrastructure software). I'm thinking something analogous to a Certificate of Authority which 'everyone' has a mutual trust in. We're talking beyond NDA and into security clearance for every employee type stuff.

2) This type of audit (or any kind for that matter) is not going to come cheap, so there has to be incentive or penalties for companies to 'make' them go through with it. First major bug found equals a hefty fine, or better yet, make the company legally liable for lost identity, wages, etc.

3) For a large company I imagine it would be very expensive and time consuming to revamp your entire software development life cycle to include a security audit at a proper time such that changes can be made without adversely affecting the software's time to market. Ok, so if you dont take that course, you must have your software audited by trusted security firm XYZ Inc. for less than the cost of doing it yourself. (also hopefully getting it done more thoroughly) If you do not comply, see #2 above.

I realize that this is likely a preaching to the choir situation, but there's my two cents on the matter. Until laws are passed and there are extreme penalties put in place, companies are going to go the cheapest route, and just push their software out. Then, MAYBE they'll fix the bug if someone finds one, but they have no incentive to, other than risking the ire of the security community (ha), especially if they hold some form of monopoly on a certain niche software.

betabug said...

You need to buy a new car sometimes... many new cars (Audi comes to mind) these days come with engine compartments that are essentially sealed to everybody but repair shop personell. Open the hood and you get access for filling up water and oil only. Special key required for anything else.

OTOH maybe just get a new analogy, I think those cars suxx.