Software Legal Liability

Should Technology Vendors be Liable for What Hackers Do?


Should Law Enforce End User License Agreements (EULA)?

My colleague at the SANS Institute, David Rice, has published an important and learned book, Geekonomics: The Real Cost of Insecure Software. This book could be for the software industry what Ralph Nader’s Unsafe At Any Speed was for the automobile industry in the 1960s. Nader’s book contributed to the legal movement requiring auto manufacturers to make safer products.

Geekonomics argues that computers would be more secure if software publishers are held legally liable for distributing faulty software. The book is destined to influence software law. David backs his argument with outstanding research and cogent explanations. He says software publishers should not be able to use end user license agreements (EULA) to immunize themselves from liability for their mistakes.

I learned a lot from the book, and I highly recommend it.

How to Judge Software Security?

As I studied the book, I developed a question: Legally speaking, how do we judge whether a software product is good or bad? In other words, What should be the standard for evaluating when a software publisher has failed so miserably that the publisher should be penalized through the mechanics of our judicial system?

On page 69 Geekonomics says the industry has known since the 1960s how to write “secure” software. But the book does not tell me what secure software development entails, and it does not tell me what secure software looks like when it is released.

Do We Want Software That Secure?

Will “secure” software look, function and cost like an M1 Abrams tank, produced by a military contractor? Do I want that kind of software?

Secure Software?

The book goes into quite some detail saying present software is bad. But as a consumer and small business proprietor who bought his first PC (and installed his first software) in 1987, I confess that I am absolutely dazzled with the software made available to me over the past 20 years! I love, for example, all the latest Web 2.0 stuff. I am thrilled to have all the new functionality offered to me in rapid succession. My assessment of the vast array of software I have used seriously is that it is really good. It has enabled me to be productive (and have fun) beyond any dreams I could have had in the 1970s.

I realize hackers can break into the software installed on my PCs, damage my PCs, deface my blog, and steal my credit card data. But the worst case scenario is that I have to backup my data, I must buy a new PC every once in a while (the cost of which has progressively decreased over the years) and I have to monitor things like my credit card statements and my status at the credit bureaus. I accept that no product can be perfect. And I accept that I must order my life to account for the security imperfections in software.

Little of the software I deal with personally impacts my physical safety.

On balance, I am very, very pleased with a great deal of the software available to me today and across the years. But Geekonomics says present software is really bad.

By What Standard Should Law Judge Software?

Hence, my esteemed colleague, David, and I come to this topic with two different value standards. For purposes of law, how do we know who is right?

More particularly, in a court of law, how do we know whether a software product is good enough (albeit not perfect) so that the publisher can avoid having to pay money as a consequence of the publisher’s distribution of the software to the public? If we can't answer this question with some precision, then our prolific software industry will be stunted.


  1. I don't really think this is a legal issue. You say
    "Little of the software I deal with personally impacts my physical safety."

    You should read up on incidents like the following
    where it is known that innocent people's lives have been destroyed due to the lack of security in current internet communication.

  2. the author has no common sense. why is it that when we take something and turn it into a bunch of digital bits people get confused.

    do we hold the makers of a gun liable for what their customers do? no, it is the customer who is stupid and misused the gun.

    now lets turn that into digital bits

    instead of "do we hold the makers of software liable for what their customers do? no, it is the customer who is stupid and misused the software." we somehow get "the software provides tools that can be abused and the makers should be held liable for any abuse"

    if such laws were enacted we would see a lot of the current software innovation flat-line because a lot of progress is and has been made by people who can't afford to be sued. the big corporations won't care because they'll just copy each other anyway.