Skip navigation

I ran across an article the other day with the title “Complex Behavior from Simple Systems” or something like that. This didn’t make sense to me. Aren’t all complex systems built up from simpler systems? Microprocessors are built from AND and OR gates, large software systems are built from simple variables and assignments, bridges are built from steel and rivets. What was so noteworthy about this?

It turns out that, in the realm of continuous systems, the terms “complex behavior” and “simple system” are well defined. It was talking about how chaotic behavior could arise from simple differential equations.

However, hardware and software are examples of discrete systems, where these terms are not as well defined. This made me realize that if I am going to talk about complex systems, I better define what I mean.

In computer science, there are many definitions of complexity. There is computational complexity, which is probably the most familiar and relevant one to the design of software. There is algorithmic complexity, which may be more relevant to hardware, but is not widely used. But none of these really capture the notion of complexity when we talk about designing large, complex systems. Also, all of these computational metrics have one problem, they don’t actually tell you whether something is complex or not. They simply return a value. It is up to you to decide if the measured system is complex or not based on this value.

The question, then, is, what is a useful definition of a complex system? I am going to use a definition that appears to be somewhat facetious:

A complex system is one that has bugs when it ships.

This doesn’t seem useful because it appears to say that you cannot define something as complex until after it ships. But, the reality is, all large systems ship with bugs (If anyone has shipped a large software/hardware system that is bug free, you shouldn’t be reading this blog, you should be writing it.) If you are working on a multi-million line-of-code project or multi-million gate project, we both know there are going to be bugs when it ships. They may be minor, but they will exist, so my definition, I think, is fairly clear cut.

But, this is still a subjective definition and there doesn’t seem to be any way of avoiding this. In fact, many aspects of engineering complex systems are subjective. I have seen that one of the greatest disconnects that causes quality and schedule issues in large project is the expectation that there are objective metrics when, in fact, there are only subjective judgements. When asked a question like “have you found all the bugs yet?”, it is useful to remember the following principle in designing complex systems:

everything is subjective

We still need to answer the question: is my definition of complex systems useful? We can break the problem into two parts: 1) how hard is it to avoid putting bugs into the system in the first place, and 2) how hard is it to find the bugs you put in? If a system has no bugs when it ships, no matter how easy it was to avoid putting bugs in, it had to have been easy to find them. Therefore, verification complexity is the key metric in determining whether a system is complex by my definition. OK then, how do we define verification complexity? Well, I hate to tell you this, but it’s subjective too, but that is the topic of another post.


One Trackback/Pingback

  1. […] basic question seems to boil down to: are bridges complex systems?  I define a complex system as one that has bugs in it when shipped. It is clear that bridges still have that […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: