Solid DNA blog

Blog about stuff on Solid Edge CAD software

Making Estimates in an Uncertain World – Part 1 of 2

This material is derived from Making Robust Decisions by David G. Ullman, 2006 Book.

Making Estimates in an Uncertain World – Part 1 of 2

All decisions are based on the experience of past performance, assessments of the current situation, and visions of the future. Where past performance may be known, the present is obscured by its immediacy and the future is a best guess. In other words, very little is known with certainty, everything else is an uncertain estimate. The robustness of any decision and the risk incurred in making it can only be as good as the estimates upon which it is based.

In this and the next newsletter, we will explore the concept of estimation and develop some guidelines for making estimates used in decision-making. In this first part we will explore the aspects of uncertainty and anchoring with regard to estimates. In the second part, we will explore, how estimation is often in conflict with uncertainty, how higher uncertainty is often seen to indicate less knowledge, how optimistic hindsight clouds estimates, common and special cause uncertainty, and the Law of Designed Behavior.

First consider a simple experiment that we have used in many presentations, the results given here are typical. A group was given a list of dirty dishes and asked to estimate how long it would take to clean up a stack of dishes. The mean value was 32 minutes with a standard deviation of 10 minutes. This result implies that, assuming a normal distribution, there is 50 percent chance that it will take longer than 32 minutes and a 50 percent chance that it will take less than that. Further, there is an 84 percent probability that the dishes will take less than 42 minutes to do (the mean plus one standard deviation, or 32 +10 min) and there is a 90 percent chance that they will take less than 45 minutes to do. For this data the standard deviation is nearly 1/3 of the mean time. Further, if just data from “experts” is used (those who have done the dishes many times in the last month) the values change very little.

How does that make you feel about estimates for more complex tasks? With this experiment in mind, how confident do you think you’ll feel next time a colleague or a vendor says, “It will take about two weeks?” What does such an estimate mean? And if you asked another person who gave you a different value, what could you say about these two values? It gets even worse.

There is a further qualifying factor to consider. The statement, “it will take two weeks,” does not convey whether this is a 50 percent certain estimate (half the time it is estimated to take longer, half shorter), a 90 percent certain estimate (with a 90 percent chance it will be done in three days) or some other percentage. In fact, when the question is given to another group of people and worded like “estimate how many minutes so that you are 90 percent sure you will be finished cleaning up the kitchen,” (underlining added here for emphasis—not in the original questionnaire) the mean is 29 minutes. If worded “estimate how many minutes so that you are 50 percent sure you will be finished cleaning up the kitchen,” it is 18 minutes. Both of these had an 8 minute standard deviation. This implies that most people (at least in this sample) cover their rear ends by giving a 90% certain estimate, but you never know.

There are two conclusions from this:

  1. There is uncertainty in all estimates and you had better know how much as it will affect your decisions.
  2. Each estimate is made with an assumed accuracy (50%, 90% or ??) and you need to know this also.

Finally, if the original question has the addition of “your partner has told you that the kitchen needs to be clean in 15 minutes,” the resulting distribution has a mean of 17 minutes and a standard deviation of only 5 minutes. There is the same number of dirty dishes, but the estimate is nearly half as long. This effect is called anchoring. Anchoring can happen in subtle ways. Let’s say you are bidding on a project and you have been led to believe that the customer has a budget of, say, $10,000. You are now anchored to this value and will try to force your project to fit it.

  1. Estimates can be easily anchored.

This effect can even be subtler. In another experiment (not done by RDI), 150 students watched a one minute film about a car accident. They were then asked “How fast were the cars going when they _____?” The blank was filled in with one of five different words and each of the five versions of the question was handed out to thirty students. Their average responses were: “smashed”—40.8 mph, “collided”—39.3 mph, “bumped”—38.1 mph, “hit”—”34.0 mph, and, “contacted”—31.8. The more dramatic the word, the higher were the students’ estimates of mph.

  1. Estimates can vary wildly depending on how you ask the question.

Finally, what if an analyst runs a detailed simulation and tells you that the system will cost $X to manufacture, or weighs Y kg, or achieves Z speed? These single valued estimates have two sources of uncertainty. First, they are based on other uncertain pieces of information. The resulting estimate cannot be any more accurate that the information on which it was based. Second, uncertainty is dependent on the fidelity of the model or simulation used to compute the cost, weight or performance. High fidelity simulations still reflect the uncertainty of the input information, low fidelity (back of the envelope estimates) even make the input uncertainty larger. All estimates compound the uncertainty of the input information through the fidelity of the model. So be suspicious anytime anyone gives you a single valued estimate. Ask them for at least the three values, high, low and most likely. If they can’t give you these numbers be very suspicious of the estimate.

  1. Estimates are dependent on other uncertain estimates and data, and this uncertainty is compounded in the fidelity of the model or simulation.

So, what can you do to get the best possible estimates? Here are some guidelines:

  1. Realize that all estimates are uncertain, and uncertainty can be managed, but not eliminated; no matter how expert the source or the amount of analysis that went into it.
  2. Don’t anchor estimates. This is very hard to achieve, but if you want to avoid anchoring ask for estimates with preloading for time, cost or performance. In other words, use terms like, “If cost was no object…”, “If we had unlimited budget…”
  3. For time or cost ask for 90% estimates. Realize that the average will be about half this value. Remember that in reality the actual time or cost will be higher than the average 50% of the time. Seems silly to say, but we easily lose sight of this.
  4. For performance estimates ask for high, low and most likely.
  5. Peers or experts should review estimates, uncertainty and reasons.

This material is derived from Making Robust Decisions by David G. Ullman, 2006 Book.

In the first part of this newsletter, anchor concept was discuss.

When it comes time to choose your CAD Software and you feel being anchor by all sort of techniques, look at this fact sheet to replace things in their context.

Siemens PLM Facts sheets

So if particular brand of software claim that they control the CAD world think twice about  theirs assessments and remember that we license technology too 🙂

Advertisements

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: