Softening strict requirement of optimality can make problems tractable. Put it another way it is more important to quickly narrow the search for an optimal solution to a “good enough” subset than to calculate the “perfect solution.” Ordinal (which is better) before Cardinal (value of optimum).
Compare the two scenarios presented below:

Getting the best decision for certain – Cost = $1m

Cost = $1m/x – Getting a decision within the top 5% With probability = 0.99*

In real life, we often settle for such a tradeoff with x=100 to 10,000

For systems that are not life-threatening, the focus should be on fast fault detection and mitigation (discover and recover) instead of exhaustively trying out every possible scenario which will make the system perfect but at such a cost that forward progress turns glacially slow. At which point your quicker, nimbler opponents will run over you with their faster product cycles.

People constantly deceive themselves into thinking that by paying much more they are getting a 100% solution but in reality you never get a true 100% solution, so its really a choice between different levels of less than perfect.

*Under independent sampling, variance decreases as 1/sqrt(n). Each order of magnitude increase in certainty requires 2 orders of magnitude increase in sampling cost. To go from p=0.99 to certainty (p=0.99999) implies a 1,000,000 fold increase in sampling cost.

This entry was posted on Friday, September 24th, 2010 at 18:31 and is filed under Stochastic. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

I’ve had discussions recently that made me consider how fear drives some people to ignore probability, and then to incur additional cost/complexity to solve for improbable scenarios. Especially once improbable scenario (A) has occurred, fear drives the invalid conclusion that P(A)=1.00, and the subsequent decision to defer current work in order to prevent recurrence of (A). Obviously, not referring to cases where (A) is caused deliberately.

I’ve had discussions recently that made me consider how fear drives some people to ignore probability, and then to incur additional cost/complexity to solve for improbable scenarios. Especially once improbable scenario (A) has occurred, fear drives the invalid conclusion that P(A)=1.00, and the subsequent decision to defer current work in order to prevent recurrence of (A). Obviously, not referring to cases where (A) is caused deliberately.