Tuesday, April 26, 2022

The value of statistics

One of the mental tools we use to navigate through life is to create mental maps of Cause==>Effect. That is: If I do THIS, then THAT happens.

We continuously make IF choices and occasionally update our expected THEN THAT results.

Most often, the actual result is close enough to the expectation that we do not bother to fine-tune the expected results in our mental maps.

We rarely "update" our maps as frequently enough or far enough

There are two kinds of errors we can make in updating our mental maps to incorporate new information. We can not update them (the typical error) or we can update them too much. In both cases the cost of obtaining the new information is wasted.

Not updating the map is seductive. We don't have to admit that our prior-belief was flawed. We don't have to deal with the uncertainty of a new setting.

Excessively zealous updates creates instability and non-convergent solutions...or non-solutions. There are probably evolutionary reasons for humans avoiding frequent updates.

The kernel of the problem is there is no simple, intuitive way to "feel" the proper amount of adjustment to make in our Cause===>Effect tables.

That is where statistic thinking has a place.

An example:

Suppose you stumbled into a process where the outputs were:

250, 250, 250, 255, 250, 250, 250, 255, 254...

Suppose you were given the task of adjusting the process. Would you average the data to figure out how much to adjust the dial? Would you do something else.

Like much of what happens in life, it depends.

Suppose you were on patrol and at deer-thirty you started getting incoming fire and you could see muzzle-flashes at 250 degrees and 255ish degrees. Firing into the pucker-brush at 251.56 degrees will accomplish nothing. The firing positions are at 250 and 255ish degrees. Incidentally, 1.56 degrees translates to about seven-and-a-half feet at 100 yards.

Statistical thinking suggests that you are at more risk from the shooter at 250 degrees so you might put cover between you and that source of incoming fire and then wait for a muzzle-flash at 255 degrees before sending one downrange.

It may be just one shooter shuttling between two firing positions and he might be favoring the 250 position because he has better cover there.

It is counter-intuitive to NOT shoot at the average position. It is even more counter-intuitive to first neutralize the position that seems to pose the least risk. But that is where statistical thinking can take you.

Another example:

There were two sources of a given component that went into an automobile body. The dimensional differences influenced the dimensional characteristics of the automotive body such that workers were hired to adjust the fit of the hood, doors and trunk-lid of the finished product.

The factory chose to find a happy medium and had to "fit" every vehicle.

It might have been smarter to set the equipment to build the vehicles built with parts from one source perfectly and only "fit" the vehicles built with parts from the other source. The time and effort of moving swing-metal 1.0mm vs 2.0mm was negligible. The only risk was the chipping of paint.

If in doubt, plot it out

Man really is an analog beast. Digits or numbers are almost meaningless. We can manipulate them but they really only sing to the idiot savants. Plot the data. Make a picture. It will speak to you.

Even if you hate statistics.


5 comments:

  1. Sales Technique analysis and fine tuning is another example of that. That is what makes a super salesman.---ken

    ReplyDelete
  2. "but they really only sing to the idiot savants"
    Truer words have never been spoken.


    ReplyDelete
  3. Best thing I ever learned in Calculus was how to sketch a function. Taught me to "draw it out" on most every issue I have bumped into that didn't just sing out it's own answer.

    ReplyDelete
  4. In the example ... the solution is not to return fire and expose your position. Call in a fire mission and keep your head down.

    When I read the two error-types in updating mental maps, realize that even trying to update gradually but continuously is prone to both error-types. Life is an exercise in continuous adaptation, not just to inaccurate mental maps, but to conditions that change the process our mental maps are meant to model. Humans are natural model-predictive controllers, but the model is always subject to refinement or replacement. How much may depend on what data you have, what data you don't have, and what are the risks of being wrong.

    ReplyDelete
  5. One problem with mental models is understanding if the external process is discrete or continuous and where Mean, Mode, and Median values come in to play. This example is a great way to show how the three values can be different. We get lazy assuming everything is a gaussian distribution

    ReplyDelete

Readers who are willing to comment make this a better blog. Civil dialog is a valuable thing.