TED Talk: How To Ask Good Questions

Why ask questions? Sometimes being able to ask a good question is more important than finding a good answer.

If the video above does not work, then try How To Ask Good Questions: David Stork at TEDxStanleyPark

What makes a question the best?

  • Clearly stated and unambiguous
  • There must be a solution
  • Solution method exists
  • Improved solution methods will likely be useful
  • Extremal
  • Goes to hear of issue
  • The “right” level
  • Leads to new questions

General techniques

  • Isolate components
  • Consider all attributes and combinations of attributes
  • Explore missing aspects
  • Consider extereme cases
  • How does X depend on Y?
  • How to measure?
  • Transisitions from state 1 to state 2
  • Invert things
  • Who, what, where, why, when?
  • Analogies
  • Different “languages” (math, code)
  • Different disciplines

Why you think you’re right — even if you’re wrong

Motivated Reasoning aka soldier mindset:

This phenomenon in which our unconscious motivations (our desires and fears) shape the way we interpret information. So some information and ideas feel like our allies and we want them to win. We want to defend them. And other information and ideas are the enemy. We want to shoot them down.

Scout mindset shows curious, open to ideas, grounded. Willing to change one’s mind based on new information. We need to proud of having changed our mind when new data shows us to have been wrong.

If the above video does not work, then try Julia Galef: Why you think you’re right — even if you’re wrong.

TED Talk: For argument’s sake

Daniel H. Cohen makes an interesting case that:

  1. We equate arguing to war; such that there are winners and losers.
  2. The loser is the one who makes a cognitive improvement, so losing gains the most.

So, we should strive to lose. “It takes practice to be a become a good arguer from the perspective of benefitting from losing.”

My personal observation is whether or not I win or lose an argument, explaining my position requires:

  1. Arriving at how someone else understands the world requires developing one’s Theory of Mind.
  2. Tailoring the argument such that the other(s) understand the position.

These explanations help expose both strengths and weaknesses in the position. In order to “win”, I have to shore up the weakness. That is a cognitive gain. Is it more than the loser who changed? Maybe.

If the above video does not work, then try Daniel H. Cohen: For argument’s sake.

I love logic.

Argumentative Theory of Reasoning

I posted a web comic poking fun at the irrational fear of the ocean. My carefulness last weekend maybe kept me from getting stung by jellyfish and definitely from stepping on a stingray or skate. There were no sharks that I saw. But then, “absence of evidence is not evidence of absence.” 🙂

7984452181_2656f87a4f
Dr. Jonathan Haidt, NYU — Bob Howard, Village Square

After some comments, I eventually deleted the post because I was tired of the arguing whether fear is rational or irrational. (It is both which is why I thought the comic funny and posted it, but obviously this was not the correct audience.) I keep to myself more these days to keep from arguing about politics. There has been a temptation to leave Facebook altogether in order to get away from the madness. Something I will not tolerate is that kind of thing on my own posts. I tell people to stop and if anyone eggs it on, then I delete the post.

In Jonathan Haidt‘s Edge talk A New Science of Morality (Part 1), he alerts us to:

According to Mercier and Sperber reasoning was not designed to pursue the truth. Reasoning was designed by evolution to help us win arguments. That’s why they call it The Argumentative Theory of Reasoning. [1]

My own Confirmation Bias screams that this absolutely must be the most true thing I have read this decade. Several posts on this blog demonstrate my fascination with people trusting their ideology over the facts. But this makes sense in an environment where people are mainly looking to prove themselves correct. Someone can be completely reasonable, but if the other has made up their mind there is no changing it. The flow of information only serves to eventually serve up something that supports their view which they will seize upon.

As Behavioral Economics fanboy, I very much am all about humans are not extremely imperfect reasoners. To label anyone, even Neil deGrasse Tyson, as very rational strikes me as irrational. It will be difficult to refrain from not using reasonable as pejorative to mean someone who has stopped thinking beyond only supporting their own view.

Memic Straw Men

Well, calling the current political, social, or even game discourses debates is probably too generous. That implies discussion which means an attempt at listening to the other if only to hear their point of view enough to counter it. At this point, much of what I see are the use of memes to perpetuate Straw Man fallacies.

It seems like memes are perfectly positioned for this purpose. They are cute and funny. This leads to people backing the ideology in them to think of them as non-threatening so they more easily share memes. Very topical, they get across the shot at the enemy in an amusing way.

A couple examples:

i-did-not-have-textual-relations-with-that-server-hillary-clinton-meme
I did not have textual relations with that server
596d04d28e
Congratulations! Your liberal butthurt just made Trump a little bit stronger.

None of this seems to be about convincing people to change their opinion so much as attacking each other. This whole thing is disappointing.

Research BEFORE reacting

A friend posted this article on Facebook, Everything wrong with this country happened this morning on my Facebook page, which showed an image with the original erroneous claim. The reactions to it were agreement with the bogus claim. Which was extremely sad because the originator of the claim now refutes it. The whole point of the article seems to be that people seem to have lost the ability to see something, research for themselves the accuracy of the information, and make a decision about it. Instead people see things which evoke a feeling and react to the emotions instead of taking the time to verify. Even when that thing is trying to point out they are falling for stupid things corrected over a decade, but the false version resonates so strongly people perpetuate it because ideology trumps facts.

You need evidence.  You must go back somewhere in our objective world of definable objects and time frames and get EVIDENCE before you have an emotional reaction to something.  Not ONE SINGLE PERSON went and researched.  They had their opinions ready when the manufactured reality presented itself.  They gained more satisfaction from expressing their world view than searching for the truth.  This is the problem we’re having.  This is the core of the problem America is having.  If we just searched for objective truth, if we stopped our anger or our emotions for a singular second we wouldn’t have Iraq wars and Afghanistan wars and we’d have an equitable economic system that brought about prosperity to all.

I think a lot about this kind of thing. Some of my posts:

P.S. Snopes and Google are your friends.

Unknown Knowns

Yesterday’s post mentioned unknown unknowns. When I heard this matrix, it pained me that one of the quadrants was missing. Over the years, I have thought about that missing one and what it might mean.

Donald Rumsfeld in 2002 talking about weapons of mass destruction in Iraq a year before the invasion:

Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know.

We also know there are known unknowns; that is to say we know there are some things we do not know.

But there are also unknown unknowns – the ones we don’t know we don’t know.

And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones.

What is interesting about this matrix is the lack of an “unknown known.” This we know, but we do not know that we know them.

  • Known knowns = are evaluated risks. There is confidence in they can be handled.
  • Known unknowns = are poorly evaluated risks. There is no confidence in handling them.
  • Unknown unknowns = are unevaluated risks.

Unknown knowns, I think, are evaluated risks which are ignored. They are our blindspots. The knowledge is there, but not used. Either we disagree with the assessment. Or we think they are too trivial to matter. Or we lie to ourselves about it. In any case they are left out of the calculus or justification of a decision. Possibly a high level administrator never sees them in making the decision.

Too much information overwhelms making a decision. Too little information risks a bad decision. What information is the right information is itself a decision. 🙂

Data > Information > Knowledge > Wisdom (ITIL)

Sunk Cost Fallacy

In economics, a sunk cost is any past cost that has already been paid and cannot be recovered. For example, a business may have invested a million dollars into new hardware. This money is now gone and cannot be recovered, so it shouldn’t figure into the business’s decision making process.

… from How the Sunk Cost Fallacy Makes You Act Stupid.

The article goes on to describe common situations where we fall for it. The solution of making a pro and con list did not really impress me. Really, the solutions are:

  • Be willing to cut losses and run. The Cull and Surrender post is about being willing to cut out things not worth the time.
  • Actively expose mistakes to find. Embarrassment about being wrong or having made a mistake keeps us on the path of that bad place. On Being Wrong.
  • Act like the present is all there is. Past experience contributes to making a decision. But the present case should be handled as a new, independent situation and not a continuation.