Trusting a black box

Steven Johnson wrote in Everything Bad Is Good For You about how in video games we have to figure out the rules of the built world. We are not just exploring a virtual space but build a mental map of cause and effect.

The humor of memes about video games having prepared one when finding something random that looks like a glitch in the real world reflects this mental map concept.

Anything built without our controlling the rules works this way. Say I have a car that estimates the range. It says I have 11 miles before it runs out of gas and the fans are on full, so I see the miles dropping faster than they should. I come to doubt really have 11 miles and the gas station I can get $.40 off is 7 miles away. I might get there and I might not. So, I put a gallon in it. The range doesn’t budge.

Do I still have 11 miles? Surely I have more, but how do I know that I do? Can I trust it?

Opaque rules impair causation. See, the whole point of the tool is to allow me to predict when to take action. More gasoline SHOULD cause more actual range which should cause the gauge to show more range. Filling up the tank soon after did show the max range like it should. This event eroded my trust, which makes me worry about whether I can trust the gauge even when it does show there is plenty of gas.

P.S. the gas gauge did not move either.