| | One of the failures in epistemological thinking that I've noted is that of failing to weight outcomes. Of course, we could be dreaming, hallucinating, stuck in an epistemology machine, etc. How can you disprove the possibility? You can't. However, as in Pascal's Wager, different assumptions have different weights, depending upon our valuative heirarchy.
(Pascal was wrong in his analysis, because he failed to balance the infinite rewards and punishments of Heaven and Hell with the infinite possible number of hypothetical differing Gods with all sorts of conflicting demands.)
When we decide to stop for a red light, it's within the context of a vast assumed value structure. There are times when we wouldn't stop, as in if we were being chased by gang members firing a machine gun at us. The risk of running the red light and possibly having a serious accident would be outweighed by the other concern. However, we don't generally question whether the light is actually red, or even there at all, or whether if we just kept going, we might simply sail on into a glorious wish-fulfillment fantasy. Each of us has plenty of inductive experience to the effect that the payoff of such an attitude is chancey at best.
Assuming that you are awake, not hallucinating and in contact with a real reality, not someone's simulation, has the payoff that if you're correct in that assumption, then you are much more likely to achieve what you want than if you assume the contrary and act as though your actions have no real consequences. If you are incorrect, and in fact are dreaming, for example, the consequences are generally minimal.
We make a huge number of explicit and implicit assumptions, based on both induction and deduction, because of the need to act. Each of these assumptions is inherently a gamble, based on the weighting of many other assumptions. But it's not arbitrary. Or so we assume - because the contrary assumption, that it is arbrary, leads nowhere. Once again, decision theory rules.
|
|