Rebirth of Reason

Post to this threadMark all messages in this thread as readMark all messages in this thread as unread

Post 0

Monday, December 15, 2008 - 1:02pmSanction this postReply

A more informed decision is a more rational decision, is it not? But too much information can lead to information overload, which in turn can lead to a kind of option paralysis or anxiety or remorse over decisions. Also, information gathering is time dependent, and if we are going to use information to make decisions, we need to pick a cut-off point where the information gathering stops. Often times, this means there's information out there that we will not take into consideration. So I'm curious: how much information, and at what point, rationally, should we say "enough"?

Also, is there a moral obligation to improve our rationality when possible? If not, why not? If so, would you say this includes an obligation to use artificial aids, such as caffeine, that help improve the ability to focus?


Sanction: 6, No Sanction: 0
Sanction: 6, No Sanction: 0
Post 1

Monday, December 15, 2008 - 3:14pmSanction this postReply
Perhaps the cut off point would be the moment you're convinced. If you're rational and convinced, chances are that you'll be right.

If a life is valued, and improved thought can better your life, then there is a moral obligation to better your thought process.

As far as the caffeine goes...caffeine'll kill ya! My body's health also has value to me.

Post 2

Tuesday, December 16, 2008 - 1:23amSanction this postReply
Not to imply that you are Jordan, but this question reminds me of skepticism.

'I can't know everything so how the hell can I know anything?'

Louis, I think, has the right idea. Once you've become convinced of something, rationally, then the job is done. You might need more information later, but if you have found what you need to find and all signs point a certain way then go that way.

It does not matter that life might exist on another planet or that the deli only serves ham on rye Tuesday after 11am if I am looking for a job. The amount of awareness required depends on the decision to be made.

I like coffee, though I'm not sure if it gives me any edge. I depend on cigarettes for that.

Post 3

Tuesday, December 16, 2008 - 6:24amSanction this postReply
Put the priority more on goal attainment. Experiment with various lengths of time spent on information gathering and compare how long in total it takes to accomplish goals. Use this information with induction to predict optimal amount of time spend on learning.

Sanction: 6, No Sanction: 0
Sanction: 6, No Sanction: 0
Post 4

Tuesday, December 16, 2008 - 8:36pmSanction this postReply

A more informed decision is a more rational decision, is it not?
No, not necessarily -- for instance, if you have integrated things improperly. An example is Malcolm Gladwell, who has tons of information about the history of successful folks -- but doesn't understand how they got to be that way. He has a theory about how all of this historical information fits together -- but it is a wrong theory. Self-hampered by his own pet theory about social-engineering, more information won't help him to become more rational. He doesn't need to become more knowledgable, he needs to become more wise.

I wrote about Gladwell's thinking errors last month here, and David Brooks of the New York Times wrote about them this month here.

Also, Rand would say that you should exercise an unbreached rationality, regardless of your intelligence and regardless of the body of knowledge with which you are working:

Man has a single basic choice: to think or not, and that is the gauge of his virtue. Moral perfection is an unbreached rationalityónot the degree of your intelligence, but the full and relentless use of your mind, not the extent of your knowledge, but the acceptance of reason as an absolute.

So I'm curious: how much information, and at what point, rationally, should we say "enough"?
As a former teacher I think an example would be worth 1000 words here. Here's one:

In food science, destroying bacteria present in food is important for human health. Food scientists went through a bunch of work to discover how much heat it would take to get a "log-kill" of bacteria in a food sample. If you heat a food sample (such as a what's in a can of food) you'll kill 10% of the bacteria present in that food (and 90% will survive the heat). If you heat them up more, you can kill 50% of all the bacteria present in food -- but that's not enough. If you heat them up still more, you can kill 99% of all of the bacteria in a food product. Some folks might say that that ought to be enough to make the food product safe and ready for the shelves of a grocery store -- but even that's not enough.

Now, I'm not a food scientist but I worked alongside them in a lab. Here is how they determined how much heat to use. It was explained to me that the heat that canned foods have to go through is a heat that not only would bring the level of bacteria down to a single cell, but it was a level of heat that would -- if you extrapolate -- bring the level of bacteria down to a 100th of a single cell (which is 2 "log-kills" past what it takes to kill all but a single cell).

If they had done things differently, leaving a single live cell of bacteria in a food -- then that single cell could divide, and its daughter cells could divide, up to the point where they would become numerous enough to cause harm to folks eating the food. The amount of heat required was dictated by reality -- food manufacturers ought to heat food enough to kill every whole single cell (because of the nature of cell division). Nature dictates how much heat is right, in order to bring the possibility that a single cell of bacteria would survive down to negligible probability.

The nature of cell life dictates how much you need to know about heating food before you can eat it -- though the notion of taking the heat up to two "log-kills" past where a single cell survives seems somewhat arbitrary. Another example -- whether there's no room for this arbitrary decision-making -- might be better:

"Two-Face" from the Batman series is pointing a gun at your face and asking you to call a coin-flip heads or tails. If you guess wrong, he pulls the trigger. You ask him if it would be all right to see a few coin flips (in order to determine whether the coin is fair -- or whether it is two-headed or two-tailed). If you discover it is a two-headed coin, then you can safely call "heads" and you'll get to live. Here is when you would have enough information for certain conclusions ...

1) if you see a "heads" flip and also see a "tails" flip -- then you know that the coin is not two-headed or two-tailed; you no longer need any more information than that

2) if you somehow see both sides of the coin while it is in the process of getting flipped -- then you know if the coin is two headed or two-tailed (hopefully, it is one of those so that you can declare to "Two-Face" that you are ready to call it!); you no longer need any more information than that

3) if you never see the coin in the air, but only the results of the flips, and they've always been heads, or they've always been tails -- then you will have to determine how many flips you will need to see before you are comfortable calling the coin.

I, personally, would become willing to go ahead and get the thing over with after about 1-10 thousand flips, each with the same outcome.

More flips might make me more sure, but I will also starve to death if I sit there with Two-Face's gun to my head, as he tirelessly and joyfully flips his precious coin. In the interest of not carrying it out to the point where I starve to death -- which would defeat my purpose of staying alive -- I would want to know the probability of a fair coin coming up the same way, 1000 times in a row.

I would balance that probability against the probability of starving to death or whatever other time concern which might make me ready to take my chances, and weigh the gravity of the outcome of guessing wrong against the outcome of my other time concerns (such as not showing up for work, or not paying a bill on time). It's a moral calculus of: [confidence] X [utility] X [value] -- much like they use in rational decision making in medicine.
Also, is there a moral obligation to improve our rationality when possible? If not, why not?
I think there's a moral obligation to improve your rationality when you become aware that you are capable, that your method has utility, and that extra rationality has value -- but that's three things that you have to become aware of, before it becomes obligating.

I am one of the "unlucky ones" who simultaneously has confidence in his capability to affect outcomes, has identified the utility of certain methods, and sees the value of extra rationality. Now aware of these three key things, it has become my moral obligation to improve my rationality when possible, which leads to your final question:

If so, would you say this includes an obligation to use artificial aids, such as caffeine, that help improve the ability to focus? 
I would say that there's an obligation to use methods that improve your focus when you really need to focus. For instance, I would not walk in to a court room to defend myself without first getting a good night's sleep (if possible), without a chemical -- like caffeine -- required for the release or sensitivity to "brain adrenaline" (nor-epinephrine), and without a high-protein breakfast -- which has been shown to improve mental performance.

However, there is no obligation to do these things before you're planning an afternoon nap, or when you are having a lazy-day of zoning out in front of the TV.


(Edited by Ed Thompson on 12/16, 8:43pm)

Post to this thread

User ID Password or create a free account.