| | Hi Daniel,
I think we may be talking past one another, but I'll give this another shot.
Firstly before we address Plato et al, can we just agree on the language issue, which is Objectivist usage vs typical usage? That what we would call in our everyday, colloquial language "approximate" or "rough" measurement, Ayn Rand would call - in the"philosophic-speak" of the IOE passage quoted - "absolutely precise" or "exact"?
This is how people talk, yes. It doesn't mean that they're correct in doing so-- it just means that they haven't really thought about this kind of stuff in this depth.
Well of course I agree. The two things [between which there is a relationship in measurement] ... are:
1) the *physical* object we are measuring
2) the *abstract* "unit" we are measuring with, which we express in another physical thing like a ruler.
No, you have to have a physical thing to measure length, otherwise there's no way to peform a measurement. Relatonships between things have to be commensurable. The only reason we can talk about an abstract "centimeter" is because we have material standard lengths of centimeters and pre-measured rulers that we can use as our units. It is with this that we perform our actual measurement, not with an abstract idea. Here, you're taking the idea of centimeter as primary to which all of our physcial copies are mere approximations. Platonism.
(Now, you too would agree that the "unit" involved is abstract, given that it is "concept" (ie the metric system), and concepts are formed by "abstracting" from reality. Thus, while not an abstraction pre-existing humans a la Plato - which I don't agree with either - it is still an abstract product of our conscious process. Yes?) This is true of the concept "centimeter," yes. However, the concept symbolized by the word "centimeter" cannot be used to measure a piece of wood. You need something that has the length of a centimeter in order to do that.
Well, I believe that Plato's basic insight was roughly correct in this respect ... Heh.
Once again, correct, but slightly beside the point. The problem is not the necessity of standards, but the *application* of a standard - which is a unit, which is abstract - to the *physical* object in question.
There may be an equivocation on the word "unit" going on here. Rand uses the term in two ways: as the referent of a concept, and as a rigid standard of length with which one performs measurements. They are related ideas-- read ItOE to see how.
To clear up the distinction here, the concept "centimeter" refers to all material objects which have length identical to a centimeter (essentially a convention); these are the units of the concept "centimeter." The actual objects which have length equal to a centimeter are used as units of measurement by counting how many of them are necessary to get just past the end of some object to be measured.
Rand acknowledges that this standard - let's say 1.5mm for argument's sake - cannot be exactly physically measured.
Not in terms of millimeters, no. However, by accepting this object as a unit, you don't care how many millimeters long it is-- you are seeking to measure other objects in terms of it.
Here, you are really saying that you have an object with extension, and it can be measured as 1.5mm with an error of 0.1mm. However, if you measure something with respect to the object you are given, it doesn't matter how long it is with respect to any other units. That one can measure this chosen unit with respect to a standard millimeter unit doesn't make the length of that standard uncertain or otherwise unsuitable for measurement-- you just have to specify that your unit of measurement is that object, and not a millimeter.
Now conversion between units are possible of course, but unless the conversion in question is a subdivision or a multiplication of a previous chosen unit, one must be careful to assign an appropriate error to the conversion. If one accepts that 1 mile is exactly 1.6km, then one's probe will miss the planet of Mars entirely. ;-)
So she says, ok, let's just say it's between 1mm and 2mm (or between 1.49mm and 1.51mm or whatever microscopic amounts), then we'll call that "exactly" - then we'll be being "absolutely precise"!! Yes, but the purpose of measurement isn't to describe every unit in terms of every other unit. It's to take a unit and measure something else in order to get information about that thing.
You seem to be treating a meter length as more fundamental than a random iron rod here, when metaphysically both are all lengths, and hence epistemologically are both equally suitable for measurement. The fact that we've all agreed on the metric system as a standard of linear measurement does not grant the millimeter any special metaphysical status.
Now, it's not a bad solution on the face of it. But there's one problem with it that she and "Professor E" (is he Lenny? I can't remember) have overlooked.
That is, the defining points of the range - 1mm and 2mm - *cannot be established any more exactly than the measurement in between them!* Not if one understands the context of measuring with a millimeter length. (Also, I strongly suspect that Prof. E is Dr. Peikoff, but I don't know. Anyone else know?)
They too must be approximated in the same fashion we have approximated 1.5mm. Approximated with respect to what? You don't have to do any such approximation if you are using a millimeter standard to begin with-- as I mentioned before, all you're doing then is counting the number of lengths it takes to get just past the length of whatever you're measuring. This is counting: no wiggle room, all discrete units. Precise.
Nate T.
|
|