|
Not that social conventions and mathematics are obligatorily mutually exclusive.
Math was used to formulate the decadal system and the calendar. Just not the math that you want to apply. This is specious. If you don't understand, don't assume. It looks foolish.
Now, actually using this particular math is a social convention. We didn't need to use tjos math. We did for a long time. However, language provides labels, and we can find useful ways to generalize from those labels and form social conventions from that generalization. Then it's possible to actually use *both* conventions and for them to interact badly. That's abstract. Let's exemplify it.
English counts using simplex numerals from 1-10. At 11 it looks simplex, and is. But historically it's "one left over"--you count using your fingers and if there's one left uncounted you've got 11. 12 is "two left over". Apparently "three left over" was a bit much for the Common Germanics, so it's "3 added to 10", threotene or some such form, which we've inherited as "thirteen". Same for 14-19. That "tene", now "teen", is handy. We generalize 13-19 and get teens. Translating "teenager" to other languages is a bitch because the word is based on a peculiarity of English, one that most languages don't have and we only have because some people living in N. Europe in 500 BCE decided to count 11 and 12 oddly. Still, having "teens" is a useful social convention, at least in English. I asked my wife earlier today if she thought of 18 and 19 year olds as teenagers. She knows a bit of semantics, and they're not prototypical teenagers. She said "no, not really". Then she said they are anyway. That sums it up nicely.
We have another convention. At age 18 you become responsible for yourself. You can marry at 18 and 19, own property, do all kinds of things. You're not a minor anymore, you've reached your majority. Or, since that's a bit old fashioned sounding, you're an adult.
So a couple of years ago the CDC published the teen birthrate and noted a spike in infants born to teens in the US. It was easy to assume that meant out of wedlock births, but it didn't; it was easy to assume it meant "babies having babies," but it didn't. Enough 18- and 19-year-olds married and had kids (whether or not conception occurred pre-nuptials is beside the point) that it made for misleading statistics. The category "teen births" had to be kept for practical reasons. The prototypical teen should not be having kids because that, statistically, had nasty correlations with future problems for the kids; to say that a married 19-year-old woman shouldn't have kids is a stretch, the correlations fail to hold so neatly. Still, it's usually a useful convention to track things by teen versus 20s. And usually better than using the minor/adult classification. Two conventions, and the usually useful one failed to be useful briefly.
So we can have two conventions in use at the same time to cover the same ground. One is rooted in something other than informal social conventions--the minor/adult distinction is grounded in law--and the other is rooted in superficial linguistic forms (note that the legal one is still a social convention, but not as quirky as the language-idiosyncratic idea of "teens").
Let's look at time.
First, we enumerate things. I have 3 apples, let's say I number them in the order I received them, apple 1, apple 2, apple 3. (We don't do this with ages, but who needs consistency?) Let's say I'm hungry. If I eat 3 apples out of the 3 (eating the 3rd first, the 2nd second, and the first last), i.e., 3 - 3, I have zero apples. Crucially, I do *not* have apple 0 after eating my three apples. I'm subtracting things and counting what's left; I'm not subtracting numbers. Oddly, try this: 3 - 2.5. What do I have left? Half of apple 1. So I still have something I can point to and say "this is apple 1, my first apple." I do not say I have "apple 1/2". If it's easier, think of "apple 1" as a formal way of writing "first apple", "year 2010" as writing "two thousand tenth year".
I could write "QED" and stop. There's math involved. Just not your math. Using it is a social convention, of course, and I'm not done with this in any event. Not by a long shot, because there's a useful social point to be made after all the falderol about social conventions and mathematics.
When I have no apples, I have no apples. If I extend it to years, it's "2010 years - 2010 years", so I don't have year 0. I have zero years. Zero years at the most? Isn't that a bit nonsensical? Time existed before year 1, surely. But time doesn't come in discrete units that started at 12:00 a.m. in 1/1/1. But let's keep using apples.
If I have 0 apples and suddenly want to nibble an apple I can borrow one. Do I call it apple 0? No. It's an apple owed. I'm in the hole for an apple. If I'm going to document it, number it, list it on my fruit ledger, it's apple -1. Whether all or part is consumed, it's apple -1 that I have, that I owe back to somebody. There is no apple 0, unless I'm in a computer science class or otherwise indexing numbers consciously trying to label my first 10 apples using just one digit. (Then I'm also likely to count 8, 9, a, b, c, d, e, f, as well.)
Shifting back to time, if I'm at a hypothetical 12:01 am on 1/1/1 and want to move back to that moment just before midnight, where I'm forced to be if I force the clock backwards to remove every last bit of year 1 (since time is continuous, or at least appears so in normal life), where am I? I'm before year 1 but I'm still in a year--I haven't fallen out of time--and so that year has to be counted. I don't count 0, as a rule. It's the same as with apples: I'm a bit in the hole, taken a bite out of a piece of time not in positive territory. If 1 is "common era", the first year of the common era is year 1, then I've moved to "before common era". Simply because I'm counting things. If 12:01 a.m. on 1/1/1 is year 1 because it's in year 1, then the 11:59 p.m. 2 seconds earlier has to be 11:59 p.m. on 12/31/"-1" or 1 BCE/BC. 0 is where there is no time and, because of the nature of reality (one tends to choose one's math to suit what one's modelling), there is no point at which I can say, Aha, zero! Unlike when I can say I have no apples.
0 is what you have, the quantity you have, when you have none. But after moving a fraction of a second before 1 CE/AD starts I'm still in some year. And that year gets counted. That's the way *that* kind of math works. In all fairness, numbering things 0 really is a recent kind of thing. And, think about it: if you number your first apple "apple 0", then if you take away that apple you get nothing. You'd probably want to write it "0 - 1 = 0".
So the lack of year 0 isn't a problem. It's entailed, actually, by the way we number things. And that nasty 0 - 1 = 0 business. (Yeah, I know it's mixing different kinds of terms as though you could perform a mathematical operation on them, but it's funny to look at.)
Let's keep going. Now, it's a social convention to group years in sets of 10. That's a decade. A trite definition, to be sure. Let's assume that we start with 2010 as the first year in a decade. It's not hard to see that 20 is two decades, 200 is 20 decades, and 2000 is 200 decades. So if 2010 starts a decade, we zap out 200 decades and wind up with the year 10 starting a decade. No problem, in principle.
Do we want to exhaustively include all years in decades? I mean, what about the "rump" years of 1-9? They're not a decade; they're a nonad. Do we have a social convention that we start grouping the years into decades from 1 or from, say, 2010? Which do you want to do *mathematically*? I'd start with 1 and count exhaustively. Then 2010 ends a decade, because the first decade was 1-10. A lot of people like that convention. It's simple: you start counting and every 10 you say, "Ah, the end of a decade." If we do that, though, saying 2010 starts a decade *is* a problem. It's a useful social convention because every year is in a decade, with no 9-year decades there to screw up the count. (I mean, if the first decade is odd, why not the 198th? Then 2009 would start "the" decade.)
On the other hand, just as we have oddities with "teens" based on the superficialities of language mapping in some useful way with adolescence and adolescents, so we have our numbering system where 20s, 30s, 40s, 50s... all form neat kinds of similar units. And, not accidentically, they're all decades--units of 10, and all based on the penultimate digit in the year designation. They form a handy class. All of the "nulevye" in Russian, all the "zeroeths", for instance; the "ninetieths", i.e., the '90s. Russian's consistent because you can just as easily talk about the antepenultimate digit and refer, say, to the eight-hundredths (i.e., the 1800s) or use two digits and refer to the eight-hundred-seventieths (1870s). It's a handy social convention. Russian nicely has "10ths", so they can refer to the period of, say, 2010-2019 in a consistent way (they don't have the eleven/twelve vs. 13-19 split in linguistic form). English doesn't handle those numbers consistently, so that's going to be interesting: Will 2010-2019 be the "pre-teens" or "tweens", while 2013-2019 are the "teens"? Will we oddly refer to 2010-2019 as the "tens"? (Note that we skipped from the "gay '90s" to the "roaring '20s" probably not by accident, lacking handy terms for 1900-1919.)
Of course, this is a *different* social convention. As with the teen/20s versus minor/adult contrasting conventions there are two distinct and competing conventions or definitions at play here.
The question that's usually posed is infelicitous, the word used in discourse pragmatics for "ill-formed" or, well, improper: "When does *the* decade start?" That "the" presupposes that we have a common frame of reference, that we know that "the decade" refers to even as we ask what it refers to. It, in this case, presupposes that we have the same definition, i.e., the social convention, in play. But there are two. One starts decades at year 1 and insists that all years be part of a well-formed decade, the other bases decade division on surface linguistic forms or a labeling scheme forced by decades that can be easily so labeled.
The question imposes an absolute unity on an obvious dichotomy, and forces a "if you're not with me you're against me" mentality that's inappropriate. Then it compels us to try to force reason and things like mathematics to rally to our side, as though forcing things that way made any kind of sense. Really, "2010 - 2010 = 0". (Even look at your graph: ok, 0 to 1 is the first unit greater than 0, 0 to -1 is, I guess, the -1th unit, the first unit less than zero--now, find me the zeroeth unit, the one, I guess, between 0 and -0? Yep, as though forcing things that way made any kind of sense.)
For some purposes, by one social convention, 1990, i.e., 12/31/1990, concluded the decade that began on 1/1/1980. For other purposes, by a different social convention, 1/1/1990 started the decade that (presumably) started the decade that ended on 12/31/1999.
Without context, without definitions and some background, there is no "the decade". It's a bit like like talking about "Dave"--surely you know one and when I say, "Hey, did Dave call you?" a particular Dave comes to mind. If I continue, "If he calls again, tell him and his wife Carmelita hi and ask if he still has a viszla" it's likely to be a bit jarring. It's improbable we know any of the same Daves. It would be silly to assume we have some "shared Dave."
Arguing about which is the One True Social Convention is pointless, as well--it's like asking which of us knows the "real" Dave. We live with the quirkiness of acting like teenagers are all minors yet being forced to call an adult 19-year-old a teenager. And yet most people are convinced that *their* social convention must necessarily be everybody else's social convention.
Diversity's fine, and to be encouraged, as long as we're all of my opinion.
Meh.
|