Across billions of neurons, mapped not quite chaotically, but with only that structure which provides for rudimentary autonomic function and the building blocks for more. For cognition, for personhood. As we receive stimuli, we change the structure of this web of pathways. Some grow stronger, others weaken and atrophy.
Patterns develop. Stimuli arriving in near-simultaneity become linked structurally. Impressions are formed, impressions in the physical structure of the brain, affecting the strength and reach of signal passage. The brain grows in sophistication, in connectedness. Basic if not simple functions, those coded for in DNA, are used as the building blocks of yet higher connections and functions. We bend the brain’s capacity to detect visual features, like vertical, horizontal, and diagonal lines, into the ability to recognize a chair or a face. We feel pain, associate it with the visual and spatial attributes of the stovetop, and learn to avoid it with our fingers.
We begin with simple atomic units, and build.
Some few layers beyond, after spending years bathed in sensory data, sorting it by building stronger pathways when they’re helpful and diminishing others when not, the brain has a refined capacity for parsing impressions of the world. It develops a sensory algebra for referring to the world through the senses. We begin to see that the world includes us, that we are in the world, and refine our reference frame such that by echoing our worldview back against the world, we can see our shadow upon it. We are now self-aware.
Thereafter, we have larger units built upon all those smaller ones. We are a unit, an algebraic element, as is the seen world, the heard world, the felt world, the smelled world, the tasted world. We are now world-aware, in a local sense, as that locality follows us.
At some point, before or after, we find a curious application of these patterns: they can be encoded and transferred. We call this language, including but not limited to the grammatical spoken and written languages. With this phenomenon arrives the possibility of creating algebraic elements which are actually only references to others, and the ability to combine these references into yet larger references, systematically. What the world is is reshaped as we fold sensory data into lingual structures, as we see a dog and understand her to be a “dog” and then read Old Yeller.
In this way, electrical signals passing through neuron gaps, down myelin-sheathed neuron fibers, across networks of fibers and gaps, form higher-order structures, which in turn form yet higher-order structure, and so on, until our “instincts” (whatever those are) are joined by our cognitive machinery to create our world. Just as a brick house is made of walls, which are made of brick, which are made of compacted clay and sediment, and so on, the space which our cognition occupies evolves from more fundamental blocks.
We may–and many people do–call this process or its result “analogy.” I describe it as the inference of structure via the combination of available cognitive elements, at least for now. It forms the basis for subsequent discussion.
I’m still trying to figure out what I think about that. I started in February of 1998, studying physics at Miami University (Ohio), and soon after decided that I might as well pursue a dual math/physics major. Some years later, logistical practicality prevailed, and I chose to study only mathematics. I’m still skeptical that I ever did.
I’d come to define myself, in part, in terms of this studenthood. Between first registration and final grade tally, I married once twice and divorced once. I saw two daughters come into the world, and one leave it. I bought a house, divested myself of that house, and bought another. I accepted the job I’ve had for a decade, and which I’ve spent the last few hoping to leave. Always, actively or in that part of my forebrain where ego and regret collide, I was a student.
And now I’m not. Now I’m “done.” I expect my diploma in the mail soon, and I will hang it in our office proudly. Though…though it’ll represent not so much a goal reached, but the closing of a personal epoch. An end to a struggle I never fully joined. A curtain call for the döppelganger, the ostensible student. And a beginning of my real education.
Not that I haven’t learned anything; that’d be a bit melodramatic and glib. I’ve learned that the Stewart calculus text is a credible device for (a) teaching various recipes for mathematical cooking, e.g., L’Hopital’s Rule; and for (b) scaring students into believing they’re “bad at math.”
I’ve learned enough of Riemann and Cauchy to have a first-order approximation of how much yet there is to know. I’ve learned some of the basic lexicon of mathematics, of continuity and metric spaces, of normal subgroups and homomorphisms. With a little review, I could even do something with them.
Which is to say, I’ve also learned what it is to truly study a subject. Well, actually, I suppose I can’t claim that: a recursively enumerable set isn’t necessarily a recursive set, which is to say that my knowing that I did not truly study a subject doesn’t confer to me knowledge of what it is to study it. Another first-order approximation, then. But useful nonetheless.
Now there is but to finally carry the mantel of student, freed from the illusion that completing repetitive problem sets on line integrals or expectation values actually constitutes more than a cheap rip-off of mathematics. What I’ve gleaned from this at-once trivial and enlightening realization applies not only to my development, but also to that of my children. What I’ve gleaned is that, even if these cobblestone streets were paved with good intentions, what we generally refer to as “education” is at its very best vocational training, and is at its very worst a commoditizing of our time and attention, of our will to power.
I failed as a student in more ways than I have the patience to note at the moment. Among this litany is my failure to form any academic community. I’m not sure how much I missed out on, but I bet it’s substantial. I’m a product of my passive-aggressive heuristic for dealing with people, characterized by idolizing my antisocial tendencies and then lamenting my lack of human connection. This failure as a student is my failure as a person, notwithstanding the logistical challenges a parent-student faces.
I’m officially done with this particular rant. I’ve really drained myself by alternating self-congratulation and “Woe is me.” I created what Carl Rogers might call a self-image at odds with reality: every Einstein quote, every time I watched Good Will Hunting, every browsing of Carl Friedrich Gauss’ or Terrance Tao‘s Wikipedia pages, contributed to the fantasy that I would, or more importantly, should, be similarly prodigious and capable. My particular knot of neurons and ganglia translated that information into a proscription against hard work, as if I shouldn’t need to work hard, as if instead I should need only to surround myself with the trappings of genius to free genius. And that, that’s the most important thing I’ve learned.
My most ambitious academic product is stored here (LaTeX file here). It’s an interesting, short paper I all but transcribed from my faculty advisor’s notes. I did learn a bit about computability, and LaTeX, and I enjoyed the process. Now I’m moving on, perhaps to study mathematics.
I get lost in a flesh-colored sea of mundanity, and feel powerless by virtue of membership. Alternately, the slightest brush of sentiment, a sudden memory of a childhood contemplation, a simple courtesy, each can bring me to sentimentality and emotional vulnerability. I may be awed by a feat of community, only to be horrified by the quickness of depravity. Hot and cold.
Hidden somewhere in my DNA is whatever code whose execution makes me need community. I don’t pretend to understand it, and I’m nearing the end of my too-cool aloofness toward the idea of sharing my concern with others, even putting theirs above mine. Each time I roll my eyes at Christian goodness expressed as a bumper sticker, I’m hoping someone sees it. I want to share my weak outrage, cleverly if possible.
I suffer the fools, nodding with a forced grin, or, more frequently these days, a committal nonchalance, a disaffection. I try not to enable their insipidity. My working hypothesis is that I can condition people to spare me their American Idol gossip and formulaic well wishes, this hypothesis underpinned by the presumption that they proffer the former to justify their interest, and the latter at the behest of perceived cultural obligation. Neither, then, is anything better than the misapplication of earnestness, a motive earnestness powering a crass social algorithm.
Whatever the quality of the hypothesis, I am yet not quite the good scientist. While you may rest your willing suspension of disbelief of the meaninglessness of our entire venture on a fetish for familial melodrama, I am just as easily duped into giving a damn by the ignorant wagging of a small dog’s tail. There’s no sound nihilistic conclusion that doesn’t undermine the value of both.
In our best moments, people and I dance like we’ve been at it for years, like we took lessons and we know what we’re doing. We finally cave and celebrate Christmas, because, damn it, it’s as good an excuse as any to get together and laugh and talk about the good old times and make yet another promise to stay in touch and yes I mean it this time. They’ve got a way of confusing me, people do, making me forget for a moment the difficulty of thinking deeply about continued existence without concluding that it’s ludicrous. I can get lost in that attachment. I can want a social life, a life filled with people and their straining against credulity to believe in the President or their political parties or how wonderful their children are. I can get lost in it, because I want their illusions to be true, because it makes those first steps out of bed each morning easier to take.
Convinced that I can circumnavigate banality by choosing to associate with like-minded folks, I seek them out, find them, and usually make them regret having admitted to a shared interest. I have gone so far as to bemoan the absence of this social cloud, blaming my fractured attentiveness to fields of my interest on the fact that I’ve no one of whom to ask for help and no one to whom to offer it. If only I could surround myself with ambient similarity of intent and disposition, I suppose, I’d finally earn my pride.
There’s logic in that, of course. There are fewer things any one of us can do better alone than there are things better suited to multiple effort, and individual aptitude doesn’t scale well. And maybe my American pedigree comes with too great a focus on not only genius, but on solitary genius, the single brilliant point in all the surrounding murky darkness. Too great, as I’ve only just seen my way through it and started learning that genius doesn’t exist, that the question isn’t even relevant to any substantive success. No, whatever our natural aptitude, it’s our work that achieves; and work is about the goal, independent of whether we reached out for help or soldiered on in private.
I have this idea, right? I’ve tinkered with it, and I’m convinced it’s a winner. It’s about a coffee house, but not Starbucks or that local shop you go to to keep it real. No, this is a shop with coffee, and healthy snacks, and white- and/or chalkboards, and computers loaded with fancy math software. And books—oh, the books. Shelves of canonical tomes, of Newton and Euclid and Euler and Gauss. Lesser works, too, but nothing without rigor. The real stars, though, are the customers.
They’re math geeks, physics geeks, engineers in fact or aspiration, hackers and crypto wonks, and the taggers-along who dig the nerd scene. You could float from one table to the next and overhear all manner of esoteric enthusiasm; and you could watch the hasty cycle of writing, erasing, lip-chewing, and writing at the boards, two or three contributors locked on a proof or a messy optics problem. All the reference material you’d ever need, all the theorems and algorithms and particle masses, are sitting on the shelves, in multiplicity. You could just soak it in, and maybe learn a thing or two.
Or, rather, I could. Ambient similarity of intent and disposition + ready access to coffee and geeks. And I own the place, which means all the business is my business; and it means that I’ve made it out of the cube farm and into the fold. No longer on part-time terms, I’m a full-time, fully vested member of that community. I’d call this place The Teachers Lounge.
Despite the eagerness with which I hang it, there’s a window behind all that window dressing, and it opens out into a void. All our efforts, all our cares, all our lofty endeavors, all our conniving and violence and pity, all of it breaks down to a single concern: survival. Depending on the scope, survival of the individual or survival of the species; but each of our urges to survive arises from similar genetic heritage, so the former is just an instance of the latter. How, then, can survival actually mean anything, really? If we matter to more than ourselves, I have yet to see how. Put those things together—that all our business is merely survival, and that our existence bears no objective worth—then our business is not itself imbued with intrinsic value. We must give it any value it has.
But the form of this value varies across cultures, measured nationally or racially or ethnically or socioeconomically, so we can’t even provide a sake for life other than life itself. This value is for each of us to assign if it is to be nontrivial, which is problematic. It’s problematic because the shared vision, the community, is in its sparest incarnation a search for authenticity, of an authority for evaluation; if we can each count on no one else but ourselves to define our value, then the community offers false hope at best, confusion or worse otherwise.
I’ll skirt an invocation of Godwin’s law and instead generically assert that history provides us many glaring examples of vicarious valuation gone wrong. Wherever there is a prostration to the wisdom of crowds, there stands a good chance for disaster and mayhem, though we might merely suffer banality instead. And we’re back where we started.
I have a few projects in mind that revolve around a surprisingly resilient interest in sharing something with a group of people. Most are more immediately tenable than The Teachers Lounge. I’ve decided—again, but more resolutely now—to give this community thing a serious whirl. I’m anticipating making use of the extra computing power; for anything my brain can crunch, mine and someone else’s might do it better and/or faster. And there are some prospects for sustainable business models among them, too. But most of all, I’m interested in collecting data and testing my hypotheses. Maybe I’m wrong, maybe there’s some kind of meaning hidden away in the spaces between us. I don’t think so, I’m not expecting much, and I’m not quite beyond feeling I’m selling out. Alas, there is but to give it a shot.
Upon cursory reading, Chesterton also makes several comments to this effect, though madness, not simple despair, is his result. I concur: it’s quite daunting to have none but yourself to rely upon for guidance and whatever “purpose” can be had.
It doesn’t follow, however, that the truth need not be dire. Invoking fruitlessness or direness as illustrations or proof of absurdity would seem relevant only if we take as given a need for life, a purpose in it other than as a means to excrete and feed other life by dying. That effectively cuts to the heart of the issue, and begs the question of whether or not humanity exists to serve a designed purpose.
Furthermore, to support a designed Reason via despair in its absence is tantamount to “finding religion” as a means of escapism, and it should be insulting—to the believer. Clear said, “There are no atheists in foxholes”; and I say that grenades don’t make believers. More than mere delusion, it appears circular: believing in a higher authority as an alternative to a dire life in its absence presupposes that the universe gives a fuck. Again, that points to the face of the question, and is unconvincing.
Of course life can be desperate. It has, does, and by all indications, will, from time to time, become so. I counter that it must be so in the absence of an extrahuman power, man-jerk or otherwise. I find it unfortunate, both that the believer is oft characterized as a zealous mystic, ignorant of fact; and that the nonbeliever is characterized as suffering from a melancholic humor, lost and listless.
To say that death is the only viable option in a world without an intrinsic, enterprising purpose is to say that life lived for its own sake is not viable. If a morality devoid of extrahuman stewardship requires no distinction between raping and not raping, then there can’t be any differentiation between the choice to live and the choice to die. Even if the world of the nonbeliever is this absurd, there still is no de facto reason for suicide. Even if life is nothing more than a protracted process of cellular division and decay, of conversion of energy into food into energy, why is living not a justifiable choice?
Consider, then, that even the nonbeliever can accommodate a belief in something grander than himself: an elegant proof; a refinement of a physical theory; the next quirky thing your son says. All are informing, all contribute to making you more than you were before them. Even oppression, persecution, and assault are potentially additive, if variably unpleasant. Those who don’t believe it was created are not necessarily restricted from admiring the natural world—this is no one’s special province.
I awake each morning and rise, not because today will be better than yesterday; but because today is today, and is not yesterday.
And if it’s not, you’re doing it wrong. Which represents the greater hubris: we have found structures in the universe whose origins we can understand only if they were created by a higher consciousness; or given an intelligent creator, whoever that is must’ve received, like, a C at best for ENG302: Creating Things That Work Well to Populate Your New Universe?
The first point is rather subtle, or seems so given the enthusiasm with which the standard of intelligent design is being raised. The creative capacity of the human species is lauded as one of the characteristics that significantly separate us from other species, and is fundamental to viewing humans as undoubtedly greater. We look to our semiconductor chipsets, our skyscrapers, our movable type, our perfect novels, and our weapons of mass destruction and proclaim, awfully, “Holy shit, we own.” To us, these results of our creative endeavors are without the bounds of nature.
No surprise, then, that upon observing some phenomenon in nature which bears strikingly similar complexity as we’ve come to understand creations wrought by our hands and minds, we conclude said phenomenon is the product of some creative force. Of course that force would have to be intelligent, must be conscious and create with intent, because that’s what we do. That force, though, is not human.
Let’s assume there is an intelligent designer. It’s no feat whatsoever to find fault with this designer’s product (though this is by no means perfectly demonstrative of the claim, as some responses note). I have also heard, more than once and from more than one source, that engineers interested in the topic generally consider the human body to be poorly engineered. Our only saving grace, it would seem, is our intelligence, as, without it, we surely would not prosper as we do. However, a useful argument can be framed which would call even this conclusion into question. It is a long way from self-evident that we are, in fact, the product of an intelligent designer, if a product of any designer we are.
Proponents of intelligent design might have their bets hedged. It’s likely that someone, upon reading the preceding paragraph, would respond, “Well, sure, maybe the coccyx seems ill-planned; but you can’t claim to know the Creator’s plan! You’re overstepping the bounds of your understanding, sir, and I’ll have none of that.” To which I would reply, “I was only following your lead, my fellow, given that you’ve deemed your understanding of nature to be quite conclusive on the matter of what occurs naturally and what must be created to exist.”
“Further,” I’d add, straightening my smoking jacket, “it might appear to the wary observer that it is you who has overstepped the bounds of your understanding more grossly, decided as you have that whatever you might make with some hardware and a free afternoon is beyond the means of the remaining entirety of the universe…except an agent fashioned after humans. Could it be, rather, that ‘creation’ is not the province of intelligence at all, but instead just one other dynamic of the universe, like entropy? Is it really so odd to think that we ‘create’ because we are wholly within nature, inextricably, and are no less an agent of nature, an organ, than gravity or stars or diseases; and, further, is it really so odd to think that, as an organ of nature, we should propagate a dynamism found elsewhere in nature, producing output which had not been observed prior? What if ‘creation’ is a product of the universe, and not vice versa?”
How different, really, are these conceptions? What is really so different about explaining the development of some observed phenomena with an appeal to this Greatness over that one? We may debate the details but not the Greatness. That is, no one really presumes to understand everything about everything, or even everything about much of anything; and whether they’re particle physicists or a pastors (or both), no matter in which of these camps they might fall, for each contributor of this large discussion there is a threshold beyond which we seem commonly to say, “Well, beyond that, we just don’t know.”
Carlin died in June of 2008, at the age of 71. It’s hard to estimate Carlin’s value in the Western culture; his Seven Words are, today, somewhat less powerful than they were in 1972, and no matter his one-time popularity, their deflation is not significantly his doing, I think. At his height, TV love scenes still involved the “one foot on the floor” rule, so for many the Carlin spiel was apparently quite powerful.
And while the the Carlin canon pivots on those seven words, they were only part of his larger social commentary machine. No doubt the Richard Pryors and Steven Wrights of the world owe some small debt to George Carlin. As social creatures who, if unwittingly, make use of their exploits, we, too, owe him that debt. I often found his delivery a little too fond of itself, but maybe that was part of the bit.
However his cultural value is estimated, he seems one of the last of a dying breed, if not necessarily a dying species. I don’t get out much, but I don’t recall hearing of anyone with a similar currency, in terms of popularity and stickiness, who challenges social norms as a matter of principle. The closest might be Dave Chappelle, off the top of my head, but that’s not where I keep my good stuff so I’m not sure how much you should invest in that idea. Oprah Winfrey notwithstanding, the popular culture has changed over these decades of Carlin’s popular decline, and any market research company (including mine) will tell you that it’s increasingly difficult to intercept peoples’ attention. Some have called George Carlin a “hero“; I’m not sure anyone can become that kind of hero in a culture that cherishes its schisms and disaffection.