We begin as pulses of electricity.

Across billions of neurons, mapped not quite chaotically, but with only that structure which provides for rudimentary autonomic function and the building blocks for more. For cognition, for personhood. As we receive stimuli, we change the structure of this web of pathways. Some grow stronger, others weaken and atrophy.

Patterns develop. Stimuli arriving in near-simultaneity become linked structurally. Impressions are formed, impressions in the physical structure of the brain, affecting the strength and reach of signal passage. The brain grows in sophistication, in connectedness. Basic if not simple functions, those coded for in DNA, are used as the building blocks of yet higher connections and functions. We bend the brain’s capacity to detect visual features, like vertical, horizontal, and diagonal lines, into the ability to recognize a chair or a face. We feel pain, associate it with the visual and spatial attributes of the stovetop, and learn to avoid it with our fingers.

We begin with simple atomic units, and build.

Some few layers beyond, after spending years bathed in sensory data, sorting it by building stronger pathways when they’re helpful and diminishing others when not, the brain has a refined capacity for parsing impressions of the world. It develops a sensory algebra for referring to the world through the senses. We begin to see that the world includes us, that we are in the world, and refine our reference frame such that by echoing our worldview back against the world, we can see our shadow upon it. We are now self-aware.

Thereafter, we have larger units built upon all those smaller ones. We are a unit, an algebraic element, as is the seen world, the heard world, the felt world, the smelled world, the tasted world. We are now world-aware, in a local sense, as that locality follows us.

At some point, before or after, we find a curious application of these patterns: they can be encoded and transferred. We call this language, including but not limited to the grammatical spoken and written languages. With this phenomenon arrives the possibility of creating algebraic elements which are actually only references to others, and the ability to combine these references into yet larger references, systematically. What the world is is reshaped as we fold sensory data into lingual structures, as we see a dog and understand her to be a “dog” and then read Old Yeller.

In this way, electrical signals passing through neuron gaps, down myelin-sheathed neuron fibers, across networks of fibers and gaps, form higher-order structures, which in turn form yet higher-order structure, and so on, until our “instincts” (whatever those are) are joined by our cognitive machinery to create our world. Just as a brick house is made of walls, which are made of brick, which are made of compacted clay and sediment, and so on, the space which our cognition occupies evolves from more fundamental blocks.

We may–and many people do–call this process or its result “analogy.” I describe it as the inference of structure via the combination of available cognitive elements, at least for now. It forms the basis for subsequent discussion.

People and me, we’re a hot and cold thing.

I get lost in a flesh-colored sea of mundanity, and feel powerless by virtue of membership. Alternately, the slightest brush of sentiment, a sudden memory of a childhood contemplation, a simple courtesy, each can bring me to sentimentality and emotional vulnerability. I may be awed by a feat of community, only to be horrified by the quickness of depravity. Hot and cold.

Hidden somewhere in my DNA is whatever code whose execution makes me need community. I don’t pretend to understand it, and I’m nearing the end of my too-cool aloofness toward the idea of sharing my concern with others, even putting theirs above mine. Each time I roll my eyes at Christian goodness expressed as a bumper sticker, I’m hoping someone sees it. I want to share my weak outrage, cleverly if possible.

I suffer the fools, nodding with a forced grin, or, more frequently these days, a committal nonchalance, a disaffection. I try not to enable their insipidity. My working hypothesis is that I can condition people to spare me their American Idol gossip and formulaic well wishes, this hypothesis underpinned by the presumption that they proffer the former to justify their interest, and the latter at the behest of perceived cultural obligation. Neither, then, is anything better than the misapplication of earnestness, a motive earnestness powering a crass social algorithm.

Whatever the quality of the hypothesis, I am yet not quite the good scientist. While you may rest your willing suspension of disbelief of the meaninglessness of our entire venture on a fetish for familial melodrama, I am just as easily duped into giving a damn by the ignorant wagging of a small dog’s tail. There’s no sound nihilistic conclusion that doesn’t undermine the value of both.

On Community

In our best moments, people and I dance like we’ve been at it for years, like we took lessons and we know what we’re doing. We finally cave and celebrate Christmas, because, damn it, it’s as good an excuse as any to get together and laugh and talk about the good old times and make yet another promise to stay in touch and yes I mean it this time. They’ve got a way of confusing me, people do, making me forget for a moment the difficulty of thinking deeply about continued existence without concluding that it’s ludicrous. I can get lost in that attachment. I can want a social life, a life filled with people and their straining against credulity to believe in the President or their political parties or how wonderful their children are. I can get lost in it, because I want their illusions to be true, because it makes those first steps out of bed each morning easier to take.

Convinced that I can circumnavigate banality by choosing to associate with like-minded folks, I seek them out, find them, and usually make them regret having admitted to a shared interest. I have gone so far as to bemoan the absence of this social cloud, blaming my fractured attentiveness to fields of my interest on the fact that I’ve no one of whom to ask for help and no one to whom to offer it. If only I could surround myself with ambient similarity of intent and disposition, I suppose, I’d finally earn my pride.

There’s logic in that, of course. There are fewer things any one of us can do better alone than there are things better suited to multiple effort, and individual aptitude doesn’t scale well. And maybe my American pedigree comes with too great a focus on not only genius, but on solitary genius, the single brilliant point in all the surrounding murky darkness. Too great, as I’ve only just seen my way through it and started learning that genius doesn’t exist, that the question isn’t even relevant to any substantive success. No, whatever our natural aptitude, it’s our work that achieves; and work is about the goal, independent of whether we reached out for help or soldiered on in private.

The Teachers Lounge

I have this idea, right? I’ve tinkered with it, and I’m convinced it’s a winner. It’s about a coffee house, but not Starbucks or that local shop you go to to keep it real. No, this is a shop with coffee, and healthy snacks, and white- and/or chalkboards, and computers loaded with fancy math software. And books—oh, the books. Shelves of canonical tomes, of Newton and Euclid and Euler and Gauss. Lesser works, too, but nothing without rigor. The real stars, though, are the customers.

They’re math geeks, physics geeks, engineers in fact or aspiration, hackers and crypto wonks, and the taggers-along who dig the nerd scene. You could float from one table to the next and overhear all manner of esoteric enthusiasm; and you could watch the hasty cycle of writing, erasing, lip-chewing, and writing at the boards, two or three contributors locked on a proof or a messy optics problem. All the reference material you’d ever need, all the theorems and algorithms and particle masses, are sitting on the shelves, in multiplicity. You could just soak it in, and maybe learn a thing or two.

Or, rather, I could. Ambient similarity of intent and disposition + ready access to coffee and geeks. And I own the place, which means all the business is my business; and it means that I’ve made it out of the cube farm and into the fold. No longer on part-time terms, I’m a full-time, fully vested member of that community. I’d call this place The Teachers Lounge.

The Nothing

Despite the eagerness with which I hang it, there’s a window behind all that window dressing, and it opens out into a void. All our efforts, all our cares, all our lofty endeavors, all our conniving and violence and pity, all of it breaks down to a single concern: survival. Depending on the scope, survival of the individual or survival of the species; but each of our urges to survive arises from similar genetic heritage, so the former is just an instance of the latter. How, then, can survival actually mean anything, really? If we matter to more than ourselves, I have yet to see how. Put those things together—that all our business is merely survival, and that our existence bears no objective worth—then our business is not itself imbued with intrinsic value. We must give it any value it has.

But the form of this value varies across cultures, measured nationally or racially or ethnically or socioeconomically, so we can’t even provide a sake for life other than life itself. This value is for each of us to assign if it is to be nontrivial, which is problematic. It’s problematic because the shared vision, the community, is in its sparest incarnation a search for authenticity, of an authority for evaluation; if we can each count on no one else but ourselves to define our value, then the community offers false hope at best, confusion or worse otherwise.

I’ll skirt an invocation of Godwin’s law and instead generically assert that history provides us many glaring examples of vicarious valuation gone wrong. Wherever there is a prostration to the wisdom of crowds, there stands a good chance for disaster and mayhem, though we might merely suffer banality instead. And we’re back where we started.

Alas

I have a few projects in mind that revolve around a surprisingly resilient interest in sharing something with a group of people. Most are more immediately tenable than The Teachers Lounge. I’ve decided—again, but more resolutely now—to give this community thing a serious whirl. I’m anticipating making use of the extra computing power; for anything my brain can crunch, mine and someone else’s might do it better and/or faster. And there are some prospects for sustainable business models among them, too. But most of all, I’m interested in collecting data and testing my hypotheses. Maybe I’m wrong, maybe there’s some kind of meaning hidden away in the spaces between us. I don’t think so, I’m not expecting much, and I’m not quite beyond feeling I’m selling out. Alas, there is but to give it a shot.

Even with a balanced coin, it’s no mean feat to make sense of the world.

Upon cursory reading, Chesterton also makes several comments to this effect, though madness, not simple despair, is his result. I concur: it’s quite daunting to have none but yourself to rely upon for guidance and whatever “purpose” can be had.

It doesn’t follow, however, that the truth need not be dire. Invoking fruitlessness or direness as illustrations or proof of absurdity would seem relevant only if we take as given a need for life, a purpose in it other than as a means to excrete and feed other life by dying. That effectively cuts to the heart of the issue, and begs the question of whether or not humanity exists to serve a designed purpose.

Furthermore, to support a designed Reason via despair in its absence is tantamount to “finding religion” as a means of escapism, and it should be insulting—to the believer. Clear said, “There are no atheists in foxholes”; and I say that grenades don’t make believers. More than mere delusion, it appears circular: believing in a higher authority as an alternative to a dire life in its absence presupposes that the universe gives a fuck. Again, that points to the face of the question, and is unconvincing.

Of course life can be desperate. It has, does, and by all indications, will, from time to time, become so. I counter that it must be so in the absence of an extrahuman power, man-jerk or otherwise. I find it unfortunate, both that the believer is oft characterized as a zealous mystic, ignorant of fact; and that the nonbeliever is characterized as suffering from a melancholic humor, lost and listless.

To say that death is the only viable option in a world without an intrinsic, enterprising purpose is to say that life lived for its own sake is not viable. If a morality devoid of extrahuman stewardship requires no distinction between raping and not raping, then there can’t be any differentiation between the choice to live and the choice to die. Even if the world of the nonbeliever is this absurd, there still is no de facto reason for suicide. Even if life is nothing more than a protracted process of cellular division and decay, of conversion of energy into food into energy, why is living not a justifiable choice?

Consider, then, that even the nonbeliever can accommodate a belief in something grander than himself: an elegant proof; a refinement of a physical theory; the next quirky thing your son says. All are informing, all contribute to making you more than you were before them. Even oppression, persecution, and assault are potentially additive, if variably unpleasant. Those who don’t believe it was created are not necessarily restricted from admiring the natural world—this is no one’s special province.

I awake each morning and rise, not because today will be better than yesterday; but because today is today, and is not yesterday.

The tailbone’s connected to the funny bone.

And if it’s not, you’re doing it wrong. Which represents the greater hubris: we have found structures in the universe whose origins we can understand only if they were created by a higher consciousness; or given an intelligent creator, whoever that is must’ve received, like, a C at best for ENG302: Creating Things That Work Well to Populate Your New Universe?

The first point is rather subtle, or seems so given the enthusiasm with which the standard of intelligent design is being raised. The creative capacity of the human species is lauded as one of the characteristics that significantly separate us from other species, and is fundamental to viewing humans as undoubtedly greater. We look to our semiconductor chipsets, our skyscrapers, our movable type, our perfect novels, and our weapons of mass destruction and proclaim, awfully, “Holy shit, we own.” To us, these results of our creative endeavors are without the bounds of nature.

No surprise, then, that upon observing some phenomenon in nature which bears strikingly similar complexity as we’ve come to understand creations wrought by our hands and minds, we conclude said phenomenon is the product of some creative force. Of course that force would have to be intelligent, must be conscious and create with intent, because that’s what we do. That force, though, is not human.

Let’s assume there is an intelligent designer. It’s no feat whatsoever to find fault with this designer’s product (though this is by no means perfectly demonstrative of the claim, as some responses note). I have also heard, more than once and from more than one source, that engineers interested in the topic generally consider the human body to be poorly engineered. Our only saving grace, it would seem, is our intelligence, as, without it, we surely would not prosper as we do. However, a useful argument can be framed which would call even this conclusion into question. It is a long way from self-evident that we are, in fact, the product of an intelligent designer, if a product of any designer we are.

Proponents of intelligent design might have their bets hedged. It’s likely that someone, upon reading the preceding paragraph, would respond, “Well, sure, maybe the coccyx seems ill-planned; but you can’t claim to know the Creator’s plan! You’re overstepping the bounds of your understanding, sir, and I’ll have none of that.” To which I would reply, “I was only following your lead, my fellow, given that you’ve deemed your understanding of nature to be quite conclusive on the matter of what occurs naturally and what must be created to exist.”

“Further,” I’d add, straightening my smoking jacket, “it might appear to the wary observer that it is you who has overstepped the bounds of your understanding more grossly, decided as you have that whatever you might make with some hardware and a free afternoon is beyond the means of the remaining entirety of the universe…except an agent fashioned after humans. Could it be, rather, that ‘creation’ is not the province of intelligence at all, but instead just one other dynamic of the universe, like entropy? Is it really so odd to think that we ‘create’ because we are wholly within nature, inextricably, and are no less an agent of nature, an organ, than gravity or stars or diseases; and, further, is it really so odd to think that, as an organ of nature, we should propagate a dynamism found elsewhere in nature, producing output which had not been observed prior? What if ‘creation’ is a product of the universe, and not vice versa?”

How different, really, are these conceptions? What is really so different about explaining the development of some observed phenomena with an appeal to this Greatness over that one? We may debate the details but not the Greatness. That is, no one really presumes to understand everything about everything, or even everything about much of anything; and whether they’re particle physicists or a pastors (or both), no matter in which of these camps they might fall, for each contributor of this large discussion there is a threshold beyond which we seem commonly to say, “Well, beyond that, we just don’t know.”

Some folks deified George Carlin, which I find wholly ironic.

Carlin died in June of 2008, at the age of 71. It’s hard to estimate Carlin’s value in the Western culture; his Seven Words are, today, somewhat less powerful than they were in 1972, and no matter his one-time popularity, their deflation is not significantly his doing, I think. At his height, TV love scenes still involved the “one foot on the floor” rule, so for many the Carlin spiel was apparently quite powerful.

And while the the Carlin canon pivots on those seven words, they were only part of his larger social commentary machine. No doubt the Richard Pryors and Steven Wrights of the world owe some small debt to George Carlin. As social creatures who, if unwittingly, make use of their exploits, we, too, owe him that debt. I often found his delivery a little too fond of itself, but maybe that was part of the bit.

However his cultural value is estimated, he seems one of the last of a dying breed, if not necessarily a dying species. I don’t get out much, but I don’t recall hearing of anyone with a similar currency, in terms of popularity and stickiness, who challenges social norms as a matter of principle. The closest might be Dave Chappelle, off the top of my head, but that’s not where I keep my good stuff so I’m not sure how much you should invest in that idea. Oprah Winfrey notwithstanding, the popular culture has changed over these decades of Carlin’s popular decline, and any market research company (including mine) will tell you that it’s increasingly difficult to intercept peoples’ attention. Some have called George Carlin a “hero“; I’m not sure anyone can become that kind of hero in a culture that cherishes its schisms and disaffection.

Does prayer beget opposable thumbs?

The University of Oxford has apparently taken £2M toward answering “Why do we believe in God?” From the Times article:

They will not attempt to solve the question of whether God exists but they will examine evidence to try to prove whether belief in God conferred an evolutionary advantage to mankind. They will also consider the possibility that faith developed as a byproduct of other human characteristics, such as sociability.

I found this in a discussion about the study in the Galilean Library, which is a wonderful resource for thoughtful, and almost more importantly, civil discussion across the philosophical spectrum. I added a (typically long-winded) point of consideration, summarized by saying that, as much as “God” might just be a label for the collection of “all things we think we don’t know, or which are currently ineffable and beyond our scrutiny,” then any such study, however expensive, really aims to investigate just one in a class of cultural metaphors, models, for things we want to understand.

There are undoubtedly flaws in viewing all our cognitive endeavors as mere mapping sensation to lingual metaphors, as it is certain to be a crude approximation, if a valid approximation at all; but just as in the case of the infinite square well, first-order, crude models work well as insertion points to broader and/or deeper analysis. And lest it be thought that I’m of the same “science != religion” mindset I was just a few short years ago, here’s a hopefully fruitful excerpt.

Of course, science itself probably holds no special claim to some objectivity, some circumnavigation of our use of metaphors as models. Just because scientific models use polynomial notation, and are algebraically derivable, and embed quite a lot of effective structure in the mathematical language they inhabit, doesn’t mean they’re anything more than just intricate metaphors. Niels Bohr developed a pretty model for the atom, but this perfect picture has since been rebutted. It worked, though, and, in many contexts, is a fine picture of atomic structure. Physics students still make use of metaphors in university, e.g. the infinite square well. These are simplified pictures used to motivate a subtle concept, before their complexity is increased on the way to deeper understanding.

The specific question about whether belief in “God” conferred/confers an evolutionary benefit is interesting, and something I’ve pondered for a while. In general, it seems that people with some kind of workable model prosper to a greater degree than those who don’t, independent of the model’s approximation of “truth.” This is related, I think, to results of a study concluding that those able to deceive themselves tended to be happier than those who aren’t, given that (a) models are only ever approximate, and (b) the utility of a model is probably proportionate to how strongly we believe in it’s explanatory power.

I need help, and I need to help.

Of all the memes wriggling across The One True Web (that social network that includes and exceeds the digital), I suggest that the greatest compels some to describe helpful normative domains, and compels others to seek these out.

I came across the following, via kottke.org, and while it’s not quite as concise as other efforts, “Taleb’s top life tips” nevertheless makes for interesting brain fodder. I’d care to hear which of these are particularly interesting to you.

  1. Scepticism is effortful and costly. It is better to be sceptical about matters of large consequences, and be imperfect, foolish and human in the small and the aesthetic.
  2. Go to parties. You can’t even start to know what you may find on the envelope of serendipity. If you suffer from agoraphobia, send colleagues.
  3. It’s not a good idea to take a forecast from someone wearing a tie. If possible, tease people who take themselves and their knowledge too seriously.
  4. Wear your best for your execution and stand dignified. Your last recourse against randomness is how you act—if you can’t control outcomes, you can control the elegance of your behaviour. You will always have the last word.
  5. Don’t disturb complicated systems that have been around for a very long time. We don’t understand their logic. Don’t pollute the planet. Leave it the way we found it, regardless of scientific “evidence”.
  6. Learn to fail with pride—and do so fast and cleanly. Maximise trial and error—by mastering the error part.
  7. Avoid losers. If you hear someone use the words “impossible”, “never”, “too difficult” too often, drop him or her from your social network. Never take “no” for an answer (conversely, take most “yeses” as “most probably”).
  8. Don’t read newspapers for the news (just for the gossip and, of course, profiles of authors). The best filter to know if the news matters is if you hear it in cafes, restaurants… or (again) parties.
  9. Hard work will get you a professorship or a BMW. You need both work and luck for a Booker, a Nobel or a private jet.
  10. Answer e-mails from junior people before more senior ones. Junior people have further to go and tend to remember who slighted them.

Additionally, though within a more focused scope (and from the same source), Kurt Vonnegut advises on how to write with style. Herewith, a full reproduction. (I’m open to suggestions if reproducing these here, even with proper credits, breaks with good sense.)

In Sum:

  1. Find a subject you care about
  2. Do not ramble, though
  3. Keep it simple
  4. Have guts to cut
  5. Sound like yourself
  6. Say what you mean
  7. Pity the readers

I find, for some, that the first is surprisingly difficult. I rather have a hard time choosing a subject I care most about; it’s my utter failure at commiting to any one subject that leads to my abject frustration in making headway in any of them.

Mr. Kottke seems to revel in this meme. You can find similar material throughout his archives.

There’s an invisible hand beyond economics.

Sometime ago, I read two pieces a synthesis of which might be interpreted to say that all we can hope for in life is to concoct an artifice which distracts us from the impending and concurrent oblivion of purposelessness which would otherwise crush us. Herewith my earnest and hopefully readable attempt at such a synthetic enlightenment.

The Yin

From Whiskey by way of the singular Hydragenic:

… creativity is essentially an overwhelming presence of awareness, and may very well be mindfulness, and could be a form of meditation, or it could be more like lucid dreaming (outside of dreaming – as in, lucid wakefulness), or it could be the state attained through the creative mind, which seems to be on a whole different level of consciousness altogether.

Hydragenic responds “intuitively.”

Yes, yes, yes. And can we throw the word ‘otherness’ in there somehow as well? Creativity as an approaching of the divine via a process of lucid mindfulness that allows us to appreciate, however briefly and superficially, the intrinsic strangeness of everything other than oneself.

Hg soaks in this for a bit, and illuminates some deeper reflection, steeped over years and an embrace of the strange.

We’re all looking for meaning in life, one way or another. I’ve come to the conclusion that meaning comes from creativity. Creativity in its broadest sense: the bees in the garden gathering pollen to make honey, friends and family making relationships and babies, businesses concocting fascinating products, singers pulling together words and melodies, painters filling canvases with dreams and desires.

Thus, so far, we get the picture that meaning is a consequence of, and not a cause of nor reason for, our existence. A search for meaning, then, should not be a search for some pocket of import somewhere in the aether, should not be an effort to distill some back-of-the-universe answer to the question “What is the purpose of life?” Rather, we have but to create our meaning, and live our meaning.

Where, in other hands, this sort of inspection might yield a gaudy melodrama, Hg finds empowerment, in the attribution of meaning to those who would build their own.

To discover something is to encounter its essence, which the dictionary describes as “the basic, real, and invariable nature of a thing or its significant individual feature or features”. Its true identity, in other words: what makes it different to everything else. It strikes me that art – creativity – is the process of divining and defining uniqueness. It’s fine to make connections between things, but ultimately those things are separate.

Does that sound too bleak? I see it as strength, as infinite richness. Too abstract? We all encounter art on a daily basis, in one form or another. Too solipsistic? I can’t dispute that: all I am is all I am.

The Yang

Jared Christensen has touched on a parallel set of observations. Though mostly specific to the software industry (my own feeble gross overgeneralization of a culture, but it suffices), Jared’s piece provides a subtle but explicit hook to abstraction. And, anyway, the “software industry” is just instance of a class of social dynamic systems, which allows for a natural comparison across common points of structure.

Sometimes I look around at the state of software, and systems in general, and wonder if they are run by [agents of chaos (my paraphrasing)]. Or perhaps the chaos began by human fallibility, but now the mess is willfully maintained in order to feed this ecosystem that thrives on the system failure. Do companies actually put overly complex, mildy [sic] destructive products out into the market, intentionally giving rise to and continuing to feed an ecosystem of other companies that thrive on repairing the damage? Are some systems designed to be so irritating and complex that whole industries must be erected to make sense of it (*cough* taxes)? Is broken the preferred state for the makers of some products and systems we interact with every day? And does the ecosystem have the power to perpetuate the failure, supplanting the creator’s will to rectify the problems?

Surely, you’ve heard arguments of similar logical structure applied to government, to law enforcement, to lawyers, to the profession of educators, to book publishers, and to practically innumerable other facets of our shared lives. Each year, hundreds of high school and university departments order the nth edition of Larry Hackajob’s monotonously-uninspiring text for <some class>, not because it’s an improvement, but due to a self-propagated system of kickbacks and peer pressure designed not only to justify this subsequent edition, but to penalize the frugality of just using last year’s edition. At least Western, and most industrialized, societies require work to justify the creation of jobs to buoy the societies themselves. Consider pieces of Roosevelt’s New Deal, which included allocations of the federal budget explicitly for the creation of jobs, not because there was a pressing need for any of them, as much as to start the nation’s long economic repair. We might usually consider work to include an intrinsic necessity, but, while an indefatigably noble cause, and more, ethically necessary, these programs were artifacts. The theme echoes through time.

Unity

Drawing them together, I can’t help but conclude that all our efforts, all our endeavors, are not dissimilar from those Depression-era policies nor from the willful messes of agents of chaos. If, as Hg posits, meaning is our product and not the other way ’round, then we build our own temples to chaos and fabricate our need to aspire and achieve, no matter the ambition or arena. Assuming this, there’s absolutely no wonder so many of us are lost, listless, and sometimes paralyzed by the prospect of choosing a course.

Once, quite without meaning to, I advised someone to embrace just such a notion. When asked how to decide the rightness of an act or decision, I said something like, “Well, I don’t know, it’s hard to say; but maybe we just have to decide what we would want to be right, and stay as close to that as we can.”

I care about the answer non-academically, in a real, I have an assignment and it is to live and I don’t want to fail kind of way. As I told Hg, though, in a comment:

Truth be told, I’m a little lost on this question. There’s no greater sensibility to leaving the world than staying, so at the very least, an inclination to survive keeps me going. But if the most we can hope for is to revel in the possibilities of our creative efforts (in your quite useful, broad notion), that’s kind of just a dressed-up hedonism…or no?

Möbius

There’s no manual I’m aware of, except the one we’re writing and editing. Since any sense of meaning is necessarily a social one, I’m curious how others find their materials and what grammar they use to construct sense and order (unabashedly subjective, those) in life. So share.

I’m afraid to write. I’m afraid to sketch. I’m afraid to math. Sometimes, I’m afraid to put on a shirt.

This isn’t a story about neurosis. This isn’t about *phobia. This is a story about how it’s wrong to want to be right. That I’m concerned about cliché is only stronger evidence.

I sweat a little in the intimidating glare of a blank canvas of any material, e.g., canvas, paper, LCD, blackboard. Each new stroke, each new keypress, seems only to gather me closer to the most imminent and unavoidable failure. I would not have honestly written that in 1989, when I claimed I would write my own exhaustive dictionary of the universe. Somewhere in the interim, I came to join this fear as a daily companion. Think The Last Unicorn, who—upon learning of regret—slowly loses her magic. Except my case might be so common as to be archetypical.

I understand the practical need for a division of things into piles labeled “right” and “wrong,” especially if they don’t include moral things; that’s quite the nasty bog there, and I’ll (continue to) come back to it sometime soon. But think back to your grade-school days, to those waking hours when The Possible World was opened to you only by traversing The World of Correct Answers. While your teachers hopefully taught you of gerunds and multiplication tables and sea anemones and General Robert E. Lee, what you learned most of all was how to fail. Long after you’ve forgotten that 14 x 14 = 196, you remember how little you enjoyed school. You remember feeling that something wasn’t quite right, that you didn’t fit. You remember feeling that you were missing something, some special handbook, a leathery tome of gilt-edged pages with all the answers to both odd- and even-numbered problems, and an entire chapter dedicated to the best mnemonic rhymes.

Maybe you did quite well in school, actually; more’s the pity. That may only mean that you embraced the dichotomy, and that may only mean you’re going to fall harder when it fails to provide footing. The world is just as easily divided into “right” and “wrong” as a Moëbius strip is divided into “top” and “bottom”: it kind of works, until you look a little closer.

We were ending when we should have been beginning. Maybe it’s inevitable that our pedagogy mirrors Darwinian ruthlessness, given that that’s our behavioral role model; and professional teachers merely parrot the species’ generalized expectations, so it’s not at their feet that I lay this. Our great^n-grandparents and their ancestors laid the gravel, the cobblestones, and the pavement on which we tread, and laid it with the best of intentions. It’s still a road to hell; anecdotally, I feel that a world population of more than six billion ever-more-densely-distributed humans ensures that, though the roads are necessarily wider, we’re even more likely to stay on them than to walk in the grass with bare feet. Like the early universe, in which symmetries broke and things started clumping together and the fundamental forces split and spent the next 20 billion years arguing for custody.

This isn’t a story about elementary school, though, and it’s not about population control. It’s a story about my intimate friendship with Fear, and how it’s ruined my friendship with Creativity; Creativity’s pretty laid back, requiring but one thing of her closest pals: bravery. The bravery to feel stupid and keep coming back, to stop trying what works and try something else, to be wrong in the face of an easier rightness. We’re distant friends anymore, though I try to call when I can.

Thing is, Fear, quite the diligent acquaintance, got me smoking and now borrows all my cigarettes. It used to be about the music, man, but that’s over and we don’t even have any groupies. Or a band. Or a smoking habit. Or any substantive writing or sketchworks or mathematical intuitions from the past 15 years.

I called Creativity yesterday and left a message. I rambled for a while, nervously, a little too plaintively, but she always liked that about me anyway. Afterwards, I equivocated for twenty minutes on which of two grey T-shirts to wear.

Let me put you in your place.

God loves assholes. Seriously. Any god worth frothing about for gets a stiffy when they see people acting on their prerogatives. Any god worth painstakingly describing in illuminated text TiVOs premeditated murderers on the street, pacifists in jail, and that crazy man at the end of your street who wants to sell you a brand new hottub that’s still in the wrapper and worth all of $2,500 only he’s not asking that much he just needs to get rid of it to help Brian liquidate since he’s losing his house. (True story.)

Cognition Ignition

I see people as systems with two primary sets of components:

  1. earnest motive power
  2. instruction sets

If you’ve ever tried to write a computer program, then you’ll remember how horrible your first five attempts met with your intent. The computer executed your code as precisely as it was written; you just didn’t code worth a damn. So it is with people: your actions begin as an earnestness, a motivation to do something, whatever that is; and you follow a set of rules, an instruction set, applying the energy of that earnest motive power.

I can’t imagine a way that people truly act outside their earnest drive. You can’t fake motivation. However, by virtue of your lazy amassing of disorganized rules, you can

  1. render useless any hope of understanding your drive
  2. apply your motivation sloppily and with unforeseen effects
  3. dupe yourself into thinking you’re following the One True Instruction Set when such doesn’t exist
  4. abstract yourself from well-suited aspirations

To varying degrees, assholes evade these pitfalls. Now, there are the misguided assholes, who don’t realize it’s not really the way for them; you may even be one of them. Odds are, though, you’re not a real asshole; you take the shit you’re given, you say “Please” and “Thank you” even when you’re not sure you need bother. And you can’t remember the last time you really pissed someone off. Really, that’s a skillset you need to develop; it’s not until you’ve pissed someone off that they’re really invested anyway.

It’s Time

So, close your browser, stop reading CNN updates about protests against warm milk, get up, and go tell Ted to stop bragging about his gastrointestinal health. Tell him that no matter how regular he is, he’s still full of shit and couldn’t hack it in the real world outside his gym-and-latte set. There are millions of Teds, so you’re not really wasting a valuable commodity by enraging him to impotent silence.