The Drams Reliquary
An Experiment in Elemental Maturity
Tuesday, June 29, 2010
Escapism at Fault
Most of my preferred forms of entertainment involve escapism. When given the downtime and opportunity I am no stranger to, say, sitting in a movie theatre and fully suspending my disbelief, rolling up a decent Dungeons & Dragons druid at table complete with awkwardly named animal companion, or finding that spot under a tree in the sunlight that allows me to not only revisit the carefree days of my youth, but also sink my teeth into any well-crafted thrill betwixt O. Henry and Dan Brown.

Yet, I find myself more and more frequently, more and more deeply, desiring richer forms of decompression. There was a time when traveling was the stress reliever I’d consistently swear by. Camping, while escapist in the sense of disconnecting from the modern world, was also proactive in engaging the natural one. I feel almost silly when having to write that “I do remember going to parties.”

It happens. It’s life. We mature and as we do we mature into different sets of priorities and responsibilities. We do so as our oldest friends might move away, as our work colleagues might never join us outside of the office, and as our original support systems give way to new ones that can tend toward focusing on that set of priorities rather than you, the “needy” individual who’d arranged them.

And when this happens, it is no wonder why downtime might get more readily fractured and spent in microcosm. It is no surprise that chosen forms of entertainment for a contemporary adult are steered, perhaps for the first time in life, toward activities that involve lesser and lesser planning, lesser and lesser engagement with people, lesser cost, lesser durations, and frequently lesser return on your attempt to tame the stressors. That tropical island is far away and you can only afford the group rate, but television is right there. Organized volunteering can feel like work, but there’s an Entenmann’s cake on the counter within arm’s reach. What beauty could possibly be experienced at that poetry reading when you’ve got the ultimate beauty of your significant other right there, in-house? Let’s get real. What decent and career-driven parent has time for an underwater spelunking hobby?

So we read for half hour clips. We flip on the video game and keep a steady finger poised on the pause button, prepping for the twenty interruptions that will likely befall our only biweekly block of “alone time.” We pretend that sleep is one hundred percent recuperative and that a Saturday stroll around the block is getting us somewhere.

These truisms I bring up for discussion are nothing new. Our entire Generation X and those coming up behind us have been aware of the dangers of poor stress relief decisions for a lifetime. Studies continue to show that the results of doing nothing about stress are particularly severe, a severity that increases with the number of and complexity of stressors in our environments. So, in a sense, these words I write to re-familiarize readers with a problem that has not gone away, could in some manner be considered a case against escapism and for what I’ll call “engagism.” Written as such and supported by study, it would seem not only important to create the downtime in heftier spans, but to use that time in a single, uninterrupted and engaged practice that relieves pressures. It’s not that you have the vacation; it is what you do with it. The physical toll of stress takes a lot more to undo than most realize.

Still, I am somewhat more compelled to bring up a mental toll that my personal stress level has helped into being. Like the cause and effect written about above, we are also familiar with the common metal tolls of poorly exercised decompression mechanisms. Inability to concentrate, memory difficulties, lack of attention span, more frequented mistakes, verbal exchanges that take place in anger, sleep difficulties, disengaged behavior, some forms of depression, they are all mental products related to elevated stress. Everybody is further familiar with the proverbial extremes of this state. We speak of people who “crack” or “lose it” indicating our belief in a personal threshold for mental stress that, when surpassed, has the power to change a person forever after.

I, like many, have been happy enough to accept both these common notions and these extreme notions of mental stress without further investigation. However, whereas I once might have thought those lists to be quite complete, when looking in on myself, I see what I believe is a disregarded and very dangerous mental reaction to stress.

Having opted so often for escapism over engagism in my grown-up entertainment selections, that escapism has become a very, very commonplace response to my environment. It is so well practiced as my go-to idea on “fun” that I no longer think about it, re-examine it, question it, or challenge it. I’ve nullified choice. I’ve become my own Pavlov’s dog. Layer that well-burned neural pathway onto both the fact that the escapism doesn’t always yield fun and the fact that it is practiced in microcosm with interruptions and split focus and you start to see that the go-to idea no longer even serves the purpose of escaping. Escapism itself becomes less of device for de-stressing and more of a cyclical form of thought that goes by without notice. Not really used for fun aymore, it’s now a familiar thought pattern that gets misapplied to almost anything else I might be too stressed to examine properly. In my case, as might be the case with many men, I’ve attached that thought process to my unfounded excuses.

What do I mean? I’ve realized very recently that, while I’ve always known I am far from perfect, I am also in no way living up to the current best that I can be. In my youth, this was an imperative. Always being the best I could be and however that stacked up against life was my prime directive. I understood it as my raison d’être through the touchier, feelier parenting techniques of my elders who dictated, “Win or lose, so long as you’ve done your best, that’s all that matters.” Being the best I could be was a self-evident truth that encompassed not only the clear hope of always expanding that possibility, creating new personal bests, but in a large way my core identity. It was a concept that simultaneously spoke to me as an individual (as my best would differ from that of others) and as a shared experience (in that so many would also try to be the best they could be).

Somehow, at sometime, I’ve relaxed that concept. I’ve become lazy about it. And it’s my unexamined, escapist, thought process that has allowed me to perpetuate under the delusion that this is okay. See, whereas some husbands might take their escapism to a practiced extreme, devoting entire weeks to televised sports, video games, and as much food and sex as they can muster, trying to offset a rough spot at work; I’ve taken it to a vicious cycle in the mind.

Escapism, as a thought process, has become so second nature to me that my mind voluminously wanders into visions of me at my best, my best foot forward as a husband, father, friend, brother and cousin. These are visions that are currently fictional. I am picturing myself doing things that I know I can do. I’ve done them before. I envision myself doing new things, unique things, groundbreaking, life-altering things. I know I can accomplish those. Yet, all of a sudden, that vision feels like enough. I’m not actually performing, instead accessing the ideas more frequently as if the ideas alone could have a direct impact on building a better life. The sequences play over and over in my head and I oddly derive pleasure from them as I would from screening a blockbuster summer film or going to a concert. It seems like me and my psyche have comfortably jumped to the false conclusion that like other escapist devices, I can access this “best me” readily. As quickly as I might reach for a video game controller, so too can I change overnight into the best husband I can be. Well, what are the odds of that? Grabbing the controller is near zero effort. Being the best husband I can be, never-ending, prioritized, altruistic effort, a drastic change to contributions tomorrow that I’m not even remotely making today. The video game eventually gets shut off. The best me cannot. The best me shut off a long time ago and, apart from elevating awareness via this blog, look at the horrible place I took it to.

Sure, some might contend that even this lesser version of me is somehow better than the bests of select others, but it’s not about comparisons. It’s about selfhood. In losing track of my best, I lose my identity and with that go my roles, my relationships, and my destiny. I share because I think a lot of people may actually be experiencing something similar and I would ask for your personal advice. As a self-proclaimed deep thinker, I’d hoped that mere acknowledgment of what I was going through would help to overcome it. Hence far, that’s proven not the case against this particular malaise.

My search, though still in its infancy, has revealed to me the concrete necessity of motivators. Knowing what needs to be done cannot lead to accomplishment in the absence of working motivators. And therein lies the defeatist in me. I look and I look and I look and while I am astonishingly impassioned to be my best version of a husband, to be my best version of a father, as of yet I find nothing that honestly motivates my needed change. I haven’t found a one that works for me anymore. I currently experience more motivation to escape to the mental images of my top notch self than to become that self. It’s slowly killing me.

Labels: , , , , , , , , , , , , ,


CLICK HERE FOR FULL POST ON NEW PAGE!
Sunday, June 27, 2010
Two Breeds of Confusion
It’s intriguing to note that our free society’s continued push to further define and acknowledge commonalities shared by all citizens somehow fails to consider any basic traits deemed negative by that same society. Confusion is among these. After all, even the most practiced thinkers cannot claim to have escaped the occasional state of outright confusion. Being confused is a trait we all share.

I suspect we ignore this widely shared facet of the human condition, perhaps, because it contains little or no legislative value. Seriously, how would society benefit from, say, a “right to be confused?” No, confusion gets dismissed by philosophers and logicians alike and we carry that “permission” to dismiss it very deeply into our everyday lives. We ignore the concept of confusion, even fear it, fear how we might be freshly perceived if suddenly caught in a moment of confusion. Will it ruin all after hopes of a fruitful career? Will our friends abandon us if they find us to be confused more often than not? Will adversaries use such moments as opportunities to pounce? The sometimes paralytic fear of even acknowledging this common, nascent brain response leads otherwise smart people to sit in ignorant silence rather than simply posing questions to navigate themselves into the know.

I, for one, think it is beneficial to examine confusion, at least as much as it is to examine any play of the mind. Doing so is not an easy task. Examination is a cerebral undertaking and by common definition confusion is exactly when such mental efforts fail any demarcation of sense. How do you employ a technique to examine a state wherein that very technique must be absent? Dissecting confusion, one gets the feel of an “X” that cannot be solved for in an equation. The “X” gets moved around, back and forth, as different techniques are applied to either side of the formula, still ultimately resulting in a function of “X,” rather than any new or exact knowledge about the value of “X” itself. Confusion, like the unknown, is a negative space in the thought process, most difficult to penetrate. It is like trying to study pure chaos.

Since appropriately penetrating confusion on its own terms, then, is unlikely if not impossible, I start right at the edge of the issue, the brink. I consider the cause(s) of confusion. It is possible to derive at least part of the nature of confusion by examining its causes. In my own attempt, I’ve noted that the many causes I can identify tend to fit into two major categories, which implies then that there might be two (or more) different types of confusion.


Common Confusion

First, there is the confusion precipitated when the presented information is too new or too plentiful to process. In a sense, this is when the subject matter is “above” us, “over our heads.” There is often an unacknowledged, educational gap between what we know on a topic and what a speaker knows on that topic.

This is confusion in our animal sense. Examine a wolf being systematically thwarted off by ranchers. That wolf might “smartly” learn to avoid tire tracks, fences, daylight raids, learning ever more in its collected associations to humankind. But when the effort to thwart is concerted and suddenly the wolf is faced with the sound of loud gunfire and engines from multiple directions, its practiced mental and physical behaviors get befuddled. The wolf might dart right and left for short sprints, freeze, growl, run toward the noise. It can exhibit unpredictable, confused behavior. We suffer this as well. We share this brand of confusion with much of the animal kingdom. When faced with possible danger for, perhaps, the first time in one’s life, many of our reactions directly and immediately contradict not only our own practiced behaviors, but also the most reasonable escape tactics. It is why we have self-defense classes, to train our bodies and minds to overcome that particular neurological short circuit.

Confusion in a conversation or debate, when lives are not on the line, can still be a direct outgrowth of this blitzed string of behaviorally questionable reactions. We relate our understanding of the world around us to self-preservation because we see ourselves as more than the physical. We define the self as a deeper set of intangible qualities, among them our ability as humans to be smart. In absence of that understanding, we feel threatened. We can get threatened by people smarter than us. Presented with a subject which we’ve never before examined or hearing the subject communicated in a poor or contradictory manner, our thoughts can race all around for an explanation, sometimes so wildly that we freeze up, shut down, retreat, or check out of the dialogue. Neurologically, this is little different from mammalian fear response, synapses firing in brand new, overproduced fashions processing so many possible courses of action at once that none are mindfully prioritized as the best solution. We get stuck. We simply fear appearing to others as stupid and in our reaction to that fear we stop listening, stop learning. It is easier in the moment to “conclude” that figuratively sticking our heads in the sand will do a better job of preserving the self than to acknowledge our own confusion and better the self, long term, by choosing to learn from what another has to offer. This is undoubtedly the most common form of confusion both because no one is exempt from their animal origins and because of the high frequency with which EVERYBODY experiences being the lesser knowledgeable participant on any number of given topics.


Thinkers' Confusion

The second, lesser evident cause of confusion seems to be of a more elusive nature. It is a more mature classification of confusion that, while it exists, lies beyond the purely animal and reactionary form described above. After all, there exist precepts, as in sociology, that describe the evolution of human intellect as being, in some ways, a developmental measure to counteract instinct. That is the seemingly material purpose of human thought…overcoming, overcoming all. So, it’s reasonable to assume, then, that humans have a commonplace capacity to even overcome their own animal confusion.

Humans, through choice and effort, can mature to a point where they do not fear sounding silly, losing face, impacting their social reputation. They can develop into an arena of thought wherein, while still learning, they are fully comfortable with openly exploring and expressing why they, themselves, might be incorrect. They do not fear attracting a stupidity label while expressing thoughts or questions because those thoughts, those ideas, are seen as entities unto themselves, not part of the mind, the soul, the self. Ideas and knowledge are out there, cycling through the human condition free of charge. They are fruits of humanity ready to be plucked and gathered, but never viscerally owned by any one person or another. Knowledge can be shared, perhaps it can only be shared, and those who come to understand this possibility can pattern their mindsets in distinct opposition to animal confusion. They openly risk revealing their intellect as wanting at every turn so as to gain an even minutely greater perspective in the exchange.

Make no mistake. When I identify persons practicing the free abandon of any concern over labels, I do not include those who’ve done so through reckless abandon, people who care little about what others think, but who allow their perspective to stop at that one, solitary conclusion. We’ve generational throngs of non-thinkers and immature thinkers who isolate themselves from new ideas, constructive concepts, and assertions over which they’ve long ago decided they’ll not entertain any further input. These are folks who’ve passively identified the richness to be found in a fearless intellect, but who do little or none of the work to arrive there. They hide behind the sometimes righteous, but always too convenient notion that they need not at all concern themselves with what others think of them. Certainly, it is difficult, if not impossible, to conduct one’s life structured to please all peers. I’m no advocate of allowing others to define you. However, this be-all, end-all refuge stated as, “Why should I care what anyone else thinks of me?” all too often gets misapplied. It is wielded as license to ignore others, to dismiss their perspectives, to disallow contradictory thought or superior arguments from creeping into one’s view of the world. They view their idea as their possession, their own, part of the self while others who perceive ideas as free-for-all pose a threat to that comparatively myopic existence. The conscious practice of ignorance fails any measure of maturity.

To illustrate this second type of confusion, I am instead grouping together thinkers who’ve so often put themselves upon the riskier path of looking foolish in mere hopes of widening their intellectual experience, that they’ve completely desensitized themselves to any embarrassment involved with “sounding stupid.” These are the folks who’ve walked the mental walk, long term, as opposed to those who’ve shut off in a single, uninformed decision. They’ve matured so far through inevitable animal confusion that they’ve ceased to experience it without navigating through to new understanding. They will continue to come across that animal confusion. There will always be another person with newer or greater, even contradictory information to offer. Yet they’ve managed to separate out any mammalian fear response from their reaction to that broader insight. The synapses do not short circuit or fail to prioritize. Instead, these listeners build bridges of understanding. Asking questions and taking all answers at face value, they learn when unprepared, when unready. Particularizing each sequence of challenge and response until such time as they can properly assimilate the external information into a furtherance of their own global comprehension, these mature thinkers persist and probe, revel and celebrate, laud the very ideas that would otherwise put them to shame.

This mental practice, however, comes not without its price. For while the mature thinkers of this persuasion look ever onward and actually seek out those with knowledge superior to their own (most new and desired confusions now proving barely a bump in the road to broader understanding), these same persons inexorably alienate themselves from droves of very smart people who cannot make the same leap. A line is unwittingly drawn in the sand. Their own personal development is separatist in nature. They’ve ascended to a thought process in which many others either cannot or will not engage. It is from this delineated talent pool that we create experts, innovators, and world-bettering deciders. This “club” is not exclusive to notable names either. There is no PhD. required. Anyone, any person who both remains on this expansive mental journey and simply dedicates time to thinking, to observation, to experimentation, to questions, to challenges, to enrichment, to research, to balance, to fact, to expression, to debate, anyone can share in that enlightenment. Everyone thinks. But as a group, these better practiced thinkers tend to be viewed as the smartest among us. They are those we’d ask advice, those whose warnings we’d heed. They are the people we look to for inspiration and the ones whose answers we most trust.

So what is the second type of confusion? If these thinkers are so mature, so “unaffected” by animalistic bewilderment, so equipped to envelop information provided by thinkers who’ve surpassed even them, what could possibly send such minds reeling? The second grade of confusion stems from the comparatively uniformed assertions offered by those who could not make the cognitive leap. Grouped together, broad thinkers are collectively geared toward expanding their existing comprehension, changing perception to fit what they cannot disprove in the moment. As a group, however, they are often bereft of the tools best used to properly address a lesser informed viewpoint. By their very defined drive, the expansive thinkers must always presume that newly encountered information can lead to betterment, that their understanding can somehow be expanded to fit the novel data. The source of the information is less relevant. So, ironically, the truer thinker is veritably forced to, at least momentarily, treat morons on an equal par with geniuses. They must treat all speakers in between as if that invisible line in the sand represents nothing. Such a brief necessity frequently causes a type of confusion all its own.

It means that any human being offering information unfamiliar to the expansive thinker can insert even the smallest, completely fictional detail into an exchange and the broad thinker must then expansively re-examine everything s/he has ever come to know in attempts to impactfully comprehend the unfounded comment without dismissing it out of hand. The comment is an unintended monkey wrench. The less factual the detail, the greater the confusion. Hard core, proven, practiced knowledge-bases within the broader thinker’s repertoire are self-challenged, circling through proof after proof, example after example, modifier after modifier, trying to locate and explain the very legitimacy that the speaker could not lend his/her own comment. Broad thinkers must presume they’ve missed something in their growth and find all the indicated holes in some Swiss cheese upbringing. Yes, the thinker can engage in a line of questioning to navigate through as before, but what is mentally blueprinted as a through line to greater wisdom, in this case goes instead through to acknowledging the information as false. Immature thinkers dismiss. Mature thinkers disprove.

So there are, at least, two kinds of confusion. The first, a generalized, animal reaction to what we do not understand. The second, a cognitive attempt to navigate through the maze of our own examined comprehension to a new, suggested exit that does not exist. All humans have the capacity to experience both forms, but only a select group will journey far enough to recognize the difference between the two.


What To Do With Confusion

Why is it important to break up the subject of confusion in this way? Inappropriately, the instilling of confusion in a debate, disagreement, or argument is often used as a tactic. It is presumed that confusion is an equalizer, that it is a shared and useful tool in absence of some rule or etiquette against it. The presumption goes, if you are confused, all you need do to end the argument is to confuse the other person as well. The presumption furthers, if you hold your own and confuse your opponent, you win. Explaining the duality of confusion reveals this commonplace presumption as incorrect. It shows that one side of a disagreement can be severely lesser examined, lesser informed, and lesser justified despite that point’s asserter being able to momentarily confuse a contravening speaker. It emphasizes that just as one noted expert should be given far more attention on his/her studied subject than the lodger of a random opinion, so too should the expert’s brief confusion, if present, be regarded with far less importance than a similar moment of pause on the part of someone whose failed to fully explore the content.

It is a hierarchy. One form of confusion clearly trumps the other and is not nearly the “gotcha” that we presume it to be. Think this is a matter for formalized debate? Think again. Allowing plenty of room for multiple perspectives to be of simultaneous merit, how often does any disagreement in the home, the office, or everyday life pit two fighters of perfectly equal mental adept (on a subject) against each other? Rarely, if ever. That said, two narrow thinkers might never reach agreement, arguments ending in a huff. Two broad thinkers might always reach agreement, both prepped to find resolution and understanding. A narrow thinker versus a broad thinker might instead keep the debate rolling forward forever, the former constantly hoping to gain that elusive and everlasting win by tripping up the latter through means of tactic over content. Sorry, small minds. You can lose out even to sheer confusion.

Labels: , , , , , , , ,


CLICK HERE FOR FULL POST ON NEW PAGE!
Monday, May 17, 2010
Unlocking Ethic: Coining “Propulsive Morality”
I never grew attached enough to the apartment that I rent to refer to it with a tricky tongue. I don’t call it a domicile or dwelling, even in jest. I barely utter the lighter words “my place” or “home.” Attaching these ideas to my over-priced and under-modernized unit just seems too permanent, as if the words themselves could wash away all hopes of future homeownership or dreams of one day retiring to an outdoor area with greater square footage than a shoe.

That doesn’t change the fact, however, that it is this very apartment that my miraculous daughter will always know as her first home. She came to us, what seems like yesterday, more beautiful than autumn dusk and as smart as those who’d stop to ponder it, sprouting all her roots here as she instantaneously shot up to age two. And it is in this crumbling box that doubles as her perfect castle that we together might never have a fruitful dialogue on thoughts if I should fail to make time to think. What I think about today is this veritable cubicle with closets and how, while it may have perturbed me before, I dislike it anew for calling my parenthood into question.

We’ve a city. There are door locks. Ours are many, five if you total entryway and apartment. Enter the landlord with all good intensions and a Stanley screwdriver, adding a sixth lock to the storm door, free of charge and without a request to do so. Safety six, convenience zero; I can now get to the second story roof of my building more easily than I can get from my porch to my kitchen.

Pure force of habit drove me to initially overlook the new lock, leaving it unclasped more than once. And while we as a family were no less safe than the day before lock six had joined the keychain puzzle, my “mistake” somehow sanctioned a multidirectional barrage of snide commentary. “You musts,” and “Don’t forgets,” and “How could yous,” and “That’s irresponsibles” flurried down on me like my own personal ticker tape parade of shame. The neighbor, the landlord, the mother, the landlord’s spouse, the friend, the friend of the friend, the dumpster-diving passer-by; they all precipitated a collective onus in attributing an importance to that quarter inch keyhole and verbally flailing me for dismissing the same.

That ambient yapping was easy enough to ignore. I was aware of my “oversight” and actively trying to change my behavior. Plus, most of the reasons offered as to why “I must” lock the new lock sounded little more than, “Because it’s a lock. That’s what locks do.”

They weren’t thinking. I really wasn’t thinking. Had it stopped there, I’d have nothing to contribute to a blog about mature thought. Then the nightmares started.

For my part, I was soon tackling the subject solo. It doesn’t take a complex game of word association to understand how my mind went from lock to security, security to crime, crime to worst case scenario. If ever you plan to be a new parent, any parent, I suggest you take an active mental hiatus from working worst case scenarios into your thought process. The imagery your own mind can muster is horrifying enough, no less the knowledge that real life acts matching that imagery are how locks leapt onto doors in the first place, warranted or not. Yet as horrifying as those mental flashes were, nothing prepared me for what I’d thought of next.

While I believe it best never to blame oneself for another person’s misdeeds, I suddenly realized that I truly don’t know what I would do if ever horror visited our “home” after I had locked all, but that final lock. That lock could be the deterrent. That last lock could have bought time enough to make a difference. The jiggle at that lock could be the one that would wake me or the task that would re-time an intruder’s entrance to police patrols. That lock, in essence of the self, could be loosely viewed, with no crime committed, as the mindful difference between a good parent and a bad one.

So it begs the thinker’s question, where does it end? If the landlord decides to put 38 locks on the door, am I a bad parent if I only lock 37 of them? I think most people would say, “no,” but post emergency, I would know, very clearly, that I didn’t do EVERYTHING I could have to prevent disaster. I’m well aware and in full acceptance of the notion that a crook unstopped by 10 locks is unlikely to be deterred by an 11th. Yet, while that would seem to place full fault squarely upon the deviant, I ask if there wouldn’t be a small portion of that fault that would factually be my own. Perhaps that small portion is not fault, but simply contribution. I might never cognitively or emotionally reconcile my contribution of ignoring any one lock. Sure, mathematically, my contribution reads less and less with every extra lock we add to the equation, but where is the threshold? At what point can an emotionally mature individual determine through all acceptable measures that one step shy of X is prudent, but one beyond is overkill?

I refer to the elusive nature of this presumably common sense threshold in our thinker’s question as “propulsive morality.” I choose the phrase because it illustrates how the internal and external measures of ethic are stretched ever outward, like an unending and irreversible vector away from manageable criteria into the chaotically complex through a practiced compulsion to add a theoretical plus one. “Propulsive morality,” by whatever name, exists and is problematic at its core. It seems to snake through culture generationally, perhaps spurred on by litigation, politicking, propaganda, poor forensic practice, isolationism, nihilism, cognitive lock, oppression, reality television, psychological scotomata, or immaturity.

Let’s examine. “Propulsive morality” first presumes that a given situation (five locks) succeeds at an agreeable measure of ethic so long as any arguably mitigating addition to that situation (a sixth lock) remains theoretical. In this state, both parties seem to be on common ground and therefore each can be egalitarian in her/his treatment of the other. “You lock all your locks regularly. Great!” However, that mutual nicety is subject to an enormous loophole. Whereupon two or more parties start such agreement in theory, it takes only a single individual to later physically and autonomously add in the formerly theoretical next step. “We agreed on that yesterday, yes, but I just added this new lock and you should see it my way now too. You’d be wrong not to use it” Parties never require of themselves the related compromise defining a needed threshold, a distant point down the chain of their new disagreement that would once again bring them together, both acquiescing that a tiny step further would prove ridiculous or needlessly redundant. So, with that personal requirement absent, loose judgments are given license to propel and propel, ad infinitum, until a negative aspersion can be cast. As a missile can seek heat, “propulsive morality” seeks disparagement. It stretches as far as it must to reach its predestined marginalization and thereafter frames the breadth of that stretch, falsely, as an effort comparable to hard core logic.

The aspersions come from others, yes, predominantly, but can also wind their way into the psyche when one takes stock in oneself, sometimes for no other reason that the fact that, perhaps, no one has pointed out this flawed practice to the thinker.

What I, here, am calling, “propulsive morality” seems to warrant this potent a description as such disconcerting words are only a fraction of the lasting, negative impact that can be experienced each time this default “methodology” is employed. I may have simply taken note, egotistically, as I would a pet peeve, and tried to coin a phrase to describe it. Yet, I am certain there are philosophy experts, psychologists, and ethicists out there who can offer great insight on what it is that I am describing. Perhaps the practice already has a name with great study devoted to it; but try cross-referencing my meager description in a search engine or card catalog. I do not share the vocabulary of established social sciences.

Locks-wise, legally, I’m covered. Culpability aptly befalls a theoretical intruder. Ethically? I’m not exactly certain. I, myself, would judge me as a bad parent if I had regularly forced my daughter to miss just one meal a day, despite feeding her all the others. I, myself, would judge me as a bad parent if I let her stew in a dirty diaper just once a day, despite changing all the others. Why not the sixth lock? The thirty-eighth? Bars? A handgun? A shotgun? A force field? No. Until I hunt down and assimilate greater insights from those studied on the subject, all my locks get locked, all the time, no matter how plentiful, no matter how redundant or inconvenient. It serves my notions of fatherly responsibility, of course. However, that decision also thrusts me into this undefined mind game where I need play along, endlessly, despite the fact that I disagree. I’m sorry, but when do we take stock in the danger of that level of compromise? Is it more dangerous to my daughter that a sixth of six locks not get clasped on occasion, or that she might learn from me the less than artful life lesson of giving-in? I’m guessing that mature minds would view the latter as more impactful, agreeing on substance. Yet, if the latter is “in fact” more dangerous to her, why then does a good parent label get lent to the giving-in, and a bad parent label to the person who questions how far compromise should reach?

Labels: , , , , , , , , , , , , , , , , ,


CLICK HERE FOR FULL POST ON NEW PAGE!
Thursday, January 21, 2010
Harvesting Fact from the Realm of Opinion
It was asked, "What was the most important thing that happened in history?"

The question was answered and the answer was sheer opinion.

It was then asked, "What was the most important thing to happen in the history of history?"

Again the question was answered, this time to a few raised eyebrows, and again the answer was clearly an opinion.


There was silence, the panel stumped. It seemed that having asked these questions to a sample group of 5,000 people was getting them nowhere. They didn't expect anything other than opinions when they'd set out in search of a factual importance, but by now they thought they would have at least been surprised with a revelation or two somewhere in those 10,000 answers. It seemed time to give up.

Brady was not the most educated among them. He didn't have a gaggle of extra letters bending his signature over the end of the page, but he was the most mature thinker among them, open to learning during the process even if the process took them to an unexpected goal. It was Brady who broke the silence, not with a "eureka!" or a "j'accuse!" but with a simple comment.

Said Brady, "The fact is that the two answers have to be different."

------------------------------------------------------------------------------

The Explanation

In life’s archaic arena, fact-mongers rumble with opinion-mongers to the figurative death. Better rounded humans have become a rarer breed, neither found on the arena floor nor in the audience. It tries the nerves.

While I prefer matters of fact myself, I feel compelled to remind my fellow facts enthusiasts that some full-fledged facts are not arrived at through the rigors of scientific method nor crisply via mathematical proof. Einstein, for example, while illustrating most of his “discoveries” using math, took math to a place outside of then established proofing and he additionally could not yet harness the massive, needed energies to “test” his own theories on gravity and time, practically. Some of his “facts,” facts that now hold true today and after testing, were then, for want of a simpler description, identified by layering one complex idea onto another.

Leave test tubes and Pythagoras aside for the moment. Fact can come from a comparison of pure idea to pure idea, if treated critically, and that is a living hell for the factual, data-driven, examples-rich debater. Why? Because among ideas there are opinions, and that means that while we stand there and cry foul over a “lesser” arguer asserting opinion as if on equal forensic par with fact, to remain the superior or the matured debater, we must at least acknowledge this “opiniondom-to-fact” possibility, however remote.

Deriving fact from the opinion domain can be illustrated, in part, by posing two strenuously similar questions that are both matters of opinion, and drawing out from their answers a concerted result.

My example is thus. If I were to ask, “What was the most important thing to happen in history?” answers would vary. Answers to this question are clearly a matter of opinion. If I were to instead ask, “What was the most important thing to happen in the history of history?” the answers would similarly vary. The second question is also clearly within the realm of opinion-only responses.


The mature examiner, however, can make the short leap to fact from there. There is fact that can be extracted by comparing the two answers. The easy leap gets stated, “It is a fact that both those answers MUST be opinions.” Children can make that leap. The more difficult leap is the realization that those two answers, while yielded from highly similar queries, MUST be different answers. This is a FACT.

History is the period of time from when humankind started recording events to a time in the future when they would stop doing so. Though used loosely in common speech, the word “history” does not include the Big Bang, the beginning of life on Earth, dinosaurs, or Pangaea. Those are prehistoric, pre-history, before history. History itself is basically the timeline of “stuff” written down. It is the collection of every event captured since writing began. Therefore, the “history of history” is, by contrast, a collection of only the events that impacted the recording process.

So, while a person’s opinion might be that the most important thing to have happened in history was, say, FDR getting elected President of The United States, that same person’s opinion on the most important thing to have happened in the history of history would be more like alphabets or the printing press or television. Despite the similarity in the questions and despite the very loose and overlapping usage of the term “history” in both questions, the answers MUST be different. Must equals fact, and facts can be built upon in furtherance of the discussion.

------------------------------------------------------------------------------

Common Rebuttals and the Points that Address Them

A quick, gut reaction might find you thinking, “The example is poor. Any two different questions will have different answers. That ‘fact’ is not such a leap.”
  • Well, first of all, unlike the example, this knee-jerk reaction doesn’t always hold true. Very different questions can have the exact same answers sometimes, especially when left open to the boundless realm of human opinion. “What is your favorite color?” can have the same answer as “What color should the city paint the bridge?” two far more disparate questions than those in the “history” example. Questions that do not even share a single key word can possibly lead to the same answer. “What does this department need to do to win the company’s Best Practice Award?” will likely not, but CAN have the exact same answer as, “How should middle-management innovate to increase the profit margin?” The point is, almost any two diverse questions can have the same answer and can have different answers. There is no MUST that can be applied throughout and therefore no fact to the assertion that “any two questions will have different answers.” The gut reaction is an uninformed one that fails to see the importance of when a MUST is present.

  • The second pitfall to the assertion, besides its fervent untruth, is the asserter’s failure to understand the importance of how infinitely similar the questions must be for the sake of the example, the experiment. A person would be hard pressed to create two different opinionated questions on history that are closer in meaning than the two cited in the example. Yes, they are, in essence different questions, but they are different questions chosen to be as similar as they can possibly be. Their difference in both language and meaning is but a scintilla of derision. These sample questions are not proximal in expression merely because they repeat words, but because of how the implied meaning of the repeated word so slightly changes the intent of the sentence. The example could have alternately used “the timing of time” or “the burden of burden” or even the pop culture mainstay “I like him, but I don’t like him, like him” to illustrate. Selecting words purposefully to create an implied overlap, as would a poet, is a device that draws the perceived similarity between two sentences closer than they would be perceived through repetition alone. We really couldn’t say that a sentence like “My dog is tired,” is closer in meaning to “My dog’s dog is tired” than “history” is to the “history of history.” Two dogs are completely separate entities while the two timelines can and do overlap. The point is that there is something to be learned when two such close questions in sound and in meaning MUST yield different answers.


It has also been argued that the example’s conclusion is false, that the two answers need not necessarily prove different. Think of a person, say, a learned professional historian specializing in periods captured mostly by the printing press. Could s/he have the same answer to both questions? After all, while the printing press captures history in the form of written text distributed en masse, its creation and first time use was an “event” in history too. Could a single person be of the opinion that the greatest thing to happen in history was the printing press and also be of the opinion that the greatest thing to happen in the history of history was the printing press?

  • While free thinkers tend to dislike statements that would quash positivism, the answer is no. If a person claims to be of this opinion, they have misunderstood the depth of the sample questions. S/he has given an incorrect response. The answerer has offered an answer without the critical examination necessary for a mature interchange. The system or mechanism that records an event can never be as important as or more important than the event that it records. Such would defy reason. The recording system is subsequent to the event being recorded. The event has to be important enough for people to make the effort to commit that event to posterity. The effort, while frequently important in and of itself, cannot be as important as the event(s) that drove the effort into being. If so, we would be perfectly happy to laud the invention of printing presses that never printed anything. We don’t do that. Both logic and common sense concur that unused invention is akin to failure, not to importance. So, while a recording system like broadcast television might ostensibly be more important than just some of the things we record on it, like a dog food commercial, it could not be the MOST important thing in history because, at the very least, it would still be subsequent to the one most important event it had ever delivered into people’s homes.

Critics have also claimed that the example is unfinished. Some agree that this is an illustration of fact from the field of opinions, but contend that its avowal is not founded. The example portends that a FACT like this can be built upon, yet it fails to profess just how to build.

  • This is a separate issue. The example illustrates how fact can be derived from an inherently opinion-driven exercise. The idea that all facts or that only facts can be built upon to reach valid conclusions is a separate proof. Yet, if the citing of further examples might help to better illustrate the “history” example, I offer the following builds as a start.

  1. I can build on the FACT that these two answers MUST be different by dropping them into logical modifiers. IF a person’s two answers MUST be different, AND they are NOT, THEN the conclusion(s) that person will draw on that subject will be INCORRECTLY reached.
  2. I can build on the FACT that these two answers must be different by allowing the content to better inform my other opinions or to show better proofs of other facts.
  3. I can build on the FACT that the two answers must be different by gaining a properly vetted statistic on how many people answer the question incorrectly and I can use that statistic to further an argument in, say, a court of law.
  4. I can even build on the FACT that these two answers must be different as a philosophical illustration of human limitation, compared to say, postulating the existence of deity who might otherwise state, “I am that I am.” The two human answers MUST be different while the god’s or perceived god’s answers CAN be “magically,” but understandably, free from that FACTUAL limitation.


Another critique of the relationship between the two sample answers states that I make a falsely phrased allegation. The criticism notes that the FACT asserted is not taken directly from an actual opinion which might read, “The Golden Age of Greece was the most important thing to have ever happened in history.” This criticism highlights the notion that if one does not draw her/his factual conclusion from THE OPINIONATED ANSWERS, then one cannot claim, in this way, that facts can be drawn from opinions at all.

I appreciate this rebuttal. It points out that while the answers MUST be different and while the answers MUST be opinions, the formulation of these resulting MUSTS comes instead from the greater logical construct of the example, a sequence of ideas that is already structured and therefore already a practical, working, formulaic model. The rebuttal is metacognitive. In short, the retort claims that I am actually making facts from facts and not from opinions. I’ve two counters to consider, however.

  • First, you’ll please note that nowhere herein is it stated that facts are harvested from opinions, but rather from the realm of opinion. Of course that sounds like splitting hairs until one considers that this infinitesimal difference in phrasing is the only minor adjustment necessary to include the entire argument in the otherwise powerfully exclusive arena of fact. Like the adjoining “history of history,” the curious “opinion and realm of opinion” intractably overlap, whilst implying just enough contrast to validly launch opinions into discourses that need otherwise dismiss them. No rational person is going to look at a question like, “What’s the most important thing in history,” and pretend to not understand exactly how the phrase relates to opinion. While the construct of comparison is formulaic, there remains an inherent link to pure opinion when the comparison is made over content strategically chosen to reach opiniondom.

  • My second counter to this rebuttal is that if one could prove that I was deceptively “making facts from facts” and therefore asserting myself poorly when I include the word “opinion," that person would have proved my point for me. It is a widely held belief that fact CANNOT come from the realm of opinion. I wish to show that it can. If one’s rebuttal starts by assuming there are first place facts from which I draw my second place facts, then, on inherently opinionated content, your first place facts show this for me. I need not make my argument. The history of history never comes into play. If you look at two such opinion-driven questions and claim that I am starting with facts, then you have just practiced the cognitive journey that I claim you can. You’ve just done what I said you could do, nullifying your case for stating it cannot be done that way.

------------------------------------------------------------------------------

Ramifications


My case for the facts that can be harvested from the realm of opinion is not, of course, license for everyone with an opinion to view their idea as automatically and equally comparable to established facts. One of my favorite quotes by Dr. Carl Sagan illustrates why this cannot be done. “The well-meaning contention that all ideas have equal merit seems to me little different from the disastrous contention that no ideas have any merit.” Should you prefer more common terminology, Sagan’s quote is very similar to an idea expressed in Disney/Pixar’s The Incredibles which contends “If everybody is SUPER, then nobody is.” No, opinions themselves, by their very nature, stand to be ruled out by facts during verbal conflict. I give opinions no such license.

Rather, as so frequently seems the case, drawing fact from opiniondom creates more work for the thinkers in the debate. It means that to retain the mature status through which a thinking debater or arguer hopes to communicate, an acknowledgement of this opiniondom-to-fact possibility must be ever-present. It means a debater seasoned with this understanding might be doing the mental work for both of the speakers, looking out for a conflux of opinionated statements uttered by one's counterpart which could, even accidentally, yield a mitigating fact. In essence, to be mature, while you lay out the process of how two plus two equals four to your listener, you must simultaneously be receiving how they make ten plus six equal four, even if you have to silently do the math for them (the math in this case being hours on a standard clock, a numbers system all its own). It is not enough to “win” or to reach agreement while standing in the face of a “lesser” arguer. It only approaches “enough” when “winning” or reaching agreement after having stood instead in the face of every related and findable fact, not just the few that another, single person was equipped to offer.

Opening one’s self up to facts garnered from the realm of opinion is the more mature path in that it acknowledges the discourse as a journey, a journey through which further learning can happen. Alternatively, a discourse in which neither party budges from an original standpoint illustrates a mental lock or cognitive lock indicative of immaturity. By definition, the person learning is more mature than the person not learning. That means fact-based thinkers are beholden to this possible concurrence, even at the expense of doing all the fair work for both arguers. Just as it has been postulated that three monkeys banging away on typewriters, if given until infinity, would eventually write Hamlet (by accident), so too does the collectively infinitive realm of human opinion yield entire crops of useful facts (sometimes by accident). The mature debater is the one who leaves the conflict wiser. The wiser person is the one who seeks out all ways to disprove the self.

Labels: , , , , , , , , , ,


CLICK HERE FOR FULL POST ON NEW PAGE!
Monday, January 4, 2010
Global Climate Change : Not A Line Item Debate
His was a curious comment, angling out there in cyberspace as if to purposely instigate a tweeting battle. It did. Hurrah, bold typist. I’d been aware for some time that Go-Text-Charlie did not put any stock in global climate change, me additionally cognizant of the fact that each beer in Charlie added a new level of anger to his expressiveness on the subject whilst always prying him ever more emotionally open to tell you why he FELT climatological results were an international conspiracy.

I guess, described in that way, it might be easy to dismiss Charlie. His single chat board opinion seemed a fringe venture, whatever one’s political bent. Yet, what you don’t know about Go-Text-Charlie is his decades spent as No-Text-Charlie, years replete with informed ideas, thoughts, opinions, and feelings vastly kept to himself. They were and still are years highly steeped in logic and reason, selflessness and intellect. Charlie mulled over all his innermost undertakings as if locked in a vault with nothing but math, method, and Descartes to guide him. Those rare, whispered assertions he would share were always among a privileged few listeners who’d marvel at what his mind and his untainted practices had accomplished.

Newly unconcerned about “backlash,” he posted publically, and disappointingly. His novelty comment claimed that climatology was bereft of scientific method. The fight was picked, the sides chosen, and the barrage of point and counterpoint that flooded hurtful ones and zeros through the ether astounded me.

Amidst that chaotic throe I found myself genuinely disinterested in adding to what passed for “debate.” Charlie means something to me. I was certain that I could set up a proof to dissuade him from his stance, convincingly, but I didn’t want to throw in with the scrappers and the moaners or get lost in the torrent. I needed to mature away from the word skirmish and the infuriating fact that every reply to Charlie’s comment was little more than a platitude, a sound bite heard a thousand times over, or a partisan campaign motto. Not one contributor, not even Charlie, offered anything more than what s/he’d heard elsewhere. The “argument” was definitively immature at best.

My attempt at maturity, I share. I offer it, hopefully, as a sobering aspect to this supposed “debate.” Charlie is not out there generating his own climate studies, gathering his own data, conducting his own experiments scientifically. And, to be fair, Charlie, neither am I. In this regard we are equals, equally versed in what we don’t know. Given that we don’t know, we have chosen to listen to others for our information. I think it logically sweeps away all the open sermonizing and pat generic responses attached to your comment when we realize that our choice resolves the issue. You have chosen to listen to the wise politicians, the studied pundits, the talk radio icons, and the popular hardliners who fervently disagree with the congruent results of independent global studies. I have chosen to listen to the people who conducted the studies.

Labels: , , , , , , , , ,


CLICK HERE FOR FULL POST ON NEW PAGE!