Archive for November, 2013

By Tricia Lunt, English Faculty.

I recently got my first Smart Phone, a development that stunned my friends, some of whom thought I’d never get one. I didn’t really even decide to get a Smart Phone; my cell phone plan expired, and I was eligible for an upgrade. I’m certainly not a Luddite, so I’ll take any upgrade available. The phone is nice, certainly new and shiny. I am still determining how it works, and suppose I will be for some time. However, I never expect technology to “change” my life; at best, new technology might be able to provide tools to simplify tasks and augment human capacity (of course, technology has been known to serve the opposite purposes, too).

I like my Smart Phone just fine, and the newness of it means that the camera performs better than the digital one I’ve had for at least a decade. When comparing Smart Phones before purchasing one, I prioritized the authenticitycamera because I love taking what I call “Fun Family Photos,” posting the best online, and printing out the truly great ones to fill magnetic frames on my refrigerator or to give as gifts to friends and family. For as many pictures as there are online, the scarcity of printed photos seems rather odd, and I do my best to preserve the tradition of displaying pictures of loved ones in my home. New technology does not require abandoning old ways.

I don’t invest much time or money in technology. There are two primary causes for this peculiar behavior. First of all, I don’t have much discretionary income. If you’ve been told that academia is overflowing with high salaries and annual bonuses, you were misinformed. Thus, in large part, the small number of technological devices I own stems from a lack of purchasing power, and also a desire to use my limited funds for things I consider more valuable, typically travel and visits with friends.

More fundamentally, I don’t gain much from the time I spend online. In the essay “Is There a There in Cyberspace,” John Perry Barlow addresses the often unmentioned limitations of the online world: “missing entirely, [are things] like body language, sex, death, tone of voice, clothing, beauty (or homeliness), weather, violence, vegetation, wildlife, pets, architecture, music, smells, sunlight, and that ol’ harvest moon. In short, most of the things that make my life real to me.” Interacting with people, for all their messiness and complications, is central to my well-being, and my relationship to the natural world informs a spiritual awareness of my place in the universe. Authentic experiences are my priority.

I like to limit my screen time, essentially because I’d rather be doing something else. I use Facebook, but not every day, and usually with the intention of maintaining and developing connections with the people in my life. I update my status to share the highlights of life with people who care. This is not to say that the way I use Facebook, or technology as a whole, is the “right” way and everyone else has it wrong; rather, it is the right balance for me. The brilliant Sherry Turkle, founder of MIT’s “Initiative on Technology and Self” and author of Alone Together: Why We Expect More from Technology and Less from Each Other has proffered the notion of a “digital diet,” which involves taking an honest assessment of the amount of technology that might be reasonable in each particular life. Naturally, a computer programmer ought to expect to spend a lot of time in front of a computer while a forest ranger should anticipate spending far less, though still some. For the most part, I need only spend time in front of a computer at work, thus alleviating the necessity for a home computer.

screenjail(1)My concerns about the excessive use of technology arise primarily from the amount of time my students spend texting friends, checking social media, and browsing images online, all while they are supposed to be paying attention in class. Again, I am not suggesting that these activities in themselves are wrong, but the pervasiveness of the behaviors troubles me. Many of my students simply don’t seem capable of turning away from their devices. When I am in class teaching, my phone is always off, locked in my desk. Naturally, I have plenty of friends I could contact, but what I can do and what I should do are different things, a reality of adulthood that an over-reliance on technology tends to undermine. In class, I need to focus on the lesson, the students, the time and space we have together, and what we can accomplish together. I use technology in the classroom, but in a way that supports our joint purpose: to develop intellectual capacity. I’m also a proponent of online course materials because printing out copies of a ten-page syllabus isn’t necessary. However, I absolutely believe that paying attention to the people who populate our world (classmates, colleagues, neighbors, and family) is a fundamental human activity. We must not allow technology to distract us from the people in our midst.

When I see my students (or friends, or myself) turning to online social networks or texting to fill a void, I sense the vibrations of a wordless cry for help, subliminally broadcasting the central human need for connection, but in a dissociative way. “Entertain me, distract me,” we beg of our machines, oftentimes because engaging with what is in front of us, be it people or problems, requires more effort and investment. See Louis CK’s insightful and heart-rending rant. Sadly, technology can facilitate a turning away from one another, making us all feel more alone.

I try to resist the technology trap. I seek to be a constructive, critical user of advanced technology, endeavoring to master technological skills that can enhance my professional and personal life and support my relationships, never permitting technology to master me.

Advertisement

By Michael Stelzer Jocks, History Faculty.

Every national community has its dates of remembrance.  In the secular religion that is nationalism, these are the high-holy days of each year.  They may be days of celebration, or they may be days of mourning. They are always to be days of reflection. The American calendar is marked with a number of such dates. July 4th is a date of great joy, whereas December 7th is a date that has, most assuredly, lived in infamy.  The only thing that can overshadow a day of tragedy is a more recent example of national pain.  Thus, for most Americans today, December 7th has slowly given up its power to September 11th.

Do notice that these dates need no year to jog our collective national memory.  July 4th goes hand in hand with 1776.  That infamous December 7th took place in 1941.  September 11th will always, in some way, be a Tuesday morning in 2001.

Of course, I write this on a day that is an American holiday. November 11th is Veteran’s Day, but, I think it is safe to assume that the particular date rings few, if any, national memory bells.  Though few Americans realize it, however, November 11th was not chosen at random to recognize our veterans.  As many Europeans will relate, the 11th day of November should always be equated with one particular year; 1918.  On that day, the armistice ending the “Great War” came into effect.

00001633**********************************************************

But, here is a question to ponder.  What if one day marked numerous events in one people’s history, both positive and negative, that were markers of national significance?  Which year would a nation equate with the particular date? You may need to ask a German to discover an answer.  You see, November 9th is a recurring date of significance for the German nation. This date marked turning points in German, and, quite honestly, world history, in the years 1918, 1923, 1938, and 1989.

On November 9th, 1918, after four years of war, Kaiser Wilhelm, the emperor of Germany, abdicated his throne.  For many Germans, this political transformation was a surprising revelation that the war was all but lost.  09112012_Schicksalstag_grFor the Social Democrats, the abdication was an opportunity to create radical liberal reforms, in the hopes of making a new Germany.  For those on the left, November 9th was the symbolic first day of the Weimar Republic. To those on the radical right, this date would also mark the first instance of leftist (read oftentimes Jewish) betrayal against the nation’s war effort.

Bundesarchiv_Bild_102-00344A,_München,_nach_Hitler-Ludendorff_ProzessOn November 9th, 1923, a racist, militaristic political party known as the NSDAP, or Nazis, attempted to forcefully overthrow the Weimar government.  The so-called ‘Beer Hall Putsch’ was largely conceived and directed by Adolf Hitler, the young leader of the Nazis. Of course, the putsch was not successful. Hitler was sentenced to jail for a couple years. But, while in prison, the ex-corporal would restructure the Nazi party, hoping for another national crisis that would lead to electoral victories for his organization.

On November 9th, 1938, the now ‘Fuhrer’ Adolf Hitler, with his Propaganda Minister Joseph Goebbels, orchestrated a massive state sanctioned pogrom against the German Jewish community.  During the evening of November 9th, and into November 10th, hundreds of synagogues were burned to the ground, roughly 100 German Jews were murdered or committed suicide, thousands of Jewish businesses and homes were ransacked and destroyed, and about 6000 German Jews were sent to concentration camps.  In the weeks afterwards, the German Jewish community was ordered to pay a 1 billion dollar fine to repair the damages.  Kristallnacht was a symbol of the ever increasing radicalism of Nazi anti-Jewish measures that would eventually culminate in the Holocaust.

kristallnacht2

On November 9th, 1989, it seemed that the German people had enough of the tragedies associated with this day.  24 years ago, thousands of West and East Berliners took to the streets, meeting at the Berlin Wall and started to dismantle the concrete symbol of Communist repression.  The world was amazed as young and old alike took sledge hammers to the physical border between east and west. If you so chose, November 9th could now be a date that would represent friendship and freedom.

pic5

German historian Michael Sturmer has labeled the 20th century, ‘the German century’.  If this is the case,  no date on the calender formed and transformed our previous century of tragedy and triumph like November 9th.

By Peter Stern, Philosophy Faculty

A perhaps well known line–my dear dear Turtlettes–and more importantly, a favorite (of mine) Beatles song line about getting famous. Baby you can drive my car, yes you’re goin’ be a star, baby you can drive my car–and baby I love you. Well forget the last phrase about the love thing. It’s really all about fame, or what used to be called fame, or being famous, and thus in case you don’t quite make it, almost being famous.

 

The_Beatles_7425453

But I mention fame and even the Beatles because the question I’d like to explore and briefly digress upon (after all brevity is the soul of wit) is whether our society today is significantly different from past societies with respect to the way it accords respect, or recognition or old fashioned fame. Thus I propose we meditate for a few moments on the term most frequently used in today’s world for well know people who gain recognition namely, star, super star, or celebrity.

To my way of thinking, we live in a celebrity besotted society. Our public life or our concerns about what gets most public attention centers around celebritydom or the weather. I’m going to push the weather to the back burner and simply concentrate on celebritydomitis.

Let’s start by defining our terms and the first term to define is celebrity. How should we define it? Well let’s say a celebrity is a very well known person who has achieved notoriety by doing something unusual. Usually the special achievement is related to the world of sports or entertainment but it can also come about by creating a breakthrough achievement in the fields of business or technology or even politics. Thus names like Bill Gates and Steve Jobs are reasonably well known even in worlds outside their own area of expertise. Politicians such as Bill and Hillary Clinton are also easily identified and probably known by more people than Mr. Gates or Mr. Jobs. Obviously this holds for President Obama as well.

Celebrityhood however isn’t simply a function of achievement though achievement of a distinctive kind is often a key element in achieving celebrity status. Along with achievement, the term implies a mysterious element of glamour that fascinates the mind and leads people into the world of fantasy where they can wonder and indeed fantasize about their favorite celebrity’s life. Celebritydom requires the full investment of a person’s id, ego, and superego and celebrity status almost implies a kind of obsessional interest. Not that one must constantly obsess about the celebrity, but rather that the celebrity is capable of eliciting this kind of response.

Now as a human being, but more importantly, as a political observer and student of politics, what I find most remarkable about the development of celebrityhood is that it emerges in probably the most egalitarian society the world has ever seen. And also the most upwardly mobile. I’m reminded of the famous distinction first coined by Ferdinand Tonnies, a well known German sociologist writing at the beginning of the 20th century. He claimed history showed the development of two kinds of societies: the first he called gemeinschaft meaning that a person’s status was primarily determined by family or birth and society was organized hierarchically; he called the second gesellschaft which defined people on the basis of their individual achievement which created a far more egalitarian structured society. In the first type of society, upward mobility was relatively infrequent, while in the second type, upward mobility was built into the system and occurred routinely.

Our egalitarian, and we should add, very democratically based society clearly falls under the heading of a gesellschaft type of social system. Here, everyone is assumed to be created equal though since people’s levels of achievement could differ they could enjoy unequal degrees of social status. But again, the key point is that the justification for difference was tied to individual achievement. Thus in a radically egalitarian society difference can gain recognition, but the principle upon which it’s based on is equality. You earn it or you don’t deserve it. The theoretical default position remains egalitarian. Society’s bedrock principle is the acknowledgement that we’re all created equal whether we’re president of the United States or living in a homeless shelter.

To me, the phenomenon of the celebrity takes on a special status today because in a radically egalitarian society like the one we now live in it suggests that the principle of equality isn’t sufficiently strong to hold society together. Equality may be politically correct, but from psychological standpoint, it can’t work. Why not? Because it’s too boring. It hath no relish of salvation in it. A standard uniformity leaves the average individual exhausted and flat and dispirited. The soul needs some excitement, and adventure. Even feeding it some mindless entertainment such as we see on reality TV beats a state of simple equality. Or to borrow a thought from Mr. Dostoevsky, an old fashioned Russian traditional modernist who wrote among a great many other works, Notes from the Underground, for people to be happy, they need magic, miracles, and authority.

By Paul Gaszak, English Faculty

(This post follows somewhat thematically with the wonderful posts written this week by Michael Stelzer Jocks and Tricia Lunt. Read those, too!)

The first time I heard of the internet was in the early 1990s while my family was at the Rosemont Convention Center. My parents were at a Pet Expo on the main floor, and my older brother and I sneaked upstairs to a small computer show.

My 8-year-old brain didn’t retain much from that afternoon, except a foggy memory of standing still and listening intently to a “grown up” (who was probably 23) tell us about this amazing computer program called Prodigy that allowed people to do amazing things like order items from Service Merchandise. My mind was absolutely blown. I could now order a blender, a vacuum cleaner, and a Phillips CD-i without leaving my bedroom!

I linked out to info on “Prodigy,” “Service Merchandise,” and “Phillips CD-i” for everyone under 30.

I linked out to info on “Prodigy,” “Service Merchandise,” and “Phillips CD-i” for everyone under 30.

Twenty-three years later, this internet thing has really caught on! It’s WAY more than just a fancy alternative to mail order catalogs. And forget being “home” for internet access; lots of us have the internet in the palm of our hands – literally – with smartphones.

The extraordinary advancements in computers and mobile technology make me proud of human ingenuity and I’m giddy to see what technology will come next.

But then there are days like yesterday, when I was stuck in traffic for three hours, that I start to lose faith in humankind’s collective ability to be, ya know, smart n’ stuff.

My drive to work on a normal day takes about 50 minutes. A wee bit long, but not unreasonable. However, yesterday, rain coupled with car accidents turned I-57 and I-94 heading into Chicago into a stream of idle metal boxes. Rather than arrive an hour early to work, as planned, I arrived an hour late.

traffic2

Like all people, traffic makes me angry. I’m angry at whoever caused the accident. I’m angry at all the other people around me for clogging the road by deciding they should also go to work today. I’m angry at myself for not clairvoyantly predicting this dilemma and setting out from home even earlier than I did.

On top of all of that, I’m angry at us humans, because whenever I’m stuck in traffic staring at a sea of brake lights, I always think the same thing: We’ve got to be able to do better than this.

Ford Model T

Ford Model T

Automobiles have been around for nearly 130 years, and started becoming common over 100 years ago with the Ford Model T. A century later, we’re still driving around – sticking it out to the bitter, expensive, polluting, trafficky end. Sure, cars have improved, but they’re still cars. This is humankind’s brilliant solution to the simple problem of how to get from Point A to Point B: sit on top of four wheels and roll around slowly and inefficiently.

There has to be a better answer.

In the early 2000s, I overheard two of my college professors discussing this secret project that was in the news. Apparently, this project was for a new invention that would revolution transportation. I did some research, but everything about the project was kept extraordinarily quiet, except the deafening buzz surrounding the product’s unveiling. I imagined it could be the flying car, or personal spaceships, or teleportation. But what was it?

The Segway.

11 1/2 out of 13 people agree that it's possible to raise both hands while riding a Segway.

11 1/2 out of 13 tourists agree that it’s possible to raise both hands while riding a Segway.

Oh how revolutionary it is! Now tourists can take “walking” tours at 1.5x their normal speed! And mall cops can glide effortlessly between Auntie Anne’s and Mrs. Fields, all while striking two-wheeled terror into the hearts of the restless, mall-roaming youths.

But, I suppose I should at least give credit to the Segway for being something slightly different, because therein lies the true difficulty in creativity and innovation: nobody has thought of it yet. It’s easy to propose changes to what currently exists: make cars and trains and planes faster, make their fuel cleaner, make them more comfortable. Build bigger roads, build better rails. All of that may help, and all of it may improve our situation, but none of it is the ultimate answer to transportation. Inventing an ultimate answer from scratch – on any issue – is much more difficult.

Nonetheless, how is it that we can go from Prodigy to iPhones in under a quarter-century, but have been stuck with cars for more than a century? We humans are capable of such magnificent ingenuity, and yet simultaneously, we can be so creatively bankrupt as to accept never-ending brake lights as a solution to anything other than how to raise someone’s blood pressure.

C’mon, humanity. We’re smarter than this. We’ve got to be able to do better.

I hope.

By Tricia Lunt, English Faculty.

Among the many wonderful things that occurred at the wedding this past weekend (wedding post forthcoming) was the opportunity to talk to countless new people. One endearing woman kept pressing me for details about life in Chicago, particularly related to walking in Chicago. Did I really walk everywhere? Yes. So far? Yes, the bride and groom’s home (lovingly nicknamed McTedros Manor) is one mile from my apartment (Tricia’s Treehouse), and yes, I walk there often, and yes, I tend to walk or bike many miles every day. Her questions were amusing because her responses tended toward incredulity. She simply could not imagine a life filled with so much walking, while I feel exactly the opposite.

In addition to the loveliness associated with any walk through my neighborhood, there is always the potential of encountering a friend Imageon the street. I regularly see Bryce out playing with his grey speckled dog, Trapper. I stop to chat with Gregg & Maddie while they are walking to the diner. I meet members of the Urban Family walking to and from the train, and we walk each other home, just because we can.

One of the most enjoyable ways to walk through my neighborhood is with my friend, Maria. She owns “the bar,” The Whirlaway Lounge (I may have mentioned it before), the one most beloved in the neighborhood. She is lovingly referred to as, “The sweetest woman on the planet,” which is a damn-near perfect portrayal. Walking the neighborhood with her is especially delightful, because we can’t get two blocks before someone stops her to say “Hello.” The sense of community experienced while walking in my Chicago neighborhood is unrivaled.

When I was speaking to my friend Clark recently, he said, “So, tell me about your new apartment.”

I thought for a moment and said, “I really love my new walk to the train.”

“How will you like it in the winter?“ He asked.

I replied, “Oh, I don’t mind. I like how quiet it is in the city on a snowy morning.”

I’m certain that I read an article about the benefits of a pedestrian lifestyle, but I can’t find it. The best I can find is an article that warns against the decline in walking (particularly in The United States), called “The Crisis of Pedestrianism” from Slate.

Additionally, Ray Bradbury’s 1951 short story “The Pedestrian” shows the clairvoyance of the best writers. Here is an animated video inspired by the tale.

Like Bradbury’s character, I walk “to see.” I think of all the wonderful things I’ve seen on walks throughout North America; the leafy Imageboulevards of Toronto, the red brick lanes in Boston, the elegant monuments in Washington DC, the bands playing on cobblestone streets in New Orleans, the relentless hills of San Francisco, the stucco enclaves in Old San Juan, and the shady cafes in Puerto Vallarta all hold special memories. I think of the immeasurable stretches of gorgeous beaches I’ve strolled down, and the thought them makes my sigh.

In Europe, I crossed breathtaking bridges. On my first solo trip, I walked The Millennium Bridge in London from The Tate Modern to St. ImagePaul’s Cathedral. I’ve also traversed bridges that have stood for thousands of years: The Charles Bridge in Prague is a sentimental favorite, but the small stonework bridges in rural Ireland are endlessly appealing, especially when they span waterways as impossibly Seussian as the River Sneed.

Walking makes daily life seem more like traveling in my own town. If I pass a building or shop or café that seems interesting, I act like I would on vacation, and go inside. It was in this way that my friends (and newly married couple) Hanna and Ryan discovered an inviting new coffee shop just north on Milwaukee Avenue, eminently comfortable, and far less crowded than the ones just a mile south.

I derive inestimable pleasure from starting and ending my day with a fine walk. This morning I passed at least ten stunning crimson and auburn and gold trees, and I stopped to admire them, the individual trees. Walking affords me the time to look at individual trees. I bear witness to the slow progression of time; feel the seasons move themselves along at a walking pace. The seasons don’t rush to meet each other; they go slow.

Walking enlivens my body and mind. The first Flaneurs were great walkers, so it is only fitting that I am one, too.

 

By Michael Stelzer Jocks, History Faculty.

Texting-on-a-Date

In my ‘Comparative Worldviews’ class, I enjoy asking my students if they think the story of humanity is one of progression, or decline.  A simple, but incredibly broad question to be sure.  Usually students will reply with some excellent nuanced answers, pointing out that such a simple dualistic question glosses over the complexities of our modern world.  Most point out that humanity has progressed, and continues to progress in areas such as medicine, science and technology.  Though surrounded by it their whole lives, my students appreciate how quickly technology is advancing. However, some rightly point out that progression in one area of life, can lead to decline in another.  It may be surprising to those who don’t interact with ‘millennials’ on a daily basis, but I find that most students feel that the progression of information and communication technology they have lived through has had radically negative social repercussions.

The above staged photo encapsulates the problem my students have with information technology.  I have heard the majority of young adults I teach argue that, though modern, handheld computers provide us a deluge of instantaneous information, they are ‘also killing human interaction’.  In this belief, they are by no means alone.  It is almost becoming a cliche to state that cell-phones, texting, social media and constant internet access drives a wedge between humans, causing all sorts of existential threats. Texting causes a loss of spelling and grammar rules! Cellphones destroy interpersonal communication! Social media increases the opportunities for lying and narcissism! Cell phones destroy human empathy!   Humanity is evidently doomed if we keep going down the road we are travelling.

And yet….let’s look at a couple more pictures.

ReadingCouple

Two elderly couples reading newspapers

Now, what do you think of when you look at these two photographs?  I am going to make an assumption about your conclusions.  These pictures provide generally positive emotions, correct?  The photo of the young couple enjoying a leisurely read outdoors  seems relaxing, and romantic.  The picture on the right, with the two elderly couples, has a timelessly quaint aura.   Perhaps these husbands and wives have had this ritual of sitting on a park bench, reading the daily newspaper for years, if not decades.  What could be more traditional; what could be more human?

These two photos are the antithesis of the top photo, right?

Not at all. These three pictures are more similar than different. Two people sitting at a table on their separate smartphones is wholly similar to the old couples sitting on a the bench reading their respective papers. All of these people are socially isolated with an individually hand-held communication tool. What difference is there if the loving pair in the grass have a couple novels, or a couple iPhones?  The quality of their reading material may be the only thing; and even then, with e-readers, this may not even be the case.  Both are lost in another world, one digital, the other paper-based.

And, yet, we do see a difference; on an emotional, visceral level, it just seems different.  But, why? Why is the first photo seen as dangerous and distasteful for the future health of all humanity, while the second is sweet, charming and heartwarming?  When I asked my students this question, one young woman stated that texting requires technology, and hence, the top picture is different.

But, wait!  Books are a technology as well.   The written word itself, is a technology.  Neither are natural; they are both human cultural inventions. Mass produced, hand- held books are only 500 or so years old.  The written word is about 10 times older. Over the centuries, these technologies have changed, but usually quite slowly; this change has seemed organic, and glacial to someone living in our times of radical technological advancements.   But, go back to any year before Gutenberg’s press, and you will discover a world of communication that is almost unrecognizable. After the radical invention made books a mass-produced commodity, you will find ‘Chicken Littles’ predicting doom as a result.  Such warnings were even applied to the written word. Plato tells us that Socrates, who never wrote anything down, warned that the written word was dangerous since it,

will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

I assume that in 50 years, if cellphones are still with us, pictures such as the one found at the top of this post will be seen as quaint and charming. There will undoubtedly be a new communication technology invented that will be blamed for the inevitable fall of all human interaction, or Western Civilization….or something. I kind of can’t wait to see wait to see that new technology.

By Peter Stern, Philosophy Faculty

My very dear Flanuering Turtles, have you ever felt behooved? If you haven’t, do you ever wish you had? This question burns in my sweet breast for I’m looking for soul mates, in this case people who have felt at one time or another quite or even very behooved because I believe I’ve been smitten with a feeling of behoovement. Never having experienced being behooved, I’m feeling both elated and a bit uncomfortable unsure if it’s really behoovement I’m feeling.

Amidst this uncertainty, I’m going to proceed on the assumption I am, indeed, feeling behooved and what I’m feeling behooved about is my wish to provide some expert thoughts on politics, and political activity given the increased acrimony our political system seems to be generating and the area of my academic expertise which is supposed to be in political science.

The simple point I wish to make which hopefully will help clarify the muddy debates currently raging abroad the land is that the principles upon which our country is based are exceedingly complex and so we shouldn’t be surprised if at various times in our history we find our politics rife with controversy.

On this my maiden voyage out, I’ll briefly take up only one such principle namely, the core idea our political system depends on which is equality, the principle first expressed in the Declaration of Independence, one of our country’s most important founding documents. And there it says that it’s a self evident truth that all men are created equal. Before proceeding further, let’s rid ourselves, at least for now, of one possible controversy and agree that the word “men” means human beings, or all men and women.

Now let’s examine this statement more closely, naively asking ourselves if we think this statement is really, really, true. For instance, let’s look at the author of the Declaration, Thomas Jefferson. Does this man appear today or way back in his time to be every other person’s equal? Well, what about IQ? Probably Jefferson’s IQ was higher than most folks living in Virginia in the year 1776, especially in July of 1776, and it’s a good guess it’s higher than most people’s IQ even today. I mean honesty compels me to admit it’s a lot higher than my IQ, seems to me.

Thomas Jefferson

Thomas Jefferson

But also Jefferson was much taller than I am; he was a far better writer, thinker and, overall, a much more creative person than I can claim for my poor person. So trying to understand how I’m Jefferson’s equal presents a challenge to me, and to you too, my dear Turtle Dove, for you’re going to have at least as difficult a time as I’ve had showing how we (Jefferson and myself) were created equal. And for all I know he was also created a better athlete than I am, not to mention being better looking while enjoying a better sense of humor than I have.

Indeed if we look around the room–any room– we’re likely to find people who are more creative, more intelligent, more athletic, and better looking than we are and maybe a fair number also enjoy better health than we do. Yet we might also notice that we’re ahead of the pack in a number of these areas. So a not surprising conclusion we might come to when we think about equality in terms of the gifts we’re created with is that their distribution isn’t equal, the Declaration of Independence notwithstanding.

declaration-of-independence-1776But to fully appreciate the complex nature of equality we also need to survey our world in terms of the conditions in which we live our lives. And we’ll quickly find these conditions, like the gifts we’re born with, don’t seem equal. In fact, they display a remarkable degree of inequality. For instance, we see, hear, and read about large inequalities of wealth, health, status, power, recognition, and achievement.

If all the above is true, cold hard logic would probably force us to conclude that basing a political system on the concept of equality would prove a very difficult undertaking, which I believe it is. And it wouldn’t be surprising to find lots of issues people would be concerned about become very controversial because in many important ways people aren’t created equal. Consequently treating people equally isn’t always such an easy thing to do. Moreover, determining exactly how equal people’s living conditions should be is also difficult, even for people who are created with far more intelligence and creativity than the majority of folks, which includes me, seem to enjoy and make use of.