Posts Tagged ‘History Faculty’

By Michael Stelzer Jocks, History Faculty.

One of the biggest entertainment stories of the last week was the announcement that Stephen Colbert will be taking over for David Letterman as the host of Late Night next year.  This news has created some waves, and not just on the entertainment pages. Many fans of Colbert have expressed concern that he will no longer be playing the role of ‘Stephen Colbert of The Colbert Report‘, and instead will be simply ‘the real Stephen Colbert’.  The salute‘Report’ watchers want Colbert to retain his egotistical, self-loving, arrogant conservative political commentator persona in his new role. It looks like they are going to be in for a disappointment.

For others though, that fake conservative persona is exactly why they feel Colbert should not be allowed to take over for Letterman.  Colbert is obviously liberal, and most of his satire has been directed towards the more conservative talking-heads in today’s Washington and in the world of cable news.  Hence, Colbert’s humor can make a good many influential figures squirm.  Bill O’Reilly, who Colbert’s character is most obviously based upon, recently called Colbert a ‘deceiver’, and an ‘ideological fanatic’.  Being on CBS will give Colbert a much larger sounding board, and this is frightening to many like oreilly-colbert-618x400‘Papa Bear’.  Thus, after the announcement that Colbert would be succeeding Letterman, Rush Limbaugh vituperatively claimed that Colbert’s hire meant that ‘CBS had declared war on the heartland of America.’  This may have been classic Limbaugh hyperbole; or, perhaps he actually believes all Colbert watchers live exclusively in New York and San Francisco. Either way, he loses here.

Personally, I love Colbert.  If he is ‘The Stephen Colbert of The Colbert Report‘, or the ‘real’ Stephen Colbert makes little difference to me; he is funny as hell either way. But, I do have one concern about his move to CBS, and that is this: What is going to happen to Colbert’s guest line-ups? Colbert’s choice of guests over the years on ‘The Report’ have been nothing short of revolutionary. Learning from his big-brother program ‘The Daily Show with Jon Stewart’, Colbert provides a fresh intellectual breath of  air amongst the staid landscape of mindless TV talk shows.  While most talk shows interview celebrities, or cutesy human-interest guests, Colbert (and Stewart) have continually provided their audiences with a wide-range of less famous, but more important guests.  Sure, Will Ferrell will sit down with Stephen one night; but on the next night, Jane Goodall will stop by.  Brad Pitt on Monday; Tuesday and Wednesday, Neil de Grasse Tyson and Steven Pinker.

I am concerned these eclectic, intellectual guest line ups will be lost with Colbert taking over for Letterman. Just take a look at who has been the headlining guest on The Late Show during the last three months:

  • Michael Strahan
  • Julia Roberts
  • Drew Brees
  • Dr. Phil
  • Clooney
  • Jack Hanna
  • Tom Selleck
  • Etc, etc.

Now, here is a short sample of Colbert’s guests during the same period:

  • Scott Stossel (Journalist for The Atlantic/Author of My Age of Anxiety.)
  • Michael Chabon (Author of Adventures of Cavalier and Clay/On to speak about Ernest Hemingway)
  • Patricia Churchland (Neurologist/Philosopher)
  • Drew Brees (Hey, a match!)
  • Paul Krugman (Princeton Economist/NY Times contributor)
  • Brian Greene (Physicist)
  • Alexander Payne (Director of’ Nebraska)
  • Simon Schama (Historian)

If he keeps such guests, Colbert’s move could be a radical change for American late-night.  If he doesn’t, and becomes just another Leno or Letterman, viewers will have lost more than simply his fake conservative persona.



By Michael Stelzer Jocks, History Faculty.

When it comes to my pop culture proclivities, I’ve always been a bit of an Anglophile.  My late high-school, early college years was when my English-mania reached its climax. Though embarrasing now, I felt it absolutely appropriate to dress like, and style my long-lost hair in the manner of my favorite English maudlin and/or ironic singers.  In high school, it was a pompadour and t-shirt with blazer, a la 1987 Morrissey.  By college, it was a shaggy moptop with stripy sweaters a la Damon Albarn of Blur. At the time I thought I could pull this off.

By the time I graduated from college, I honestly didn’t have the energy any longer to style my hair in a particular fashion. Plus, I had my girlfriend, and eventual wife, who rightly felt the look went from a cute sign of style, to something much closer to pretention. I came to realize that there is a certain point when putting daisies in your back pocket, and carrying around a volume of Oscar Wilde is just sad. Though my fashion changed, my musical


An inside joke for fans of the Smiths.

tastes still focused upon the English pop music of the 1980′s and 1990′s.  In high school, my cohorts were obsessed with the Nirvana/Pearl Jam/Smashing Pumpkins grunge movement; I enjoyed a more dour line-up of The Cure, Depeche Mode and The Smiths.  By college, people were fighting over West Coast vs. East Coast rap; I was concerning myself with the Oasis vs. Blur quarrel (Blur is MUCH better, by the way.) Some of my university acquantances spent their nights listening to Phish, The Grateful Dead, and Blues Traveller, I…well, I just couldn’t stand that crap. Still can’t.

When we moved to Chicago 15 years ago, my Britpop obsession had cooled considerably.  Now in my mid (okay, late) 30′s, I thought my Anglophilia had finally died.  Then, a couple weeks ago, an import from the Islands rejuvinated my love.  But, it wasn’t music this time.

One evening, I was looking for a good historical documentary to watch on Netflix.  Not much there.  Figured I might as well check out PBS.  Nope, nothing really on.  Finally, I realized, ‘of course! Youtube!’  Sure enough, inputting historical documentary got me quite a list of shows to watch (835,000 hits to be exact).  How to decide?  Well, I quickly realized to look for three letters: BBC.  Youtube was awash in BBC historical programs.  After watching a couple, I was amazed at their quality and seriousness.  For instance, I found a wonderfully intriguing three part documentary written, and hosted by one of my favorite historians, Mary Beard.  Beard is a classicist at Cambridge, and though she doesn’t write about my specialities, she is wonderful at humanizing the people of Ancient world.   Just take a look how she deals with Roman toilets (yes toilets) in this scene from her series ‘Meet The Romans’ (Jump to 24 mins):

Now tell me that is not interesting!  Nothing at all fancy about the production; no special effects; no actors; nothing ‘EXTREME’ or ‘SHOCKING’ or pseudohistorical. Just an expert telling the viewer about a time period and a long gone people she loves. Such shows are stirring my Angliophile nature once again.  But, I must be honest.  This love is mixed with a serious degree of jealousy.  I mean, why can’t we produce works like this in America? Instead, we have  The History Channel.  AAARRRGGGHHH!  How I hate The History Channel.  Let’s just take a look at what the History Channel has on it’s two stations in the next couple days, shall we?  Oh, great, ‘Swamp People’!  Hey, ‘Ancient Aliens”!  How historically challenging! Wait a minute, don’t forget ‘Jurassic Fight Club’!  Sounds like a really enlightening program.

I just can’t figure out why Americans have such a limited understanding of history….Wait, what was that? ‘Pawn Stars’ is on the History Channel at 6:30? Nevermind.

By Michael Stelzer Jocks, History Faculty. 

If there is anything I have learned from studying history the last twenty years (my goodness, I can’t believe it has been that long since I began my undergraduate studies), it is that the past affects every aspect of our lives. This took me awhile to grasp, since as a teenager and twenty-something, I assumed my worldview was a self-created thing; I thought that I had the power to pick and choose what I wanted from the ideas and memories of yesteryear.  Studying history in all its guises has made me see that I was a foolish kid. All our lives are molded by the most idiosyncratic remnants of days long gone.

With this in mind, let me give you a odd historical example illustrating how mentalities don’t die, though humans do.




François-Marie Arouet, better known as Voltaire, died May 30th, 1778.  A playwright, philosopher, novelist, political thinker, and much more, Voltaire was, and still is, understood as being a giant of the 18th century era known as the Enlightenment.  Though not an outspoken political radical, Voltaire was a champion of revolutionary cultural ideals.  Most infamously in his day, he was an often harsh critic of organized religion and, specifically, the Catholic Church.  Here is one of many of his anti-clerical statements:

Every sensible man, every honest man, must hold the Christian sect in horror. ‘But what shall we substitute in its place?’, you say. What? A ferocious animal has sucked the blood of my relatives. I tell you to rid yourselves of this beast, and you ask me what you shall put in its place?

In 18th century Europe, holding such opinions, much less stating them, was a dangerous proposition.  Voltaire played with fire, which made him one of the most admired, most feared, and most despised men of European letters at the time of his death.  It would take 11 years, and the anti-clerical French Revolution to redeem Voltaire’s memory.

The Revolution of 1789 and its adherents waxed and waned in their feelings towards religion.  Some were outright atheists.  Some were deists.  Some were romantic Christians.  As a whole however, the Revolution as a political movement would try to control religion, either by making the church subservient to the nation, or even by transforming the revelatory nature of Christianity into the naturally rational cult of a faceless Supreme Being.  Hence, by 1791, Voltaire was transformed from being a dangerous, though popular rebel, to a nationally recognized prophet of the French nation.

The French Revolution and the French nation had martyrs and saints.  Voltaire would become the latter.  He didn’t die for the cause, but he did face persecution for his beliefs by a ‘tyrannical’ French pre-revolutionary state, and he would need to be recognized as such.  What better way to do so than moving his mortal remains to the Revolutionary state’s temple, the Pantheon? Nothing really new to all this hullabaloo.  Each nation recognizes those early forebears, and seers who foreshadowed the nation.  America is no different. Think Lincoln, Washington, and their respective monuments.  However, this story veers in an unexpected direction.  Friends and enemies of the Revolution began to fight regarding Voltaire’s state of decomposition.


Moving Voltaire to the Pantheon

As the French historian Antoine de Baecque points out his book, Glory and Terror: Seven Deaths Under the French Revolution, the state of Voltaire’s remains was controversial.  After disinterring the body of the great man, two conflicting sets of rumors began to spread. Amongst the friends of the Revolution, it soon became gospel that Voltaire’s body was perfectly preserved, 13 years after being buried (he had been embalmed, so this makes some sense).  But there was more: The Voltaire lovers relayed seemingly miraculous stories.  Not only was Voltaire’s remains perfectly preserved, but they also smelled….good.  The body was not decomposed, and had a sweet bouquet.  On the other hand, those enemies of the Revolution, and the haters of Voltaire gossiped the opposite.  Voltaire was actually a disgusting, rotted piece of decomposed flesh that was embarrassingly earthly.  The smell of the remains in this story, instead of being sweet, were radically worst than one would expect. It was as if the infidel’s remains had the whiff of hellish brimstone about them.

What in the world was this all about?  Well, to understand this ghoulish argument, we need to realize that this discourse of bodily remains was much older than Voltaire.  The Catholic church, going all the way back to its earliest days, argued for the incorruptibility of their saint’s bodies.  It would be proof of sacredness if a saint’s body was incorruptible; it would be a sign of God’s love if the dead saint smelled not rancid, but delightful. So, when the argument over 60835932Voltaire’s body arose, it was done so in the discourse of Catholicism. What the what?  Superstition’s most famous enemy was now being turned into a saint by those whom he influenced. History does indeed repeat itself.

I love this story for two reasons. First, it is just weird and unforgettable tale, showing the strange beliefs of humans.  Second, and more importantly, it is a perfect example of what effect the past can have on all of us.   Even the French Revolutionaries, those who hoped to create the world anew, and in many ways did so, still could not escape their bygone forerunners.  They were locked into a rut of history. You and I are no different.

By Michael Stelzer Jocks, History Faculty.

My youngest daughter turned five last October. For her birthday, her aunt and uncle, my sister-in-law and brother, got her a funky pair of pink 578688_10201388189609185_1668997307_nrimmed glasses.  She was extremely excited, and so was her older sister.  The seven year old sis instantly knew what she wanted for her upcoming birthday. ‘I want some glasses just like that!’

When December rolled around, said older daughter got a package in the mail from said aunt and uncle.  Sure enough, inside was a new pair of glasses.  Happy day!

Neither of my girls need glasses to read or to see far away (unlike their parents), and so these glasses are simply fashion accessories. They wear them some days, and not others.  Often, when they want to ‘dress up’ fancy, they will break out their frames.  Wearing them to school, or preschool is all about the image.

I would be remiss to point out how wonderful I find this.   The perception surrounding glasses seems to be evolving from when I was a kid. 715swU1WPgL Back then, there was a stigma to wearing glasses, and that stigma was an American tradition.  It was so common that you can even find the normalization of this stigma in children’s books of my era.   Take for instance Marc Brown’s book Arthur’s Eyes, in which Arthur the Aardvark needs to get glasses.  The first day he shows up at the bus stop with his new eye-wear his friends laugh at him.  His best friend Buster even calls him a  ‘freak’. In 1979, when this book was published, glasses were obviously a symbol of the social outsider that everyone, even children, could recognize. If my daughters’ friends and classmates are any indication, this traditional stigma is dissipating among kids today.

What a revolutionary change  this could be for American culture!  Just look at the twentieth-century outsider terms for those who wore glasses: Nerds, geeks, and eggheads.  These people were outsiders in schools, at parties and within pop-culture because they were intellectuals. Glasses=brainiacs=social outcasts. Perhaps now this stereotype is transforming. Perhaps being smart is becoming, dare I say it, cool?

I hope so, but I want glasses to remain a perceived sign of intelligence, since the psychological process called  ‘enclothed cognition‘ may make this perception into a reality.  Put simply, ‘enclothed cognition’ studies have found wearing certain clothes can have positive or negative effects on cognitive processes.  Wearing a lab coat can make people think more clearly; wearing exercise clothes will make people want to work-out more. As far as I know, studies have never been done regarding the effect of wearing glasses on our cognitive processes. But, it seems only logical that the perception that glasses make you look smarter will make you feel smarter, which, in turn, will actually make you smarter.


Are glasses going to remain cool, or is this just a fad?  I don’t know. All I know is that I will keep pushing my kids to wear glasses, even if they never need them for medical reasons. They and their friends may or may not think it makes them look smarter; there is no question in my mind it makes them look cute.

By Michael Stelzer Jocks, History Faculty.

Each of the last three terms, I have taught RMU students about the Holocaust.  I created this course on history’s most infamous genocide, and it is, as compared to the most of the survey history classes our students take, extremely detailed.  To properly cover such a topic within 10 weeks is quite challenging. One hurdle to face is the seemingly simple question: Where to begin?  Should the course focus solely upon the Twentieth Century?  Or, should it range back to the earliest days of European Antisemitism; perhaps even back to the break of Christianity from Judaism?  It is a difficult issue, but, after teaching the course numerous times, I have a methodology.  The first class in the course focuses upon Christian Antisemitism and anti-Judaism from the earliest days, down to the beginnings of the early modern European world (circa 1600).


Antisemitism as a term was first used by anti-Jewish political parties

Obviously, this is a great deal of information to dole out to students in 90 minutes, and though I think I have gotten pretty good at painting with a broad historical analytical brush, I recently realized I faced a problem in this initial course.  The first couple times I taught the course, I quickly jumped into the history of Antisemitism, using the term Antisemitism over and over during my first lecture.  Most students seemed interested, and appeared to recognize the word.  Then, maybe a year ago, when I mentioned Antisemitism for the first time in class, I noticed a furrowed brow or two among my students.  Hmmm.  Why the confusion? Then, it struck me: These students don’t recognize the term.  Sure enough, when I asked my students who knew what Antisemitism was, I only saw a tentative smattering of hands.  My mind zoomed back to my previous courses. What if the vast majority of my students had NO idea what I meant in any of those classes when I first used the term Antisemitism?

I jumped into action.  I needed to clearly define the term.  Or, better yet, I would ask my students to find a definition for me.

Understand that I write this not as a critique of my students, but as a critique of myself.  I had been making the worst assumption a teacher can make.  I lazily figured that my students have the same information in their heads that I do. The power of this classroom incident really struck home for me recently when I stumbled upon a wonderful, important article in The Atlantic titled, “To Read Dickens, It helps to know about French History and the Bible.”  Jessica Lahey, the writer of the article, is a middle-school teacher.  She realized that for her students to really understand, and hence, enjoy Dickens’ classic The Tale of Two Cities, they would need to be ‘culturally literate’ in the terms of French 18th century history and the New Testament.  To provide this cultural background, Lahey now begins each of her classes with important terms and ideas that will clarify the necessary material for that day.

Lahey does this for her 8th graders, but, this is not something that should be exclusive to age or grade level. Such introduction to ‘cultural literacy’ is a constant of thorough education. Without it, the student suffers. However, it often must be handled with kid gloves.  The introduction of ‘cultural literacy’ should never be done in a spirit of elite superiority. Let me give one personal anecdote to prove my point. I  particularly remember a graduate school instructor of mine who often portrayed the students’ lack of cultural literacy as an incredible failureJacques-Louis_David_004_Thermopylae on their parts.  One example: In his 19th century German history course, this grad professor asked me and the rest of the students about a Greek history reference we stumbled upon in a work by Nietzsche (I think). No one in the class recognized the reference. Our professor was visibly dismayed.

He huffed his frustration, mentioning that the writer was obviously referring to ‘Thermopylae” and the 300 Spartans who died there facing a vastly greater Persian force. (This classroom incident took place several years before the hit film 300 was released.)  I and my classmates  felt inadequate. According to him, we SHOULD have known about Thermopylae, and the fact that we did not illustrated an unforgivable ignorance.  Imagine how my classmates and I responded to questions from that point on.  There was always a concern of looking ‘dumb’, and facing a dismissive smirk from ‘the expert.’

I realize now that incidents like this happen on an everyday basis in a college classroom. Of course, this does not mean every professor reacts to a lack of cultural literacy in the way my professor did.  But, if we assume all our students understand a term or idea that we are familiar with, we have taken a step on that slippery slope.  Of course, some in the class do have the recognition of cultural ideas and terms from day one.  Those students will most likely be the ‘hand-raisers’.  They will ask the questions, and become invested in the class.  This is wonderful.  But what if most of the class is instantly alienated by an assumption of cultural literacy? This silent majority may lose hope, and/or interest.  Many will feel the way I felt about not recognizing the word ‘Thermopylae’.  Can they overcome this feeling? Will they take it in stride?  This is the question, and it will mean failure or success for many.

I don’t know about you, but I want all my students to be successful.


By Michael Stelzer Jocks, History Faculty.

Last week, the New York Times ran an amazing story.  Evidently, a team of researchers have been spending the last few years developing a ‘genetic atlas’ of the world.  What is a ‘genetic atlas’ you might ask?  Put simply, the researchers have been collecting, and comparing the genomes of people living in many parts of the world, all the while finding similarities and shared genetic markers between seemingly disparate communities. Our DNA tells the story of human history, and surprise, surprise, it is pretty messy (the history, not the DNA).  Shared genome sequences point to, in scientific lingo, ‘mixing events’, and

Some of the hundred or so major mixing events they describe have
plausible historical explanations, while many others remain to be
accounted for. For instance, many populations of the southern
Mediterranean and Middle East have segments of African origin in their
genomes that were inserted at times between A.D. 650 and 1900,
according to the geneticists’ calculations. This could reflect the activity of
the Arab slave trade, which originated in the seventh century, and the
absorption of slaves into their host populations

genetic_atlasTwo things stick out to me most with this amazing, exciting research.  First, the findings of this study, and many others of the same ilk, are continually clouding our ideas about race. This is especially so for Americans, who historically have portrayed race as absolute, and physically evident.  Historians realize that notions such as ‘white’ and ‘black’ have culturally metamorphosed over the years, and that race as a definitive genetic category is socially constructed.  But to the average American born within the twentieth century, racial categories are non-negotiable.  You are either ‘white’ or ‘black’ or ‘Asian’, or something else.  Hence, when last quarter one of my students who was raised in Bulgaria mentioned to the class that she does not consider herself to be ‘white’, though she fits the ‘Caucasian’ physical bill, many of my students were dumbfounded.  Since they were born in America, they believe her whiteness to be not a choice; it is a mark of her biological essence.

Studies such as the ‘genetic atlas’ throw such ideas for a loop.  As the quote above illustrates, a white-skinned Italian-American student may have a genome made up of Middle Eastern, African and European portions.  Though twenty-first century Americans would consider him/her white, how do we base such a notion?  Do we simply go upon highest percentage of DNA for racial grouping?  Well, American history has generally said no to this solution.  Race, specifically ‘blackness’, but necessarily then ‘whiteness’ as well, is not based upon majority genome markers.  As Professor F. James Davis explains:

To be considered black in the United States not even half of one’s ancestry must be African black. But will one-fourth do, or one-eighth, or less? The nation’s answer to the question ‘Who is black?” has long been that a black is any person with any known African black ancestry. This definition reflects the long experience with slavery and later with Jim Crow segregation. In the South it became known as the “one-drop rule,” meaning that a single drop of “black blood” makes a person a black. It is also known as the “one black ancestor rule,” some courts have called it the “traceable amount rule,” and anthropologists call it the “hypo-descent rule,” meaning that racially mixed persons are assigned the status of the subordinate group. This definition emerged from the American South to become the nation’s definition, generally accepted by whites and blacks. Blacks had no other choice….

As Davis points out, the ‘one drop rule’ became central to identifying power and status in the dark days of slavery and Jim Crow.  Ironically enough, such a definition of ‘hypo-descent’ was necessary for American slave-v1m2012_art10_im5_growners since they  themselves were consistently ‘mixing’ with their African-American chattel.  Though the ‘Virginian Luxuries’ sign was meant to critique the practice, it illustrates the well-known fact that slave-owners (male only) were allowed, and sometimes encouraged, to take a slave mistress.  Though never truly consensual, these interracial couplings produced thousands of ‘mulatto’ children. It was all-important to identify who was, and who was not, a slave.

Thus, I come to the second striking aspect of the ‘genetic atlas’ study.  Notice from the initial quote above what historical events caused the genetic mixing? It was usually the worst aspects of human history.  Slavery, wars, and the growth of empires caused human genomes to splice in all different directions. The history of American genetic ‘mixing’ events in the Colonial, and early Republican periods was nothing new to the human experience.  American slavery was similar to Roman wars of conquest; or Mongolian empire building; or the Arab slave trade. Each was based upon unequal power dynamics  with one people being the exploiter, and the other the exploited. Exploitation of labor, and exploitation of sex.  Our genomes display the continually violent, often horrendous tale of human historical misery.

But, let’s look for a more positive side of this research, shall we?

Maybe, just maybe, we are witnessing the birth of a new, more peaceful ‘genetic atlas.’ The twenty-first century may be the first time that human-kind is mixing ‘racial’ genetic traits voluntarily and equally.  Just look at America today. What was once a taboo ‘mixing event’ is becoming something common and accepted.   Just in the last decade, there has been a 28% growth in interracial/ethnic marriages in the US.  At this point, around 10% of married couples are interracial. The number is even higher for non-married couples (18%).  As these couples have children, and their children grow up, and meet partners themselves, interracial numbers will only grow. The vast majority of Americans have no problem with this development. Is America specifically breaking racial ground? Is the genetic atlas of the 21st century going to be consensually complex?

You may say I am being naive, and maybe I am.  You may say that America is still a racialized society, and you would be right.  You may say that American racism is alive and well, and I would sadly agree with you. Racism is thriving in America.  But, perhaps race is slowly perishing.

It’s a start.

By Michael Stelzer Jocks, History Faculty. 

Though I study history, which is undoubtedly the ‘softest’ of the social sciences, I enjoy reading about and keeping track of ‘harder’ sciences. Yes, even physics.  But, to be honest, most physics of today I just barely grasp.  It is just too complex, specialized, and my eyes, arcane.  It is intriguing to ‘learn’ about the newest discoveries in the world of quanta, or the most exciting theories about ‘multiverses’, but it is really to, excuse the pun,universe-multiverse-1024x768 ‘far out’ for me.  I took physics in high school, and an introductory astronomy class in undergrad, and even at that rudimentary level the mathematics and scientific jargon were just too complex for my liberal arts brain.

dog_wolfOn the other hand, I can wrap my head around biology.  Now, don’t get angry all you biologists/biology students.  I am not saying I could easily become a specialist in your field. I most definitely could not.  However, biology is much more understandable to a layman like me.  Maybe this is simply because I can see my role, as a human lifeform, in the biologist’s world more than the physicist’s world.  Or, maybe it is because of biology’s founder, Charles Darwin, and his theory of natural selection.

Darwin’s incredibly important, and influential theory is explainable, in a very rough and ready way, within minutes to even the most science-phobic folk.  The theory itself makes sense on a rational level. We, as human beings, can imagine evolutionary theory even if we don’t completely understand the process.  Look at a wolf, then look at a dog.  See the similarities?  Yeah, that is because they are related. Wolf evolved to dog.  When?  Well, we are still trying to figure that out.  Why? Yeah, that is still a question too. But, wolf slowly evolving into a dog is imaginable, even to a child.  Now,  try to quickly illustrate Newtonian, Einsteinian, or quantum physics to said child.  On second though, don’t.  It’s just not gonna work.  

HeliocentricWe live in a strange world, where up is often down, and black is often white.  The world of astrophysics, no matter how immensely strange, foreign and arcane, is blindly accepted by most anyone who calls themselves sane.  Just imagine the heliocentric solar system.   The vast majority of Americans faithfully argue that the Sun is at the center of our world. Only the crackiest of crack-pots would take the opposite stance.  Yet, why is this so?  Can Americans prove we live in a heliocentric system in which our planet is spinning in an elliptical orbit at an enormous rate of speed around a giant ball of burning gas?  Have they seen the Earth revolving?  No, they have not.  So, why believe? Well, because scientists tell us so, and they’re the experts. 

Antithetically, Darwin’s much more obvious theory of evolution by natural selection is always fighting an uphill battle for acceptance throughout the world, but especially in America. This is doubly true when it comes to the touchy subject of human evolution. A very large minority of Americans believe with all their might that humans were created in our present state, and have no relation to other primates.  98.7 % DNA shared with chimps be damned! Same physiological features be damned! Constantly growing fossil record be damned!  For those Americans, it doesn’t matter what biology or Ockham’s Razor says.  They just aren’t buying it.

Obviously the reason for this distaste of Darwin’s incredibly well documented theory are complex, and controversial, and I really don’t want to touch on them in this post. What I do want to illustrate is the exciting news these people are missing. I want to let them in on some recent findings in the field of human evolution that are blowing collective scientific minds. The seemingly daily breakthroughs, theories and discoveries in the world of anthropology, paleontology and human genetics is, to put it mildly, awe-inspiring. See here:

These are simply a few stories about human evolution to appear in the last couple months.  Yes, you read that correctly: MONTHS. Not accepting Darwin’s theory, and hence, most likely human evolution, makes all these stories moot.  To put in bluntly, if you are in that large minority of Americans, you are really missing out on a great deal of our  amazing world. I suggest you celebrate Darwin’s 205th birthday by reading these attached articles, and then, analyze your worldview.

Go on. Darwin will wait.

By Michael Stelzer Jocks, History Faculty. 

Martin Luther King, a rabble-rousing civil disobedient, is now an American national hero.  This statement is obvious.  It is fact.  But, the lionization of MLK in America today elevates him beyond simply the level of hero. For the vast majority of the country, he is part of a even more exclusive pantheon of great Americans.   Paradoxically, we can see this by the use, and misuse, of MLK’s name and memory.

Watch the news.  Listen to the political talk-show hacks.  Use C-Span to spy on Congress as they argue over some arcane issue.  If Martin Luther King’s name comes up in any of these arenas, it is usually because someone12583152-standard is calling upon his memory to harden their argument into a moral imperative.  Or, alternatively, MLK’s memory and beliefs will be used to differentiate a political enemy’s ideals from those of the great Civil Rights leader. In other words, a sanitized, sanctified version of Martin Luther King has become a political weapon.  ‘What Would MLK say/think about this?” constantly gets thrown out into the public realm, leading to such ridiculously unanswerable questions as “what would MLK think about assault weapon bans?,’ or, ‘what would MLK believe about the Chick-Fil-A boycott’!  The best question, but the one that is never asked is, ‘What would Martin Luther King think about all these ‘What Would MLK think’ queries?”

Though sometimes absurd, or even distasteful, this usage of MLK’s message and life places him into exclusive company.  Only a handful of American historical figures are appropriated by the political left and right in this way. In fact, only the nation’s ‘founders’ are called upon as often as King and his legacy.

FoundersWhen the moniker ‘the founders’ gets thrown around in today’s political culture, it usually refers to a small sampling of men who signed the Declaration of Independence, fought the Revolution, and created the Constitution. Though usually not stated outright, it is safe to assume Washington, Adams, Jefferson, Madison, Hamilton, and Franklin are the big six.  Though historians will tell you that these men disagreed constantly and vociferously about the the meaning of America, twenty-first century Americans gloss over such complexities.  When ‘the founders’ are spoken of as a homogeneous bunch, it is usually to justify our political proclivities, or attack political enemies.  “What would the founders say about Obamacare?” “What would the founders think about waterboarding?” Picking and choosing the quotes of Jefferson, or Franklin that suit their needs, media personalities and political figures utilize ‘the founders’ to fight today’s political battles.

MLK is now part of this national pantheon. But, in one way at least, MLK is an even more evocative symbol than Jefferson, Adams or Washington. King’s image and visage resonates so brightly not just because of his life, but also his death.  Unlike ‘the founders’, MLK is a national martyr.  He died for what we understand today as being the best of American ideals.  Though ‘the founders’ fought to create the nation, and their lives were often in danger, none of them made the greatest sacrifice for the new republic.  (Of course, Hamilton is the exception. He died a martin-luther-king-jr-in-front-of-lincoln-memorialrelatively young man in a violent manner, killed by Aaron Burr in a duel. But, to our twenty-first century eyes, this death, though romantic, was not for the nation, but only for Hamilton’s individual pride and honor.) Most of the first generation of American heroes passed away quietly in their beds. They had cleared their own, and the nation’s hurdles, while alive.  They lived to see their dreams made real. MLK died before he reached his ‘promised land.’

But, martyrs die so that others may live.  Martyrology means that King’s death caused our collective rebirth. This places MLK in an even more exclusive club.  It could be argued there is only one other member: Abraham Lincoln.  Both King and Lincoln fit the definition of martyrs as they both died so that others could thrive and survive.  Both American heroes foresaw the future far before their contemporaries, and died for this prescience.

As our nation is at fault for the death of these two men, the least we can do is celebrate their births. 

By Michael Stelzer Jocks, History Faculty.

I am writing this down here at the Turtle in order to avoid any excuses.  I have a goal for the new year, and I really want to accomplish it. This is something I have wanted to do since I finished up with graduate school about 10 years ago, but I just never found time.  No more rationalizations.  This year, I do it.

00505813_largeYes, yes, another New Year’s resolution post. I can feel cyberspace sigh with annoyance.  But wait!  Perhaps this will interest you?  I have never, ever made a New Year’s resolution before the one I am about to share.   This is a first.

So what do I resolve?  What will I accomplish?  Well….It may not seem exciting to a lot of people, but here it goes….drum roll please….

This year, I will keep an online, private journal recording each new book I read. Each entry will be a small synopsis of said book, and if called for, memorable quotes from the text.

*Crash, crash, crash* (Sound of cymbals)

Now, why am I resolving to do this?  Honestly, and this is not some pretentious claim, I read such a large volume of books at this point that I often forget what I have read. Don’t you think this pretty much defeats the $_35whole purpose of reading? I do. You know you have a problem when you go to the library, or bookstore, and see a book that interests you, and yet you wonder, “did I already read that?”   I was looking through some books the other day and I stumbled upon Francois Furet’s “Interpreting the French Revolution.” ‘Oh, that sounds excellent,’ I thought.  But, then, as I pondered, I could not, and I still cannot, remember if I ever read it.   How infuriating.

Oh, and there is one other reason to keep a ‘books read’ journal.  Though many people think I read the ever growing stack of books on my desk at RMU from cover to cover, I have a confession.  A good number of books piled up at work never get completely read.  Some of those mighty tomes only get a good skimming.  I will begin a book, find it boring, poorly written, or not about what I thought it was about, and put it down.  Nonetheless, even when that is the case, there are usually a couple pages of each book that has some value to me.  I hope fulfilling this journal resolution will help me keep track of those partially finished, or quickly dismissed books for my records.  I need to get this book obsession under control.

I know. Not the most world-shattering resolution, but there it is.  Welcome to the 2014 Flaneur’s Turtle.

By Michael Stelzer Jocks, History Faculty.

Anyone who knows Peter Stern, knows that he has a way with words.  As you can tell from many of his Turtle posts, and as many of his coworkers would readily admit, Peter can be quite loquacious.  But, that does not mean Peter is not wisely pithy when it suits him.  So, for your reading enjoyment, I give you some of the best Peter Stern-isms of the last year, as witnessed by myself and my fellow Turtle-ite Paul Gaszak.

Peter is the master of analogy:

 On the idea of Wrigley Field adding a jumbotron: “It’s like someone whipping out their genitals in Holy Name Cathedral.”

Peter does not view the world in simple dualistic catagories:

RMU Student: I’m a failure.

Dr. Stern: That’s not true. You’re just not a success.

Peter displaying his Socratic wisdom:

Gerry Dedera: Peter, you couldn’t be more wrong.

Peter Stern: I couldn’t be more wrong? Just wait a minute.

Peter contemplating his own photographic image:

I don’t believe that’s me at all; I think it looks much more like Marcel Proust after letting his hair grow out a bit and turn gray.

Peter using humor to illustrate society’s prejudices:

Carol Bivin: Why is it that a woman has never been elected President?
Peter: Well, let me tell ya, sweetie pie….

Peter as instructor:

Paul Gaszak: I told your students to move to the back row, because you educate with such force that there is a blast radius.
Peter: The force was so great that it actually pushed them out the door.

Peter as critic of our society’s obsession with physical beauty:

Paul Gaszak (A runner extraordinaire): Did you make a big deal of Cynthia’s birthday yet?

Peter: We are going to celebrate later when you’re out narcissistically jogging or whatever it is your selfishly do.

Peter being Peter:

Paul Gaszak: I thought we’d be the birthday strippers (for Cynthia’s party). You want to be the policeman or the fireman?

Peter: I can do both.

Happy Holidays, from the Turtle!