Tag Archives: Wednesday Blog

Art

Photo: Tom Kane at Immersive Van Gogh Kansas City
This week, how art impacts how we see the world around us. ~~~ Immersive Van Gogh: https://www.kansascityvangogh.com Claude Monet, Boulevard des Capucines, (1873): https://art.nelson-atkins.org/objects/17852/boulevard-des-capucines George Caleb Bingham's Catalog: https://www.binghamcatalogue.org Thomas Hart Benton's art at the Nelson-Atkins: https://art.nelson-atkins.org/people/2320/thomas-hart-benton/objects "Hard Times Come Again No More" by The Chieftains and Paolo Nutini: https://youtu.be/uPqjQTkEA6g

On Sunday, I went with my parents to see the Immersive Van Gogh exhibit that’s been touring around the globe for the past few years. I first heard about it when I was in Paris in May 2018 and thought about going to see it there but ended up not paying a visit to it then. So, the following year when it was announced that Immersive Van Gogh would be coming to Kansas City, I jumped on the opportunity and bought tickets for my family to attend. 

Then the world changed in what now seems like a prolonged moment as the COVID-19 Pandemic took hold around the globe. The exhibit opening was delayed in Kansas City, and it began to slip from my mind for the next couple years as the storms that shadowed the last few years of the 2010s burst into the troubled times that have been the hallmarks of the 2020s thus far.

So, after years of anticipation when I finally entered the Immersive Van Gogh exhibit this past Sunday afternoon I was awed to experience it, the sights and sounds combined for a truly awe-inspiring experience. We entered the gallery as Edith Piaf’s “Je ne regrette rien” burst over the speakers to the bright yellow hues of the fields of Provence as observed 140 years ago by the artist’s eyes. I took a seat on the floor with my back to a mirror-covered pillar and watched as the images danced across the walls and floor surrounding me.

The exhibit inspired a question: does our art influence how we perceive the world around us, and as a historian more importantly does the art of past generations influence how we today perceive the light and color and nature of past periods? Take the Belle-Époque, the age of the Impressionists like Monet and Post-Impressionists like Van Gogh, do we understand and think of the daily reality of that period in a way that is colored by the works of those artists? There is a Monet painting in the Nelson-Atkins French collection here in Kansas City of the Boulevard des Capucines which dates to 1873. It shows the hustle and bustle of the French capital in a manner that is both of its own time and seemingly timeless in how modern it appears. This extends in my own mind to the point that I’ve imagined the same scene whenever I’ve happened to walk down that same boulevard in the last few years.

On the other hand, the images that exist of Kansas City from the nineteenth century are largely dominated by black-and-white photographs and the odd painting by the likes of our first great local artist George Caleb Bingham (1811–1879). So, for how many of us are our ideas of say the Civil War largely just in black and white even though the reality was in the same vibrant color as we see now today? Even in my own life, I’ve found that there’s a slight hint of faded color in my memories of earliest days of my life, perhaps influenced by the technology available in the color photography of the 1990s which is noticeably less radiant than the colors available today in our digital images.

George Caleb Bingham, Canvassing for a Vote, 1852, (92.71 x 105.41 cm), Nelson-Atkins Museum of Art

Art at its most fundamental level is a means of communication. It transmits memories from the creator, a historian of their own sort, to their patrons in posterity. Whether that art is expressed in painting or sculpture, sketching or cartoons, music or poetry, theatre or film, and in every form of literature both fiction and non-fiction alike, it is still at its core a transmission of knowledge and information. Through art the dead are able to speak to us still. In art we can experience something of the world that others live, that they see and hear and think. In the paintings of Thomas Hart Benton (1889–1975), in my opinion the greatest Kansas City artist to date, we can see echoes of American life as he understood it in the first half of the twentieth century. I can truly say that his art has influenced how I understood the Depression, World War II, and the Postwar years in a way that is best described by the fact that having grown up in Kansas City going to the Nelson and the Truman Library I saw his art far more often than many other Americans might well have. Through his paintings, Benton communicated ideas about what it means to be American, and the place of the Midwest in general and this part of Missouri in particular in the wider fabric of this diverse country of ours.

Thomas Hart Benton, Hollywood, 1937-1938, (156.53 x 227.01 cm), Nelson-Atkins Museum of Art

So, what sort of message will we in the 2020s leave for future generations? What do we want to communicate to them? In the last couple of weeks, I’ve been thinking of the Stephen Foster song “Hard Times Come Again No More.” Written in 1854 at a time when my home region was embroiled in the Border War known more commonly as Bleeding Kansas, one of the last preludes to the American Civil War of the 1860s, I’ve always thought of “Hard Times” as a song not of the nineteenth century but of the Great Depression, something that I could imagine being sung by farmers fleeing the Dust Bowl here in the prairies for new lives elsewhere. Still, the fact that the stories surrounding that song can speak to different times with common troubles speaks to the power of art. Maybe it’s high time we restore “Hard Times” to the charts, after all what better description of the present could there possibly be?

Times of Trial and Hope

Photo by Pok Rie on Pexels.com

Times of Trial and Hope Wednesday Blog by Seán Thomas Kane

This week, living in exciting times often turns out to be a bit unlucky, from a certain point of view.

To say we’ve been living in boring times would be the lie of the decade. The twenty-first century has proven to have nearly as many pitfalls and joys as the twentieth, albeit pitfalls of varying types. We’ve avoided the great cataclysmic chasm of world war so far, but that portal to the Underworld remains visible off in the distance. How close our collective human path comes to its shadows remains ours to decide. What we have that our ancestors didn’t is their collective memories of the century now past to ensure we avoid some of the same mistakes they made in their wanderings through life.

As a child I had many dreams about what my future would bring, what sort of job or jobs I’d have, who I’d spend my adulthood with and the kids we’d have together; the joys and griefs that would come with the waxing and waning of our days. I figured my adulthood would be straightforward, that I wouldn’t have any trouble finding work or building a life for myself, after all that’s how my parents’ lives seemed to me. Yet in the last decade as I’ve entered my twenties and now see the great stone gateway leading to my thirties near the horizon, I have to laugh at those juvenile ideas of what my life would be like. 

The last decade has been tough, and everything that I’ve done that I feel truly confident about has yet to really translate into a stable long-term career. This is a sentiment that I doubt I’m alone in expressing. For all the benefits of our modern world and its advances, for the threats of the past seeming to be in the past (until they reawaken like the zombies that dominate our popular culture), many in my generation remain stuck having trouble finding work or finding that the industries that we’re interested in working in are “broken” or simply aren’t hiring.

I’ve been very frustrated at how the whole hiring process hasn’t worked for me yet, no matter seemingly hard I try and how many applications I send in. It often feels like there’s some lesson that I missed in my close to 27 years of schooling about how to get a job. It makes me angry when the response from people around me is “oh, don’t worry, you’ll get something eventually” or any other related phrases and sayings that aren’t constructive or helpful.

Today’s title comes from the first volume of President Truman’s autobiography, the stories of his starts and stops as he tried to build his own career a century ago here in Jackson County, Missouri. I’ve always felt that I could relate to Truman more than many other presidents, perhaps because he came from the same area as me, or because he was good friends with some of my more distant relatives. Whatever the case, Truman’s words ring true to me. These are truly times of trial and hope, and I think despite how trying the current time may be, we have to keep up hope that we can drag ourselves out by our fingernails if we have to and into a better tomorrow.

The best solution in my Sisyphean task of applying for jobs is to keep rolling that boulder up the hill and hope that eventually I’ll make it through. The funny thing is the security that graduate school provides with a stipend and a position in an institution hides the fact that in many ways the activities central to graduate school are also Sisyphean in wrangling together support for your work all with the hope, however dim it may be that that work will ensure you a job once you’ve earned your credentials.

It’s hard to be hopeful right now in 2022, and there’s so much to be worried about. I don’t really have a positive spin to put on this one because I’m still not sure what that positive spin might possibly be. All I’ll say is that it’s up to us to figure out a solution to move from our times of trial and hope to our times of decisions and maybe eventually into a new age of optimism.

Culture

This week, some thoughts on what keeps a culture alive.

I really enjoy going to concerts and hearing all sorts of musicians from all around the globe perform. I’ve been lucky enough to attend some historic concerts, such as the 2012 performance of the Cuban National Symphony Orchestra at the Kauffman Center in Kansas City. It was their first performance in the United States since the beginning of the embargo on Cuba in the 1960s. In that moment I could feel echoes of a vibrant and lively culture living alongside my own in the same moment in time. The way the musicians put their own spin, their own rhythm on Gershwin’s Rhapsody in Blue to make it sound Cuban was wonderful to hear. Still, the setting of the concert hall where there is a clear physical barrier between the performers who stand or sit on a stage frozen in place with an audience watching all around does sometimes give me chills. 

There have been many other concerts that I’ve attended where music or stories from particular cultures are played or told, relayed to us an audience foreign to those artists’ culture. Yet the way they’re transmitted, removed from the physical surroundings of those artists’ home, placed on a sterile stage where we can hear and see everything in perfect clarity, takes something out of the performance. It worries me that in America we’ve come to expect that particularly “cultural” things, especially if they’re foreign or “ethnic” ought to be neatly packaged in an expected format and set up. That way they can be interpreted through our own cultural lens in such a way that if I see a bottle of Italian seasonings in the grocery store, I’ll be able to tell first and foremost that those are seasonings. Yet in that standardization of culture to fit a set mold into which all its many and contradictory elements can be poured and melted down, we lose a great deal of the memories and the life that those cultural artifacts embodied.

My own people, the Irish Americans, have an interesting relationship with this. We still maintain some elements of a distinct culture from other European-descended ethnic groups in the United States, though I know some reading or listening to this might object to my use of the word “ethnic” to describe a community so assimilated into mainstream American society. The problem for me arises in trying to answer how our culture is still even partially Irish. Here in Kansas City, we have plenty of people who play Irish music, myself included, as well as a couple of local Irish dance schools. We even have a local Irish Center where classes in the Irish language are taught. Yet when most of those cultural milestones are performed, they are often more so in delineated places and situations where they are expected, say at the Kansas City Irish Fest, rather than more organically on a regular basis as a daily part of life. 

One of the great exceptions to this rule is with music, after all some of my favorite concert memories have been sitting in on the jam sessions at the Irish Center and at other venues around town, even in the homes of friends. There, an element of our Irish American culture is still being performed organically, like a group of friends getting together one evening for a party. It just so happens that instead of playing the Top 40 Hits at that party they might pull out their own instruments and play their own top 40 for themselves.

Culture is fundamentally performative, and to survive it must flourish organically in the setting where it exists. This past weekend I had the honor of serving as a groomsman for one of my best friends who is Greek Orthodox. The wedding took place in his church, and clearly seemed to be an unfamiliar ceremony to many of those present who weren’t themselves Greek (myself included). Still, I found the chanted prayers and hymns––most of which were performed in English––to be fairly easy to learn, and after the first one I was able to add my own voice to the congregation. Later that evening the typical wedding reception DJ hits were freely interspersed with Greek dances, which likewise for the average participant were far easier to pick up on than any of the Irish dances I’ve done over the years. There’s a culture that’s still vibrant in how ordinary its performances tend to be for the people who live it every day.

It struck me that the idea of having Irish music played at an Irish American wedding reception would probably be met with shock, after all most of us don’t know the steps for the jigs and reels that make up Irish dance today. Furthermore, someone is bound to be annoyed by some inconsistency with what is properly Irish American, meaning that trying to toe the line of ensuring that one’s Irish American cultural practices are so highly regulated that they become harder than necessary to follow. I for one would love to try and introduce some elements of jazz into the jigs, reels, and airs that I’ve learned to play on the tin whistle, and if I do ever get around to joining in a specifically Irish dance again you can bet that I’ll let my arms move more freely than is expected.

I worry that a culture which isn’t performed as a daily routine will gradually become fossilized. Such a culture, if confined only to special events that aren’t expected or normal for the everyday, will surely die, leaving its participants poorer as a result. Perhaps one of the greatest differences between Greek American culture and Irish American culture is that our ethnic church, Roman Catholicism, did not preserve our Irish language as the language of liturgy and spirituality. Rather, Catholic priests continued to say the Mass in Latin until the 1960s, by which point so few Irish Americans still spoke Irish that there are hardly any Irish Masses performed here in the United States today. I’ve only ever been to one such Mass that was done entirely in Irish. It was said on a stage at the Dublin Irish Festival outside Columbus, Ohio where we the congregation watched on as if gazing at some exotic ceremony stuck out of place and out of time. That most essential element of any culture, the way in which it speaks and sings and laughs and cries, its language, is vital to that culture’s survival. And in a country where we make up a good portion of English-speaking Catholics, our Church has assimilated faster than our hopes for a distinct Irish American culture may have wanted.

Childhood

My Mom and I, Thanksgiving 1997
This week, reflections on childhood, from my own personal experience.

One of my favorite types of daydreams is to imagine my current self talking to my younger self. It could be me as a junior in high school or me watching off to the side as my three- and four-year-old self began to conceive of the world around me. I can remember what my younger self was thinking in any given time so the “script” if you want to call it that is one of the easiest ones to write. And in the past, as is still the case today, I often wonder about my future self, who I’ll become as I continue in my life, who I’ll meet, where I’ll live, who I will be in my future. 

A couple of weeks ago I found myself thinking about all this for a good three hours while I was flying east from Kansas City to Newark on my return trip to Binghamton from a wonderful Easter weekend at home. As I watched the prairies of the Midwest give way to the deep cloud covered Appalachians, I kept thinking about who I dreamed I would become when I was at varying stages of my life. Would my six-year-old self whose life changed dramatically when he moved with his parents from Chicago to Kansas City be proud of how my 29-year-old self has found a way into a career where he still looks back at what he loved to think about as a kid? Would my moody teenage self be happy knowing that he was still going to be single at the end of the following decade? And what about my more recent past? Would the Seán who lived in London for a year and learned so much about the world in those months abroad, would he be proud or scared at how tough the next seven years of his life were going to be?

I wonder, and maybe this is a conversation better posed to psychologists, are we still the same people who we were as kids? Or do we transform or change the shape of our personality through our lived experiences, through our joys and sorrows? I remember thinking about myself more simplistically as a child, a time when the things that I was proud to be a part of conjured up images like the Space Shuttle or other marvels of the modern world. As I learned more about myself, I found more and more things that were new to me that I could attach myself to, that I could find some connection to. My interest in Ancient Rome was born out of a conversation with my Mom when I first remember hearing my Church called Roman Catholic. I knew where Rome was (I memorized the globe at a very early age) so the idea that I was a part of something so rooted to something so ancient was thrilling. 

Similar things happened at a time that I can’t pin down when I began to understand and listen to the stories about my Irish ancestors. It’s funny, I remember only one person from that generation, my grandfather’s Aunt Catherine who died in 2000 when I was seven. I remember her accent puzzling me, but I bet if I sat with her today, I’d be able to understand her perfectly well now that I’ve got nearly 30 years of listening to people from Mayo under my belt. On my Mom’s side I have only one memory of my great-grandmother, my GG as I always called her. I must have been very young, but I remember waking up from a nap in what’s now the computer room in my grandmother’s house in Kansas City and going out into the living room where she was sitting with my grandmother. All I remember her saying was “Hi, Seán.” When I told my Mom about this memory recently she said she must have carried me out because my GG died before I could walk, meaning this could well be one of my very first memories.

Still, when I think back to all those moments as if looking down the long string of a double bass, I wonder if the guy whose eyes saw those moments, whose ears heard those sounds, whose nose smelled those smells (for good or bad) was the same guy who I am today? If I can say anything definitive, it’s that the one constant among all those memories isn’t necessarily how they were framed or what I was thinking or feeling in each moment. It’s that the same internal monologue was going in the same voice that I still think in today. A few weeks ago, I wrote about the first time I recognized my conscious thoughts, something that a lot of people said was a profound idea. I asked if that for me was the moment best described as “In the beginning” for me, with everything else I ever have come to know or will come to know happening afterwards in the order that I discovered those things? Today I want to add onto that dogpile of a question and ask, which part of my past is most influencing my present, and by extension my future? I think the best way to look at answering this question is through the lens of nostalgia.

The truth is I’m not sure which reflections of my past that live on in my memory is the one that I’m most nostalgic for. There are echoes of all of those shadows in my life and my work today, the deep passion for natural history and the natural world in general that drove my six-year-old self whose favorite places in the world included the Field Museum and the Brookfield Zoo, or the teenage reflection who loved his Latin classes more than any other and really wanted to be doing better at it but just didn’t have the patience to stop overthinking things. I think those teenage loves drove me into my adulthood, after all as much as I loved spending my time in London’s Natural History Museum it was the British Museum that I dreamed working at when I decided to do either a Classics or a History PhD in 2016. It took me a few years to get into a program, by which time I’d settled not on Ancient Rome but on the Renaissance, before building my own field from the ground up, as a kid with my childhood would tend to do, to become a Historian of Renaissance Natural History.

As it happens, this whole idea of a hyper-individualized vision of a historical timeline, beginning with a person’s first consciousness about something could be useful in my work. After all, one of the great debates in the history of how the Americas were approached by Europeans during the Renaissance is whether it’s right to say they “encountered” or “discovered” these continents. I usually prefer to say “encountered” seeing as there were already generations of people reaching back into the Ice Age who had called these continents home. Still, if we think about this question less in the scope of all of human experience and more in the limited view of how one set of humans, one branch of the family isolated from others by circumstance understood the Americas when they reached those shores in 1492 then the word “discover” coming from the Latin “discooperiō” meaning “to expose” or “to lay bare” then the word does fit the experience of the many peoples of Europe in first learning of the existence another series of worlds across the Ocean Sea that they came to call America. But our history is the history of the creation of our modern world, a global world defined by shrinking borders and a growing sense that we’re all in this together, and for that world this isolated story of one perspective “discovering” the fact that other people had already made it to first base merely makes the discoverer a shortsighted pitcher. Without all the caveats and framing, the idea doesn’t work. It speaks to the warning that it’s best not to think of whole groups of people in the same context that we’d use to think of just one guy.

So, with that out of the way, do I think my younger self would be proud of who I am today? In some ways, yes, after all I’m sticking to doing something that I love despite a great deal of the odds and the circumstances of our world in 2022 seeming to be stacked against me making a living out of being a Historian of Renaissance Natural History. I may not be working at the Field Museum or at one of the other wonderful natural history museums in this country or beyond, but I’d say that’s still a possibility. Nonetheless, I imagine that teenage Seán would be a bit more forlorn knowing he’d still be single all these years later. Teenage moodiness can cast a shadow even from the confined distance of your memories. I think the moral here, if there is one, is that there’s always room for improvement, right? And at the end of the day, as my undergrad self the triple major in History, Philosophy, and Theology with double minors in French and Music would like to say, “Anything is possible.” So, if I could go for three and a half years without a lunch break trying to earn 3 majors and 2 minors in 4 years then I can get a job doing something I love and maybe figure out the personal life while I’m at it too.

Homo Sapiens

A particularly bumbling specimen of the species.
This week, a bit of self-reflection. The Man from Earth website The Man from Earth trailer

On Monday last week, I sat down to watch the 2007 film The Man from Earth for the second time. You may remember hearing or even reading my reflections provoked by that film. I said I’d probably watch the sequel, The Man from Earth: Holocene soon, and well I did just that. Compared to the original, Holocene lacks some of the powerful dialogue, and the gripping storytelling. The Man from Earth felt like it was a story being told in real time, while Holocene, its sequel, seemed more like a TV pilot that was turned into a feature. Both films feature some wonderful actors that I recognize from the many Star Trek series I’ve seen in the last few years, notably John Billingsley and Richard Riehle in the first film, and the great Michael Dorn himself makes a wonderful appearance in the second film.

If I were to draw any deep arguments out of the second film, Holocene, it would be something to do with how we identify ourselves. We humans call ourselves Homo sapiens, a scientific designation that we’ve given to ourselves to distinguish us from our hominid cousins including the Neanderthals. Homo is Latin for human; it is the genus which represents all hominids as a subset of primates. Sapiens on the other hand is more interesting. It is a Latin participle based on the 3rd conjugation verb sapiō, sapere which is used to mean many things from “to taste,” “to have flavor,” to the more innate concept of being able to sense or discern things, all of which is necessary for knowledge. Homo sapiens then means we distinguish ourselves from our hominid cousins by our abilities to understand ideas. Now there’s evidence today that other early humans could think and create in ways that are similar to us, evidence for example that Neanderthals created art of some sort in ancient Europe, so in many ways the fact we designate we humans as Homo sapiens is as much a way of patting ourselves on the back as anything else.

This brings me back to The Man from Earth: Holocene. It’s a film that introduces the core conflict when a group of inquisitive undergrads start to wonder about their professor who they soon realize is the same 14,000-year-old man from the first film. Only now he’s begun to age in slow but noticeable ways. This film made me question the idea that we are Homo sapiens for the personifications of humanity in this film, the four students seeking the truth about their professor, make a series of terrible decisions that prove as book smart as they might be they are clueless to so many other factors of life. Homo sapiens indeed.

In my own research I study the introduction of Brazilian flora and fauna into European natural history through the writings of several French explorers dating to the 1550s through the 1580s. And while I came into my research thinking I would have some fun writing about sloths and parrots and dyewood trees, I have found that the story I’m trying to tell is as much a warning to our present and our future as it is anything mundane about Renaissance natural history. There is a theory, an idea that is introduced late in The Man from Earth: Holocene called the Anthropocene, a concept that is widely discussed today which argues that human interventions and influence upon nature have become so great that we have shifted the course of Earth’s natural development from the Holocene, the current geological epoch defined by our planet’s warming by the Sun over the past 11,650 years, making for the perfect conditions for the development of life as we know it today into a new geological epoch where we humans, the Anthropoi in Greek, are now the prime movers of Earth’s natural course. In the film this becomes an understated note of caution, yet in my own research I find the Anthropocene to be a fundamental piece of the story of the European exploration, conquest, and colonization of the Americas largely ignored until recently.

We call ourselves discerning, we call ourselves wise, and yet we allow our own demands on nature to outstrip what nature can provide. It’s a curious balance we need to maintain, one which I am just as guilty of destabilizing as anyone else. It’s curious to me that we call ourselves wise when we think of all we have done with our home. We are one of maybe only a handful of species (leaving room here for other hominids at least) that has created beautiful art and weapons of mass destruction all with the same innate tool: our brains. We have just as much an ability to love as to fear, and in a given day I think it’s safe to say we act on those emotions without often really realizing it. 

Through it all we’ve survived and thrived on this planet of ours. There are 7.9 billion of us today, and while our population growth is a marvel of our ingenuity and ability to adapt to everything that this planet has had to offer so far, our own exponential growth may be the thing that drives the planet to the point of no longer being able to care for us the way we have been. If we don’t eat, we starve, yet if we eat too much we will run out of food and then starve. As the Man from Earth himself said in the first film, it’s the species that live in balance with nature that survive.

I argue in my dissertation that the Anthropocene really began when two different gene pools of life, one Afro-Eurasian the other American intersected in a large scale for the first time in thousands of years following Columbus’s accidental stumbling on land and people on this side of the Atlantic in 1492. That was the moment when human endeavors began to triumph over natural barriers, when a new global world was first conceived out of the collective products of a series of old worlds on every inhabited continent. It’s fair to call ourselves sapiens, discerning and wise, for the fact that it was humans who bridged that gap through innovation and technology. Yet it’s also fair to say that it was humans too whose innovation and technology created the great climate crisis we find ourselves in now. While the pessimists among us would end the story there, in a way that is in vogue to do these days, I want to continue the story, to contribute a verse to the poetry of life and say to you here and now today that it will be our innovation and technology, our discerning and wise nature that will figure out a way out of this crisis and that will lead us to adapt again to a new life in this new world we’ve created in our own image.

The Man from Earth speaks to me of the potential of humanity and of how at the end of the day we’re still just telling each other the same sorts of stories around a campfire. Like our ancient ancestors before us we see what we know and imagine what could be out there beyond the light of our knowledge. Unlike our ancestors we today are comfortable in a world we’ve created for ourselves, or at least some among us are comfortable in that world. We don’t need to innovate quite so greatly as past generations; we can let our minds become lazy and unimaginative. Like the big wigs from every time just before a storm we can be content and let the tides overcome us, but some among us will be hit more fiercely by those tides than others, and they’ll be the ones to stand up and say we can do better for ourselves. We will always stumble and fall, like those four characters from The Man from Earth: Holocene but we will always find a way of getting back up.

At the end of the day, we’ve created this new world where we are at its center, the keystone species around which all others exist in a new balance. I personally find that balance more precarious than I’d like, and personally I’d rather not be the one holding the entire balance of nature up like some modern Atlas. Yet over generations of decisions for good or ill this is what we’ve decided to do, and who we’ve decided to become. All we can do now is live up to the task and make the burden less strenuous for our descendants.

The Fog of Time

Photo by wsdidin on Pexels.com
This week, some thoughts inspired by the Jerome Bixby written 2007 science fiction film The Man from Earth. The Man from Earth website The Man from Earth trailer

When I was an undergrad, I watched a lot of really neat films and TV shows. It was something that sticks out to me from those years as distinct, something that I’ve continued and recreated from time to time as my mood allows it. In the Fall of 2013 I saw a movie on Netflix that peeked my curiosity called The Man from Earth that I’d never really forgotten. It’s not really an action-packed story in the modern sense, no rather it’s 90 minutes of dialogue, discussion, good old fashioned storytelling about a professor who is leaving his job, home, and friends after 10 years. The reason: because he’s learned over the 14,000 years of his life that that’s a good policy to do every decade or so. Yep, John Oldman, the Man from Earth himself, is a Cro-Magnon.

For some of you hearing or reading this the plot of The Man from Earth will be all too familiar to you. It became something of a quiet hit among certain crowds. One reviewer called it “intellectual sci-fi” even. While I was home over one of my recent breaks from working on my dissertation I bought an HD copy of it on YouTube figuring I’d like to re-watch this particular classic of my early 20s again. It took me a few months but on Monday night this week while I was keeping an eye on my students’ term papers streaming in before the 11:59 pm deadline I sat down and watched The Man from Earth all over again.

What stands out to me the most about my memories of watching this film for the first time nine years ago is how it unsettled me a bit, as it did the expert characters on the screen. The very idea seems counter to all that experience has taught us. “People die!” as the goddess Persephone cries to Orpheus in another film I purchased on YouTube. Yet somehow in the logic of The Man from Earth the title character, Dr. John Oldman had been able to live for 140 centuries and still after all that time appear to be only around 35.

This time I came away from my second viewing of The Man from Earth thinking less about the literal story and more about the ideas it proposes. For one thing I found myself thinking of the perspective that such a man would have of nature and reality. My own perspective is fundamentally framed by my upbringing, it’s conditioned by the powerful forces of my Catholic faith, my Irish American traditions, my personal ethics, and by my upbringing learning about Creation as both an Act of God and scientifically the product of a Big Bang and billions and billions of years of evolution. 

On the other hand, John Oldman––the fourteen thousand year old Cro-Magnon––reflects and recognizes a different sort of perspective, one born out of the entirety of human experience, one where the world began as what was knowable from the furthest reaches of the light from the nearest campfire inside a cave. For him, the Big Bang and all the creation stories we’ve been telling ourselves could well have happened after his first memories, his first inklings of reality and existence. To him those were all things that were learned over time, and their existence only became tangible once they were learned. If you think about it no one really knows things before they begin to have that first spark of an idea that those things could be possible. Today, I believe that anything is possible. That belief is grounded in my faith in an omnipotent and omniscient Divinity, yet it’s also equally held up by the course of human intellectual history, of how we keep finding out more and more things are real, and by extension at the edge of our knowledge that more and more things are possible. How wonderous is that?!

On Sunday evening, in honor of Yuri’s Night, an annual celebration held around 12 April to commemorate Yuri Gagarin’s first spaceflight, the first time a human left our home planet, I decided to watch a space-themed film. Last year I was lucky enough to join the global livestream party and hear all sorts of neat panelists talk about the past, present, and future of Space exploration. This year though that wasn’t an option, so I improvised and put on the 1982 classic The Right Stuff. For those of you who aren’t familiar with it, this is a film celebrating the Mercury 7 astronauts, the first Americans to leave the Earth and among them the first humans to orbit our planet. The Right Stuff begins with Chuck Yeager’s triumphant 1947 flight that proved the sound barrier (Mach 1) could be broken. Before he flew his Bell X-1 faster than the speed of sound no one was entirely sure it could be done. It pushed the edge of our knowledge out of a pure hope that supersonic travel could be possible.

Maybe then on a human scale we should think of our place in the history of the Cosmos less as a story beginning with the Big Bang, which as scientific as it is does bear some resemblance in how its story is told to the other great Creation stories out there, and more with that first human spark that signaled to our conscious thoughts that we exist. Maybe the beginning of the human story ought to be around a campfire in a cave somewhere, or maybe this leaves room for multiple human stories, each threads that broke off from one another until as it is in our present time there are 7.9 billion such threads living, all vibrant and emotional and passionate each in their own way.

When I think back to my beginnings, to what I can ascertain as my first definite memory that I can remember, that first moment when I could recognize my own internal narrative that keeps me going even now, I think of a day trip my parents and I took when I was around 3 or so from our home in the Chicago suburbs to the farm where my Mom’s grandfather grew up in Sheffield, Illinois. I remember sitting in the back seat, on the passenger side watching out the window as we drove by new suburbs, new neighborhoods were being built as we drove west. For me so much that I consider etiological, that I consider as origin-stories to my own, whether they be Genesis or the Big Bang or the many stories keeping my ancestors’ memories alive even when the people involved are long dead, all of those come after that moment in my own memory. So, to my own individual, my own personal recollection of history, of reality, of the Cosmos in all its wonder, all of that comes after that one moment that I can remember from 27 years ago.

What does this mean for how we understand our place in nature? I think if it changes anything, it ought to follow another line of wisdom from The Man from Earth, that the species that lived in balance with nature tended to be the ones who survived. Maybe we need to balance ourselves with our worlds. What I mean to say is maybe we should allow room for our own individual views of things while acknowledging there’s a greater truth to be found in the collective knowledge and wisdom of humanity. There is an inherent fog surrounding our understanding of time, after all we can only ever really see what’s happening right in front of us at any given moment. We can remember with growing haziness what happened in the past, and we can yearn for possible futures that are equally fuzzy in our imaginations today.

This time around I was delighted to see that a sequel was made called The Man from Earth: Holocene back in 2018. It stars the same actor, David Lee Smith, as John Oldman. I think I might watch it tomorrow, and who knows maybe you’ll hear more from me about this story next week. For now, keep imagining because that’s what allows for the improbable to become the possible.

The Man from Earth (2007) trailer

Sandwiches

Photo by Brigitte Tohm on Pexels.com
This week, to conclude what I’ve been saying.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D “Signs,” Wednesday Blog 1.10.[2] “On Servant Leadership,” Wednesday Blog 6.15.[3] Percy Bysshe Shelley, “Ozymandias,” Poetry Foundation.

Like many people I love a good sandwich. My tastes are a bit austere, a bit simple for most. I’ll go between one or two options: ham and provolone with maybe some lettuce and rarely a tomato slice or two, or roast beef and Swiss. More often than not lunch is a meal that I tend to eat on the run, either in between things in my apartment or on the go between meetings and lectures at the university. There are also plenty of lunches where I’ve ended up going through the drive-thru at one of the local outlets of the national burger chains, you know what I mean, or made a stop for a burrito at Chipotle, but beyond all of those options nothing beats a good sandwich decked out on wonderful bread.

Some of the best sandwiches I’ve ever had were in France. There my preference is the rather plain jambon-beurre, ham and butter on a baguette. Writing this now that doesn’t seem like such a bad idea to try making in my apartment in Binghamton one of these days. You could go as fancy as you’d like with your lunch, have all sorts of sauces and toppings on your sandwiches, and bravo for you with your preference. Yet what I’ve ended up settling on here is putting my ham and provolone or roast beef and Swiss on a bagel, often a raisin bagel mildly toasted and buttered to perfection.

Sandwiches are good topics to make podcasts about because eventually everyone gets hungry. There’s a chance this one might rise above my average 10 listeners per episode, all because I tag the blog post version of it with the keyword “sandwich”. There are plenty of sandwich-themed short videos and other entertainments out there, take British motoring journalist James May’s bunker kitchen on Food Tribe, and while I enjoy Mr. May’s commentaries for their insight and humor, I appreciate seeing how different his tastes are to my own.

There are all sorts of commentaries about sandwiches and what they mean below the surface. Some say they are symbols of how our culture has developed a need to deal with human necessities on the go, we “eat when [we’re] hungry ” to borrow a phrase from an Irish folk song about moonshiners. There are also the endless debates about sandwiches, what makes sandwiches, how we define sandwiches, are hot dogs sandwiches? The jury is still out on that one, no doubt enjoying their own bready concoctions in the jury room. The sandwich is one thing too good for even the French to ignore, adopting it with its pure English name despite a general cultural distrust of anglicismes.So, here’s to the sandwich, the humblest crust upon which we stack our hopes and dreams, the object of our fancies, and the delight of a quick lunch.

The Syntax of Internet Culture

Photo by fauxels on Pexels.com

The Syntax of Internet Culture Wednesday Blog by Seán Thomas Kane

This week, some things I've noticed about how people communicate online.

I’ve had access to computers for as long as I can remember. My parents work in the tech world, so naturally I was probably one of the first people in my class to have an email address. I still remember that first email I sent, it was to my Aunt Jennie in Kansas City. Even then at the spry young age of 3 or 4 I was already growing into an expected typical Midwesterner: I asked her about the weather where she was. Over the years my access to the internet have only increased to the point where today it is ubiquitous. I’m rarely, if ever, away from a data signal, and any simple question I have can be easily answered by a quick question to Siri or a Google search.

The amount of technology in our lives today is sometimes scary. The fact that it surrounds us at all times, in all places makes us all the more dependent on it. What’s more, it’s changed the way we talk, the way we solve problems, and quite possibly the way we think too. In the last couple years, the majority of my time online has shifted from being spent reading long-form articles on the New York Times, the BBC, and such to watching videos, both long and short, on platforms like YouTube and Instagram. In the early years of YouTube, content on that platform tended to be much more diffuse with different creators crafting different sorts of videos in their own style, yet as that frontier continues to be settled YouTube videos have become more standardized. There’s the catchy title that’s supposed to get the algorithm to convince you to watch the video. There’s the introduction, the body, and the conclusion, demonstrating how so much YouTube content is essentially an extension of the essay. 

And of course, there’s the sponsored content thrown in there for good measure. I admire quite a few of these YouTubers and have a handful that I’ll watch on a regular basis. I even tried publishing short history videos on YouTube a few years ago, they’re still out there, but I found the work needed to get those videos out simply was too much for my production abilities and schedule. This podcast has ended up being a happy medium for me, something that I can write, record, edit, and release in a couple hours on a weekday afternoon. Of any transformational aspect of our current time, YouTube and podcasts and the democratization of knowledge that they embody have to be some of the most critical aspects.

Then there are the shorter videos, pure mind-numbing entertainment. I tend to have a soft spot for cat and dog videos on Instagram, many of which were originally made for Tik Tok, one platform I continue to avoid. There seem to be a few usual tropes and themes that run through all of these, identical music, identical storylines, say a cat or a dog doing something silly. Then there are the videos that try to express situational emotions, that take the subtext of life and turn it into a loud and proud declaration of what the person on camera is thinking or more often feeling. I feel that these sorts of videos are an outgrowth of memes that I’ve seen on Facebook in particular for over a decade now. Memes that often include the horrendously poorly worded phrase “be like…” as one example. If anything marks out the syntax, the sentence structure, of English internet culture most clearly it’s the disregard for grammar and the fluidity of English. On the one hand it has a tendency to annoy me, yet on the other hand I recognize that this is likely the development of new forms of English that will be how this language is expressed and used as our current century continues. After all, my own English is the product of both generations of immigrant interpretations of this language and official dictates of varying degrees of linguistic validity.

The one great problem with internet culture is how much content is processed and released at any given moment. After the tenth video using the same song to varying degrees of effectiveness, I get even more annoyed than I already was at the whole conversation underway. This Sunday and Monday for example I only lasted half an hour scrolling through Twitter and Instagram before I was annoyed at all the memes trying to interpret excessively diffuse meanings from Will Smith’s altercation with Chris Rock at the Oscars. That’s the beauty of more traditional forms of media: they limit how many voices are speaking at once. As anyone who has sat through endless Zoom calls over the last two years will know the signal connecting everyone attending can only pick up 1 voice at a time, and as much as we want to believe we can multitask that’s simply not the case.

I’ve thought about dropping some of my social media accounts. I’ve been on Facebook, Twitter, and Reddit for a decade now, and on Instagram for almost that long. I recognize the ability of social media to distract from work and more importantly from living my own life rather than watching other people and their pets do silly things online. I still see some utility in social media though, it’s the primary way that I promote this blog and podcast, I still have thoughtful conversations every so often over news articles or essays that I’ll post online with other intelligent people. There have even been opportunities I’ve taken because I saw an announcement or some other listing online. But compared to the overwhelming cacophony of the internet, and to the things that really make me happy, I’ve come to the conclusion that I’d be happy if I did drop some of my social media accounts.

In short, our ability to communicate without boundaries has expanded far faster than any guidelines for how to do so safely and civilly have been able to be set in place. There is so much potential in the internet, we just have to recognize that like with everything else we need to keep that space tidy, and that we need to find a balance so we can live full and fruitful lives while enjoying the benefits of this greatest creation of our global world.

Bad Practices in Baseball Broadcasting

Wrigley Field from the press box.

Bad Practices in Baseball Broadcasting Wednesday Blog by Seán Thomas Kane

This week, how baseball broadcasting today is a tale in what not to do. All musical performances included are my own.

As you’ll have gathered from last week’s episode, I’m a big baseball fan. I always have been, and probably always will be. Baseball was the first sport I watched as a kid, the first I played (Kindergarten T-Ball), and the one that I have spent the most time watching both in the stadium and at home on TV. Growing up in the 90s and 2000s baseball was one of a handful of things that were just normal to have on the TV or the radio during the day in the background. No matter which major league city you were in, the local team or teams would probably be on the airwaves on any given Spring or Summer afternoon or evening. 

As a lifelong Cub fan, I was lucky after my parents & I moved to Kansas City in June 1999 to be able to still watch the Cubs live on WGN’s national superstation. Those broadcasts became one real big constant in my young life, something I even introduced to my cousins on occasion during those long summers in the early 2000s when I spent the day at their house. They were all Royals fans first and foremost, but I distinctly remember one particularly exciting game from Wrigley when we all were buzzing with excitement in front of the TV watching an especially close day game, cheering & celebrating when the Cubs won with a walk-off home run in the bottom of the 9th.

That all began to change in the early 2010s when baseball began to move from the ubiquitous over-the-airwaves channels to special sports channels that you either got through your cable package or that were only available at special request. The Cubs left WGN in 2015, and one of my last day-to-day links with my original hometown went with the last of those broadcasts. I didn’t notice it at first, in September 2015 I moved across the water to London to do a master’s degree in International Relations and Democratic Politics at the University of Westminster, and only rarely got to see baseball at the odd American restaurant in the British capital. When I returned to the U.S. the following year the Cubs were almost always on MLB Network or any of the other regular baseball broadcasters, mostly ESPN and Fox Sports. It was 2016, the year when the drought was finally lifted (see last week’s episode for an emotional recounting of the night of Game 7). In the following years I was able to see the Cubs fairly regularly on national TV, and the Kansas City Royals, my favorite American League team on the local Fox Sports Kansas City broadcasts on a daily basis, but as the 2010s ended all that began to change yet again.

Around the same time as the beginning of the pandemic in this country in March 2020 the news broke that Fox was selling their Fox Sports division as a part of the Disney acquisition of 20th Century Fox. The Federal Trade Commission ruled that Disney couldn’t control Fox Sports and ESPN, that’d be a monopoly, so Fox Sports was up for grabs to the highest bidder. That bidder turned out to be Sinclair, America’s shadiest right-wing owned media conglomerate, the Hearst of the 21st century, the true Charles Foster Kane. I wasn’t happy from the beginning about this; Sinclair had been caught red handed making their local news anchors read a prepared statement that sounded way too much like propaganda for my liking, and nearly anything they touched seemed to be weaponized to benefit their own ideals and mission. So, when Sinclair announced that the Fox Sports naming rights had been sold to Bally’s, the casino chain, I wasn’t totally surprised. One thing to get out of the way: I’m fine with legalized sports betting, I’m just annoyed with how gaudy and grotesque its advertising often tends to be, and frankly I don’t want any part in it. What frustrated me the most was that as a part of the deal Sinclair decided to get greedy, as robber barons are known to do, and raise their rates to the point that Fox Sports, now Bally Sports, was cut from most TV providers’ channel listings, especially from streaming TV providers like YouTube TV and Sling. For the first time in my life, I couldn’t watch baseball, whether the Cubs or the Royals, on local TV.

I recognize that this isn’t a serious societal problem on its own. Having professional baseball off the airwaves for a good portion of the population isn’t going to cause people to starve or to lose their homes or their jobs; it isn’t a matter of public education or human rights. Compared to those problems this is insignificant. Culturally though, in a country that is largely isolated from the global sports market, baseball remains our national past-time. It’s something that developed as our country grew, a sport that came into its own after the Civil War with teams that have existed as long as some communities in this country have. My own Chicago Cubs have an old heritage in professional baseball dating back to around 1870. They joined the National League at its founding in 1876 and have stuck around in the same city ever since. For the first 30 to 40 years professional baseball was seen by spectators in the stands and reported on in the newspapers. In the next few decades with the invention of radio it was broadcast around the country alongside sports like boxing to homes and businesses from Atlantic to Pacific. Following World War II it began to be seen on TV screens, with greats like Jack Brickhouse calling games from the press box at Wrigley Field. Some of my fondest memories of baseball are the most mundane ones, like the times I’d sit in my grandparents’ kitchen watching the Cubs with my grandmother, or the summer evenings in recent years when my parents and I would sit around in our living room in Kansas City watching as the Royals played fun small ball, outwitting heavy-hitting teams with their base running, base stealing, and excellent fielding. Memories like those are what companies like Sinclair are burying deep in the ever-receding past.

A year ago, Sinclair announced they’d have a streaming service ready to go for the 2022 baseball season called Bally Sports+ (because every streaming service is called “so and so +” for some reason). The report said it would cost around $23 a month, or $184 per baseball season, including Spring Training and the playoffs (March-October). This would cater to people like me who don’t have traditional cable packages, owing to their exorbitant prices around $80-120 a month, and would fill the void that Sinclair themselves created to gouge the market during the 2020 and 2021 seasons. Well, Spring Training has begun for 2022 and Bally Sports+ is nowhere in sight. I was lucky enough on Monday afternoon this week to catch a Royals Spring Training game against the Angels but that feels more like a chance encounter than a solid resolution to the problem.One potential solution would be to build off the legal exception that the 1922 Supreme Court Case Federal Baseball Club v. National League made for Major League Baseball to be exempt from the Sherman Antitrust Act, the very act which opened the door for Sinclair to buy Fox Sports over Disney’s objections. In the present case, I’d argue that baseball is an exception to the rule because it’s a deep-rooted part of American culture. As such it ought to be available to watch over-the-air, exempt from any special cable or streaming packages, exempt from being hogged by the greedy arms of broadcasting conglomerates like Sinclair. Unlike the NFL, baseball’s closest competitor, Major League Baseball’s season doesn’t lend itself well to having an all-national broadcast schedule. We’re talking about 24 weeks between April and the end of September with 162 games being played by 30 teams, or 2,430 games in a season. A solely national broadcast system like the NFL’s simply wouldn’t work for the MLB. Instead, local broadcasts should be prioritized, and broadcasting companies should be incentivized to put the viewers first. The alternative, if baseball isn’t easy for people to watch on a regular basis, is for the sport to decline in popularity, something that hurts companies like Sinclair anyway. I only hope that Sinclair’s executives realize it’s to their benefit to let people watch baseball at a fair price.

My Dad and I in front of the Wrigley Field Marquee in December 2016

National Mall

Mr. Lincoln
This week, I want to tell you about a trip I took last weekend to Washington, D.C. Links: The Smithsonian's Futures Exhibit: https://aib.si.edu/futures/ The Planetary Society's Sailing the Light documentary premiere live stream: https://youtu.be/NnKsHlV1NhA

Of all the cities in the east, Washington remains my favorite. It’s place at the emotional heart of our republic, the center of the Union that my lifelong hero President Lincoln fought to preserve, makes me yet another Mr. Smith every time I return to the capital. This week I made such a trip back to some of my favorite museums, some powerful monuments, and some good weather after months of cold and snow in Upstate New York. I decided that I wanted to make my trip a bit of an adventure and chose to drive down from Binghamton rather than fly, an easier option. This led me to an occasionally tense journey down Interstates 81 and 83 through Pennsylvania and Maryland to the BWI Amtrak station where I decided to leave my car for the weekend, figuring it’d be better to not try to drive and park in the District if possible.

Arriving in D.C. on the Acela, currently this country’s fastest passenger train, something the train nerd in me specifically chose to do, I had a similar arrival to Jimmy Stewart’s Mr. Smith at Union Station, its high vaulted ceilings designed by Chicago’s own Daniel Burnham over a century ago. Unlike Smith I didn’t see the capitol dome from the station, instead looking downward trying to find the nearest metro station to get to my hotel.

Seeing the monuments at night is always a special treat. As elegant as they are in the daytime, and some like the Vietnam Memorial are better seen under the Sun, there’s a special artistry in seeing the work of sculptors and architects illuminated with floodlights. That’s how I saw the Washington, Lincoln, and MLK Memorials, lit up solemnly. Mr. Lincoln and Dr. King looked as though they were great titans of antiquity in the glow of their memorials’ lights. 

At this time in our history, Lincoln’s struggle to save the Union and end slavery in this country once and for all seems all the more present. In the week since my last post (episode for those listening) the Russian military has invaded Ukraine. I alluded to those threats last week, but now threats have become a living nightmare for the Ukrainian people and a great storm cloud over the rest of Europe that threatens to engulf all of humanity. How do we embrace the true and righteous words of Mr. Lincoln to do the right thing and feel no evil towards others, even those like President Putin who have so brutally attacked their neighbors? I don’t have an answer to that question yet, nor am I certain that I ever will. But today’s feast, Ash Wednesday for us Catholics, fits well into this narrative as an annual reminder “Remember you are dust, and to dust you shall return.”

I spent a good deal of time on Saturday in several of the Smithsonian museums, returning to the Natural History Museum that I visited in July to double check a label for a sloth for my dissertation, and revisiting an old favorite in the Asian Art Museum. I also visited the American History Museum for the first time in over a decade and enjoyed it quite a bit more. The previous time in 2011 it seemed to be sparse in actual history, yet this time I could notice the nuance in the stories it told in the objects on display in what little space it had available.

The most insightful museum visit though was to the Futures exhibit currently housed in the Arts and Industries Building on the south side of the Mall next to the Smithsonian Castle. This exhibit, which asks visitors to imagine how our future could be a sign of human life improving offered a much needed antidote to the troubles of the world. There were examples of carbon-neutral and renewable building techniques and materials, electric cars, air taxis, and hyperloops. There was a new model of a space suit that was far less bulky than those used by astronauts today and a model of Light Sail 2, a spacecraft sent into orbit by the Planetary Society, a space advocacy organization of which I’m proud to say that I’m a member. There is no one future but many for us to choose from. It’s up to us to determine how we want our future to be written, to be designed, to be imagined.

The National Mall is the emotional heart of this country. It speaks to me of generations of memory, passion, and possibility. On this trip as well though I could imagine myself there in the future, introducing the next generation and later generations to come to that heart, to the ideals and hopes and dreams of this republic. Now at the end of my 20s, my visits to the capital mean something different to me than they did in the last decade. They represent my own future, its infinite possibilities, and how I might be able to do my part, however small it may be, to influence and improve upon our experiences.