Tag Archives: philosophy

The author on a blue background wearing Apple AirPods.

On Machinery

This week, for the penultimate post of the Wednesday Blog, how machinery needs constant maintenance to keep functioning.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D Surekha Davies, “Walter Raleigh’s headless monsters and annotation as thinking,” in Strange and Wonderous: Notes from a Science Historian, (6 October 2025).[2] “Asking the Computer,” Wednesday Blog 5.26.


This week, for the penultimate post of the Wednesday Blog, how machinery needs constant maintenance to keep functioning.


I am just old enough to remember life before the ubiquity of computers. I had access to our family computer as long as I can remember, and to my grandparents’ computer at their condo when we stayed with them in the Northwest Suburbs of Chicago. Yet even then my computer usage was limited often to idle fascination. I did most of my schoolwork by hand through eighth grade, only switching from writing to typing most of my work when I started high school and was issued a MacBook by my school. I do think that a certain degree of whimsy and humanity has faded from daily life as we’ve so fully adopted our ever newly invented technologies. Those machines can do things that in my early childhood would’ve seemed wonderous. Recently, I thought how without knowing how powerful and far-reaching my computer is as a vehicle for my research and general curiosity, I would be happy, delighted in fact, if my computer could conduct one function, say if it had the ability to look up any street address in the United States as a device connected to the US Postal Service’s database. That alone would delight me. Yet that is the function of not just one application on my computer but merely one of many functions of several such programs I can load on this device, and not only can I look up addresses in the United States but I can look up addresses in any country on this planet.

With the right software downloaded onto this computer I can read any document printed or handwritten in all of human history and leave annotations and highlights without worrying about damaging the original source. Surekha Davies wrote warmly in favor of annotating in her newsletter this week, and I appreciated her take on the matter.[1] In high school, I was a bit of a prude when it came to annotating; I found that summer reading assignment in my freshman and sophomore English classes to be almost repulsive because I see a book as a work of art crafted by its author, editor, and publisher to be a very specific way. To annotate, I argued, was like drawing a curly-cue mustache on the Mona Lisa, a crude act at best. Because of this I process knowledge from books differently. I now often take photos of individual pages and organize them into albums on my computer which I can then consult if I’m writing about a particular book, in much the same fashion that I use when I’m in the archive or special collections room looking at a historical text.

All of these images can now not only be sorted into my computer’s photo library, now stored in the cloud and accessible on my computer and phone alike, but they can also be merged together into one common PDF file, the main file type I use for storing primary and secondary sources for my research. With advances in artificial intelligence, I can now use the common top-level search feature on my computer to look within files for specific characters, words, or phrases to varying levels of accuracy. This is something that was barely getting off the ground when I started working on my doctorate six years ago, and today it makes my job a lot easier; just my file folder containing all of the peer-reviewed articles I’ve used in my research since 2019 contains 349 files and is 887.1 MB in size.

Our computers are merely the latest iterations of machines. The first computer, Charles Babbage’s (1791–1871) counting machine worked in a fairly similar fashion to our own albeit built of mechanical levers and gears where ours have intricate electronics in their hard drives. I, like many others, was introduced to Babbage and his difference engine by seeing the original in the Science Museum in London. This difference engine was a mechanical calculator intended to compute mathematical functions. Blaise Pascal (1623–1662) and Gottfried Wilhelm Leibniz (1646–1716) both developed similar mechanisms in the seventeenth century and still older the Ancient Greek 2nd century BCE Antikythera mechanism can complete some of the same functions. Yet between all of these the basic idea that a computer works in mathematical terms remains the same even today. For all the linguistic foundations of computer code, the functions of any machine burn down to the binary operations of ones and zeros. I wrote last year in this blog about my befuddlement that artificial intelligence has largely been created on verbal linguistic models and was only in 2024 being trained on mathematical ones.[2] Yet even then those mathematical models were understood by the A.I. in English, making their computations fluent only in one specific dialect of the universal language of mathematics making their functionality mostly useless for the vast majority of humanity.

Yet I wonder how true that last statement really is? After all, I a native English speaker with recent roots in Irish learned grammar like many generations of my ancestors through learning to read and write in Latin. English grammar generally made no sense to me in elementary school, it is after all very irregular in a lot of ways, and so it was only after my introduction to a very orderly language, the one for which our Roman alphabet was first adapted, that I began to understand how English works. The ways in which we understand language in a Western European and American context rely on the classical roots of our pedagogy influenced in their own time by medieval scholasticism, Renaissance humanism, and Enlightenment notions of the interconnectedness of the individual and society alike. I do not know how many students today in countries around the globe are learning their mathematics through English in order to compete in one of the largest linguistic job markets of our time. All of this may well be rendered moot by the latest technological leap announced by Apple several weeks ago that their new AirPods will include a live translation feature acting as a sort of Babel Fish or universal translator depending on which science fiction reference you prefer.

Yet those AirPods will break down eventually. They are physical objects, and nothing which exists in physical space is eternal. Shakespeare wrote it well in The Temepst that 

“The solemn temples, the great globe itself,

Yea, all which it inherit, shall dissolve,

And, like this insubstantial pageant faded,

Leave not a rack behind. We are such stuff

As dreams are made on, and our little life

Is rounded with a sleep.” (4.1.170-175)

For our machines to last, they must be maintained, cleaned, given breaks just like the workers who operate them lest they lose all stamina and face exhaustion most grave. Nothing lasts forever, and the more those things are allowed to rest and recuperate the more they are then able to work to their fullest. So much of our literature from the last few centuries has been about fearing the machines and the threat they pose. If we are made in the Image of God then machines, our creation, are made in the image of us. They are the products of human invention and reflect back to us ourselves yet without the emotion that makes us human. Can a machine ever feel emotion? Could HAL-9000 feel fear or sorrow, could Data feel joy or curiosity? And what of the living beings who in our science fiction retrofitted their bodies with machinery in some cases to the extent that they became more machine than human? What emotion could they then feel? One of the most tragic reveals for me in Doctor Who was that the Daleks (the Doctor’s main adversaries) are living beings who felt so afraid and threatened that they decided to encase the most vital parts of their physical bodies in wheelchair tanks, shaped like pepper shakers no less, rendering them resilient adversaries for anyone who crossed them. Yet what remained of the being inside? I urge caution with suggestions of the metaverse or other technological advances that draw us further from our lived experiences and more into the computer. These allow us to communicate yet real human emotion is difficult to express beyond living, breathing, face-to-face interactions.

After a while these machines which have our attention distract us from our lives and render us blind to the world around us. I liked to bring this up when I taught Plato’s allegory of the cave to college freshmen in my Western Civilization class. I conclude the lesson by remarking that in the twenty-first century we don’t need a cave to isolate ourselves from the real world, all we need is a smartphone and a set of headphones and nothing else will exist. I tried to make this humorous, in an admittedly dark fashion, by reminding them to at least keep the headphones on a lighter mode so they can hear their surroundings and to look up from their phone screen when crossing streets lest they find themselves flattened like the proverbial cartoon coyote on the front of a city bus. 

If we focus too much on our machines, we lose ourselves in the mechanism, we forget to care for ourselves and attend to our needs. The human body is the blueprint for all human inventions whether physical ones like the machine or abstract like society itself. As I think further about the problems our society faces, I conclude that at the core there is a deep neglect of the human at the heart of everything. I see this in the way that disasters are reported on in the press: often the financial toll is covered before the human cost, clearly demonstrating that the value of the dollar outweighs the value of the human. In abdicating ourselves to our own abstractions and social ideals we lose the potential to change our course, repair the machinery, or update the software to a better version with new security patches and fixes for glitches old and new. In spite of our immense societal wealth, ever advancing scientific threshold, and technological achievement we still haven’t gotten around to solving hunger, illiteracy, or poverty. In spite of our best intentions our worst instincts keep drawing us into wars that only a few of us want.The Mazda Rua, my car, is getting older and I expect if I keep driving it for a few years or more it’ll eventually need more and more replacement parts until it becomes a Ship of Theseus, yet is not the idea of a machine the same even if its parts are replaced? That idea is the closest I can come to imagining a machine having a soul as natural things like us have. The Mazda Rua remained the Mazda Rua even after its brakes were replaced in January and its slow leaking tire was patched in May. Yet as it moves into its second decade, that old friend of mine continues to work in spite of the long drives and all the adventures I’ve put it through. Our machinery is in desperate need of repair, yet a few of us see greater profit from disfunction than they figure they would get if they actually put in the effort, money, and time to fix things. If problems are left unattended to for long periods of time they will eventually lead to mechanical failure. The same is true for the machinery of the body and of the state. Sometimes a good repair is called for, reform to the mechanisms of power which will make the machine work better for its constituent parts. In this moment that need for reform is being met with the advice of a bad mechanic looking more at his bottom line than at the need of the mechanism he’s agreed to repair. Only on this level the consequences of mechanical failure are dire.


[1] Surekha Davies, “Walter Raleigh’s headless monsters and annotation as thinking,” in Strange and Wonderous: Notes from a Science Historian, (6 October 2025).

[2] “Asking the Computer,” Wednesday Blog 5.26.


The Lotus-Eaters

This week, comparing the benefits of pleasure with the rewards of good work.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:Photo: © Juan Valentín CC BY-NC 4.0 https://www.inaturalist.org/photos/427040191. No modifications made. Available under public license. Image slightly cropped length-wise for podcast episode art.[1] André Thevet, Les Singularitez de la France Antarctique, (Antwerp, 1558), 4v ; Aristotle, Situations and Names of Winds 973b, 12–13.[2] Homer, Odyssey 9.106–110, trans. Robert Fagles, (Penguin, 1996), 214.[3] Homer, Odyssey 9.110–117, trans. Fagles, 214.[4] Aristotle, Nicomachean Ethics 1118a.[5] Aristotle, Nicomachean Ethics 1118a, 8.

Photo: Ziziphus lotus, © Juan Valentín CC BY-NC 4.0 https://www.inaturalist.org/photos/427040191. No modifications made. Available under public license.


This week, comparing the benefits of pleasure with the rewards of good work.


A recurring challenge of my life is finding a good work-life balance. Perhaps central to this conundrum is the fact that I simply enjoy the work that I do, so I’m more willing to approach something work-related at all hours because it brings me joy. There are plenty of things that I need to do with my time, and plenty more that I know I will someday accomplish, yet I feel less pressed to push through any weariness or writer’s block to finish a given project today than I have in the past. For most things, I have a wide enough gap leading up to project deadlines that I can afford to work as I will on a given project. This is a luxury of the moment, which was foreign to me even a year ago, and I know well that the ample time I have now is a singular moment in my life that will likely not repeat often again. So, as long as I have the time to spend working on the Wednesday Blog and the handful of articles and book chapters that I’m writing, I’ll use that time to the best of my ability.

Each of us operates within the structures of our civilization, and within the cultural edifices built up over millennia that define our very identities. No one exists in true solitude everyone comes from somewhere. There are plenty of stories of loosening the burdens of life for the splendid abandon. Life is hard for all of us; one of the great unifying factors of the human experience is struggle. I doubt that either the richest or the poorest people alive today are fully happy and content in their present state. There are certainly things I would like to change about my life, things that I’m now approaching with the same resolve that I dedicate to my work and I see that among my family and friends too, such potent dedication to completing tasks difficult and easy alike that when all is said and done the doer can rest proud of their work.

Still, there is value to taking time to rest. I’ve developed a bad habit of sitting at my desk until I’m so tired that I can’t sit up straight, or even to the point that I find one eye closing so that I can keep reading with the other. These make for good stories but they’re bad habits overall. It seems to me like there’s so much to learn and not enough time to commit it all. We Americans are particularly bad at our work-life balance. While we have a strong work ethic in this country, we don’t give ourselves enough time to enjoy the fruits of our labor. I now work at some of the places where otherwise I would go to rest, places like the Kauffman Center for the Performing Arts where when I returned to Kansas City in December 2022, I was a frequent patron of the Kansas City Symphony’s performances until March 2023 when I signed on as a Team Captain of the Volunteer Usher Corps. Now, I work at the Kauffman Center and while I don’t get to relax and soak in the music there anymore, I’m proud of the work that I do and I work with people who I genuinely enjoy being around. In fact, working at the Kauffman Center has magnified the value of my historical research and writing even more. That’s what I love most in all the things that I do because it’s what I’m best at, and it’s through academia that I’ve met some of the people I most admire in all the world. The last two months then when I singularly devoted my attention to researching, writing, and editing a new and better introduction to my dissertation I poured all my effort and energy into the task and the work shows it. Yet I also drained myself of that same strength and realized that the working hours I kept four years ago when I was reading 12 hours a day in preparation for my comprehensive exams were no longer tenable. Life moves on, and with the changes in my life so too my stamina for these sorts of long hours have changed. I’m doing a lot more now than I was during the height of the pandemic in January, February, and March of 2021. Thus, it’s reasonable to say that I cannot do quite as much of the same things that I once did.

There are times when I can get so caught up in what it is I’m doing in the moment that I miss the world going by. I mourn a little bit how fast 2025 has been for me, there are things I wish I had done in the first half of this year that I failed to do for one reason or another. Often those reasons were out of my control. Yet they remain monuments to things that could have been. In other cases, though those things are goals which I turned away after finding better things to pursue. I’ve learned that I must remain open to change, flexible in my ways of living and doing things. How many times have I thought I was done with my dissertation only to be told that there was still more work to do? I know that endeavor defines my career and will continue to do so as long as I’m contributing to the scholarship of Renaissance natural history. Still, at times the idea of abandoning my efforts and falling into a state of rest has its appeal. At this moment, I would appreciate a vacation, even if only 24 hours away from my work. I took some time to enjoy the friendly company of my brother Hibernians and their families, and my Gaelgeoir friends this weekend at the Kansas City Irish Fest. It was lovely using that time to be with people whose company I enjoy, yet it was just as great a joy to return to my work this week and especially now that I’ve finished this round of work on my dissertation’s introduction to return to editing my translation of André Thevet’s 1557 book Les Singularitez de la France Antarctique. I had a delightful day spent reading through the Loeb Classical Library and the Perseus database hunting down Thevet’s Greek and Roman references on the geography, ethnography, and zoology of Sub-Saharan Africa.

The legacy of those ancient authors lies heavy on the European perception of their southern neighbors. The Greeks especially perceived Libya, their name for Africa, as the great desert landmass on the southern edge of their world. Thevet wrote that Libya was named by the Greeks for the southwestern wind, or Lips (Λίψ), a notion he got from Aristotle’s book the Situations and Names of Winds.[1] Thus, while Libya was the Greek name for Africa as a whole in antiquity, that the name was associated more with the southwest than the south suggests that their notion of Libya was west of Egypt and in the general vicinity today known as Libya. Further west along the Mediterranean coast of Africa lay an island where Homer records that Odysseus’s ship made a beachhead born by the north wind across what Robert Fagles translates as “the fish-infested sea.” On the tenth day “our squadron reached the land of the Lotus-eaters,” who Homer described as “people who eat the lotus, mellow fruit and flower.” Odysseus’s crewmen “snatched a meal by the swift ships” and found as “they mingled among the natives” that they “lost all desire” to do their duties 

“much less return

their only wish to linger there with the Lotus-eaters,

grazing on lotus, all memory of the journey home

dissolved forever.”[2]

The lotus-eaters of the Odyssey who live in bliss induced by the plant. Their worries carried far away they could bask in the glow of their sun and live out their days in a sense of peace. Yet Odysseus saw in this idyll a great distraction from what must be done, he and his crew needed to still return home to Ithaca. The king in his wisdom continued his story,

“But brought them back, back

To the hollow ships, and streaming tears––I forced them,

Hauled them under the rowing benches, lashed them fast

And shouted out commands to my other, steady comrades:

‘Quick, no time to lose, embark in the racing ships!’––

So none could eat the lotus, forget the voyage home.”[3] (9.92-117)

The danger lay less in an immediate threat to life and limb but rather in a threat to mission, to vocation. Odysseus knew his charge was to shepherd as many of his men home as he could; what a tragedy it was that after all his efforts he returned home alone. The threat of the lotus-eaters lay in their carefree abandon of the need of self-preservation. Eventually, had the King of Ithaca and his men stayed on the island they would have faded in body and in spirit, dying not in war but by becoming stale and wasting away slowly until they had not even their memory to keep alive. Too much of a good thing becomes a bad thing, just as everything changes over the long dance of time.

Moderation then is the best way of living, to do things such that we humans not only survive but thrive in the conditions in which we find ourselves. Aristotle expresses this best in his Nicomachean Ethics that for every sort of action or feeling there is an excess and a deficiency and between them a mean which is the moral virtue. Thus, the lotus-eaters lived in a state of self-indulgent excess, born from their love of the lotus plant and the way it can make all their troubles disappear.[4] Aristotle argued that “temperance and profligacy are concerned with those pleasures which man shares with the lower animals, and which consequently appear slavish and bestial.”[5] It is human to have passions, desires, and urges to do one thing over another, yet it is an entirely different thing to give into those passions and abandon control over one’s own life. I think it is a greater sorrow to give up this control thoughtlessly than it is to have that control taken from you, even if the act of subjugation remains in the eye of the subduer and only as powerful as society wills it to be. This is something we too often forget: so many of the bad things that go on in our world are things of our own making. We choose to allow rampant gun violence in our country, or to let the institutions of our democracy crumble, or to let people go hungry, die from treatable diseases, and remain illiterate all because people in positions of power benefit from having others in need. I suspect that we don’t have to live like this. Perhaps the root of these societal woes comes from an understandable inability to understand death, that final act of life which often is so very unfair to the dying and those left behind. So long as the greatest inequity exists then why should we bother with trying to fix our own problems?Dear reader, I’ve been writing this Wednesday Blog now for four and a half years, and I’ve always said that my one rule for this publication is that I will end it once it’s no longer fun to write. Just before the pandemic during a family gathering, one of my uncles remarked that he had no interest in retiring soon because he loves the work he does. This struck me because it explains why I’ve stuck around in academia in spite of all the trouble I’ve been through in these past few years. I do this work because I love it; I write because I enjoy writing, and I’m writing to you today to suggest that we could make our world a better place to live for ourselves and our children and grandchildren who’ll come after us, we just have to leave the island and its lotuses and climb back into our boat and set out onto the fish-infested sea again. For all that I’ve learned about a great many topics, I still often need reminding to do basic things like stop reading or writing late at night and go to bed. I suspect that’s the case for most of us, that we get caught up in the worries or passions of the day and lose sight of the good things that we can do to really find true peace. Here in the United States the first big step that we ought to take is reconsider how we prioritize work to such a degree that it becomes life itself. We ought to work to live, not live to work. On this Labor Day week that’s as good a starting place as any.


[1] André Thevet, Les Singularitez de la France Antarctique, (Antwerp, 1558), 4v ; Aristotle, Situations and Names of Winds 973b, 12–13.

[2] Homer, Odyssey 9.106–110, trans. Robert Fagles, (Penguin, 1996), 214.

[3] Homer, Odyssey 9.110–117, trans. Fagles, 214.

[4] Aristotle, Nicomachean Ethics 1118a.

[5] Aristotle, Nicomachean Ethics 1118a, 8.


A macaw

On Skepticism

This week, I express my dismay at how fast time seems to be moving for me of late and how it reflects the existence of various sources of knowledge in our world.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D Ada Palmer, Inventing the Renaissance: The Myth of a Golden Age, (University of Chicago Press, 2025), 603.[2] If this word epistemology leaves you confused, have no fear, for my own benefit as well I wrote a blog post explaining this word alongside two of its compatriots. “Three Ologies,” Wednesday Blog 6.6 (podcast 5.6).


This week, I express my dismay at how fast time seems to be moving for me of late and how it reflects the existence of various sources of knowledge in our world.


I first noticed the passage of time on my tenth birthday, that is to say I remember remarking on how from that day on for the rest of my life, I would no longer be counting my years in single digits. I remember distinctly the feeling of surprise at this, a sense that I could never go back to my earliest years. That was especially poignant for me as those first six years lived in the Chicago suburbs held a nostalgic glow in my memory then as they do now. In those early years I felt that time moved slowly; I remember once as a kid I fretted over a 3 minute cooking timer, worrying that I would be unable to stand there and watch the flame over which I was cooking eggs for a full 3 minutes. Today that sounds silly, yet I believe it is vital to remember how I felt all those years ago lest I lose my empathy with my past self or anyone else I may encounter with similar concerns over things I see as minute.

Soon after my tenth birthday, I found a new method of getting through things that I found tedious or even odious to endure. I realized that if I tricked myself into enjoying the moment that the tedium would pass by quicker than if I wallowed in my annoyance and misery. Perhaps there was a degree of pessimism in this realization: that the good moments don’t seem to last as long as the bad ones in my recollection of things, or that it’s in fact easier to remember the bad more than the good. This is something I’ve been struggling with lately, that when I find my thoughts sinking to these depths of my greatest uncertainty and grief that I need to remind myself of all the good in my life. Time seems to move faster today than it did before. The days fly by more than linger, and there’s always something new or old that I need to do. I’ve long thrived on work, a trait I inherited from my parents. Often my happiest days are those spent dedicated to a specific task; those days are made happy by my sense of accomplishment once the task has progressed or even is done. I’ve learned to accept that good things won’t often be finished in a day. I’ll push myself instead to do as much as I feel I can do in the span of a day and see where that leaves me when I go to bed at night. With the new introduction to my dissertation this meant that it took me 9 days to write all 105 pages of it. This is one of those times where I feel that I’m on a roll and in my writer’s paradise when I can write and write and write and not run out of ideas to commit to paper.

Yet I worry about that quicker passage of time because I feel that there are less things that I’m able to do in a given day than I would like. I sacrifice rest sometimes in order to see a project to completion, or I choose to try and find a balance between my work and the rest of my life only to see one side, or another overwhelm its counterpart leaving me feeling unfulfilled when I retire for the night. I do worry that the time I’m afforded is limited, and that I’m not going to do everything I want to undertake. There are plenty of things I want to write, so much I want to say, yet so little time in a given day to say it. I’m still young, just a few weeks over halfway to my 33rd birthday. I have this lingering feeling that there’s so much that I want to do with the life I have and an indeterminate amount of time with which to do those things. Am I content with what I’ve done with my life so far? Yes. Is there so much more I want to do? Absolutely.

I suspect this shock at time moving faster is my own realization of my mortality. Everything has a beginning and an end, the mystery lies in not knowing either terminus directly. How many of us can remember our own birth? I certainly can’t. By the same token we can’t necessarily interview the dead after they’ve shuffled off this mortal coil because, in the words of Dr. McCoy, they’re dead. Thus, we remain doubters of our own mortality, our limits. I often hear older friends talk about how the young feel invincible and immortal and make mistakes which reinforce that sentiment of invincibility all while, if they’re particularly bold or just unlucky, asserting their mortality with a sudden abandon. Our doubts are aimed at established sources of knowledge, authorities to whom we feel no particular duty to abide even if we begrudgingly accept their precepts out of bare necessity. I see enough people every day ignore pedestrian crossing lights even though they are there on the city’s authority to protect us pedestrians when crossing the streets that we’ve abdicated to vehicles. It usually leaves me at least frustrated at the ignorance of the driver, at most even angry when I’ve gotten close to being hit by such an ignoramus.

Skepticism is a significant marker in Renaissance studies as a transitional element from the classically inspired scholarship of the fifteenth and sixteenth centuries into the empirical knowledge-making that traditionally we’ve said was emblematic of the Scientific Revolution. I have many colleagues who are working now on disproving the existence of that Scientific Revolution; I admire that cause and yearn to read what they’re writing even though one of my stock courses to teach is called “the Scientific Revolution: 1500-1800.” Ada Palmer calls Michel de Montaigne, in some ways the inspiration for my Wednesday Blog, “the avatar of this moment” when skepticism became a driving force in Renaissance thought.[1] I argue in my dissertation that the American experience drove the course of skeptical thought in the Renaissance; all the things which André Thevet called singular in the Americas represented a dramatic break from classical standards of knowledge which required a new epistemology to explain them.[2] The key here is that we should never be complacent that our current knowledge is all there is to know, after all a well-lived life is a life spent learning. I’m skeptical about many things and have a drive to continue learning, to continue exploring. Curiosity hasn’t killed this cat yet.[3]I find then that my time is best spent in pursuit of this knowledge, and as much as one can learn alone in the solitude of their study reading and thinking quietly to oneself like a monk, it is far better to learn in communion with others. Since the pandemic began, I’ve grown particularly fond of Zoom lectures, webinars, and workshops as much for the expertise on show as for the community they build. Even if we only communicate through these digital media I still look forward to seeing these people, to experiencing that one part of life with them. We learn so that we might have richer experiences of our own lives, so that we might find comfort in our knowledge, so that we might, in Bill Nye’s words, “change the world.” In the time that I have afforded to me I want to learn more than anything else, to learn about the people around me, about our common heritage, about what our future may hold, and about myself. If I can do that, then when I am “no more, cease to be, expired and gone to meet my maker, become a stiff, bereft of life and resting in peace” I’ll be content in my leave-taking. Hopefully unlike the dead parrot they won’t nail me to my perch like Bentham’s auto-icon which greets knowledge-seekers in the South Cloisters of University College London, though that could be a rather humorous way to go.


[1] Ada Palmer, Inventing the Renaissance: The Myth of a Golden Age, (University of Chicago Press, 2025), 603.

[2] If this word epistemology leaves you confused, have no fear, for my own benefit as well I wrote a blog post explaining this word alongside two of its compatriots. “Three Ologies,” Wednesday Blog 6.6.

[3] Meow.


A figure from Raphael's "The School of Athens" variously identified as Francesco Maria della Rovere, Pico della Mirandola, or Hypatia of Alexandria.

On Knowledge

This week, I want to address how we recognize knowledge in comparison to the various fields of inquiry through which we refine our understanding of things.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkaneArtRaphael, The School of Athens (1509–1511), Apostolic Palace, Vatican Museums, Vatican City. Public Domain.Sources“On Writing,” Wednesday Blog 6.27.Surekha Davies, Humans: A Monstrous History, (University of California Press, 2025).Marcy Norton, The Tame and the Wild: People and Animals After 1492, (Harvard University Press, 2024), 307.Dead Poets Society, (1989) "What will your verse be?" Video on YouTube.


This week, I want to address how we recognize knowledge in comparison to the various fields of inquiry through which we refine our understanding of things.


Lately my work has been dedicated to a thorough review of the historiography within which I’m grounding my dissertation. I wrote about this two weeks ago in an essay titled “On Writing.”[1] My research is historical, yet it touches on secondary literature which operates within various fields within the discipline of history. These include Renaissance history, and its larger sibling early modern history, the history of cartography, the history of animals, the history of botany, and more broadly the history of early modern science. Methodologically, I owe a great deal to two great twentieth-century Francophone anthropologists, Alfred Métraux (1902–1963) and Claude Lévi-Strauss (1908–2009). While Métraux and Lévi-Strauss aren’t considered directly in the historiographic section of the new introduction that I’m writing for my dissertation, which is limited to sources published since the millennium, they nevertheless stand tall in the background of my history.

Today we often talk within academia about a desire for interdisciplinarity in our work and our research. We’ve found ourselves too narrowed by our ever shrinking fields and seek greener common pastures for grazing as our intellectual and pastoral ancestors alike once knew. In my case, this interdisciplinarity lies more in my efforts to incorporate historical zoology into my work, a methodology which seeks to use zoological methodology and theory to explain historical animals. I have friends who study many things. Among them is one whose passion for history, classics, and mathematics has come together to craft a dissertation which seeks to demonstrate the intersections between those three to better understand the great transitions in human inquiry. Another seeks to follow the medical connections across oceans between disparate regions in the Americas and Europe that nevertheless existed even if they seem remarkable today. Still more, I have a friend who applies basic economic need to explain a complex diplomatic situation that once existed between the Venetian Republic and the Ottoman Empire in the Adriatic Sea. All of these historians of whom I write are applying a degree of interdisciplinarity to their work that reflects their own disparate interests and curiosities. In early modern history we talk about curiosities as objects which were collected from disparate and exotic lands into cabinets to display the erudite collector’s prestige and wealth. I say our curiosity is something to be collected by those worthy archives, libraries, museums, or universities that will employ us in the near future and for us to feed with new ideas and avenues of investigation that we will never be bored with life.

In all of these things, there is an underlying genre of knowledge which I am addressing. I’ve written thus far about history alone, yet it is the same for the anthropologists, astronomers, planetary scientists, and physicists who I know. Likewise for the literature scholars and the linguists. Our fields of inquiry all grow on the same planet that comprises of our collected knowledge. In English, this word knowledge is somewhat nebulous. To me, it says that we know things broad or specific. In London, for instance, the Knowledge is the series of tests which new cabbies must complete in order to learn every street within a certain radius of Charing Cross. The Latin translation of this word, scientia, makes things even more complicated as that is the root of the English word science. Thus, when we refer to Renaissance science, there is always a caveat in the following sentence explaining that “this is not science as we know it but a sort of protoscience.” I was advised, similarly, after a particularly poorly received presentation at a workshop at the Museum of Natural Sciences in Brussels in October 2023 that I shouldn’t refer to “sixteenth-century conservation” because no such concept existed at the time; instead, it would be better to discuss a “genealogy of conservation.” This sense that modern terms, in use since the Enlightenment of the eighteenth century, ought not to be pulled further back into the past I think loses some of the provenance of those terms and how the Enlightenment philosophes first came across them. 

I find it telling that the Ancient Greek translation of knowledge, γνῶσις (gnôsis), is a word with which I’m more familiar from theology and the concept of Gnosticism whereas scientia reminds me of philosophy and the other fields of inquiry which grew from that particular branch of the tree of human curiosity. One might even say that philosophy and theology are a pair, siblings perhaps? They seek to understand similar things: on the one hand an inquiry into thought, and ideally wisdom, and on the other a search for the nature of the Divine, which at least in my Catholicism we can know because we are made in the Image of God. The division here between the Ancient Greek term being affiliated with faith and the Latin one with reason I think speaks to the Latin roots of my own education in Catholic schools and at a Jesuit university, where I learned about Plato and Aristotle, yet I recognized Aristotle’s Historia animalium (History of Animals) by its Latin name by which it was generally known in Western Europe for centuries before the rise of vernacular scholarship rather than by its Greek original Τῶν περὶ τὰ ζα ἰστοριῶν (Ton peri ta zoia historion). Note that the English translation of this title, History of Animals reflects better the Latin cognate of ἰστοριῶν rather than the better English translation of that Greek word, Inquiry.

Added onto these classical etymologies, in my first semester Historiography class at Binghamton University I was introduced to the German translation of scientiaγνῶσις, and knowledge. Wissenschaft struck me immediately because I saw the German cognate for the English word wizard in its prefix, and because I knew that the -schaft suffix tends to translate into English as -ship. Thus, my rough Anglicization of Wissenschaft renders Wizardship, which is rather nifty. Yet this word Wissenschaft instead was seen in the nineteenth century as a general word which could be translated into English as science. This is important for us historians trained in the United States because our own historiographic tradition, that is our national school of historians traces our roots back to German universities in the early and middle decades of the nineteenth century. I remember long sessions of my historiography class at UMKC discussing the works of Leopold von Ranke (1795–1886), the father of research-based history. I felt a sense that this concept of Wissenschaft seemed relatable, and as it turned out that was because Irish has a similar concept. 

Whereas in English we tack on the suffix -ology onto any word to make it the study of that word, in Irish you add the suffix -ocht. So, geology is geolaíocht and biology is bitheolaíocht. Yet note with the second example that the suffix is not just -ocht but an entire word, eolaíocht. This is the Irish word for science, added onto the end of bitheolaíocht to demonstrate that this word refers to the study of bith- a prefix combining form of the word beatha, meaning life. So, biology then is the science of life itself. Powerful stuff. I appreciate that Irish linguists and scholars have sought overall to preserve our language’s own consistency with its scientific terminology. It means that these fields of study, these areas of knowledge, can exist purely within the purview of the Irish language without any extra need to recognize that their prefixes or suffixes come from Latin, Greek, or English. There are some exceptions of course: take zó-eolaíocht, the Irish word for zoology, which effectively adopts the Greek word ζῷον perhaps through the English zoo into Irish. Would it not have been just as easy for whoever devised this hyphenated word to instead write ainmhíeolaíocht, translated into English as the science of animals? Here though I see more influence from English because this language adopts as much as it can from other languages out of prestige and a desire for translingual communicability. As an English speaker, I find scholarly works often easier to read because we share common etymologies for our words relating to knowledge. English’s sciencegeology, biology, and zoology are French’s sciencegéologie,biologie, and zoologie. In English, we drop any pretense of Englishness to clothe ourselves in a common mantle familiar to colleagues from related cultures around the globe. In academia this is to our mutual benefit, after all so much of our work is international. I’m regularly on webinars and Zoom calls with colleagues in Europe for instance. I believe this is the lingering spirit of the old scholarly preference for Latin as a lingua franca which at least to me seems close enough in the past that it’s tangible yet realistically it’s surely been a very long time since any serious scholarly work beyond classics was published in Latin for the benefit of a broad translingual readership?

I for one admire the Irish word eolaíocht and its root eolas, which translates into English as knowledge, that is an awareness of things because eolaíocht represents a universal concept while retaining its own native nature. So often in my research I am discussing the early assimilation of indigenous cosmovisions, to borrow a Spanish word put to good use by Surekha Davies in her latest book, into the nascent global world centered on Europe.[2] I see how these cosmic conceptions faded until they were rendered in Gothic or Latin letters on the voluminous pages of encyclopedic Renaissance general and natural histories which remain among the most often cited primary sources for these indigenous cultures who Marcy Norton argued in her 2024 book The Tame and the Wild: People and Animals After 1492 had their own classical past made remote from their colonial present by European contact, conquest, and colonization.[3] Seeing these indigenous perspectives fade into their categorized and classified statuses within the cosmos defined by Europe’s natural philosophers I feel fortunate that my own diaspora (which was also colonized) has retained this element of our individual perspective. I first came across the -ocht suffix in the word poblacht, the Irish word for republic. A famous story from the birth of the Irish Free State during the Anglo-Irish Treaty negotiations in 1921 tells of British Prime Minister David Lloyd-George, a Welsh speaker, remarking to Michael Collins, an Irish speaker, that their choice of a republic was unusual because none of the Celtic languages naturally have a word for republic. That word evokes its Roman roots in the ancient Res publica Romana, the Roman Republic, whose northward expansion across the Alps led to the gradual death of the Continental Celtic languages, whose speakers’ descendants today are largely the Western Romance speakers of French, Romansh, Occitan, Catalan, Spanish, Galician, and Portuguese, among others. Romance languages are noted for their common descent from Latin, whence they all derive variations on the Latin word scientia; English gets science through Old French. “How are you going to name your new government in the Irish language?” Lloyd-George asked. Collins replied something along the lines of “a kingdom is called a ríocht, so this government of the people (pobal) will be called a poblacht. Thus, the Republic of Ireland is named in Irish Poblacht na hÉireann. Naturally, this word pobal derives from the Latin populus, so the shadow of Rome hovers even over unconquered Hibernia. Yet that is another topic for a different essay.

Let me conclude with a comment on the difference between knowledge and wisdom, as I see it. The former is far more tangible. We can know things through learning embodied best in living and in reading. I know for instance to look both ways before crossing a street because plenty of people in the last 140 years have been hit by cars, buses, and trucks, and you can never be too careful. Likewise, I know everything I do about the things I study through reading what others have written about these topics. It’s my job then to say what I will. In Whitman’s words made immortal by our recitation, the answer to the eternal question, “that the powerful play goes on, and you may contribute a verse.” That’s history, people! Reading the powerful play of what others have written and summoning up the courage to take the podium and have your say. I first heard this particular poem, as did many in my generation, recited by Robin Williams in the 1989 film Dead Poets Society. Knowledge is the recitation of these facts we’ve learned. Wisdom is understanding how these facts fit together and speak to our common humanity. What makes us human? I believe it’s as much what we know as what we remain ignorant of. Our ignorance isn’t always a curse, rather it’s another foggy field we’ve yet to inquire about, a place where someone’s curiosity will surely thrive someday. It is another evocation of eolas still to come in our long human story. How wonderous is that?


[1] “On Writing,” Wednesday Blog 6.27.

[2] Surekha Davies, Humans: A Monstrous History(University of California Press, 2025).

[3] Marcy Norton, The Tame and the Wild: People and Animals After 1492, (Harvard University Press, 2024), 307.


Montaigne and the Ages of Life

Montaigne and the Ages of Life Wednesday Blog by Seán Thomas Kane

This week, reflections on Michel de Montaigne’s perception of his changing character throughout his life.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane


This week, reflections on Michel de Montaigne’s perception of his changing character throughout his life.


I’m currently reading Philippe Desan’s biography of Michel de Montaigne, the French philosopher and statesman and the father of the essay. Montaigne is an influence for me in how the Wednesday Blog has developed over the last four years that I’ve been writing this weekly. He is also one of the figures on the orbit of my dissertation, and one of the most important sources for critical analysis of the events which I describe in that doctoral work. Philippe Desan in turn is one of, if not the most prolific Montaigne scholar of our time. So, it’s been a delight to read his biography of this man who I’ve gotten to know however faintly through the frame of his Essays in my research.

Most of my work deals with his famous essay “On the Cannibals” found in Volume 1 of that three volume collection. “Des cannibales,” as it’s known in its original French, was published in the first collection of Montaigne’s essays in 1580, and it’s this collection with which I’ve been the most invested in my work. The cannibals of Montaigne’s focus speak to questions of humanity and human dignity which I pose in my dissertation, which is titled “Understanding the Sauvage in André Thevet’s Brazil: 1555-1590.” 

Yet it is in the third volume of Essays where Desan established a crucial connection between Montaigne the man and Montaigne the humanist of the late Renaissance preserved in the amber of his words. In the essay titled “On Vanity” Montaigne poses a fascinating self-reflection looking back at his life as he remembered it and who he was at the time he wrote that particular essay near the end of his days. Quoting here from Donald Frame’s 1965 translation, Montaigne wrote that in the years since he published his first edition of essays in 1580 “I have grown older by a long stretch of time; but certainly I have not grown an inch wiser.” Here whether out of humility or in refutation of Aristotle’s maxim that age and experience begats wisdom, Montaigne sees himself as the same light as before. Despite this, Montaigne continued to observe that “myself now and myself a while ago are indeed two; but when better, I simply cannot say.”[1] This struck me that the essayist could see such a simple yet profound difference between himself as he was when first he wrote and published his magnum opus and the man he later was publishing his third and final volume of essays nearly a decade later.

From my earliest days of extensive writing in my high school years I found myself looking ahead to a time late in my life when I would return to the places of my teenage youth and reflect on what once was and who I’d become. I suppose there’s some vanity of my own in having this profound sense of legacy even from what was then quite an early point in my life. Still, in recent weeks I’ve been reintroduced to younger versions of myself as my family carries out a Spring cleaning and we’ve found decades old boxes of photographs and postcards that I still remember taking and sending yet which haven’t seen the light of day since their capture. I was humbled and heartened to see in particular how loved was the boy I once was, and how inventive and imaginative he could be. Looking at these photos, especially from around my family’s great move from Chicago to Kansas City in 1999, I remember each and every one of them being taken. I remember the sights and sounds, the smells, the prairie winds and the things I was thinking in those first days of my life in Kansas City. These memories have always been there in my mind, yet the subsequent quarter-century has piled many more atop them so that they are now rendered foundations for the memories that comprise me today.

I suspect these days spent pouring over decades-old photos removed some sort of mental block I’d put up out of stress that’s kept my imagination in check in recent months. I longed to have the same expansive dreams and wandering thoughts that’ve populated so much of my consciousness, and now again I find it easy to tap into that deep reservoir which too is built into my memories yet also grows out from them into things which are wonderous and extend beyond the limits of reality toward the possible. Am I then wiser than I was when first I began writing essays in my adolescent and early teenage years? I’d like to think so, at least in some respects. I have a sense of calm today which was lacking in earlier years, and while the stresses of my life are great, as they are for all of us, I know how to accept them and tamper down some of their effects.

Yet in so many ways I do feel that I too am a different person from the kid who moved west all those years ago. Likewise, I see a clear distinction between the student starting high school in the years after the turn of the millennium and imagining his future in the last decades of this century. I’ve learned to live more in the moment in which I find myself, to influence that moment to fit what I aspire it to be. A complex turn of this answer is to consider all the potential lives I might have led, a thought experiment which I’ve considered developing into a short story with some sort of science fiction shenanigans. In one version of this, a broken-down elevator occurring simultaneously across parallel realities as a sort of mirror image resulted in contemporary alternative versions of myself ending up stuck in the same elevator all at once. I could see it either being a bit of a laugh-fest as one version of myself attempted to out-wit the others, or a simmering cauldron of irritation. 

What all this speaks to is the complexity of our personalities. We are all multifaceted with so many different competing thoughts and desires and inclinations and perceptions. I’ve thought more recently that perhaps my academic career would be further along if I limited myself to only focusing on my research, yet then again, I’ve always had multiple hats in the ring so why would I stop doing all these different things now? The Wednesday Blog for one remains a sort of release-valve for me to write about things which I’m curious about yet don’t directly relate to my research. I look to my colleagues, and I see people with similar interests and in some cases similar paths they’ve taken to get to where they are today. Several days ago, when I was dwelling in a particular bout of melancholy thinking about the long winter that has grayed the skies over my own doctoral candidacy when compared to my peers, I felt a sense of pride at noticing just how I’ve persisted in my efforts and my work in spite of all the challenges which the last six years have brought. Perhaps it is this combination of trial and hope which forms a person; it’s what formed me into the historian I am today.

When I started writing the Wednesday Blog in March 2021 I did so because I felt such a profound sense of nostalgic hope at one particular memory that surfaced after a sleepless night amid my comprehensive exam studying that I felt compelled to share it with the world. I know for a fact that I am a different person today than I was four years ago when I wrote that blog post about an Air France commercial I remembered seeing on ITV and Channel 4 five years before when I lived in London. The difference lies in the added layers of experience laid by all the trials which I’ve endured and the hopes which’ve kept me going. When I had such tremendous trouble unlocking my imagination and letting myself daydream in the latter months of 2024, I recognized that I am happier when I allow my mind to wander and craft stories that no one else will ever know. These are often stories of the future I hope I might live and the wonders I might come to know and explore. That imagination, that connection with my own consciousness, is the thread that runs all throughout my life and connects these different versions of myself that I’ve grown into and out of with the passing of time.

When Montaigne picked up a copy of the 1588 edition of his Essais, containing all three volumes of musings, he took a pen to it and steadily began correcting things he found beneath the standards he’d developed at that late moment in his life. I don’t often read my own writing after I’ve finished editing a document. I’ll occasionally return to an old blog post when I’m referencing it in a newer one, and even more occasionally if I’ve cited a source before in a previous paper, I’ll open that paper to aid me in citing the same source again in the research project of the moment. Yet, I rarely sit down just to read my own writing. The last time I did I ended up switching from a PDF file back to the Word document version so I could edit as I read. In fact, when I was moving into my apartment in Binghamton in August 2019, I found an essay I wrote in my sophomore year of high school when I was 15 years old. It was a near 20 page essay that attempted to summarize the history of religion in Britain and Ireland from the Stone Age to St. Patrick. Reading it then at the start of my doctorate and thinking about it as an essay that I might grade, I would’ve given it a low B- or maybe a C+.

I need to remember that my old writing fits into a particular time and place in my life and ought to remain in that setting for as long as I can muster the strength to not try to refine it further. These ages in my life mirror those in everyone else’s, and I hope that as I dream about the ages to come, I will be able to share them and live them to their fullest potential. Montaigne died in September 1592, almost 400 years before my own birth. At that point, he’d made his name in politics and in philosophy. The Wednesday Blog is essentially my collection of essays of varying length and quality. I hope that when I wander off in my own time that my life in all its ages will have been as fulfilled and prolific as the great essayist.


[1] Montaigne, Essais (EB) 3.9.433r, Frame, 736.


Three Ologies

This week, talking through three terms I’ve historically had trouble understanding: epistemology, ontology, and teleology.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane


This week, talking through three terms I’ve historically had trouble understanding.


A major turning point in my life came at the end of 2014 when I decided to drop my philosophy major to a minor and not take the final class that I needed to complete that major. The class in question was Continental Philosophy, and it remains one of those decisions that I regret because it closed some doors for me in the long run even while it seemed like a reasonable decision in the short term. A year later, now working on my master’s degree in International Relations and Democratic Politics at the University of Westminster, I was reminded daily that I really should’ve just taken that last class because so much of what we were studying was based in continental philosophy.

I initially pursued a triple major in History, Philosophy, and Theology and a double minor in French and Music at Rockhurst University. I was quite proud of the fact that up to that point in my seven completed semesters at Rockhurst that I’d been able to juggle those three majors and the two minors while still having an active and fulfilling social life on campus. I went into Rockhurst with several vague ideas of what I might want to do with these degrees when I was finished; notably I remember both considering doing a Ph.D., likely in History, and possibly going from Rockhurst either into the Jesuit novitiate or into a diocesan Catholic seminary to become a priest. The first four years of Catholic seminary is comprised of that philosophy bachelor’s degree, so it felt like a good idea to undertake that at Rockhurst and keep the door open.

Now ten years after I would’ve finished my undergraduate with that philosophy degree, I realize that even as I continued to consider holy orders that I may well have properly begun to close that door in my early twenties, not feeling that the priesthood was the right fit for me in spite of what many people have said. Even then, most of the other professions that I’ve considered have been shrinking in one way or another in my lifetime. It feels here as in so many other aspects of my life that I was born at a high point in our society’s capacity to consider the arts, humanities, and even the sciences and that as I’ve gotten older that capacity has diminished time and again. Even while I continue to be frustrated to remain in these wilderness years, I nevertheless continue to learn and to grow in my understanding of what is possible for me to do in my career.

In the last seven years I’ve reasserted myself as a historian first and foremost, settling into the Renaissance as my period of study in late 2017 and gradually shifting from considering the history of Englishwomen’s education to the history of translation to now the history of natural history. Yet all of these disciplines lie under the common umbrella of intellectual history. My manner of writing the history I craft tends to speak toward French notions of mentality and perception, while the economics I still occasionally encounter in my work speak to Max Weber’s notions of capitalism as a broader Cross-Channel enterprise including Brittany and Normandy alongside England, Picardy, Flanders, and the Dutch Republic. I’m beginning to try out a new method of writing history that draws on the natural sciences to better understand the animals and other natural things described by my Renaissance cosmographers and natural historians.

Amid all of this, three words continue to appear, three words which I have often had trouble remembering their meaning. These three are epistemology, ontology, and teleology. In spite of my training in Ancient Greek, I still have trouble keeping these three apart. They represent three central tenants of philosophy which help make sense of how we understand things. It may not sound like the strongest topic for a riveting podcast episode, but for those of you listening bear with me.

Descartes’s tomb, photo by the author.

Epistemology is the theory of knowledge. It distinguishes things which are justified from mere opinions. This theory of knowledge considers propositions about facts, practices which form knowledge, and familiarity with an object thus allowing the subject to know it. This word episteme in Greek (ἐπιστήμη) translates into English as both knowledge and science. Science itself is a word which at its core refers to knowledge, for the root Latin verb sciō means “to know.” We know for instance that we exist because we can recognize our existence, in Descartes’s famous words “I think, therefore I am.” I made a point of visiting Descartes’s tomb in the Abbey Church of St. Germain-des-Prés when I was in Paris in October 2023 because so much of my own philosophy is Cartesian in its origins. I reject the principle that we could be living in a simulation on the grounds that based on what we can know and perceive we are not inclined to accept such a suggestion.

The second of these words is ontology, a branch of metaphysics dealing with the nature of being. This word derives from the present participle of the Greek to-be verb εἰμί. I stand by my assertion that the life we are living is real because we can recognize it in large part because the best explanation that I’ve found for the course our history has taken is reliant on us having the freedom to decide the courses of our own lives. This free will explains how a society can seem to take steps backward even while the chaos those retreats cause is to the society’s detriment. The method which I am developing in my research to understand the nature of historical animals using modern scientific research is ontological in character. I can test if this method will work by applying it to particular individual animals who appear in the historical record and determining their true character by a process of eliminating candidate species until the animal’s own species is determined. In this search for the nature of these animals I hope to prove that the historical past, before the development of the scientific method in the seventeenth century, is valuable to the natural sciences as a means of understanding the longer-term nature of other animals during the period in which human influence upon nature was growing toward the Anthropocene which we find ourselves in today.

I like to think of ontology in the linguistic context of how the copular to be verb appears in our literature. Think, for instance, of how God is identified in the Bible. In the story of the burning bush, the Divine is referred to as “I Am that I Am,” or rather the purest expression of existence. For this reason, when I was an undergraduate in my theology major, I began to refer to God as the Divine Essence owing to the root of essence in the Latin copular verb. English recognizes a far wider set of states of being than does Irish. Where in English I might say “I am sad,” in Irish I would say “sadness is upon me,” or “Tá brón orm.”

The third of these words is teleology. This is the explanation of phenomena in terms of their purpose rather than the manner of their invention. Τέλος (telos) is the Greek word for an end, an aim, or a goal. The purpose of something’s existence then is at focus here. I do question this idea that we have a specific purpose in life, perhaps because mine has not gone quite how I expected. In my Catholicism, the most teleological concept we retain is the idea of a vocation either to holy orders, marriage, or to the single life. The teleology at play here speaks to some sense of destiny which I feel stands in opposition to our free will. Perhaps there is some purpose to life, at its initial conception in the first moments that matter began to form in the void that became our Universe, yet I do not believe that I can perceive any intended influence beyond the flick of the first domino at the Big Bang. We may not even be sure that the Big Bang was the beginning of everything, after all there had to be energy to build up to cause such a tremendous explosion in the first place. In a theological view I would point to the Incarnation of Jesus as an example of telos in our history, I am a Catholic after all. My lingering question is where should that theological teleology interact with the other ways of knowing?

I’ve written here before about my view that belief and knowledge are two distinct yet interrelated things. One must believe in one’s senses to know, yet there are things in which one can believe without knowing which one cannot know without believing. The prime example of this is God; “I believe in One God,” it’s something I say every week at Mass in the Creed, “Credo in unum Deum,” in the Latin original of our Roman Missal. Yet God alone is a tremendous challenge to know because God is both paradoxical and far greater than the extent of my knowledge. For this reason, we had the Incarnation, as we recite in the Creed:

“I believe in one Lord Jesus Christ,

The Only Begotten Son of God,

Born of the Father before all ages.”

For God to be knowable God needed to come down to our human level in the person of Jesus, God the Son. This was Jesus’s telos, to be known, to be heard, and as we believe restore faith in God and cleanse humanity of original sin. Here there is a collision of belief and knowledge, where something clearly happened about 2,000 years ago because a new profusion of faith occurred, beginning in Judaea and spreading around the Mediterranean World in the Roman Empire and beyond to become Christianity. That new religion adapted to fit the cultures it encountered, so as to be more acceptable to its new converts. Today that collision continues in the Eucharist, the most sacred of all seven sacraments, in which we Catholics alongside our Orthodox brothers and sisters believe that God becomes flesh again in the sacramental bread and wine. Can we know that it happens? Not by any scientific measure, yet something does happen. That something is perceptible through belief, and it is the Great Mystery of the Faith that has kept me in the Catholic Church in spite of the ecclesiastical politics and divisions of our time.

My Irish Gaelic ancestors understood Christianity in their own way, aspects of which survive into the present day. That collision of belief and knowledge looks to some lingering folk belief, or superstition if you will, that I’ve inherited of particular days in the calendar when the worlds of the living and the dead could collide. We see this most pronounced in the old Gaelic calendar on Samhain, which developed through Catholicism into Halloween, the Day of the Dead, and All Souls’ Day around the beginning of November. I see All Saints’ Day fitting into this as well, after all the Saints are our honored dead all the same. Likewise, Bealtaine, the celebration of the coming of Summer at the beginning of May is also the Catholic celebration of the Crowning of Mary, something I attended at Rockhurst on several occasions.

What in all of this can I actually know? I know the stories that have survived from before St. Patrick and the coming of Christianity to my ancestors 15 centuries ago, even if those stories are Christianized in some way or another. I know this just as much as I know that Jesus existed in the first century CE because there are effects of these stories in the lives and histories that are remembered down the generations. If these stories have any teleology, it’s to teach us lessons about life that our ancestors learned so that we might not have to face the same trouble all over again. The folly of humanity is that we are resistant to having a clear purpose or end to our aims. Through our free will we know that there are always many options to choose between.I don’t know if I made the right choice in dropping that philosophy major at the last moment. In many respects, it was a poor decision. I learned from that experience and many others in my early life to stick with things until their conclusion. This learning is something that has been tested to grow beyond mere opinion through belief into something that is verifiable. When I look at my prospects in my doctoral program, I always decide to stick with it because I don’t yet know what my prospects will be like once I’ve earned it, something that I do know having 2 master’s degrees and a bachelor’s degree to my name. I have gained a great deal of epistemic experience through all these memories that have informed the nature of my character. Yet where they lead I cannot say, for the purpose of my life is something I continue to decide day by day.


Galileo Galilei pictured in his early 40s c. 1600.

Return to Normalcy

Over the last week, I've been thinking about the standards we define to cast a model of normality, or in an older term normalcy. This week then, I try to answer the question of what even is normal? — Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane


Over the last week, I’ve been thinking about the standards we define to cast a model of normality, or in an older term normalcy. This week then, I try to answer the question of what even is normal?


One of the great moments of realization in my life to date was when it occurred to me that everything we know exists in our minds in relation to other things, that is to say that nothing exists in true isolation. The solar eclipse I wrote about last week was phenomenal because it stood in sharp contrast to what we usually perceive as the Sun’s warmth, and a brightness which both ensures the longevity of life and can fry anything that stares at it for too long. So too, we recognize the people around us often in contrast to ourselves. Everyone else is different in the ways they walk, the ways they talk, the ways they think and feel. We are our own control in the great experiment of living our lives, the Sun around which all the planets of our solar system orbit.

There is a great hubris in this realization, as a Jesuit ethics professor at Loyola said to my Mom’s class one day, in a story she often recounts, no one acts selflessly, there’s always a motive for the things we do. That motive seems to be in part driven by our desire to understand how different things work, how operations can function outside of the norm of our own preferences or how we would organize them. I might prefer to sort the books on my shelves by genre, subgenre, and then author; history would have its own shelf with the history of astronomy in its own quadrant of that shelf and Stillman Drake’s histories of Galileo set before David Wootton’s Galileo: Watcher of the Skies. Yet, at the same time I could choose to add another sublevel of organization where each author’s titles are displayed not alphabetically but by publication date. So, Stillman Drake’s Two New Sciences of 1974 would be placed before his 1978 book Galileo At Work.

This shelving example may seem minor, yet one can find greater divergence in book sorting than just these small changes here or there. My favorite London bookseller, Foyle’s on Charing Cross Road, was famous for many decades for the eccentricities of organizing the books on their shelves by genre, yet then not by author but by publisher. This way, all the black Penguin spines would be side-by-side, giving a uniform look to the shelves of that establishment. It is pleasant to go into Foyle’s today and see on the third floor all the volumes of Harvard’s Loeb Classical Library side-by-side with the Green Greek volumes contrasting with the Red Roman ones on the shelves below. Yet to have books organized by publisher when the average reader is more interested in searching for a particular author seems silly. Yet that was the norm in Foyle’s for a long time until the current ownership purchased the business.

Our normal is so remarkably defined by our lived world. In science fiction, bipedal aliens who have developed societies and civilizations are called humanoid, in a way which isn’t all that dissimilar from how the first generations of European explorers saw the native peoples of the Americas. André Thevet wrote in his Singularitez, the book which I’ve translated, that the best way he could understand the Tupinambá of Brazil was by comparing them to his own Gallic ancestors at the time of Caesar’s conquest of Gaul in the first century BCE. Even then, an older and far more ancient normal of a time when he perceived that his own people lived beyond civilization was needed to make sense of the Tupinambá. The norms of Thevet’s time, declarations of the savagery of those who he saw as less civilized for one, are today abnormal. Thus, our sense of normal changes with each generation. For all his faults and foibles, Thomas Jefferson got that right, in a September 1789 letter to James Madison, Jefferson argued that “by the law of nature, one generation is to another as one independent nation to another.” Thus, the norms of one generation will both build upon and reject those of their predecessors.

At the same time that we continue to refer to the aliens of fiction in contrast to ourselves, we have also developed systems of understanding the regulations of nature that build upon the natural world of our own planet. The Celsius scale of measuring temperatures is based on the freezing point of water. At the same time, the Fahrenheit scale which we still use in the United States was originally defined by its degrees, with 180 degrees separating the boiling (212ºF) and freezing points (32ºF) of water. the source of all life on our own planet and a necessary piece of the puzzle of finding life on other planets. I stress here that that water-based life would be Earthlike in nature, as it shares this same common trait as our own. So, again we’re seeing the possibility of other life in our own image. Celsius and Fahrenheit then are less practical as scales of measurement beyond the perceived normalcy of our own home planet. It would be akin to comparing the richness of the soils of Mars to those of Illinois or Kansas by taking a jar full of prairie dirt on a voyage to the Red Planet. To avoid this terrestrial bias in our measurements, scientists have worked to create a temperature scale which is divorced from our normalcy, the most famous of these is the Kelvin scale, devised by Lord Kelvin (1824–1907), a longtime Professor of Natural Philosophy at the University of Glasgow in Scotland. Kelvin’s scale is defined by measuring absolute 0. Today, the Celsius and Fahrenheit scales are both officially defined in the International System of Units by their relations to the Kelvin scale, while still calculating the freezing point of water as 0ºC or 32ºF.

In this sense, the only comparison that can be made between these scales comes through our knowledge of mathematics. Galileo wrote in his 1623 book Il Saggiatore, often translated as The Assayer, that nature, in Stillman Drake’s translation, “cannot be understood unless one first learns to comprehend the language and interpret the characters in which it is written. It is written in the language of mathematics, and its characters are triangles, circles, and other geometrical figures.”[1] I love how the question of interplanetary communication in science fiction between humanity and our visitors is often answered mathematically, like the prime numbers running through Carl Sagan’s Contact which tell the radio astronomers listening in that someone is really trying to talk to them from a distant solar system. There one aspect of our own normalcy can act as a bridge to another world’s normalcy, evoking a vision of a cosmic normal which explains the nature of things in a way that would have made Lucretius take note.

I regret that my own mathematical education is rather limited, though now in my 30s I feel less frustration toward the subject than I did in my teens. Around the time of the beginning of the pandemic, when I was flying between Kansas City and Binghamton and would run out of issues of the National Geographic and Smithsonian magazines to read, I would sit quietly and try to think through math problems in my mind. Often these would be questions of conversions from our U.S. standard units into their metric equivalents, equivalents which I might add are used to define those U.S. standard units. I’ve long tried to adopt the metric system myself, figuring it simply made more sense, and my own normal for thinking about units of measurement tends to be more metric than imperial. That is, I have an easier time imagining a centimeter than I do an inch. I was taught both systems in school, and perhaps the decimal ease of the metric system was better suited to me than the fractional conversions of U.S. Standard Units, also called Imperial Units for their erstwhile usage throughout the British Empire.

In his campaign for the Presidency in 1920, Republican Warren G. Harding used the slogan “Return to Normalcy.” Then and ever since, commentators have questioned what exactly Mr. Harding meant by normalcy. I think he meant he wanted to return this country to what life had been like before World War I, which we entered fashionably late. I think he also meant a return to a sort of societal values which were more familiar to the turn of the twentieth century, values perhaps better suited to the Gilded Age of the decades following the Civil War which in some respects were still present among his elite supporters. I remember laughing with the rest of the lecture hall at the presentation of this campaign slogan, what a silly idea it was to promise to return to an abstract concept that’s not easily definable. Yet, there is something comforting about the idea of there being a normal. I’ve looked for these normalcies in the world and seen some glimpses of it here or there. Perhaps by searching for what we perceive as normal, we are searching within our world for things we have crafted in our own image. We seek to carry on what we have long perceived as works of creation, the better to leave our own legacy for Jefferson’s future generations to use as foundations for their own normal.


[1] Galileo Galilei, Discoveries and Opinions of Galileo, (Garden City, NY: Doubleday Anchor Books, 1957), 238.