Tag Archives: Science

A figure from Raphael's "The School of Athens" variously identified as Francesco Maria della Rovere, Pico della Mirandola, or Hypatia of Alexandria.

On Knowledge

This week, I want to address how we recognize knowledge in comparison to the various fields of inquiry through which we refine our understanding of things.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkaneArtRaphael, The School of Athens (1509–1511), Apostolic Palace, Vatican Museums, Vatican City. Public Domain.Sources“On Writing,” Wednesday Blog 6.27.Surekha Davies, Humans: A Monstrous History, (University of California Press, 2025).Marcy Norton, The Tame and the Wild: People and Animals After 1492, (Harvard University Press, 2024), 307.Dead Poets Society, (1989) "What will your verse be?" Video on YouTube.


This week, I want to address how we recognize knowledge in comparison to the various fields of inquiry through which we refine our understanding of things.


Lately my work has been dedicated to a thorough review of the historiography within which I’m grounding my dissertation. I wrote about this two weeks ago in an essay titled “On Writing.”[1] My research is historical, yet it touches on secondary literature which operates within various fields within the discipline of history. These include Renaissance history, and its larger sibling early modern history, the history of cartography, the history of animals, the history of botany, and more broadly the history of early modern science. Methodologically, I owe a great deal to two great twentieth-century Francophone anthropologists, Alfred Métraux (1902–1963) and Claude Lévi-Strauss (1908–2009). While Métraux and Lévi-Strauss aren’t considered directly in the historiographic section of the new introduction that I’m writing for my dissertation, which is limited to sources published since the millennium, they nevertheless stand tall in the background of my history.

Today we often talk within academia about a desire for interdisciplinarity in our work and our research. We’ve found ourselves too narrowed by our ever shrinking fields and seek greener common pastures for grazing as our intellectual and pastoral ancestors alike once knew. In my case, this interdisciplinarity lies more in my efforts to incorporate historical zoology into my work, a methodology which seeks to use zoological methodology and theory to explain historical animals. I have friends who study many things. Among them is one whose passion for history, classics, and mathematics has come together to craft a dissertation which seeks to demonstrate the intersections between those three to better understand the great transitions in human inquiry. Another seeks to follow the medical connections across oceans between disparate regions in the Americas and Europe that nevertheless existed even if they seem remarkable today. Still more, I have a friend who applies basic economic need to explain a complex diplomatic situation that once existed between the Venetian Republic and the Ottoman Empire in the Adriatic Sea. All of these historians of whom I write are applying a degree of interdisciplinarity to their work that reflects their own disparate interests and curiosities. In early modern history we talk about curiosities as objects which were collected from disparate and exotic lands into cabinets to display the erudite collector’s prestige and wealth. I say our curiosity is something to be collected by those worthy archives, libraries, museums, or universities that will employ us in the near future and for us to feed with new ideas and avenues of investigation that we will never be bored with life.

In all of these things, there is an underlying genre of knowledge which I am addressing. I’ve written thus far about history alone, yet it is the same for the anthropologists, astronomers, planetary scientists, and physicists who I know. Likewise for the literature scholars and the linguists. Our fields of inquiry all grow on the same planet that comprises of our collected knowledge. In English, this word knowledge is somewhat nebulous. To me, it says that we know things broad or specific. In London, for instance, the Knowledge is the series of tests which new cabbies must complete in order to learn every street within a certain radius of Charing Cross. The Latin translation of this word, scientia, makes things even more complicated as that is the root of the English word science. Thus, when we refer to Renaissance science, there is always a caveat in the following sentence explaining that “this is not science as we know it but a sort of protoscience.” I was advised, similarly, after a particularly poorly received presentation at a workshop at the Museum of Natural Sciences in Brussels in October 2023 that I shouldn’t refer to “sixteenth-century conservation” because no such concept existed at the time; instead, it would be better to discuss a “genealogy of conservation.” This sense that modern terms, in use since the Enlightenment of the eighteenth century, ought not to be pulled further back into the past I think loses some of the provenance of those terms and how the Enlightenment philosophes first came across them. 

I find it telling that the Ancient Greek translation of knowledge, γνῶσις (gnôsis), is a word with which I’m more familiar from theology and the concept of Gnosticism whereas scientia reminds me of philosophy and the other fields of inquiry which grew from that particular branch of the tree of human curiosity. One might even say that philosophy and theology are a pair, siblings perhaps? They seek to understand similar things: on the one hand an inquiry into thought, and ideally wisdom, and on the other a search for the nature of the Divine, which at least in my Catholicism we can know because we are made in the Image of God. The division here between the Ancient Greek term being affiliated with faith and the Latin one with reason I think speaks to the Latin roots of my own education in Catholic schools and at a Jesuit university, where I learned about Plato and Aristotle, yet I recognized Aristotle’s Historia animalium (History of Animals) by its Latin name by which it was generally known in Western Europe for centuries before the rise of vernacular scholarship rather than by its Greek original Τῶν περὶ τὰ ζα ἰστοριῶν (Ton peri ta zoia historion). Note that the English translation of this title, History of Animals reflects better the Latin cognate of ἰστοριῶν rather than the better English translation of that Greek word, Inquiry.

Added onto these classical etymologies, in my first semester Historiography class at Binghamton University I was introduced to the German translation of scientiaγνῶσις, and knowledge. Wissenschaft struck me immediately because I saw the German cognate for the English word wizard in its prefix, and because I knew that the -schaft suffix tends to translate into English as -ship. Thus, my rough Anglicization of Wissenschaft renders Wizardship, which is rather nifty. Yet this word Wissenschaft instead was seen in the nineteenth century as a general word which could be translated into English as science. This is important for us historians trained in the United States because our own historiographic tradition, that is our national school of historians traces our roots back to German universities in the early and middle decades of the nineteenth century. I remember long sessions of my historiography class at UMKC discussing the works of Leopold von Ranke (1795–1886), the father of research-based history. I felt a sense that this concept of Wissenschaft seemed relatable, and as it turned out that was because Irish has a similar concept. 

Whereas in English we tack on the suffix -ology onto any word to make it the study of that word, in Irish you add the suffix -ocht. So, geology is geolaíocht and biology is bitheolaíocht. Yet note with the second example that the suffix is not just -ocht but an entire word, eolaíocht. This is the Irish word for science, added onto the end of bitheolaíocht to demonstrate that this word refers to the study of bith- a prefix combining form of the word beatha, meaning life. So, biology then is the science of life itself. Powerful stuff. I appreciate that Irish linguists and scholars have sought overall to preserve our language’s own consistency with its scientific terminology. It means that these fields of study, these areas of knowledge, can exist purely within the purview of the Irish language without any extra need to recognize that their prefixes or suffixes come from Latin, Greek, or English. There are some exceptions of course: take zó-eolaíocht, the Irish word for zoology, which effectively adopts the Greek word ζῷον perhaps through the English zoo into Irish. Would it not have been just as easy for whoever devised this hyphenated word to instead write ainmhíeolaíocht, translated into English as the science of animals? Here though I see more influence from English because this language adopts as much as it can from other languages out of prestige and a desire for translingual communicability. As an English speaker, I find scholarly works often easier to read because we share common etymologies for our words relating to knowledge. English’s sciencegeology, biology, and zoology are French’s sciencegéologie,biologie, and zoologie. In English, we drop any pretense of Englishness to clothe ourselves in a common mantle familiar to colleagues from related cultures around the globe. In academia this is to our mutual benefit, after all so much of our work is international. I’m regularly on webinars and Zoom calls with colleagues in Europe for instance. I believe this is the lingering spirit of the old scholarly preference for Latin as a lingua franca which at least to me seems close enough in the past that it’s tangible yet realistically it’s surely been a very long time since any serious scholarly work beyond classics was published in Latin for the benefit of a broad translingual readership?

I for one admire the Irish word eolaíocht and its root eolas, which translates into English as knowledge, that is an awareness of things because eolaíocht represents a universal concept while retaining its own native nature. So often in my research I am discussing the early assimilation of indigenous cosmovisions, to borrow a Spanish word put to good use by Surekha Davies in her latest book, into the nascent global world centered on Europe.[2] I see how these cosmic conceptions faded until they were rendered in Gothic or Latin letters on the voluminous pages of encyclopedic Renaissance general and natural histories which remain among the most often cited primary sources for these indigenous cultures who Marcy Norton argued in her 2024 book The Tame and the Wild: People and Animals After 1492 had their own classical past made remote from their colonial present by European contact, conquest, and colonization.[3] Seeing these indigenous perspectives fade into their categorized and classified statuses within the cosmos defined by Europe’s natural philosophers I feel fortunate that my own diaspora (which was also colonized) has retained this element of our individual perspective. I first came across the -ocht suffix in the word poblacht, the Irish word for republic. A famous story from the birth of the Irish Free State during the Anglo-Irish Treaty negotiations in 1921 tells of British Prime Minister David Lloyd-George, a Welsh speaker, remarking to Michael Collins, an Irish speaker, that their choice of a republic was unusual because none of the Celtic languages naturally have a word for republic. That word evokes its Roman roots in the ancient Res publica Romana, the Roman Republic, whose northward expansion across the Alps led to the gradual death of the Continental Celtic languages, whose speakers’ descendants today are largely the Western Romance speakers of French, Romansh, Occitan, Catalan, Spanish, Galician, and Portuguese, among others. Romance languages are noted for their common descent from Latin, whence they all derive variations on the Latin word scientia; English gets science through Old French. “How are you going to name your new government in the Irish language?” Lloyd-George asked. Collins replied something along the lines of “a kingdom is called a ríocht, so this government of the people (pobal) will be called a poblacht. Thus, the Republic of Ireland is named in Irish Poblacht na hÉireann. Naturally, this word pobal derives from the Latin populus, so the shadow of Rome hovers even over unconquered Hibernia. Yet that is another topic for a different essay.

Let me conclude with a comment on the difference between knowledge and wisdom, as I see it. The former is far more tangible. We can know things through learning embodied best in living and in reading. I know for instance to look both ways before crossing a street because plenty of people in the last 140 years have been hit by cars, buses, and trucks, and you can never be too careful. Likewise, I know everything I do about the things I study through reading what others have written about these topics. It’s my job then to say what I will. In Whitman’s words made immortal by our recitation, the answer to the eternal question, “that the powerful play goes on, and you may contribute a verse.” That’s history, people! Reading the powerful play of what others have written and summoning up the courage to take the podium and have your say. I first heard this particular poem, as did many in my generation, recited by Robin Williams in the 1989 film Dead Poets Society. Knowledge is the recitation of these facts we’ve learned. Wisdom is understanding how these facts fit together and speak to our common humanity. What makes us human? I believe it’s as much what we know as what we remain ignorant of. Our ignorance isn’t always a curse, rather it’s another foggy field we’ve yet to inquire about, a place where someone’s curiosity will surely thrive someday. It is another evocation of eolas still to come in our long human story. How wonderous is that?


[1] “On Writing,” Wednesday Blog 6.27.

[2] Surekha Davies, Humans: A Monstrous History(University of California Press, 2025).

[3] Marcy Norton, The Tame and the Wild: People and Animals After 1492, (Harvard University Press, 2024), 307.


Asking the Computer

This week, I address news that the latest version of ChatGPT will help with your math problems. — Links: New York Times, 12 Sep. 2024, Cade Metz, "OpenAI Unveils New ChatGPT That Can Reason Through Math and Science." Eddie Burback, 1 Sep. 2024, "AI is here. What now?" YouTube.


This week, I address news that the latest version of ChatGPT will help with your math problems.


I’ve used ChatGPT on occasion, mostly to test the system and see what it will do if I prompt it about very particular things. What does it know about André Thevet (1516–1590), or about the championship run of my beloved Chicago Cubs from the 80s, the 1880s that is. I even asked it questions in Irish once and was startled to see it reply with perfect Irish grammar, better than Google Translate does. I’ve occasionally pulled up my ChatGPT app to ask about the proper cooking temperatures of beef, pork, or chicken rather than typing those questions into Google, and in one instance I used it to help me confirm a theory I had based on the secondary literature it had in its database for a project I was writing. The one thing that I would’ve expected ChatGPT to be best at from the start are logical questions, especially in mathematics. 

There are clear rules for math, except that in America it’s singular in its informal name while in Britain it retains its inherent plurality. As much as I acted out a learned frustration and incomprehension when posed with mathematical questions in elementary, middle, and high school, I appreciate its regularity, the way in which it operates on a universal and expected level. Many of the greatest minds throughout human history have seen math as a universal language, one which they could use to explain the world in which we live and the heavens we see over our heads. The History of Science is as much a history of knowledge as it is the history of the development of the Scientific Method, a tool which has its own mathematical regularity. All our scales and theorems and representations of real and unreal numbers reflect our own interpretation of the Cosmos, and so it is logical that an advanced civilization like our own (if I may be so bold) would have developed their own language for these same concepts which are inherent in our universe. Carl Sagan took this idea to a fuller level in his novel and later film Contact, in which the alien signal coming from Vega is mathematical in nature. 

Often, the lower numbers are some of the easiest words in a language for learners to pick up on. The numbers retain their similarities in the Indo-European languages to the extent that they were used as early evidence that the Irish trí, the English three, and the Latin trēs are related to the Sanskrit trī (त्रि) and the Farsi se (سه.) The higher the numbers go the more complicated they get, of course. An older pattern in Irish which I still use is to count higher numbers as four and fifty or ceathair is caoga, which is similar to the pattern used in modern German, and something that appears far more King James Bible in English. I love the complexity of the French base-twenty counting system, where the year of my birth, 1992, is mille neuf cent quatre-vignts douze, or one thousand nine-hundred four-twenties and twelve. Will the Belgian and Swiss word nonante to refer to the same number as quatre-vignts-dix ultimately win out in the Francophonie? Peut-être.

I was surprised to read in the New York Times last Friday that the latest version of ChatGPT called OpenAI o1 was built specifically to fix prior bugs that kept the program from solving mathematical problems. Surely this would be the first sort of language that one would teach a computer. As it turns out, no. Even now, OpenAI o1’s mathematical capabilities are limited to questions posed to it in English. So, as long as you have learned the English dialect of the language of mathematics then you can use this computer program to help you solve questions in the most universal of languages.

It reminds me of the bafflement I felt upon first seeing TurnItIn’s grammar correction feature, the purple boxes on TurnItIn’s web interface. For the uninitiated, TurnItIn is the essay grading and plagiarism detection system that most academic institutions that I’ve studied and taught at in the last 15 years use as a submission portal. I was proud to program into my Binghamton TurnItIn account several hotkeys that would allow me to save time retyping the same comment on 50 student essays every time they had a deadline. Thousands of essays later I can squarely say these hotkeys saved my bacon time and time again. Like legal documents, especially the medieval and early modern kind that I’ve read and written about in my studies, they are formulaic and expectable in their character.

The same goes for math: even with the basic understanding that I have (I only made it as far as Algebra II) the logic when explained well is inherent in the subject. Earlier in my doctoral studies, beginning in 2020, my two-sided approach to developing my own character and intellect beyond my studies came in the form of first signing up for Irish classes again, and second picking up where I left off with my mathematical studies in college and trying my hand at a beginner physics course. I’m sad to say I really haven’t had the time to devote to this mathematical pursuit as much as I would like. Perhaps I will be able to work it in someday, alas I also have to eat and sleep, and I’ve learned my attention will only last for so long. I too, dear reader, am only human.

Yet this is something where Open AI o1 differs from the average bear, for it is decidedly not human. How would we try to successfully communicate with a non-human entity or being when we have no basis for conversation to start with? The good thing about o1 and other AI programs is these are non-human minds which we are creating in our own image, ever the aspirant we are wrestling with the greater Essence from beyond this tangible Cosmos we inhabit. We can form o1 and its kind in the best image of our aspirations, a computerized mind that can recognize both empathy and logic and reflect those back to us in its answers to our questions. In the long run, I see o1’s descendants as the minds of far more powerful computers that will help our descendants explore this solar system and perhaps even beyond. 

From the first time I saw it in work, I saw in ChatGPT a descendant of the fictional computers of Starfleet’s vessels whose purpose in being is to seek out new life and new civilizations and to boldly go where no one has gone before. Perhaps that future where humanity has built our utopia in this place, our planetary home, will be facilitated by AI. Perhaps, if we use it, build it, and train it right. 

That said, the YouTuber Eddie Burback made a video several weeks ago about how he has seen AI put to use in his daily life in Los Angeles. In it, from the food delivery robots to his trips in several self-driving Waymo cars (manufactured by Jaguar), to his viewing of several AI films, Burback concluded that AI at this moment in 2024 is a net negative on human creativity and could remove more of the human element from the arts. I have seen far more AI generated images appear on my Instagram and Pinterest in the last year. I like Eddie’s videos, they may be long, but they are thorough and full of emotion, heart, and wit. They do a great service to their viewer at taking a long look at the world as he perceives it. I see much of the same thing, yet as the good Irish Catholic Cub fan that I am, I hold out hope that what today seems impossible to some: AI used morally and for the future improvement of our species and our advancement out of this adolescence in our story may still happen. I believe this is possible because I believe in us, that once this Wild West phase of the new Information Age settles down, we will see better uses of our new technologies develop, even as they continue to advance faster, higher, and stronger with each passing day.



Galileo Galilei pictured in his early 40s c. 1600.

Return to Normalcy

Over the last week, I've been thinking about the standards we define to cast a model of normality, or in an older term normalcy. This week then, I try to answer the question of what even is normal? — Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane


Over the last week, I’ve been thinking about the standards we define to cast a model of normality, or in an older term normalcy. This week then, I try to answer the question of what even is normal?


One of the great moments of realization in my life to date was when it occurred to me that everything we know exists in our minds in relation to other things, that is to say that nothing exists in true isolation. The solar eclipse I wrote about last week was phenomenal because it stood in sharp contrast to what we usually perceive as the Sun’s warmth, and a brightness which both ensures the longevity of life and can fry anything that stares at it for too long. So too, we recognize the people around us often in contrast to ourselves. Everyone else is different in the ways they walk, the ways they talk, the ways they think and feel. We are our own control in the great experiment of living our lives, the Sun around which all the planets of our solar system orbit.

There is a great hubris in this realization, as a Jesuit ethics professor at Loyola said to my Mom’s class one day, in a story she often recounts, no one acts selflessly, there’s always a motive for the things we do. That motive seems to be in part driven by our desire to understand how different things work, how operations can function outside of the norm of our own preferences or how we would organize them. I might prefer to sort the books on my shelves by genre, subgenre, and then author; history would have its own shelf with the history of astronomy in its own quadrant of that shelf and Stillman Drake’s histories of Galileo set before David Wootton’s Galileo: Watcher of the Skies. Yet, at the same time I could choose to add another sublevel of organization where each author’s titles are displayed not alphabetically but by publication date. So, Stillman Drake’s Two New Sciences of 1974 would be placed before his 1978 book Galileo At Work.

This shelving example may seem minor, yet one can find greater divergence in book sorting than just these small changes here or there. My favorite London bookseller, Foyle’s on Charing Cross Road, was famous for many decades for the eccentricities of organizing the books on their shelves by genre, yet then not by author but by publisher. This way, all the black Penguin spines would be side-by-side, giving a uniform look to the shelves of that establishment. It is pleasant to go into Foyle’s today and see on the third floor all the volumes of Harvard’s Loeb Classical Library side-by-side with the Green Greek volumes contrasting with the Red Roman ones on the shelves below. Yet to have books organized by publisher when the average reader is more interested in searching for a particular author seems silly. Yet that was the norm in Foyle’s for a long time until the current ownership purchased the business.

Our normal is so remarkably defined by our lived world. In science fiction, bipedal aliens who have developed societies and civilizations are called humanoid, in a way which isn’t all that dissimilar from how the first generations of European explorers saw the native peoples of the Americas. André Thevet wrote in his Singularitez, the book which I’ve translated, that the best way he could understand the Tupinambá of Brazil was by comparing them to his own Gallic ancestors at the time of Caesar’s conquest of Gaul in the first century BCE. Even then, an older and far more ancient normal of a time when he perceived that his own people lived beyond civilization was needed to make sense of the Tupinambá. The norms of Thevet’s time, declarations of the savagery of those who he saw as less civilized for one, are today abnormal. Thus, our sense of normal changes with each generation. For all his faults and foibles, Thomas Jefferson got that right, in a September 1789 letter to James Madison, Jefferson argued that “by the law of nature, one generation is to another as one independent nation to another.” Thus, the norms of one generation will both build upon and reject those of their predecessors.

At the same time that we continue to refer to the aliens of fiction in contrast to ourselves, we have also developed systems of understanding the regulations of nature that build upon the natural world of our own planet. The Celsius scale of measuring temperatures is based on the freezing point of water. At the same time, the Fahrenheit scale which we still use in the United States was originally defined by its degrees, with 180 degrees separating the boiling (212ºF) and freezing points (32ºF) of water. the source of all life on our own planet and a necessary piece of the puzzle of finding life on other planets. I stress here that that water-based life would be Earthlike in nature, as it shares this same common trait as our own. So, again we’re seeing the possibility of other life in our own image. Celsius and Fahrenheit then are less practical as scales of measurement beyond the perceived normalcy of our own home planet. It would be akin to comparing the richness of the soils of Mars to those of Illinois or Kansas by taking a jar full of prairie dirt on a voyage to the Red Planet. To avoid this terrestrial bias in our measurements, scientists have worked to create a temperature scale which is divorced from our normalcy, the most famous of these is the Kelvin scale, devised by Lord Kelvin (1824–1907), a longtime Professor of Natural Philosophy at the University of Glasgow in Scotland. Kelvin’s scale is defined by measuring absolute 0. Today, the Celsius and Fahrenheit scales are both officially defined in the International System of Units by their relations to the Kelvin scale, while still calculating the freezing point of water as 0ºC or 32ºF.

In this sense, the only comparison that can be made between these scales comes through our knowledge of mathematics. Galileo wrote in his 1623 book Il Saggiatore, often translated as The Assayer, that nature, in Stillman Drake’s translation, “cannot be understood unless one first learns to comprehend the language and interpret the characters in which it is written. It is written in the language of mathematics, and its characters are triangles, circles, and other geometrical figures.”[1] I love how the question of interplanetary communication in science fiction between humanity and our visitors is often answered mathematically, like the prime numbers running through Carl Sagan’s Contact which tell the radio astronomers listening in that someone is really trying to talk to them from a distant solar system. There one aspect of our own normalcy can act as a bridge to another world’s normalcy, evoking a vision of a cosmic normal which explains the nature of things in a way that would have made Lucretius take note.

I regret that my own mathematical education is rather limited, though now in my 30s I feel less frustration toward the subject than I did in my teens. Around the time of the beginning of the pandemic, when I was flying between Kansas City and Binghamton and would run out of issues of the National Geographic and Smithsonian magazines to read, I would sit quietly and try to think through math problems in my mind. Often these would be questions of conversions from our U.S. standard units into their metric equivalents, equivalents which I might add are used to define those U.S. standard units. I’ve long tried to adopt the metric system myself, figuring it simply made more sense, and my own normal for thinking about units of measurement tends to be more metric than imperial. That is, I have an easier time imagining a centimeter than I do an inch. I was taught both systems in school, and perhaps the decimal ease of the metric system was better suited to me than the fractional conversions of U.S. Standard Units, also called Imperial Units for their erstwhile usage throughout the British Empire.

In his campaign for the Presidency in 1920, Republican Warren G. Harding used the slogan “Return to Normalcy.” Then and ever since, commentators have questioned what exactly Mr. Harding meant by normalcy. I think he meant he wanted to return this country to what life had been like before World War I, which we entered fashionably late. I think he also meant a return to a sort of societal values which were more familiar to the turn of the twentieth century, values perhaps better suited to the Gilded Age of the decades following the Civil War which in some respects were still present among his elite supporters. I remember laughing with the rest of the lecture hall at the presentation of this campaign slogan, what a silly idea it was to promise to return to an abstract concept that’s not easily definable. Yet, there is something comforting about the idea of there being a normal. I’ve looked for these normalcies in the world and seen some glimpses of it here or there. Perhaps by searching for what we perceive as normal, we are searching within our world for things we have crafted in our own image. We seek to carry on what we have long perceived as works of creation, the better to leave our own legacy for Jefferson’s future generations to use as foundations for their own normal.


[1] Galileo Galilei, Discoveries and Opinions of Galileo, (Garden City, NY: Doubleday Anchor Books, 1957), 238.



Eclipse simulation using Stellarium

The Eclipse

This Monday, North America experienced its second total solar eclipse in the last decade. — Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane


This Monday, North America experienced its second total solar eclipse in the last decade.


I remember being over-the-moon excited when we began preparing for the Total Solar Eclipse in August 2017. Several weekends before the eclipse, my parents and I drove north from Kansas City into the path of totality to scout out possible places where we might travel on Eclipse Day to see the phenomenon for ourselves. Eclipse Day 2017 also happened to be my first day as a history graduate student at the University of Missouri-Kansas City. That morning a sudden summer thunderstorm rolled through Kansas City and as the day continued the clouds persisted in our skies. When the moment of totality arrived around 12:55 pm on 21 August, we watched it through darkened clouds and heard the birds and insects around us revert to their nocturnal states and songs.

I was excited to have experienced a total solar eclipse yet disappointed that I wasn’t able to see it. So, when the prospect of traveling for this week’s total solar eclipse appeared, I seriously considered going afield to Texas to observe it. That trip didn’t end up working out because of a series of scheduling conflicts, and so instead seeing that the cloud forecast across North America called for most places along the path of totality to be obscured, I decided to stay here in Kansas City and observe our partial solar eclipse. At its greatest extent, the April 2024 solar eclipse reached about 90.5% totality. I was able to see that extent, yet the feel of it was quite different than 100% totality from seven years ago. We were watching Everyday Astronaut and the Planetary Society’s live broadcast from the Society’s Eclipse-o-rama event in Fredericksburg, Texas while observing the eclipse here at home, and what they experienced was far more dramatic than what we observed. I do regret not travelling for this eclipse, yet at the same time in the circumstances as they fell, I’m glad I chose to stay home all the same.

This concept of an eclipse is one that speaks to me both astronomically, as a big space nerd, historically, and linguistically. Eclipses are phenomena that have made their mark on the psyche of more than just us humans, note how the birds began singing their twilight songs when the Moon passed in front of the Sun. I have never put much theological potency into eclipses because we have been able to predict their occurrences with increasing accuracy for generations now. Religion, in many ways, relies on our perceptions of things. Some see in an eclipse a threat to divine order in the Cosmos. This view reminds me of Mozart’s final opera, near to my favorite of his works, Die Zauberflöte (The Magic Flute) in which the Queen of the Night is defeated by Sarastro, the high priest of the Temple of the Sun. Sarastro proclaims victory for the good and right, singing: 

Die Strahlen der Sonne

Vertreiben die Nacht.

Zernichtet der Heuchler

Erschlichende Macht.

The rays of the sun

Drive away the night.

Destroyed  is the hypocrites’  

Surreptitious power.

(Source: Aria-Database.com, trans. Lea Frey)

Sarastro’s triumphant finale in Die Zauberflöte sung by Josef Greindl with the RIAS Symphonie-Orchester Berlin.

The divine hand is better seen in the wisdom of devising a manner to mathematically ascertain the revolutions of these celestial orbs, to borrow the title of Copernicus’s magnum opus, De revolutionibus orbium coelestium. In our ability to ascertain our surroundings, and to make sense of nature we see a loving design.

Still, knowledge of the movements of the Sun, planets, moons, and stars across our night skies have had their impact in our history. During his fourth voyage, on 1 March 1504, after 9 months stranded in Jamaica, Christopher Columbus (1451–1506) used his knowledge of eclipses from an almanac he brought with him written by the Castilian Jewish astronomer Abraham Zacuto (1452 – c. 1515) to inspire the Taíno caique of that part of Jamaica to give Columbus’s men food and provisions. Columbus wrote in his journals that he pointed at the Moon and told the Taíno that “God caused that appearance, to signify his anger against them for not bringing the food” to Columbus and his men.[1] Several years ago then, when discussing this story with a friend and fellow Renaissance historian, I decided to use the Stellarium astronomy program to simulate this lunar eclipse as Columbus and those with him in Jamaica saw it. Our ability to track the movements of these celestial orbs is good enough that our computers can show exactly what was visible in the night sky (baring any atmospheric data) at any moment in the past or future.

My simulation of the March 1504 Lunar Eclipse as seen from St. Ann’s Bay, Jamaica using Stellarium.

This ability to calculate the dates and locations of eclipses came in handy when researchers look at mentions of eclipses in ancient literature to seek to date the events of the stories. Plutarch and Heraclitus both argued that the Odyssey contains “a poetic description of a total solar eclipse,” which astronomers Carl Schoch and P.V. Neugebauer proposed matched an eclipse which occurred over the Ionian Sea on 16 April 1178 BCE, though a more recent article in the Proceedings of the National Academy of Sciences by Constantino Baikouzis and Marcelo O. Magnasco offer doubts concerning this proposition owing to the difficulty of finding exact matches in spite of centuries of the Odyssey‘s transmission through the oral tradition before it was written.[2] Still, that eclipses are so readily discernible and measurable with our mathematics speaks to the potential that they could be used to date moments long remembered only in heroic literature like Odysseus’s return to Ithaca in Book 20 of the Odyssey (20.356-57). In this effort, where others divine gods, we make tools out of the Sun and Moon to better understand ourselves.

The way we describe an eclipse speaks to our culture’s relationship with the phenomenon. Our Modern English word derives from the same word in Old French, which developed from the Latin eclīpsis, which in turn was borrowed from the Ancient Greek ἔκλειψις (ékleipsis), which comes from the verb ἐκλείπω (ekleípō)meaning to abandon, go out, or vanish.” Eclipse eclipsed the Old English word āsprungennes, which derives from the past participle of the verb āspringan, meaning “to spring up, to spread out, to run out, to cease or fail.” As an adjective, āsprungen meant that something was defunct or deficient, so perhaps this sense of an eclipse meant that it seemed for a moment as though the Sun had run out of energy and ceased to burn? Again, this speaks to the idea that nature had limits as humanity does, to an older understanding of nature from the perspective of a limited human lifespan. 

In Irish, there is the Hellenic word éiclips, yet there’s an older Gaelic word which means the same thing, urú. Now, usually students of the Irish language will learn of urú in the context of Irish grammar, an urú or eclipsis is one way that Irish handles both consonant clusters and situations when one word ends in a vowel and the following word begins with another vowel. So, in that sense the word gets eclipsed by this urú which preserves some of the integrity of the language. Yesterday’s eclipse then was less an urú focail (word eclipse) and more a urú gréine (solar eclipse). That both the Sun and the words we speak in Irish can be eclipsed makes this astronomical phenomenon all the more ordinary and measurable. 

We use this word eclipse beyond astronomy in many cases; it seems to me today that the old guard of the Republican Party has been eclipsed by an orange political pulsar whose violent rhetoric and chaotic behavior have eaten away at their party’s support in these last 8 years, not unlike a pulsar discovered by NASA’s Swift and Rossi X-Ray Timing Explorer satellites in 2007. An eclipse is something wonderous to behold yet ordinary in how readily we can predict when they will appear. They have given us a great deal of cultural qualifications that continue to influence how we see our world.

On Monday then, when the sky began to darken as the Moon passed in front of the Sun, I noticed that the color spectrum that I’ve always known began to change. Before my eyes the colors seemed to take on a sort of metallic glow, as if the light which illuminated them was shifting into a spectrum that seemed unnatural to the natural world I’ve known. The Sun is fundamental to how we understand the world around us. Its light is what illuminates our senses, and without it, or even with partial changes to its glow, we would find ourselves observing a very different world.


[1] Christopher Columbus, “The Fourth Voyage,” Select Letters of Christopher Columbus: With Other Original Documents Relating to the Four Voyages to the New World, trans. and ed. R. H. Major, (London: Haklyut Society, 1847), 226.

[2] Constanino Baikouzis and Marcelo O. Magnasco, “Is an eclipse described in the Odyssey?” Proceedings of the National Academy of Sciences 105, no. 26 (2008): 8823–8828, nn. 1, 12–14.


Belief & Science

Photo by Felix Mittermeier on Pexels.com

I often stop myself mid thought when considering questions of truth to ask whether I believe in something or know of something. The distinction here is rather simple, knowledge is founded upon evidence, upon scrutiny & careful consideration of the facts of a case. Belief on the other hand is more of a gut feeling, it’s something we can discern but never really know until that feeling is backed up by fact-based evidence. Of all the forms of knowledge we have yet devised perhaps the most precise and useful is science, or rather the Scientific Method, which is fundamental to understanding the most innate truths of our world.

Both belief & science are built on a certain degree of faith. If the hard facts found in scientific inquiry are the bricks used to construct a house for our collected wisdom built up over every generation, then faith is the mortar that keeps those bricks together. You have to have faith in your senses, in your reasoning, and in the methods and tools you use to come to your scientific conclusions. Similarly, faith is necessary to believe, faith in an idea, in a hope, yes even in a dream of eternity. I’ve been using the English for these ideas so far, but now I think it might be useful to dive into the Latin, which will give us a better idea of how these concepts of belief, science, and even faith, interact in our Modern English.

In Latin the verb crēdō fits my own understanding of belief best. This verb refers to the action of believing and trusting in something, for belief is inherently an active thing. This verb is the origin of our English word creed, and in fact is the opening word of the Latin version of Nicene Creed. Something is credible because it can be believed, and so perhaps there is a certain degree of belief necessary and inherent in science whose facts and statements have enough credit to be considered irrefutable.

Science is itself an English adaption of the Latin word scientia, which had its origins in the Late Roman Republic as an abstract noun referring to the present active participle sciēns, a form of the verb sciō meaning “to be able to,” or “to know,” or “to understand.” Sciō is a practical sort of knowing, it refers to a manner of knowledge that can be tested, reviewed, and proven. Science relies on these proofs to survive and flourish, yet moreover science relies on the tools used to know being credible in their utility. You wouldn’t use a dull knife to cut meat, let alone blunted senses or scientific instruments to prove the fullness of our perceivable reality.

I have a deep admiration for many of the great scientific thinkers of the last few generations, and frequently mention the likes the great public science educators as Drs. Carl Sagan, Neil DeGrasse Tyson, and my generation’s favorite science teacher Bill Nye as people whose curiosity and intellect I look to for inspiration. It is striking then that someone like me who does believe in God, who is a practicing Catholic, would be so admiring of thinkers who themselves are profoundly atheistic in their worldview. I understand where they’re coming from, the existence of God cannot be proven through science, that is an indisputable fact, and to say otherwise would likely diminish the power and vitality of my own faith. I don’t mind that God cannot be proven real or otherwise, for the simplest summation of God in Christian theology as I was taught it, largely coming from the Latin Catholic and Greek Orthodox perspectives, is that God is a paradox. God can only really be approached through belief, through the hope that one might be doing things as some original Creator hoped things would turn out, because in my tradition Free Will is something fundamental to Creation.

I think of God in terms of a Divine Essence, certainly not physical let alone personal in a way that we as humans could fully understand a guy sitting across a table from us. I wonder then can we say that we knowGod, for knowledge relies on those same proofs born out of scientific inquiry? I’m not sure there, and I hesitate to talk about a personal relationship with God because how does one really go about talking to or feeling for someone who can’t be discerned by our methods or means? In the end, if God exists, as I believe, then it relies on that same belief, guided by faith, in Latin fidēs, a word that can also mean reliance, trust, confidence, or a promise that the thing you believe in, whether it be the accuracy of the Webb telescope to find for us the rings of Neptune in greater detail than ever before seen, or in the existence of a God who created all things at some moment deep beyond the furthest reaches of our known past.

I used to think that one could place God’s act of creation at the moment of the Big Bang, after all the image of a great explosion fit neatly with a certain idea of an outpouring of Divine Love, caritas in Latin, that is so central to the writings of many of the mystics of the Church. Yet now using scientific measures our experts have determined that the Big Bang was caused by an eruption of pure energy that had built up before the beginning. It makes me wonder whether we will learn more about those earliest moments as time goes on, whether today’s and tomorrow’s cosmologists will find new truths determined by their own proofs of what might well have happened when all matter in our Universe was compressed into a minute area of tremendous mass.

It seems fair to me to argue then that the moment of Creation did take place, and that at some point our own abilities as humans, all our own wisdom, ingenuity, and cleverness, will reach its limit. Thankfully our scientific tools have yet to reach that limit, and I doubt that limit will be reached in a good long while. It is in our nature as humans to continue pushing the boundaries of our knowledge, first beyond the campfires our ancestors gathered around on the long cold nights of the Last Glacial Period which ended somewhere around 11,700 years ago, then as we learned to plant crops and live sedentary lives, building villages, towns, and later cities where we settled. 

As our ancestors continued to develop their societies, they continued to fill in the edges of what became their maps, pushing the edges of what they came to call Terra incognita (unknown land) further and further out to the periphery until 500 years ago the disparate human family was reconnected through our ingenuity and technology transforming the oceans that were once barriers into bridges which we today can cross with ease. In the last seventy years those boundaries have begun to be pushed upward and outward from our home planet and into the stars beyond. We are explorers driven by our desire to understand the unknown, to see over the next horizon. Yet at the core of all that exploring we have continued to explore ourselves, to look within and ask deep questions about who we are not just as physical beings made of flesh, blood, and bone but as individuals, personalities each distinct from the rest. It is this exploration of the self that continues to drive our desire for some greater truth than we can know, a memory of a Creator who began our long and winding story as a species billions of years even before we ourselves evolved into the species we are today, Homo sapiens, discerning humans.

In times now past our ancestors often turned to belief rather than science to answer their questions, to find truths behind the mysteries they faced in their lives. Ideas of monsters, magic, and spirits out for good or ill were born from that worldview. Today, many of those same phenomena could be readily explained using the tools that our sciences have provided, yet still there are limits to our reason, for there are limits to what we as rational beings are capable of. The fullness of God as I believe in such a Divine Essence is beyond that reasoning, something reliant on my belief supported by my faith, my reliance in the possibility of the wisdom that such a Word, to borrow from St. John’s Gospel, promises. That belief is far from scientific, yet it is reinforced by my faith that we as humans can make sense of the reality into which we exist through our own tools, our Scientific Method.

Optimism and Belief

Cloud-line

In my life, there have been two things standing as constants: optimism and belief. I have embraced these two guiding principles, and striven in due course to live a better life as a part of the wider human community through them. For me, my faith as a Catholic and as a Christian is an inherently positive one; it is a faith in Resurrection, in Union with the Divine Essence, in the fulfilment of the circle and restoration of humanity to paradise.

Yet to allow this faith to persist I have found myself inherently optimistic, always expecting the best from people, and looking at even the darkest of situations with the hope that is required to believe in something greater than Reality. True, this is blind faith, something entirely counter to the principles of our scientific age, yet in the end is not blind faith equally necessary in a scientific setting? After all, we have yet to learn all that there is to know about nature, our sciences are as of yet unfinished in amassing the totality of reality. Therefore, if we are to accept science as an effective and prosperous measure of nature, then we must also accept that that measure is man-made and limited in its scope.

I see those things measured by science each and every day, and I am in awe of their wonder. I see how the Sun rises in the east and sets in the west, how the stars circle in the sky as the year passes. I hear the wind bristling through the leaves of the trees, and across the tall grass prairies. I have known what it means to be caught on the beach at high tide, and to be at the mercy of the awesome tempestuous power of lightning. Past generations might well have worshiped these forces of nature, seen them as gods like Zeus, Taranis, or Ukko, yet I see them as terrestrial, as natural, as real. The true force, the veritable essence to be worshiped is far greater than even the rolling thunder or bristling lightning.

In these circumstances I am reminded of the American hymn How Great Thou Art, yet in the smallest of moments too I am reminded of God’s coming to Elijah on the softest breath of wind in the cave. Divinity and the essence that made all that we know and love is so far beyond our own understanding, yet in that realisation I find my peace.

Often it can be said that I find my belief renewed through music, through that purest, most mellifluous of sound. Some of the most sacred moments of my life, the most moving moments in the story of my belief have come in moments of music, from operas like Mozart’s Die Zauberflöte to the Pilgrim’s Chorus in Wagner’s Tannhäuser to great orchestral outbursts of emotion as in Stravinsky’s Firebird and most all of Mahler’s symphonies; yet equally spiritually potent for me are the more recently composed naturalistic Mass settings that I sang with the Rockhurst University Chorus while an undergraduate student there from 2011 to 2015. Music has long been said to be the Voice of the Heavens, and certainly it has appeared to be so to me.

Yet what I find the most fulfilling to my belief in the Divine is humanity. In the Christian tradition we believe that humanity was “Created in the Image and Likeness of God.” For me, this means that our souls particularly were made in the Divine Image, but that our bodies also have Divine inspiration. When I see humanity, with all our faults, all our problems, all our pain and anguish, I can’t help but be swept off my feet in grief. Yet at the end of the day I always remember the old adage echoed by Little Orphan Annie, “Tomorrow will be a brighter day.”

I believe that one day that will come true, that one day all will be sorted out in our capitals, our courts, our executive palaces. I believe that one day we will march through our cities, not in protest or in anger, not out of anguish or to alleviate our suffering, but because we are celebrating that most essential characteristic of our humanity: liberty. I believe that someday all humanity will walk together, singing in unison, a multitude of voices, of languages, of cultures and creeds making one song. I believe in optimism, and I am optimistic about my belief.