Tag Archives: Dissertation

The author on a blue background wearing Apple AirPods.

On Machinery

This week, for the penultimate post of the Wednesday Blog, how machinery needs constant maintenance to keep functioning.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D Surekha Davies, “Walter Raleigh’s headless monsters and annotation as thinking,” in Strange and Wonderous: Notes from a Science Historian, (6 October 2025).[2] “Asking the Computer,” Wednesday Blog 5.26.


This week, for the penultimate post of the Wednesday Blog, how machinery needs constant maintenance to keep functioning.


I am just old enough to remember life before the ubiquity of computers. I had access to our family computer as long as I can remember, and to my grandparents’ computer at their condo when we stayed with them in the Northwest Suburbs of Chicago. Yet even then my computer usage was limited often to idle fascination. I did most of my schoolwork by hand through eighth grade, only switching from writing to typing most of my work when I started high school and was issued a MacBook by my school. I do think that a certain degree of whimsy and humanity has faded from daily life as we’ve so fully adopted our ever newly invented technologies. Those machines can do things that in my early childhood would’ve seemed wonderous. Recently, I thought how without knowing how powerful and far-reaching my computer is as a vehicle for my research and general curiosity, I would be happy, delighted in fact, if my computer could conduct one function, say if it had the ability to look up any street address in the United States as a device connected to the US Postal Service’s database. That alone would delight me. Yet that is the function of not just one application on my computer but merely one of many functions of several such programs I can load on this device, and not only can I look up addresses in the United States but I can look up addresses in any country on this planet.

With the right software downloaded onto this computer I can read any document printed or handwritten in all of human history and leave annotations and highlights without worrying about damaging the original source. Surekha Davies wrote warmly in favor of annotating in her newsletter this week, and I appreciated her take on the matter.[1] In high school, I was a bit of a prude when it came to annotating; I found that summer reading assignment in my freshman and sophomore English classes to be almost repulsive because I see a book as a work of art crafted by its author, editor, and publisher to be a very specific way. To annotate, I argued, was like drawing a curly-cue mustache on the Mona Lisa, a crude act at best. Because of this I process knowledge from books differently. I now often take photos of individual pages and organize them into albums on my computer which I can then consult if I’m writing about a particular book, in much the same fashion that I use when I’m in the archive or special collections room looking at a historical text.

All of these images can now not only be sorted into my computer’s photo library, now stored in the cloud and accessible on my computer and phone alike, but they can also be merged together into one common PDF file, the main file type I use for storing primary and secondary sources for my research. With advances in artificial intelligence, I can now use the common top-level search feature on my computer to look within files for specific characters, words, or phrases to varying levels of accuracy. This is something that was barely getting off the ground when I started working on my doctorate six years ago, and today it makes my job a lot easier; just my file folder containing all of the peer-reviewed articles I’ve used in my research since 2019 contains 349 files and is 887.1 MB in size.

Our computers are merely the latest iterations of machines. The first computer, Charles Babbage’s (1791–1871) counting machine worked in a fairly similar fashion to our own albeit built of mechanical levers and gears where ours have intricate electronics in their hard drives. I, like many others, was introduced to Babbage and his difference engine by seeing the original in the Science Museum in London. This difference engine was a mechanical calculator intended to compute mathematical functions. Blaise Pascal (1623–1662) and Gottfried Wilhelm Leibniz (1646–1716) both developed similar mechanisms in the seventeenth century and still older the Ancient Greek 2nd century BCE Antikythera mechanism can complete some of the same functions. Yet between all of these the basic idea that a computer works in mathematical terms remains the same even today. For all the linguistic foundations of computer code, the functions of any machine burn down to the binary operations of ones and zeros. I wrote last year in this blog about my befuddlement that artificial intelligence has largely been created on verbal linguistic models and was only in 2024 being trained on mathematical ones.[2] Yet even then those mathematical models were understood by the A.I. in English, making their computations fluent only in one specific dialect of the universal language of mathematics making their functionality mostly useless for the vast majority of humanity.

Yet I wonder how true that last statement really is? After all, I a native English speaker with recent roots in Irish learned grammar like many generations of my ancestors through learning to read and write in Latin. English grammar generally made no sense to me in elementary school, it is after all very irregular in a lot of ways, and so it was only after my introduction to a very orderly language, the one for which our Roman alphabet was first adapted, that I began to understand how English works. The ways in which we understand language in a Western European and American context rely on the classical roots of our pedagogy influenced in their own time by medieval scholasticism, Renaissance humanism, and Enlightenment notions of the interconnectedness of the individual and society alike. I do not know how many students today in countries around the globe are learning their mathematics through English in order to compete in one of the largest linguistic job markets of our time. All of this may well be rendered moot by the latest technological leap announced by Apple several weeks ago that their new AirPods will include a live translation feature acting as a sort of Babel Fish or universal translator depending on which science fiction reference you prefer.

Yet those AirPods will break down eventually. They are physical objects, and nothing which exists in physical space is eternal. Shakespeare wrote it well in The Temepst that 

“The solemn temples, the great globe itself,

Yea, all which it inherit, shall dissolve,

And, like this insubstantial pageant faded,

Leave not a rack behind. We are such stuff

As dreams are made on, and our little life

Is rounded with a sleep.” (4.1.170-175)

For our machines to last, they must be maintained, cleaned, given breaks just like the workers who operate them lest they lose all stamina and face exhaustion most grave. Nothing lasts forever, and the more those things are allowed to rest and recuperate the more they are then able to work to their fullest. So much of our literature from the last few centuries has been about fearing the machines and the threat they pose. If we are made in the Image of God then machines, our creation, are made in the image of us. They are the products of human invention and reflect back to us ourselves yet without the emotion that makes us human. Can a machine ever feel emotion? Could HAL-9000 feel fear or sorrow, could Data feel joy or curiosity? And what of the living beings who in our science fiction retrofitted their bodies with machinery in some cases to the extent that they became more machine than human? What emotion could they then feel? One of the most tragic reveals for me in Doctor Who was that the Daleks (the Doctor’s main adversaries) are living beings who felt so afraid and threatened that they decided to encase the most vital parts of their physical bodies in wheelchair tanks, shaped like pepper shakers no less, rendering them resilient adversaries for anyone who crossed them. Yet what remained of the being inside? I urge caution with suggestions of the metaverse or other technological advances that draw us further from our lived experiences and more into the computer. These allow us to communicate yet real human emotion is difficult to express beyond living, breathing, face-to-face interactions.

After a while these machines which have our attention distract us from our lives and render us blind to the world around us. I liked to bring this up when I taught Plato’s allegory of the cave to college freshmen in my Western Civilization class. I conclude the lesson by remarking that in the twenty-first century we don’t need a cave to isolate ourselves from the real world, all we need is a smartphone and a set of headphones and nothing else will exist. I tried to make this humorous, in an admittedly dark fashion, by reminding them to at least keep the headphones on a lighter mode so they can hear their surroundings and to look up from their phone screen when crossing streets lest they find themselves flattened like the proverbial cartoon coyote on the front of a city bus. 

If we focus too much on our machines, we lose ourselves in the mechanism, we forget to care for ourselves and attend to our needs. The human body is the blueprint for all human inventions whether physical ones like the machine or abstract like society itself. As I think further about the problems our society faces, I conclude that at the core there is a deep neglect of the human at the heart of everything. I see this in the way that disasters are reported on in the press: often the financial toll is covered before the human cost, clearly demonstrating that the value of the dollar outweighs the value of the human. In abdicating ourselves to our own abstractions and social ideals we lose the potential to change our course, repair the machinery, or update the software to a better version with new security patches and fixes for glitches old and new. In spite of our immense societal wealth, ever advancing scientific threshold, and technological achievement we still haven’t gotten around to solving hunger, illiteracy, or poverty. In spite of our best intentions our worst instincts keep drawing us into wars that only a few of us want.The Mazda Rua, my car, is getting older and I expect if I keep driving it for a few years or more it’ll eventually need more and more replacement parts until it becomes a Ship of Theseus, yet is not the idea of a machine the same even if its parts are replaced? That idea is the closest I can come to imagining a machine having a soul as natural things like us have. The Mazda Rua remained the Mazda Rua even after its brakes were replaced in January and its slow leaking tire was patched in May. Yet as it moves into its second decade, that old friend of mine continues to work in spite of the long drives and all the adventures I’ve put it through. Our machinery is in desperate need of repair, yet a few of us see greater profit from disfunction than they figure they would get if they actually put in the effort, money, and time to fix things. If problems are left unattended to for long periods of time they will eventually lead to mechanical failure. The same is true for the machinery of the body and of the state. Sometimes a good repair is called for, reform to the mechanisms of power which will make the machine work better for its constituent parts. In this moment that need for reform is being met with the advice of a bad mechanic looking more at his bottom line than at the need of the mechanism he’s agreed to repair. Only on this level the consequences of mechanical failure are dire.


[1] Surekha Davies, “Walter Raleigh’s headless monsters and annotation as thinking,” in Strange and Wonderous: Notes from a Science Historian, (6 October 2025).

[2] “Asking the Computer,” Wednesday Blog 5.26.


A photograph of the Parade of African Mammals in the Grand Gallery of Evolution at the National Museum of Natural History in Paris taken by the author from the 3rd floor.

On Systems of Knowing

This week, I argue that we must have some degree of artifice to organize our thoughts and recognize the things we see in our world.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D For my recent essays referring to this current historiographic project see “On Sources,” Wednesday Blog 6.22, “On Writing,” Ibid., 6.27, and “On Knowledge,” Ibid., 6.29.[2] Lee Alan Dugatkin, Mr. Jefferson and the Giant Moose, (University of Chicago Press, 2009).[3] Staffan Müller-Wille, “Linnean Lens | Linnaeus’ Lapland Journey Diary (1732),“ moderated by Isabelle Charmantier, virtual lecture, 12 May 2025, by the Linnean Society of London, YouTube, 1:04:18, link here.[4] Jason Roberts, Every Living Thing: The Great and Deadly Race to Know All Life, (Random House, 2024), 45–49.[5] Roberts, 20.[6] Roberts, 115–125.[7] Roberts, 109.[8] André Thevet, Les Singularitez de la France Antarctique, (Antwerp, 1558), 16r–16v. The translation is my own.[9] Roberts, 109.[10] Damião de Góis, Chronica do Felicissimo Rei Dom Emanuel, 4 vols., (Lisbon, 1566–1567).[11] Geraldine Heng, The Invention of Race in the European Middle Ages, (Cambridge University Press, 2018), 190.[12] Roberts, 110.[13] Michael Wintroub, A Savage Mirror: Power, Identity, and Knowledge in Early Modern France, (Stanford University Press, 2006), 42.[14] Roberts, xii.[15] Roberts, 107.[16] Roberts, 96–98.[17] Michael Allin, Zarafa: A Giraffe’s True Story, from Deep in Africa to the Heart of Paris, (Delta, 1998).


This week, I argue that we must have some degree of artifice to organize our thoughts and recognize the things we see in our world.


Near the end of June on a Sunday afternoon visit to the Barnes & Noble location on the Plaza here in Kansas City when we were picking out books to gift to family, I espied a copy of Jason Roberts’s new paperback Every Living Thing: The Great and Deadly Race to Know All Life. In the Plutarchan model it is a twenty-first century Parallel Lives of Carl Linnaeus (1707–1778) and Georges-Louis Leclerc, Comte de Buffon (1707–1788), two of the eighteenth century’s most prolific naturalists. I saved it as fun reading once I thought I’d done enough of my proper historical work. That moment came after I finished writing the first draft of the new introduction to my dissertation, a rather large addition to my doctoral study which is mostly historiographic in nature.[1] I’ve been reading Roberts’s book in my free time and delighting in the vibrant portraits he paints of the two men in question. I am a newer Fellow of the Linnean Society of London, elected in January 2025, and so I arrived to this particular book with a happy perspective on Linnaeus, whose Systema Naturae is cited in my dissertation as the first identification of the three-toed sloth by the genus Bradypus. At the same time, I’ve referenced Buffon’s Histoire Naturelle far more frequently in those moments when I’m following the legacy threads of my own Renaissance naturalists into the Enlightenment. After all, Buffon cited Thevet on several occasions where the savant referred to the same animals which the earlier cosmographer described two centuries before.

In spite of my own Linnean affiliation, and my use of Buffon’s corpus in the earliest stages of my broader historiography, I am still largely unfamiliar with these two men. I first knew of Buffon for his famous comments on his presumption of the diminutive nature of American animals when compared with their Afro-Eurasian counterparts, to which Thomas Jefferson retorted by sending Buffon evidence of an American moose.[2] I also know very little about Linnaeus, most of what I know of the Swede comes from lectures presented at the Linnean Society online including a recent lecture given in May by Staffan Müller-Wille, Professor in the History and Philosophy of the Life Sciences at Cambridge about Linnaeus’s Lapland diary from his northern expedition in 1732.[3] There is a new biography of Linnaeus by Gunnar Broberg titled The Man Who Organized Nature: The Life of Linnaeus which I have an eye on yet haven’t gotten a copy of quite yet. So, reading Roberts’s book is a quick introduction for me to this man who for me is most influential with his method of binominal taxonomy which has appeared time and again here in the Wednesday Blog. Yet this system followed after Linnaeus’s earlier alphabetical system for identifying plants by sexual characteristic. The basic premise here is that if there are 26 letters in the alphabet, we can then use that familiar framework to organize other complicated concepts for easy recognition. Linnaeus used this to categorize plants by their male and female sexual characteristics in his 1730 booklet Praeludia Sponsaliorum Plantarum, or Prelude to the Betrothal of Plants.[4] Therefore, Linnaeus could go around the botanical garden at the University of Uppsala in 1730 and quickly identify a plant as a J plant or a G plant. First reading this I thought of the way that letters are used by the Federal Reserve System to identify specific regional branches. Thus, J represents the Federal Reserve Bank of Kansas City and G the Federal Reserve Bank of Chicago. 

I like the idea behind Linnaeus’s alphabetic system yet having only 26 categories to describe the entire plant kingdom seems doomed to be flawed as it relies on a belief that all the plants that are known to exist are the ones that exist, that there’s nothing new under the Sun to be discovered. Roberts frames this in a biblical context, describing how Olof Celsius (1670–1756), one of Linneaus’s first professors, met the young Linnaeus when he was working on a project called the Hierobotanicum or Priestly Plants which was intended to be a compendium of all 126 plants mentioned in the Old and New Testaments.[5] Why would Linnaeus need more than 26 categories to contain all the plants known to the Ancients and to the Bible? Naturally, the flaws were apparent in this from the start by using a system of knowing which originated in the more arid landscape of the Levant rather than in the cooler and damper climate of Sweden. I’ve noticed this in my own life, how many cultural elements which we practice in the United States, notably the seasons, better fit the natural climate of New England and England proper than they do here in the Midwest with its far more variable conditions depending on the time of year, or even the given hour. Roberts deconstructed Linnaeus’s early efforts near the end of Part I of his book when he described Linnaeus’s first scholarly collision with Buffon after the Frenchman’s appointment by Louis XV to the position of Intendant of the Jardin des Plantes in Paris.[6] In a debate which Roberts calls “the Quarrel of the Universals” Linnaeus argued that species could be recognized from individual type specimens while Buffon countered that this ran the great risk of minimizing the diversity of life and eliminating potential variations in nature.

This got me thinking about systems of knowing, thus I decided to render the title of the original file for this blog post that you’re now reading (or listening to) De Systemarum Scientis in the full Latinate tradition of my own scholarship, or “On Systems of Knowing” in English. Why is it, for instance, that our Roman alphabet begins with A and ends with Z? The first half of that question is easier to answer: the Romans adapted our alphabet from the Greeks who started it off with α alpha, β beta, thus the noun alphabet itself. Yet the Greek alphabet ends with ω omega rather than ζ zeta, so why does ours end with Z? What I’ve heard about this is that the Greek letters that were adopted into the Roman alphabet were tacked onto the end of the line, or at least this is what I remember being taught when I learned to recite the alphabet in French in my undergraduate years. French calls the letter Y y-grec, or the Greek i. Likewise, everyone except for we Americans call the final letter of the Roman alphabet some variation of zed, which is a shortening of the Greek zeta. This better reflects that letter’s original sound in Greek, just as the cursive lowercase z is the lowercase Greek ζ just adopted straight into the Roman alphabet without any major changes.

So, when it comes to the organization of our knowledge there are things that we know in this same alphabetical order or in relation to this alphabetical order. Because the Roman alphabet is written left to right, we know that when it’s used to set up a coordinate system on a printed map that A will always appear to the top left, orientating the way the map should be held. Likewise, a reader can quickly scan through an index in any language written in the Roman alphabet by following along with the order of the letters. How individual languages index objects from that point on differs, but the foundational element remains the same. The Roman alphabet works best for Latin, the language for which it was originally developed, so it tends to be adapted in its phonetic values depending on which language is using it. This is why English uses the letter W to represent a [w] sound while German and in loanwords French uses W to represent a [ˈv] sound. Meanwhile, Irish represents the [w] and [ˈv] sounds with two digraphs, bh and mh that represent both depending on the context. Typically, bh represents [ˈv] while mh represents [w], but it depends on context. The reasoning behind this is that when the Roman alphabet was adapted by Latin speakers to fit Old Irish in the fifth and sixth centuries CE they approximated the phonology of their Latin in rendering the Roman alphabet usable for Irish. So, to these monks the Irish [ˈv] sound in a Gaelic name like Medbh sounded enough like how the letter b was used at the time that they used that letter to approximate this [ˈv] sound. It’s notable to me that in Modern Greek the letter β is today pronounced veta and in the Cyrillic alphabet the letter В represents this same [ˈv] sound while the letter Б represents the [b] sound that we English-speakers associate with the letter B. Cyrillic and its predecessor the Slavonic alphabet were being developed around the same time that the Roman alphabet began to be used for Irish so there must’ve been something going on with the pronunciation of people’s Bs becoming closer to Vs in late antiquity. Thus, the ways in which our alphabets represent specific sounds today reflect the prestige dialects of our two classical languages–Latin and Greek–as they were spoken over a millennium ago.

Consider then how we distinguish technical, scientific, or artistic terminology depending on the prestige language of that field. History has largely become a vernacular field, where we adapt terms that will be more familiar to the non-professional enough to initiate them into what Ada Palmer calls the History Lab. Yet often these terms will have etymologies beyond English itself. Consider the word photograph, or its more common shortened form photo. This word comes purely from Greek, the classical language more associated with science and technology. It blends the Greek φωτο-, the blending form of φῶς (phôs), or light with the suffix –γρᾰ́φος, from the verb γρᾰ́φω meaning to draw, sketch, or write. So, photography at its core is light writing. Neat! The word photography entered English from the French photographie, that etymology referring to the French origins of the art and craft of photography itself in the middle of the 1820s. Yet the linguists who modernized Irish a century ago decided to favor indigenous terminologies, rendering this word grianghraf using the Irish word grian for Sun instead of a variation of φωτο- (light) while adopting the Greek –γρᾰ́φος suffix to center this new Irish conception of the term within the same technological corpus as the English photograph. While consequential to have a particular Irish name for this technology that elevated the Irish use of photography as equal to any other culture’s photography and particular within the Irish language, it still remains rooted in the same western tradition of grounding our names for scientific and technical things in Greek.

Language directly influences how we know things because it is the vehicle by which we recognize those things around us. I know that a photograph is something made by “light writing” therefore I will also recognize that anything else beginning with “photo” also refers to “light” and that anything ending with “graph” refers to some form of record or writing. I come from a culture where light is connected with goodness and dark with ill. Likewise, for me I think of blue and green as happier colors rather than red or orange which are angrier colors. There is safety in light, in the daytime we can see people or things coming toward us easier than in the dark of night. At the Easter Vigil the celebrant lights the Paschal Flame which is then passed around the church so that we all share in the Light of Christ (Lux Christi) returned to the world with the Resurrection. The central question in my dissertation is linguistic: what did André Thevet (1516–1590) mean when he referred to the Americas overall as sauvage? This French word translates into English as both savage and wild, yet I chose to retain the original French to better represent the original concept which encompasses both concepts in English. This word was not necessarily racial in the modern sense, rather Thevet used sauvage to describe people, places, and things which existed beyond civilization. This word itself betrays its original meaning, that is city life. Thevet himself understood the sauvage to be the antonym of this city life. I describe it in the introduction to my dissertation in terms of light and dark, following the cultural connotations already illuminated: the city is the sun whence radiates the light of civilization. The further one goes from that sun, the darker things become and the less civilized they remain. Thevet’s sauvage existed at that furthest extreme in the dark. I imagine the character of Gollum in this sort of darkened existence, deep beneath the Misty Mountains uninterested in light save for the Ring of Power which consumed his day rendering it eternal night. In the literature of Thevet’s time a fine sauvage characterization is Caliban in Shakespeare’s Tempest, wild as the waters which wrecked King Alonso and his men on the island in Act 1 of that play.

Roberts notes how these linguistic attributes influenced Linnaeus’s systemization of humanity in the 1735 second edition of his Systema Naturae. The Swede divided humanity into four subcategories described by color over any other facet.[7] Roberts spends the following five pages questioning Linnaeus’s methodology, asking “why four?” and why these specific colors? There is some historical context for Linnaeus’s choice to refer to Black Africans, even Thevet referred to the varied peoples of Africa as “black” in his Singularitez de la France Antarctique. Thevet hints at a possible environmental cause for blackness, writing that the peoples “of Barbary” who are “the blackest” are “of the same manners and conditions as their region is hotter than others.”[8] Thevet’s understanding of African geography is somewhat uncertain, so his definition of Barbary may not align with the Berbers from whom the Barbary Coast of the Maghreb was named. Still, it hints at an understanding that the hotter, or more torrid, the climate got the darker the skin of the people would become. Roberts notes that the Portuguese were the first to use the “word negro to signify African origin or descent” in the middle of the sixteenth century.[9] This makes sense considering the Portuguese were the first European power to sail down the West African coast in the fifteenth century. That Roberts notes this Portuguese definition of blackness first appears in the middle of the sixteenth century likely refers to Damião de Góis’s (1502–1574) Chronica do Dom Emmanuel I of 1566 to 1567 which is an early source that I’ve consulted for information on the voyages of Vasco da Gama (d. 1524).[10] Geraldine Heng, the leading authority on medieval notions of race, wrote in her 2018 book The Invention of Race in the European Middle Ages that blackness was already well established as an element in religious and secular iconography by the beginning of the First Age of Exploration.[11] Roberts concludes his discussion of this particular racial element of Linnaeus’s great contribution to taxonomy sullenly noting that it’s thanks to Linnaeus that this cultural connotation of blackness with darkness was given scientific credence which continues to support racist ideologies to this day.[12]

How do we use our own words to describe things to which they are not suited, in turn transforming the nature of those things that they may become part of our own world? My research is most interested in understanding these questions of how those things at the boundaries of knowledge were understood by André Thevet using the tools afforded to him during the French Renaissance of the sixteenth century. Thevet used the word sauvage to do this and create a category of life against which he could measure and proclaim the existence of something civilized closer to home. Michael Wintroub, Professor Emeritus of Rhetoric at Cal-Berkeley, wrote in his 2006 book A Savage Mirror that Thevet’s countrymen sought to “civilize the barbarians” to make up for an insecurity they felt at being called barbarians themselves by Italian intellectuals at the turn of the sixteenth century during the French invasion of Italy under King Charles VIII (r. 1483–1498).[13] As long as there was someone else who the French could look down upon beyond their own cities they felt secure in their own civility. Yet the sauvage exists within a larger framework of singularities, a word which is central to Thevet’s cosmography. Thevet used the word singularity to describe those things which were exotic, wonderous, and immensely collectable in his eye and hopefully in the eyes of potential readers who would buy his books. I see various layers and categories of singularities in Thevet’s cosmography, for instance he only included images of certain animals in his book of the same name, the aforementioned Singularitez of 1557. The sloth and toucan were depicted as well as described, yet the mysterious Ascension Island aponar remained a bird worthy only of a textual description. This suggests that somethings were more singular than others, or more worthy of attention and the money needed to produce these woodcut images than others. These systems of knowing framed around the singularity are the subject about which I intend to write my first academic monograph. Classifying something as singular gives it an appeal which sets it aside from both the civil and the sauvage as belonging to a higher level of category which can include both the urbane and the agrestic.

Jason Roberts describes Buffon and Linnaeus’s mutual missions to make something of themselves and to rise above their provincial origins to the heights of society. I laughed out loud reading Roberts’s introduction to Linnaeus’s character, what felt like an iconoclasm of sorts for this Fellow of the Linnean Society. “Carl Linnaeus was a Swedish doctor with a diploma-mill medical degree and a flair for self-promotion, who trumpeted that ‘nobody has been a greater botanist or zoologist’ while anonymously publishing rave reviews of his own work.”[14] Buffon by contrast took advantage of a golden opportunity to build his own demi-paradise at his manor in the Burgundy countryside until his good reputation as a botanist brought him to royal attention and the appointment as Intendent of Jardin du Roi.[15] The Jardin des Plantes, as Buffon’s charge is today known, is perhaps a better place to conclude than most. Situated in the Fifth Arrondissement across Boulevard de l’Hôpital and Rue Buffon from Gare d’Austerlitz, the Jardin is an urban oasis created for the purpose of crafting systems of knowing. Its original intent was to serve as a medicinal garden existing beyond the purview of the Sorbonne, Paris’s sole licensed teaching medical school in the seventeenth century.[16] I’ve spent several happy hours wandering through the Jardin, home to the Muséum National d’Histoire Naturelle’s Grande Galerie de l’Évolution, the Galerie de Paléontologie et d’Anatomie compare, and the Ménagerie du Jardin des Plantes, which was home to Paris’s first resident giraffe whose story is delightfully told by Michael Allin in his 1998 book Zarafa: A Giraffe’s True Story, from Deep in Africa to the Heart of Paris.[17] While Allin’s heroine Zarafa is not today on display in the Grande Galerie de l’Évolution (she is instead today to be found in the Muséum d’Histoire naturelle de La Rochelle), the taxidermy in the Parade of African Mammals that is the centerpiece of the Grande Galerie represents a system of knowing animal life in itself.An elephant leads the parade followed by hippopotami, zebras, and giraffes with two such camelopards standing erect their long necks rising toward the upper galleries at the center of the procession. Behind them come the horned mammals, rhinoceroses, and at the rear a crouching lion watching its prey. This is a system that Buffon would have appreciated more than Linnaeus, one which represents the nature of individual beings more than species. Each stuffed specimen seems to have its own character, its own personality. They look about as one would expect they would in life. The great artifice of this is the idea of a parade itself, a very human notion indeed, and one that is infrequent enough to be nearly singular in character, a reason for a day out, worth putting in the social calendar of a city, town, or village no matter how large or small. A parade is its own system of knowing.


[1] For my recent essays referring to this current historiographic project see “On Sources,” Wednesday Blog 6.22, “On Writing,” Ibid., 6.27, and “On Knowledge,” Ibid., 6.29.

[2] Lee Alan Dugatkin, Mr. Jefferson and the Giant Moose(University of Chicago Press, 2009).

[3] Staffan Müller-Wille, “Linnean Lens | Linnaeus’ Lapland Journey Diary (1732),“ moderated by Isabelle Charmantier, virtual lecture, 12 May 2025, by the Linnean Society of London, YouTube, 1:04:18, link here.

[4] Jason Roberts, Every Living Thing: The Great and Deadly Race to Know All Life(Random House, 2024), 45–49.

[5] Roberts, 20.

[6] Roberts, 115–125.

[7] Roberts, 109.

[8] André Thevet, Les Singularitez de la France Antarctique(Antwerp, 1558), 16r–16v. The translation is my own.

[9] Roberts, 109.

[10] Damião de Góis, Chronica do Felicissimo Rei Dom Emanuel4 vols., (Lisbon, 1566–1567).

[11] Geraldine Heng, The Invention of Race in the European Middle Ages, (Cambridge University Press, 2018), 190.

[12] Roberts, 110.

[13] Michael Wintroub, A Savage Mirror: Power, Identity, and Knowledge in Early Modern France, (Stanford University Press, 2006), 42.

[14] Roberts, xii.

[15] Roberts, 107.

[16] Roberts, 96–98.

[17] Michael Allin, Zarafa: A Giraffe’s True Story, from Deep in Africa to the Heart of Paris, (Delta, 1998).


A macaw

On Skepticism

This week, I express my dismay at how fast time seems to be moving for me of late and how it reflects the existence of various sources of knowledge in our world.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D Ada Palmer, Inventing the Renaissance: The Myth of a Golden Age, (University of Chicago Press, 2025), 603.[2] If this word epistemology leaves you confused, have no fear, for my own benefit as well I wrote a blog post explaining this word alongside two of its compatriots. “Three Ologies,” Wednesday Blog 6.6 (podcast 5.6).


This week, I express my dismay at how fast time seems to be moving for me of late and how it reflects the existence of various sources of knowledge in our world.


I first noticed the passage of time on my tenth birthday, that is to say I remember remarking on how from that day on for the rest of my life, I would no longer be counting my years in single digits. I remember distinctly the feeling of surprise at this, a sense that I could never go back to my earliest years. That was especially poignant for me as those first six years lived in the Chicago suburbs held a nostalgic glow in my memory then as they do now. In those early years I felt that time moved slowly; I remember once as a kid I fretted over a 3 minute cooking timer, worrying that I would be unable to stand there and watch the flame over which I was cooking eggs for a full 3 minutes. Today that sounds silly, yet I believe it is vital to remember how I felt all those years ago lest I lose my empathy with my past self or anyone else I may encounter with similar concerns over things I see as minute.

Soon after my tenth birthday, I found a new method of getting through things that I found tedious or even odious to endure. I realized that if I tricked myself into enjoying the moment that the tedium would pass by quicker than if I wallowed in my annoyance and misery. Perhaps there was a degree of pessimism in this realization: that the good moments don’t seem to last as long as the bad ones in my recollection of things, or that it’s in fact easier to remember the bad more than the good. This is something I’ve been struggling with lately, that when I find my thoughts sinking to these depths of my greatest uncertainty and grief that I need to remind myself of all the good in my life. Time seems to move faster today than it did before. The days fly by more than linger, and there’s always something new or old that I need to do. I’ve long thrived on work, a trait I inherited from my parents. Often my happiest days are those spent dedicated to a specific task; those days are made happy by my sense of accomplishment once the task has progressed or even is done. I’ve learned to accept that good things won’t often be finished in a day. I’ll push myself instead to do as much as I feel I can do in the span of a day and see where that leaves me when I go to bed at night. With the new introduction to my dissertation this meant that it took me 9 days to write all 105 pages of it. This is one of those times where I feel that I’m on a roll and in my writer’s paradise when I can write and write and write and not run out of ideas to commit to paper.

Yet I worry about that quicker passage of time because I feel that there are less things that I’m able to do in a given day than I would like. I sacrifice rest sometimes in order to see a project to completion, or I choose to try and find a balance between my work and the rest of my life only to see one side, or another overwhelm its counterpart leaving me feeling unfulfilled when I retire for the night. I do worry that the time I’m afforded is limited, and that I’m not going to do everything I want to undertake. There are plenty of things I want to write, so much I want to say, yet so little time in a given day to say it. I’m still young, just a few weeks over halfway to my 33rd birthday. I have this lingering feeling that there’s so much that I want to do with the life I have and an indeterminate amount of time with which to do those things. Am I content with what I’ve done with my life so far? Yes. Is there so much more I want to do? Absolutely.

I suspect this shock at time moving faster is my own realization of my mortality. Everything has a beginning and an end, the mystery lies in not knowing either terminus directly. How many of us can remember our own birth? I certainly can’t. By the same token we can’t necessarily interview the dead after they’ve shuffled off this mortal coil because, in the words of Dr. McCoy, they’re dead. Thus, we remain doubters of our own mortality, our limits. I often hear older friends talk about how the young feel invincible and immortal and make mistakes which reinforce that sentiment of invincibility all while, if they’re particularly bold or just unlucky, asserting their mortality with a sudden abandon. Our doubts are aimed at established sources of knowledge, authorities to whom we feel no particular duty to abide even if we begrudgingly accept their precepts out of bare necessity. I see enough people every day ignore pedestrian crossing lights even though they are there on the city’s authority to protect us pedestrians when crossing the streets that we’ve abdicated to vehicles. It usually leaves me at least frustrated at the ignorance of the driver, at most even angry when I’ve gotten close to being hit by such an ignoramus.

Skepticism is a significant marker in Renaissance studies as a transitional element from the classically inspired scholarship of the fifteenth and sixteenth centuries into the empirical knowledge-making that traditionally we’ve said was emblematic of the Scientific Revolution. I have many colleagues who are working now on disproving the existence of that Scientific Revolution; I admire that cause and yearn to read what they’re writing even though one of my stock courses to teach is called “the Scientific Revolution: 1500-1800.” Ada Palmer calls Michel de Montaigne, in some ways the inspiration for my Wednesday Blog, “the avatar of this moment” when skepticism became a driving force in Renaissance thought.[1] I argue in my dissertation that the American experience drove the course of skeptical thought in the Renaissance; all the things which André Thevet called singular in the Americas represented a dramatic break from classical standards of knowledge which required a new epistemology to explain them.[2] The key here is that we should never be complacent that our current knowledge is all there is to know, after all a well-lived life is a life spent learning. I’m skeptical about many things and have a drive to continue learning, to continue exploring. Curiosity hasn’t killed this cat yet.[3]I find then that my time is best spent in pursuit of this knowledge, and as much as one can learn alone in the solitude of their study reading and thinking quietly to oneself like a monk, it is far better to learn in communion with others. Since the pandemic began, I’ve grown particularly fond of Zoom lectures, webinars, and workshops as much for the expertise on show as for the community they build. Even if we only communicate through these digital media I still look forward to seeing these people, to experiencing that one part of life with them. We learn so that we might have richer experiences of our own lives, so that we might find comfort in our knowledge, so that we might, in Bill Nye’s words, “change the world.” In the time that I have afforded to me I want to learn more than anything else, to learn about the people around me, about our common heritage, about what our future may hold, and about myself. If I can do that, then when I am “no more, cease to be, expired and gone to meet my maker, become a stiff, bereft of life and resting in peace” I’ll be content in my leave-taking. Hopefully unlike the dead parrot they won’t nail me to my perch like Bentham’s auto-icon which greets knowledge-seekers in the South Cloisters of University College London, though that could be a rather humorous way to go.


[1] Ada Palmer, Inventing the Renaissance: The Myth of a Golden Age, (University of Chicago Press, 2025), 603.

[2] If this word epistemology leaves you confused, have no fear, for my own benefit as well I wrote a blog post explaining this word alongside two of its compatriots. “Three Ologies,” Wednesday Blog 6.6.

[3] Meow.


A figure from Raphael's "The School of Athens" variously identified as Francesco Maria della Rovere, Pico della Mirandola, or Hypatia of Alexandria.

On Knowledge

This week, I want to address how we recognize knowledge in comparison to the various fields of inquiry through which we refine our understanding of things.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkaneArtRaphael, The School of Athens (1509–1511), Apostolic Palace, Vatican Museums, Vatican City. Public Domain.Sources“On Writing,” Wednesday Blog 6.27.Surekha Davies, Humans: A Monstrous History, (University of California Press, 2025).Marcy Norton, The Tame and the Wild: People and Animals After 1492, (Harvard University Press, 2024), 307.Dead Poets Society, (1989) "What will your verse be?" Video on YouTube.


This week, I want to address how we recognize knowledge in comparison to the various fields of inquiry through which we refine our understanding of things.


Lately my work has been dedicated to a thorough review of the historiography within which I’m grounding my dissertation. I wrote about this two weeks ago in an essay titled “On Writing.”[1] My research is historical, yet it touches on secondary literature which operates within various fields within the discipline of history. These include Renaissance history, and its larger sibling early modern history, the history of cartography, the history of animals, the history of botany, and more broadly the history of early modern science. Methodologically, I owe a great deal to two great twentieth-century Francophone anthropologists, Alfred Métraux (1902–1963) and Claude Lévi-Strauss (1908–2009). While Métraux and Lévi-Strauss aren’t considered directly in the historiographic section of the new introduction that I’m writing for my dissertation, which is limited to sources published since the millennium, they nevertheless stand tall in the background of my history.

Today we often talk within academia about a desire for interdisciplinarity in our work and our research. We’ve found ourselves too narrowed by our ever shrinking fields and seek greener common pastures for grazing as our intellectual and pastoral ancestors alike once knew. In my case, this interdisciplinarity lies more in my efforts to incorporate historical zoology into my work, a methodology which seeks to use zoological methodology and theory to explain historical animals. I have friends who study many things. Among them is one whose passion for history, classics, and mathematics has come together to craft a dissertation which seeks to demonstrate the intersections between those three to better understand the great transitions in human inquiry. Another seeks to follow the medical connections across oceans between disparate regions in the Americas and Europe that nevertheless existed even if they seem remarkable today. Still more, I have a friend who applies basic economic need to explain a complex diplomatic situation that once existed between the Venetian Republic and the Ottoman Empire in the Adriatic Sea. All of these historians of whom I write are applying a degree of interdisciplinarity to their work that reflects their own disparate interests and curiosities. In early modern history we talk about curiosities as objects which were collected from disparate and exotic lands into cabinets to display the erudite collector’s prestige and wealth. I say our curiosity is something to be collected by those worthy archives, libraries, museums, or universities that will employ us in the near future and for us to feed with new ideas and avenues of investigation that we will never be bored with life.

In all of these things, there is an underlying genre of knowledge which I am addressing. I’ve written thus far about history alone, yet it is the same for the anthropologists, astronomers, planetary scientists, and physicists who I know. Likewise for the literature scholars and the linguists. Our fields of inquiry all grow on the same planet that comprises of our collected knowledge. In English, this word knowledge is somewhat nebulous. To me, it says that we know things broad or specific. In London, for instance, the Knowledge is the series of tests which new cabbies must complete in order to learn every street within a certain radius of Charing Cross. The Latin translation of this word, scientia, makes things even more complicated as that is the root of the English word science. Thus, when we refer to Renaissance science, there is always a caveat in the following sentence explaining that “this is not science as we know it but a sort of protoscience.” I was advised, similarly, after a particularly poorly received presentation at a workshop at the Museum of Natural Sciences in Brussels in October 2023 that I shouldn’t refer to “sixteenth-century conservation” because no such concept existed at the time; instead, it would be better to discuss a “genealogy of conservation.” This sense that modern terms, in use since the Enlightenment of the eighteenth century, ought not to be pulled further back into the past I think loses some of the provenance of those terms and how the Enlightenment philosophes first came across them. 

I find it telling that the Ancient Greek translation of knowledge, γνῶσις (gnôsis), is a word with which I’m more familiar from theology and the concept of Gnosticism whereas scientia reminds me of philosophy and the other fields of inquiry which grew from that particular branch of the tree of human curiosity. One might even say that philosophy and theology are a pair, siblings perhaps? They seek to understand similar things: on the one hand an inquiry into thought, and ideally wisdom, and on the other a search for the nature of the Divine, which at least in my Catholicism we can know because we are made in the Image of God. The division here between the Ancient Greek term being affiliated with faith and the Latin one with reason I think speaks to the Latin roots of my own education in Catholic schools and at a Jesuit university, where I learned about Plato and Aristotle, yet I recognized Aristotle’s Historia animalium (History of Animals) by its Latin name by which it was generally known in Western Europe for centuries before the rise of vernacular scholarship rather than by its Greek original Τῶν περὶ τὰ ζα ἰστοριῶν (Ton peri ta zoia historion). Note that the English translation of this title, History of Animals reflects better the Latin cognate of ἰστοριῶν rather than the better English translation of that Greek word, Inquiry.

Added onto these classical etymologies, in my first semester Historiography class at Binghamton University I was introduced to the German translation of scientiaγνῶσις, and knowledge. Wissenschaft struck me immediately because I saw the German cognate for the English word wizard in its prefix, and because I knew that the -schaft suffix tends to translate into English as -ship. Thus, my rough Anglicization of Wissenschaft renders Wizardship, which is rather nifty. Yet this word Wissenschaft instead was seen in the nineteenth century as a general word which could be translated into English as science. This is important for us historians trained in the United States because our own historiographic tradition, that is our national school of historians traces our roots back to German universities in the early and middle decades of the nineteenth century. I remember long sessions of my historiography class at UMKC discussing the works of Leopold von Ranke (1795–1886), the father of research-based history. I felt a sense that this concept of Wissenschaft seemed relatable, and as it turned out that was because Irish has a similar concept. 

Whereas in English we tack on the suffix -ology onto any word to make it the study of that word, in Irish you add the suffix -ocht. So, geology is geolaíocht and biology is bitheolaíocht. Yet note with the second example that the suffix is not just -ocht but an entire word, eolaíocht. This is the Irish word for science, added onto the end of bitheolaíocht to demonstrate that this word refers to the study of bith- a prefix combining form of the word beatha, meaning life. So, biology then is the science of life itself. Powerful stuff. I appreciate that Irish linguists and scholars have sought overall to preserve our language’s own consistency with its scientific terminology. It means that these fields of study, these areas of knowledge, can exist purely within the purview of the Irish language without any extra need to recognize that their prefixes or suffixes come from Latin, Greek, or English. There are some exceptions of course: take zó-eolaíocht, the Irish word for zoology, which effectively adopts the Greek word ζῷον perhaps through the English zoo into Irish. Would it not have been just as easy for whoever devised this hyphenated word to instead write ainmhíeolaíocht, translated into English as the science of animals? Here though I see more influence from English because this language adopts as much as it can from other languages out of prestige and a desire for translingual communicability. As an English speaker, I find scholarly works often easier to read because we share common etymologies for our words relating to knowledge. English’s sciencegeology, biology, and zoology are French’s sciencegéologie,biologie, and zoologie. In English, we drop any pretense of Englishness to clothe ourselves in a common mantle familiar to colleagues from related cultures around the globe. In academia this is to our mutual benefit, after all so much of our work is international. I’m regularly on webinars and Zoom calls with colleagues in Europe for instance. I believe this is the lingering spirit of the old scholarly preference for Latin as a lingua franca which at least to me seems close enough in the past that it’s tangible yet realistically it’s surely been a very long time since any serious scholarly work beyond classics was published in Latin for the benefit of a broad translingual readership?

I for one admire the Irish word eolaíocht and its root eolas, which translates into English as knowledge, that is an awareness of things because eolaíocht represents a universal concept while retaining its own native nature. So often in my research I am discussing the early assimilation of indigenous cosmovisions, to borrow a Spanish word put to good use by Surekha Davies in her latest book, into the nascent global world centered on Europe.[2] I see how these cosmic conceptions faded until they were rendered in Gothic or Latin letters on the voluminous pages of encyclopedic Renaissance general and natural histories which remain among the most often cited primary sources for these indigenous cultures who Marcy Norton argued in her 2024 book The Tame and the Wild: People and Animals After 1492 had their own classical past made remote from their colonial present by European contact, conquest, and colonization.[3] Seeing these indigenous perspectives fade into their categorized and classified statuses within the cosmos defined by Europe’s natural philosophers I feel fortunate that my own diaspora (which was also colonized) has retained this element of our individual perspective. I first came across the -ocht suffix in the word poblacht, the Irish word for republic. A famous story from the birth of the Irish Free State during the Anglo-Irish Treaty negotiations in 1921 tells of British Prime Minister David Lloyd-George, a Welsh speaker, remarking to Michael Collins, an Irish speaker, that their choice of a republic was unusual because none of the Celtic languages naturally have a word for republic. That word evokes its Roman roots in the ancient Res publica Romana, the Roman Republic, whose northward expansion across the Alps led to the gradual death of the Continental Celtic languages, whose speakers’ descendants today are largely the Western Romance speakers of French, Romansh, Occitan, Catalan, Spanish, Galician, and Portuguese, among others. Romance languages are noted for their common descent from Latin, whence they all derive variations on the Latin word scientia; English gets science through Old French. “How are you going to name your new government in the Irish language?” Lloyd-George asked. Collins replied something along the lines of “a kingdom is called a ríocht, so this government of the people (pobal) will be called a poblacht. Thus, the Republic of Ireland is named in Irish Poblacht na hÉireann. Naturally, this word pobal derives from the Latin populus, so the shadow of Rome hovers even over unconquered Hibernia. Yet that is another topic for a different essay.

Let me conclude with a comment on the difference between knowledge and wisdom, as I see it. The former is far more tangible. We can know things through learning embodied best in living and in reading. I know for instance to look both ways before crossing a street because plenty of people in the last 140 years have been hit by cars, buses, and trucks, and you can never be too careful. Likewise, I know everything I do about the things I study through reading what others have written about these topics. It’s my job then to say what I will. In Whitman’s words made immortal by our recitation, the answer to the eternal question, “that the powerful play goes on, and you may contribute a verse.” That’s history, people! Reading the powerful play of what others have written and summoning up the courage to take the podium and have your say. I first heard this particular poem, as did many in my generation, recited by Robin Williams in the 1989 film Dead Poets Society. Knowledge is the recitation of these facts we’ve learned. Wisdom is understanding how these facts fit together and speak to our common humanity. What makes us human? I believe it’s as much what we know as what we remain ignorant of. Our ignorance isn’t always a curse, rather it’s another foggy field we’ve yet to inquire about, a place where someone’s curiosity will surely thrive someday. It is another evocation of eolas still to come in our long human story. How wonderous is that?


[1] “On Writing,” Wednesday Blog 6.27.

[2] Surekha Davies, Humans: A Monstrous History(University of California Press, 2025).

[3] Marcy Norton, The Tame and the Wild: People and Animals After 1492, (Harvard University Press, 2024), 307.


The author pulling a face at the camera.

On Writing

This week, some words about the art, and the craft, of writing.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Links in this episode:Patrick Kingsley, Ronen Bergman, and Natan Odenheimer, “How Netanyahu Prolonged the War in Gaza to Stay in Power,” The New York Times Magazine, (11 July 2025).John McWhorter, “It’s Time to Let Go of ‘African American’,” The New York Times, (10 July 2025).Bishop Mark J. Seitz, D.D., “The Living Vein of Compassion’: Immigration & the Catholic Church at this moment,” Commonweal Magazine, (June 2025), 26–32.“On Technology,” The Wednesday Blog 5.2.“Artificial Intelligence,” The Wednesday Blog 4.1.


This week, some words about the art, and the craft, of writing.


In the last week I’ve been hard at work on what I hope is the last great effort toward completing my dissertation and earning my doctorate. Yet unlike so much of that work which currently stands at 102,803 words across 295 U.S. Letter sized pages inclusive of footnotes, front matter, and the rolling credits of my bibliography I am now sat at my desk day in and day out not writing but reading intently and thoroughly books that I’ve read before yet now find the need for a refresher on their arguments as they pertain to the subject of my dissertation: that André Thevet’s use of the French word sauvage, which can be translated into English as either savage or wild, is characteristic of the manner in which the French understood Brazil as the site of its first American colony and the Americas overall within the broader context of French conceptions of civility in the middle decades of the sixteenth century. I know, it’s a long sentence. Those of you listening may want to rewind a few seconds to hear that again. Those of you reading can do what my eyes do so often, darting back and forth between lines.

As I’ve undertaken this last great measure, I’ve dedicated myself almost entirely to completing it, clearing my calendar as much as I see reasonable to finish this job and move on with my life to what I am sure will be better days ahead. Still, I remain committed to exercising, usually 5 km walks around the neighborhood for an hour each morning, and the occasional break for my mind to think about the things I’ve read while I distract myself with something else. That distraction has truly been found on YouTube since I started high school and had a laptop of my own. This week, I was planning on writing a blog post which compared the way that my generation embraced the innovation of school-issued laptops in the classroom and the way that starting next month schools and universities across this country will be introducing artificial intelligence tools to classrooms. I see the benefits, and I see tremendous risks as well, yet I will save that for a lofty second half of this particular essay.

I’ve fairly well trained the YouTube algorithm to show me the sorts of videos that I tend to enjoy most. Opening it now I see a segment from this past weekend’s broadcast of CBS Sunday Morning, several tracks from classical music albums, a clip from the Marx Brothers’ film A Night at the Opera, the source of my favorite Halloween joke, and a variety of comic videos from Conan O’Brien Needs a Friend to old Whose Line is it Anyway clips. Further down are the documentary videos I enjoy from history, language, urbanist, and transportation YouTubers. Yet in the last week or so I’ve been seeing more short videos of a minute or less with clips from Steven Spielberg’s 2012 film Lincoln. I loved this film when I saw it that Thanksgiving at my local cinema. As longtime readers of the Wednesday Blog know, I like to call Mr. Lincoln my patron saint within the American civic religion. As a young boy in Illinois in the ‘90s, he was the hero from our state who saved the Union and led the fight to abolish slavery during the Civil War 130 years before. Now, 30 years later and 160 years out from that most horrific of American wars I decided to watch that film again for the first time in a decade. In fact, I’m writing this just after watching it so some of the inspiration from Mr. Lincoln’s lofty words performed by the great Daniel Day-Lewis might rub off on my writing just enough to make something inspirational this week before I return in the morning to my historiography reading.

Mr. Lincoln knew what every writer has ever known, that putting words to paper preserves them for longer than uttering even the longest string of syllables can last. What I mean to say is they’ll remember what you had to say longer if you write it down. He knew for a fact that the oft quoted and oft mocked maxim that the pen is mightier than the sword is the truth. After all, a sword can take a life, as so many have done down our history and into our deepest past to the proverbial Cain, yet pens give life to ideas that outlive any flesh and bone. I believe writing is the greatest human invention because it is the key to immortality. Through our writing generations from now people will seek to learn more about us in our moment in the long human story. I admit a certain boldness in my thinking about this, after all I’ve seen how the readership and listener numbers for the Wednesday Blog ebb and flow, and I know full well that there’s a good chance no one in the week I publish this will read it. Yet I hold out hope that someday there’ll be some graduate student looking for something to build a career on who might just stumble across my name in a seminar on a sunny afternoon and think “that sounds curious,” only to then find some old book of my essays called The Wednesday Blog and then that student will be reading these words. 

I write because I want to be heard, yet I’ve lived long enough to know that it takes time for people to be willing to listen, that’s fair. I’ve got a growing stack of newspaper articles of the affairs of our time growing while my attention is drawn solely to my dissertation. I want, for instance, to read the work of New York Times reporters Patrick Kingsley, Ronen Bergman, and Natan Odenheimer in a lengthy and thorough piece on how Israeli Prime Minister Netanyahu “prolonged the War in Gaza to stay in power” which was published last Friday.[1] I also want to read John McWhorter’s latest opinion column “It’s Time to Let Go of ‘African American’”; I’m always curious to read about suggestions in the realm of language.[2] Likewise there are sure to be fascinating and thoughtful arguments in the June 2025 issue of Commonweal Magazine, like the article titled “’The Living Vein of Compassion’: Immigration & the Catholic Church at this moment” by Bishop Mark Seitz, DD of the Diocese of El Paso.[3] I’m always curious to read what others are writing because often I’ll get ideas from what I read. There was a good while there at the start of this year when I was combing through the pages of Commonweal looking for short takes and articles which I could respond to with my own expertise here in the Wednesday Blog. By writing we build a conversation that spans geography and time alike. That’s the whole purpose of historiography, it’s more than just a literature review, though that’s often how I describe what I’m doing now to family and friends outside of my profession who may not be familiar with the word historiography or staireagrafaíocht as it is in Irish. 

Historiography is writing about the history that’s already been written. It’s a required core introductory class for every graduate history program that I’m familiar with, I took that class four times between my undergraduate senior seminar (the Great Historians), our introductory Master’s seminar at UMKC (How to History I), and twice at Binghamton in courses titled Historiography and On History. The former at Binghamton was essentially the same as UMKC’s How to History I while the latter was taught by my first doctoral advisor and friend Dr. Richard Mackenney. He challenged us to read the older histories going back to Herodotus and consider what historians in the Middle Ages, Renaissance, Enlightenment, and Nineteenth Century had to say about our profession. Looking at it now, the final paper I wrote for On History was titled “Perspectives from Spain and Italy on the Discovery of the New World, 1492–1550.” I barely remember writing it because it was penned in March and April 2020 as our world collapsed under the awesome weight of the Coronavirus Pandemic. Looking through it, I see how the early stages of the pandemic limited what I could access for source material. For instance, rather than rely on an interlibrary loan copy of an English translation, perhaps even a more recent edition, of Edmundo O’Gorman’s The Invention of America, I instead was left working with the Spanish original that had been digitized at some point in the last couple decades. Likewise, I relied on books I had on hand in my Binghamton apartment, notably the three volumes of Fernand Braudel’s Civilization and Capitalism, in this case in their 1984 English translations. I wrote this paper and then forgot about it amid all the other things that were on my mind that Spring, only to now read it again. So, yes, I can say to the scared and lonely 27 year old who wrote this five years ago that someone did eventually read it after all.

What’s most delightful about reading this paper again is I’m reminded of when I first came across several names of fellow historians who I now know through professional conferences and have confided in for advice on my own career. The ideas first written in the isolation of lockdown have begun to bear fruit in the renewed interactions of my professional life half a decade later. What more will come of those same vines planted in solitude as this decade continues into its second half? Stretching that question further back in my life, I can marvel at the friendships I’ve cultivated with people I met in my first year of high school, now 18 years ago. That year, 2007, we began our education at St. James Academy where many of us were drawn to the promise of each student getting their own MacBook to work on. I wrote here in March 2024 about how having access to that technology changed my life forever.[4] So, in the last week when I read in one of my morning email newsletters from the papers about the soon-to-be introduction of artificial intelligence to classrooms across this country in much the same way that laptops in classrooms were heralded as the new great innovation in my youth I paused for a few moments longer before turning to my daily labor.

I remain committed to the belief that having access to a laptop was a benefit to my education; in many ways it played a significant role in shaping me into the person I am today. I wrote 14 plays on that laptop in my 4 years in high school, and many of my early essays to boot. I learned how to edit videos and audio and still use Apple products today because I was introduced to them at that early age. It helps that the Apple keyboard comes with easy ways to type accented characters like the fada in my name, Seán. Still, on a laptop I was able to write much the same that I had throughout my life to that point. I began learning to type when I was 3 years old and mastered the art in my middle school computer class. When I graduated onto my undergraduate studies though I found I could take notes far better that I could remember by hand than if I typed them. This is crucial to my story: the notes that I took in my Renaissance seminar at UMKC in Fall 2017 were written by hand, in French no less, and so when I was searching for a dissertation topic involving Renaissance natural history in August 2019, I remembered writing something about animals in that black notebook. Would I have remembered it so readily had I typed those notes out? After all, I couldn’t remember the title of that term paper I wrote for On History in April 2020 until I reopened the file just now.

Artificial intelligence is different than giving students access to laptops because unlike our MacBooks in 2007, A.I. can type for the student, not only through dictation but it can suggest a topic, a thesis, a structure, and supporting evidence all in one go. Such a mechanical suggestion is not inherently a suggestion of quality however, and here lies the problem. I’ve read a lot of student essays in the years I’ve been teaching, some good, some bad. Yet almost all of them were written in that student’s own voice. After a while the author’s voice becomes clear; with my current round of historiography reading, I’m delighting in finding that some of these historians who I know write in the same manner that they speak without different registers between the different formats. That authorial voice is more important than the thesis because it at least shows curiosity and the individual personality of the author can shine through the typeface’s uniformity. Artificial intelligence removes the sapiens from we Homo sapiens and leaves our pride in merely being the last survivor of our genus rather than being the ones who were thinkers who sought wisdom. Can an artificial intelligence develop wisdom? Certainly, it can read works of philosophy both illustrious and indescribably dull yet how well can it differentiate between those twin categories to give a fair and reasoned assessment of questions of wisdom?These are some of my concerns with artificial intelligence as it exists today in July 2025. I have equally pressing concerns that we’ve developed this wonderous new tool before addressing how it will impact our lived organic world through its environmental impact. With both of these concerns in mind I’ve chosen to refrain from using A.I. for the foreseeable future, a slight change in tone from the last time I wrote about it in theWednesday Blog on 7 June 2023.[5] I’m a historian first and foremost, yet I suspect based on the results when you search my name on Google or any other search engine that I am better known to the computer as a writer, and in that capacity I don’t want to see my voice as soft as it already is quieted further by the growing cacophony of computer-generated ideas that would make Aristophanes’ chorus of frogs croak. Today, that’s what I have to say.


[1] Patrick Kingsley, Ronen Bergman, and Natan Odenheimer, “How Netanyahu Prolonged the War in Gaza to Stay in Power,” The New York Times Magazine, (11 July 2025).

[2] John McWhorter, “It’s Time to Let Go of ‘African American’,” The New York Times, (10 July 2025).

[3] Bishop Mark J. Seitz, D.D., “The Living Vein of Compassion’: Immigration & the Catholic Church at this moment,” Commonweal Magazine, (June 2025), 26–32.

[4] “On Technology,” The Wednesday Blog 5.2.

[5] “Artificial Intelligence,” The Wednesday Blog 4.1.


Essay Writing

Photo by picjumbo.com on Pexels.com

I’m at a rather fun point in my doctoral studies. Today, I get to spend my days working on my dissertation and serving as a Teaching Assistant for a class. In my TA duties here at Binghamton, I’ve been assigned two sections of 25 students each, so when they submit essays, I find myself grading 50 of those in the course of a few days. It’s a lot of work, and in the moment the grading inspires a variety of emotions in me, from joy at a wonderfully written essay to disappointment at one that could’ve done better with just a little more effort.

One of the greatest boons to my work as a teacher is that now, after 12 years in college and 27 years overall as a student, I’ve finally made sense of how to write an essay. This word, used so often in academic writing, never really clicked for me. I knew I was supposed to write a research paper that had a introduction, body, and conclusion, but never really got the structure my teachers and professors were going for beyond that. It took my own transition from student to teacher for me to really understand that an essay is an extended argument. 

It also took for me to start studying Renaissance French Humanism and Natural History for me to really understand the origins of the essay with Michel de Montaigne’s 1580 book of Essays, where the term originated. These were reflections on a variety of topics, from children’s education to cannibalism and everything in between. My own Wednesday Blog is in some ways a nod to Montaigne in format. Montaigne’s essays sought to describe his world as he saw it and understood it, in all its rich detail and complexity.

In the academic essays that I write, from the quick 3 to 5 page papers I used to write for my undergrad history classes at Rockhurst to my dissertation, which in many aspects is itself a long essay, all have the same core structure and spirit. Yes, at its barest bones the essay is made up of an introduction, body, and conclusion, but there’s so much more rich detail to a good essay than just that. 

This semester it really occurred to me that the introduction ought to be made up of three main things: something to catch the reader’s attention, a thesis statement laying out the essay’s argument, and a brief summary of the main points with some context in the existing literature included if you’re writing on a graduate or professional level. Any of my colleagues reading or listening to this will either find their eyes are glazing over here or are instead laughing that I only really figured this out this late in the game as I was writing my dissertation.

The body is more than just the main points of the essay, it’s the real meat of the work, the rich quotes and analyses of the sources, the connections made to other works in all their intricate splendor, the quotable lines that help the essay stick in the reader’s memory and look forward to reading more of your work in the future. I still would say though that the thesis remains more important than the body, after all the body isn’t going to make sense without a strong central argument, a beating heart at the core of the entire work. I often tell my students this and have even begun advising them to underline their thesis statements to help them keep that heart in mind as they continue to flesh out the rest of their written creation.

Finally, there’s the conclusion. It’s a summary of the main points of the essay, a restating of the thesis with the memory of the body fresh in the reader’s mind. The conclusion is a chance to leave your reader with a really strong impact from your essay, something to find them wanting more. 

For a while now I’ve often thought of the essay as a form functioning for scholarship and literature as the symphony functions for music. In both cases there are different styles and methods of elevating the pure form into an art that reflects the writer or composer’s personality and craft, that leaves the audience feeling something different, something that they can best describe as emblematic of that work’s author. In symphonic music there are clear distinctions between the classical composers like Haydn & Mozart and the romantic composers like Beethoven & Brahms. Sometimes, the best way to end an essay is to borrow an idea from the romantic symphonies and even the romantic operas of composers like Gounod and Wagner: let the main themes finish and then have a sigh to really round things off. If you listen to the finale of Wagner’s Götterdammerung, you’ll hear this very sigh, as if all the energy built up in the composition over the last few hours is nearly extinguished but has one last breath. If you can write an essay like that then bravo.

I’m writing this in the midst of an extensive round of edits to my dissertation, going line by line making corrections, clarifications, and all around tightening down my work until it really just elevates the core form of my thesis. In the last year since I started writing this dissertation, I’ve learned a great deal about how to do this job, and I hope I will in future avoid some of the great pitfalls that I’ve caught myself up in time and again in my studies.If you’re a fellow academic, or interested in academic writing, I highly suggest you go listen to my friend Kate Carpenter’s podcast Drafting the Past, which is all about the process of writing history. It’s a wonderful service to the profession that Kate’s doing. Enjoy your week!

Doctoral Study

Photo by Ricardo Esquivel on Pexels.com

I was 27 when I arrived here in Binghamton at the start of August 2019. I made a big move out here, with immense help from my parents, and set up shop in a good-sized one bedroom apartment that’s remained my sanctuary in this part of the country ever since. I’d wanted to continue my education up to the PhD since my high school days, and it’s a plan I’ve stuck with through thick and thin. After a false start in my first attempt to apply to PhD programs in 2016, which led to two wonderful years working on a second master’s in History at the University of Missouri-Kansas City (UMKC), I applied again, now far better positioned for a PhD program and ended up here through the good graces and friendly insight of several people to whom I’m quite grateful.

Arriving in Binghamton though I found the place very cold and quite lonely. In recent months I’ve begun to think more and more about getting rid of some of my social media accounts only to then remember that Facebook, Instagram, and Twitter were some of my greatest lines of communication with friends and family back home in Kansas City and elsewhere around the globe throughout these last three years. That first semester was tough, very tough, and while the second semester seemed to get off to a good start it was marked by the sudden arrival of the Coronavirus Pandemic and the end of my expectations for these years in Binghamton. 

I spent about half of 2020 and 2021 at home in Kansas City, surrounded by family and finding more and more things to love about my adopted hometown with each passing day. When I was in Binghamton it was to work, in Fall 2020 to complete my coursework and in Spring 2021 to prepare for my Comprehensive Exam and Dissertation Prospectus defense. I still did a good deal of the prospectus work at home rather than here, though the memories of those snowy early months of 2021 reading for the comps here at this desk where I am now always come to mind when I’m in this room.

As the Pandemic began to lessen in Fall 2021 and into the start of this year, I found myself in Binghamton at a more regular pace. There was something nice about that, sure I wanted to be home with my family, but I also felt like I was getting a part of the college experience of going away for a few years to study that was reminiscent of the year I spent working on my first MA at the University of Westminster in London. I started to venture further afield in the Northeast again, traveling to Boston, New York, Philadelphia, Baltimore, and Washington again. When I first decided to come here one of the things, I decided was I’d take the opportunity of being in the Northeast to see as much of this region as possible.

2022 saw another transition, I wasn’t in one of the newer cohorts in my department anymore. Now, in Fall 2022 I’m one of the senior graduate students. It’s a weird thing to consider, seeing as it felt like 2020 and 2021 evaded the usual social life of the history graduate students here, thanks to the ongoing pandemic. I also began to look more seriously at my future, applying for jobs in cities across this country, and even looking again at some professorships, something I doubted for a while would be an option for me. If there’s anything about life that I’ve learned over the past three years spent here, it’s that you always need to have things beyond your work to look forward to. Whether that be a long walk in the woods on the weekends or a day trip to somewhere nearby, or even the latest episode of your favorite show in the evenings. Doing this job without having a life beyond it is draining. 

For me the best times here in Binghamton were in Fall 2021 and Spring 2022 when I truly began to feel like I had a place here that I’d made my own. I was confident in my work, happy with how my TA duties were going, and really enjoying my free time as I began to spend my Friday evenings up at the Kopernik Observatory and Sundays at the Newman House, the Catholic chapel just off campus. I was constantly reading for fun as well, something I’d lost in 2020, even falling behind with the monthly issues of my favorite magazines National Geographic and Smithsonian. There were many weeknights I’d spend out having dinner alone reading natural history, science fiction, anthropology, and astronomy books. 

It’s interesting looking back on myself from six years ago when I was in London, the months that summer when I decided I wanted to get back into history after a year studying political science. My motivations were to earn a job working at one of the great museums I’d spent countless hours in during that year in the British capital. While I studied for my MA in International Relations and Democratic Politics, I was still spending my free time looking at Greek and Roman statuary and wandering the halls of Hampton Court or watching the hours of history documentaries on BBC 4 in the evenings. And now that I’m back in History as much as I do appreciate and love what I do, I find my free time taken up by science documentaries and books.

It’s important if you do want to get your PhD in the humanities and social sciences to figure out why it is you want to do this before you start. Have a plan in mind, have a big research question in mind, and focus your attentions onto that question. My own story has many twists and turns from an interest in my early 20s in democratic politics to a brief dalliance with late republican Roman history before settling into the world of English Catholics during the Reformation. I ended up where I am today because of another series of events that led me to moving from the English Reformation to the French Reformation, and from studying education to natural history. So, here I am, a historian of the development of the natural history of Brazil between 1550 and 1590, specifically focusing on three-toed sloths. In a way there are echoes of all the work I’ve done to date in what I’m doing now, thus as particular as this topic is it makes sense in the course of my life as a scholar.

A month from now will be my 30th birthday, a weird thing to write let alone say aloud. My twenties have been a time of exploration of both the world around me and of myself. When I look at my photo on my Binghamton ID card, the best way to describe my appearance would be grumpy yet optimistic. Just as I was a decade ago, a sophomore in college, so now I am today, looking ahead to the next decade with excited anticipation of what it’ll bring, and hopeful that all the work I’ve done in this decade will find its reward in the next.

Me upon arrival in Binghamton, August 2019.

Patience

Isidor Kaufmann (1853–1921), “Waiting room at the Court”, 1888
This week, how I've learned that patience really is a virtue.

I’ll freely admit that I’m a pretty impatient guy. I feel the most rewarded when I’m able to solve problems quickly and efficiently, and throughout my life I’ve never really enjoyed dealing with things that are long term questions. As I’ve gotten older though my impatience has mellowed out, I’m more willing now to let myself take a day to relax and think rather than trying to force myself to write a page a day or read a book in an afternoon.

In the last few years, with this global pandemic, it’s really begun to occur to me that there is far more outside of my control than within it, a lot more that I simply can’t do anything about. Sure, I’m not naïve enough to think I could single-handedly stop the looming war clouds hovering over Ukraine or solve the climate crisis. Those are big problems that are going to be solved by a whole host of people likely over generations of hard work. And even in my own work as a historian, and especially as a teacher, I’ve come to learn that as much as I’d like my students to follow directions to the letter, the best I can do is make sure those directions are clear and concise and then let them go down the road I’ve laid. They’re adults after all, it’s up to them how they want to perform in my classroom.

At this moment in my work, I’m writing my doctoral dissertation. The working title is “Trees, Sloths, and Birds: Brazil in Sixteenth-Century Natural History”. It’s a bit of an odd ball of a topic, a combination of many different topics, ideas, and fields that I’ve been interested in from childhood. As of today, I have one out of six of my chapters written, and I’m glad to be in the position I am. But looking ahead at the second chapter, the next one I’m going to start writing in the next week, I’ve got to admit it’s daunting to imagine how I’m going to make it work. And that’s the key to this project and every project any of us will ever attempt; we have to be able to imagine doing it before we actually do it. So, now in my doctoral studies, I’ve learned the benefit of professional patience.

Today is one of those days when I intended to get more done than I actually have. I did make a very loose outline of the chapter I’m about to start writing, with some questions about which order the sections should go in, and I’ve made some headway in coordinating the primary and secondary sources I’m using in my thinking about this chapter. Unlike Mozart in Peter Shaffer’s play Amadeus, I don’t have a fully written draft of this chapter already done in my mind, just waiting to process through my hands and the keyboard into the word processor on my computer. Instead, I’ve got a loose collection of ideas, and an understanding that in a little while, whether it be hours, days, or weeks, I’ll start crafting those into sentences and paragraphs.

That’s my writing process today. It’s less about the mad dash to the finish, and more a leisurely stroll through different interrelated ideas that I’ve got until they’ve come together in a convincing argument that I’m willing to send around to those interested parties. Patience is a virtue, and while I’m thinking through what this chapter will look like, I’m happy to sit and wait for a good result, knowing that eventually it too will pass.