Tag Archives: The Tempest

The author on a blue background wearing Apple AirPods.

On Machinery

This week, for the penultimate post of the Wednesday Blog, how machinery needs constant maintenance to keep functioning.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D Surekha Davies, “Walter Raleigh’s headless monsters and annotation as thinking,” in Strange and Wonderous: Notes from a Science Historian, (6 October 2025).[2] “Asking the Computer,” Wednesday Blog 5.26.


This week, for the penultimate post of the Wednesday Blog, how machinery needs constant maintenance to keep functioning.


I am just old enough to remember life before the ubiquity of computers. I had access to our family computer as long as I can remember, and to my grandparents’ computer at their condo when we stayed with them in the Northwest Suburbs of Chicago. Yet even then my computer usage was limited often to idle fascination. I did most of my schoolwork by hand through eighth grade, only switching from writing to typing most of my work when I started high school and was issued a MacBook by my school. I do think that a certain degree of whimsy and humanity has faded from daily life as we’ve so fully adopted our ever newly invented technologies. Those machines can do things that in my early childhood would’ve seemed wonderous. Recently, I thought how without knowing how powerful and far-reaching my computer is as a vehicle for my research and general curiosity, I would be happy, delighted in fact, if my computer could conduct one function, say if it had the ability to look up any street address in the United States as a device connected to the US Postal Service’s database. That alone would delight me. Yet that is the function of not just one application on my computer but merely one of many functions of several such programs I can load on this device, and not only can I look up addresses in the United States but I can look up addresses in any country on this planet.

With the right software downloaded onto this computer I can read any document printed or handwritten in all of human history and leave annotations and highlights without worrying about damaging the original source. Surekha Davies wrote warmly in favor of annotating in her newsletter this week, and I appreciated her take on the matter.[1] In high school, I was a bit of a prude when it came to annotating; I found that summer reading assignment in my freshman and sophomore English classes to be almost repulsive because I see a book as a work of art crafted by its author, editor, and publisher to be a very specific way. To annotate, I argued, was like drawing a curly-cue mustache on the Mona Lisa, a crude act at best. Because of this I process knowledge from books differently. I now often take photos of individual pages and organize them into albums on my computer which I can then consult if I’m writing about a particular book, in much the same fashion that I use when I’m in the archive or special collections room looking at a historical text.

All of these images can now not only be sorted into my computer’s photo library, now stored in the cloud and accessible on my computer and phone alike, but they can also be merged together into one common PDF file, the main file type I use for storing primary and secondary sources for my research. With advances in artificial intelligence, I can now use the common top-level search feature on my computer to look within files for specific characters, words, or phrases to varying levels of accuracy. This is something that was barely getting off the ground when I started working on my doctorate six years ago, and today it makes my job a lot easier; just my file folder containing all of the peer-reviewed articles I’ve used in my research since 2019 contains 349 files and is 887.1 MB in size.

Our computers are merely the latest iterations of machines. The first computer, Charles Babbage’s (1791–1871) counting machine worked in a fairly similar fashion to our own albeit built of mechanical levers and gears where ours have intricate electronics in their hard drives. I, like many others, was introduced to Babbage and his difference engine by seeing the original in the Science Museum in London. This difference engine was a mechanical calculator intended to compute mathematical functions. Blaise Pascal (1623–1662) and Gottfried Wilhelm Leibniz (1646–1716) both developed similar mechanisms in the seventeenth century and still older the Ancient Greek 2nd century BCE Antikythera mechanism can complete some of the same functions. Yet between all of these the basic idea that a computer works in mathematical terms remains the same even today. For all the linguistic foundations of computer code, the functions of any machine burn down to the binary operations of ones and zeros. I wrote last year in this blog about my befuddlement that artificial intelligence has largely been created on verbal linguistic models and was only in 2024 being trained on mathematical ones.[2] Yet even then those mathematical models were understood by the A.I. in English, making their computations fluent only in one specific dialect of the universal language of mathematics making their functionality mostly useless for the vast majority of humanity.

Yet I wonder how true that last statement really is? After all, I a native English speaker with recent roots in Irish learned grammar like many generations of my ancestors through learning to read and write in Latin. English grammar generally made no sense to me in elementary school, it is after all very irregular in a lot of ways, and so it was only after my introduction to a very orderly language, the one for which our Roman alphabet was first adapted, that I began to understand how English works. The ways in which we understand language in a Western European and American context rely on the classical roots of our pedagogy influenced in their own time by medieval scholasticism, Renaissance humanism, and Enlightenment notions of the interconnectedness of the individual and society alike. I do not know how many students today in countries around the globe are learning their mathematics through English in order to compete in one of the largest linguistic job markets of our time. All of this may well be rendered moot by the latest technological leap announced by Apple several weeks ago that their new AirPods will include a live translation feature acting as a sort of Babel Fish or universal translator depending on which science fiction reference you prefer.

Yet those AirPods will break down eventually. They are physical objects, and nothing which exists in physical space is eternal. Shakespeare wrote it well in The Temepst that 

“The solemn temples, the great globe itself,

Yea, all which it inherit, shall dissolve,

And, like this insubstantial pageant faded,

Leave not a rack behind. We are such stuff

As dreams are made on, and our little life

Is rounded with a sleep.” (4.1.170-175)

For our machines to last, they must be maintained, cleaned, given breaks just like the workers who operate them lest they lose all stamina and face exhaustion most grave. Nothing lasts forever, and the more those things are allowed to rest and recuperate the more they are then able to work to their fullest. So much of our literature from the last few centuries has been about fearing the machines and the threat they pose. If we are made in the Image of God then machines, our creation, are made in the image of us. They are the products of human invention and reflect back to us ourselves yet without the emotion that makes us human. Can a machine ever feel emotion? Could HAL-9000 feel fear or sorrow, could Data feel joy or curiosity? And what of the living beings who in our science fiction retrofitted their bodies with machinery in some cases to the extent that they became more machine than human? What emotion could they then feel? One of the most tragic reveals for me in Doctor Who was that the Daleks (the Doctor’s main adversaries) are living beings who felt so afraid and threatened that they decided to encase the most vital parts of their physical bodies in wheelchair tanks, shaped like pepper shakers no less, rendering them resilient adversaries for anyone who crossed them. Yet what remained of the being inside? I urge caution with suggestions of the metaverse or other technological advances that draw us further from our lived experiences and more into the computer. These allow us to communicate yet real human emotion is difficult to express beyond living, breathing, face-to-face interactions.

After a while these machines which have our attention distract us from our lives and render us blind to the world around us. I liked to bring this up when I taught Plato’s allegory of the cave to college freshmen in my Western Civilization class. I conclude the lesson by remarking that in the twenty-first century we don’t need a cave to isolate ourselves from the real world, all we need is a smartphone and a set of headphones and nothing else will exist. I tried to make this humorous, in an admittedly dark fashion, by reminding them to at least keep the headphones on a lighter mode so they can hear their surroundings and to look up from their phone screen when crossing streets lest they find themselves flattened like the proverbial cartoon coyote on the front of a city bus. 

If we focus too much on our machines, we lose ourselves in the mechanism, we forget to care for ourselves and attend to our needs. The human body is the blueprint for all human inventions whether physical ones like the machine or abstract like society itself. As I think further about the problems our society faces, I conclude that at the core there is a deep neglect of the human at the heart of everything. I see this in the way that disasters are reported on in the press: often the financial toll is covered before the human cost, clearly demonstrating that the value of the dollar outweighs the value of the human. In abdicating ourselves to our own abstractions and social ideals we lose the potential to change our course, repair the machinery, or update the software to a better version with new security patches and fixes for glitches old and new. In spite of our immense societal wealth, ever advancing scientific threshold, and technological achievement we still haven’t gotten around to solving hunger, illiteracy, or poverty. In spite of our best intentions our worst instincts keep drawing us into wars that only a few of us want.The Mazda Rua, my car, is getting older and I expect if I keep driving it for a few years or more it’ll eventually need more and more replacement parts until it becomes a Ship of Theseus, yet is not the idea of a machine the same even if its parts are replaced? That idea is the closest I can come to imagining a machine having a soul as natural things like us have. The Mazda Rua remained the Mazda Rua even after its brakes were replaced in January and its slow leaking tire was patched in May. Yet as it moves into its second decade, that old friend of mine continues to work in spite of the long drives and all the adventures I’ve put it through. Our machinery is in desperate need of repair, yet a few of us see greater profit from disfunction than they figure they would get if they actually put in the effort, money, and time to fix things. If problems are left unattended to for long periods of time they will eventually lead to mechanical failure. The same is true for the machinery of the body and of the state. Sometimes a good repair is called for, reform to the mechanisms of power which will make the machine work better for its constituent parts. In this moment that need for reform is being met with the advice of a bad mechanic looking more at his bottom line than at the need of the mechanism he’s agreed to repair. Only on this level the consequences of mechanical failure are dire.


[1] Surekha Davies, “Walter Raleigh’s headless monsters and annotation as thinking,” in Strange and Wonderous: Notes from a Science Historian, (6 October 2025).

[2] “Asking the Computer,” Wednesday Blog 5.26.


A photograph of the Parade of African Mammals in the Grand Gallery of Evolution at the National Museum of Natural History in Paris taken by the author from the 3rd floor.

On Systems of Knowing

This week, I argue that we must have some degree of artifice to organize our thoughts and recognize the things we see in our world.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D For my recent essays referring to this current historiographic project see “On Sources,” Wednesday Blog 6.22, “On Writing,” Ibid., 6.27, and “On Knowledge,” Ibid., 6.29.[2] Lee Alan Dugatkin, Mr. Jefferson and the Giant Moose, (University of Chicago Press, 2009).[3] Staffan Müller-Wille, “Linnean Lens | Linnaeus’ Lapland Journey Diary (1732),“ moderated by Isabelle Charmantier, virtual lecture, 12 May 2025, by the Linnean Society of London, YouTube, 1:04:18, link here.[4] Jason Roberts, Every Living Thing: The Great and Deadly Race to Know All Life, (Random House, 2024), 45–49.[5] Roberts, 20.[6] Roberts, 115–125.[7] Roberts, 109.[8] André Thevet, Les Singularitez de la France Antarctique, (Antwerp, 1558), 16r–16v. The translation is my own.[9] Roberts, 109.[10] Damião de Góis, Chronica do Felicissimo Rei Dom Emanuel, 4 vols., (Lisbon, 1566–1567).[11] Geraldine Heng, The Invention of Race in the European Middle Ages, (Cambridge University Press, 2018), 190.[12] Roberts, 110.[13] Michael Wintroub, A Savage Mirror: Power, Identity, and Knowledge in Early Modern France, (Stanford University Press, 2006), 42.[14] Roberts, xii.[15] Roberts, 107.[16] Roberts, 96–98.[17] Michael Allin, Zarafa: A Giraffe’s True Story, from Deep in Africa to the Heart of Paris, (Delta, 1998).


This week, I argue that we must have some degree of artifice to organize our thoughts and recognize the things we see in our world.


Near the end of June on a Sunday afternoon visit to the Barnes & Noble location on the Plaza here in Kansas City when we were picking out books to gift to family, I espied a copy of Jason Roberts’s new paperback Every Living Thing: The Great and Deadly Race to Know All Life. In the Plutarchan model it is a twenty-first century Parallel Lives of Carl Linnaeus (1707–1778) and Georges-Louis Leclerc, Comte de Buffon (1707–1788), two of the eighteenth century’s most prolific naturalists. I saved it as fun reading once I thought I’d done enough of my proper historical work. That moment came after I finished writing the first draft of the new introduction to my dissertation, a rather large addition to my doctoral study which is mostly historiographic in nature.[1] I’ve been reading Roberts’s book in my free time and delighting in the vibrant portraits he paints of the two men in question. I am a newer Fellow of the Linnean Society of London, elected in January 2025, and so I arrived to this particular book with a happy perspective on Linnaeus, whose Systema Naturae is cited in my dissertation as the first identification of the three-toed sloth by the genus Bradypus. At the same time, I’ve referenced Buffon’s Histoire Naturelle far more frequently in those moments when I’m following the legacy threads of my own Renaissance naturalists into the Enlightenment. After all, Buffon cited Thevet on several occasions where the savant referred to the same animals which the earlier cosmographer described two centuries before.

In spite of my own Linnean affiliation, and my use of Buffon’s corpus in the earliest stages of my broader historiography, I am still largely unfamiliar with these two men. I first knew of Buffon for his famous comments on his presumption of the diminutive nature of American animals when compared with their Afro-Eurasian counterparts, to which Thomas Jefferson retorted by sending Buffon evidence of an American moose.[2] I also know very little about Linnaeus, most of what I know of the Swede comes from lectures presented at the Linnean Society online including a recent lecture given in May by Staffan Müller-Wille, Professor in the History and Philosophy of the Life Sciences at Cambridge about Linnaeus’s Lapland diary from his northern expedition in 1732.[3] There is a new biography of Linnaeus by Gunnar Broberg titled The Man Who Organized Nature: The Life of Linnaeus which I have an eye on yet haven’t gotten a copy of quite yet. So, reading Roberts’s book is a quick introduction for me to this man who for me is most influential with his method of binominal taxonomy which has appeared time and again here in the Wednesday Blog. Yet this system followed after Linnaeus’s earlier alphabetical system for identifying plants by sexual characteristic. The basic premise here is that if there are 26 letters in the alphabet, we can then use that familiar framework to organize other complicated concepts for easy recognition. Linnaeus used this to categorize plants by their male and female sexual characteristics in his 1730 booklet Praeludia Sponsaliorum Plantarum, or Prelude to the Betrothal of Plants.[4] Therefore, Linnaeus could go around the botanical garden at the University of Uppsala in 1730 and quickly identify a plant as a J plant or a G plant. First reading this I thought of the way that letters are used by the Federal Reserve System to identify specific regional branches. Thus, J represents the Federal Reserve Bank of Kansas City and G the Federal Reserve Bank of Chicago. 

I like the idea behind Linnaeus’s alphabetic system yet having only 26 categories to describe the entire plant kingdom seems doomed to be flawed as it relies on a belief that all the plants that are known to exist are the ones that exist, that there’s nothing new under the Sun to be discovered. Roberts frames this in a biblical context, describing how Olof Celsius (1670–1756), one of Linneaus’s first professors, met the young Linnaeus when he was working on a project called the Hierobotanicum or Priestly Plants which was intended to be a compendium of all 126 plants mentioned in the Old and New Testaments.[5] Why would Linnaeus need more than 26 categories to contain all the plants known to the Ancients and to the Bible? Naturally, the flaws were apparent in this from the start by using a system of knowing which originated in the more arid landscape of the Levant rather than in the cooler and damper climate of Sweden. I’ve noticed this in my own life, how many cultural elements which we practice in the United States, notably the seasons, better fit the natural climate of New England and England proper than they do here in the Midwest with its far more variable conditions depending on the time of year, or even the given hour. Roberts deconstructed Linnaeus’s early efforts near the end of Part I of his book when he described Linnaeus’s first scholarly collision with Buffon after the Frenchman’s appointment by Louis XV to the position of Intendant of the Jardin des Plantes in Paris.[6] In a debate which Roberts calls “the Quarrel of the Universals” Linnaeus argued that species could be recognized from individual type specimens while Buffon countered that this ran the great risk of minimizing the diversity of life and eliminating potential variations in nature.

This got me thinking about systems of knowing, thus I decided to render the title of the original file for this blog post that you’re now reading (or listening to) De Systemarum Scientis in the full Latinate tradition of my own scholarship, or “On Systems of Knowing” in English. Why is it, for instance, that our Roman alphabet begins with A and ends with Z? The first half of that question is easier to answer: the Romans adapted our alphabet from the Greeks who started it off with α alpha, β beta, thus the noun alphabet itself. Yet the Greek alphabet ends with ω omega rather than ζ zeta, so why does ours end with Z? What I’ve heard about this is that the Greek letters that were adopted into the Roman alphabet were tacked onto the end of the line, or at least this is what I remember being taught when I learned to recite the alphabet in French in my undergraduate years. French calls the letter Y y-grec, or the Greek i. Likewise, everyone except for we Americans call the final letter of the Roman alphabet some variation of zed, which is a shortening of the Greek zeta. This better reflects that letter’s original sound in Greek, just as the cursive lowercase z is the lowercase Greek ζ just adopted straight into the Roman alphabet without any major changes.

So, when it comes to the organization of our knowledge there are things that we know in this same alphabetical order or in relation to this alphabetical order. Because the Roman alphabet is written left to right, we know that when it’s used to set up a coordinate system on a printed map that A will always appear to the top left, orientating the way the map should be held. Likewise, a reader can quickly scan through an index in any language written in the Roman alphabet by following along with the order of the letters. How individual languages index objects from that point on differs, but the foundational element remains the same. The Roman alphabet works best for Latin, the language for which it was originally developed, so it tends to be adapted in its phonetic values depending on which language is using it. This is why English uses the letter W to represent a [w] sound while German and in loanwords French uses W to represent a [ˈv] sound. Meanwhile, Irish represents the [w] and [ˈv] sounds with two digraphs, bh and mh that represent both depending on the context. Typically, bh represents [ˈv] while mh represents [w], but it depends on context. The reasoning behind this is that when the Roman alphabet was adapted by Latin speakers to fit Old Irish in the fifth and sixth centuries CE they approximated the phonology of their Latin in rendering the Roman alphabet usable for Irish. So, to these monks the Irish [ˈv] sound in a Gaelic name like Medbh sounded enough like how the letter b was used at the time that they used that letter to approximate this [ˈv] sound. It’s notable to me that in Modern Greek the letter β is today pronounced veta and in the Cyrillic alphabet the letter В represents this same [ˈv] sound while the letter Б represents the [b] sound that we English-speakers associate with the letter B. Cyrillic and its predecessor the Slavonic alphabet were being developed around the same time that the Roman alphabet began to be used for Irish so there must’ve been something going on with the pronunciation of people’s Bs becoming closer to Vs in late antiquity. Thus, the ways in which our alphabets represent specific sounds today reflect the prestige dialects of our two classical languages–Latin and Greek–as they were spoken over a millennium ago.

Consider then how we distinguish technical, scientific, or artistic terminology depending on the prestige language of that field. History has largely become a vernacular field, where we adapt terms that will be more familiar to the non-professional enough to initiate them into what Ada Palmer calls the History Lab. Yet often these terms will have etymologies beyond English itself. Consider the word photograph, or its more common shortened form photo. This word comes purely from Greek, the classical language more associated with science and technology. It blends the Greek φωτο-, the blending form of φῶς (phôs), or light with the suffix –γρᾰ́φος, from the verb γρᾰ́φω meaning to draw, sketch, or write. So, photography at its core is light writing. Neat! The word photography entered English from the French photographie, that etymology referring to the French origins of the art and craft of photography itself in the middle of the 1820s. Yet the linguists who modernized Irish a century ago decided to favor indigenous terminologies, rendering this word grianghraf using the Irish word grian for Sun instead of a variation of φωτο- (light) while adopting the Greek –γρᾰ́φος suffix to center this new Irish conception of the term within the same technological corpus as the English photograph. While consequential to have a particular Irish name for this technology that elevated the Irish use of photography as equal to any other culture’s photography and particular within the Irish language, it still remains rooted in the same western tradition of grounding our names for scientific and technical things in Greek.

Language directly influences how we know things because it is the vehicle by which we recognize those things around us. I know that a photograph is something made by “light writing” therefore I will also recognize that anything else beginning with “photo” also refers to “light” and that anything ending with “graph” refers to some form of record or writing. I come from a culture where light is connected with goodness and dark with ill. Likewise, for me I think of blue and green as happier colors rather than red or orange which are angrier colors. There is safety in light, in the daytime we can see people or things coming toward us easier than in the dark of night. At the Easter Vigil the celebrant lights the Paschal Flame which is then passed around the church so that we all share in the Light of Christ (Lux Christi) returned to the world with the Resurrection. The central question in my dissertation is linguistic: what did André Thevet (1516–1590) mean when he referred to the Americas overall as sauvage? This French word translates into English as both savage and wild, yet I chose to retain the original French to better represent the original concept which encompasses both concepts in English. This word was not necessarily racial in the modern sense, rather Thevet used sauvage to describe people, places, and things which existed beyond civilization. This word itself betrays its original meaning, that is city life. Thevet himself understood the sauvage to be the antonym of this city life. I describe it in the introduction to my dissertation in terms of light and dark, following the cultural connotations already illuminated: the city is the sun whence radiates the light of civilization. The further one goes from that sun, the darker things become and the less civilized they remain. Thevet’s sauvage existed at that furthest extreme in the dark. I imagine the character of Gollum in this sort of darkened existence, deep beneath the Misty Mountains uninterested in light save for the Ring of Power which consumed his day rendering it eternal night. In the literature of Thevet’s time a fine sauvage characterization is Caliban in Shakespeare’s Tempest, wild as the waters which wrecked King Alonso and his men on the island in Act 1 of that play.

Roberts notes how these linguistic attributes influenced Linnaeus’s systemization of humanity in the 1735 second edition of his Systema Naturae. The Swede divided humanity into four subcategories described by color over any other facet.[7] Roberts spends the following five pages questioning Linnaeus’s methodology, asking “why four?” and why these specific colors? There is some historical context for Linnaeus’s choice to refer to Black Africans, even Thevet referred to the varied peoples of Africa as “black” in his Singularitez de la France Antarctique. Thevet hints at a possible environmental cause for blackness, writing that the peoples “of Barbary” who are “the blackest” are “of the same manners and conditions as their region is hotter than others.”[8] Thevet’s understanding of African geography is somewhat uncertain, so his definition of Barbary may not align with the Berbers from whom the Barbary Coast of the Maghreb was named. Still, it hints at an understanding that the hotter, or more torrid, the climate got the darker the skin of the people would become. Roberts notes that the Portuguese were the first to use the “word negro to signify African origin or descent” in the middle of the sixteenth century.[9] This makes sense considering the Portuguese were the first European power to sail down the West African coast in the fifteenth century. That Roberts notes this Portuguese definition of blackness first appears in the middle of the sixteenth century likely refers to Damião de Góis’s (1502–1574) Chronica do Dom Emmanuel I of 1566 to 1567 which is an early source that I’ve consulted for information on the voyages of Vasco da Gama (d. 1524).[10] Geraldine Heng, the leading authority on medieval notions of race, wrote in her 2018 book The Invention of Race in the European Middle Ages that blackness was already well established as an element in religious and secular iconography by the beginning of the First Age of Exploration.[11] Roberts concludes his discussion of this particular racial element of Linnaeus’s great contribution to taxonomy sullenly noting that it’s thanks to Linnaeus that this cultural connotation of blackness with darkness was given scientific credence which continues to support racist ideologies to this day.[12]

How do we use our own words to describe things to which they are not suited, in turn transforming the nature of those things that they may become part of our own world? My research is most interested in understanding these questions of how those things at the boundaries of knowledge were understood by André Thevet using the tools afforded to him during the French Renaissance of the sixteenth century. Thevet used the word sauvage to do this and create a category of life against which he could measure and proclaim the existence of something civilized closer to home. Michael Wintroub, Professor Emeritus of Rhetoric at Cal-Berkeley, wrote in his 2006 book A Savage Mirror that Thevet’s countrymen sought to “civilize the barbarians” to make up for an insecurity they felt at being called barbarians themselves by Italian intellectuals at the turn of the sixteenth century during the French invasion of Italy under King Charles VIII (r. 1483–1498).[13] As long as there was someone else who the French could look down upon beyond their own cities they felt secure in their own civility. Yet the sauvage exists within a larger framework of singularities, a word which is central to Thevet’s cosmography. Thevet used the word singularity to describe those things which were exotic, wonderous, and immensely collectable in his eye and hopefully in the eyes of potential readers who would buy his books. I see various layers and categories of singularities in Thevet’s cosmography, for instance he only included images of certain animals in his book of the same name, the aforementioned Singularitez of 1557. The sloth and toucan were depicted as well as described, yet the mysterious Ascension Island aponar remained a bird worthy only of a textual description. This suggests that somethings were more singular than others, or more worthy of attention and the money needed to produce these woodcut images than others. These systems of knowing framed around the singularity are the subject about which I intend to write my first academic monograph. Classifying something as singular gives it an appeal which sets it aside from both the civil and the sauvage as belonging to a higher level of category which can include both the urbane and the agrestic.

Jason Roberts describes Buffon and Linnaeus’s mutual missions to make something of themselves and to rise above their provincial origins to the heights of society. I laughed out loud reading Roberts’s introduction to Linnaeus’s character, what felt like an iconoclasm of sorts for this Fellow of the Linnean Society. “Carl Linnaeus was a Swedish doctor with a diploma-mill medical degree and a flair for self-promotion, who trumpeted that ‘nobody has been a greater botanist or zoologist’ while anonymously publishing rave reviews of his own work.”[14] Buffon by contrast took advantage of a golden opportunity to build his own demi-paradise at his manor in the Burgundy countryside until his good reputation as a botanist brought him to royal attention and the appointment as Intendent of Jardin du Roi.[15] The Jardin des Plantes, as Buffon’s charge is today known, is perhaps a better place to conclude than most. Situated in the Fifth Arrondissement across Boulevard de l’Hôpital and Rue Buffon from Gare d’Austerlitz, the Jardin is an urban oasis created for the purpose of crafting systems of knowing. Its original intent was to serve as a medicinal garden existing beyond the purview of the Sorbonne, Paris’s sole licensed teaching medical school in the seventeenth century.[16] I’ve spent several happy hours wandering through the Jardin, home to the Muséum National d’Histoire Naturelle’s Grande Galerie de l’Évolution, the Galerie de Paléontologie et d’Anatomie compare, and the Ménagerie du Jardin des Plantes, which was home to Paris’s first resident giraffe whose story is delightfully told by Michael Allin in his 1998 book Zarafa: A Giraffe’s True Story, from Deep in Africa to the Heart of Paris.[17] While Allin’s heroine Zarafa is not today on display in the Grande Galerie de l’Évolution (she is instead today to be found in the Muséum d’Histoire naturelle de La Rochelle), the taxidermy in the Parade of African Mammals that is the centerpiece of the Grande Galerie represents a system of knowing animal life in itself.An elephant leads the parade followed by hippopotami, zebras, and giraffes with two such camelopards standing erect their long necks rising toward the upper galleries at the center of the procession. Behind them come the horned mammals, rhinoceroses, and at the rear a crouching lion watching its prey. This is a system that Buffon would have appreciated more than Linnaeus, one which represents the nature of individual beings more than species. Each stuffed specimen seems to have its own character, its own personality. They look about as one would expect they would in life. The great artifice of this is the idea of a parade itself, a very human notion indeed, and one that is infrequent enough to be nearly singular in character, a reason for a day out, worth putting in the social calendar of a city, town, or village no matter how large or small. A parade is its own system of knowing.


[1] For my recent essays referring to this current historiographic project see “On Sources,” Wednesday Blog 6.22, “On Writing,” Ibid., 6.27, and “On Knowledge,” Ibid., 6.29.

[2] Lee Alan Dugatkin, Mr. Jefferson and the Giant Moose(University of Chicago Press, 2009).

[3] Staffan Müller-Wille, “Linnean Lens | Linnaeus’ Lapland Journey Diary (1732),“ moderated by Isabelle Charmantier, virtual lecture, 12 May 2025, by the Linnean Society of London, YouTube, 1:04:18, link here.

[4] Jason Roberts, Every Living Thing: The Great and Deadly Race to Know All Life(Random House, 2024), 45–49.

[5] Roberts, 20.

[6] Roberts, 115–125.

[7] Roberts, 109.

[8] André Thevet, Les Singularitez de la France Antarctique(Antwerp, 1558), 16r–16v. The translation is my own.

[9] Roberts, 109.

[10] Damião de Góis, Chronica do Felicissimo Rei Dom Emanuel4 vols., (Lisbon, 1566–1567).

[11] Geraldine Heng, The Invention of Race in the European Middle Ages, (Cambridge University Press, 2018), 190.

[12] Roberts, 110.

[13] Michael Wintroub, A Savage Mirror: Power, Identity, and Knowledge in Early Modern France, (Stanford University Press, 2006), 42.

[14] Roberts, xii.

[15] Roberts, 107.

[16] Roberts, 96–98.

[17] Michael Allin, Zarafa: A Giraffe’s True Story, from Deep in Africa to the Heart of Paris, (Delta, 1998).


A picture of the great clock at Kansas City Union Station at night.

The Poetics of Finality

The Poetics of Finality Wednesday Blog by Seán Thomas Kane

This week, some words on endings.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane


This week, some words about endings.


On the morning of Flag Day, I went to the Linda Hall Library with my parents to see the classic 1951 science fiction film The Day the Earth Stood Still. I knew about this film, but this was my first time seeing it. Beside the story, what struck me most about this film was its tone, pacing, and overall character. After I finished my other two events of the day, the Plaza No Kings Rally where I watched the crowd of 11,000 people rally for democracy, and Mass that afternoon, I returned home tired yet eager to find that same tone. I went looking for it in Rod Serling’s classic series The Twilight Zone. Released between 1959 and 1964 in its first incarnation, this series had scared me a bit the previous times I’d sat down to watch an episode or two. It has an air of fear to it that is reminiscent of the reasons why I generally stay away from horror films. And yet on closer inspection, Serling’s stories tell something that is far less frightening than I first imagined because it’s a theme with which I’m all too familiar.

I came to indirectly know more about Mr. Serling when I moved to his hometown, Binghamton, New York, to undertake my doctoral studies in August 2019. His image isn’t all over town, but it’s a visible reminder of Binghamton’s history and place in the fabric of American culture. In fact, much of the stories that I’ve now watched in The Twilight Zone fit the character of that interior part of the Northeast where I lived from August 2019 to December 2022 quite well. In some ways, not too much of the built environment has changed from Serling’s day 60 years ago. Still, I noticed time and again how the optimism of that postwar era had faded. The same town was there, but some of the energy it once knew was long gone. Having lived my life to date in Chicago, Kansas City, and London, all cities with layers of history and memory, I’ve seen how the current generations have chosen to craft their own layer. 

London is a city that holds mementos to its ancient and medieval past while largely built in the form of its eighteenth, nineteenth, and twentieth century growth at the height of the British Empire. Yet today there are enough futuristic buildings and settings in the capital that it was used as a setting standing in for the space-age galactic capital of Coruscant in the latest Star Wars series Andor. I delighted in seeing familiar places from the Barbican Estate and Canary Wharf in the show.

Chicago has some of the same American character of Binghamton and the Northern states as a whole, a common history. Yet Chicago is the powerhouse of this country, the beating heart of our transportation network, the real crossroads of this nation. Where other industrial cities in the Great Lakes faltered in the 1970s, 1980s, and 1990s Chicago instead continued to power on for its sheer size and the diversity of its industry. Today, it has a very particular character which I believe makes it the most American city this country has to offer for its marriage of American settler culture and all the different indigenous, migrant, and immigrant communities that make America the patchwork of peoples in one great republic that it is.

Kansas City meanwhile saw more of the downturn for its smaller size and some of its traditional industries haven’t translated as well into the current information revolution. Kansas City once thrived as another great railway hub: the Gateway to the Southwest as the last major Midwestern metropolis along the Santa Fe Railroad as it drove across the prairies toward New Mexico, Arizona, and Southern California. Today, our interstate highways direct traffic through Kansas City more from Texas, Colorado, the Dakotas, Iowa, Minnesota, and points east than in the old northeast-to-southwest alignment of the rails. Recently while I was in downtown Kansas City, I remarked on how underwhelmed I felt visiting there for the first time after the business and thrill of going with my parents down to the Loop on weekends when we still lived in Chicago. Kansas City however has seen a renaissance of its own in the last twenty-five years that has filled in many of the gaps left by urban renewal and restored this city’s vitality. That more than anything else made my move to Binghamton a tremendous culture shock: going from a growing city to one that was a shadow of its former self struggling to invest in its future.

For every Twilight Zone episode there seems to be a fearsome unknown menace looming over the story; something that the character can perceive the effects of yet can’t quite see. Yet if there is any common thread to this menace it’s that it is a fear of the unknown. In the original pilot that launched the Twilight Zone, titled “The Time Element,” Serling’s rational psychoanalyst foil to the main character trapped in his dreams concludes through his logic that his dreams that he goes back in time from 1958 to Pearl Harbor on December 6th, 1941 could not be real because any incident that happened in this dreamed 1941, if real, would impact the patient as he lived in 1958. Yet reason is proven unequipped to address the irrational, how can it explain what it intrinsically is not? I’ve argued time and again here in the Wednesday Blog that this is where there exists room for belief in a life lived rationally. Still, having watched a fair number of Mr. Serling’s stories now, I think I can say something to this menace’s true character.

There is an intrinsic fear that comes with knowledge of seeing that we do have an ending. On a biological level, our bodies can only continue working for so long. We drift apart from our lives as they were in one moment or another, apart from friends who we admired and loved in a given moment, apart from jobs that consumed our waking and sleeping thoughts, apart from situations which challenged us to become better versions of ourselves. Yet, all those lived moments will continue on in our memory, at least for a time. I was stunned to find how well I could remember very particular moments of minute detail earlier this year when prompted by a sudden and wonderful realization about how I want to live in my life to come. Even the smallest of details that my senses perceived were there, locked away. The antidote to any fear is joy, and for me it was the most radiant joy I’ve felt in years which unlocked those memories for me of moments which led to that jubilation. Still, fear in moderation is a good counsel, a wise friend. It’s what makes me watch for traffic when I’m crossing the street here in Kansas City, or that advises me to make certain decisions over other ones at a very fundamental level to keep me alive. This is one interpretation of what the infamous tree in Genesisportended: that once humanity ate its fruit we would never again be able to be innocent from seeing flaws in the beauty of nature and in the beauty of ourselves.

Over the weekend then, I went to see the new Stephen King film The Life of Chuck starring Tom Hiddleston as Charles Krantz. I particularly grew to like young Chuck’s grandfather played by Mark Hamill. If I were to compare Stephen King’s writing to any other American storyteller of the last century it would be Rod Serling. Both tell stories of this same menacing fear. Yet in King’s Life of Chuck, the monster who’s revealed in the last scene is far more familiar, ordinary, and known to us all that I saw it less as a menace and more as a companion. There is intense poetry in both Serling’s Twlight Zone and King’s Life of Chuck around endings. They tell us that the finality of moments in our lives and of our lives all together give our lives greater meaning and purpose. I’ve found in the various projects and events I’ve helped organize that we get more done when we have goals we’re trying to achieve and a timeline by when we want to achieve those goals. I often work better when I have deadlines because if I begin to feel impatient at how long something might take, I know there’s an end date to look forward to. I feel that about little things but not the big ones, not the experiences that’ll one day make for good stories or about my life itself.

I for one don’t want to live forever, I worry that’d take some of the meaning out of my life. I would like to be remembered for my writing, for being a good person, for the history I research and leave for generations of graduate students to muddle through in their coursework. On a recent digital security Zoom call that I attended we were asked to search our names on several search engines and see what came up. Should there be anything we didn’t want searchable we could then get that removed. I was delighted to see that after my website, social media profiles, and various conference programs came page after page filled with essays published here on The Wednesday Blog. I suppose that’s one benefit of writing this weekly for the last four years: my thoughts written here will be remembered at least by the search engines. Yet I think the Wednesday Blog will have more meaning when I decide to set it aside and turn my staff to other facets of “my so potent art” to borrow from Prospero. Because then anyone who is curious enough to glance through these pages will be able to see them in their totality and know these essays are artifacts of the time when they were written in the early 2020s at a time of my life of doctoral study that feels so very close to ending.

This is not the last time you’ll hear from me on the Wednesday Blog, rather I’ve decided to end my weekly publication of this blog at the end of the current season. This is Season 5 of the podcast, or Book 6 of the blog itself. I feel that it’s had a wonderful run, and it’s been a great outlet for me while I’m biding my time as my career slowly begins. Yet now, I’ve got a lot more writing to do from new research papers to submit for peer-review to book reviews that it’ll be nice to take this off my docket. This is the 25th issue of this season, and I have a further 15 issues planned before the end. Thank you to all my readers over the last four years and all my listeners over the last three. I hope this will be an ending worthy of your curiosity.