Tag Archives: Apple

The author on a blue background wearing Apple AirPods.

On Machinery

This week, for the penultimate post of the Wednesday Blog, how machinery needs constant maintenance to keep functioning.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Sources:%5B1%5D Surekha Davies, “Walter Raleigh’s headless monsters and annotation as thinking,” in Strange and Wonderous: Notes from a Science Historian, (6 October 2025).[2] “Asking the Computer,” Wednesday Blog 5.26.


This week, for the penultimate post of the Wednesday Blog, how machinery needs constant maintenance to keep functioning.


I am just old enough to remember life before the ubiquity of computers. I had access to our family computer as long as I can remember, and to my grandparents’ computer at their condo when we stayed with them in the Northwest Suburbs of Chicago. Yet even then my computer usage was limited often to idle fascination. I did most of my schoolwork by hand through eighth grade, only switching from writing to typing most of my work when I started high school and was issued a MacBook by my school. I do think that a certain degree of whimsy and humanity has faded from daily life as we’ve so fully adopted our ever newly invented technologies. Those machines can do things that in my early childhood would’ve seemed wonderous. Recently, I thought how without knowing how powerful and far-reaching my computer is as a vehicle for my research and general curiosity, I would be happy, delighted in fact, if my computer could conduct one function, say if it had the ability to look up any street address in the United States as a device connected to the US Postal Service’s database. That alone would delight me. Yet that is the function of not just one application on my computer but merely one of many functions of several such programs I can load on this device, and not only can I look up addresses in the United States but I can look up addresses in any country on this planet.

With the right software downloaded onto this computer I can read any document printed or handwritten in all of human history and leave annotations and highlights without worrying about damaging the original source. Surekha Davies wrote warmly in favor of annotating in her newsletter this week, and I appreciated her take on the matter.[1] In high school, I was a bit of a prude when it came to annotating; I found that summer reading assignment in my freshman and sophomore English classes to be almost repulsive because I see a book as a work of art crafted by its author, editor, and publisher to be a very specific way. To annotate, I argued, was like drawing a curly-cue mustache on the Mona Lisa, a crude act at best. Because of this I process knowledge from books differently. I now often take photos of individual pages and organize them into albums on my computer which I can then consult if I’m writing about a particular book, in much the same fashion that I use when I’m in the archive or special collections room looking at a historical text.

All of these images can now not only be sorted into my computer’s photo library, now stored in the cloud and accessible on my computer and phone alike, but they can also be merged together into one common PDF file, the main file type I use for storing primary and secondary sources for my research. With advances in artificial intelligence, I can now use the common top-level search feature on my computer to look within files for specific characters, words, or phrases to varying levels of accuracy. This is something that was barely getting off the ground when I started working on my doctorate six years ago, and today it makes my job a lot easier; just my file folder containing all of the peer-reviewed articles I’ve used in my research since 2019 contains 349 files and is 887.1 MB in size.

Our computers are merely the latest iterations of machines. The first computer, Charles Babbage’s (1791–1871) counting machine worked in a fairly similar fashion to our own albeit built of mechanical levers and gears where ours have intricate electronics in their hard drives. I, like many others, was introduced to Babbage and his difference engine by seeing the original in the Science Museum in London. This difference engine was a mechanical calculator intended to compute mathematical functions. Blaise Pascal (1623–1662) and Gottfried Wilhelm Leibniz (1646–1716) both developed similar mechanisms in the seventeenth century and still older the Ancient Greek 2nd century BCE Antikythera mechanism can complete some of the same functions. Yet between all of these the basic idea that a computer works in mathematical terms remains the same even today. For all the linguistic foundations of computer code, the functions of any machine burn down to the binary operations of ones and zeros. I wrote last year in this blog about my befuddlement that artificial intelligence has largely been created on verbal linguistic models and was only in 2024 being trained on mathematical ones.[2] Yet even then those mathematical models were understood by the A.I. in English, making their computations fluent only in one specific dialect of the universal language of mathematics making their functionality mostly useless for the vast majority of humanity.

Yet I wonder how true that last statement really is? After all, I a native English speaker with recent roots in Irish learned grammar like many generations of my ancestors through learning to read and write in Latin. English grammar generally made no sense to me in elementary school, it is after all very irregular in a lot of ways, and so it was only after my introduction to a very orderly language, the one for which our Roman alphabet was first adapted, that I began to understand how English works. The ways in which we understand language in a Western European and American context rely on the classical roots of our pedagogy influenced in their own time by medieval scholasticism, Renaissance humanism, and Enlightenment notions of the interconnectedness of the individual and society alike. I do not know how many students today in countries around the globe are learning their mathematics through English in order to compete in one of the largest linguistic job markets of our time. All of this may well be rendered moot by the latest technological leap announced by Apple several weeks ago that their new AirPods will include a live translation feature acting as a sort of Babel Fish or universal translator depending on which science fiction reference you prefer.

Yet those AirPods will break down eventually. They are physical objects, and nothing which exists in physical space is eternal. Shakespeare wrote it well in The Temepst that 

“The solemn temples, the great globe itself,

Yea, all which it inherit, shall dissolve,

And, like this insubstantial pageant faded,

Leave not a rack behind. We are such stuff

As dreams are made on, and our little life

Is rounded with a sleep.” (4.1.170-175)

For our machines to last, they must be maintained, cleaned, given breaks just like the workers who operate them lest they lose all stamina and face exhaustion most grave. Nothing lasts forever, and the more those things are allowed to rest and recuperate the more they are then able to work to their fullest. So much of our literature from the last few centuries has been about fearing the machines and the threat they pose. If we are made in the Image of God then machines, our creation, are made in the image of us. They are the products of human invention and reflect back to us ourselves yet without the emotion that makes us human. Can a machine ever feel emotion? Could HAL-9000 feel fear or sorrow, could Data feel joy or curiosity? And what of the living beings who in our science fiction retrofitted their bodies with machinery in some cases to the extent that they became more machine than human? What emotion could they then feel? One of the most tragic reveals for me in Doctor Who was that the Daleks (the Doctor’s main adversaries) are living beings who felt so afraid and threatened that they decided to encase the most vital parts of their physical bodies in wheelchair tanks, shaped like pepper shakers no less, rendering them resilient adversaries for anyone who crossed them. Yet what remained of the being inside? I urge caution with suggestions of the metaverse or other technological advances that draw us further from our lived experiences and more into the computer. These allow us to communicate yet real human emotion is difficult to express beyond living, breathing, face-to-face interactions.

After a while these machines which have our attention distract us from our lives and render us blind to the world around us. I liked to bring this up when I taught Plato’s allegory of the cave to college freshmen in my Western Civilization class. I conclude the lesson by remarking that in the twenty-first century we don’t need a cave to isolate ourselves from the real world, all we need is a smartphone and a set of headphones and nothing else will exist. I tried to make this humorous, in an admittedly dark fashion, by reminding them to at least keep the headphones on a lighter mode so they can hear their surroundings and to look up from their phone screen when crossing streets lest they find themselves flattened like the proverbial cartoon coyote on the front of a city bus. 

If we focus too much on our machines, we lose ourselves in the mechanism, we forget to care for ourselves and attend to our needs. The human body is the blueprint for all human inventions whether physical ones like the machine or abstract like society itself. As I think further about the problems our society faces, I conclude that at the core there is a deep neglect of the human at the heart of everything. I see this in the way that disasters are reported on in the press: often the financial toll is covered before the human cost, clearly demonstrating that the value of the dollar outweighs the value of the human. In abdicating ourselves to our own abstractions and social ideals we lose the potential to change our course, repair the machinery, or update the software to a better version with new security patches and fixes for glitches old and new. In spite of our immense societal wealth, ever advancing scientific threshold, and technological achievement we still haven’t gotten around to solving hunger, illiteracy, or poverty. In spite of our best intentions our worst instincts keep drawing us into wars that only a few of us want.The Mazda Rua, my car, is getting older and I expect if I keep driving it for a few years or more it’ll eventually need more and more replacement parts until it becomes a Ship of Theseus, yet is not the idea of a machine the same even if its parts are replaced? That idea is the closest I can come to imagining a machine having a soul as natural things like us have. The Mazda Rua remained the Mazda Rua even after its brakes were replaced in January and its slow leaking tire was patched in May. Yet as it moves into its second decade, that old friend of mine continues to work in spite of the long drives and all the adventures I’ve put it through. Our machinery is in desperate need of repair, yet a few of us see greater profit from disfunction than they figure they would get if they actually put in the effort, money, and time to fix things. If problems are left unattended to for long periods of time they will eventually lead to mechanical failure. The same is true for the machinery of the body and of the state. Sometimes a good repair is called for, reform to the mechanisms of power which will make the machine work better for its constituent parts. In this moment that need for reform is being met with the advice of a bad mechanic looking more at his bottom line than at the need of the mechanism he’s agreed to repair. Only on this level the consequences of mechanical failure are dire.


[1] Surekha Davies, “Walter Raleigh’s headless monsters and annotation as thinking,” in Strange and Wonderous: Notes from a Science Historian, (6 October 2025).

[2] “Asking the Computer,” Wednesday Blog 5.26.


The author pulling a face at the camera.

On Writing

This week, some words about the art, and the craft, of writing.—Click here to support the Wednesday Blog: https://www.patreon.com/sthosdkane—Links in this episode:Patrick Kingsley, Ronen Bergman, and Natan Odenheimer, “How Netanyahu Prolonged the War in Gaza to Stay in Power,” The New York Times Magazine, (11 July 2025).John McWhorter, “It’s Time to Let Go of ‘African American’,” The New York Times, (10 July 2025).Bishop Mark J. Seitz, D.D., “The Living Vein of Compassion’: Immigration & the Catholic Church at this moment,” Commonweal Magazine, (June 2025), 26–32.“On Technology,” The Wednesday Blog 5.2.“Artificial Intelligence,” The Wednesday Blog 4.1.


This week, some words about the art, and the craft, of writing.


In the last week I’ve been hard at work on what I hope is the last great effort toward completing my dissertation and earning my doctorate. Yet unlike so much of that work which currently stands at 102,803 words across 295 U.S. Letter sized pages inclusive of footnotes, front matter, and the rolling credits of my bibliography I am now sat at my desk day in and day out not writing but reading intently and thoroughly books that I’ve read before yet now find the need for a refresher on their arguments as they pertain to the subject of my dissertation: that André Thevet’s use of the French word sauvage, which can be translated into English as either savage or wild, is characteristic of the manner in which the French understood Brazil as the site of its first American colony and the Americas overall within the broader context of French conceptions of civility in the middle decades of the sixteenth century. I know, it’s a long sentence. Those of you listening may want to rewind a few seconds to hear that again. Those of you reading can do what my eyes do so often, darting back and forth between lines.

As I’ve undertaken this last great measure, I’ve dedicated myself almost entirely to completing it, clearing my calendar as much as I see reasonable to finish this job and move on with my life to what I am sure will be better days ahead. Still, I remain committed to exercising, usually 5 km walks around the neighborhood for an hour each morning, and the occasional break for my mind to think about the things I’ve read while I distract myself with something else. That distraction has truly been found on YouTube since I started high school and had a laptop of my own. This week, I was planning on writing a blog post which compared the way that my generation embraced the innovation of school-issued laptops in the classroom and the way that starting next month schools and universities across this country will be introducing artificial intelligence tools to classrooms. I see the benefits, and I see tremendous risks as well, yet I will save that for a lofty second half of this particular essay.

I’ve fairly well trained the YouTube algorithm to show me the sorts of videos that I tend to enjoy most. Opening it now I see a segment from this past weekend’s broadcast of CBS Sunday Morning, several tracks from classical music albums, a clip from the Marx Brothers’ film A Night at the Opera, the source of my favorite Halloween joke, and a variety of comic videos from Conan O’Brien Needs a Friend to old Whose Line is it Anyway clips. Further down are the documentary videos I enjoy from history, language, urbanist, and transportation YouTubers. Yet in the last week or so I’ve been seeing more short videos of a minute or less with clips from Steven Spielberg’s 2012 film Lincoln. I loved this film when I saw it that Thanksgiving at my local cinema. As longtime readers of the Wednesday Blog know, I like to call Mr. Lincoln my patron saint within the American civic religion. As a young boy in Illinois in the ‘90s, he was the hero from our state who saved the Union and led the fight to abolish slavery during the Civil War 130 years before. Now, 30 years later and 160 years out from that most horrific of American wars I decided to watch that film again for the first time in a decade. In fact, I’m writing this just after watching it so some of the inspiration from Mr. Lincoln’s lofty words performed by the great Daniel Day-Lewis might rub off on my writing just enough to make something inspirational this week before I return in the morning to my historiography reading.

Mr. Lincoln knew what every writer has ever known, that putting words to paper preserves them for longer than uttering even the longest string of syllables can last. What I mean to say is they’ll remember what you had to say longer if you write it down. He knew for a fact that the oft quoted and oft mocked maxim that the pen is mightier than the sword is the truth. After all, a sword can take a life, as so many have done down our history and into our deepest past to the proverbial Cain, yet pens give life to ideas that outlive any flesh and bone. I believe writing is the greatest human invention because it is the key to immortality. Through our writing generations from now people will seek to learn more about us in our moment in the long human story. I admit a certain boldness in my thinking about this, after all I’ve seen how the readership and listener numbers for the Wednesday Blog ebb and flow, and I know full well that there’s a good chance no one in the week I publish this will read it. Yet I hold out hope that someday there’ll be some graduate student looking for something to build a career on who might just stumble across my name in a seminar on a sunny afternoon and think “that sounds curious,” only to then find some old book of my essays called The Wednesday Blog and then that student will be reading these words. 

I write because I want to be heard, yet I’ve lived long enough to know that it takes time for people to be willing to listen, that’s fair. I’ve got a growing stack of newspaper articles of the affairs of our time growing while my attention is drawn solely to my dissertation. I want, for instance, to read the work of New York Times reporters Patrick Kingsley, Ronen Bergman, and Natan Odenheimer in a lengthy and thorough piece on how Israeli Prime Minister Netanyahu “prolonged the War in Gaza to stay in power” which was published last Friday.[1] I also want to read John McWhorter’s latest opinion column “It’s Time to Let Go of ‘African American’”; I’m always curious to read about suggestions in the realm of language.[2] Likewise there are sure to be fascinating and thoughtful arguments in the June 2025 issue of Commonweal Magazine, like the article titled “’The Living Vein of Compassion’: Immigration & the Catholic Church at this moment” by Bishop Mark Seitz, DD of the Diocese of El Paso.[3] I’m always curious to read what others are writing because often I’ll get ideas from what I read. There was a good while there at the start of this year when I was combing through the pages of Commonweal looking for short takes and articles which I could respond to with my own expertise here in the Wednesday Blog. By writing we build a conversation that spans geography and time alike. That’s the whole purpose of historiography, it’s more than just a literature review, though that’s often how I describe what I’m doing now to family and friends outside of my profession who may not be familiar with the word historiography or staireagrafaíocht as it is in Irish. 

Historiography is writing about the history that’s already been written. It’s a required core introductory class for every graduate history program that I’m familiar with, I took that class four times between my undergraduate senior seminar (the Great Historians), our introductory Master’s seminar at UMKC (How to History I), and twice at Binghamton in courses titled Historiography and On History. The former at Binghamton was essentially the same as UMKC’s How to History I while the latter was taught by my first doctoral advisor and friend Dr. Richard Mackenney. He challenged us to read the older histories going back to Herodotus and consider what historians in the Middle Ages, Renaissance, Enlightenment, and Nineteenth Century had to say about our profession. Looking at it now, the final paper I wrote for On History was titled “Perspectives from Spain and Italy on the Discovery of the New World, 1492–1550.” I barely remember writing it because it was penned in March and April 2020 as our world collapsed under the awesome weight of the Coronavirus Pandemic. Looking through it, I see how the early stages of the pandemic limited what I could access for source material. For instance, rather than rely on an interlibrary loan copy of an English translation, perhaps even a more recent edition, of Edmundo O’Gorman’s The Invention of America, I instead was left working with the Spanish original that had been digitized at some point in the last couple decades. Likewise, I relied on books I had on hand in my Binghamton apartment, notably the three volumes of Fernand Braudel’s Civilization and Capitalism, in this case in their 1984 English translations. I wrote this paper and then forgot about it amid all the other things that were on my mind that Spring, only to now read it again. So, yes, I can say to the scared and lonely 27 year old who wrote this five years ago that someone did eventually read it after all.

What’s most delightful about reading this paper again is I’m reminded of when I first came across several names of fellow historians who I now know through professional conferences and have confided in for advice on my own career. The ideas first written in the isolation of lockdown have begun to bear fruit in the renewed interactions of my professional life half a decade later. What more will come of those same vines planted in solitude as this decade continues into its second half? Stretching that question further back in my life, I can marvel at the friendships I’ve cultivated with people I met in my first year of high school, now 18 years ago. That year, 2007, we began our education at St. James Academy where many of us were drawn to the promise of each student getting their own MacBook to work on. I wrote here in March 2024 about how having access to that technology changed my life forever.[4] So, in the last week when I read in one of my morning email newsletters from the papers about the soon-to-be introduction of artificial intelligence to classrooms across this country in much the same way that laptops in classrooms were heralded as the new great innovation in my youth I paused for a few moments longer before turning to my daily labor.

I remain committed to the belief that having access to a laptop was a benefit to my education; in many ways it played a significant role in shaping me into the person I am today. I wrote 14 plays on that laptop in my 4 years in high school, and many of my early essays to boot. I learned how to edit videos and audio and still use Apple products today because I was introduced to them at that early age. It helps that the Apple keyboard comes with easy ways to type accented characters like the fada in my name, Seán. Still, on a laptop I was able to write much the same that I had throughout my life to that point. I began learning to type when I was 3 years old and mastered the art in my middle school computer class. When I graduated onto my undergraduate studies though I found I could take notes far better that I could remember by hand than if I typed them. This is crucial to my story: the notes that I took in my Renaissance seminar at UMKC in Fall 2017 were written by hand, in French no less, and so when I was searching for a dissertation topic involving Renaissance natural history in August 2019, I remembered writing something about animals in that black notebook. Would I have remembered it so readily had I typed those notes out? After all, I couldn’t remember the title of that term paper I wrote for On History in April 2020 until I reopened the file just now.

Artificial intelligence is different than giving students access to laptops because unlike our MacBooks in 2007, A.I. can type for the student, not only through dictation but it can suggest a topic, a thesis, a structure, and supporting evidence all in one go. Such a mechanical suggestion is not inherently a suggestion of quality however, and here lies the problem. I’ve read a lot of student essays in the years I’ve been teaching, some good, some bad. Yet almost all of them were written in that student’s own voice. After a while the author’s voice becomes clear; with my current round of historiography reading, I’m delighting in finding that some of these historians who I know write in the same manner that they speak without different registers between the different formats. That authorial voice is more important than the thesis because it at least shows curiosity and the individual personality of the author can shine through the typeface’s uniformity. Artificial intelligence removes the sapiens from we Homo sapiens and leaves our pride in merely being the last survivor of our genus rather than being the ones who were thinkers who sought wisdom. Can an artificial intelligence develop wisdom? Certainly, it can read works of philosophy both illustrious and indescribably dull yet how well can it differentiate between those twin categories to give a fair and reasoned assessment of questions of wisdom?These are some of my concerns with artificial intelligence as it exists today in July 2025. I have equally pressing concerns that we’ve developed this wonderous new tool before addressing how it will impact our lived organic world through its environmental impact. With both of these concerns in mind I’ve chosen to refrain from using A.I. for the foreseeable future, a slight change in tone from the last time I wrote about it in theWednesday Blog on 7 June 2023.[5] I’m a historian first and foremost, yet I suspect based on the results when you search my name on Google or any other search engine that I am better known to the computer as a writer, and in that capacity I don’t want to see my voice as soft as it already is quieted further by the growing cacophony of computer-generated ideas that would make Aristophanes’ chorus of frogs croak. Today, that’s what I have to say.


[1] Patrick Kingsley, Ronen Bergman, and Natan Odenheimer, “How Netanyahu Prolonged the War in Gaza to Stay in Power,” The New York Times Magazine, (11 July 2025).

[2] John McWhorter, “It’s Time to Let Go of ‘African American’,” The New York Times, (10 July 2025).

[3] Bishop Mark J. Seitz, D.D., “The Living Vein of Compassion’: Immigration & the Catholic Church at this moment,” Commonweal Magazine, (June 2025), 26–32.

[4] “On Technology,” The Wednesday Blog 5.2.

[5] “Artificial Intelligence,” The Wednesday Blog 4.1.