On discarding books

At my day job, we are shifting offices, or at least we are planning to within the next year or so. My shared space, with three full-height bookshelves, will be repalced by a single office of my own, albeit one with far less square footage. I recently got to see a video of the typical new office space, and it’s barely more than a closet, and, so I am told, there will be a single book case inside, just half height.

If I do the math right, I have more than six hundred books, plus an additional ten linear fet of files in file boxes, plus maybe half again as much in historical materials I use for reaseach and other tasks. Those boxes alone would fill up all the anticipated shelf space I will have. Obviously, something has to give. Much of it may have to return home, but that puts it largely out of reach when I am actually working. What can I keep? How much can stay? And what do I do with the rest? These are questions that plague me.

The result is that I’ve been cleaning and sorting, and this has gone on in several rounds. The first pass was to remove every duplicate book, as well as every book that, if I am honest, I will never look at or read again. This was progress, it did mean removing a couple hundred titles. It also gave me an exuse to clean and organize what was left, until I had a single shelf level deicated too all of my theory books, two for fiction, literature, and essays, two for art books, one for books on writing, and so on. I rather liked how this all worked out.

Even though it was paintful to stack books that I intended to part with, I also felt like I was letting things go, like I was clearing out and making changes. A book, when it comes into your life, changes it even if you don’t read it. It is first and foremost a presence on the shelf, acting at a minimum as a statement that you aspire to be a person who has read and valued that work. If you read it, and you decide to keep it, it expresses, I think, a sense of sentiment or perhaps intellectual obligation, as if the book deserves its space because it helped you think through something. I once had a professor, a little bit unreconstructed hippy, one foot in retirement, with whom I took a small seminar class. There were just four or five students, and we met in their office each friday and, after about an hour of discussions, we would move to a nearby campus restaurant and sit on the terrace and eat lunch together will continuing our conversation. It felt the peak of civilization. And I recall that she once suggested that the books we had on our shelves, like all the things we had in our offices and rooms and bedrooms, were part of our brain, they were part of our cognition. “If you look across the room, and see that thing, and think something in response, isn’t it part of your congitive proces?” This was like nothing I had considered before. It changed me.

By extension, when you discard a book, you are letting go of the part of yourself that needed it, you are, in some way, moving, growing, altering yourself. So even as I grumbled about the lack of space in my future, I experienced an unexpected sense of freedom from the process, a sense of possiblity. And the act of deciding what books to discard was punctuated, at times, by finding duplicates of good books, books that I saw and thought, oh, I know someone will like finding that.

This process however, is not a single-pass thing. As I ordered and re-ordered my shelves, I realized that I still had too many books for the future move, but more than that, I began to grow doubts about some of those that had survived my first pass. I reached out to touch my fingers against the spines—why is this tactile move seemingly enlightening? I don’t know, but there is, perhaps, an emotional quality to making these decisions. As I felt each book, I realized that there were many that sat on my shelf out of obligation rather than admiration. This book is here because I might need to reference it one day. This book is here because it is widely regarded as a key text on such and such topic. This one is here because I put my name in it long ago, this one because it was a gift. And this one I’ve never read, bt what if I need this book some day? It was as if I had turned my shelves into a Victorian china cabinet, filled with plates and bowls and cups that almot never get used, and nick-nacks and vases and serving-wear that is valuable, or might one day be valuable, but that in practical terms just sit there taking up space

The result was another round of introspection, doubt, and guilt. But also, some clarity. Aren’t there books here that I would never part with, books here that mean something more?

A new experiment. What, from each shelf, are the books that must come with me, that must be in my new space? Again, running my hand down the spines on each shelf, it wasn’t hard to find these titles. Some were collections of essays. Some were history books, some art monographs. A surprising number were books of social theory and criticism–I’ve never thought of myself as a theory person, and yet, there’s Fisher and Bejamin and Barthes and bunch of others, somehow on my shelf of personal classics. And now, as I look at the gaps I’ve made I have new thoughts. First, I am saddened at how much I have wrecked my still quite young organization system, so much so that I consider putting everything back the way it was. Second, I look at all the shelves that have spaces where once my favorite titles had sat, and I feel ambivalence about the titles that remain. How many here are due to completeism, or a sense of obligation?

Crap. I left Harvey on the regular theory shelf. I get up, I move it over to the “keepers” shelf, and marvel at another book of theory.

Have I really changed that much?

I once placed a line in a dating app, something to the effect of “valuing the material over the theoretical.” I know, I sound like such a catch, right? But my thinking was to always make my profiles say something about who I was, about my values, my interests, about the part of me that isn’t a face, that is more than a heartbeat and a body. And I remember, once, that I got a random message from someone who had read the profile and reached out because they were mildly twerked by that line, and wanted to challenge me, to demand an explanation of me. I don’t know what I responded, not word for word, but the gist was that I had spent too much time digging through the evidence of blloodshed and harm to have much patience for the kind of theory-spouting performativity that fills so much discourse, both online, and in the academy.

He didn’t block me, but he certainly didn’t continue our conversation.

And now, the books that I want to keep, the books I am putting into my mental suitcase to take with me when I have to move, have a not insignificant number of therory titles.

Some of this must be age. Some of this, I think, must be the crisis, too. To live now, much less to begin to feel some weight of age now, is to try and cut through the endless bullshit of daily life in a time of chaos and anger and spectacle, to try and find the things worth holding onto. I find, more and more, that the process of doing things outweighs the outcomes, that clarifying what I think and what I think I want to do matters as much as the doing. Are my theoretical texts just religious books, spot-gap crutches for a lack of the divine? Perhaps, but what I take from them, what these books seem to actually, actively be saying to me from across the room is that why matters. As much as when, and where, and what, and whom.

And now I am left with two levels of shelf with the books I really need, and the knowledge that, soon enough, I’m going to have to ask myself what I really think about the other titles that remain.

Oppenheimer is science fiction: Myth, the Anthropocene, and Ursula K. LeGuin

A portrait view of Cillian Murphy, actor playing J. Robert Oppenheimer. He is show head and shoulders, facing the camera, but looking slightly to the left. Behind him the scene is blurry but it is probalby the desert town of Los Alamos, New Mexico, built specifically at the real Oppenheimer's request to house part of the U.S. atomic weapons program during World War Two.
The titular character of J. Robert Oppenheimer (Cillian Murphy). Universal Pictures

What kind of movie is Christopher Nolan’s Oppenheimer? Ostensibly, it is a about the real J. Robert Oppenheimer (1904-1967), the man who led the Manhattan Project during World War Two and who many see as the “father” of the first atomic bomb. Released last week, critics have mostly lauded it—Wendy Ide called it a “volatile biopic” and a “towering achievement” in The Guardian, while Anthony Lane (writing for the New Yorker) is more reserved, calling the film “challenging” because of its “glut of political paranoia.”

A screenshot from Twitter showing Sam Altman saying "i was hoping that the oppenheimer movie would inspire a generation of kids to be physicists but it really missed the mark on that. let's get that movie made! (i think the social network managed to do this for startup founders.)" This is followed by a reply by Elon Musk saing "indeed." Another reply below this, from Celement Mihailescu, says "Haven't watched Oppenheimer yet, but I can definitively say that The Social Network was indeed the first trigger to my interest in entrepreneurship."
Atomic weaponry should inspire! Screenshot from X, the site formerly known as Twitter.

Meanwhile, on social media, some have worried that the film would fall back on “great man” tropes to lionize the titular character of the physicist Oppenheimer, while others complained the opposite, with Open AI CEO Sam Altman stating the film didn’t do enough to “inspire a generation of kids to be physicists” in the way that “The Social Network managed to do… for startup founders.” What “kids” would be doing watching Oppenheimer (rated R) is just the first of several curiosities peaked by Altman’s complaint, but let’s set this aside for a moment. 

You don’t order a hamburger at a hotdog stand. Nolan’s film is not a documentary, it is not a work of history, and it is not a recruitment video. Yet it would be equally a mistake to call this, as many have, a “biopic,” and then expect it to accurately represent the ins and outs of the life of the real J. Robert Oppenheimer. Those looking for a documentary have missed the thread. 

Nolan makes films that tend toward the dark and the fantastic.These films play with unusual pacing and matters of perspective, such as Memento (2000). They use speculative devices to drive the plot, as with a cloning machine made by a fictionalized Nikola Tesla in The Prestige (2006), dream penetration in Inception (2010), a wormhole in Interstellar (2014), time travel in Tenet (2020), and so on. We see, in these films, the destruction wrought by the creation of new technologies, by the scientific shortcuts that seem to offer great potential, but instead become avenues of unforeseen and often terrible ends. Nolan is famously ambivalent towards technology, an adherent of analog film processes in an era dominated by digital imaging. (Much of Oppenheimer was shot on 70 millimeter IMAX film stock produced by Kodak, and to shoot scenes in monochrome, Nolan had custom 70mm black-and-white film manufactured.) And in his Batman trilogy (Batman Begins, 2005, The Dark Knight, 2008, The Dark Knight Rises, 2012) all the gadgets in the world cannot clear up the conundrum that the vigilante hero and the villain are almost indistinguishable. Of all of the films that Nolan has both written and directed, only one, Dunkirk (2017), is historically themed, but even then, there’s a suspense of watching an elaborate magic trick: How did the British Army get away from the Nazis after the failure of the Battle of France in 1940? I am not particularly fond of Nolan’s films, in part because they are all heavy-handed, they all impose a deeply gloomy world view, and for all their supposed focus on ambiguity, they seem all too definitive about nihilism. One thing is undeniable, however: Nolan doesn’t much care for filmic norms. 


A 17th century painting by Reubens showing the mythical Prometheus. He is naked, chained upside down to a rock. The landscape is dark and moody. An eagle stands on his head, and is reaching down with its beak to tear into the torso of Prometheus, and there is some blood.
Reubens, Prometheus Bound. Oil on canvas, c. 1612. Philadelphia Museum of Art.

As for Oppenheimer? Nolan’s filmdeploys some of the tropes of the “biopic” form, yet despite these presumed ties to a real life, this is science fiction. Nolan builds, in part, on a biography of the real J. Robert Oppenheimer—more on this in a moment—but he is also building on older stories, namely the ancient Greek myth of Prometheus, who stole fire from the gods and gave it to humanity. The Prometheus metaphor is twisted throughout the film, starting with the opening epigram, which states:

Prometheus stole fire from the gods and gave it to man.
For this he was chained to a rock and tortured for eternity.

This is influenced, in part, by the title of Kai Bird and Martin J. Sherwin’s Pullitzer winning 2005 biography, American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer, a book that served as one of Nolan’s chief sources. (The end credits name-call the book early and prominently.) Nolan’s script, in turn, has its characters directly mention the mythic Prometheus more than once.

The Prometheus framing is a bit overwrought. As the film progresses, we see how, after the Second World War is over, Oppenheimer loses his security clearance, his job, and much of his reputation during the 1950s “red scare” years. This is a meaningful injury to Oppenheimer (both the real physicist and the fictional version played by Cillian Murphy). It is, however, hardly the fate of the original, immortal Prometheus, who tradition tells us was chained to a rock in Caucasus Mountains, where eagles pecked at his torso, and ate his internal organs, over and over and over again, presumably forever. (According to at least one legend, however, the demigod Heracles later freed him.) 

But whether Oppenheimer deserves to be regarded alongside Prometheus is beside the point. This is filmmaking, and more than that, epic filmmaking. Nolan’s Oppenheimer is not the real Oppenheimer, he is a mythic figure, no more bound by the actual events of the past than is Prometheus himself.


A still from Christopher Nolan's film, Oppenheimer. The scene shows a large, dark, spherical device at center left, with many wires sticking out of it--this is meant to be a depiction of the atomic bomb tested at the Trinity site in New Mexico. Beside it stands a shadowy figure, actor Cillian Murphy, depicting J. Robert Oppenheimer.
Nolans depiction of “the gadget,” ready for its test at the Trinity Site. Universal Pictures

Central to the plot is “the gadget,” as the scientists euphemistically call the bomb, and a fateful question that each wrestles with: is it inevitable? Early in the film, Oppenheimer says that “we have no choice” but to build a bomb, noting that while he had doubts about whether the Americans could be trusted with it, he was sure the Nazis could not be. If Oppenheimer and the rest of the Manhattan Project had not made it, would Werner Heisenberg have done so for the Nazis? Or someone else, just later? In science and technology, where does responsibility lie? Later in the film, shortly before the now famous test explosion called Trinity, Oppenheimer asks his team “Is there anything else that might stop us?” It’s not clear who the “us” actually is. 

As the film is titled Oppenheimer, it focuses mostly on the perspective of this single figure, and that implies a personal response to such questions. What is it like to become involved in the terror of things larger than yourself? Who controls action, or does anyone? Is there anything an individual can do other than become swept along with events, to join in? Cillian Murphy deserves a lot of credit here, for Oppenheimer’s obsession with such questions is made visible less through dialogue than through a simmering anguish that the actor wears on his face in nearly every shot. (For a supposedly dialogue heavy film there’s a hell of a lot of simmering, silent glances—did Murphy learn from the Mark Rylance school of acting?) Late in the film, after the war, Oppenheimer goes to see President Truman. Rhyming with one of the real Truman’s famous aphorisms, Gary Oldman’s take on the president essentially tells Oppenheimer that “the buck stops here,” noting that history will judge the person who ordered dropping the bomb, not who built it. In the process, though, this fictional Truman misses the point: Oppenheimer isn’t interested in where the blame should go, but in how the decision was ever made, in whether or not the creation of the gadget was not truly his “miracle” (as the character at one point describes it), but merely an inevitable chain reaction.

Such chain reactions that lay at the heart of the film. At one point, the filmic Edward Teller hands around a set of calculations that suggest that one possibility is that the ignition of an atomic fission bomb would create an unstoppable chain reaction, one that would eventually grow to ignite the atmosphere and end all life on the planet. Disturbed, Oppenheimer took the calculations to Albert Einstein (Tom Conti) to have them checked, and then again to Hans Bethe (Gustaf Skarsgård), one of the project team. Bethe concludes the odds were “near zero”, but not unprovable using math alone. In the film, General Leslie Grove (Matt Damon), the Manhattan Project’s military head, does not learn this until just hours before a practical test of “the gadget” in the New Mexico desert.

Grove is apoplectic. Oppenheimer asks Grove what he wants, and Grove responds “zero would be nice!” The project proceeds. The test is successful, and Oppenheimer watches his creations carted off in crates, bound for use over Japan, where the American government will use them to kill hundreds of thousands in a grim calculus that somehow, brutal and indiscriminate slaughter now will save lives in the after. If the Prometheus allegory holds, then here we are seeing what humanity has since done with divine fire.


a close view of the profile of the head of J. Robert Oppenhiemer (played by Cillian Murphy). He is inside a bunker, facing towards a small porthole window. He wears thick protective eye goggles. His face is brightly lit, almost pure white, from a source outside the window. The performance is meant to depict the shocking blast of light from the explosion of the test weapon at Trinity.
Progress isn’t always progress. Universal Pictures.

The power Oppenheimer and his team unleashes at Trinity is the moment that the Anthropocene became unstoppable. Historical films are never historical. They are always a product of contemporary culture, and no matter how authentic the costumes and sets and research, they are about the now. Oppenheimer’s story is about the birth of the atomic bomb, but its message is about us. It is about fossil fuels and climate change, rack-rent digital “enshittification” and radical income inequality, predictive language models (“AI”) and the future of work. It is about what obsession with technological “progress” can bring us: death, destruction, tragedy.

And this is what makes Oppenheimer science fiction. As the late great Ursula K. LeGuin once wrote, science fiction isn’t about the future, it’s about the narrative potential of a world of possibilities, worlds where

what we do see is the stuff inside our heads. Our thoughts and our dreams, the good ones and the bad ones…. when science fiction is really doing its job that’s exactly what it’s dealing with.

Ursula K. LeGuin, “Science Fiction and the Future,” 1985.

The bomb at the heart of Oppenheimer, then, isn’t the “gadget” that was blown up in a stretch of New Mexico desert at 5:29 a.m. MWT on July 16, 1945, nor the two bombs deployed over Hiroshima and Nagasaki. Nor is it the “bigature” version of the Trinity test that Nolan’s crews constructed from propane and gasoline and magnesium and aluminum. The real bomb is the one inside the human mind, inside of our minds. It is the “bad dreams” that LeGuin says science fiction can show us. 

Is it any wonder that Sam Altman (or Twitter dictator Elon Musk) doesn’t like this film?

Searching for the Real McGuffin: Indiana Jones and the Dial of Destiny

A still from Indiana Jones and the Dial of Destiny (James Mangold, 2023), showing actor Harrison Ford disguised as a German officer during World War Two. Ford's face has been digitally retouched to appear younger than the 80-year-old actor actually does today.
Above: Harrison Ford looks pretty good for 80, doesn’t he? Thanks, Photoshop! Still from Indiana Jones and the Dial of Destiny (James Mangold, 2023).

Is Indiana Jones still relevant? The character was originally conceived of as a pastiche of numerous 1930s and 1940s adventure heroes, part Humphrey Bogart in Treasure of the Sierra Madre, part Ronald Coleman in Lost Horizon. Despite these temporal allusions—themselves part of the filmic world, not the historical one—the character of Henry Jones, Junior, or “Indiana” Jones, or just plain “Indy,” is distinctly a product of nostalgia. He is not of the 1930s, but of the 1980s. Does the third decade of the 21st century really need the return of Indiana Jones? The release of Indiana Jones and the Dial of Destiny late last month (June 2023) seems to say: we’re getting Indy again whether we want him or not. 

I don’t ask this easily. Thanks to a box set of the original trilogy, bought on discount at a Payless Drugs, Indy was a companion in my adolescent search for self, available any time I wanted to insert a tape into the VCR machine. I played the third film, Indiana Jones and the Last Crusade, so many times that eventually my mother would roll her eyes and say “not that again.” Yet the world of Last Crusade was an exciting, stylish one,  made for the pre-pubescent that I then wished to stay. It was filled with machines—teak Chris-Craft speed boats, silvery Zeppelins, biplanes, fighter planes, tanks. Its plots were structured as constant movement between distant places—Peru, Nepal, Egypt, India, Italy, the Middle East. It was a world that stood in stark contrast with the glass block, jewel toned, postmodern malaise of my suburban reality. It was a world of high stakes hooked to esoteric mysteries from the past, a world where what and how the hero thought was as important as who and how he fought. 

More broadly, Indy served as an American fantasy of self image. In two of the three original films, Raiders of the Lost Ark (1981) and Last Crusade (1989), Indy’s antagonists were Nazis. During one scene in the latter film, Indy’s father Henry (Sean Connery) calls the Nazis “the slime of humanity.” For the movie writers, the Nazis were the ideal adversary, one that could be shot, stabbed, driven over, and blown up without remorse. This is Indy as Captain America, but with a subdued palette. 

Yet Indy’s Nazi adversaries were more than merely the ultimate NPC enemy. They had wider cultural ramifications. When the real Hitler had marched his armies across Central Europe, Scandinavia, and France, had bombed England from the air, and had begun to mass murder sexual, religious, and racial minorities, the United States had done nothing, maintaining political neutrality.  Such policies all too often made space for Americans to give political comfort to the Nazi regime. The most visible example came in early 1939, a year after the Nazis had forcibly seized Austria, and just a month before it would repeat this “annexation” in Czechoslovakia.  On February 20th of that year, more than twenty thousand American citizens filled Madison Square Garden, the most prestigious sporting venue in the most important American city.  These were members of the Amerikadeutscher Volksbund, or the German-American Bund, a society of American citizens who saw no contradiction between the fascism of Nazi Germany and the future of the United States, and they put on a Nazi rally, praising Hitler and George Washington in the same breath. Elsewhere? As scholars have since established, anti-black “Jim Crow” race laws throughout the American South and West served as a legal pattern for many of the race laws Nazis crafted to justify and carry out the Holocaust. By pitting Indiana Jones against the Nazis in two of its three original, 1980s blockbusters, the historical ambivalence of Americans to the rise of Naziism was written over, is retconned. In these films, we were always the good guys, we were always on the side of the angels. 

The myth of Indy the Nazi fighter became so strong that it nearly destroyed the franchise. It took more than twenty years for another film to be released, 2008’s Indiana Jones and the Kingdom of the Crystal Skull, set in 1957. For antagonists, the writers replaced the Nazis with Soviet spies. Turning Indy the anti-fascist into Indy fighter of reds under the bed was so far-fetched that the filmmakers themselves didn’t really believe in it, preposterously arming their ultimate villain, the KGB agent Irina Spalko (an ill-used Cate Blanchett) with a sword rather than a gun. Though it did relatively well at the box office, Kingdom was a muddled film, receiving mixed reactions from both critics and fans, and somehow making the racist and misogynist Temple of Doom seem better by comparison. (As ever, the petulant George Lucas told us so.) Was Kingdom an archaeological adventure like those Indy had been on so far? Or was it an homage to another Speilberg directed film, 1977’s Close Encounters of the Third Kind? Was it celebrating and leaning into Cold War red scare tropes, or making fun of them? Was Kingdom meant to be the end of Indy’s career and a springboard for his successor, his newly discovered son “Mutt” Williams (Shia LaBeouf), or to be the first of several new Indy films? Nobody, not even the filmmakers, knew what Kingdom was actually about or for—other than cashing in on the Indy intellectual property, anyway—but most of all it was an Indy film that wasn’t about fighting the Nazis, and as a consequence seemed to lack a certain spark.

The bad taste of Kingdom made me hesitant to go see Indiana Jones and the Dial of Destiny, the recently released fifth (and supposedly final) installment of the Indy franchise. As when Kingdom came out, some have wondered whether or not we needed this latest film, or wondered whether Harrison Ford (who is now eighty) was too old to play Indy. A better question: Do we need to see Indy fighting Nazis again? In 2008, the answer to this was mired only in the aesthetics of fandom. In 2023, however, there is far more salience to this question. 

The answer? Nazis we get. Although Dial is set in 1969, our main villain, Doctor Voller (Mads Mikkelsen) is a Nazi… and a very Americanized one. A riff on the real Werner von Braun—the former Nazi rocket scientist who helped NASA reach the moon in 1969—Voller has distinct and bitter thoughts about the outcome of the war. As Voller tells a Black hotel steward early in the film, “You didn’t win the war. Hitler lost it.” With help from the Central Intelligence Agency, Voller sets out to struggle against Indy to recover a highly fictionalized version of Archimedes’ Antikythera mechanism, as Voller believes that this machine will enable time travel. Presumably unbeknownst to his American sponsors—we never quite get full resolution about this fact—Voller’s goal with the Antikythera is to go back in time to kill Hitler, and replace him with an actually competent Nazi leader, presumably Voller himself. The film becomes a meditation on Naziism and American complicity, with the CIA helping Voller to acquire the device, even at the cost of civilian lives. 

Meanwhile, Indy finds himself fighting alone, surrounded by people who categorize his warnings about Nazis as hyperbole or paranoia. Nobody believes the Nazis are a thing anymore, even though they are right there, and everybody can see them!  The only person who really understands the stakes is the eighty-year-old man who everyone thinks is a little too old and too past it to crack his whip, or ride a horse, or climb up a cave wall. As the plot unfolds, however, the companions who doubted Indy begin to see what he saw, begin to realize that, all along, the old man had been right. (If you squint hard enough, the whole film starts to resemble a commentary on the Biden 2024 presidential campaign.) Though I won’t spoil how, it should be obvious that, old as he is, and doubted as he is, Indy (with a little help from his friends) once more succeeds in defeating the Nazis, because that is simply what Indy does, it is who he is. 

By the end of Dial of Destiny, we come to realize that just as the first three films were about the 1980s they were made in rather than the 1930s they were set in, Dial is about now, not the 1960s, and its main thesis is that, faced with old troubles, generational reconciliation is the way out. Despite his in-universe biography, Indy is a “boomer.” He’s had his nuclear family (wife Marian, son Mutt), he’s had the financial security of an institutional job, he has a pension and a literal gold timepiece is presented to him on his retirement. He is old and stuck in his ways and does not understand contemporary culture, he yells at neighbors to turn down their music, and he wishes he could escape the complexities of the present and immerse himself—literally—in the past. His goddaughter Helena (Phoebe Waller-Bridge), meanwhile, is the avatar of a millennial. She is single, with no home, no husband, no kids, but has fictive kin in the form of Teddie (Ethan Isidore), a teenage fellow drifter who is one part brother, one part son. Unlike Indy, Helena jumps from job to job (many of them only questionably legal), scrapes by with her wits, is comfortable with precarity, and most of all is not part of any institutional inside. She is, in other words, a gig worker. (Helena’s assertiveness is voiced in soft-peddled variations of feminist rhetoric, but it’s never clear if Helena is actually an empowered female character, or merely a totem of one installed to make the film seem more contemporary.) Helena’s in-betweenness in all things queer codes her—despite her frequent leering at male bodies of many ages—and it sets Indy on edge, making him doubt her almost incessantly. Yet, by the end of the film—in part due to Helena’s persuasive use of physical violence—she and Indy are reconciled to being on the same team, to fighting together rather than against each other and to seeing family—both born and fictive—as the site of meaning and contentment, the thing worth fighting for. Reconciliation, not relics, are the film’s real McGuffin. 

What, then, does Indiana Jones and the Dial of Destiny really try to tell us? Despite the technological marvel of “de-aging” Harrison Ford for an extended opening sequence set in the thirties, and despite the trope of the titular “Dial of Destiny” that allows for time travel, the film actually argues the inverse. There is no magic. There is no going back to change the past, not for the Nazis, not for Indy, and not for us. The film ends by reuniting Indy with a wife that grieves for their lost son, with Salah and his loud, cute children, with a goddaughter who annoys the crap out of him. All Indy can fight for is all that we can fight for as well, an imperfect present and the determination to keep going, even if it gives us aches and pains. 

Indiana Jones and the Dial of Destiny is not the best Indy film, nor one that I see myself watching over and over again, but it may be the most relevant Indy film ever made, replacing a fantasy of how we wish we were with a vision of how we might be.