Is our current cataract of information the reason why JFK’s murder seems more real to me than the 9/11 attacks, or the bagging of Osama bin Laden?
FILE -- President John F. Kennedy is seen riding in motorcade approximately one minute before he was shot in Dallas, Texas on Nov. 22, 1963. In the car riding with Kennedy are Mrs. Jacqueline Kennedy and Gov. and Mrs. John Connally of Texas. Perhaps the most profoundly disturbing 26 seconds of footage in American history -- the Zapruder film of President Kennedy's assassination -- goes on sale to the general public at video stores Monday. (AP Photo)
Fifty years ago next month, in the early afternoon of Nov. 22, 1963, my seventh-grade class was calculating sums in base-two, or binary code, a facet of the “new math.”
We were assured that such manipulation of ones and zeros was how computers worked, and while that didn’t quite tally yet for most of us, we were aware the “new math” could be traced to one of the most stunning events of our lives to that point — the Russian Sputnik beeping around the planet in October 1957. An alarmed President Dwight Eisenhower issued a call for more scientists and engineers, and in my mind binary code and the Soviet Union are linked — along with President John F. Kennedy’s rallying cry to Congress in May 1961: “I believe this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the Earth.”
During our avant-garde arithmetic lesson, the announcement arrived: President Kennedy had been shot in Dallas. I don’t recall if we were informed he was dead, but we were dismissed from school. Some classrooms spontaneously cheered, scandalizing teachers. It was conceded the kids were merely gladdened by release from class, not by the fate of JFK, but I suppose it wounded our teachers either way.
I walked a mile across town to home and found my mother weeping in front of the television set. She was Roman Catholic; Kennedy’s portrait hung in our house. Her enthusiasm had inspired me and some pals to fashion our own pro-JFK fliers during the 1960 election campaign, inserting them under the windshield wipers of neighborhood cars — as if Kennedy needed any help in a northeastern Minnesota mining town in those politically blue days.
My mother’s tears had impact, not least I think because she wept before the television, the magic fantasy box that was still a relative novelty in our house. Less than 48 hours later she and I gasped together as witnesses of Lee Harvey Oswald being shot to death on the screen. I was shocked and strangely thrilled. The 24-hour news cycle was decades in the future, but I believe that inadvertently televised killing was a seed.
Oswald’s strong Russian connection was highlighted immediately after the assassination, and many felt the stab of fear we’d known a year before during the Cuban Missile Crisis. Had the Soviets retaliated against Kennedy? We perceived JFK as the unalloyed moral victor of the Cuba showdown, being ignorant of realpolitik deals made with Nikita Khrushchev, and the initial recklessness of the American response. Would Lyndon Johnson have to nuke the USSR?
Mercifully, that moment quickly passed, and our national angst soon transferred to conspiracy theories that ballooned to Hollywood proportions and continue to augment political grist mills and bank accounts to this day. Historically, it does matter who killed Kennedy and why, and in my opinion Oswald likely pulled the trigger, though I’m not certain he acted alone. After all, even JFK’s brother Robert, attorney general, asked John A. McCone, director of Central Intelligence: “Did the CIA kill my brother?”
The fact that Oswald was promptly murdered in police custody is certainly suspicious. That suspicion, however, is partly a product of time passed. The day Jack Ruby killed Oswald, my mother and I were accepting of personal grief and longing for revenge as a believable motive — because that’s how we felt.
For the American high school generation of 1963, those postassassination days delivered an irremediable blow to collective innocence and spawned a germ of cynicism. As journalist Lance Morrow wrote on the 20th anniversary in 1983: “The real 1960s began on the afternoon of November 22, 1963 … [It] came to seem that Kennedy’s murder opened some malign trap door in American culture, and the wild bats flapped out.”
The subsequent assassinations of Robert Kennedy and Martin Luther King — five years later and two months apart — seemed almost foreordained and tragically hackneyed, a network TV version of regrettable grit, as sure as “death and taxes.”
Still, big news was not yet entertainment in 1963. Technology was not in play to trivialize shared experience via homogenized, mythmaking images ceaselessly funneled through multiple channels and able to leisurely short-circuit individual reflection.
Our cumbersome pencil-and-paper calculations of binary code, now multiplied quadrillions-fold, have long since trumped the power of a Mannlicher-Carcano bolt-action rifle. (How quaint it seems that Kennedy’s shooter did not wield a semiautomatic; imagine the rich trove of ballistics analysis we’ve been cheated of because their weren’t 10 or 12 shots.)
Is our current cataract of information the reason why JFK’s murder seems more real to me than the 9/11 attacks, or the bagging of Osama bin Laden? Is it a simple equation: the more you have of something the less you value it? The fainter impression it makes? Has instant information, gilded with glitz and fashioned into scripted narratives with beginning, middle, end — plus time for advertisement and space for overlays of sanctioned meaning — corrupted knowledge?
That gray, early winter afternoon so long ago, I left school and walked for 20 minutes in chilly, wet weather. By late evening the TV screen hosted a test pattern and was turned off. We went to bed and slept, perhaps dreamed our own dreams. Next morning, we resumed hard-wired connections with daylight and faces, waiting (!) for the next newscast, or for the afternoon newspaper.
These recesses in the coverage, oasis-islands in the ocean of information, preserved the separation required for comprehension, like the pregnant pause in a recitation of poetry, or the moment when a good teacher goes silent, raises an eyebrow, and says, “Any questions?”
There are few genuine queries in the deluge of ones and zeros, mostly just unsolicited answers that fit the format of bytes, or the imperious needs of Wall Street and Madison Avenue.
I’m not saying ignorance is bliss, but our level of useful awareness is not directly related to the amount of information we consume. An inundation of data becomes clutter, or in the terms of the iron-mining culture I grew up in: overburden, the waste rock concealing valuable minerals.
It’s about mindfulness, Mark Muesse wrote: “Mindfulness is the skill of being deliberately attentive to one’s experience as it unfolds — without the superimposition of our usual commentary and conceptualizing.” (Or that of others.) In popular parlance: Be here now.
If computers were sentient, the processing of binary code would be an exemplar of mindfulness meditation — focusing on one thing without judgment, a paragon of equanimity. An open laptop would be the lotus position. Big Blue would be the Buddha. Fortunately, our machines are not conscious, for I suspect many science fiction writers are correct to assume that if computers were truly intelligent they would probably dispense with entities as messy and incalculable as human beings.
It resonates with me that I was working with machine language when I heard that Kennedy was shot. I had little concept of the implications of either computers or a president’s death. Fifty years later, our society is at risk of anarchy and mass death if a cyberattack mangles our software. In 1963, creating such chaos would have required a nuclear strike.
As for JFK’s demise, Earl Warren, chief justice of the United States Supreme Court and head of the eponymous commission that investigated the crime, said, “We may not know the whole story in our lifetime.” I don’t believe I do. It bugs me a little, and even more so when I realize that given the morass of information, speculation and opinion — all magnified by the ones and zeros, switches open and switches closed — it seems possible no one ever will. The oasis-islands have washed out with the tide.
Shortly after the assassination, Marguerite Oswald, mother of Lee Harvey, said, “If my son killed the president, he would’ve said so. That’s the way he was brought up.” Maybe, maybe not. We all understand parents can be delusional regarding the character of their children. But a half-century on, her words stand in poignant isolation. Are they true? Well, at least she didn’t start a blog.
Six centuries ago, an English theologian and philosopher called William of Occam, who contested the worldly power of the pope and who “claimed that purely intellectual abstractions are not valid knowledge and that reasoning must be based on experimental proof,” stated a principle that gained fame as Occam’s razor. He wrote: “Entities [means employed to explain phenomena] should not be multiplied beyond what is needed.”
The Warren Commission was officially satisfied with the simplest explanation of the assassination: Lee Harvey Oswald acted alone. Are many of us not satisfied because it’s too simple? How could a lone, flawed commoner bring down the King of Camelot? A screen writer would never buy it.
Peter M. Leschak, of Side Lake, Minn., is the author of “Ghosts of the Fireground,” “Letters from Side Lake” and other books.
The Opinion section is produced by the Editorial Department to foster discussion about key issues. The Editorial Board represents the institutional voice of the Star Tribune and operates independently of the newsroom.