Finally, some good news for the absent of mind. Forgetfulness is not a sign of early dementia and, no, you are not suffering from information overload. Turns out your brain is just doing its job. What's more, being forgetful may mean you're blessed (or cursed) with exceptional smarts.

Even the human brain, as superior as it is, can't do everything. In the interest of advancing the species through the kind of problem-solving that produces paradigm shifts like fire-building and the wheel, evolution seems to have randomly bestowed upon some people an uncommon ability to shed the small stuff. These people may or may not be regarded as smart by their peers in the "real" world — they aren't necessarily the first with the right answer on spelling tests, and they may not know how to do things as brain-dead simple as keeping a calendar and finding their keys. But they do score the highest on IQ tests.

I am one of those dysfunctional "smart" people. When I was 9, my mom broke the news (and swore me to keep the secret) that I had scored high on an IQ test. The parochial school I attended was bursting at the seams. A few future fourth-graders had been picked, based on that IQ test, to share a classroom with kids a year older. We were being groomed to skip a grade the following year.

The hoped-for outcome — that I would benefit from knowledge imparted on the other side of the classroom where the fifth-graders were — failed miserably. I spent fourth grade in a fog, unable to focus on my own lessons much less the older kids'. My parents moved me to an all-girls' school the following year. I arrived hopelessly behind in math, with barely legible penmanship and a lot of catching up to do.

What was no secret within my family were disturbing traits they did not associate with intelligence. I was feisty, fearful and extremely forgetful. Of course, we now know that "extreme" behavior goes with the territory when you have a high IQ. New brain science is finally figuring out why.

In a paper published in the journal "Neuron," University of Toronto neurobiologists Paul Frankland and Blake Richards lay out their thesis: "Based on principles from machine learning and computational neuroscience, we propose that it is the interaction between these two processes (i.e., persistence × transience) that optimizes memory-guided decisionmaking in changing and noisy environments. ...[O]nly by combining persistence (remembering) and transience (forgetting) can individuals exhibit flexible behavior and generalize past events to new experiences."

Those "noisy environments" include such places as chaotic classrooms. The "intelligent" brain not only tunes out the noise, it deletes it altogether. It has no way of knowing, especially if it belongs to a 9-year-old, what is important noise and what isn't, but it doesn't respond well to being told. It must decide on its own, through trial and error.

In artificial intelligence, this principle is called "regularization" and it works (or doesn't) through simple computer models that prioritize core information but eliminate specific details, allowing for wider application. Memories in the brain work in a similar way. When we only remember the gist of an encounter as opposed to every detail, this controlled forgetting of insignificant details creates simple memories that are more effective at predicting new experiences.

In critical ways, humans do a better job of regularizing than do the machines we program to mimic our intelligence. "Overfitting" occurs when data points clustered along a preconceived grid cause faulty interpretations of the aforementioned "gist." Driverless cars may someday be programmed to operate at 99 percent accuracy, but that won't make them perfectly safe. Physician Atul Gawande, writing in the New Yorker recently, complains that expensive computer software intended to free up time for doctors to spend with their patients has backfired, creating endless redundancies and needless complexities.

And we all know what happened when mortgage bankers replaced common sense with algorithms that promised housing prices would always go up. Wrong-o.

In Michael Lewis' somewhat romanticized version of the crisis, "The Big Short," a brilliant but notoriously "odd" human figured out that the computers were wrong by combining an ability to interpret raw data with a deep knowledge of how bankers think. He connected the dots and came up with the contrarian view that made him a billionaire: The banking industry was knowingly rigging the system. (See Charles Ferguson's book "Predator Nation" — his documentary on this topic, "Inside Job," won an Oscar in 2011 — to learn the gory details that most Americans are still ignorant of; it doesn't take exceptional smarts to conclude that the lack of criminal prosecutions of these same corrupt bankers represents the most dazzling failure of the Obama administration.)

In short, even supersmart computers are as yet unable to perform the sort of transcendent problem-solving that enabled Copernicus to see that just because the horizon is flat doesn't mean the world is, and Newton to come up with the counterintuitive concept of an invisible force pulling apples to the Earth, and Einstein to figure out how light lets us live comfortably in a dimension of reality that doesn't actually exist.

Why are only some people smart in this way? Luck of the draw. Remember, evolution is not intelligent design. Random mutations either prove their worth in a given environment, or don't. The vast majority don't. One such mutation granted a monkey self-awareness. This trait grew and flourished in the form of man. It is what distinguishes us from other species — the magic bullet that protects us from (and gives us an advantage over) species of superior size and strength.

Einstein was as memorable to some for his absence of mind as he was for his brilliance. Lots of smart people fly under the radar for just this reason. It wasn't modesty that prevented me from shouting out the right answer before anyone else. I had no clue what x was in 4x - 5 = 15. My mind was elsewhere in math class. But my teachers appreciated my curiosity and imagination, and in a small school I had the luxury of learning collaboratively — educating myself, with my teachers' and classmates' encouragement and help, by writing long essays and reports that I assigned to myself and presented every morning, by composing songs and directing plays that my friends and I improvised over recess, and by being generally disruptive (I spent a lot of time in detention study hall) because after those earlier years in a crowded classroom with a teacher who didn't know my name, I would do just about anything for attention.

My teachers let me "forget" to my heart's content. And the absence of boys meant I could be disheveled and well-liked at the same time. Girls don't care what girls look like. Girls just want to have fun. I was fun. Adolescence came as a rude awakening. The "male factor" eventually ruined my blissful state of being myself. Boys wanted none of my goofball antics. With lightning speed, I learned my place in a man's world and turned myself into someone I saw in magazines like "Seventeen" — pretty and shy, always laughing at their not-very-funny jokes.

My mom's early-childhood training in the feminine wiles paid off, because her concern over what would become of a girl with a high IQ turned out to be right on the money. I've struggled to find a balance between "the good girl" and the renegade all my life.

Of course, whether we're male or female, it behooves us all to learn how to fit in, and that includes being on time, combing your hair, being on time, sewing missing buttons back on your shirt, being on time, and, speaking of shirts, tucking them in. Oh, and did I mention being on time?

There's no excuse for inconveniencing others, no matter what sort of brain you were born with. Such friends as I still have all know that while I may be unreliable, I do have a moral compass: They will have dinner on me at the restaurant of their choice if I keep them waiting even 10 minutes at the coffee shop.

One last thing about that long-ago IQ test. I firmly believe in the adage, use it or lose it. Intelligence is highly overrated, in that our brains are far more similar than different and we all have the freedom and the smarts to seek honest answers to life's persistent questions. We all know, deep down, whether we're looking reality square in the eye or falling back on conventional wisdom. Because intelligence is about creative thinking, a photographic memory can even be a handicap. Attitude of mind is what matters; then comes habit, and the latter is entirely fluid.

I try hard to keep my smarts, such as they are, focused on imagining "what if?" and not "what do the neighbors think?" because I believe that intelligence is a process, not a collection of cells or genes, and that it is, like all things neurobiological, "plastic." I believe, in other words, that intelligence can itself be "taught." That's what philosophers like Plato and Socrates were doing with their students: teaching them not what to think but how.

Freedom of thought is, I believe, imperiled in America. We have become the land of top-down, "my way or the highway" groupthink. It is why the mortgage crisis happened. It is why slogans and sound bites are dangerous. Partisanship is the new religion in that it fosters blind faith, which fosters a herd mentality, which thrives not on reason but anger and fear. Intelligence requires just the opposite to flourish: collaboration. "Yes, and" instead of "but, no." The German philosopher Hegel is associated with dialectical thinking as thesis, antithesis, synthesis. Collaborative thinking doesn't seek "the right answer" because there seldom is one. Everything is relative, as Einstein discovered.

There's nothing wrong with forgetting when judged from this perspective and actually a lot to be said for it beyond its contribution to quantum physics. It resolves the paradox of the absent-minded professor who keeps losing his keys. The old thinking says he only seems to be forgetful but is simply self-absorbed. The new thinking cuts him slack. Those of us who are sick and tired of apologizing profusely for forgetting a birthday are as blameless as a blind man who bumps into someone he doesn't see.

Society, of course, could care less. Would you rather be blind or brilliant? Boohoo, Ms. Smarty-pants. It must be rough. Well, it takes more courage to admit that you like to think for yourself and are good at it than it does to feign humility and cite (pass the buck to) "the experts."

My advice to the absent-minded professor, then, is this: Count your blessings and soldier on. Ransack the house, rummage through pockets, retrace your steps, all the usual protocols when something just goes poof.

But don't bother jogging your memory. Those keys are gone.

Bonnie Blodgett, of St. Paul, specializes in environmental topics. She's at bonnieblodgett@gmail.com.