I’d like to begin by thanking the Knight Foundation for the invitation to speak. It’s a profound honor to be here, before this organization dedicated to journalistic excellence.

I’ve been asked to give a talk about decision-making. I’m going to focus today on bad decisions, on the causes and repercussions of failure. The failure I’ll be talking about is my own.

For those who do not know who I am, let me give you a brief summary. I am the author of a book on creativity that is best known because it contained several fabricated Bob Dylan quotes. I committed plagiarism on my blog, taking, without credit or citation, an entire paragraph from the blog of Christian Jarrett. I lied, repeatedly, to a journalist named Michael Moynihan to cover up the Dylan fabrications.

My mistakes have caused deep pain to those I care about. I am constantly remembering all those people I’ve hurt and let down – friends, family, colleagues. My wife, my parents, my editors. I think about all the readers I’ve disappointed, people who paid good money for my book and now don’t want it on their shelves.

I have broken their trust. For that, I am profoundly sorry. It is my hope that, someday, my transgressions might be forgiven.

I could stop here. But there is a reason I want to talk today about my failures. I am convinced that unless I talk openly about what I’ve learned so far – unless I hold myself accountable in public - then the lessons will not last. I will lose the only consolation of my failure, which is the promise that I will not fail like this again. That I might, one day, find a way to fail better.

The lessons have arrived in phases. The first phase involved a literal reconstruction of my mistakes. I wanted to have an accounting, in my head, of how I fabricated those Dylan quotes. I wanted to understand the mechanics of every lapse, to relive all those errors that led to my disgrace. I wanted to understand so that I could explain it to people, so that I could explain it, one day, in a talk like this. So that I could say that I found the broken part and that part has a name. My arrogance. My desire for attention. My willingness to take shortcuts, provided I don’t think anyone else will notice. My carelessness, matched with an ability to excuse my carelessness away. My tendency to believe my own excuses.

But then, once I came up with this list of flaws, and once I began to understand how these flaws led to each of my mistakes, I realized that all of my explanations changed nothing. They cannot undo what I’ve done, not even a little. A confession is not a solution. It does not restore trust. Not the trust of others and not the trust of myself. At worst, my detailed explanations sounded like an excuse, a distraction from the more important reality I needed to confront.

Because my flaws – these flaws that led to my failure - are a basic part of me. They are as fundamental to my self as those other parts I'm not ashamed of. This is the phase that comes next, the phase I’m in now. It is the slow realization that all the apologies and regrets are just the beginning. That my harshest words will not fix me, that I cannot quickly become the person I need to be. It is finally understanding how hard it is to change.

Character, Joan Didion wrote, is the willingness to accept responsibility for one’s own life. For too long, I did not accept responsibility. And by not accepting responsibility – by pretending that all of my errors were accidents, that my carelessness was not a choice - I kept myself from getting better. I postponed the reckoning that was needed. I did not learn who I was until it was too late.

There is no secret to good decision-making. There is only the obvious truth: We either confront our mistakes and gain a little wisdom, or we don’t and remain a fool.

What I’d like to talk about today is how I’m attempting to confront my mistakes in the future. I don’t have any wisdom, just a story that gives me a small measure of comfort. It’s a hard story to tell because it’s not about me. But I want to tell it anyways, because it has helped me understand what I need to do next.

The story is about forensic science. At the time my career fell apart, I was working on an article about the mental flaws that plague forensic researchers. These are flaws that, if left unchecked, can lead to mistaken matches and wrongful convictions. My article focused largely on the research of Dr. Itiel Dror, a cognitive neuroscientist at the University College London. His most widely cited work involves fingerprint analysis. Dror has repeatedly shown that when fingerprints are unclear - and prints lifted from crime scenes are often full of ambiguity - it’s possible to get forensic scientists to alter their conclusions by telling them a different story about the prints. (Similar problems have been found in hair samples, bite marks, even complex genetic evidence.) In one of his experiments, Dror showed that four out of six experienced forensic examiners reversed their verdicts when presented with different contextual information. The prints remained the same, but that didn't matter. He could trick them into changing their minds.

It's important to note, of course, that Dror isn't alleging conscious bias. He doesn't believe that these forensic scientists are intentionally switching their verdicts to bolster the prosecution. Rather, he argues that they are victims of their hidden brain, undone by flaws so deep-seated they don't even notice their existence. The examiners think they are seeing it straight, observing the print as it is. But we see nothing straight. Everything is a little crooked.

It’s easy to brush these shortcomings aside, to insist, quite accurately, that the overwhelming majority of forensic testimony is valid and true.

But you know how it goes: close is not close enough. If we are not prepared to deal with our mistakes – if we try to hide them away, as I did - then even minor errors can become catastrophes.

This is what happened to the FBI forensics lab. Three days after the terrorist bombings in Madrid on March 11, 2004, the FBI received a print lifted from a plastic bag full of detonators found in a stolen van near one of the bombing sites. This print was immediately entered into the FBI fingerprint database, the largest biometric database in the world. A few hours later, the computer generated an initial list of 20 possible matches. After looking at all the possibilities, an FBI forensic scientist concluded that one of these prints was an exact match. A second FBI examiner confirmed this conclusion. The print in question belonged to Brandon Mayfield, a lawyer living in Portland, Oregon.

The FBI immediately opened an intensive investigation of Mayfield, obtaining warrants for electronic surveillance and physical searches of his property. The detectives soon discovered that Mayfield was a Muslim, married to an Egyptian immigrant, and had represented a convicted terrorist in a child custody dispute. On May 6, the FBI arrested Mayfield as a “material witness” in the Madrid bombings. Because there was no corroborating evidence - no flight records, no eyewitness accounts, no link between Mayfield and the suspects – the United States District Court appointed a third expert to review the forensic match. On May 19, the expert confirmed the judgment of the FBI: both prints came from the same person. Mayfield was sent to the Multnomah County Detention Center, where he was placed in solitary confinement and kept in his cell for 22 hours a day.

But the scientists were wrong. Mayfield’s print was not a match. In fact, it wasn’t even close. In retrospect, the certainty of the FBI examiners seems hard to understand. For one thing, the entire upper left quadrant of the original print failed to match Mayfield’s fingerprint. What’s worse, the print from the crime scene was taken from a right middle finger, even though it was matched to a print from Mayfield’s left index finger. It’s for these reasons that the Spanish National Police advised the FBI, nearly six weeks before Mayfield was formally exonerated, that he could not be linked to the crime scene, that their match was a mistake. Needless to say, the FBI ignored this warning.

In the wake of the Mayfield failure, the FBI had several options. They could have denied the systematic nature of the problem, insisting that the Mayfield case was a mere aberration.

Or perhaps the FBI could have acknowledged the remote possibility of error and decided that a little education was more than enough. Maybe Dror could have given a lecture on cognitive bias to the Bureau scientists.

This is a tempting solution. I have been tempted by this solution, by the possibility that if I simply research the psychology of deceit, that if I investigate the neuroscience of broken trust, then I can find a way to fix myself, that the abstract knowledge will be some kind of cure. But such knowledge is not enough. I know this from personal experience.

The month before I resigned from my job, I conducted an interview with Dan Ariely, a behavioral economist at Duke. We were talking about his new book on dishonesty. The essential premise of the book is that tiny falsehoods are everywhere; the human mind is a confabulation machine.

Here’s a typical experiment that Ariely conducted on unsuspecting MIT students. He gave the students a series of puzzles and told them that they would get 50 cents for each correct answer. One group of students was randomly assigned to the control condition, in which it was impossible to cheat. The second group, in contrast, was allowed to check their own work and then, after shredding the answer sheet, report their results.

What did Ariely find? While those in the control condition solved, on average, four of the problems, those who graded their own work reported six correct solutions. Furthermore, this spike in scoring did not result from a few bad apples. Instead, the improved performance was the result, Ariely writes, of “lots of people who cheated by just a little bit.”

I believed in Ariely’s hypothesis, which is that it’s all too easy to shade the truth, to report the wrong answers, to find yourself engaged in the dirty business of rationalization. During our conversation, I asked Dan several questions about his dishonest subjects, always using the reassuring detachment of the third person. I failed to appreciate my own hypocrisy.

The self-blindness on display here is emblematic of a larger pattern, a consistent asymmetry in the ways in which I noticed error. I never had a problem contemplating the flaws and bad decisions of others; I was not lacking for things to criticize. Oh, the comedy of human folly! What ridiculous people these other people are.

But I was totally incapable of applying this same standard to my own life.

My failures were my fault alone. But I’ve come to believe that, if I’m going to regain some semblance of self-trust, then I need the help of others. I need my critics to tell me what I’ve gotten wrong, if only so that I can show myself that I’m able to listen. That is the test that matters – not the absence of error, but a willingness to confront it.  

And this leads me back to the FBI. After being prodded by the Justice Department, the FBI recognized that apologies, re-training, even a big legal settlement, were not enough. If the Bureau was going to prevent another Mayfield-type failure from occurring, then it needed to fundamentally reform its scientific culture.

These reforms are why I’m still interested in this story. Not because of the mistakes, but because of the reckoning.

The most important reforms at the FBI have to do with the ways in which disagreements and contradictions – those troubling signals that someone, somewhere, might have made a mistake - are dealt with by the lab. While such disagreements were routinely dismissed before Mayfield – two forensic scientists within the FBI disagreed with the certainty of the original match, but their doubts were never taken seriously - the new standard operating procedures require that all “technical disagreements” among forensic scientists, and between the FBI and other law-enforcement agencies, be fully acknowledged. The scientists are supposed to sit down and calmly discuss the inconsistencies in their verdicts, going over each whorl of skin and what it might mean. Their differences are documented in writing and then passed along to the Unit Chief. If the disagreement cannot be resolved, it triggers yet another analysis, done this time by an independent examiner. While the pre-Mayfield culture at the FBI rarely encountered disagreements - verifiers almost always verified the original conclusion – the reforms have done a lot to change that. According to a 2011 report from the Office of the Inspector General in the Justice Department, forensic analysts now contradict each other all the time. They know that the only way to avoid big failures is to confront every little one.

There’s a phrase that appears again and again in the Justice Department report: standard operating procedure. The investigators spend many pages reviewing these procedures in tedious detail, how forensic scientists are supposed to go over each ridge and in what order, what the procedures are for dealing with uncertainty or disagreement, even how analysts are supposed to document their evidence in the case files. When I first read this report nearly a year ago, all this talk of standard operating procedures struck me as bureaucratic nonsense. The failures of forensics reflect deep-seated biases, fundamental cognitive weaknesses. How could a rewritten manual possibly help?

But you know what? I’ve come to appreciate this fixation on standard operating procedures.

That is how, one day, I will restore a measure of the trust that I have lost. Not with the arrangement of words, not with the apology, but with the commitment to a set of decent rules. To not have these rules – and it doesn’t matter if the rules are a natural instinct or a paper manual - is to expose myself to the possibility of indifference. It is to slip down a slope and not even notice.

There is a wonderful section in Charles Darwin’s autobiography where he writes about his “golden rule.” The rule is simple: Whenever Darwin encountered a “published fact” or “new observation” that contradicted one of his beliefs, he forced himself to “make a memorandum of it without fail and at once.” Why did Darwin do this? Because he had “found by experience that such facts and thoughts” – those inconvenient ideas - “were far more apt to escape from the memory than favorable ones.”

This really is the golden rule. It begins with a recognition of inherent weakness, but contains this weakness with a conscious habit, something that Darwin does “without fail.”

I clearly need a new list of rules, a stricter set of standard procedures. If I’m lucky enough to write again, then whatever I write will be fact-checked and fully footnoted. It doesn’t matter if it’s a book or an article or the text for a speech like this one. Every conversation with a subject will be tape recorded and transcribed. If the subject would like a copy of their transcript, I will provide it. There is, of course, nothing innovative about these procedures. The vast majority of journalists don’t need to be shamed into following them. But I did, which is why I also need to say them out loud.

Such a writing process will take a discipline I don’t yet have, but that’s why there are standard operating procedures. They are there for those days when I’m not strong enough to look at my errors, when I’d rather pretend they don’t exist. I need the rules because I know that simply knowing is not enough. That listing my failures or hearing about the failures of others won’t prevent me from doing the same thing. These procedures are my fail-safe.

In the past, I have quoted a line from the physicist Niels Bohr. An expert, he said, is a man who has made all the mistakes which can be made in a very narrow field. Bohr is right: we learn how to get it right by getting it wrong again and again.

But here’s the crucial addendum, which I failed to appreciate: screwing up is not enough. Because I certainly made lots of mistakes – I just tried not to pay attention to them. I did my best to look the other way.

And that’s why I need my new rules. Writing about science, about the hard struggle for truth, has always been a profound privilege. When I lost the trust of readers, I lost that privilege.

These rules are my attempt to make sure that never happens again. Their details are a kind of self-portrait, a summary of all those weaknesses I’d rather ignore. If nothing else, my new procedures are a mark of my mistakes, a reminder that whatever I do next will be shadowed by what I've done.

There are days when that strikes me as a tragic fact. And then there are days when I see it as a necessary burden, when I can imagine my failure as a kind of protection, like an antibody in the bloodstream after a disease.

I’d like to end with a quote from Bob Dylan, one he actually said: “She knows there’s no success like failure and that failure’s no success at all.” I used to believe that this verse was inherently inscrutable, a hollow riddle. But now I see that the words are literal; there is no riddle at all. Because success does require failure. It requires that we struggle and screw up and keep going. That we learn from what we cannot do well.

But that is a cliché. The poetry of Dylan’s line exists in the inversion, for even as he insists on the necessity of failure, he acknowledges that every failure is still a failing. It is a lapse, a shortcoming, a regret that wakes us up in the darkest parts of the night. The power of the verse is inseparable from this honesty, for Dylan manages to compress a brutal fact of life – failure is both necessary and terrible – into a few seconds of mournful singing. He is speaking a difficult truth, which is really the only kind.

I have learned a difficult truth about myself. I have learned about parts of me that I tried for too long not to see. But entangled with that truth is the possibility of improvement. Not redemption, not forgiveness. Just the mere possibility of improvement. The hope that, one day, when I tell my young daughter the same story I’ve just told you, I will be a better person because of it. More humble. More careful. Less tempted by shortcuts and my own excuses. 

I will tell my daughter that my failure was painful, but that the pain had a purpose. The pain showed me how I needed to change.