Xi in China, Putin in Russia, Trump in the United States[1]—it feels as though the stock of truth is trading at an all-time low. What factors reduced it to a penny stock, and how can it be turned around? This article aims to set the foundation by analyzing the intrinsic value of truth, while the following article When is Lying Justified? derives a model for when we may deviate from the truth. The subsequent post Why is Truth Having a Hard Time? explores the exact reasons for the recent decline of the value of truth in society overall, and How to Save the Truth provides a list of concrete measures to raise the importance of truth again. (Get alerted when they are released).
But first, in the spirit of truthfulness, it’s important to corroborate the examples above. Trump’s case is perhaps the most straightforward, as his statements have been systematically tracked and quantified: The Washington Post documented 30,573 false or misleading claims (not explicitly called lies, as that would imply intent) during his first term—an average of about 21 per day, giving rise to a world of so-called “alternative facts” and “post-truth politics.” The situation is no better with Xi and Putin. Many of their statements on international matters have been thoroughly debunked—for example, Xi’s claims regarding the treatment of Uyghurs in Xinjiang and the origins of COVID-19, and Putin’s misrepresentation of NATO and the war in Ukraine. Fewer falsehoods about domestic affairs come to light in their cases, but only due to the severe suppression of free speech. In fact, this very suppression might be the clearest indicator of dishonesty: if you need to suppress voices, you’re not on the side of truth.
To dispel any comfortable explanations right away, the issue doesn’t lie with these three individuals alone. They didn’t seize power through political coups, now ruling over a truth-seeking majority. Trump’s stance on truth—evident both directly through his well-documented falsehoods and indirectly through his felonies and disregard for democratic values[2]—was widely visible long before Americans freely voted him into a second term in 2024. Similarly, Putin’s approval ratings are reported to hover around 85%, a 15-point increase following his invasion of Ukraine.[3] Xi’s ratings aren’t much lower either.[4] Citing state propaganda as an excuse doesn’t hold up: everyone in Russia and China knows exactly what happens if they speak out openly and that corruption—which thrives on lies and dishonesty—is not just widespread but deeply embedded in the system. It’s easy to blame the leaders, but the uncomfortable truth is that it’s not them—it’s us, the people.
However, have things really gotten worse in this respect? Human nature probably hasn’t changed recently. The feeling that truth is at an all-time low—implying that it used to be higher—may itself be deceptive. Could we be suffering from a distorted perception of reality, the classic “good old days” syndrome? Similarly, perhaps the central question of this article series, “How to make people care about truth again,” suffers from the same flaw as “Make America Great Again” in that we’d need to specify what “again” actually refers to, revealing that the perception of a better past is just an illusion?
The first approach to answering this question would be to look for scientific research. Unfortunately, there seems to be a lack of longitudinal studies on the subject. Most analyses of deception focus on what people lie about, who they lie to, and why they lie.[5] Some studies do examine lying frequency, concluding that small groups of “prolific liars” account for the majority of lies spread.[6] However, none have analyzed overall lying behavior over extended periods. Later, it will be argued that filling this gap is crucial—after all, we can only improve what we can measure—but for now, we need to explore other ways to approach the question.
Returning to the example of politicians, it’s fair to say that the predecessors of all the mentioned figures—Jiang and Hu in the case of Xi, Gorbachev and Yeltsin in the case of Putin, and Bush and Obama in the case of Trump—had a healthier relationship with truth. Of course, this is not to say they were angels of truth—far from it. It’s just a comparison, and if we end up with politicians who lie less, we should see it as a positive development.[7] Moreover, the issue isn’t limited to these three countries. The global rise of populism reflects this trend for several reasons. Populist rhetoric often appeals to emotions, fears, and hopes rather than empirical evidence (“what feels true is true”). It simplifies complex issues to create “us vs. them” narratives, often omitting inconvenient truths or distorting facts. It delegitimizes mainstream media, experts, institutions, and science as part of the “corrupt elite,” blurring the lines between truth and falsehood. It frequently relies on a cult of personality, where personal narratives are fabricated or exaggerated to enhance appeal. And it often comes with authoritarian tendencies, as leaders seek to control the narrative through lies and propaganda. All of this points to a declining valuation of truth worldwide.
The phrasing “points to a declining valuation of truth” is accurate because these developments don’t necessarily prove that truth itself is no longer valued. Theoretically, people could still value truth, but other factors might be considered more important. This could explain a voting behavior that appears anti-truth but isn’t necessarily so—for example, someone might think, “I don’t like Trump’s lies either, but I believe he’ll be good for the economy.”[8] However, this explanation—an easy way out for preserving the idea that people still value truth—doesn’t hold up. Studies have shown that fact-checking Trump’s statements had no noticeable impact on voter behavior.[9] Instead, false claims are often either rationalized as true or dismissed with thoughts like, “All politicians lie,” implying that once lies are told, their quantity or severity no longer matters.
The normalization of conspiracy theories—such as QAnon or claims of widespread election fraud during the 2020 U.S. elections—provides further evidence of the decline in the value of truth. Once fringe beliefs, conspiracy theories have entered mainstream political discourse, indicating a shift in the public’s threshold for credible evidence. Studies showing that many individuals struggle to distinguish between factual statements and opinions support this picture.[10] Beyond this, numerous other developments make it highly plausible that disinformation has risen in recent years (as analyzed in the next post, as well as RAND’s excellent report on Truth Decay). For now, for the sake of truth, there’s no way around admitting it: there is indeed a growing disregard for truth.
Truth may seem like the default starting position, where deviations—such as lies—need to be explained or justified. However, the opposite is the case: we weren’t born with a built-in commitment to truth, but learned this concept over many years. It comes at a cost: thinking critically takes effort, often slowing down decision-making and requiring self-discipline, unlike the much easier “anything goes” mindset. So, to understand why truth is under pressure today, we first need to step back and consider why we value it at all.
Many reasons for valuing truth can be summed up as benefits from better models of reality. Truth is grounded in facts, and knowing the facts helps us navigate the world. Either there is a lion behind the hill, or there isn’t—it’s very useful to know. Truth also enables us to learn: we take action, reality provides feedback, and we adjust our future decisions accordingly. This isn’t just a rational process—our emotions are directly involved as well. We feel good when we improve our models, even if we haven’t yet used them for anything. Einstein described his realization that a person in free fall does not feel their own weight as the “happiest thought of his life.” This emotional reward likely evolved because, over time, organisms with more accurate models of reality had a survival advantage. As a result, nature “wired” us to feel immediate satisfaction from discovering truth, even before we put it to use.
This value comes not only from better individual models but also from society’s collective models: the scientific endeavor is our collaborative effort to get closer to the truth. We owe almost everything to it: our technology, higher life expectancy, reduced violence, and much more, resulting in an extremely high quality of life compared to historical standards. We must always remind ourselves of this; it’s incredibly easy to take our current situation for granted and forget that only the continued quest for truth made it all possible.
The second main group of benefits is social benefits—how others react to the way we handle truth. People don’t like being lied to because it makes their own quest for truth more difficult (exceptions are discussed further below), so they often punish dishonesty. These punishments can take many forms, from parents applying physical discipline or taking away a child’s privileges to friends distancing themselves—after all, no one seeks the company of a compulsive liar. As social animals, we learn these rules quickly, making social benefits just as important as the benefits of improved models.
Building on these two pillars are a range of effects that further reinforce the value of truth. Once we get a taste of it, our appetite only grows. For example, if we improve our models by learning about science, we become even more eager to close knowledge gaps—like a stamp collector with only a few missing pieces to complete a collection (at least it feels that way, which is all that matters). Similar reinforcing effects occur with social benefits. A society that highly values truth tends to punish those who go against it more severely—for example, through ostracization—encouraging even more people to stick to the truth. The same happens within ourselves: once we understand the value of truth, lying can feel even worse, as it brings guilt. Closely related to this, being honest with ourselves is another important dimension of truth—we don’t feel good if we can’t look into the eyes of the person in the glass.[11]
All the above are good, plausible, and solid reasons for sticking to the truth, right? Yes, yes, and no—they are indeed good and plausible, but they are not solid. If we read carefully, none of these points value truth intrinsically; they all treat it as a means to an end. While they make a strong case for why truth benefits us, it inevitably raises the question of whether we should stay loyal to the truth if circumstances happen to be a little different.
This question is far less hypothetical than it sounds—we lie all the time. For those who doubt this, the following statements might help refresh the memory: “I’m busy,” “Your email was in my junk folder,” “Your baby is cute,” “This meal is delicious,” “You’re improving a lot,” “I didn’t spend much on it,” “I like that thing you created,” “You don’t look fat in those pants at all,” “I love you,” “What a nice gift,” and many others. Lying also tends to increase in certain situations, with dating being a prime example. Or, what about the simple “I’m doing fine”? When grieving the loss of a loved one, answering truthfully can open the door to a conversation that feels too difficult to have. In such cases, most would agree that “I’m fine” is an acceptable response, even if it isn’t true.
In ways like this, we bend the truth all the time. This article just did it as well: “Bending the truth” is not the truth but a deviation from the facts. Similarly, we tend to smooth the edges—a euphemism that, like many others, often involves a degree of dishonesty, as George Carlin aptly pointed out. Such smoothing can be very useful, as it lubricates social interactions. Yet, once again, it deviates from the truth.
The classic way to defend such lies is by calling them “white lies”—small distortions of the truth that are considered acceptable because they supposedly cause no real damage. However, white can quickly turn into grey or black (or even blue); and like all colors, they’re in the eye of the beholder. The potentially slippery slope of white lies will be explored in When is Lying Justified?
Moreover, we don’t only accept white lies. If we were living in Germany in 1943, and the Nazis asked whether we were hiding Jews in the attic, we would hopefully not adhere to Immanuel Kant’s categorical imperative, which insists on telling the truth at all times. Other examples of lying to protect someone include withholding the truth from a suicidal person about their spouse’s infidelity or taking the blame for something to shield another (“I took the $100”). Likewise, if our 100-year-old terminally ill grandmother finds emotional comfort in believing she will soon be reunited in heaven with her recently deceased husband, responding with “Granny, stop talking nonsense” wouldn’t feel right. If we have some heart, not only would we not contradict, but might even gently affirm her belief with an occasional, perhaps softened, “yes.”[12]
We also frequently lie to ourselves. When we don’t get what we want, a nonchalant “I didn’t want it anyway” can help us move on more quickly. At the very least, we focus on the negative aspects of what we didn’t get (or lost) or the positive aspects of not getting it (or losing it), distorting the true picture. If we were to criticize this kind of thinking, the question arises: what’s the alternative? Dwelling on past disappointments and getting upset over things we can no longer change anymore only drains our energy. Instead, staying positive and looking ahead helps us tackle future challenges. It’s hard to condemn this mindset. For the purpose of this article, however, the key point is that, once again, we’re deviating from the truth.
Truth is often avoided for a simple reason: it can hurt—a lot. In fact, it may be what hurts us most psychologically (we’re not talking about physical pain here). This becomes especially clear when someone criticizes us with entirely made-up claims; we can often shrug them off or even laugh because they’re so absurd. But when they expose an uncomfortable truth, it hits a nerve. As the 13th-century Persian poet Rumi said, “The wound is the place where the Light enters you.” That’s why it’s so hard to face the truth about ourselves—and why speaking the truth about others can be dangerous. Literature is full of characters who suffered for telling the truth (e.g., Oedipus, Prometheus)[13], although we certainly don’t need fiction to find such cases. Most people forgive us our shortcomings, mistakes, and much else (more than we tend to think[14])—but very few forgive us for speaking the truth.
In most cases, truth is also for sale. If we were offered $100 to tell an innocuous lie, would we agree to it? If we decline out of steadfast noble principles, let’s put them to the test: what about $1,000? What about a billion? With a billion dollars, we could do immense good in the world—would refusing this offer not be the more morally questionable decision? Almost everything has a price[15], including truth, and for good reason—raising, once again, the question of how much truth there really is in the principle of always telling the truth.
Other cases where lies are commonplace and widely accepted include childhood myths (Santa Claus, the Tooth Fairy), lies as self-defense (such as in cases of extortion), and comical statements made in jest (“This is the best article on truth ever”). Lies are also used for positive surprises (e.g., pretending to forget a birthday to set up a surprise party) and to bypass bureaucratic hassles (e.g., exaggerating symptoms to receive quicker medical attention in an overwhelmed healthcare system). These are just a few examples of how lying is normalized in everyday life.
There are also cases that don’t involve outright lies but still conceal the truth—such as shielding children from the details of a horrific accident.[16] Or what about wearing clothes? It’s literally a cover-up. Of course, we certainly wouldn’t accuse someone of hiding the truth just because they’re dressed. But this isn’t about blame; it’s simply another example of how we take measures to keep certain truths hidden—something society not only accepts but actively expects from us. When it comes to exposing ourselves—not just physically, but emotionally—there are often legitimate reasons to hold back the truth.
What about the growing reverence for actors—who received hardly any social recognition 100 years ago[17]—whose main job is to pretend to be someone they are not? Of course, actors shouldn’t be accused of deception: there is a mutual understanding that they are portraying a role. In fact, their goal is often to convey that role as truthfully and realistically as possible, revealing deeper truths about the subject they portray. In other words, deviating from literal truth can sometimes serve to reveal more profound truths, much like fairy tales or fiction often do. Such cases—where truth may be sacrificed to achieve a greater truth—will be elaborated later. For now, this provides yet another example of how deviation from truth is not only commonplace but widely accepted.
Considering the wide range of ways we deviate from the truth, is there any way to preserve the idea of its intrinsic value? One attempt might be to argue that all the cases mentioned are merely exceptions to an otherwise stable rule. But how true can a rule be if it has so many exceptions? In fact, it takes far less to make us skeptical—in science and rational thinking, a single exception is enough to falsify a rule, as will be explored in On Rules and Laws.
To emphasize the above point even further, let’s look at some cases where people go to great lengths to avoid knowing the truth. A striking example is the story of televangelist Peter Popoff, who astonished audiences by claiming to receive divine revelations about their personal details, illnesses, and addresses. However, skeptic James Randi and his team exposed the truth: Popoff was using an earpiece to receive information from his wife, who had gathered it beforehand from prayer cards filled out by attendees. The reaction of Popoff’s followers was astonishing. Instead of directing their anger at him, many turned on those who had exposed the scam—feeling their hopes had been shattered.
Another revealing story is that of psychologist Ray Hyman, who participated in a test of applied kinesiology—a method in which practitioners claim to determine whether a substance is “good” or “bad” for a person based on muscle resistance. In the test, practitioners placed drops of either glucose or fructose on volunteers’ tongues, then pushed down on their arms to see if they could resist. If the volunteer’s arm held firm, the sugar was deemed “good”; if not, it was considered “bad.” This is a well-known trick with no scientific basis. To evaluate its validity, researchers conducted double-blind tests—neither the practitioners nor the volunteers knew which type of sugar was being used. The results were clear: there was no connection between muscle resistance and the type of sugar given. When these findings were announced, the head practitioner turned to Dr. Hyman and said, “You see, that is why we never do double-blind testing anymore. It never works!” At first, Dr. Hyman thought he was joking. But from the practitioner’s perspective, the conclusion was entirely logical: since he already “knew” applied kinesiology worked, the only possible explanation was that there was something wrong with the scientific method.[18]
Such reactions, however, are only surprising to outsiders—those who fail to grasp what’s really happening. If we were in a situation where such a belief brought significant value, chances are we’d respond in the same way. Similarly, many people’s circumstances—shaped by their genes, upbringing, personal experiences, and social circle—naturally position them to see belief in a higher power as a source of immense benefit. This can mean an endless source of hope or a way to transform the fear of death into anticipation of eternal bliss. Arguably even more important is the alignment with one’s social circle. Abandoning a deeply ingrained belief can feel like breaking ties with almost everything close to one’s heart—one’s past convictions, friends, extended family, and most painfully, one’s parents. Turning away from them may feel almost impossible, considering they gave us life and, hopefully, a great deal of love as well. In such circumstances, who wouldn’t live the hell out of that story of heaven?
Many so-called “rational” thinkers struggle to understand this—and in doing so, become a little less rational themselves. It’s amusing to hear them say they can’t understand why people think the Earth is 6,000 years old, only to conclude that those believers are simply irrational. But they’ve admitted it themselves: they don’t understand it. And as always, when we don’t understand something, we should think about it until it makes perfect sense to us. We live in a world of cause and effect, and if people hold views that seem irrational, there must be a perfectly logical—or rational—reason behind them. Only by understanding the problem deeply—which often requires stepping into the opposite mindset (something many people find difficult)—can we engage with it meaningfully. By simply dismissing ostensibly irrational belief systems—or worse, calling their believers “stupid”[19]—the so-called rational person becomes the truly irrational.[20]
It’s easy to make this mistake when we overlook how different our incentive systems are. Many of us have gone through a long educational system that repeatedly rewarded truthfulness and penalized deviation—an experience that others may not have had to the same extent. We’ve built a social circle where expressing flat Earth views or belief in miracles would get us ostracized—or at least met with strange looks—while for others, it’s the exact opposite. On top of that, there are cumulative effects: in circles where truth is highly valued, it can even become a kind of “cult”—where knowing more facts than others places one higher in the hierarchy. This is something we simply don’t experience—nor strive for—if our social circle doesn’t reward it and if we gave up on being the most erudite person in the room a long time ago.
Incidentally, a key reason we believe in the importance of truth is that everybody seems to agree on it. Of course, what the truth is is often hotly debated, but its importance itself is rarely questioned. Ironically, this message is often louder among those who are furthest from the truth. An internet search for “flat earth truth” returns more hits than “round earth truth.” Similarly, a search for “seeking truth” leads to a multitude of religious sites, each emphasizing truth repeatedly. However, when someone declares “This is the truth” over and over, it should make us skeptical. A genuine truth seeker focuses on building arguments and presenting evidence, trusting that their reasoning will naturally lead people to the right conclusion—not insistently reinforcing a desired outcome like a relentless litany. This may be a form of offensive defense—anticipating the inevitable objection that their claim contradicts facts and aiming to make themselves and others believe it through sheer repetition (which can be highly effective) or at least blurring the lines between truth and falsehood.[21]
As Jeremy Bentham pointed out, we are steered by the masters of pleasure and pain, and in a way, prisoners within the cage of our incentive system. To illustrate this, let’s consider an extreme example: Imagine we were offered every conceivable good—paradise on earth, reunion with lost loved ones, the undoing of every past wrong, and eternal bliss for ourselves and every living being on the planet. All of this would be ours in exchange (there’s always a catch!) for a single hour of torture. It would be difficult to argue against accepting such a deal. However, imagine we couldn’t commit in advance but were instead asked repeatedly during the hour of suffering whether we wanted to quit, ending the pain immediately. If the torture was “done right,” no one would be able to endure it.
While sticking to the truth involves much weaker rewards and punishments, the same principle holds. We are bound by our incentive system, and we value the truth only insofar as it serves our goals. When the truth aligns with our incentives, we use it—when it doesn’t, we don’t. In this respect, there is hardly any difference between people; the difference lies in how our incentive systems happen to be shaped.
Many rational, truth-seeking readers may not like the above because it seems to diminish—or even dismantle—the importance of truth. So what, then, is the conclusion? Can we simply lie whenever we want, deflecting any criticism by saying, “It’s just a result of the incentive system I happen to be trapped in”? Can that be right?
First, disliking what has been said is irrelevant—it’s not about what readers of this article like or dislike. This type of reasoning is precisely what rational people criticize when others believe or spread falsehoods based on emotions. This article seeks to explore the truth by putting all facts on the table. We should remain objective, verify whether these facts are indeed the facts, and think about what conclusions can be drawn from them.
Second, sometimes it is wiser to tear down a structure—whether a physical building or the construct of our attitude toward truth—to rebuild it on a stronger foundation. Telling people to always stick to the truth is not built on solid ground: they will quickly respond by listing the many ways we frequently deviate from the truth, causing us to lose our footing rather quickly. We need to think more deeply about better guiding principles for handling truth—which is what this article series aims to explore.
Third, showing understanding for why people deviate from the truth can be key to addressing the issue. The opposite approach—dismissing others with “they’re just bad”—does nothing to advance progress. Quite the opposite: approaching people this way only triggers their defenses and ensures they won’t change their views. The key lies in understanding their situation (and even showing empathy) while helping them see greater value in truth. At first, this may sound a little naïve or overly soft, but it isn’t. Punishments—including drastic ones—can be just as valid a tool to “help” people see more value in truth as rewards. More about this in How to Save the Truth.
Fourth, the decision about which actions to take isn’t solely determined by our incentive system, so we cannot use this as an excuse to lie. The time horizon of our actions’ implications also plays a key role; two people can have identical incentive systems yet think very differently about the balance between short- and long-term goals. Those who can delay short-term gratification in favor of long-term gains will inevitably reach different conclusions about what to do. This is especially relevant in the context of truth, as understanding and valuing truth often require a long-term perspective. If we focus only on the present, we won’t care that a lion is lurking behind the hill if we’re only concerned with the next 10 seconds of our lives. From this perspective, the rise of short-term gratification tools—such as the internet providing instant pleasure through funny cat videos, pleasing tweets, or porn—may have contributed to the erosion of long-term thinking and, consequently, a diminished regard for truth.
Closely related to this, a key question is whether our incentive system accounts for the interests of others in some way. If we care only about ourselves, then our time horizon for our actions—which may impact many future generations—shrinks to however many years we expect to have left. An 80-year-old politician who only thinks of himself may consider only the short-term implications of his actions—perhaps the next 10 to 15 years—and find it much easier to deny facts that will materialize further in the future, such as climate change.[22] The relationship between truth, long-term thinking, and egoism will be elaborated further in Why is Truth Having a Hard Time?.
There’s also a fifth point that needs to be mentioned in the context of our discomfort with denying truth any intrinsic value. A key distinction in ethical decision-making is the difference between theory and practice. To analyze something clearly—without cluttering our minds—we first need to examine it in isolation, from a purely theoretical, ceteris paribus (all other things being equal) perspective. In the second step, we consider how our assessment changes in practice, where many other factors come into play. It’s like an experiment in a laboratory, shielded from outside influences, where the subject is treated as if in a vacuum to understand its core characteristics. This distinction is discussed in greater detail in the concept of “The Wall of the 1,000 Filters” in the free eBook The History, Present and Future of Happiness (from page 92). So far, the analysis in this article has remained purely theoretical. Thus, stating that truth lacks intrinsic value does not imply that we can ignore it in practice—in fact, adhering to truth may still be a highly useful rule. Paradoxically, even if we conclude that truth has no intrinsic value in theory, in practice, it may still be beneficial to claim that it does—which, ironically, would be a lie in itself but potentially a useful one. More about this in When is Lying Justified?
Another way to make it clear that truth cannot have truly intrinsic value is the following: Let’s imagine we ask someone why they believe truth has intrinsic value. One type of response might be that it cannot be questioned—a simple rule we must follow, passed down by parents, philosophers, or scripture. This “someone else said so” or “just because” type of reasoning clearly cannot be a valid argument (as will be elaborated in On Rules and Laws). Another response might be: “Truth is valuable because…” followed by an explanation of what it aims to achieve. At that point, the specific reasons no longer matter—because by answering in this way, one implicitly acknowledges that truth serves a deeper objective, making it non-intrinsic.
Assigning truth intrinsic value is not only incorrect in theory but also poses real-life dangers in practice. Since this idea can be easily refuted, it risks undermining the very value we derive from truth. For example, if we emphasize the absolute importance of never lying and then inevitably get caught telling an innocuous lie (as discussed earlier), some may take this as justification to reject truth altogether—failing to stick to it when it truly matters.
Another danger arises when sticking to the truth is used as a justification to cause harm for personal gain. For instance, let’s revisit the case where Nazis ask where Jews are hidden. At the time, many people had a personal interest in denouncing Jews—whether to seize their property, retaliate over past conflicts, or act out of jealousy toward the wealth they had accumulated through hard work. Such individuals could expose the Jews’ location and justify it by saying, “I don’t want to lie! I’m just sticking to the truth!” In other words, they hide behind the truth. Rather than relying on rigid rules, we must take responsibility for considering whether our actions do good or cause harm. To answer this chapter’s title question, “The End of Accountability?”—No, this is not the end of accountability, but the beginning.
The bottom line is this: The truth about truth is that it’s not about truth. Does that mean we can ignore truth altogether? No—but it is crucial to understand why, and when deviating from the truth may be justified. The next post, When is Lying Justified?, aims to build a model and decision tree to answer these questions.
—
[1] The reference to political figures in this article, particularly Donald Trump, is not an endorsement or condemnation of any political stance. Trump is highlighted primarily due to his visibility and the extensive documentation of his statements, making him a widely recognized and illustrative example. Whether his overall actions serve the reader’s interests, the United States, or the world at large is a separate debate and beyond the scope of this article. Readers may argue that he is the best option overall. However, claiming that Trump consistently values truth or prioritizes interests beyond his own—an essential factor in adherence to truth, as discussed below—would be difficult to support with evidence.
[2] Democratic values often align with respect for truth, as democracy is built on informed decision-making, accountability, and the rule of law. Moreover, debate is central to any democratic system—and meaningful debate requires a foundation of facts.
[3] Naturally, in an environment dominated by state-controlled media and suppression of dissent, obtaining truly reliable approval ratings is challenging. The reported 85% approval rating comes from the Levada Center, a Russian polling organization that claims to be independent. While such figures should be met with skepticism, there are reasons to believe they are not entirely fabricated—unlike the likely inflated numbers from state-run agencies such as VTsIOM. Notably, Levada has previously reported lower approval ratings for Putin, which would not align with the Kremlin’s interests if it were merely a propaganda tool. Furthermore, independent international studies, such as Russian Public Opinion in Wartime by NORC at the University of Chicago, have also found strong public support for Putin, lending credibility to these findings. For additional context, see Wikipedia’s entry: Public Image of Vladimir Putin.
[4] Finding reliable statistics on Xi Jinping’s popularity is even more difficult than for Putin, given China’s tight control over information, state-run media, censorship, and the absence of independent polling. A 2014 survey by the Harvard Kennedy School’s Ash Center for Democratic Governance and Innovation found that Xi received a composite rating of 8.7 out of 10, suggesting strong domestic support. However, such high approval ratings may be influenced by a lack of truly anonymous polling. Studies using indirect questioning methods indicate that while overall approval remains significant, it likely falls within the 50–70% range when anonymity is assured.
[5] Serota, K. et al. (2010): “The Prevalence of Lying in America: Three Studies of Self-Reported Lies”
[6] Serota, K. et al. (2021): “Unpacking variation in lie prevalence: Prolific liars, bad lie days, or both?”
[7] This isn’t sarcastic politician bashing; there’s a good reason for their lying—the issue is in the system—as elaborated further in Why is Truth Having a Hard Time?.
[8] Such sacrificing of fundamental values for perceived short-term gains has historically often led to disastrous consequences. For example, in 1930s Germany, many acknowledged but downplayed Hitler’s antisemitic rhetoric, telling each other, ‘I don’t like what he says about the Jews either, but look at what he’s done for the economy!’ This kind of ethical surrender has often proven catastrophic—more on that later.
[9] For example, see Nyhan, B. et al (2019): Taking Fact-checks Literally But Not Seriously? The Effects of Journalistic Fact-checking on Factual Beliefs and Candidate Favorability. Moreover, fact-checks delivered in a confrontational manner may even have reverse effects, reinforcing false beliefs, see When Fact-Checks Backfire.
[10] For example, see Study: Americans struggle to distinguish factual claims from opinions amid partisan bias, Pew study finds Americans can’t tell fact from opinion and 86% of American 15-year-olds can’t distinguish fact from opinion. Can you?
[11] The Man in the Glass, a poem by Dale Wimbrow (1934):
When you get what you want in your struggle for pelf,
And the world makes you King for a day,
Then go to the mirror and look at yourself,
And see what that guy has to say.
For it isn’t your Father, or Mother, or Wife,
Who judgement upon you must pass.
The feller whose verdict counts most in your life
Is the guy staring back from the glass.
He’s the feller to please, never mind all the rest,
For he’s with you clear up to the end,
And you’ve passed your most dangerous, difficult test
If the guy in the glass is your friend.
You may be like Jack Horner and “chisel” a plum,
And think you’re a wonderful guy,
But the man in the glass says you’re only a bum
If you can’t look him straight in the eye.
You can fool the whole world down the pathway of years,
And get pats on the back as you pass,
But your final reward will be heartaches and tears
If you’ve cheated the guy in the glass.
[12] In this context, another thought may make such a lie easier to accept: it won’t be around for much longer, dissolving along with the person who carries it in the not-too-distant future. It’s a harsh thought, but this article is about the truth—and truth can be harsh at times. This also highlights an important aspect: the impact of a lie depends on its context, duration, and reach. A lie told at the beginning of someone’s life can profoundly shape their world, while a lie near the end may simply offer comfort with little lasting consequence—as will be explored later.
[13] Compliments for showing interest in this footnote. In Sophocles’ Oedipus Rex, Oedipus relentlessly pursues the truth about the plague afflicting Thebes, only to discover that he himself is the cause—having unwittingly killed his father and married his mother. Though others, including his wife-mother Jocasta, beg him to abandon his search, he refuses, determined to uncover the truth. Once revealed, this knowledge leads to his downfall: Jocasta takes her own life, and Oedipus blinds himself in despair before going into exile. In Aeschylus’ Prometheus Bound, Prometheus suffers for speaking a different kind of truth—one that defies the gods. After stealing fire from Olympus and giving it to humanity, he also prophesies that Zeus’s rule is not as secure as the god believes. For daring to empower humans and for revealing a truth Zeus does not want to hear, Prometheus is chained to a rock, where an eagle eats his liver daily, only for it to regenerate each night—an eternal punishment for his defiance.
[14] We tend to underestimate how much others forgive us due to the “spotlight effect”—a psychological phenomenon in which individuals overestimate the extent to which their actions and appearance are noticed by others. This egocentric bias makes people feel as though they are under constant social scrutiny, even when they are not. For more on this, see Mu, F., Bobocel, D.R. (2019): “Why did I say sorry? Apology motives and transgressor perceptions of reconciliation” (Journal of Organizational Behavior).
[15] This is reminiscent of the anecdote of a man who arrives in a small town and hears a rumor that every woman there offers love for a price. Outraged, he asks, “What? Don’t you have any decent women in this town?” to which someone replies, “Oh, we do! But those are very expensive.”
[16] There are also other situations where we don’t speak the truth, yet it’s unclear whether they qualify as lies. For example, someone might say, “Christopher Columbus discovered America in 2013,” unintentionally mixing up the years 2013 and 1492. Conventionally, a lie requires intent to deceive, so this wouldn’t be considered a lie. However, by this logic, the question, “Is the statement that Christopher Columbus discovered America in 2013 a lie?” must be answered with, “It depends,” as it hinges on whether the statement was made deliberately to mislead—which also makes the question of whether it is a lie dependent on who made it. This highlights how tricky it can be to define lies. Regardless, the bottom line remains: if we include such cases, we speak even more falsehoods—intentionally or not—than previously considered. Our task is to reduce falsehoods overall, even when no bad intentions are involved. More on this in Why is Truth Having a Hard Time?
[17] See Kohansky, R. (1984): “The Disreputable Profession: The Actor in Society” (Greenwood Press)
[18] This story was retold in Derren Brown’s great book “Tricks of the Mind” in chapter Anti-Science, Pseudo-Science and Bad Thinking.
[19] To avoid any misunderstandings, this is not to suggest that stupidity doesn’t exist—it certainly does. The point above refers specifically to situations where an individual’s incentive system is structured in a way that logically explains actions that may not align with objective truth. However, this does not apply to cases where someone acts against their own incentive system due to a failure to grasp the consequences of their actions—such as supporting ideas, leaders, or policies that demonstrably harm them. In such instances, the term stupidity is entirely appropriate.
[20] This also applies to so-called “paradoxes”—which are not true paradoxes but rather reflections of our limited understanding of how they arise. For example, the existence of deeply religious scientists may initially seem contradictory, as science is typically associated with skepticism, evidence-based reasoning, and a willingness to revise beliefs, while religion is often linked to faith and adherence to spiritual doctrines. However, this perceived contradiction assumes that people apply the same style of thinking to every aspect of their lives—a premise that falls apart upon closer examination. In reality, people often compartmentalize different ways of thinking depending on context. If an individual’s incentive structure aligns such that scientific reasoning maximizes their professional success, while faith-based reasoning enhances their emotional well-being or sense of purpose, then this dualism is not only understandable but entirely logical.
[21] The blurring of lines between truth and falsehood is epitomized in the following anecdote. While generating cover images for this blog post, AI was used. After a few iterations, the AI began replacing the word “Truth” with “Trump” (see the conversation here). After the initial shock settled, the reasons behind this mistake were explored. The origin of this issue may be similar to the well-documented observation that AI-generated images of clocks almost always display the hands at 10:10 (as described here). Since AI relies on training data, and most pictures of clocks feature 10:10—a position commonly used in advertising (as explained here)—AI adopted it as a default pattern. In the case of “Truth” and “Trump,” the explanation could be similar: the word “Trump” has been extremely prevalent in media over the past few years, often appearing in discussions related to truth—whether in Trump’s own references to truth, reports on his stance toward truth, or the branding of “Truth Social.” However, despite its typographic similarity, truth (and also trust) is a fundamentally different concept—one that must never be conflated with Trump.
[22] Obviously, this isn’t referring to Donald Trump because, at the time of writing, he is 78 years old, not 80. In this context, it is worth recalling that lies can be justified when made in jest.