21 Lessons for the 21st Century (Yuval Noah Harari)

(全部内容转自Blinkist)

What’s in it for me? Futureproof yourself against the twenty-first century.

21 Lessons for the 21st Century (Yuval Noah Harari)_第1张图片
图片发自App

In an era of relentless change and uncertain futures, governments and individuals alike are grappling with technological, political and social issues unique to the twenty-first century. How should we respond to modern-day phenomena, such as frighteningly intelligent computers, globalization and the fake news epidemic? And what about the threat of terrorism – should we take action or take a deep breath and relax?

In these blinks, you’ll discover the answer to all these questions and more. You’ll learn how to futureproof your children by changing your approach to education, what robots and automation mean for the future of white-collar work and why the question of immigration is threatening to destroy twenty-first-century Europe.

Author Yuval Noah Harari has formulated some important lessons to help us deal with these fascinating times. These blinks seek to highlight the six most crucial ones.

In these blinks, you’ll also find out

how technological disruption led to Brexit;

why we have more to fear from cars than terrorists; and

why we need to teach our children less.


Computer technology is disrupting our financial, economic and political systems.


Throughout the twentieth century, three distinct political ideologies vied for world supremacy – communism, fascism and liberalism. Fast-forward to the late twentieth century and liberalism, which celebrates democracy, free enterprise and individual freedoms, was the clear winner. But how will the West’s liberal-democratic system cope in the twenty-first century?

Disturbingly, its vital signs aren’t good – and the revolution in information technology is to blame.

From the 1990s onward, computer technology has arguably transformed our world more than any other force. But despite its massive impact, most politicians seem hardly able to comprehend this new innovation, and are even less capable of controlling it.

Just consider the world of finance. Computers have already made our financial system fiendishly complicated – so much so, that very few humans are now able to understand how it works. As the twenty-first century continues and artificial intelligence advances, we may reach a stage where no human will be able to make any sense of financial data. The implications of this scenario for our political process are disturbing. Just imagine a future where governments have to patiently wait for algorithms to give them the green light on their budget or their tax-reform plans.

Unfortunately, for many twenty-first-century politicians, technological disruption isn’t at the top of the agenda. For instance, during the 2016 American presidential election, neither Donald Trump nor Hillary Clinton discussed the implications of automation on job losses. In fact, disruptive technology was only really discussed in the context of the Hillary Clinton email scandal.

This wall of silence is causing many voters to lose faith in established governments. Ordinary people in liberal democracies across the Western world are feeling more and more irrelevant in this brave new world of artificial intelligence, globalization and machine learning. And this fear of becoming irrelevant has made them desperate to wield whatever political power they still have, before it becomes too late. Not convinced? Just take a look at the political earthquakes of 2016. Both Brexit in the United Kingdom and Donald Trump’s election in the United States were supported by ordinary people, worried that the world and its dominant liberal political systems were leaving them behind.

Throughout the twentieth century, ordinary workers worried about their labor being exploited by economic elites. But these days, the masses are more afraid of losing their economic status in a high-tech economy that no longer needs their labor at all.


New discoveries in the field of neuroscience are enabling computers to take our jobs.


Although most experts agree that robotics and machine learning will change nearly all lines of work in the twenty-first century, we can’t predict what this change will look like. Will billions of people find themselves economically irrelevant within the next twenty years, or will automation result in wider prosperity and great new jobs for all?


Many optimists point to the industrial revolution in the nineteenth century, a time when the fear that new machine technology would create mass unemployment was widespread. They point out that since the start of that industrial revolution, new technology has created a new job for each one it made obsolete.


Unfortunately, there is good reason to assume that, in the twenty-first century, the impact of new technology on human employment will be much more destructive.


Just consider the fact that humans are possessed of two sorts of abilities – cognitive and physical. In the previous industrial revolution, humans experienced competition from machines largely in the realm of purely physical abilities. Our cognitive abilities, meanwhile, remained far superior to machines’. Therefore, even as automation occurred in manual jobs within industry and agriculture, there concurrently emerged new jobs that required the sort of cognitive skills particular to humans – such as analysis, communication and learning.


But in the twenty-first century, machines are getting smart enough to compete for these cognitive-based jobs, too.


Recently, neuroscientists have discovered that many of our choices, preferences and emotions are not the result of some magical human faculty, such as free will. Instead, human cognition comes from our brain’s ability to calculate different probabilities in the space of a split second.


These neuroscientific insights raise a troubling question: Will artificial intelligence eventually outperform people in professions requiring “human intuition,” such as law and banking? Its highly probable. Computer scientists now know that what looked like impenetrable human intuition was really just our neural networks recognizing familiar patterns and making fast calculations about probabilities.


So, in the twenty-first century, computers might well be able to make banking decisions about whether or not to lend a customer money, as well as accurately predict whether a lawyer in a court case is bluffing or not. In other words, in the years ahead, even the most cognitively demanding jobs won’t be safe from automation.


Polarized debate over immigration is threatening to tear apart the European Union.

The world has never looked so small. The twenty-first century has ushered in changes unimaginable to our forebears. For instance, globalization has made it possible to meet people from all over the world. Unfortunately, it has also opened up new opportunities for conflict.

Indeed, as more of the world’s people cross borders in the hunt for better jobs and more security, our urge to expel, confront or assimilate strangers is putting our political ideologies and national identities to the severest of tests.

This immigration challenge is particularly pertinent in Europe.

In the twentieth century, the European Union was founded on the premise of overcoming cultural disparities between citizens of France, Germany and other European nations. But ironically, this political project may now collapse because of its failure to accommodate cultural distinctions between European citizens and new arrivals from the Middle East and Africa.

For instance, growing numbers of new arrivals from these regions have sparked bitter debates between Europeans in regard to issues of tolerance and identity.

Although it is broadly accepted that immigrants should make attempts to assimilate to their host country’s culture, how far this assimilation should go is a contentious subject. Some Europeans and political groups argue that new arrivals should cast off their previous cultural identities entirely, right down to their traditional styles of dress and their taboos regarding food. These Europeans argue that immigrants arriving from a culture that is, say, deeply patriarchal and religious, and entering into a European liberal society, should adopt their host’s feminist and secular norms.

In contrast, pro-immigration Europeans contend that since Europe is already highly diverse, with a wide range of values and habits represented among it’s native peoples, it is unfair to expect immigrants to assimilate to some abstract collective identity that most Europeans themselves don’t even relate to. These Europeans argue that we shouldn’t expect Muslim immigrants to convert to Christianity when the majority of British people don’t attend church themselves. And they question why immigrants from the Punjab should have to forgo their traditional curries in favor of fish and chips, given that most native Brits are more likely to be found in a curry house on a Friday night than in a fish-and-chip shop.

Ultimately, the issue of immigrant assimilation is far from clear-cut. Therefore, the lesson for the twenty-first century is that this debate shouldn’t be framed, as it often is, as a moral struggle between “fascist” anti-immigrationists and pro-immigrationists promoting the “suicide” of European culture. Instead, immigration should be discussed rationally, as both political viewpoints have some legitimacy.


Terrorist groups like al-Qaeda are masters of manipulation.


No one is better at playing mind games than twenty-first century terrorists. Since the 9/11 attacks, in 2001, roughly 50 people are killed by terrorists in the European Union every year. In America, about ten people are killed.

Now consider that during that time, 80,000 people in Europe and 40,000 Americans have died in traffic accidents. Clearly, our roads pose a far greater hazard to our lives than terrorists, so why are most Westerners more scared of terrorism than driving?

Terrorism is a strategy typically employed by weak and desperate parties. It aims to change the political situation by sowing fear in the hearts of the enemy rather than by causing material damage, which terrorists usually aren’t strong enough to do. Although terrorists typically kill very few people overall, the twenty-first century has taught us that their campaigns can be ruthlessly effective.

For instance, although al-Qaeda’s 9/11 attacks killed 3,000 Americans and caused terror on the streets of New York, they inflicted very little damage on America as a military power. Post-attack, America had exactly the same amount of soldiers, ships and tanks as she had before, and the country’s roads, communication systems and railways were unharmed. But the enormous audiovisual impact of the Twin Towers collapsing was enough for the nation to seek massive retribution. The terrorists wanted to cause a political and military storm in the Middle East, and they got one. Just days after the attacks, George W. Bush declared a war on terror in Afghanistan, the consequences of which still reverberate in the region today.

So how did this weak group of terrorists, with few military resources at their disposal, manage to manipulate the world’s greatest power into such disproportionate retaliation?

To answer this question, it's useful to think of terrorist groups like al-Qaeda as a fly buzzing around a china shop. This fly wants to break something, but it’s not strong enough to even move a teacup. However, it has a better idea. Standing in this china shop is a massive bull, and if the fly can buzz in his ear and annoy him, the bull, in his attempts to kill the fly, might eventually break everything himself. In the case of 9/11 and the war on terror, the Islamic extremist fly succeeded, and the United States bull, driven by anger and fear, all but destroyed the Middle Eastern shop. Today, the fundamentalists are flourishing amid the carnage left behind.

The lesson for the twenty-first century? Terrorists win when mighty governments overreact.


Twenty-first-century humans are far more ignorant than we realize.


For centuries, liberal societies have placed a huge amount of trust in the ability of individuals to think and act rationally. In fact, our modern societies are founded on the belief that each human adult is a rational, independent agent. For instance, democracy is based on the notion that voters will know what is best. Our system of free-market capitalism is premised on the idea that customers are never wrong. And our liberal system of education instructs pupils to engage in independent thinking.


But in the twenty-first century, placing so much faith in our ability to act rationally is a grave mistake. Why? Because modern humans, as individuals, know appallingly little about how the world actually works.


People in the Stone Age knew how to hunt, turn animal skins into clothes and get a fire going. Modern man is far less self-sufficient. The problem is that, even though we require experts to fulfill almost all our needs, we falsely think that, on an individual level, we know much more than our Stone Age ancestors.


For instance, in one experiment, participants were asked whether they understood how zippers work. Although most participants confidently replied that they did, when they were asked to elaborate on this knowledge, most were revealed to be clueless about how this everyday mechanism actually works.


The lesson for the twenty-first century? Modern man often falls prey to what scientists have deemed “the knowledge illusion.” That is, individuals tend to believe they understand a lot simply because they treat the knowledge that other people possess – for instance, how a zipper functions – as though they possessed it, too.


The consequences of the knowledge illusion are that individuals, such as voters or government officials, fail to understand just how complex the world really is and how ignorant they are of that complexity.


Thus, we see individuals who know almost nothing about the field of meteorology proposing climate change policies, or politicians forcefully espousing solutions to conflicts in Ukraine or Iraq, even though they couldn’t find these countries on a map.


So next time someone gives you their opinion, dig a little deeper to find out how much they really know about the subject in question. You might be surprised.


Twenty-first-century schools need to give students less information and more critical thinking abilities.

A child born the year of writing will be in his or her thirties in 2050 and will hopefully still be alive in 2100. But what sort of education would help this child prosper well into the next century?

For children of the twenty-first century to flourish and become capable adults, we need to radically rethink our schooling system. In other words, the schools that got us here won’t get us there.

Currently, schools tend to place too much emphasis on cramming their students with information. This approach made a lot of sense in the nineteenth century, because information tended to be scarce. This was a time without daily newspapers, without radio and public libraries and television. Additionally, even the information that did exist was regularly subject to censorship. In many countries, there was little reading material in circulation apart from religious texts and novels. Consequently, when the modern schooling system was introduced, with its focus on imparting the essential facts of history, geography and biology, it represented a huge improvement for most ordinary people.

But living conditions are very different in the twenty-first century, and our educational systems are now hopelessly antiquated.

In today’s world, we are flooded with almost too much information, and our governments, or most of them, no longer attempt to censor it. People all over the world have smartphones, and could spend all day every day perusing Wikipedia, catching up on TED talks and studying for online courses if they had the time and desire to do so.

Nowadays, the problem for modern man is not scarcity of information but all the misinformation that now exists. Just consider all the fake news that many of us wade through each time we browse our social media feeds.

In response to this information overload, schools should stop shoving even more data down children’s throats. Instead, twenty-first century children need to be taught how to make sense of the vast amounts of information that bombards them on a daily basis. They need to learn how to distinguish between important information and irrelevant, or downright fake, news. In the twenty-first century, information will be always at our fingertips. The truth, however, will be harder to find.


Final summary


The key message in these blinks:

In this century of constant technological and political upheaval, we can prepare ourselves for the future by acknowledging our own ignorance in the face of increasing complexity, and discussing hot political topics, such as immigration, with calm rationality. We can also futureproof ourselves by learning to tell the difference between real and fake news. Although the twenty-first century has brought fears of terrorism and mass unemployment, we should remember that, ultimately, the key to our prosperity and security remains in our own hands.

Actionable advice:

Truth doesn’t speak to power.

It’s easy to assume that powerful leaders always have the inside track on situations, or know the truth about what other people think. But the reality is that great leaders are often less well informed than the average person. Why? Because, as people become more and more powerful, those around them become equally less likely to divulge hard hitting truths to them. Instead, people around these leaders become more concerned with flattering them, and ensuring they don’t say anything inappropriate or confusing during the short time they have their ear. Therefore, if you want the truth, try hanging around on the periphery of power, rather than at it’s centre. You just might learn something.

你可能感兴趣的:(21 Lessons for the 21st Century (Yuval Noah Harari))