Home

Out of Character Part 4

Out of Character - novelonlinefull.com

You’re read light novel Out of Character Part 4 online at NovelOnlineFull.com. Please use the follow button to get notification about the latest chapter next time when you visit NovelOnlineFull.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy

Harry Watanabe opened a small gift shop in Omaha, Nebraska, in 1932. His inventory was unique, consisting mostly of trinkets from j.a.pan. His business, which he eventually named Oriental Trading Company, soon expanded into seventeen shops throughout the Midwest. Harry had two children, Terrance and Pam, and in keeping with j.a.panese tradition, Harry dreamed that one day Terrance would take over for him as head of the family business. In 1977 his dream was realized: Terrance become president of the company. Moving the focus of production to party supplies and favors, Terrance oversaw the rise of a business that would eventually serve 18 million customers, churn out 25,000 products, employ 3,000 workers, and earn $300 million.

You know the sort of trinket we're talking about: spider rings, rubber bouncy b.a.l.l.s, key chains, and those miniature pink muscle men that expand when placed in water. Americans have been rooting through cereal boxes in search of just such prizes for decades. For us, these small plastic delights have been the surprise found in the hollow center of the traditional Italian uova di Pasqua (chocolate Easter eggs) pa.s.sed around at our own family Easter celebrations for as long as we can remember. Indeed, a browse through the Oriental Trading Company's online catalog will surely be a walk down memory lane for any child raised in the last half century.

Terrance was incredibly devoted to the success of the company, so much so that his friends and family couldn't help noting how he was never able to maintain a close romantic relationship. How proud Harry must have been of Terrance, making such sacrifices to devote his life to the family business. How happy Terrance must have been to be the source of such pride. And indeed, how much pleasure Terrance himself must have taken in his own professional and financial success. How surprising it is, then, that in 2000, after shepherding the family business so responsibly for over two decades, Terrance sold the company and proceeded to blow through most of his hard-earned proceeds at Vegas casinos. We're not talking just a few thousand dollars. Terrance Watanabe lost a mind-blowing $127 million in a single year. Doesn't it seem quite strange that someone so successful, who had earned a fortune by making intelligent, calculated decisions about costs and benefits, could so foolishly fall prey to the lure of the flashing casino lights?1 You may be tempted to lump Terrance, who clearly had developed a gambling problem, in with all other addicts. And depending upon your views of addiction, you might see him as weak-unable to overcome the temptation of winning on the next hand. You might see compulsive gambling as a character flaw, signaling a type of person who is unreliable, untrustworthy, and certainly not the sort with whom one would want to conduct business. But is the person who loses $127 million at the casino so different from the person who plays the stock market or dabbles in real estate? The difference between these types of individuals, we'll argue, isn't so much their character-after all, all three of these activities are high-stakes gambling-but rather in their sensitivities to risk and reward. As we intend to show in this chapter, our perceptions of risk and probability can change on a dime and are subject to the push and pull of dueling forces in the mind. If you acknowledge this fact, then suddenly the woman pouring her weekly paychecks into the slot machines in Atlantic City might not seem as deviant or flawed as you think.

Consider an entirely different kind of situation involving a risk that most people have personally experienced: flying on airplanes. Surely you've heard the statistic that you're more likely to get into a fatal accident in your car on the way to the airport than to be killed in a plane crash, but for the many people who fear flying, this fact doesn't always provide much comfort. Though we may know that the probability of a plane crash is low, our intuitions are harder to convince. That's because when emotions run high, our a.s.sessments of probability and risk are skewed by all kinds of cognitive biases. For example, studies show that after a high-profile plane crash hits the headlines, people estimate the likelihood of being killed in a crash as being much higher than they might have the day before. Again, rationally, this doesn't make sense. The likelihood of dying in a crash on March 26, 1977, was almost exactly the same as the likelihood of dying in a crash on March 28, 1977, but it probably didn't feel that way to the many people who watched the footage of the 583 bodies being pulled from the wreckage after two 747s collided at the Tenerife airport in the Canary Islands on March 27. Indeed, much research has shown that simply being able to recall something easily and vividly, like a recent and well-publicized tragedy, makes it suddenly seem more likely to occur again even though the odds have not objectively changed.2 To understand why this irrational fear persists, think about the relative visibility of plane crashes vs. car crashes. Every day most of us see hundreds of cars safely traveling through city streets. Very occasionally we'll witness an accident, but we witness infinitely more safe trips than crashes. Not the case with airplanes. Unless you're an air traffic controller, chances are you've been exposed to a fairly high ratio of accidents to safe landings. After all, every time a plane crashes, the images are splattered all over the news for days, sometimes weeks, but we don't see the thousands and thousands of planes that take off and land safely every day. In short, we are selectively exposed to the catastrophes. And these vivid images are seared in our brains, creating expectations of harm that are detached from the statistical realities.

These same kinds of cognitive errors are at work when someone such as Terrance Watanabe steps up to the c.r.a.ps table. In the same way that rational judgments of the probability of a crash take a backseat when evaluating the safety of air travel, the logical probabilities involved in losing or winning at the c.r.a.ps table can be lost on gamblers. When we gamble, we tend to focus on the possibility, not the probability, of winning, much the way fliers focus on the possibility, not the probability, of dying in a fiery crash.

It turns out that almost everyone-not just compulsive gamblers and fearful fliers-can be biased when it comes to judging probability and weighing the potential for risks and rewards. This fact certainly flies in the face of rational models of human decision making, which suggest that people make decisions about risk by carefully and methodically calculating the likelihood of possible outcomes. But if you haven't guessed yet, we and other psychologists of our ilk don't put much stock in such models. Sure, it would be nice if decisions were made by making use of all available information and rationally weighing all the costs and benefits. If this were the case, then our decisions about whether to play the next hand, get on that transatlantic flight, risk taking that shortcut through a bad neighborhood, or skip the birth control this time would generally turn out all right. But unfortunately, the ant and the gra.s.shopper are not truth seekers-each recruits all the psychological ammunition it can to convince you to go all in with a pair of twos, pop four Xanax to get you through the trip home, make that condom seem too far away to reach for, or have you reach for the Purell gel every time you shake a hand.

Risk and distance: Smelling the cookies makes them harder to resist.

Food and s.e.x. These are two things that are pretty much universally enjoyed. But they are also two things that consistently cause us to make errors in judgments of risk. Just as with gambling, when it comes to food and s.e.x, what seem to be failures of will, like eating that second piece of cake or cheating on one's significant other, often actually boil down to our inability to accurately weigh the short-term rewards of our actions (e.g., satisfying a sweet tooth or a carnal urge) against the long-term risks (e.g., weight gain or ruining a relationship). When you're considering whether or not to add extra cheese to that pizza or buy those reduced-fat Wheat Thins, chances are you don't often stand there calculating the long-term risks involved in eating too much salt or fat, right? Similarly, if a partner tells you, in the heat of the moment, to forget protection and get on with it already, are you going to stop and rationally evaluate his or her s.e.xual history? No. You have an urge, and you act on it. In other words, when it comes to food and s.e.x, short-term pleasure seems to win every time. But are these urges to engage in risky behavior rooted only in our brains, or are they also sensitive to cues in our external environment?

That's what Peter Ditto and his colleagues at the University of California wanted to find out.3 More specifically, Ditto and his team were interested in the extent to which people's sensitivity to risk hinges on the proximity of reward. In one experiment, they told partic.i.p.ants that they would be playing a game of chance. If they won, they would get some freshly baked chocolate chip cookies that were waiting in the next room. If they lost, they would have to spend an extra thirty minutes filling out boring questionnaires. The rules of the game were as follows: Partic.i.p.ants would pick a card from one of four decks of ten cards. Each card would be either a win or a loss. But different decks would have different odds of winning, and partic.i.p.ants would be told these odds before they drew, at which point they could choose whether or not to play. The experimenters wanted to see how many people would choose to play the game at the varying levels of risk.

Now, if partic.i.p.ants were at all sensitive to objective information about risk, then the results should be obvious: more people would choose to play the game when the risk of losing was lower (or the odds of winning were higher). And this is indeed what happened. But wait-the experiment wasn't over yet. Now the researchers wanted to see what would happen when the rewards stayed the same but were brought a little closer to home. So they conducted the experiment a second time. Here, instead of simply telling the partic.i.p.ants that they could win cookies, they set up a small oven in their lab and actually baked the cookies right in front of the subjects. Would sitting in that room, with the cookies turning golden in front of them and the smell of freshly baked deliciousness wafting through the air, change their decision making? Yes. As the experimenter slipped on an oven mitt and pulled the hot tray from the oven to let the morsels cool ever so slightly, somehow the partic.i.p.ants' willingness to take risks miraculously skyrocketed. As suspected, the temptation to gamble, even when the odds of winning were low, was now too much to resist. Partic.i.p.ants' inner gra.s.shoppers wanted those d.a.m.n cookies, and they wanted them bad: "To h.e.l.l with the possibility of consequences later! I have a chance to win chocolate now!" With this voice echoing in their subconscious, just as many people chose to play the game when the deck was stacked against them as when the odds of winning were high. It seemed, the researchers concluded, that making the reward more vivid and immediate can overwhelm the ability to weigh risks rationally. In other words, when the reward looms close, the risk becomes harder to resist.

Our perceptions of risk seem to be similarly swayed when we make decisions about s.e.x. When you show men pictures of women and ask them to gauge the odds of contracting a disease from the women, the more physically attractive the woman, the lower the estimate.4 When you think about it, this is completely irrational; after all, if anything, a s.e.xier woman should be expected to have had more partners and therefore more opportunities to contract a disease. But we don't take the time to a.s.sess this logically when such an alluring immediate reward-s.e.x with a gorgeous woman-is on the line. In a similar demonstration of the power of immediate rewards, another study showed that men who were s.e.xually aroused reported being more willing to engage in risky s.e.xual behavior than men who weren't aroused.5 It's not that any of these men were inherently bigger risk takers; it was that when the visual and sensory cues of s.e.xual opportunity are there, the desire for immediate pleasure takes over, turning even the most responsible guy into a carefree Lothario.

So the more appealing and immediate the reward, the more we instinctively ignore or downplay the risks involved. This may not seem particularly shocking. We've probably all been in situations where we're more than willing to throw caution to the wind in pursuit of something or someone we really wanted. But as we're about to see, when simple shifts in our environments completely blind us to the long-term consequences of our actions, it can have some pretty surprising-and often disastrous-consequences.

Risky business: Feeling your way.

It turns out that many of the most important decisions of our lives, as well as the ones that seem to have direct implications for our character, are rooted in our subconscious a.s.sessments of risk. Whether he's aware of it or not, a smoker's decision about whether to quit is directly related to his belief about the odds that smoking causes cancer. Similarly, a voter's support for a policy geared toward, say, ending workplace discrimination and hara.s.sment will hinge on her judgment of how frequently these types of offenses occur. At first this may appear fine; after all, these people are grown-ups and free to make their own decisions. But the problem is, as shown by the studies described above, people rarely make these decisions rationally, although they like to believe they do. Rather, they allow emotional cues to override logic, which, more often than not, results in flawed decisions or judgment. We teamed up with colleagues Richard Petty and Duane Wegener at Ohio State and Derek Rucker at Northwestern's Kellogg School of Business to look at how small shifts in people's emotional states affect their a.s.sessments of risk and reward. If we were correct in thinking that simple changes in mood could alter your view of what awaited you behind the next door, the implications on our lives could be profound. For example, what if a smoker were suddenly less willing to go to a cancer screening because her good feelings about a recent job promotion made her underestimate the risk that the cancer would metastasize? Or what if a star athlete's elation after winning a big game inured him to the risks of unprotected s.e.x?

To see how this might work, let's conduct a simple thought experiment. Imagine the scene during Hurricane Katrina. Think about the thousands of people desperately scrambling through the storm to find shelter, leaving their homes and, in many cases, their friends and family behind as they fought for survival. Picture those children who clung to their pets, only to be torn away by rescue workers and forced to leave the dogs and cats to certain doom. Think of the overwhelming grief of far-flung friends and relatives as they slowly received word of the loved ones they lost. Feeling a little sad yet? Now, answer this question: of the four million people in the United States who will propose marriage to someone this year, how many will be refused by the person they love?

This is more or less the exercise we put our partic.i.p.ants through in our study. We had them read a news story that was intended to elicit a particular emotional state, such as sadness or anger, and then asked them to predict the likelihood of various other events. We overwhelmingly found that feeling sad or angry, simply from reading about an event such as a natural disaster or an anti-American protest in Iraq, was all it took to color their judgments about the odds of completely unrelated events occurring. It wasn't that hearing about an event such as a plane crash made them think plane crashes were more likely; it was that their emotional state swayed their general perception of the world around them. When people felt sad, they believed tragedy to be more prevalent; for example, they estimated that there were higher numbers of children starving in Romanian orphanages and brides being left at the altar. By the same token, people who were feeling angry overestimated the frequency of infuriating events, such as being screwed over by a used-car salesperson or being stuck in traffic.6 It may seem disconcerting at first to learn that not only do we fail to use logic when weighing probabilities but feelings and moods that have absolutely nothing to do with the decision being made can bias our judgments. But don't fret. It turns out that this tendency to overestimate risks can actually have its advantages, evolutionarily speaking.

Consider the following example: You're walking through the savannah with some of your family in search of a little breakfast. You come across a type of animal you've never seen before. It has dark brown fur with a white stripe down its spine. As you approach, it lunges at your merry band, sinking its teeth into your eldest daughter's neck and killing her. Now let's say we asked you what the probability is that the next animal with dark brown fur and a white stripe down its spine you see would be dangerous. You'd probably say 100 percent, and that's the most rational guess you could make since the single dark-furred, white-striped animal you've encountered proved to be dangerous.

Now, let's say you accidentally happen upon another one of these creatures. This time the animal sits there peacefully, even a.s.suming a deferential posture as you pa.s.s. Again we ask you, what is the probability that the next animal with dark brown fur and a white stripe down its spine will be dangerous? You'd probably pause. Rationally, your answer should be 50 percent, since as of this moment, one of two has proved dangerous. But your gut says something different. It's true that it is no longer reasonable to expect that all individuals of this species are dangerous, but on an intuitive level you know it's better to be safe than sorry. In your heightened emotional state, the cost of taking a longer path to avoid the brown and white critter is far less than the risk of losing another life. And in this case, your intuitive mind is right. While avoiding all animals with dark fur and white stripes would be an irrational calculation rooted in emotion (namely, fear), it is also an adaptive one.

Of course, this isn't just true in the jungle. In modern life too, listening to intuition and being more sensitive to the possibility of harm will serve you better on average than evaluating each individual situation rationally and objectively, particularly in situations that require rapid decisions for which you have incomplete information. It's hard if not impossible to know the odds involved in any given risk. What is the probability that you will get attacked if you walk down your own street? If you asked Kitty Genovese this question early on the night of March 13, 1964, she probably would have said it wasn't that high. But she was attacked. And she was killed. What are the chances you will get sick if you share a cup or if you eat a serrano pepper? Again, probably not that high. But tell that to the college students who contracted swine flu or fell victim to the salmonella outbreak of 2008.

The point is that our past experiences play a large role in our a.s.sessment of risk-perhaps an even bigger role than our mood or proximity to reward. When we undergo a painful experience, the desire to prevent such a thing from ever happening again can be so strong that we'd rather ignore the probabilities and just play it safe. If that means you have to avoid serrano peppers for a year, so be it. Our intuitive systems don't give much credence to that old maxim about lightning never striking the same place twice.

At the same time, having missed out on a reward in the past can make us more willing to take a risk in the future. For example, if you fold your hand in a poker game and the next card that's turned is the one you were waiting for, it's hard to convince yourself you made the right decision. Now the money you could have won is staring you in the face, coaxing you to go for it the next time and put it all on the line.

Studies such as ours have shown that not only does feeling sad or angry lead people to overestimate the prevalence of tragic or infuriating events, but feeling happy makes people more likely to overestimate the likelihood of positive events. This too is adaptive. How? Because it might compel you to take a chance on something you otherwise wouldn't have. Take a promotion, for example. Let's say only 10 percent of the people in your company get promoted to the next level. Logic and reason would tell you these are terrible odds and that you shouldn't even bother trying. But what if on one particularly sunny and cheerful morning your gut tells you just to go ask for that promotion even if, logically speaking, it's a fool's errand? What often seems like a fool's errand isn't, and if you put in the effort, you may just be rewarded. Sometimes you have to be in it to win it. So, it can often be better to listen to our intuition and play the possibilities than the probabilities.

But if following our intuition often leads to better outcomes in the long run, how does this explain Terrance Watanabe's gambling losses? It seems as though he had the opposite problem. Terrance wasn't in a situation where he had to make split-second decisions. The ma.s.sive losses at the casinos unfolded over time. The answer is that Terrance was underestimating the risks. Instead, like the people who were more likely to gamble when they could smell the warm cookies, he was overly focused on the immediate reward. Each time he bet, the possibility that the next spin of the roulette wheel or the next turn of the card would win him the jackpot was so seductive, it blocked out all rational concerns about his long-term financial well-being or his family's reaction to his blowing their nest egg on a few rolls of the dice. When we think about judgments of risk and reward in terms of the battle between the ant and the gra.s.shopper, Terrance's behavior and phenomena like it begin to make a lot more sense. The desires to avoid immediate losses and to obtain immediate rewards-whether in the savannah or in the poker room-all stem from the psychological processes geared toward our short-term interests. The processes that govern long-term interests are the voices in the back of our head advising us to forget about what's in front of our eyes and focus on what will be there much later on. And as we know, these are the voices that so often can be ignored.

So we see that gambling, or taking risks, is less about our "character" and more about situation and circ.u.mstance: our past experiences, our moods and emotions, and the visibility of rewards in that moment. The variability of all these factors is exactly what makes us seem to be daredevils one minute and straight arrows the next. When it comes to risk, our decisions are under the control of the ant and gra.s.shopper, with important implications for how we are judged by those around us. In fact, understanding the processes underlying risk taking provides a compelling explanation for why we consider some types of people valiant heroes and others meek cowards.

Eyes on the prize.

In our culture, heroes tend to be risk takers: the general who orders a daring a.s.sault to win a battle, the investor who makes a wild gambit and ends up with a windfall, the politician who puts his career on the line to champion a n.o.ble cause. But why do we have so much respect for those who run headfirst into danger, who don't think twice before acting? Why is this considered so heroic, whereas careful, cautious, and reasoned behavior isn't?

The answer to this conundrum lies in an unlikely place: sports. Ask yourself why few figures in sports are more beloved than the underdog. It's because people are fascinated by those who "beat the odds." As any Red Sox fan will tell you, no moment in recent sports history comes close to the thrill of seeing the 2004 team come back from an 03 deficit streak to beat their long-standing rival, the New York Yankees, in the American League Championship Series. But this thrill wasn't just about the win. The victory was icing on the cake. This defiance of odds and expectations, the unlikely becoming reality, is what captured our hearts. The marketers at major television networks are well aware of this fact, which is why it seems impossible to watch a sporting event or even a reality show such as Dancing with the Stars without being bombarded with information about the unlikely circ.u.mstances from which particular athletes or contestants emerged. Indeed, it's become increasingly difficult to tell the difference between coverage of an Olympic event and a heartwarming biopic. The announcers know that what is likely isn't interesting (the record-breaking quarterback with a twelve-game winning streak makes another touchdown, yawn); it's the unlikely that gets the ratings.

Case in point: During the 2010 NCAA college basketball tournament, the Butler Bulldogs knocked out a series of higher-ranked opponents on their way to a national championship showdown with the heavily favored Duke Blue Devils. It was painted as a David vs. Goliath matchup, and the nation was captivated. Even those who had absolutely no interest in college basketball were tuning in to see the drama unfold. Butler lost by two points after their last second shot clanked off the rim, but no one cared all that much about the outcome; the nation loved the Bulldogs for the mere fact that they'd gotten there by beating the odds. Taking risks that seem insurmountable may be the key to being seen as a hero.

Let's see how this psychological bias for the unlikely plays out in another compet.i.tive context: Wall Street. There is perhaps no group of individuals toward whom more vitriol and scorn have been directed over the past several years than Wall Street traders (or the greedy, callous, irresponsible, money-hungry leeches, as they're usually referred to). But it turns out that the psychological processes that cause us to root for the underdog (this attraction to beating the odds) might be the exact same ones that are responsible for the risky investment strategies that most likely contributed to the 2008 economic collapse.

Wall Street traders feel the same way about the high-stakes game of buying and selling that most people feel about s.e.x and warm cookies: they like it. They like it a lot. To see just how much, Brian Knutson, a neuroscientist at Stanford, put traders into fMRI machines. Not surprisingly, when the traders were making high-risk decisions, the pleasure centers of their brains lit up like Christmas trees. And the riskier the decisions became (i.e., the worse the odds), the more pleasure they brought the traders.7 In a way, the same thing is true for sports fans-the less likely the dark horse is to win, the more excited we are just to watch them play. And the less likely it is that a firefighter will come out alive from a burning building, the more praise we heap upon him if he or she survives.

So the next time you curse those bankers on Wall Street and wonder at how they could possibly be so indifferent to the risks they were taking and the choices they were making, remember the pleasure you take in seeing Cinderella stories unfold. Sure, rooting for Seabiscuit doesn't have the same consequences as gambling away millions of dollars of taxpayers' money, but the psychology behind it is much the same. And, by the same token, the next time you're tempted to judge someone such as Terrance Watanabe for gambling away his family's fortune, remember that those same mental mechanisms that bring you so much joy in the fortune of unlikely winners are much the same as those that repeatedly drove him to bet thousands of dollars on a measly pair of twos. Again, when we look at risk in terms of the battle between the ant and the gra.s.shopper, what seem at first glance to be deficiencies in character suddenly become a little more understandable after all.

Supermen and scaredy-cats.

So if risk takers are heroes, then what about those who avoid risk at all costs? What about the cowards? To understand how common an aversion to risk can be, and why it is rooted in a fundamental property of the mind, let's first consider the little-known eccentricities of a famous figure: Charles Darwin. Darwin was nothing if not meticulous. And one particularly interesting detail about his travels that's not mentioned very often is the fact that not only did he keep detailed logs of the many species he encountered, he also kept a detailed log of his flatulence and bowel movements (as well as daily records of the severity and frequency of his tinnitus, or the ringing in his ears).8 His writing on this matter certainly does not rank up there with On the Origin of Species, but it was something he evidently expended a considerable amount of time on. After all, he was known to be a hypochondriac. Hypochondria is a cla.s.sic example of the human tendency to overestimate the possibility of immediate risks in our environment. If we were to ask Darwin or any other hypochondriac the likelihood that his stomach grumblings were symptoms of a serious ailment, he would most likely say close to 100 percent. Clearly, this would be inaccurate, but when our minds are always attuned to danger, we see it wherever we look.

This kind of mentality takes many forms. Agoraphobics confine themselves to their home because they've overestimated the risks they perceive in the outside world. h.o.a.rders can't bear to throw anything away because they can't risk not having that old flowerpot when they need it. Of course, these are extreme situations, but in milder forms, risk aversion is actually an extremely common psychological trait.

Consider the following example. If we were to ask whether you'd rather have $50 right now or flip a coin for the chance to win $100, which would you choose? If you're like most people, you'd go with the former, and this makes sense. Though the expected outcome of each decision is the same ($50), there is risk involved in the coin toss-you might end up with nothing. But what if we asked if you'd rather have $40 or flip the coin for a chance at $100? Logically, if you calculated the risk, the odds of the coin toss would be in your favor, but you'd probably still choose the guaranteed $40.9 This is an example of irrational risk aversion, also known as loss aversion, and most people experience it in one way or another. We seem to be wired to avoid immediate losses, even when it means sacrificing potential long-term gain. Yet as we've noted, in our culture this kind of behavior is often construed as a weakness in character. In fact, we reserve a word for those who avoid any kind of risky behavior: cowards.

Many of us feel like cowards at some point in our life. When we can't muster up the will to go talk to that person we've been eyeing all night, for fear of being rejected. When we'd prefer to keep all our money in savings accounts (or under our mattresses) so we don't lose it all in the stock market. When we refuse to walk home alone in the dark for fear of being mugged. When we don't let our kid eat that candy bar with the slightly torn wrapper in case it has a razor blade inside. These fears may not be rational, and they certainly aren't s.e.xy, but again, they can be adaptive. In the long run, cowards are less likely to get rejected, lose their nest eggs, get mugged, and feed their kids razor blades. Which brings us back to the question at the heart of the chapter. What makes a person a risk taker in one context, and a coward in another? Once again, we see it has to do with our subjective understanding of the risks involved. Consider the child of a lifelong firefighter. Every day he sees Dad leave the house in the morning to go extinguish burning buildings and then come home safe and sound. Might this child grow up with a different idea about the risks a.s.sociated with running into burning buildings than a child of a firefighter who died in a blaze? Of course he would. As we saw when we talked about the irrational fear of air travel, experience and exposure powerfully sway our perceptions of risk. So would the former child be more willing, later in life, to climb a fire escape to pull a baby out of a fourth-floor window than the latter child? Probably. But would that mean he's a braver person, a person of better character? A hero instead of a coward? Well, not necessarily.

The point is that "heroes" aren't necessarily braver people; they may simply have different estimates of the probabilities involved with the events. If you don't buy this, then you may have to reevaluate your opinions about adolescents, especially boys. Most people (over the age of eighteen, at least) would not agree that teenagers are necessarily more courageous or heroic than adults. But research has found that they certainly are less risk-averse.10 Suggest to a fifteen-year-old boy that the two of you grab your skateboards and careen down the steps of city hall and he'd probably give you a high five, whereas most adults would look at you like you'd lost your mind. This isn't just because most adults look ridiculous on a skateboard. It's because the teen and the adult are wired to think differently about the risks involved. Research has shown that the teen brain hasn't fully developed the ability to develop what psychologists call "counterfactuals." In other words, they lack the cognitive ability to imagine the potential consequences of their actions (i.e., the skateboard going into the street and its rider getting flattened by an oncoming bus). And if a teenager can't even envision breaking his neck by skateboarding down a steep staircase, then how can he accurately a.s.sess the risk that it might happen? How could he be considered a hero for taking a risk he can't even fathom? So whether we act like heroes or cowards is not as much a matter of character as people tend to think it is. When the gra.s.shopper is in charge, it can turn us into heroes, addicts, or cowards, depending on the context.

Tomorrow is always a day away.

Imagine that on the table in front of you are four decks of cards. You know only two things about these decks. First, every card will have a number on it that represents the amount of money you will either win or lose, depending on the card. Second, the cards differ among the decks. But what you don't know is that in this game, known as the Iowa Gambling Task, some decks have better odds than others. The risky decks offer greater potential payoffs but have more "loss" cards; the safe decks offer smaller payoffs but at a more constant rate. But again, you know none of this, at least not yet. So how do you decide which deck to choose from?

When people play this game, at first they use trial and error; they pick from the different decks more or less randomly and see what happens. After about forty or fifty trials, however, they have developed a pretty good sense of which decks are safe and which are not, and then begin to choose cards almost entirely from safe ones. Why? They know that the game is going to go on for a while and therefore that their ultimate profit will be determined over the course of the game, not just on the next draw. In other words, somehow the systems of the ant kick in and shift people's attention away from short-term wins and onto the accrual of money over the long haul.

Intuitively this makes sense. Imagine playing the game again, but this time the experimenters have placed sensors on your skin that can gauge your arousal level by measuring increases in perspiration. That is, they can literally see you sweat. When Antoine Bechara and his colleagues did this, they found something fascinating: around the tenth card draw-long before you have any conscious inkling of which decks are risky-you begin to show anxiety (as measured by arousal level) each time your hand reaches to draw from what you will only later consciously realize is a risky deck. You're nervous, but you aren't even aware of it.11 This is a compelling demonstration of the ant at work. It acts as a silent statistician, calculating the risks and rewards a.s.sociated with each deck and trying to steer you one way or the other based not on each individual flip but on the effect multiple flips will have over the long term. Left to the devices of the gra.s.shopper, people might continue to flip from a deck from which they get immediate positive feedback or avoid a deck from which they've just been burned. But remember, the ant is focused on the probability, not the possibility, of rewards. After all, playing the probabilities is the key to success over the long term. The power of the Bechara study lies in its demonstration of just how subtly, how deeply below our level of consciousness, the ant can work. Clearly, we know on an intuitive level which of the decks are risky, otherwise we wouldn't be experiencing that anxiety. It takes us thirty more rounds-300 percent longer-to be able to consciously report this knowledge and adjust our behavior to minimize losses. Why? The gra.s.shopper doesn't go down without a fight. The impulse to avoid immediate harms and gravitate toward immediate gains competes with the anxiety generated by the ant. In this kind of controlled situation, over time the scales tip toward long-term concerns and the players wise up. But in the real world, unfortunately, this isn't always the case. For many of the most important decisions in our lives, sometimes the ant needs a little help.

To your health.

Earlier in the chapter we talked about how our perceptions of risk can impact health-related decisions such as whether or not to quit smoking or go for cancer screenings. In both cases we make these choices by subconsciously weighing the short-term benefits against the long-term risks. In the case of smoking, it's the pleasure of cigarettes vs. the risk of cancer. With the screenings, it's the reward of avoiding all that unpleasant poking and prodding (and the worry about receiving bad news) vs. the risk that a disease will go undetected. In their best-selling book Nudge, Richard Thaler and Ca.s.s Sunstein talk about how, by understanding the ways in which people think irrationally, we can help nudge them toward healthier, more responsible, and more productive behaviors.12 Building on that idea, how can we use what we know about the psychology of risk taking to encourage people to be more responsible in looking after their health? In other words, how do we get people not only to hear the voice of the ant telling them to focus on the long term but actually to heed it? When it comes to our health, it's not enough to intuitively know those risks are there, like the players in the early rounds of the Iowa Gambling Task did. We have to actually act on them!

If you still believe that focusing disproportionately on risks makes you a coward, consider the following field experiment. Yale psychologist Peter Salovey was interested in how to get more women to go for mammograms. He quickly realized that in order to voluntarily subject themselves to the unpleasant procedure, women would have to judge the long-term risks of not going (cancer, possibly death) as being greater than the short-term costs of going (the ha.s.sle of going to the doctor, the physical discomfort of the X-ray, the mental anguish of worrying about a bad result, and so on). Logically, this seems like a no-brainer, but we shouldn't have to tell you at this point that logic has little to do with it. Manipulating mental and physical discomfort would be tricky, so Salovey and his team decided to focus on the risk part of the equation. They teamed up with a local phone company to recruit women in the New Haven area to come into his lab and watch short public service announcements on their lunch break. The announcements were of two types. Both urged women to get mammograms, but one video talked about the benefits of mammography (e.g., finding a tumor early increases survival odds); the other talked about the risks (e.g., not finding a tumor early can lead to death).

This seems like a trivial difference, but it actually turned out to have a huge impact on the women's decisions. Those who were made to focus on the long-term risks rather than the benefits were much more likely to later act responsibly and go for a screening. Why? Simple. When the announcement was framed in such a way that the ultimate long-term consequence was front and center, the ant suddenly couldn't be ignored. Here again we see how the gambles we take, even the big ones such as whether we're willing to risk our long-term health for short-term conveniences, can be greatly influenced by small and subtle differences.13 This may make it sound as if all would be well with the world if we always listened to the ant and focused on the long term. We might not have as much fun, but we'd be responsible and better off in the end, right? Well, that's true when it comes to our health, since the stakes are so high. But in other situations that rule of thumb doesn't always work because, as we've learned, the ant's foresight isn't always 20/20.

Ask any new professor what's the worst thing that can happen to his or her career and nine out of ten will give you this answer: being denied tenure. To avoid that future horror, they will make great sacrifices: working twenty-hour days, not spending as much time with their families as they'd like, letting their teaching responsibilities slide, and so on (trust us, we've seen it). But as work by Dan Gilbert and his colleagues has shown, all this extra effort may not, in the end, be justified.14 Sure, being denied tenure is bad, but when Gilbert a.s.sessed the actual levels of unhappiness among professors who had been denied tenure, it quickly became clear that they were actually a lot happier than their younger selves would have predicted. And as Gilbert's team has shown, this type of prediction error for happiness is quite pervasive; we're as bad at predicting future happiness about all kinds of long-term rewards-everything from wealth to the outcome of an election and more-as we are at predicting risk. It's hard to make decisions regarding our long-term welfare if we can't accurately predict what will make us better off. Here again, neither intuition nor rationality always provides the answer.

So what does this all mean? Our decisions and behaviors are guided in large part by what our minds and circ.u.mstances trick us into believing about relative risks and rewards. Add to this the fact that our estimations of risks and rewards not only are very frequently flawed but are also quite fluid, and the mechanisms shaping character quickly become more complex. Once we come to grips with these dueling forces and how they can sway us-once we realize that we too are just one or two big poker wins away from a whole lot of more losses-then we can start making better decisions about when to gamble and when to play it safe.

8 / TOLERANCE VS. BIGOTRY.

Why sometimes we just can't help hating "them"

Recorded aboard an Apache helicopter, July 12, 2007, in Baghdad, Iraq: 2:11 All right, we got a guy with an RPG [rocket-propelled grenade].

2:13 I'm gonna fire.

2:15 No, hold on. Let's come around. Behind buildings right now from our point of view.

2:43 You're clear. 2:49 Let's shoot.

2:50 Light 'em up!

2:52 Come on, fire!

3:15 I got 'em!

3:40 Got a bunch of dead bodies lying there.

4:31 Oh yeah, look at those dead b.a.s.t.a.r.ds.1.

These words doc.u.ment the last minutes in the life of Namir Noor-Eldeen. As his name might suggest, Noor-Eldeen was Iraqi, but he was not an enemy combatant. To the contrary, he was one of the top Reuters freelance photographers doc.u.menting the American and Iraqi governments' efforts to root out insurgents in Baghdad and Mosul. This day, Noor-Eldeen, along with his Reuters driver, Saeed Chmagh, were taking photos in a Baghdad neighborhood where the Apache helicopter team was searching for insurgents. Noor-Eldeen had just snapped some pictures using his telephoto lens and was showing a few others the shots he had taken. By all accounts, he was calm even as the helicopter circled above him. After all, why should he worry? He was a photographer, not a militiaman.

The scene on the copter, however, was not so calm. The gunners had mistakenly identified Noor-Eldeen's lens and camera as an RPG and were circling to get him in their sights and take him out. By the time he realized that the copter gunners were aiming right at him, it was too late. Noor-Eldeen and his companions, none of whom had any weapons, were gunned down in a b.l.o.o.d.y ma.s.sacre.

The release of this video (which the army fought for years to keep under wraps) has stoked much debate. If you listen carefully, you can hear the soldiers voicing hopes that the Iraqis will pick up a weapon (even though there weren't any there) so that, according to the rules of engagement, they could hit them with another round of machine gun fire or missiles. This series of events has led to public outcry against these military personnel. How could they mistake a camera for an RPG? With the level of training they'd had, how could they not recognize that Noor-Eldeen had done nothing to suggest he might be a combatant other than look Iraqi? The answer many have come up with is that these soldiers must simply be bigots-hungry for the blood of any and all Iraqis.

Although this view might seem tenable at first blush, on further a.n.a.lysis it really doesn't appear to hold water. The American soldiers often fought side by side with the Iraqi forces, sometimes putting their lives in one another's hands-not something you'd do with people you despised. Plus, this tragedy was just one isolated incident-a freak accident. If the soldiers were really prejudiced against all Iraqis, wouldn't there have been many more incidents like this one? And for whatever it's worth, the army's own internal investigation found no evidence of prior bias or an inclination to shoot before identifying the target. It was a tragic event but, at least according to the army, an unavoidable one that is part of the cost of war.

Still, when we read this story we couldn't help wondering whether the triggers would have been pulled so quickly if the man with the camera had been named Smith instead of Noor-Eldeen. If his skin had been lighter, if he had been blond, would there have been a little more hesitation, or at least a better attempt to verify whether what he was holding was in fact a weapon before the Americans opened fire? It's not that we believe the soldiers consciously shot the man just because they thought he looked Iraqi. But, as we've seen many times before in this book, what a person consciously thinks doesn't always dictate what he actually does.

Prejudice is one of the most reviled of human tendencies. Few of us would look at a bigot and say, "Now there's a guy with good character." Yet as psychologists, we can't help wondering: if prejudice is so bad, why has it stuck around for this long? As far as anyone can tell, stereotypes and prejudice appear to be as old as civilization itself. To have endured this test of time, there must be something that can sometimes be adaptive about them, something that, historically speaking, served a purpose, even if not a n.o.ble one. We realize this might not be the most popular argument in this book, but if you want to understand how to prevent bigotry from emerging, you have to understand the basis for why the mind engages in it in the first place. As part of this process, then, we intend to show that the question of whether prejudice is "good" or "bad" isn't always so (for lack of a better phrase) black and white. Which is why, as we will show you, when the circ.u.mstances are ripe, any of us, ourselves included, can act like a bigot no matter how fair and unbiased we believe our character to be. In fact, most of us, if placed in the situation of the soldiers on that Apache helicopter, probably would have acted similarly. Whether we like it or not, and whether we believe that prejudice is something we should all strive to overcome (which the two of us do personally believe), the human mind is wired for it-and this can influence people's behavior to an extent that you wouldn't believe.

I know I saw a gun.

Imagine you're a New York City police officer scanning the neighborhood for a suspected felon. You see a man who might match the physical description and you begin to approach him. You are white. He isn't. As you're approaching, he turns to duck into the doorway of a nearby building. You identify yourself as a police officer, and as you do so, the individual reaches into his pocket and begins to turn toward you. You direct your gaze toward his hand, and you see he is holding a dark object. What do you do? Do you shoot or do you wait? You'd probably think that the answer most likely depends on whether the object he's taking out of his pocket looks like a gun. But that's not the whole story. You see, just how much that object (whatever it may be) resembles a gun depends a lot on who is holding it.

The notion that a mere error in perception can lead us to shoot an innocent man might seem (understandably) a bitter pill to swallow. Yet it's exactly what the psychologist Joshua Correll and his colleagues have convincingly shown in a series of inventive experiments that re-create the scenario above. Here's how it worked.

You sit down in front of a computer screen with two b.u.t.tons in front of you. One labeled "shoot" and the other "don't shoot." The experimenter informs you that you'll see images of different street scenes flash on the screen in front of you-a city intersection, an alley, a parking lot, etc.-and every so often a man will appear in some of the scenes as well. The man will always be holding something-a wallet, a cell phone, or a gun. Your job is to "shoot" men who are holding guns by pressing the shoot b.u.t.ton as fast as you can, just as you would if your life were actually on the line. If the man isn't holding a gun, you have to push the "don't shoot" b.u.t.ton just as quickly.

We can all agree that if people took their time, no one would make any errors and no one would spot a gun where there wasn't one. A gun, after all, looks very different from a wallet. But we can also agree that when people need to identify the object in under a second, mistakes become a little more likely. But here's the kicker. Yes, Correll's partic.i.p.ants made errors, just as you might expect given the time they had to make the decision, but their errors weren't random. Not by a long shot. His partic.i.p.ants (all of whom were white) were much more likely to mistakenly identify a phone or a wallet as a gun, and therefore to shoot, when the man holding it was African American. The reverse pattern held when the man was white.2 It seemed the partic.i.p.ants' minds were engaging in some racial profiling on the intuitive level.

Now, these partic.i.p.ants weren't bigots. They espoused no racial prejudices and had no history of acting discriminatory in any way. Yet here they were, deciding to shoot a black man much more readily than a white one. Sure, it was just an experiment, but the fact of the matter is that these same biases play out in real life. In fact, Correll based this experiment on a real-world tragedy you may recall from the headlines: the death of Amadou Diallo, a twenty-three-year-old Guinean immigrant to New York City. Diallo wasn't the criminal the police were looking for on the evening of February 4, 1999. He was an innocent guy selling wares on the street to make money for college. Yet as the police approached him, because they thought he might be the man they were after, he got scared and fled (as many in his situation might do), entering a nearby building. They ordered him to stop, and he began to turn around, reaching into his pocket for his wallet so that he could prove to them who he was. Unfortunately, however, the policemen were certain that the emerging wallet was a gun-a split-second mistake that resulted in Diallo falling to the ground with nineteen bullets lodged inside him.

But is it really fair to call this prejudice? In all these cases-the cops, the soldiers, the research partic.i.p.ants-everyone thought they saw a gun. Wouldn't you shoot to protect yourself? Of course you would; almost anyone would if they thought they saw a gun. But that's exactly the point. Whether you think you see a gun isn't just determined by what's actually in front of your eyes. It's also influenced by the battle going on behind them.

The quick, the fair, and the dead.

It's human nature that whenever we meet someone new, our mind automatically and immediately categorizes him or her in some way: old or young, white or black, gay or straight, Christian or Muslim, liberal or conservative, and so on. A major reason for this rapid categorization is the mind's desire to predict what is likely to happen next. Interactions with other people usually portend one of two things: rewards or costs. So, beginning the interaction with some knowledge-any knowledge-about the others involved can help you predict what's coming and, thereby, adjust your actions accordingly. As we discussed in Chapter 5 on cruelty and compa.s.sion, we tend to categorize people by lumping them into groups that can be defined as similar to or different from us: us vs. them. When another person is in the "us" group, we feel comfortable. We a.s.sume we know what they're like, because their goals and interests are similar to ours. They're brethren who will help us. However, when the person is "them," we're a little more wary. The odds of incompatible goals and strife become higher.

Historically speaking, human social life has always involved compet.i.tion and conflict between groups. The end result is that the mind has evolved to be quite sensitive to signs of group affiliation. Yes, forming a friendship or partnership with someone from another group holds the potential to offer rewards, but it also holds the potential to end in compet.i.tion, conflict, or worse. The systems of the ant and gra.s.shopper know this well and work to shape your views and judgments accordingly. They both want to keep you alive and let you thrive; they just go about it differently.

The long-term systems, in their efforts to build benefits for the future, try to tip the mind toward further exploring the potential of each individual. It urges us to try to learn what he or she is like and not to jump to conclusions. The ant knows that to make a rapid decision about what someone is like based on the color of their skin or other marker can lead to missed opportunities. For the short-term systems, however, it's better to be wrong than to be dead. What matters most to the gra.s.shopper is surviving right here and now, and given that the interests of different groups do often conflict, it may make sense to use a quick and dirty guess for what the person in front of us is likely to do. In other words, to use the only information we may have regarding a new person: stereotypes.

For better or worse (and often it's for worse), stereotypes provide the mind with a guess about what specific people are like. But if stereotypes are so bad, it raises an interesting question: why does the mind use them? The answer is simple: to help us make sense of the people around us. You see, stereotypes aren't inherently biased or maladaptive. They are just concepts that we use to categorize people in our social world, just as we use concepts to categorize objects in our physical world. For example, just as we know that chairs have four legs and are meant for sitting, we "know" that Italians are brilliant and irresistibly attractive. (What did you expect from two guys named DeSteno and Valdesolo?) In the absence of any other information, the mind uses these concepts to make predictions about new objects or people. For example, if we tell you something is a chair, you know you can sit on it even if it looks really strange (remember those chairs that were shaped like giant human hands?). Similarly, if you know you're going on a blind date with an Italian, you can expect it's going to be great. It's true that not all chairs have four legs and not all Italians are brilliant. But on average, if stereotypes are working correctly, most members of a category have the relevant features of the stereotype, and so our minds can use stereotypes as shortcuts to give predictive order to our world.

Now, while you may have accepted our chair example, you may have sensed some bias or self-interest creep in with the example about Italians. Our stereotype about Italians may not be the same as yours, which brings up an important question: how do we learn stereotypes in the first place? Usually it's in one of two ways: either someone plants an idea in our head about what people in group X are like (whether by telling us explicitly or by treating them certain ways), or we repeatedly see members of group X acting in specific ways and we extrapolate from that impression. For instance, back on the ancestral savannah, if every time you saw a member of the Mib tribe, they bludgeoned you, you would begin to avoid them at all costs, or to attack them before they hit you first. It certainly might be true that not every Mib would take a swing at you, but it might be safer to a.s.sume they would and avoid serious injury as opposed to taking a risk by conversing with them. Hence the potential benefit of stereotypes and prejudice. Of course, this strategy will not help you in terms of long-term peacemaking. Finding that one Mib who might well be interested in resolving hostilities between his group and yours could lead to great long-term benefits. But being wrong could also lead to broken teeth. Thus you see the contest between the two mental mechanisms playing out.

There is one more kink in the system, though. This is the one that often makes stereotypes so pernicious. Now that we're no longer on the savannah, what we see of group X can be very misleading. In these days of 24/7 media, what we learn about group X is often what the media decides to show us. If on any given day ninety-eight Italian men put in a solid day's work but one is indicted for being a mob boss and one commits a murder, which two stories will probably show up on the six o'clock news? The same goes for any other ethnic or social group. Unless we live in a cave, much of what we learn of other groups comes from the tragic or salacious stories we see on television. Back on the savannah, what we saw, we saw with our own eyes. So if Mibs were frequently violent, then the best guess the mind could make on encountering a Mib (at least in a statistical sense) was that he or she was going to be violent. But in today's sensationalist, media-saturated culture, what we see tends to reflect not the statistical realities but rather what is most "interesting" or aberrant. Yet the intuitive mind still uses that information to generalize.

This fact is why stereotyping people (as opposed to chairs) can be so problematic. Because our minds have been wired over thousands of years of evolution to take in small bits of information and generalize it to all members of a group, the usefulness of stereotypes can vary widely depending on the accuracy of the information. Even when we rationally know that all Iraqis aren't terrorists, or that all African Americans aren't criminals, or that all Italians aren't brilliant, our intuitive biases, irrespective of whether we endorse them, can shape what we think, what we see, and even what we do. If the officers confronting Diallo had waited an extra second or two-enough time to processes the information in front of them instead of just relying on intuition-they might never have made such a horrible mistake. What may be most surprising, however, is not only that our subconscious prejudices impact our behavior in unfair and dangerous ways but also that they have the potential to emerge seemingly from thin air.

Red, blue, I hate you.

On the afternoon of April 5, 1968, the water fountain near Jane Elliott's third-grade cla.s.sroom in Riceville, Kansas, was suddenly off-limits to students with blue eyes. Elliott had just told her eight-year-old students that blue-eyed people don't have as much melanin as brown-eyed ones, and that was important because melanin was responsible for intelligence and other good qualities. "Brown-eyed people are the better people in this room. They are cleaner and they are smarter," she said. "Blue-eyed people sit around and do nothing."3 This was the pretext for one of the most famous and shocking examples of how quickly and arbitrarily prejudice can rear its ugly head. The evening before Elliott told her tale, Dr. Martin Luther King Jr. had been a.s.sa.s.sinated in Memphis. Now Elliott was desperate to teach her young pupils a lesson about prejudice. So she told them this fib about the superiority of the brown-eyed children. But it didn't stop there. She then proceeded to spend the remainder of the day praising the "brownies" over the "blueys," as she called them. It didn't take the children long to chime in. In just a few short hours, when a blue-eyed student got a math problem wrong on the board, the students said it was because he was a bluey. When a blue-eyed girl had to use a paper cup instead of drinking from the water fountain, a brown-eyed boy told his friend this was to make sure the brownies didn't catch anything.

This event is fascinating for several reasons. The foremost, though, is that it shows how readily the human mind-at least the young, relatively unformed human mind-will discriminate. These kids had all been friends. They had no history of any type of cliquishness or infighting. Yet all it took was an authority figure to give them a seemingly believable reason for why one group might be better than the other, and lines were quickly drawn in the sand. Suddenly even brown-eyed kids who were usually a bit quiet and timid were scoffing at their supposed inferiors. And to make matters worse, the exact same pattern of prejudice repeated itself on the next day-this time in the opposite direction-after Elliott informed the cla.s.s that she had made a mistake: it was less melanin, and therefore blue eyes, that was a.s.sociated with desirable qualities. Now it was the "brownies," according to Elliott, who were inferior. And the cla.s.s bought right into it.

This demonstration was one of the first-and most resonant-to suggest that the capacity for prejudice lurks within everyone. Yet on the face of it there are several reasons to suspect that this view of character is too dismal. First, these were little kids, and kids are impressionable. They will believe whatever you tell them, especially if the person doing the telling is an authority figure. These kids, then, probably accepted the "facts" about melanin and eye color because their teacher told them it was true. Similarly, they discriminated against the blueys or brownies because their teacher did. So there's no reason to think that adults would ever act this way, right?

We decided to find out. In this case we were joined by Nilanjana Dasgupta from the University of Ma.s.sachusetts at Amherst, one of the foremost experts on the fluidity of prejudice. If what we all suspected was correct, under the right circ.u.mstances prejudice could emerge in anyone. And if prejudice is really a function of the battle of mental systems, like so many other aspects of character we've discussed thus far, it should crop up even if you don't have any preexisting stereotypes or biases about the group in question. No explanations for why one group is more worthy are needed. No

Please click Like and leave more comments to support and keep us alive.

RECENTLY UPDATED MANGA

Legend of Swordsman

Legend of Swordsman

Legend of Swordsman Chapter 6351: Entering the Divine Fire Temple Author(s) : 打死都要钱, Mr. Money View : 10,247,573

Out of Character Part 4 summary

You're reading Out of Character. This manga has been translated by Updating. Author(s): David DeSteno. Already has 892 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

NovelOnlineFull.com is a most smartest website for reading manga online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to NovelOnlineFull.com