spot_img
26.8 C
Philippines
Monday, December 23, 2024

Learning how consumers think

By Cass R. Sunstein

In recent decades, psychologists and economists have produced a flood of new findings about how human beings think and act. Those findings offer compelling lessons about how to change people’s behavior.

- Advertisement -

Governments have taken notice—and so has the private sector. There are terrific opportunities here, but also real risks.  

Behavioral scientists have established, for example, that people are greatly affected by “default rules,” which establish what happens if they do nothing at all. If employers automatically enroll employees in savings plans (while allowing them to opt out), participation rates will be  far higher  than if employers ask employees whether they want to opt in.   

The US Department of Agriculture has found that if poor children are automatically enrolled in programs that provide free school meals, it  can get millions  of kids into those programs. Alert to the power of default rules, several states have become keenly interested in automatic voter registration, by which eligible citizens  become registered voters, unless they explicitly say that they don’t want to be.

Behavioral science has also shown that people have limited attention, that they dislike losses far more than they like equivalent gains, and that unless information is made salient, they might ignore it, even if it is quite important.

In these circumstances, simple reminders significantly  increase  the likelihood that people will take necessary medicines,  save money, and  teach  their children to read.   

Consumers  lose a lot less money  when credit card bills contain  clear information  about late fees and overuse charges.   Because people abhor losses, a small tax on the use of disposable bags will dramatically  cut their use  of such bags.

In cases of this kind, behaviorally informed nudges deserve a big round of applause. But other uses of the same techniques, exploiting people’s behavioral biases, should make us a lot more nervous.

Consider the widely discussed report that Uber Technologies Inc. and Lyft Inc., the ride-hailing companies, are aggressively exploring  how to use behavioral science  to get their drivers to do what they want.

In a clever experiment, Lyft showed one group of inexperienced drivers that they would make a lot more money by moving their work from a slow time, such as Tuesday morning, to a busy time, such as Friday night. It showed another group how much money they would lose by sticking with the slow times.

The result was straight out of Behavioral Science 101: Because people dislike losses, the second approach was more effective in getting drivers to work more during busy times. Intriguingly, Lyft elected not to use that approach, on the ground that it would be too manipulative.

But ride-hailing companies have eagerly enlisted other behaviorally informed approaches. One of the smartest, now used by both Uber and Lyft, is called “forward dispatch.” Before the current ride ends, drivers are automatically assigned a new one.

That’s good for passengers, because their waiting time is shorter. It’s terrific for the companies, because it encourages drivers to stay on the road. It’s less clear that it’s good for drivers, for whom it creates a clear default: Keep working.

In a similar vein, many companies  use a strategy  known as “negative option marketing,” which means that unless consumers actually take action, they will be assumed to continue to want some good or service—and to pay for it.   

For example, you might choose to subscribe to a magazine for a year, but the subscription is automatically renewed, so you can end up paying for it for a decade or more, even if you don’t much like it. Or a company will give you a service for free during an introductory period—but if you don’t cancel it, you’ll find yourself paying a healthy amount every month thereafter.

As companies keep learning more about behavioral science, and as it becomes easy to accumulate massive amounts of data about each of us, self-interested nudging will become both more common and more personalized.

Aware of the limited nature of attention and the power of loss aversion, casinos  have become  the Jedi Knights of behavioral manipulation, focusing customers on the possibility of riches (and showing them new opportunities around every corner). Every day, banks are  using  behaviorally informed techniques to persuade their customers to obtain “overdraft protection,” which often turns out to be immensely expensive loans. If you hope to profit from impulse purchases, you would be  smart to put  high-calorie, sugary foods right near the cash register—and a lot of supermarkets are smart.   

All this raises big questions: When is nudging ethical? When is it acceptable to take account of, and perhaps to exploit, people’s behavioral biases?

The first way to answer those questions is to ask whether people are being helped or hurt.

Many of the best nudges are like GPS devices: They make it easier for people to get where they want to go. If you are reminded that a bill is due, or that you have a doctor’s appointment this week, you aren’t likely to complain.

Most employees are happy to be defaulted into savings programs, and poor children aren’t exactly objecting to free school meals. But we shouldn’t approve if people are being defaulted into a product or a program that hurts them, whether it involves extra pounds, impaired health or hidden fees.  

It’s also necessary to ask whether behavioral science is being used to manipulate people.

Most nudges are anything but manipulative. On the contrary, they are explicitly designed to increase transparency and to boost people’s capacity for agency—as, for example, by informing consumers about late fees, health risks or the energy efficiency of household appliances. Many nudges are widely publicized, as in the case of automatic voter registration.

Uses of behavioral science become far more troublesome when they are hidden from their targets, or when their users are exploiting people’s unconscious biases for their own profit.

If companies default customers into expensive services they don’t need, and use jargon or fine print to obscure what they’re doing, there’s a big problem. The same is true if employers enlist behavioral science to trick employees into working longer hours, without giving them a fair opportunity to make that decision on their own.

We are sorely in need of an ethics of nudging. We should start by insisting that in this era of behavioral science, it is more, not less, important for private institutions to treat people with respect.

Here, as elsewhere, government’s regulatory role should be cautious. But consumer and worker protection laws already provide protection against lying and deception. For federal and state regulators, it is time to consider the possibility that egregious forms of manipulation, exploiting people’s behavioral biases to their detriment, belong in the same category. 

LATEST NEWS

Popular Articles