a man being interviewed by a woman

Ethics of Nudging with False Information

This paper got me thinking of an ethical issue that many of us face with cognitive engineering and behavioral design. Part of our mission is to design products, systems, and services to improve user performance and user experience. What if the best way to do that is to provide false information? Is that ethical?

As behavioural sciences are unearthing the complex cognitive framework in which people make decisions, policymakers seem increasingly ready to design behaviourally-informed regulations to induce behaviour change in the interests of the individual and society…

Here is the example. The Behavioral Sciences Unit in the UK developed a program (not actually implemented from what I can tell) for the long term unemployed. As you may know, long term unemployment can take a real toll on your confidence. Then reduced confidence decreases your motivation and effort in finding a new job, resulting in a self-reinforcing cycle that keeps a lot of people out of work. This is bad for the worker and for the economy. But of course, the opposite is also true. If you can boost their confidence, they try harder to find a job, have a better chance of getting that job, and this is good for both the worker and the economy.

So what if you can boost that confidence by giving them false information about their strength of character, personality attributes, job qualifications, etc in a generic way that improves their confidence and starts this cycle in the right direction? There are significant benefits to everyone, no one is harmed, but it is lying.

Is this OK? I look forward to your thoughts in the comments.

Image credit: “IMG_1980” by bpsusf used under CC BY 2.0

2 thoughts on “Ethics of Nudging with False Information”

  1. It is never a good idea to lie. Once one use figures out the app is lying, they will tell the others and then two things happen: the efficacy of the system goes to zero and users will not trust the makers of the system.
    Instead the makers of such a system would have to come up with ways to improve people’s self esteem that are verifiable “truths.” Probably harder to accomplish, and I will be demonstratively more effective.

  2. It is never a good idea to lie. Once one user figures out the app is lying, they will tell the others and then two things happen: the efficacy of the system goes to zero and users will not trust the makers of the system.
    Instead the makers of such a system would have to come up with ways to improve people’s self esteem that are verifiable “truths.” Probably harder to accomplish, and I will be demonstratively more effective.

Leave a Reply

Your email address will not be published. Required fields are marked *