People can be forgiven for feeling depressed these days. The future touted by global leaders is dire.
The 2021-2022 United Nations Human Development Report states that living standards have declined in nine out of ten countries around the world since 2020 and that multiple crises “are hitting us at the same time and interacting to create new threats and make us feel insecure, unsafe, and untrusting of each other.”
The solution, according to the U.N., is to “recognize and treat the global mental health crisis that undermines human development and recognize the polarization that is more deeply dividing us, making collective responses harder.”
Echoing the U.N.’s narrative, World Economic Forum (WEF) founder Klaus Schwab stated that the world is facing “unprecedented multiple crises” today. To combat the “global mental health crisis,” the WEF’s Uplink program—a platform to help companies that support the U.N.’s Sustainable Development Goals (SDGs)—presented its remedy for depression and dissent: artificial intelligence (AI).
Speaking at the recent WEF Davos summit, a company called Wysa demonstrated their phone app that uses AI to provide psychological counseling.
“This is a person coming into the app and starting to talk about things that are not necessarily about depression, just about their feelings,” said Jo Aggarwal, Wysa’s CEO for India, displaying an example of an AI text therapy session. “AI is helping this person reframe what they’re thinking, it’s helping them open up,” she said. “People open up to AI three times faster than they do to a human therapist.”
The Wysa app currently has about 5 million users in more than 30 countries, Aggarwal said. “232 people have written us to say that they’re only alive today because they found this app.”
According to Wysa, many companies including Accenture and SwissRe are choosing to using its app. And schools are as well.
“Teenagers have been our first cohort,” Aggarwal said. “About 30 percent of our users are young people under the age of 25. We do have a cutoff: above 13.”
Numerous trials were used to test and refine the program.
“We built this for three years iteratively,” adjusting the program when users had concerns about it, Aggarwal said. Some concerns were about the “power differential” created by the app, particularly from younger users, who said: “I don’t want to reframe a negative thought because that’s the only control I have in this situation.”
“Then, we changed how the clinicians told us what else we could say to them,” she said.
Adjusting Children’s Minds
This program coincides with another U.N. effort to adjust children’s minds in favor of the U.N.’s SDG goals, called Social and Emotional Learning (SEL). SEL is embedded into the curriculum at most public and private schools throughout the United States and other countries today.
In a report titled “SEL for SDGs: Why Social and Emotional Learning is Necessary to Achieve the Sustainable Development Goals,” the U.N. argues that children are suffering from “cognitive dissonance” that arises when what they see around them conflicts with the progressive ideology that presented by their teachers, or when concepts like systemic racism and intersectionality prove to be self-contradictory.
The U.N. says that, for children, “dissonance is unpleasant—the aversive arousal state is because inconsistent cognitions impede effective and unconflicted actions.” In other words, cognitive dissonance allows for questioning of U.N-approved concepts and may result in children having second thoughts about taking action in support of the SDGs.
“The dual potential of dissonance to undermine development goals by enabling compromise and inactions,” the report states, “necessitates appropriate dissonance management for the attainment of development goals. We posit two specific avenues, emotional resilience and prosocial behavior, for managing dissonance and attainment of the SDGs.”
What the U.N. considers psychological problems are not just giving children headaches; the WEF says it is also harming the productivity of “human capital.”
According to the World Health Organization (WHO), 12 billion workdays are lost each year due to depression and anxiety, costing the global economy about $1 trillion. The report adds that 15 percent of the world’s workforce has a mental disorder and that among the causes are “bullying and psychological violence (also known as ‘mobbing’).”
According to Wysa, global mental health is deteriorating at an alarming rate: one in eight people suffer from a mental health disorder today, there has been a 25 percent increase in “major depressive disorders,” 42 percent of employees they polled said their mental health has declined recently, and one-third of employees are “suffering from feelings of sadness and depression.”
Pandemic lockdowns, which the WEF supported, appear to be the number one culprit; though declining living standards from fuel and food shortages, in the wake of the WEF’s net-zero carbon emissions campaign, are also a key factor.
Risks Around Brain Data
Regarding the pros and cons of AI therapy, a report in Psychology Today says the upside is that patients can get therapy whenever they want and pay less. In addition, “machine learning could lead to the development of new kinds of psychotherapy.”
The downside is that patients may worry that “data from their encounters will be used for marketing, including targeted ads, spying or other nefarious purposes. There might even be concerns that the data might be hacked and even exploited for ransom.”
A WEF report titled “4 ways artificial intelligence is improving mental health therapy,” said that one of the ways AI is “helping” is monitoring patient progress by tracking what it calls “change-talk active” statements uttered by patients, “such as ‘I don’t want to live like this anymore’ and also ‘change-talk exploration’ where the client is reflecting on ways to move forward and make a change.”
“Not hearing such statements during a course of treatment would be a warning sign that the therapy was not working,” the WEF writes. “AI transcripts can also open opportunities to investigate the language used by successful therapists who get their clients to say such statements, to train other therapists in this area.”
Questions from WEF attendees at Wysa’s presentation included whether AI therapy apps could be programmed to include suggesting certain “values such as service and community,” and whether it uses “AI emotion recognition algorithms to see the condition of the voice” and assess how distressed a patient might be.
Aggarwal responded that “when we analyze their voice, people began to feel less safe.”
“If we use their data to say, looks like you didn’t sleep very well last night based on their phone, they will start feeling less safe; they would say, ‘Oh, somebody’s tracking me.’ So we took all that cool AI out and gave them what they needed to be able to feel this was private, this was safe.”
Voice recognition programs may be added in the future, however, when it can be done in what the app owners consider is a “clinically safe way.”
Wysa has worked to create an app that is “truly equitable, so that a person in Sub Saharan Africa could access it as much as someone working at Goldman Sachs,” Aggarwal said. For some languages, such as French and German, there is a “for-profit track” to use the app; for others, like Hindi, there is a “non-profit track.”
Aggarwal explained that she herself had suffered from depression, and that was her inspiration was to create an app that could help others.
“I wanted something that would guide me through how to restructure the negative thoughts, all the evidence-based techniques that I could feel supported,” she said. “So when you think about AI, don’t think about it as another entity, think about it as your own resource to work through things in your own head.”