Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

yaesu

(8,020 posts)
Fri Jan 28, 2022, 03:28 PM Jan 2022

Suicide hotline shares data with for-profit spinoff, raising ethical questions

Crisis Text Line is one of the world’s most prominent mental health support lines, a tech-driven nonprofit that uses big data and artificial intelligence to help people cope with traumas such as self-harm, emotional abuse and thoughts of suicide.

But the data the charity collects from its online text conversations with people in their darkest moments does not end there: The organization’s for-profit spinoff uses a sliced and repackaged version of that information to create and market customer service software.

Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly “anonymized,” stripped of any details that could be used to identify people who contacted the helpline in distress. Both entities say their goal is to improve the world — in Loris’ case, by making “customer support more human, empathetic, and scalable.”

In turn, Loris has pledged to share some of its revenue with Crisis Text Line. The nonprofit also holds an ownership stake in the company, and the two entities shared the same CEO for at least a year and a half. The two call their relationship a model for how commercial enterprises can help charitable endeavors thrive.

https://www.politico.com/news/2022/01/28/suicide-hotline-silicon-valley-privacy-debates-00002617

Data mining of suicide hotlines is a new all time low for capitalism in the USA

8 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Suicide hotline shares data with for-profit spinoff, raising ethical questions (Original Post) yaesu Jan 2022 OP
This message was self-deleted by its author traitorsgalore Jan 2022 #1
Big Tech motto: "If we can monetize you, we WILL" dalton99a Jan 2022 #2
Hot lines and other programs that aid people anonymously.... jaxexpat Jan 2022 #3
model for how commercial enterprises can help charitable endeavors thrive. yaesu Jan 2022 #4
Reminds me of the episode of Dexter with the evil psychiatrist. n/t Coventina Jan 2022 #5
Wrong on so many levels. Phoenix61 Jan 2022 #6
Horrifying. K&R for visibility. crickets Jan 2022 #7
This is just proof that when you think capitalism can't possibly sink any lower . . . wackadoo wabbit Jan 2022 #8

Response to yaesu (Original post)

dalton99a

(81,455 posts)
2. Big Tech motto: "If we can monetize you, we WILL"
Fri Jan 28, 2022, 03:32 PM
Jan 2022
For Crisis Text Line, an organization with financial backing from some of Silicon Valley’s biggest players, its control of what it has called “the largest mental health data set in the world” highlights new dimensions of the tech privacy debates roiling Washington: Giant companies like Facebook and Google have built great fortunes based on masses of deeply personal data. But information of equal or greater sensitivity is also in the hands of nonprofit groups that fall outside federal regulations on commercial businesses — with little outside control over where that data ends up.

Ethics and privacy experts contacted by POLITICO saw several potential problems with the arrangement.

Some noted that studies of other types of anonymized datasets have shown that it can sometimes be easy to trace the records back to specific individuals, citing past examples involving health records, genetics data and even passengers in New York City taxis.

Others questioned whether the people who text their pleas for help are actually consenting to having their data shared, despite the approximately 50-paragraph disclosure the helpline offers a link to when individuals first reach out.

The nonprofit “may have legal consent, but do they have actual meaningful, emotional, fully understood consent?” asked Jennifer King, the privacy and data policy fellow at the Stanford University Institute for Human-Centered Artificial Intelligence.

jaxexpat

(6,818 posts)
3. Hot lines and other programs that aid people anonymously....
Fri Jan 28, 2022, 03:36 PM
Jan 2022

Are not funded by taxes or user fees. If anonymity is maintainable, the partnership may be beneficial for all. Almost every service in the world is a balance between slave wages and kleptocracy. If this sustains a balance for a time, then all may be well.

yaesu

(8,020 posts)
4. model for how commercial enterprises can help charitable endeavors thrive.
Fri Jan 28, 2022, 03:37 PM
Jan 2022

well, if its being fueled by corporate money made from a non profit enterprise how in the hell can it be called a charity?

Phoenix61

(17,003 posts)
6. Wrong on so many levels.
Fri Jan 28, 2022, 03:44 PM
Jan 2022

They are using volunteer labor to generate date they sell for a profit. Seems the volunteers were not aware this was happening. Gotta wonder where the money is going.

wackadoo wabbit

(1,166 posts)
8. This is just proof that when you think capitalism can't possibly sink any lower . . .
Fri Jan 28, 2022, 07:27 PM
Jan 2022

it'll prove you wrong.

Latest Discussions»General Discussion»Suicide hotline shares da...