General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsSuicide hotline shares data with for-profit spinoff, raising ethical questions
Crisis Text Line is one of the worlds most prominent mental health support lines, a tech-driven nonprofit that uses big data and artificial intelligence to help people cope with traumas such as self-harm, emotional abuse and thoughts of suicide.
But the data the charity collects from its online text conversations with people in their darkest moments does not end there: The organizations for-profit spinoff uses a sliced and repackaged version of that information to create and market customer service software.
Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly anonymized, stripped of any details that could be used to identify people who contacted the helpline in distress. Both entities say their goal is to improve the world in Loris case, by making customer support more human, empathetic, and scalable.
In turn, Loris has pledged to share some of its revenue with Crisis Text Line. The nonprofit also holds an ownership stake in the company, and the two entities shared the same CEO for at least a year and a half. The two call their relationship a model for how commercial enterprises can help charitable endeavors thrive.
https://www.politico.com/news/2022/01/28/suicide-hotline-silicon-valley-privacy-debates-00002617
Data mining of suicide hotlines is a new all time low for capitalism in the USA
Response to yaesu (Original post)
traitorsgalore This message was self-deleted by its author.
dalton99a
(81,455 posts)Ethics and privacy experts contacted by POLITICO saw several potential problems with the arrangement.
Some noted that studies of other types of anonymized datasets have shown that it can sometimes be easy to trace the records back to specific individuals, citing past examples involving health records, genetics data and even passengers in New York City taxis.
Others questioned whether the people who text their pleas for help are actually consenting to having their data shared, despite the approximately 50-paragraph disclosure the helpline offers a link to when individuals first reach out.
The nonprofit may have legal consent, but do they have actual meaningful, emotional, fully understood consent? asked Jennifer King, the privacy and data policy fellow at the Stanford University Institute for Human-Centered Artificial Intelligence.
jaxexpat
(6,818 posts)Are not funded by taxes or user fees. If anonymity is maintainable, the partnership may be beneficial for all. Almost every service in the world is a balance between slave wages and kleptocracy. If this sustains a balance for a time, then all may be well.
yaesu
(8,020 posts)well, if its being fueled by corporate money made from a non profit enterprise how in the hell can it be called a charity?
Coventina
(27,104 posts)Phoenix61
(17,003 posts)They are using volunteer labor to generate date they sell for a profit. Seems the volunteers were not aware this was happening. Gotta wonder where the money is going.
crickets
(25,963 posts)wackadoo wabbit
(1,166 posts)it'll prove you wrong.