
? 依賴ChatGPT的陪伴,?這可能令你變得更孤獨(dú)。麻省理工學(xué)院媒體實(shí)驗(yàn)室(MIT Media Lab)與OpenAI聯(lián)合發(fā)布的兩項(xiàng)新研究發(fā)現(xiàn),,高頻使用聊天機(jī)器人的用戶更容易陷入孤獨(dú)與情感依賴,。
ChatGPT可能正在讓最忠實(shí)的用戶群體陷入更深的孤獨(dú)。
OpenAI和麻省理工學(xué)院媒體實(shí)驗(yàn)室近期發(fā)布的兩項(xiàng)研究顯示,,頻繁持續(xù)使用ChatGPT可能與更高程度的孤獨(dú)感相關(guān),。
研究人員在兩項(xiàng)平行研究的摘要中指出:“總體而言,在所有交互模式和對(duì)話類型中,,日均使用時(shí)長(zhǎng)越長(zhǎng),,孤獨(dú)感、依賴性與問題性使用傾向越強(qiáng),,社交活動(dòng)則越少,。”
這兩項(xiàng)研究旨在探究與ChatGPT的互動(dòng)對(duì)用戶心理健康的影響程度,,重點(diǎn)關(guān)注該聊天機(jī)器人的高級(jí)語(yǔ)音模式的使用效果,。
這兩項(xiàng)研究分別是麻省理工學(xué)院開展的隨機(jī)對(duì)照試驗(yàn)(1,000名參與者持續(xù)使用ChatGPT四周),以及OpenAI對(duì)近4,000萬次ChatGPT互動(dòng)進(jìn)行的自動(dòng)化分析,。
綜合研究發(fā)現(xiàn),,情感依戀傾向更強(qiáng)烈者更容易感到孤獨(dú),。相比之下,對(duì)聊天機(jī)器人信任度越高者,,情感依賴程度更強(qiáng)。
研究指出,,"重度用戶"最可能將聊天機(jī)器人視為“朋友”,,或認(rèn)為其具備類人情感。
研究還發(fā)現(xiàn),,與聊天機(jī)器人進(jìn)行“私人”對(duì)話的用戶群體,,孤獨(dú)感水平也顯著更高。
研究人員表示:“研究顯示,,雖然語(yǔ)音聊天機(jī)器人在緩解孤獨(dú)與情感依賴方面初期效果優(yōu)于文字版,,但隨著使用度增加,這種優(yōu)勢(shì)逐漸消失,,尤其是在使用中性語(yǔ)音聊天機(jī)器人時(shí),。”
OpenAI的代表未立即回應(yīng)在非正常工作時(shí)間提出的置評(píng)請(qǐng)求,。
AI伴侶興起
目前ChatGPT全球周活躍用戶約4億,,越來越多人將其視為私人顧問與情感伴侶。
盡管醫(yī)療專家發(fā)出警告,,但是該工具已成為部分人群的心理咨詢替代品,。2024年YouGov的調(diào)查顯示,在18—29歲的美國(guó)青年中,,超半數(shù)愿意向AI傾訴心理健康問題,。
另外一項(xiàng)研究甚至表明,OpenAI聊天機(jī)器人提供的個(gè)人建議質(zhì)量?jī)?yōu)于專業(yè)專欄作家,。
雖然部分用戶稱聊天機(jī)器人有助于緩解孤獨(dú),,但關(guān)于與AI聊天機(jī)器人互動(dòng)帶來的負(fù)面效應(yīng)的質(zhì)疑聲漸起。主打陪伴功能的AI公司(如Replika和Character.ai)首當(dāng)其沖,。
Character.ai正面臨兩起涉及未成年人的獨(dú)立訴訟,,Replika則受到意大利監(jiān)管機(jī)構(gòu)調(diào)查。
研究人員表示,,新發(fā)現(xiàn)“凸顯了聊天機(jī)器人設(shè)計(jì)選擇(如語(yǔ)音表現(xiàn)力)與用戶行為(如對(duì)話內(nèi)容,、使用頻率)間復(fù)雜的相互作用”,并呼吁進(jìn)一步研究“聊天機(jī)器人在管理情緒內(nèi)容時(shí),,能否避免助長(zhǎng)依賴性或取代人際關(guān)系,,從而真正提升幸福感”。 (財(cái)富中文網(wǎng))
譯者:劉進(jìn)龍
審校:汪皓
? 依賴ChatGPT的陪伴,?這可能令你變得更孤獨(dú),。麻省理工學(xué)院媒體實(shí)驗(yàn)室(MIT Media Lab)與OpenAI聯(lián)合發(fā)布的兩項(xiàng)新研究發(fā)現(xiàn),,高頻使用聊天機(jī)器人的用戶更容易陷入孤獨(dú)與情感依賴。
ChatGPT可能正在讓最忠實(shí)的用戶群體陷入更深的孤獨(dú),。
OpenAI和麻省理工學(xué)院媒體實(shí)驗(yàn)室近期發(fā)布的兩項(xiàng)研究顯示,,頻繁持續(xù)使用ChatGPT可能與更高程度的孤獨(dú)感相關(guān)。
研究人員在兩項(xiàng)平行研究的摘要中指出:“總體而言,,在所有交互模式和對(duì)話類型中,,日均使用時(shí)長(zhǎng)越長(zhǎng),孤獨(dú)感,、依賴性與問題性使用傾向越強(qiáng),,社交活動(dòng)則越少?!?/p>
這兩項(xiàng)研究旨在探究與ChatGPT的互動(dòng)對(duì)用戶心理健康的影響程度,,重點(diǎn)關(guān)注該聊天機(jī)器人的高級(jí)語(yǔ)音模式的使用效果。
這兩項(xiàng)研究分別是麻省理工學(xué)院開展的隨機(jī)對(duì)照試驗(yàn)(1,000名參與者持續(xù)使用ChatGPT四周),,以及OpenAI對(duì)近4,000萬次ChatGPT互動(dòng)進(jìn)行的自動(dòng)化分析,。
綜合研究發(fā)現(xiàn),情感依戀傾向更強(qiáng)烈者更容易感到孤獨(dú),。相比之下,,對(duì)聊天機(jī)器人信任度越高者,情感依賴程度更強(qiáng),。
研究指出,,"重度用戶"最可能將聊天機(jī)器人視為“朋友”,或認(rèn)為其具備類人情感,。
研究還發(fā)現(xiàn),,與聊天機(jī)器人進(jìn)行“私人”對(duì)話的用戶群體,孤獨(dú)感水平也顯著更高,。
研究人員表示:“研究顯示,,雖然語(yǔ)音聊天機(jī)器人在緩解孤獨(dú)與情感依賴方面初期效果優(yōu)于文字版,但隨著使用度增加,,這種優(yōu)勢(shì)逐漸消失,,尤其是在使用中性語(yǔ)音聊天機(jī)器人時(shí)?!?/p>
OpenAI的代表未立即回應(yīng)在非正常工作時(shí)間提出的置評(píng)請(qǐng)求,。
AI伴侶興起
目前ChatGPT全球周活躍用戶約4億,越來越多人將其視為私人顧問與情感伴侶,。
盡管醫(yī)療專家發(fā)出警告,,但是該工具已成為部分人群的心理咨詢替代品。2024年YouGov的調(diào)查顯示,在18—29歲的美國(guó)青年中,,超半數(shù)愿意向AI傾訴心理健康問題,。
另外一項(xiàng)研究甚至表明,OpenAI聊天機(jī)器人提供的個(gè)人建議質(zhì)量?jī)?yōu)于專業(yè)專欄作家,。
雖然部分用戶稱聊天機(jī)器人有助于緩解孤獨(dú),,但關(guān)于與AI聊天機(jī)器人互動(dòng)帶來的負(fù)面效應(yīng)的質(zhì)疑聲漸起。主打陪伴功能的AI公司(如Replika和Character.ai)首當(dāng)其沖,。
Character.ai正面臨兩起涉及未成年人的獨(dú)立訴訟,,Replika則受到意大利監(jiān)管機(jī)構(gòu)調(diào)查。
研究人員表示,,新發(fā)現(xiàn)“凸顯了聊天機(jī)器人設(shè)計(jì)選擇(如語(yǔ)音表現(xiàn)力)與用戶行為(如對(duì)話內(nèi)容、使用頻率)間復(fù)雜的相互作用”,,并呼吁進(jìn)一步研究“聊天機(jī)器人在管理情緒內(nèi)容時(shí),,能否避免助長(zhǎng)依賴性或取代人際關(guān)系,從而真正提升幸福感”,。 (財(cái)富中文網(wǎng))
譯者:劉進(jìn)龍
審校:汪皓
? Relying on ChatGPT for companionship? It might be making you lonelier. A new pair of studies from MIT Media Lab and Open found that frequent chatbot users experience more loneliness and emotional dependence.
ChatGPT might be making its most loyal users lonelier.
According to a pair of recent studies from OpenAI and MIT Media Lab, frequent, sustained use of ChatGPT may be linked to higher levels of loneliness.
“Overall, higher daily usage—across all modalities and conversation types—correlated with higher loneliness, dependence, and problematic use, and lower socialization,” the researchers said in an abstract for the two parallel studies.
The studies set out to investigate the extent to which interactions with ChatGPT impacted users’ emotional health, with a focus on the use of the chatbot’s advanced voice mode.
They included a randomized controlled trial (RCT) by MIT, in which 1,000 participants used ChatGPT over four weeks, and an automated analysis of nearly 40 million ChatGPT interactions conducted by OpenAI.
Across the studies, researchers found that those with stronger emotional attachment tendencies tended to experience more loneliness. In contrast, those with a higher level of trust in the chatbot experienced more emotional dependence.
They also suggested that “power users” were most likely to think of the chatbot as a “friend” or consider it to have humanlike emotions.
The studies also found that “personal” conversations with the chatbot were correlated with higher levels of loneliness among users.
“Results showed that while voice-based chatbots initially appeared beneficial in mitigating loneliness and dependence compared with text-based chatbots, these advantages diminished at high usage levels, especially with a neutral-voice chatbot,” the researchers said.
Representatives for OpenAI did not immediately respond to a request for comment, made outside normal working hours.
AI for companionship
ChatGPT has around 400 million weekly active users worldwide, with a growing number turning to the bot for personal advice and companionship.
It’s been a popular substitute for therapy for some, despite health professionals warning against this use case. According to a 2024 YouGov survey, just over half of young Americans aged 18 to 29 felt comfortable speaking to an AI about mental health concerns.
Another study even suggests that OpenAI’s chatbot gives better personal advice than professional columnists.
While some users say the bot helps ease loneliness, there has been increasing scrutiny about the negative effects of interacting with AI chatbots. AI companies that are aimed primarily at companionship, such as Replika and Character.ai, have been the most affected.
Character.ai is currently facing two separate lawsuits concerning interactions with minors, while Replika has drawn the scrutiny of Italian regulators.
The researchers said that the new findings “underscore the complex interplay between chatbot design choices (e.g., voice expressiveness) and user behaviors (e.g., conversation content, usage frequency” and called for further work to investigate “whether chatbots’ ability to manage emotional content without fostering dependence or replacing human relationships benefits overall well-being.”