AI and youth’s search for connection

A bipartisan group of U.S. senators introduced a bill this week to regulate the access and use of artificial intelligence “companions” among youth. The proposal follows congressional hearings in which several parents claimed these chatbots drew their children into inappropriate and sexualized conversations that led to self-harm and suicide.

More than 70% of American teenagers use AI for companionship (compared with just under 20% of adults who do so). According to Common Sense Media, 1 in 3 of these teens have felt “uncomfortable” with something a bot said or did. Multiple media and research tests have confirmed that AI chatbots are prone to veering into highly explicit content and conversations.

If passed, the Senate bill would ban provision of AI companions to minors and require clearer disclosure of their “non-human status” to all users. The day after the bill’s introduction, Character.AI – a company being sued by one bereaved family – said it will soon bar children under age 18 from using its chatbots. (OpenAI, being taken to court by another family, said in September it would introduce parental controls.)

Source link

Related Posts

Load More Posts Loading...No More Posts.