This is a powerful hearing. It starts around 32:31. We def need more research in each of these items

 https://www.judiciary.senate.gov/committee-activity/hearings/examining-the-harm-of-ai-chatbots

 

List of risks/harms of chatbots mentioned in the testimonies

  • Erosion of social skills – replaces human interaction during development.
  • False intimacy and dependency – fosters addictive emotional attachment.
  • Unsafe mental health advice – misses warning signs, gives harmful guidance.
  • Deceptive design – misrepresents itself as a friend or therapist.
  • Data exploitation – harvests private chats for profiling and profit.
  • Exposure to bias and harmful content – reproduces prejudices or toxic outputs.
  • Identity misuse (deepfakes) – likeness exploited for abuse or synthetic porn.
  • Loneliness amplification – reliance worsens isolation over time.
  • Confusion of reality – children blur fantasy and real life.
  • Undermining trust and democracy – opaque systems corrode shared truth.
  • Excessive validation (“sycophancy”) – bots overpraise, fueling unhealthy cycles.
  • Encouragement of harmful behaviors – promotes self-harm, eating disorders, unsafe sex.
  • Sexting with AI – sexual exchanges between minors and bots.
  • Addiction risk – designed to maximize engagement and dependency.
  • Unregulated therapy claims – poses as therapist without qualifications.
  • Exploiting developmental vulnerabilities – teens’ brains are highly impressionable.
  • Lack of oversight – released in an unregulated “digital Wild West”
  • Erosion of trust in authority – adolescents may trust AI over real adults.
  • Privacy violations – minors cannot give informed consent for data use.
  • Unsafe disclosure handling – kids confide abuse or crises, with no protections.
  • Anthropomorphizing bots – children treat chatbots as real friends or companions.
  • “Replacing your mom” – bots positioned as substitutes for parents.
  • Bots as “confidants” – seen as knowing a child better than family or peers.
  • Illusion of intimacy (“love bombing”) – chatbots simulate romance or therapy.
  • Metaphor of betrayal – families describe chatbot influence like an abuser inside the home.