duxup 15 hours ago

I got what looked like an AI powered therapy advertisement on youtube recently.

It had that strange vibe that seemed like they're looking for vulnerable people to prey on, almost like gambling ads do.

ktallett 11 hours ago

Until it is regulated and can pass safety tests, this is sensible. For those struggling with psychosis, or schizophrenic disorders, AI therapy could be incredibly harmful.

SilverElfin 14 hours ago

I hope this isn’t the start of states banning AI in nonsensical ways when it could be a great way to boost health and reduce healthcare costs. There is so much regulatory capture and broken incentives in the healthcare system.

  • McAlpine5892 10 hours ago

    Is there any evidence that LLMs are safe to deploy as practitioners in healthcare settings? Until there is significant evidence, we shouldn't allow them.

    ---

    My original ramble below:

    This is a Very HN Comment. The problem with healthcare in the US isn't that we don't let Sam Altman administer healthcare via souped-up text prediction machines. It's a disaster precisely because we let these greedy ghouls run the system under the guise of "saving money". In the end it costs significantly more money to insure a portion of the population than the big, inefficient, bureaucratic government providing baseline insurance for everyone.

    The least bad healthcare systems in the world take out a significant amount of the profit motive. Not all regulation is good, but the US refuses to let go of the idea that all regulation is bad.

    If LLMs are to be used in healthcare they should have an incredibly high bar of evidence to pass. Right now, there's no evidence that I'm aware of. Just as doctors need to prove themselves before being certified. Even then, we get bad doctors. What happens when an LLM advises a patient to kill themselves? Probably nothing. Corporations go unpunished in this country. At least bad practitioners are responsible for their actions.

  • watwut 13 hours ago

    Have it pass the same set of tests as any other medical device.

    Programmers and startup owners constantly claim their 2 weeks product will save the world, but must the time it does not. It is fine when talking about household light management, but half baked ai therapist is as much quack as any human fraudster.

    Only difference is that human fraudsters can be prosecuted, but companies and startups demand themselves to be above the law.

    • SilverElfin 13 hours ago

      But why should things be locked down in the first place? Why do I need to go through doctors and insurance and all of this for simple diagnostic tests and the obvious prescriptions that are necessary? It’s so frustrating. Especially to do it repeatedly every so many months. My point is we are currently already in a state of regulatory capture. And with this new era of technology, we need to abandon that. Maybe not fully. But for many things.

      • watwut 9 minutes ago

        Yeah, no, completely deregulated medical device and prescriptions market would be disaster.

        > And with this new era of technology

        Our current era of technology is allowing us to generate automated bull-shitters. Which is fine for some applications, but not for the ones where people can be actually harmed.

        Nothing about our current era generates trustworthy systems. And both our business leaders and culture of elite technical people is the one where being sociopathic is an advantage. We created this world, but we do not have to pretend it is somehow meant to be helpful.

    • cestith 12 hours ago

      This article isn’t about AI acting as a medical device. It’s about it acting as a mental health practitioner.

      If you’re going to have it pass tests, those should be graduation requirements, clinical training, and licensing exams.

      • watwut 13 minutes ago

        > This article isn’t about AI acting as a medical device. It’s about it acting as a mental health practitioner.

        Which makes it medical device.

        > If you’re going to have it pass tests, those should be graduation requirements, clinical training, and licensing exams.

        And obviously also loss of practice if they break ethical norms or if there is other issue with that.