End-of-life decisions are difficult and distressing. Could AI help?
- jeanne7629
- Jul 11
- 1 min read
Updated: Jul 19

This article definitely made me think again. While I’m not convinced we need an AI tool to guide end-of-life decisions, it did prompt me to reflect on how these conversations should happen with family, GPs, or any trusted, long-term healthcare provider. There’s never a perfect time, but I’m someone who likes to think things through ahead of time, especially when decisions might need to be made in emotionally charged moments.
What I appreciated most was how it nudged me to consider starting this conversation with my own parents. I’d be curious to see what kind of "discussion guide" the company behind this tool provides. That said, this feels like a rather extreme application of. Perfect case to illustrate the saying"when you’re holding a hammer, everything looks like a nail"! To me, it’s a good example of the current "AI frenzy", where some people seem to believe that AI can solve everything. (For the record I’m absolutely excited about AI!) but while the intentions are good and the outcomes desirable, I’m not sure AI is the right tool for this. I’d also be surprised if people felt comfortable sharing so much personal information with a system like this. And by the time you’re inputting/recording/authorising the record of that information, aren’t you already starting the conversation this tool is meant to enable?
Last thing I promise, I also found it interesting that the tool is positioned to help doctors, and I’d love to know whether that’s something clinicians actually want, especially given how complex and sensitive these situations can be.
Comments