nah, i wouldn't use it like that, and i'd strongly recommend that no one does. there have been various pieces written about AI inducing what is essentially religious/spiritual psychosis in people, some of which have even lead to divorce. but, even aside from that, replacing human interaction, companionship or therapy, with a hyper-advanced predictive text chat bot will do nothing but further alienate people, especially under capitalism.
i think using it to talk to someone that's deceased isn't quite as bad as the latter, but is still very bad. understanding and coping with mortality is something that humans have been doing the entire time we've existed, and a key part of the way we process death involves not being able to talk to the deceased person anymore. aside from completely removing that aspect of the experience of losing a loved one, what are the ethical implications of this, and how would one even feasibly "get their loved one into the LLM?" would you have to upload their all of their diaries to some cloud based AI service? would you have to painstakingly write their biography into the prompt field yourself?
i think a lot of people are using "AI" (i hate that its even called that) for things that it should never have been used for. use it as a tool to summarize your emails and make lists, not as a therapist or girlfriend.