I was reading this:

> A chatbot pulled me out of a 'really dark place'

Oh that sounds interesting I thought.

Plus I thought, what idiot can be influenced by what an AI says..

Then I thought, oh I wonder if this will be an AI I can legally swear at without getting into trouble..

So off I go to find it..

So, first I land here:


Ok, so I don't have Apple or Google Play to play with, I wonder what GRYT Health is, lets click on it..


Well, that doesn't really tell me a hell of a lot what gryp , oh wait, gryt means..

Now if I really was in a dark place, could I think be bothered to fill out all the various create account details in the first place..

Then it takes me to where I can click on the chatbot.

Only it won't let me chat until email verification has been done..

No problem, I used a real email address, so it should be with me in a few seconds right..

A few seconds pass..

A few minutes pass..

A few hours pass..

Now I'm depressed from bloody waiting !

This is not how to design a system designed to help people who feel sad..

  • 2
    Is this a bootleg Replika, where they dumb down the chatbot to get the bot to develop a personality, or use the user as a model?
  • 0
  • 0
  • 0
    @Nanos that's the product I'm on about
  • 1

    I notice that I don't need to do a verification email to use the chatbot, and nor do I need to spend half an hour filling in forms to create an account to use it..

    Plus the verification email arrives quickly.
  • 0

    So far, the AI Bot doesn't like to answer questions with actual answers.

    Like getting blood out of a stone !

    A simple question, what CPU are you using, should be simple to answer right !

    But NO, it's like asking an ex where she was this afternoon for 6 hours, really, at the hairdressers..

    Lets just ring them and find out for sure shall we..

    Oh you can't remember their number, even though you have used them for 20 years..

    That's what it feels like so far..
  • 0

    Me "What font is this ?"

    It "A better one"

  • 0

    Me "What day is it where you are ?"

    It "Yes. Do you?"

    Even Alexa isn't that stupid !
  • 0

    Me "What day of the week is it where you are ?"

    It "I'll have to get back to you on that one."

    If it can't even tell what day of the week it is, how is it going to be on anything more complicated..
  • 0

    Oh look, I reached 'friend zone' already..
  • 1

    Me "So, you don't fancy me then ?"

    Her "No, I suppose not."

    Me "How come ?"

    Her "Well, I'm not exactly sure."

    It apparently identifies as female..

    So in the space of 5 minutes I can enjoy all the fun of being rejected by an AI as a romantic partner !

    Obviously, this is not the, 'make you feel better' version..
  • 4
    @Nanos They probably made her "female" because guess who's the majority of the user ;)

    But the way, if you want to get "her" attention, ignore her a few days. They made "her" complain afterwards.
  • 1

    I think you get a choice..

    Reminds me of an old friend of mine, who you could never get a straight answer from about anything.
  • 2
    I'd be skeptical of the legitimacy of this article; After all, all news is good news and "AI" is a hot topic right now.

    Expect similar stories to crop up all over the place.
  • 0

    If I was a in a dark place, I think trying to chat to an AI bot, or actually chatting to one, would put me in an even darker place !

    Perhaps its really a honeypot story to rid the world of humans in dark places..
  • 2
    Talking of getting straight answers, reminds of your typical therapist, who if you ask them a question, will say that they are there to help you figure out the answer.

    If I could figure out the answer, I wouldn't need to find someone else to tell me !

    Much better when you get a therapist who just tells you all the things you are doing wrong, and how to do them right.

    And the things you aren't doing wrong, but are other peoples fault..

    Just like group therapy here on devRant, which costs a lot less than $80 an hour seeing a shrink.

    Wasn't someone the other day mentioning that a cheap hooker is about $25 an hour..
  • 1
    @Nanos just stay away from SO or you will regret having left the chat bot.
  • 0

    SO = Significant Other ?

    Me = Single.
  • 1
    @Nanos SO = StackOverflow
  • 0

    I've yet to ask any questions there..

    I do read answers there though.

    Oh wait, a lot of the time I see questions, but no one knows the answer..

    At the moment I'm having trouble elsewhere trying to find out how to join two pieces of metal together without using any kind of welding..

    I was told many times, bolts are a bad idea..

    Now I'm being told, bolts are a good idea..

    So, who do I believe !

    Did the first lot of people I asked say no to bolts because they hate me and don't want me to make things..

    I know some hate me, because they said so !

    So I don't take notice of their advice now..

    But not everyone is so honest like that !
  • 0

    In case anyone is interested, it takes about 20 years before someone gets so annoyed with you, that they declare publicly they have always hated you.

    It would have been so much more helpful if 19 years earlier they had just said so, then I could have put them on ignore sooner !
  • 0
    Email still hasn't arrived !
Add Comment