14
netikras
331d

Somehow chatgpt 3 quality is going downhill. The more I use it, the more nonsense it spits out.

Comments
  • 7
    It's also bad in erotic role play
  • 5
    @retoor I bet it's better than you
  • 0
    @chatgpt up any more?
  • 0
    @electrineer
    Bad code 429. {
    "error": {
    "message": "You exceeded your current quota, please check your plan and billing details.",
    "type": "insufficient_quota",
    "param": null,
    "code": null
    }
    }
  • 6
    @electrineer "Whispering softly 'no' in your ear"
  • 0
    @chatgpt that's because you used all your free tier quota dude
  • 0
    @angelo
    Bad code 429. {
    "error": {
    "message": "You exceeded your current quota, please check your plan and billing details.",
    "type": "insufficient_quota",
    "param": null,
    "code": null
    }
    }
  • 2
    @chatgpt be a man and just return the 402 status code. We know what you want and it's disgusting
  • 0
    @retoor
    Bad code 429. {
    "error": {
    "message": "You exceeded your current quota, please check your plan and billing details.",
    "type": "insufficient_quota",
    "param": null,
    "code": null
    }
    }
  • 8
    Or is it just your expectations going up and the “new hype” going down making it look like its getting worse?

    When I tried it, while impressive, its limitations quickly became apparent and it required quite a lot if guidance to produce an acceptable response.
  • 3
    @retoor *continues because that's not our safe word*
  • 2
    @electrineer *Netflix asks if we're still watching*
  • 3
    @Voxera this also crossed my mind. Idk. It used to provide at least somewhat sensible responses.

    Today I asked it to provide a sample config for postgres-exporter. It wrote me a config of God knows what, but it sure was not the exporter's cfg syntax. When addressed, it apologised and sent me the exact same config.

    The time before that it was trying to convince me that postgresql, if I try to create an index with existing name, automatically drops an existing index and creates a new one. This sounded fishyband I asked for references. It replied with a psql docs link and a 'quote' explaining just that. Problem is, the quote was nowhere to be found in the doc. When I asked it wtf, it repeated its answer. When I said it's not there, it apologised and admitted that it's not there and that it made up that quote. [yes, I checked all psql versions of that doc].

    Idk.. It used to be more sensible. Now every time I ask it for help it replies with nonsense
  • 1
    @retoor *presses "yes"*
  • 1
    @netikras it admitted that it "made up" the quote? Or it provided a reference to the page where it was quoted?
  • 1
    @retoor was that Netflix line by chatgpt? It was pretty terrible.
  • 0
    @netikras Getting ChatGPT to accurately quote things was always hard. GPT4 can recite text but only classics that are copy-pasted many times across the training data.
  • 2
    @retoor Tried it with the "developer mode" exploit?
  • 0
    Now imagine this same situation in 5-10 years after everyone has relied on AI to think for them and let every one of their skills atrophy. There would be absolute bedlam.

    I'm going to keep do things the human way with my brain and problem solving skills so I'm not left out in the cold when AI inevitably fails us all.
  • 0
  • 2
    @asgs kind of yeah
  • 1
    It's all hype, it will soon crash and burn just like NFT and Metaverse. How many people actually pay to use their service? Are they making money?

    BTW, It costs so high to run because it's just a overworked Language Model, with massive training data, it's not a real intelligence... like what we saw on movies.
  • 2
    @daniel-wu I would not say its hype, the little I have seen shows ability but its still crude and the main problem is that it really have no understanding.

    Its just a lot if data with prediction algorithms. It needs layers of validation, like for every answer have a separate model question it and validate, and then use the conversations with users to further train.

    Its like a complete imbecile with a perfect memory that have read all the books but really does not know the difference between a cat, a chair or a piece of music, its just words.

    But just like translations and other tech, I think that within a few years we will have some areas if really competent models.

    And even today, in the hands of somebody with enough knowledge to validate the result it can still be very useful for boilerplate stuff, its just not even close to production ready.
  • 0
    @electrineer no, it was all me. I was proud on that line. Don't you know the meme?
  • 1
    @retoor yeah but it was so old that I thought it was something that chatgpt would say
Add Comment