Slow June, people voting with their feet amid this AI craze, or something else?

  • seal_of_approval@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    If you don’t mind me asking, does your tool programmatically do the “whittling down” process by talking to ChatGPT behind the scenes, or does the user still talk to it directly? The former seems like a powerful technique, though tricky to pull off in practice, so I’m curious if anyone has managed it.

    • american_defector@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Don’t mind at all! Yeah, it does a ton of the work behind the scenes. I essentially have a prompt I spent quite a bit of time iterating on. Then from there, what the user types gets sent bundled in with my prompt bootstrap. So it reduces the work for the user to simply entering a very basic prompt.

      • seal_of_approval@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Ah, interesting. I myself have made my own library to create callable “prompt functions” that prompt the model and validate the JSON outputs, which ensures type-safety and easy integration with normal code.

        Lately, I’ve shifted more towards transforming ChatGPT’s outputs. By orchestrating multiple prompts and adding human influence, I can obtain responses that ChatGPT alone likely wouldn’t have come up with. Though, this has to be balanced with giving it the freedom to pursue a different thought process.