7 comments

  • scottydelta 18 hours ago
    Open Web UI already provides this as a self hosted web solution.

    One good feature I like is ability to generate multiple responses from different models and merge it using one default model.

    • natoucs 3 hours ago
      Very nice! Do you still get access to the frontend of the original LLM providers and do you have to insert API keys ?
      • scottydelta 3 hours ago
        You get access to similar UI like ChatGPT and you connect the models you want to use by providing API key.

        Once configured you can choose between models of all providers you have connected in dropdown in chat.

  • benterix 1 day ago
    > Would love feedback from the HN community. What other features would make this more useful?

    A web app

    • scottydelta 3 hours ago
      Check out open web UI, it’s self hostable web app that can connect to different providers and models.
    • natoucs 3 hours ago
      I would if I found a way to keep access to the frontends of each LLM provider while being the web
  • sidcool 1 day ago
    Make a web all pls. I'm not going to install a native app from unknown source.
    • natoucs 3 hours ago
      That makes sense. But you can't access the native frontend if it is in a webapp
      • scottydelta 3 hours ago
        Why do you need native ChatGPT Frontend specifically?

        There are apps that provide similar Frontend and use api keys from ChatGPT and Gemini and others to provide all models under one web interface.

    • mmh0000 1 day ago
      Especially where it is just an electron app.

      I don’t want to run your webpage in a web browser I have no control over.

      My normal browser has been tediously customized and tailored for my usability.

  • BeetleB 1 day ago
    Just FYI, Open WebUI has this feature built-in.
  • hotgeart 2 days ago
    Does it need an API Key or it's like an 'iframe' of the web version?
    • natoucs 2 days ago
      iframe - you got it - it embeds the web apps
      • nextaccountic 1 day ago
        A better feature would be to select one pf the responses as the best one, and use it as the context for all LLMs, as if they were sent by each ome

        But this would require API access instead of embedding web apps

        • natoucs 3 hours ago
          good idea! And yes that's the issue
  • browningstreet 1 day ago
    Ninja Chat offers this.
    • natoucs 3 hours ago
      It doesn't offer to keep access to the native frontends
  • unstatusthequo 1 day ago
    Now you just need to add a judge node that compares the responses, fact checks them, and outputs the best response of the three. Although this makes another issue of which model is that judge.
    • theoldgreybeard 1 day ago
      Pewdiepie did something like this where all the AIs looked at each others answers and voted on the highest quality answer.

      Democracy!

      It worked pretty well until he updated them to know that poorly performing agents would get deleted and replaced. Then they started conspiring against him.

      (19:43 for relevant part) https://youtu.be/qw4fDU18RcU

    • irilesscent 1 day ago
      Make a jury or blind models making a case for the best response and choose a random model to be the judge.
    • mschulkind 1 day ago
      Just give me a day to vibe code an interface to side by side judge judging models...
    • adamisom 1 day ago
      easy, let them all judge then? you guessed it...