I am experimenting with a new UI where typical profile settings are updated via chat instead of UI. For example, instead of showing a frontend component to let users cancel their billing, they can just chat to the bot.
I am wondering if its possible to get my LLM (lets say gpt3) to generate the graphql query necessary to run these operation. I am thinking I could ingest my graphql schema into a vector database like Pinecone, then feed in that context into LLM so that it can generate the appropriate GQL query/mutation.
Is this feasible, or a bad idea?
I have only theorized this so far
1
2 Answers
this is not an answer.
I’m working on a similar issue, I was wondering if we can use gql introspection query to get just "the field needed" to formulate the query. I’ve tried using gql agent, but I find on more complex schema, it does hallucinate non-existant field even when schema is provided as context.
Did you find any solution? I am also trying to solve a similar problem and this feels like a good way to fit larger schemas inside models' context limit.
May 26 at 16:07