Thanks for this help article. I want to know how big is the dataset, and how many tables it can cover after fine tuning.
Thanks for this helpful article about fine-tuning GPT-3 for natural language to SQL conversion. For those who are interested in using ChatGPT, I recommend reading this article (blog.devart.com/how-to-use-chatgpt-to-write-sql-j…) about how to use ChatGPT to write SQL JOIN queries
What would u suggest - fine tuning on custom data set vs using vectordb for passing required metadata abt the tables
Its very difficult to fine tune 1000’s of table across multiple database etc…
We noticed Hallucination of llm cannot be reduced using fine tuning
Do u have code for transfer learning on text to sql for gpt3 - so that it does its best in sql generation
Hello sir, I am intrigued by your article. I have got much more understanding, however I wanted to understand that is task specific to a particular schema?
I see that you are not passing the fields in the prompt, so how is it picking up about which field to use
Adil Majeed
Its really good, it would be great if the author can share the training file that he has used to train the model.