84 likes
路
2.3K reads
13 comments
Interesting analysis.
GPT-3 is the closest we can get to have something like Jarvis from Iron man so that should be something that can be exciting
I agree with you. I have been following the development on Twitter and I must say I am very impressed by its capability. Going to dig deeper over the weekends.
Thanks for writing this.
I would recommend you to checkout "the subtle art of priming" and other works of Carlos if you haven't yet. I find his content to be truly unique compared to everything else.
I have been hearing a lot about GPT-3 but never really understood the effects of it but I do now thanks to your article.
I am glad that this helped you 馃槉
Great insight! I hope that invite reaches you soon to tell us more!
I am hoping for the same. Thanks for your kind words 馃槉
This is really amazing! Will check more around it.
Thanks for sharing!
I appreciate your kind words 馃槉
This is really interesting and the first I have heard of it. Thank you for sharing.
In a general sense, this simply feel like a smarter scafolding tool and part of the Low Code movement. My impression of Low Code is that you're stuck with brittle unmanageable systems, which might be fine for initial rapid prototyping but tend to require devs to actually build the thing from scratch for real world usage.
I can not agree more. I think this would bring up a new league of no-code tools where you can pick up the code and tweak it anytime. The level of customization of those would be nothing to be heard of.
Also, the tool to be adapting those customizations and work on future development accordingly.
I read somewhere that, priming a GPT-3 model is like teaching a child. A child who has the internet in its memory. That is the best analogy according to me.
There are so many applications that I want to try out of this.
Utkarsh Bhimte great insight. In the "teaching a child" analogy, from what you gather does each instance need to be independently "taught" or is there the idea of some sort of database of knowledge? I could see it go both ways. It would be cumbersome to teach from scratch but also susceptible to bias or opinion if there is a global data base. eg. the "Ugly emoji" example from the video. What is "ugly"?
Blaine Garrett I cannot confirm it as I have not experienced it first hand but I think the context can be inputted as initial config and then you can save them as presets.
The bias on GPT-3 is also something that has been debated on twitter for some time now, there have been some disturbing statements. That is the exact reason why it is in such a close beta right now.
GPT is a very interesting invention. The notion that AI would replace humans is something that I don't buy, anytime. Indeed, it will be interesting to witness the future of work with inventions like this.