53 likes
·
6.6K reads
8 comments
High quality content put in here
Thank you! 💛
right
I can't understand why pay twice: one to use openai and another to bedrock. Why don't setup this code locally/common host and just call the openai API?
I am really trying to understand this.
Hi Alan! With the AWS free tier, I’d argue that it’s basically impossible to get charged even $0.01 if it’s only used by yourself 😊 (except for the costs by the LLMs in/output tokens).
Also, it’s mainly a learning project. It should demonstrate how easily you could start building a chat bot that’s deployed in the cloud 📚
I see. Currently, I've built my chatbot (Flask + OpenAI API) to respond to users' questions about the company I work for. It's already functioning in a testing phase. I'm using RAG to analyze company documents. For each question/response, I'm averaging U$ 0.0006 using gpt3.5-turbo. I expect this application to handle around 20,000 interactions per month, costing approximately $12 per month. I believe this is a very low cost. Do you think using Bedrock could be cheaper? Oh, I forgot to congratulate you on the post.