@plamen9
Building Oracle APEX applications. Writing about tech.
Creating apps built with ❤️ using Oracle APEX. PL/SQL dev. Exploring the AI and ML world. Keen on front-end, visual arts and new technologies. Love travel, sports, photography.
Nothing here yet.
Thanks CV Ranjith ! What you need to do is create a new List: Navigation Menu from Shared Components / Lists . Then from Shared Components / Application Definition / User Interface use the following settings to make the navigation display as tabs. When you open the application on mobile (or in browser where the width is no more than 768px), the list items will be display at the bottom. In all other cases it will be displayed at the top.
This is only possible if you host your LLM. In this case I'm running it locally, which allows me to inspect the full REST calls coming from the APEX Assistants. This is not possible if you use ChatGPT, Claude or any other public service. You can check this guide on how to setup an Open Source LLM locally and use it in APEX - https://blog.apexapplab.dev/apex-in-the-ai-era
Hi Ravi, I'm glad it worked for you. As you have discovered, Ngrok's free accounts have some limitations - they use the default URLs which are too long. It also generates a new URL every time you re-start the service, so you need to re-configure the services using it. But since you have Ollama on OCI and Ngrok is out of the equation, it's all good. Hope it performs well on the OCI and the cost is not too high.
The Ngrok setup first: Try adding the host_header when starting Ngrok: ngrok http http://127.0.0.1:11434 --host-header '127.0.0.1:11434' After starting the tunnel, open the Ngrok URL in the browser and when displayed a default page, click on " Visit Site " button. It only need to be done once. If the above doesn't work, you can try to run it using a .yml file - here is the documentation (https://ngrok.com/docs/guides/site-to-site-apis-mtls/#update-ngrokyml). The file will look like this: version: 2 authtoken : YOUR-AUTH-TOKEN tunnels : ollama: proto: http addr : http: //127.0.0.1:11434/ host_header: 127.0 .0 .1 : 11434 Pay attention to the host_header part - it should fix your 403 error. As to Ollama, run ollama list in the Terminal. You will get a list of your models. You can get the Model name from there and use it in your APEX AI Services configuration. This should resolve your second problem. When using the newly configured AI Service for the AI Assistant component for example, it is a drop down menu, you can't enter a wrong name.