The Ngrok setup first:
Try adding the host_header when starting Ngrok:
ngrok http 127.0.0.1 --host-header '127.0.0.1:11434'
After starting the tunnel, open the Ngrok URL in the browser and when displayed a default page, click on "Visit Site" button. It only need to be done once.
If the above doesn't work, you can try to run it using a .yml file - here is the documentation (ngrok.com/docs/guides/site-to-site-apis-mtls).
The file will look like this:
version: 2
authtoken: YOUR-AUTH-TOKEN
tunnels:
ollama:
proto: http
addr: http:
host_header: 127.0.0.1:11434
Pay attention to the host_header part - it should fix your 403 error.
As to Ollama, run ollama list in the Terminal. You will get a list of your models. You can get the Model name from there and use it in your APEX AI Services configuration. This should resolve your second problem. When using the newly configured AI Service for the AI Assistant component for example, it is a drop down menu, you can't enter a wrong name.