We are currently transitioning from running our app in simple Docker containers to Kubernetes. In the old infrastructure there was only one app running in the container that logged into a file in a directory mounted from the host machine. We had two of such machines behind a load balancer, with one instance of the container each. The logs were then picked up by filebeat running directly on the machine, not the container, and all the records were sent to a separate machine running an ELK (ElasticSearch, Logstash, Kibana) stack.
We will most probably use DigitalOceanʼs managed Kubernetes service. This means we wonʼt have direct access to the nodes, so running filebeat on the node itself wonʼt be an option.
I already looked up some possibilities but i also wonder: how other Hashnoders do it? How do you manage logs in a K8s environment?
Aravind
Software Engineer At Hasura, Hashnode Alumnus
Digitalocean has a pretty good guide demonstrating this, It's detailed here
digitalocean.com/community/tutorials/how-to-set-u…