My FeedDiscussionsHeadless CMS
New
Sign in
Log inSign up
Learn more about Hashnode Headless CMSHashnode Headless CMS
Collaborate seamlessly with Hashnode Headless CMS for Enterprise.
Upgrade ✨Learn more

I want to do distributed video encoding for adaptive video streaming, using NodeJS and MPEG Dash! Any tips?

Vinay Bhinde's photo
Vinay Bhinde
·Dec 19, 2016

I am looking into implementing a video encoding solution for adaptive streaming. The best way at the moment, to have adaptive streaming is to use MPEG Dash streaming standard; as it is the only open standard which will be the future of adaptive streaming, and is supported by major browser vendors and companies.

I am looking into creating a distributed encoding service where in I can use 3-4 nodes to encode a source file into the required formats, and bitrates and can then create an adaptive streaming video out of it. The idea is to divide the source file into smaller durations, split them among the worker nodes who would then do the encoding of those small bits and then we can re-assemble those encoded parts into a complete video.

I plan to use Node on the server and ffmpeg to do video encoding. I know its one hell of a job, doing distributed video encoding (there are literallyy companies running on such products) but I do not plan to have a fully scalable service doing tons of video data encoding per day, just a small service which can suffice my workloads of few hours of video per day.

I wanted to know if anyone has ever tried to implement such a distributed thing using node, and queues such as ZeroMQ. Any tips on how one should be looking at designing such a system, and any learnings or gotchas one may have faced when building one.