My FeedDiscussionsHeadless CMS
New
Sign in
Log inSign up
Learn more about Hashnode Headless CMSHashnode Headless CMS
Collaborate seamlessly with Hashnode Headless CMS for Enterprise.
Upgrade ✨Learn more

How to handle processing of a large set of data?

Roopak A N's photo
Roopak A N
·Jan 15, 2018

My current problem has approx 120 million objects in total. Every operation will use around 30-40 million objects from this superset based on some filters. We use Java, MySQL, Hibernate stack. While processing, we need all these objects in memory.

Please share the way you would handle such a problem. Thanks in Advance for the help.


Edit: Our solution is time critical. This is why we handle all operations in-memory.