How to handle processing of a large set of data?
My current problem has approx 120 million objects in total. Every operation will use around 30-40 million objects from this superset based on some filters. We use Java, MySQL, Hibernate stack. While processing, we need all these objects in memory.
Please share the way you would handle such a problem. Thanks in Advance for the help.
Edit: Our solution is time critical. This is why we handle all operations in-memory.