I am trying to stream data from a very large Oracle database to MongoDB for archiving. I have succeeded in implementing this with spring data jpa (Hibernate) and MongoDB. I am however hitting a
after about 300,000 records. I have tried various strategies to clear the archived object in memory but the error is not going away.Code:java.lang.OutOfMemoryError: GC overhead limit exceeded
This is what I am currently doing:
- I use Akka to create different actors for mining data from the oracle database.
- Each Akka Actor is wired a singleton instance of a DAO for the Oracle database which is backed by a Stateless Hibernate session (so there's no caching of objects)
- I clear every list that is saved to mongo
I have tried to increase the memory allocated to the JVM as well as adding the JVM optionbut that has not helpedCode:-XX:-UseGCOverheadLimit
Question: What is the best strategy for handling large data with Spring Data & MongoDB? How do I prevent the caching of saved objects which is currently hogging my memory? I'd prefer not increasing memory.
Thanks in advance.