I loaded a large dataset last night and GSE used about 58GB memory. Nothing is running for almost 24 hours, but “top” shows it still holds 58GB memory. Will it eventually release this memory by itself? Or do I have to restart everything?
An interesting question: why would you want it to release the memory?
I might be a bit old-fashioned, but my view is that database servers (of any kind) and in-memory analytical platforms should be running on a dedicated machine (or a cluster of dedicated machines) and have all the resources to themselves (except what the OS needs, of course). So, if they use all memory an CPU, so much the better, they are maximising in-memory processing/analytics, the stuff we pay them for.
Obviously, it assumes that they do this because of efficiently, but that’s a different story.
So, why would want TigerGraph release the memory? Who/what else would need it?
In principal I agree with you that the database servers should be running on dedicated machines.
However, as I mentioned we are developing a plugin for TG. The development machines are shared among a few developers for various purposes. It would be nice that TG can release some of unused memory back to the OS.
I am running memory profiling of TG vs dataset sizes at the moment. I could be wrong, but my observation is that TG loads all databases into the memory all the time regardless if they are used or not. If this is the case, I certainly want a way to “close” graphs I am not actively using so that the memory can be freed for graphs I work on.
Ok, thanks for the clarification. I will check if there were ways to “close” graph/release memory, and indeed, how memory handling in general works.