related to out of memory in netbeans java - memory

I am using Netbeans Ide and doing project related to Natural Language Processing.When Iam running my project it is showing error "out of memory".My training data(input file) size is 34 MB.I increased by heap size in netbeans.conf and project->properties->run VM arguments to 1024M.But it is showing same error.My RAM size is 1GB.I tried by setting vm arguments size differently like 1024m,768m etc....but not worked.
In my project JNI code is used.From java programs C functions are called.Please give suggestion how to solve this problem....

Netbeans comes with a profiler that can help you find memory leaks in your application. Here is an article about how to use it. http://netbeans.org/kb/articles/nb-profiler-uncoveringleaks_pt1.html

Related

Rascal MPL how to increase heap size?

I just stumbled upon this old post that mentions Java heap space and how to change its parameters in Eclipse (in the eclipse.ini file). How do I do set these in the new VSCode environment?
The Eclipse environment is quite different from the VScode environment. Everything runs in a single JVM there, and so it was difficult for plugin writers to programmatically increase the heap size. This led to writing manual pages on the topic.
In VScode we have a JVM for every Rascal process:
for every terminal REPL there is one JVM
for the Rascal LSP server there is one JVM
for the generic DSL-parametrized LSP server there is one JVM
there is one JVM for every deployed DSL server written in Rascal
And the extension code starts these JVMs, so we can control how much memory they receive. The latest release does this by assessing the total available memory on your machine and allocating a sumptuous amount for every process.
And so there is no configuration option for the user anymore, as we had to add for the Eclipse situation.

JDK 11 (and newer) DirectByteBuffer Hold Large Off-Heap Memory Even At Startup

Our application uses alot of DirectByteBuffer's object via nio's FileChannel.map() and ByteBuffer.allocateDirect() to load and process files (ex. DICOM). The code is written in java 8 but compiled in java 11.0.3. We profiled our application using both JMC 7.x and JxRay (which shows DirectByteBuffers memory specifically). JxRay reported that our application used a large amount of DirectByteBuffer (off-heap memory) around 140MB even on the startup of the application, which is pretty unusual. Speficially JxRay report points to jdk.internal.jimage.ImageReader$SharedImageReader object that is holding this large memory. So I created a small hello world without any reference to DirectBuffer's class/object and JxRay reported almost identical result, which I am puzzled. Contacted JxRay team and they told me possiblly the newer JDK 11 jdk.internal.jimage.ImageReader$SharedImageReader could have been initialized and allocated that large memory. JxRay does not report this issue on JDK 1.8 version and they also said the format of the heap dump didn't change between JDK version (8 vs 11). I'm posting this question if somebody has encountered this issue or knowledge thereof.
Thanks

Optimal CLion VM memory settings for very large projects

Im currently working on a fork of a VERY LARGE project with about 7-8 * 10^6 LoC and 100000+ classes. The problem is, of course, that the indexer or CLion in general runs out of memory or is very slow and not responsive.
I already saw the blog entry https://blog.jetbrains.com/idea/2006/04/configuring-intellij-idea-vm-options/ where you describe some memory projects but it seems not to fit for my project setup.
My .vmoptions file looks like this:
-Xss20m
-Xms2560m
-Xmx20000m
-XX:NewSize=1280m
-XX:MaxNewSize=1280m
-XX:ReservedCodeCacheSize=2048m
-XX:+UseConcMarkSweepGC
-XX:SoftRefLRUPolicyMSPerMB=500
-ea
-Dsun.io.useCanonCaches=false
-Djava.net.preferIPv4Stack=true
-XX:+HeapDumpOnOutOfMemoryError
-XX:-OmitStackTraceInFastThrow
-Dawt.useSystemAAFontSettings=lcd
-Dsun.java2d.renderer=sun.java2d.marlin.MarlinRenderingEngine
Im working on a machine with 128GB MainMemory and Intel XEON 28 Core CPU, so the resources should not be the problem.
Do you have any recommendations for the optimal memory settings?
I wrote a mail to JetBrains support and this was the answer:
The possibility to change how many cores should be used in CLion
hasn't been implemented yet, we have a related feature request:
https://youtrack.jetbrains.com/issue/CPP-3370. Please comment or
upvote. Could you please capture a CPU snapshot so we can take a look
at what is going on?
So it would be great if anybody who wants this feature +1's it on JetBrains YouTrack.

Groovy: passing PermSize to groovy compiler from IntellijIdea

I have big grails project (five modules + 2 custom no pre-compiled plugins).
At first – compilation was failed with out of memory in javac. I added as params:
-J-Xmx1024m -J-Xms512m -J-XX:MaxPermSize=2048m
great, first OOM – fixed.
At second – I have OOM in groovy compiler. How to pass memory params from Idea to groovyc? Also my application should work in NOT forking mode
You don't set the groovyc memory directly.
You should set/increase your memory settings directly in the IntelliJ settings.
The documentation can be found here: https://www.jetbrains.com/idea/help/increasing-memory-heap.html
In general, in your IntelliJ installation's bin folder, there will be 2 files of interest: idea.exe.vmoptions and idea64.exe.vmoptions. You would adjust the settings for the file that matches the idea.exe or idea64.exe you use to launch IntelliJ
Within the file, you can adjust your heap memory for IntelliJ as a whole. My guess is you are running out of perm gen space and should increase your MaxPermGenSize. The settings would looks similar to this (I have 16gb memory on my PC):
-Xms750m
-Xmx2g
-XX:MaxPermSize=350m

Basics of Jmapping?

I've done some search out there but couldn't find too much really helpful info on it, but could someone try to explain the basic of Java memory maps? Like where/how to use it, it's purpose, and maybe some syntax examples (inputs/outputs types)? I'm taking a Java test soon and this could be one of the topics, but through all of my tutorials Jmap has not come up. Thanks in advance
Edit: I'm referring to the tool: jmap
I would read the man page you have referenced.
jmap prints shared object memory maps or heap memory details of a given process or core file or a remote debug server.
NOTE: This utility is unsupported and may or may not be available in future versions of the JDK. In Windows Systems where dbgeng.dll is not present, 'Debugging Tools For Windows' needs to be installed to have these tools working. Also, PATH environment variable should contain the location of jvm.dll used by the target process or the location from which the Crash Dump file was produced.
http://docs.oracle.com/javase/7/docs/technotes/tools/share/jmap.html
Its not a tool to be played with lightly. You need a good profiler which can read it output as jhat is only useful for trivial programs. (YourKit works just fine for 1+ GB heaps)

Resources