I'm trying to increase C-stack size in rstudio server 0.99 on CentOS6 editing /etc/rstudio/rserver.conf file as follow:
rsession-stack-limit-mb=20
But "rstudio-server verify-installation" returns this message:
The option 'rsession-stack-limit-mb' is deprecated and will be discarded.
If I put this setting within /etc/rstudio/rsession.conf I obtain this message:
unrecognised option 'rsession-stack-limit-mb'
Someone can help me to find right configuration?
Thanks in advance
Diego
I guess you use the free version of RStudio Server. According to https://github.com/rstudio/rstudio/blob/master/src/cpp/server/ServerOptions.cpp, it seems like you have to need a commercial version if you'd like to manage memory limits in RStudio Server.
Or, you can use "ulimit" command on CentOS, e.g., "ulimit -s 20000". Then, run R from the Linux command line or in batch mode.
Related
I'm currently using knife zero to help me provision the servers and the commands are executed through Jenkins.
But recently I noticed that the process will stuck halfway through without any error and it is always at the part where it start executing the Java recipe.
So every time the process got stuck, I will have to reboot my system to get it running again.
The Java cookbook i'm using is from Chef supermarket.
https://supermarket.chef.io/cookbooks/java/versions/1.50.0
How should I debug this issue?
I would agree with #coderanger that this sounds like a symptom of a process that is waiting for user input before it can continue. Is it possible that you're trying to install Java 7 and haven't included the attribute accept_oracle_download_terms?
From the docs for that cookbook, this is explicitly noted as being required:
To install Oracle Java 7 (note that when installing Oracle JDK,
accept_oracle_download_terms attribute must be set
After much investigation, I found out that this is due to the lack of available ram in the system when chef is running.
I use the sudo sh -c 'echo 3 > /proc/sys/vm/drop_caches' command to clear up the caches at the start of each run and my script no longer get stuck anymore.
http://linuxinsight.com/proc_sys_vm_drop_caches.html
I have a job that needs to run a script on a remote computer. I'm doing so by using psexec via "Execute windows batch command":
C:\PsExec.exe \\computername -u username -p password -accepteula c:\xxx.exe
When I run the job I get the following error:
c:\PsExec.exe is not recognized as an internal or external command
** PsExec.exe is located under c:\
Any ideas?
First Define psexec.exe path in environment varaiable "PATH" or else place psexec.exe file in C:\Windows\System32\
And to Download Psexec.exe file
https://download.sysinternals.com/files/PSTools.zip
One possible explanation is the version of PsExec.exe: 32bits or 64bits.
If you have the 32 one on a 64bits machine, that command would not be recognized indeed. PsExec64.exe would.
I can see the age of this question and my answer may not be relevant to this topic since I was technically trying to solve a different problem, but maybe this will help other people who are stuck.
c:\PsExec.exe is not recognized as an internal or external command
I was trying to disable the Maintenance Configurator with PSExec (my problem is the never ending maintenance bug) and kept running into the same error as the OP BUT I got PSexec64 to run this command:
C:\PsExec64.exe -s schtasks /change /tn >"\Microsoft\Windows\TaskScheduler\Maintenance Configurator" /DISABLE
BY checking the "Run this program as an administrator" option under the Compatibility settings for "PsExec64.exe"
Don't know if this has solved my problem yet, but I think the OP would have been able to run his process if he had done this. Dear OP did you ever solve that?
I have installed NBIS software in Redhat Linux in VMware and running as a host OS in my windows 7 system.
Till now I executed giving only one image, but now I need to run the entire DB with 100 images at a time and I should get the extracted minutiae.
I use the below cmd:
/NBIS/src/bin/mindtct /NBIS/Test_4.1.0/mindtct/data/5_2.jpg
/NBIS/output/5_2.xyt
Can anyone resolve my issue? What cmd should I use?
You can write a script to loop over all the images in your collection, or better yet, write a C program to wrap the mindtct functions, doing whatever you want to do within your new app. Take a look at the source for the binary mindtct in NBIS, especially the get_minutiae() function.
In the folder with your images you can use a bash script. This is the relevant part from mine. A simple for loop that will convert all images with the extension jp2 to xyt images.
PHOTOTYPE="*.jp2"
SAVEPATH="path/to/save/folder/"
for PIC in $PHOTOTYPE
do
echo "Processing mindtct -m1 $PIC $SAVEPATH/$PIC"
mindtct -m1 "$PIC" "$SAVEPATH/$PIC"
done
I tried it on Raspbian to Raspberry Pi
./mindtct path/file.jpg path/output
and it produced 8 files:
.brw, .dm, .hcm, .lcm, .lfm, .min, .qm, .xyt
In my understanding you should use a mindtct function to compare two finger images.
While generating a large site using the ToolTwist Controller, the server hangs. Using ps -ef I can see that there is an ImageMagick 'convert' command that never seems to finish. If I kill the convert process, the generate continues.
If I get the full convert command from the log file or using ps, I can run it from the command line with no problem. Each time I run the generate process in the Controller it gets stuck in a different place.
How often it hangs seems to be sporadic, and only occurs maybe every 1,000 images.
I'm running OSX 10.7.3 on a Macbook Pro.
This is a known bug in ImageMagick - see http://www.imagemagick.org/discourse-server/viewtopic.php?f=3&t=19962
The solution is to define an environment variable:
export MAGICK_THREAD_LIMIT=1
You'll need to do this before starting the Controller's tomcat server.
When compiling a latex document with 15 or so packages and about five includes, pdflatex throws a "too many open files"-error. All includes are ended with \endinput. Any ideas what might cause the error?
The error seems to depend on how many packages are used (no surprise...); however, this is not the first time I use this many packages, while I've never encountered such an error before.
#axel_c: This is not about linux. As you may or may not know, LaTeX is also available on windows (which just happens to be what I'm using right now).
Try inserting
\let\mypdfximage\pdfximage
\def\pdfximage{\immediate\mypdfximage}
before \documentclass.
See also these threads from the pdftex mailing list:
Error message: Too many open files.
Too many files open
Type
ulimit -n
to get the maximum number of open files. To change it to e.g. 2048, type
ulimit -S -n 2048
What is this command giving you:
$ ulimit -n
You might want to increase it by editing /etc/security/limits.conf file.
This could be caused by a low value in your 'max open file descriptors' kernel configuration. Assuming you're using Linux, you can run this command to find out current limit:
cat /proc/sys/fs/file-max
If the limit is low (say, 1024 or so, which is the default in some Linux distros), you could try raising it by editing /etc/sysctl.conf:
fs.file-max = 65536
Details may differ depending on your Linux distribution, but a quick google search will let you fix it easily.