Optimizing command line GIMP - gimp

I am running a script-fu macro using GIMP from the command line. However, it is quite slow to startup and run - about 20-25 seconds. I think a lot of this time is spent on startup - loading all the plugins and such. What are some ways to optimize GIMP on the CL? Is there any way to keep it always running?

Some promising options from the GIMP docs (some of which you may already be using):
--no-interface: Run without a user interface.
--no-data: Do not load patterns, gradients, palettes, or brushes. Often useful in non-interactive situations where start-up time is to be minimized.
--no-fonts: Do not load any fonts. This is useful to load GIMP faster for scripts that do not use fonts, or to find problems related to malformed fonts that hang GIMP.
--no-splash: Do not show the splash screen while starting.
The GIMP FAQ:
The GIMP takes too long to load - how can I speed it up?
The main things are to make sure you are running at least version 1.0, and make sure you compiled with optimization on, debugging turned off, and the shared memory and X shared memory options turned on.
Or, buy a faster system with more memory. 8^)
This question on SuperUser addresses slow GIMP startup time in general and recommends:
Rebuild the font cache file by deleting C:\Documents and Settings\<username>\.fonts-cache1 and then opening GIMP.
Check for slow-loading plugins by starting up with --verbose and seeing where it hangs. Then remove problematic plugins by renaming them in C:\Program Files\GIMP-2.0\lib\gimp\<version>\plug-ins. Alternately, remove all plugins by renaming the whole plugins folder.

Not so much a solution as a different possibility for the future, but have you considered not using GIMP?
GIMP is first and foremost a GUI-based app. If you're doing a lot of repetitive image manipulation from the command line, you might be better off with a tool like ImageMagick that's designed expressly for such use. I don't know how complex your script-fu scripts are, or how easily they could be translated to ImageMagick's (admittedly complex) syntax, but you definitely wouldn't have problems with long startup time.

You could use "Script-fu Server" .
image window > Main menu > filters > script-fu > Start server.
You will be provided with a popup asking for the port to run it in. There is also "help" provided on the same popup, which also describes the protocol used by the server.

Related

Can a LyX/LaTeX file be too large to create a PDF?

I've been working on my bachelors thesis in LyX for about a month without encountering any problems and today, all of a sudden, when creating a PDF LyX just loads indefinitely and even asks me at some point if I want to stop the PDF creating since it takes such a long time. Am I doing something wrong? I have about 100 pages and the PDFs I created lately have been around 100 mb large since they hold very high res images and a lot of them.
In case anyone is struggling with the "convert" functionality usage in Lyx, this is some additional info:
Initially I struggled to make eps to load and be displayed on screen as well as to get it exported to PDF file. I saw that the Lyx latest install had already all "convert blah-blah $$ii $$o" commands predefined and it was still not working.
Here is what worked for me:
sudo mv /etc/ImageMagick-6/policy.xml /etc/ImageMagick-6/policy.xmlout
Here there are two parts -
a) imagemagick needs to be installed on the machine as it provides most of the converters. Following command on terminal would check if imagemagick is installed or not on your system.
identify -version
b) Imagemagick tools should be "allowed" to run - "convert" being one of those. You need to relax some default security policies for that. That is what the above renaming of the policy file does. Detailed information is given in answer to this question on ubuntu forum.
Note - This security policy relaxation is not recommended for web-server machines. Only desktop users may take the risk.

Atom is running slow when edit a file which is over 500 lines

when I use Atom to edit javascript files there has some performance issues, if the js script lines is over a number of amount, e.g, 500, to scroll the file or move cursor will be stuck. It should not because hardware problems and 500 lines is also not a big amount. Is there something I can do to make Atom to run smoothly when I edit a big size file? Thanks,
As you can read in this article, this is an ongoing issue with Atom and is currently being dealt with by the team. I don't believe it has anything to do with computer performance.
I currently run an i7 machine and, when opening large (typically minified) files, the editor will run extremely slow and, in some instances, crash completely.
Hopefully we can see a resolution soon.
Finally I found the problem occurs is because a plugin -- linter-jscs, 500 lines is not big amount, after disable this plugin, editing is on right way.
Have you considered the possibility that your machine may just be slow?
I understand this doesn't directly address your question, but if you're not bound to Atom you could experiment with other text editors. I personally recommend Visual Studio Code. Have a look:
https://code.visualstudio.com/download
Although you've posted a solution, it may be worth considering using a package such as Timecop, which displays information about where time is spent while Atom loads. You can also check similar information in the Settings > Packages view, which will list how much time each installed package will add to the startup time (see the Flight Manual section on packages).

not enough space for environment appears when executing ".exe" file

I am trying to use an application called CLUT.exe which is an old application for MS-DOS that can be used to reindex NTX files for DBF databases.
(This is not the main topic, but I am just writing this if someone wants to test the app and don't trust at all about the content).
The problem starts when trying to run the command line version through console (cmd.exe) and this error appears:
C:\>CLUT.exe [arg1] [arg2] [arg3]
run-time error R6009
- not enough space for environment
So, according to what I've searched, this could be a possible solution:
http://support.microsoft.com/default.aspx?scid=kb;en-us;230205
but it doesn't work and every alternative that I found to solve this over the internet is the same.
Another alternative could be to make right-click in the .exe file, go to Properties then Memory tab and increase the Initial environment memory from Auto to the max value but it doesn't work too.
Well, I am stuck and no "possible" solution is working for me. If someone is interested, knows more about this issue and want to test, you can download the application from here (click "Free Download" green button):
http://www.filebasket.com/free/Development-Clipper-programming-language/clut-exe/13996.html
or directly from my DropBox:
https://dl.dropbox.com/u/15208254/stackoverflow/clut_214.rar
Just to know, I am using Windows 7 and the CLUT.exe application is a Clipper based app (old programming language) that may run under windows console (cmd.exe).
Wikipedia does mention other dos emulators but, oddly, doesn't mention BOCHS.
Reindexing NTX files is not a difficult thing to do, and can be done with tools other than CLUT. For example, many of the utilities listed on this part of Download32 could be used. Otherwise, you could write your own using Harbour Project or xHarbour. Or contact me off list and I'll cook up something in Clipper 5.3.
LATER
If I read the README correctly for CLUT, it's a replacement for the DBU utility that comes with Clipper 5.x. I can supply you with a build of that if you're unsuccessful with other approaches.

Delphi7 - How can i find where my project is hanging the compiler?

I have a project in Delphi7. Its is rather large consisting of 40 odd forms and frames.
Recently, the compiler only allows me to compile the project once so i can run it, then every re-compile the IDE hangs and i have to end the Delphi process. Before this occurs, my CPU goes to 50% (on dual core machine) so my deduction is the compilation process has gone into an infinite loop. The Executable it produces is not runnable and usually at a fixed size after it hangs.
I was wondering how i can go about finding where this inconsistency in my project is. Other projects do not suffer from this same issue.
You can use Process Explorer to discover what compiler is doing (reading a file, or ...).
Check the QC 3807 issue.
Check the system resources - free disk space, memory. Clean the temp folder. Check the disk for errors. Do you have antivirus running ? If yes, then try to turn it off.
Use "process of elimination", to see if it's something in your code.
First, make a backup of where you are, or save to your CVS (you ARE using version control, right? RIGHT? good.) Revert your branch to an earlier version where it worked. See if that works. If so, merge half of the changes from the present-day version. If that works, try the other half. Keep cutting things in half, and you'll find the code that causes the problem, by process of elimination.
Or, it may turn out to be something in the configuration. Carbonite may be your friend here.
You can either:
Enable "Compilation progress display" in the "Environment Options" window, in the "Preferences" tab.
Use the command line compiler bcc32.exe to have a detailed console output.
Both will let you know which file is hanging the compiler.
Take a look at the great Delphi Speed Up tool, which allows e.g. to abort CodeCompletion and HelpInsight by ESC/mouse move.

Is exec a good programming solution to ant OutOfMemory issues?

This question requires a bit of backstory... At my company, we produce a set of PDF and HTML files. A very large set. The current build process (which I designed, in haste) is a Perl script that reads a set of files, where each file contains a new ant command to execute.
It is designed terribly.
Now, I'm trying to shift the entire project over to using ant for the majority of the tasks. Within a target, I can construct a list of files that need to be built, as either PDF or HTML. However, when I call the ant command to build each file, after about three builds (of, say, five), the entire process crashes with an OutOfMemory error. Furthermore, my buildlog.xml ends up being something like 20 megs--it concatenates every ant command's output into one giant log, since they are being called from a single target. With the earlier Perl solution, I was able to get a buildlog.xml for each ant command--simply save and rename the buildlog to something else.
Even if I set ant or java heap sizes in my user.properties, I still fail with an OOM eventually. I wonder if an appropriate solution is to call <exec> to launch a script that does some of what I described and desire: namely, call ant, rename the buildlog, and die--theoretically allocating and freeing up space better than one "giant" ant call. I am worried that I am going to be heading down another "hacky" solution to a problem that's well-defined, and can be entirely confined to ant. Then again, <exec> does exist for a reason, so should I not feel bad for using it?
As with most corporate software (at least those which have deadlines and, if yours don't, please let me know where you work so I can try get a job there), the first step is to get it working.
Then, worry about getting it working well.
For that first step, you can use any tool at your disposal, no matter how ugly you think it looks.
But you might want to make sure that the powers-that-be know that you've had to do all sorts of kludgy things to get it working for them, so that they allow you to hopefully fix it up before maintenance has to start on it. You probably don't want to be maintaining a hideously ugly code base or design.
We've unleashed such wonders on the world as applications that shut themselves down nightly to avoid memory leaks (leaving the OS to restart them), putting "questionable" code at the other end of a TCP socket so their crashing doesn't bring down the main application and, I'm sure, many other horrors that my brain has decided to remove all trace of.

Resources