I have written a simple scraping program using PhantomJS and webdriver, which essentially consists of the following steps:
Navigate to a page
Collect a number of links (30) from the page by xpath
For each link, navigate to that url and collect a bunch of text elements on the page
I am not calling this in JavaScript, I am using webdriver written in C#.
My issue is that the PhantomJS.exe process reaches memory usage of over 800 MB, which I can't understand why this is happening. My .net exe's memory usage (which calls the PhantomJS.exe) sticks around 12 MB throughout. I'm only using one instance, and running all processing in serial, so effectively one window or 'tab' as far as I am aware.
Has anybody come across similar issues using PhantomJS, any reason it should have this behaviour?
Related
I get a weird behavior of Razor - after rendering a web page of approx 300 DIVs, with some user info in each, rendered in a loop, the CPU continues to run at 100% single-core load for about 30 seconds. No IO ops, no change in memory utilization, just burning CPU cycles.
The page is rendering data from the database, 300 records. It's not the database fault - I checked it by disabling DB access, replacing the records with dummy data and obtained the same behavior. The page is rendered and displayed in the browser, no other requests are active, so the server side code (at least my code) is idle.
UPD: The problem ONLY appears when the site is launched from within Visual Studio. Regardless whether it is hosted in IIS Express or IIS. Both running .NET 4.5.1, MVC 5.1.2. Opening the same site when devenv is not running makes the issue disappear.
Could anyone advise - whether you have experienced a similar issue and how you coped with it, and, how could I identify the piece of code that's causing the problem?
SOLVED! It's the Browser Link!
http://blogs.msdn.com/b/webdev/archive/2013/06/28/browser-link-feature-in-visual-studio-preview-2013.aspx
Disabling it solves the issue.
Eventually it all came down to VS Browser Link. (http://blogs.msdn.com/b/webdev/archive/2013/06/28/browser-link-feature-in-visual-studio-preview-2013.aspx)
Happens to be that smaller web pages work just fine, but larger pages cause a disproportionally higher load on the web-server process, making part of the server do something after the page is sent to the browser.
Disabling Browser Link solves the problem.
According to process explorer / task manager my application has a private working set size of around 190MB even while not performing a specific task, which is way more than I would expect it to need. Using FastMM I have validated that none of this is an actual memory leak in a traditional sense.
I have also read the related discussion going on here, which suggests using FastMM's LogMemoryManagerStateToFile();. However the output generated states "21299K Allocated, 49086K Overhead", which combined (70MB) is way less than the task manager suggests.
Is there any way I can find out what causes the huge differences, might 190MB even be an expectable value for an application with ~15 forms? Also, is having 70% overhead "bad", any way of reducing that number?
You can use VMMap from Sysinternals to get a complete overview of the virtual memory addres space your proces is using. This should allow you to work out the difference you are seeing between taks manager and FastMM.
I doubt that FastMM reports or even can report sections like Mapped File, Shareable, Page Table while those sections do occupy Private WS.
DDDebug can give you insights about memory allocation by objects in your app. You can monitor changes live.
Test the trial version or checkout the introductory video on the website.
I have an application that uses TWebbrowser to periodically navigate to a specific URL and extract some data. The app keeps runing 24x7 and does a lot of navigation in pages.
The problem is that TWebbrowser has a well-known memory leak problem, in which every time you navigate to a new page, the memory used for the application is increased. My app can easily use more than 2GB of RAM after some time. And after navigating hundred of times an 'Out of memory' or 'Out of system resources' exception is thrown and the only way to work around it is restarting the application.
The strange thing is FASTMM never shows these leaks. When I use my app for some minutes and close it, nothing is reported.
I've been searching for a solution for this problem for years (in fact since 2007 when I wrote the first version of my application). There are some workarounds but in fact, none of them solves the problem. For me the only workaround is really to close and open the app periodically.
I already tested the SetProcessWorkingSetSize approach, but it only shrinks the memory used by the app temporarily. After some seconds, the app uses a huge amount of memory again.
I also tried EmbeddedWB, but as it descends from TWebbrowser, it's plagued by the same issue.
By the way, I can't use a simple component like IdHTTP, because I need to do some JavaScript manipulation in the website visited.
Does anyone know if is there REALLY a solution for this problem?
QC#106829 describes one possible cause of memory leaks with TWebBrowser. Accessing Document (and any other properties that are implemented via TOleControl.GetIDispatchProp or TOleControl.GetIUnknownProp) causes leaks because it calls AddRef without ever calling Release. As a workaround, you can manually call Release, or you can patch the VCL (see here), or you can avoid the problematic properties (for example, by using browser.DefaultInterface.Document instead of browser.Document).
I have a pretty big issue, although I only have the symptoms, and a theory on the cause.
I have a C++ application under Windows 7x64 that uses system calls to FFMPEG 0.7.13 to extract frames from videos. When running, the parent application maintains a nice, predicable memory footprint in memory profilers (task manager, RAMMap) of about 2MB. I can see the individual calls to FFMPEG also come and go without incident. The trouble is, after about 100 calls to FFMPEG, and 70,000+ PNGs created (no one directory has more than 1500 pngs), the Windows memory page size raises gradually from about 2.5GB to over 7.0GB, and the system is brought to its knees. The sum of the processes for all users is no where near the reported Memory Page amount.
I thought it might be Windows Search indexing related, so I turned off the indexing for the output directories in question using SetFileAttributes() and FILE_ATTRIBUTE_NOT_CONTENT_INDEXED, and while it seems to be working as advertised, it does not seem to combat the issue at hand. My current running theory is that all of these extracted PNGs are either fully or partially memory mapped, by FFMPEG or something else. I can also see the output PNGs under the RAMMap Physical Pages tab as standby mapped files.
Question:
- Is there enough information here to possibly diagnose the exact problem?
- Do I have a way to combat this issue?
Thanks in advance...
I am writing my own text editor, and I was wondering how can I make it load faster. Notepad.exe witch comes with windows loads almost instantly and it is a small application (on XP is 67.5KB), I know that my app is a MDI project, but it has ~900KB and it loads in 5 seconds. I could write a DLL with all bitmaps and load them from there but I don't thing that this is the solution.
Anyone has any ideea?
thanks
In one of my projects I gained a tremendous decrease in loading time by disabling the autocreation of forms. Only the mainform is created in the DPR, all others are created when needed.
Often, it's the perceived speed that's important rather than the actual speed. If you can get a splash screen up as quickly as possible and continue initializing while that's up, people will see that as faster.
Another trick is to put most of your code into DLLs and run your program on Windows startup with a special invisible mode:
myprog.exe /sneaky
which may convince Windows to leave your DLLs in memory so that, next time your application starts, it's faster.
Or even stay running in memory in invisible mode and, when the user runs myprog.exe themselves, simply make yourself visible.
Yet again, use lazy-loading DLLs for the bulk of your functionality (we've used this one under UNIX) so that it's only loaded when needed. This amortizes the loading process over the total execution time rather than taking a big hit at startup.
Those are some tricks I've heard of, there may be others.
All performance problems, can be solved by looking at the code that is executed.
Guessing what is causing the performance problems may have you spinning your wheels for a long time. When you have a performance problem, you need to profile your code. There are various tools for Delphi out there to help you do this.
Some of which are:
Automated AQTime
ProDelphi
Sampling Profiler
These and other options were discussed in this Stack Overflow Question
There are various techniques to speed up code once you have identified what the problem areas are. Since you have identified the area you want to improve, profile the start up of your application.
You may find that your creating things such as forms, resources, or other object that don't need to be created at startup.
Often applications have more than one way they can be started. Since your application is a text editor I suspect you may have a command line where you can specify the file you want to edit. Profiling the different ways you can start your application is key to make sure really know all the impacts of performance improvement.
I noticed that my project loads E_SKU327.dll and E_DAUDF1.dll about 20 times, those files belong to a shared printer (Epson Stylus), so I removed the TPageSetupDialog from my form, and it loads instantly :)
Problem solved
:)
Try to omit the code on the start and initializations sections, and see if there's any improvement, then check which section make your application load slower in this case.
and if you testing the startup time with opening text file, try to replace TMEMO (if you are using it) with SynEdit and it will load the text files a lot faster, even from Notepad ;-).