open cart import/export tool Out of memory - memory

I'm having problems with a free open cart module and was hoping to get some help.
While using the import/export tool I'm Getting the following error
"Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 25165824 bytes) in /home3/haas12/public_html/breslovcenter.org/system/PHPExcel/Classes/PHPExcel/Style/Supervisor.php on line 126"
I only have about 700 items and my xlsx file it is only 291k but it's saying 256MB in the error message.
I created a php info file and it is at:
http://breslovcenter.org/phpinfo.php
Anyone have any ideas on how to fix this? Any help would be greatly appreciated. I'm guessing this problem has to be due to some bug that makes it leak memory. I'm kind of stuck and not sure what to do.

The file might be small, but PHP uses a lot of memory/processing to open the excel file. While it does seem like a lot of memory, it's pretty well known for having that issue and you will need to increase your memory limit or find a better way to import (there are numerous other ways to import, they're just not as convenient)

The error message:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 25165824 bytes) ...
This says that Your Apache server has allowed memory size for PHP of 268435456 bytes (256 MB). This memory is shared between all possible threads Apache server may open. Your script is trying to allocate 25165824 bytes (24 MB) more (but this may differ per each request and depending on the loading file size).
PHPExcel, though excellent PHP extension for working with XLS(X) files, has one critical weakness - a requirement for lot of resources, especially memory. On a shared hosting with 32 or 64 MB of allowed memory size there is even no chance to run this extension.
The solutions
If You have the chance to modify the memory limit for Your PHP, then do so. Open up Your php.ini file and search for memory_limit setting. It should be now memory_limit 256M - change it to e.g. this value: memory_limit 350M or even to 512M if You want to be completely sure this won't happen in the future. If You have the possibility to modify the PHP settings from within the PHP scripts using the ini_set() directive, this may be even better to use only in the import script so that there is not so much useless memory allocated for each request: ini_set('memory_limit', '350M'); - it's best if You call this as the first line in Your import script.
If the option one is not possible (You do not have rights to access or modify the PHP settings on Your hosting) then the other possibility it to export the XLS(X) file into a CSV file and import the data from CSV which is maybe not so comfortable but for sure uses as less resources as possible.

As Jay Gilford says PHPExcel is well known for this issue. You can try either:
Editing the php.ini files
If your website is hosted on a shared server or you do not have access to the PHP Configuration you will need to amend two 'php.ini' files in your OpenCart installation. The first is in the root folder of your OpenCart installation and the second is in the '/admin' folder. Change:
memory_limit = 64M;
To:
memory_limit = 256M;
If you’re on a shared server there may be a limit imposed by your provider (from experience 1&1 is around 80MB) which would override these 'php.ini' files, in which case you may need upgrade to a dedicated server or VPS if you want to increase your PHP memory limit beyond this.
Increasing the PHP memory limit on your server
If you have access to the server PHP Configuration you can increase the PHP memory limit directly on the server through your Control Panel or via SSH. You will most likely need to restart your server for the changes to take effect.
Of course deleting some old products would do the trick and free up some memory usage, however you will encounter the issue again once you get back up to the same level. Alternatively you could try a different extension which is not so memory-hungry, however the import/export functionality of this extension still seems to be the best of its kind.

Related

How can I open or review a large txt file

I have a few large txt files I'm trying to load into a data warehouse, I do get an error message with the offending row/line number but cannot open the txt file to review it as says it's 2 large 2,413,060KB. Someone suggested using the cmd option to do this but unsure how.
You can either use HJSplit in order to split the data or using another vim-based program such as gVim. Also it may be good to free up your PC's RAM as much as possible.

iOS Application uses too much storage space

Various users of our application started to complain it uses lots of memory on the phone. We added data collector based on files (that locates within application folder) and it's sizes, the following been found for small amount of users:
Preferences:{
files:{
"{bundle_identifier}.plist":"23.97479057312012",
"{bundle_identifier}.plist.0BTeiJo":"22.25380897521973",
"{bundle_identifier}.plist.1lT9kMO":0,
"{bundle_identifier}.plist.2HHwLSb":0,
"{bundle_identifier}.plist.2L9bkJR":0,
"{bundle_identifier}.plist.2xAnoy5":0,
"{bundle_identifier}.plist.3Qgyplk":0,
"{bundle_identifier}.plist.4SBpAox":"23.95059013366699",
"{bundle_identifier}.plist.4Xm8NvI":0,
"{bundle_identifier}.plist.5sPZPIi":0,
"{bundle_identifier}.plist.6GOkP57":0,
"{bundle_identifier}.plist.6SYZ1VF":"21.67253875732422",
"{bundle_identifier}.plist.6TJMV5r":"21.67211151123047",
"{bundle_identifier}.plist.6oNMJ0b":0,
"{bundle_identifier}.plist.7C1Kuvm":0,
"{bundle_identifier}.plist.7E3pmr4":0,
"{bundle_identifier}.plist.7ExLAx0":"21.70229721069336",
"{bundle_identifier}.plist.7GOPE3W":"18.70771026611328",
...
},
size:"960.2354183197021"
Can someone assist and explain, why this files (plist.*) appeared and how it's possible to safely remove them and ensure they wont appear again?
P.S. I found the logic in the project where we store dictionaries to the NSUserDefaults (I know this is a bad practice), but there is not much data.
UPDATE:
I have discovered that files (*.plist.*) are generated after Back Up. And sometimes size is 0 sometimes size same as origin *.plist size (on back up time).
Now i need to know, is it safe to remove them?

How to prevent SWF read from memory?

I'm able now to read an encrypted SWF from a resource to a stream, decrypt the file and load it directly into memory.
But unfortunately there are tools to scan memory and list/view/dump your pure SFW file used by the Flash Player. One such a tool is SFW Vampire.
Even fake SWF signatures doesn't seem to be safe.
There is one tool - SWFkit - who hides all the stuff, but that no longer exists and I use the F-In-Box component with Delphi 7.
Is there a way to hide/prevent SWF files reading from memory by fooling such tools?
Thanks.
Aren`t there any READ_WRITE ACCESS parameters at memory so only your program that reserves the memory can read and write to it ?
I am not an expert in this languages so here is a link if it helps :
http://www.joachim-bauch.de/tutorials/loading-a-dll-from-memory/
You can destroy the flash file headers and some other parts after it's loaded to the memory. This is to prevent those tools from recognizing the files in memory. In addition, if it can dump to file, it should be a broken file.

Download large files with Ruby on Rails

My small project for internal use is something like "file share portal like sharerapid", it will be use about 100 people. I have problem with downloading large files. Small files (< 200 MB) are downloading fast, but largest files block my server for 2-5 minutes. Maybe problem is with RAM, I have 2 GB ram. My code to download file:
def custom_send(userfile)
file = userfile.attachment.file.url.to_s.split("?").slice(0..-2).join("?")
send_file "#{Rails.root.to_s}/public#{file}" , filename: userfile.name, x_sendfile: true
end
I don't know where is problem, in develop mode on my localhost machine is OK, but problem is on public virtual server(ubuntu 12).
what web server are you using? the most likely cause is that the request is blocking further requests in a single-threaded environment.
the best solution to your problem would be to host the files on amazon s3 and link to them there. if the files must remain local, you could try something more like this:
http://www.therailsway.com/2009/2/22/file-downloads-done-right/

Does Windows.CopyFile create a temporary local file while source and destination are network shares?

I have a D2007 application that uses Windows.CopyFile to copy MS Word and PowerPoint files from one network folder to another network folder. Our organization is migrating to Windows 7 from Vista. One of my migrated users got an error message that displayed a partial local folder (C:\Users\(username)\...\A100203.doc) during the copy. Does the CopyFile function cache a local copy of the document when it is copying from one network folder to another network folder or is it a direct write? I have never seen this error before and the application has been running for years on Win95, Win 98, Win2000, WinXP and Vista.
Windows.CopyFile does NOT cache the file on your hard drive... instead, it instructs Windows to handle the copying of the file itself (rather than you managing the streams in your own program). The output file buffer (destination) is opened, and the input buffer simply read and written. Essentially this means that the source file is spooled into system memory, then offloaded onto the destination... at no point is an additional cache file created (this would slow file copying down).
You need to provide more specific information about your error... such as either the text or an actual screenshot of the offending error message. This will allow people to provide more useful answers.
The user that launches the copy will require read access to the original and write access to the target, regardless of caching (if the user has read access to the file, then the file can be written to a local cache, so caching/no-caching is irrelevant).
It's basic security to disallow someone to be able to copy files/directories among machines just because the security attributes between the machines are compatible.
There's little else to say without the complete text of the error message.

Resources