I am working on an application where I have to store approximately 200 photos captured from camera in Documents directory. I know that there has to be some limit on the usage of Documents Directory, and if used in excess it may result in Low Memory Situations.Can you people tell me what is the max usage limit, if any. What are the different scenarios to be considered while doing the same?
No there is no limit. Limit is your device's available disk space. People will tell that there is 2GB limit but that is wrong. That is the app's size limit.
No it will not result in low memory situations if used in excess considering you are handling your objects properly like you do in any app.
Related
I'm using TDengine database. The data will increase 200 million per day,but the memory usage is growing by about 50MB accordingly.I'm a little concerned about the amount of resources being used.
I have searched the development documentation and can't find any similar parameters to limit the memory footprint.
I wonder what mechanism is causing its memory to keep growing? Is there any way to limit it?
From current list of "Realm Limitations":
Any single Realm file cannot be larger than the amount of memory your
application would be allowed to map in iOS
Does this mean that if I check ProcessInfo.processInfo.physicalMemory and it is smaller than FileManager.default.attributesOfItem(atPath:realmPath)[FileAttributeKey.size] (plus a variable amount to account for fragmentation etc), I should not try to open the Realm?
If the Realm file is too big for mmap to map the file, you should get a Swift error. So all you really need to do is to try opening the Realm and catch any Realm.Error.addressSpaceExhausted errors.
The bigger problem is what to do once you know the file is too big. Our compaction on launch feature requires that the file be openable first, which rules it out (and is why we recommend that compact on launch be used to pre-empt this issue). We're working on ways to mitigate this problem.
mmap shouldn't depend upon the amount of free physical RAM you have (although some amount of RAM is required to map the file), nor is the limit that iOS imposes anywhere near the theoretical maximum. Finally, virtual memory limits operate on a per-process basis, meaning that the size of a Realm file you can open depends both on what other files have been mapped by that process and by how much memory that process is using for other things.
I am making a some complex app, in which every detail is imp. i have some questions
1. how much storage limit we have, if we plan to save big files on core data/cache.
2. Whats the RAM limit on iphone? Actually searching for some table that can give detailed info about IOS devices on this. Because i need to handle memory warnings and defend App crashes.
3. Its better to save images in cache or core data, assume you have a lot of images approx. 200-250.
Thanks
1) I am not aware of any storage limit. Obviously, you will never get 64GB or more - since no device is larger ;-). My wife's facebook app consumes >5GB at the moment... I suppose they did something wrong. The only important point is to fail gracefully (show a dialog, clean some space, ...) if the storage is full.
2) The RAM limit varies depending on the iPhone model and the currently running applications. Also there are some iPods with less memory in market. 30MB should be pretty safe. Total physical memory of the device can be retrieved as described here while retrival of the available RAM can be derived from that question.
3) Maybe this is a good starting point. I would always write image data to the file system and just store the file name inside the database, as suggested here.
Basically I'm trying to keep the memory use on my Nginx server under a certain amount, both because I'm insane (according to my friends) & I want to save money. However I'm worried ImageMagick may push it over the edge.
I'm using -limit area 20MiB and I've also tried -limit memory 15MiB -limit map 15MiB but when checking the process (as it runs) through top -c (with Shift-M) and ps aux it shows it using, sometimes, considerably more memory than I've set in the limits. To give numbers it may be using 35MB or 40MB, instead of the 20MB/30MB I would expect. I wouldn't be bothered for 2MB or 3MB but that's quite a large offset.
I've been told the extra memory may be the ImageMagick's overhead as it loads the interpreter etc, but I'm not super familiar with Unix programs so haven't a clue in that department.
If anyone can explain why this is happening, that would be great. If it's a normal thing, great. I'll just adjust things to take into account the fact that it may use my limit plus a certain amount, but if it isn't and the -limit parameter doesn't limit memory to a certain amount, what exactly is the point in having that parameter in ImageMagick?
Again thanks for your help in advance, it's much appreciated, as always.
According the documentation ImageMagick is moving all memory operations to mmaped files, so it will start to swap if you have enough disk space, see the manual:
SNIP from manual -limit:
The value for File is in number of files. The Disk limit is in
Gigabutes and the values for the other resources are in Megabytes. By
default the limits are 768 files, 1024MB memory, 4096MB map, and
unlimited disk, but these are adjusted at startup time on platforms
that can provide information about available resources. When the limit
is reached, ImageMagick will fail in some fashion, or take
compensating actions if possible. For example, -limit memory 32 -limit
map 64 limits memory When the pixel cache reaches the memory limit it
uses memory mapping. When that limit is reached it goes to disk. If
disk has a hard limit, the program will fail.
The Limits only affect ImageMagick's pixel cache. The program code and anything the libraries / delegates may do to load or process the images are not influenced by these settings at all.
You don't specify what you're looking at in top, the proper column would obviously be RES or RSIZE. With such small limits as 20MiB, the program and library code will represent a significant fraction of resident set size.
To verify that you're using the right units for your environment variables, use identify -list resource . If the size of the memory pixel cache (MAGICK_MEMORY_LIMIT) is insufficient for an image, an mmap-ed file will be used (MAGICK_MAP_LIMIT) and if that limit is too low, a conventional disk file (MAGICK_DISK_LIMIT) is used instead. If all the limits are too low, ImageMagick will fail immediately with an error such as cache resources exhausted, Memory allocation failed or corrupt image.
Can anyone tell me what is the maximum size that can be download from the web
and can be stored locally.
The limit is 5 MB.
See also this S.O. post.
You could also have a look at this if you are interested in going beyond that limit.
EDIT: after your comment I see you are not referring to local web storage. Local web storage is a new possibility offered by HTML5 to store key-value pairs. This has got the limitation I was mentioning.
As to your case, I don't think that there is some restrictive limit on the amount of data you can download and store locally (in your Documents directory) in order to access it later. You can check this, where answers range from a minimum of 2GB of flash space to no limit at all. So you can be safe.
If you are referring to the limit in the total amount of data you can download over 3G connections (also called over-the-air download), this is 20MB. You are only allowed to download a larger file when using Wi-Fi.