my php.ini values
upload_max_size = 14000M
post_max_size = 14000M
if i increase value more than 14000M, $_POST can't be accessed and using that value i can upload a file of 1.5GB only and can't upload a file of 2.14GB.
Here i have three questions
What should i do so $_POST array also keep in working and i can also
upload a file of 2.14 GB.
Why $_POST is not working when i exceed
value more than 14000M
14000M should mean 14GB, isn't so? if so
then why i can't upload file of 2.14GB
i found answer to my question after 2 days work.
This is a bug in PHP which allow us to put *_max_size = 14000M and don't allow us to upload a file of 14000MB.
Reference https://bugs.php.net/bug.php?id=35578
We can't upload file for more than 2047MB, so following values are meaning less
upload_max_size = 14000M
post_max_size = 14000M
and should be converted to its maximum value like
upload_max_size = 2047M
post_max_size = 2047M
So now you can upload about 1.99GB File
THere is no only upload_max_size and post_max_size that affect on file upload. Check this link
The most important is memory_limit. when you tring to upload big file, php run out memory
I have had luck using G (Gigs) in my php.ini file:
upload_max_size = 3G
post_max_size = 3G
Not sure if this will help with the $_POST issue.
There are much more limitations and pitfalls you have to check for, see oficial PHP documentation: http://www.php.net/manual/en/features.file-upload.common-pitfalls.php
Anyway, note that the 2G is also a limit of signed 32bit integer! So this problem might rise from some other limits unrelated to upload itself. Also, what is the maximum file size on the server filesystem? 2G is a limit on some systems.
Related
I'm using FileHandle to write stream of bytes to the mp4 file.
My data source allows me to input bytesOffset which is basically the current size of the file (if 0 it starts from the beginning, if more then it continue to save until reaches end).
I want to implement restart functionallity, but when write gets interrupted, the file exist, but size is always 0 KB.
Do you know any way to solve this or know any library which could help me with implementing this?
You can use Shamik framework for that.
Right now I was printing all of my data into a PDF using FPDF and those data contains pictures with a large image sizes. Then my the XAMPP promt like following text below. What is the following solution in order for me to proceed printing? Is their a solution without changing the images size?
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 50933016 bytes) in C:\xampp\htdocs\techgirl\reports\fpdf.php on line 1449
Even I tried to change the memory limit in my php.ini into 900M nothing happen.
Change the memory_limit option in your php.ini to your needs.
Change the memory_limit option in your php.ini to your needs.
Then Restart the Apache server from XAMPP
Change memory_limit to -1 in php.ini. i.e memory_limit=-1
You can find php.ini file in bin/php/
I am using ESP8266 Arduino ConfigFile.ino as an example to store configuration settings on SPIFFS.
https://github.com/esp8266/Arduino/blob/master/libraries/esp8266/examples/ConfigFile/ConfigFile.ino
From this code segment, configFile cannot be >1024 bytes.
size_t size = configFile.size();
if (size > 1024) {
Serial.println("Config file size is too large");
return false;
}
Why is 1024 bytes the limitation for config file size? If this is indeed a limitation, are there ways to overcome this limitation?
It's a limitation only in this particular example - It's meant to serve as a basis for you to start developing your own configuration file code. Nothing is stopping you from creating a larger buffer for both the raw character data and JsonBuffer. I have several configuration files on production devices around 10-20K with no issues to report.
I am working with a text file in iOS. The file size keeps increasing and reaches to 10 MB of size.The problem is accessing the content is becoming very slow.Is there any other way that I can speed up the process.Also, I want the total character length/count of the text files.Its running in the main thread.
NSString* content = [NSString stringWithContentsOfFile:path usedEncoding:&encoding error:NULL];
There is too many solution for your problem :
1- Reading file with more than 1MB
2- Reading file with more than 1MB
To keep total size of my app down, I would like to distribute some very large databases as ZIP files, then unpack them as needed. Is there an easy way to determine the size of a file in a ZIP so I know in advance if there is space on the device to even try to unzip one of them?
The ZIP file would contain only one file. Yes, I could measure the size in advance and hardcode it, but that's error prone, as I update the databases on a regular basis.
Best regards,
Anders
There are a few 3rd party Objective-C libraries for working with ZIP files. These would allow you to interrogate the contents of a zip file without needing to unzip them first.
Check out ZipKit or Objective-Zip.
You can also use zipzap like this:
NSUInteger wantedFileSize = NSNotFound;
ZZArchive* archive = [ZZArchive archiveWithURL:zipFileURL error:nil];
for (ZZArchiveEntry entry in archive.entries)
if ([entry.fileName isEqualToString:wantedFileName])
{
wantedFileSize = entry.uncompressedSize;
break;
}
This will be fast since it only scans the last blocks of the zip file for the uncompressed size of the entry you want, and doesn't do any decompression.