I'm still making 60 images array to GIF images Size 320*320. Normally I used third party Library NSGIF on github but still getting app memory warning and crash when used 80 images in to make GIF images process.
NSDictionary *fileProperties = #{(__bridge id)kCGImagePropertyGIFDictionary: #{
(__bridge id)kCGImagePropertyGIFLoopCount: #0, // 0 means loop forever
}
};
NSDictionary *frameProperties = #{(__bridge id)kCGImagePropertyGIFDictionary: #{
//(__bridge id)kCGImagePropertyGIFDelayTime: #0.02f, // a float (not double!) in seconds, rounded to centiseconds in the GIF data
(__bridge id)kCGImagePropertyGIFDelayTime: #0.06f,
}
};
NSURL *documentsDirectoryURL = [[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:YES error:nil];
NSString *savePath = [documentsDirectoryURL URLByAppendingPathComponent:#"animated.gif"];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)savePath, kUTTypeGIF, FrameArr.count, NULL);
CGImageDestinationSetProperties(destination, (__bridge CFDictionaryRef)fileProperties);
for (NSUInteger i = 0; i < ImageArray.count; i++) {
#autoreleasepool {
UIImage *CaptureImage = [ImageArray objectAtIndex:i];
CGImageDestinationAddImage(destination, CaptureImage.CGImage, (__bridge CFDictionaryRef)frameProperties);
}
}
if (!CGImageDestinationFinalize(destination)) {
}
else
{
//[shareBtn setHidden:NO];
}
CFRelease(destination);
i want to make 80 plus images to GIF..
That actually depends on the duration and the framerate of the gif that your are creating. With a GIF you can probably bring down the framerate to 8-10 fps and still see something decent out of it.
Otherwise you can always opt for a different library and hope it has a better performance
Related
For gif conversion with ImageArray . But I'm facing memory Issue when array contains more then 420 images (approximate).
Here is my code:
-(void) createGifFromImages:(NSArray*)imageArray :(NSString*)filename
{
NSDictionary *fileProperties = #{
(id)kCGImagePropertyGIFDictionary: #{
(id)kCGImagePropertyGIFLoopCount: #0, // 0 means loop forever
}
};
NSURL *documentsDirectoryURL = [[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:YES error:nil];
filename=[filename stringByAppendingString:#".gif"];
fileURL = [documentsDirectoryURL URLByAppendingPathComponent:filename];
NSLog(#"%#",fileURL);
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((CFURLRef)fileURL, kUTTypeGIF,imageArray.count, NULL);
CGImageDestinationSetProperties(destination, (CFDictionaryRef)fileProperties);
UIImage *image;
for (int i=0;i<imageArray.count;i++) {
image=imageArray[i];
NSDictionary *frameDurationProperties = #{
(id)kCGImagePropertyGIFDictionary: #{
(id)kCGImagePropertyGIFDelayTime: _durationArray[i % _durationArray.count], // a float (not double!) in seconds, rounded to centiseconds in the GIF data
}
};
CGImageDestinationAddImage(destination, image.CGImage, (CFDictionaryRef)frameDurationProperties);
}
if (!CGImageDestinationFinalize(destination)) {
}
CFRelease(destination);
}
In this code i'm using default size image.
How do I reduce my memory allocation?
& Is there any way to reduce size of images without changing its quality?
I'm building an app that allows users to select photos and videos from their device and upload them to the server. Trying to get the file size (in bytes) of each item select, can anyone help me out?
if ([dict objectForKey:UIImagePickerControllerMediaType] == ALAssetTypePhoto){ // image file
if ([dict objectForKey:UIImagePickerControllerOriginalImage]){
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 1; // image
item.itemSize = // what do I need here??
[m_items addObject:item];
}
} else if ([dict objectForKey:UIImagePickerControllerMediaType] == ALAssetTypeVideo){ // video file
if ([dict objectForKey:UIImagePickerControllerOriginalImage]){
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 2; // video
item.itemSize = // what do I need here??
[m_items addObject:item];
}
}
EDIT
Getting NSCocaoErrorDomain 256 with videos:
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 2; // video
//Error Container
NSError *attributesError;
NSDictionary *fileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[urlPath path] error:&attributesError];
NSNumber *fileSizeNumber = [fileAttributes objectForKey:NSFileSize];
long fileSize = [fileSizeNumber longValue];
item.itemSize = fileSize;
[m_items addObject:item];
For only image data selection:
item.itemImage = (UIImage*)[info valueForKey:UIImagePickerControllerOriginalImage];
NSData *imgData = UIImageJPEGRepresentation(item.itemImage, 1); //1 it represents the quality of the image.
NSLog(#"Size of Image(bytes):%d",[imgData length]);
Hope this will help you.
Below method is generalize, it will work for both image and video:
Something like this should take care finding the file size of a selected image or video returned from the UIImagePickerController
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSURL *videoUrl=(NSURL*)[info objectForKey:UIImagePickerControllerMediaURL];
//Error Container
NSError *attributesError;
NSDictionary *fileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[videoUrl path] error:&attributesError];
NSNumber *fileSizeNumber = [fileAttributes objectForKey:NSFileSize];
long long fileSize = [fileSizeNumber longLongValue];
}
I am trying to programmatically create GIF in iOS, using the following stack's question:
Create and and export an animated gif via iOS?
My code looks like this:
// File Parameters
const void *keys[] = { kCGImagePropertyGIFLoopCount };
const void *values[] = { (CFNumberRef) 0 };
CFDictionaryRef params = CFDictionaryCreate(NULL, keys, values, 1, NULL, NULL);
const void *keys2[] = { kCGImagePropertyGIFDictionary };
const void *values2[] = { (CFDictionaryRef) params };
CFDictionaryRef fileProperties = CFDictionaryCreate(NULL, keys2 , values2, 1, NULL, NULL);
// URL to the documents directory
NSURL *documentsDirectoryURL = [[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:YES error:nil];
NSURL *fileURL = [documentsDirectoryURL URLByAppendingPathComponent:fileName];
// Object that writes GIF to the specified URL
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)fileURL, kUTTypeGIF, [arrayOfAllFrames count], NULL);
CGImageDestinationSetProperties(destination, fileProperties);
for (NSUInteger i = 0; i < [arrayOfAllFrames count]; i++) {
#autoreleasepool {
float delayTime = [[gifFramesDuration objectAtIndex:i] floatValue];
NSDictionary *frameProperties = #{
(__bridge id)kCGImagePropertyGIFDictionary: #{
(__bridge id)kCGImagePropertyGIFDelayTime: [NSNumber numberWithFloat:delayTime] // a float (not double!) in seconds, rounded to centiseconds in the GIF data
}
};
UIImage *myImage = [arrayOfAllFrames objectAtIndex:i];
CGImageDestinationAddImage(destination, myImage.CGImage, (__bridge CFDictionaryRef)frameProperties);
}
}
if (!CGImageDestinationFinalize(destination)) {
NSLog(#"failed to finalize image destination");
}
CFRelease(destination);
CFRelease(fileProperties);
CFRelease(params);
However once I try to add around 240 frames to the GIF file, debugger throws the following error once the CGImageDestinationFinalize gets called:
(923,0xb0115000) malloc: *** error for object 0xd1e7204: incorrect checksum for freed object - object was probably modified after being freed.
Could you please provide me with some workaround, or with a suggestion on how to avoid malloc?
First of all, try debugging your app using Instruments. You probably will notice that the problem is caused by the method:
Generatefromrgbimagewu
I have been wondering whether the cause was in my threads implementation, but it turns out, that once you have that kind of error, you should focus on resizing the Image.
Once the image had been resized, the code published above, will generate your own GIF.
I have searched all over the web and cannot find a tutorial on how to use the SoundTouch library for beat detection.
(Note: I have no C++ experience prior to this. I do know C, Objective-C, and Java. So I could have messed some of this up, but it compiles.)
I added the framework to my project and managed to get the following to compile:
NSString *path = [[NSBundle mainBundle] pathForResource:#"song" ofType:#"wav"];
NSData *data = [NSData dataWithContentsOfFile:path];
player =[[AVAudioPlayer alloc] initWithData:data error:NULL];
player.volume = 1.0;
player.delegate = self;
[player prepareToPlay];
[player play];
NSUInteger len = [player.data length]; // Get the length of the data
soundtouch::SAMPLETYPE sampleBuffer[len]; // Create buffer array
[player.data getBytes:sampleBuffer length:len]; // Copy the bytes into the buffer
soundtouch::BPMDetect *BPM = new soundtouch::BPMDetect(player.numberOfChannels, [[player.settings valueForKey:#"AVSampleRateKey"] longValue]); // This is working (tested)
BPM->inputSamples(sampleBuffer, len); // Send the samples to the BPM class
NSLog(#"Beats Per Minute = %f", BPM->getBpm()); // Print out the BPM - currently returns 0.00 for errors per documentation
The inputSamples(*samples, numSamples) song byte information confuses me.
How do I get these pieces of information from a song file?
I tried using memcpy() but it doesn't seem to be working.
Anyone have any thoughts?
After hours and hours of debugging and reading the limited documentation on the web, I modified a few things before stumbling upon this: You need to divide numSamples by numberOfChannels in the inputSamples() function.
My final code is like so:
NSString *path = [[NSBundle mainBundle] pathForResource:#"song" ofType:#"wav"];
NSData *data = [NSData dataWithContentsOfFile:path];
player =[[AVAudioPlayer alloc] initWithData:data error:NULL];
player.volume = 1.0; // optional to play music
player.delegate = self;
[player prepareToPlay]; // optional to play music
[player play]; // optional to play music
NSUInteger len = [player.data length];
soundtouch::SAMPLETYPE sampleBuffer[len];
[player.data getBytes:sampleBuffer length:len];
soundtouch::BPMDetect BPM(player.numberOfChannels, [[player.settings valueForKey:#"AVSampleRateKey"] longValue]);
BPM.inputSamples(sampleBuffer, len/player.numberOfChannels);
NSLog(#"Beats Per Minute = %f", BPM.getBpm());
I've tried this solution to read the BPM from mp3 files (using the TSLibraryImport class to convert to wav) inside the iOS Music Library:
MPMediaItem *item = [collection representativeItem];
NSURL *urlStr = [item valueForProperty:MPMediaItemPropertyAssetURL];
TSLibraryImport* import = [[TSLibraryImport alloc] init];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSURL* destinationURL = [NSURL fileURLWithPath:[documentsDirectory stringByAppendingPathComponent:#"temp_data"]];
[[NSFileManager defaultManager] removeItemAtURL:destinationURL error:nil];
[import importAsset:urlStr toURL:destinationURL completionBlock:^(TSLibraryImport* import) {
NSString *outPath = [documentsDirectory stringByAppendingPathComponent:#"temp_data"];
NSData *data = [NSData dataWithContentsOfFile:outPath];
AVAudioPlayer *player =[[AVAudioPlayer alloc] initWithData:data error:NULL];
NSUInteger len = [player.data length];
int numChannels = player.numberOfChannels;
soundtouch::SAMPLETYPE sampleBuffer[1024];
soundtouch::BPMDetect *BPM = new soundtouch::BPMDetect(player.numberOfChannels, [[player.settings valueForKey:#"AVSampleRateKey"] longValue]);
for (NSUInteger i = 0; i <= len - 1024; i = i + 1024) {
NSRange r = NSMakeRange(i, 1024);
//NSData *temp = [player.data subdataWithRange:r];
[player.data getBytes:sampleBuffer range:r];
int samples = sizeof(sampleBuffer) / numChannels;
BPM->inputSamples(sampleBuffer, samples); // Send the samples to the BPM class
}
NSLog(#"Beats Per Minute = %f", BPM->getBpm());
}];
The strangeness is that the calculated BMP is always the same value:
2013-10-02 03:05:36.725 AppTestAudio[1464:1803] Beats Per Minute = 117.453835
No matter which track was i.e. number of frames or the buffer size (here I used 2K buffer size as for the SoundTouch example in the source code of the library).
For Swift 3:
https://github.com/Luccifer/BPM-Analyser
And use it like:
guard let filePath = Bundle.main.path(forResource: "TestMusic", ofType: "m4a"),
let url = URL(string: filePath) else {return "error occured, check fileURL"}
BPMAnalyzer.core.getBpmFrom(url, completion: nil)
Feel free to comment!
In order to prevent lagging in my app, I'm trying to compress images larger than 1 MB (mostly for pics taken from iphone's normal camera.
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSData *imageSize = UIImageJPEGRepresentation(image, 1);
NSLog(#"original size %u", [imageSize length]);
UIImage *image2 = [UIImage imageWithData:UIImageJPEGRepresentation(image, 0)];
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1);
NSLog(#"new size %u", [newImageSize length]);
UIImage *image3 = [UIImage imageWithData:UIImageJPEGRepresentation(image2, 0)];
NSData *newImageSize2 = UIImageJPEGRepresentation(image3, 1);
NSLog(#"new size %u", [newImageSize2 length]);
picView = [[UIImageView alloc] initWithImage:image3] ;
However, the NSLog I get outputs something along the lines of
original size 3649058
new size 1835251
new size 1834884
The difference between the 1st and 2nd compression is almost negligible. My goal is to get the image size below 1 MB. Did I overlook something/is there an alternative approach to achieve this?
EDIT: I want to avoid scaling the image's height and width, if possible.
A couple of thoughts:
The UIImageJPEGRepresentation function does not return the "original" image. For example, if you employ a compressionQuality of 1.0, it does not, technically, return the "original" image, but rather it returns a JPEG rendition of the image with compressionQuality at its maximum value. This can actually yield an object that is larger than the original asset (at least if the original image is a JPEG). You're also discarding all of the metadata (information about where the image was taken, the camera settings, etc.) in the process.
If you want the original asset, you should use PHImageManager:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = [result firstObject];
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString *filename = [(NSURL *)info[#"PHImageFileURLKey"] lastPathComponent];
// do what you want with the `imageData`
}];
In iOS versions prior to 8, you'd have to use assetForURL of the ALAssetsLibrary class:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSLog(#"size of original asset %llu", [representation size]);
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data strea
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
NSUInteger bytesRead;
while (bytesRemaining > 0) {
bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(#"error reading asset representation: %#", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(#"error=%#", error);
}];
Please note that this assetForURL runs asynchronously.
If you want a NSData with compression, you can use UIImageJPEGRepresentation with a compressionQuality less than 1.0. Your code actually does this with a compressionQuality of 0.0, which should offer maximum compression. But you don't save that NSData, but rather use it to create a UIImage and you then get a new UIImageJPEGRepresentation with a compressionQuality of 1.0, thus losing much of the compression you originally achieved.
Consider the following code:
// a UIImage of the original asset (discarding meta data)
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// this may well be larger than the original asset
NSData *jpgDataHighestCompressionQuality = UIImageJPEGRepresentation(image, 1.0);
[jpgDataHighestCompressionQuality writeToFile:[docsPath stringByAppendingPathComponent:#"imageDataFromJpeg.jpg"] atomically:YES];
NSLog(#"compressionQuality = 1.0; length = %u", [jpgDataHighestCompressionQuality length]);
// this will be smaller, but with some loss of data
NSData *jpgDataLowestCompressionQuality = UIImageJPEGRepresentation(image, 0.0);
NSLog(#"compressionQuality = 0.0; length = %u", [jpgDataLowestCompressionQuality length]);
UIImage *image2 = [UIImage imageWithData:jpgDataLowestCompressionQuality];
// ironically, this will be larger than jpgDataLowestCompressionQuality
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1.0);
NSLog(#"new size %u", [newImageSize length]);
In addition to the JPEG compression quality outlined the prior point, you could also just resize the image. You can also marry this with the JPEG compressionQuality, too.
You can not compress the image again and again. If so everything can be compressed again and again. Then how small do you think it will be?
One way to make your image smaller is to change it's size. For example change 640X960 to 320X480. But you will lose quality.
I is the first implementation of UIImageJPEGRepresentation (image, 0.75), and then change the size. Maybe image's width and heigh two-thirds or half.