CGDataProviderCopyData builds up in memory causing crash - ios

Okay, so I'm downloading a bunch of large-ish images (5mb) from a server in pieces, then stitching the pieces together and rendering the total image from a byte array. However, I've realized that the data for each image is not being released, and consequently builds up causing a memory warning and crash of my app. I thought that because of my explicit (__bridge_transfer NSData *) casting that ARC would take care of releasing the object, but it's still proving to be a problem. In instruments, objects called "CGDataProviderCopyData" of ~ 1mb build up and are not discarded for each file that is being stitched into the whole image. Any ideas or anyone who can steer me in the right direction? Much obliged.
// Create array to add all files into total image
NSMutableArray *byteArray = [[NSMutableArray alloc] initWithCapacity:(imageHeight * imageWidth)];
// Iterate through each file in files array
for (NSString *file in array)
{
// Set baseURL for individual file path
NSString *baseURL = [NSString stringWithFormat:#"http://xx.225.xxx.xxx%#",[imageInfo objectForKey:#"BaseURL"]];
// Specify imagePath by appending baseURL to file name
NSString *imagePath = [NSString stringWithFormat:#"%#%#", baseURL, file];
// Change NSString --> NSURL --> NSData
NSURL *imageUrl = [NSURL URLWithString:imagePath];
NSData *imageData = [NSData dataWithContentsOfURL:imageUrl];
// Create image from imageData
UIImage *image = [UIImage imageWithData:imageData];
CGImageRef cgimage = image.CGImage;
size_t width = CGImageGetWidth(cgimage);
size_t height = CGImageGetHeight(cgimage);
size_t bpr = CGImageGetBytesPerRow(cgimage);
size_t bpp = CGImageGetBitsPerPixel(cgimage);
size_t bpc = CGImageGetBitsPerComponent(cgimage);
size_t bytes_per_pixel = bpp / bpc;
// Get CGDataProviderRef from cgimage
CGDataProviderRef provider = CGImageGetDataProvider(cgimage);
// This is the object that is not being released
NSData *data = (__bridge_transfer NSData *)CGDataProviderCopyData(provider); //Using (__bridge_transfer NSData *) casts the provider to type NSData and gives ownership to ARC, but still not discarded
const UInt8 *bytes = (Byte *)[data bytes];
// Log which file is currently being iterated through
NSLog(#"---Stitching png file to total image: %#", file);
// Populate byte array with channel data from each pixel
for(size_t row = 0; row < height; row++)
{
for(size_t col = 0; col < width; col++)
{
const UInt8* pixel =
&bytes[row * bpr + col * bytes_per_pixel];
for(unsigned short i = 0; i < 4; i+=4)
{
__unused unsigned short red = pixel[i]; // red channel - unused
unsigned short green = pixel[i+1]; // green channel
unsigned short blue = pixel[i+2]; // blue channel
__unused unsigned short alpha = pixel[i+3]; // alpha channel - unused
// Create dicom intensity value from intensity = [(g *250) + b]
unsigned short dicomInt = ((green * 256) + blue);
//Convert unsigned short intensity value to NSNumber so can store in array as object
NSNumber *DICOMvalue = [NSNumber numberWithInt:dicomInt];
// Add to image array (total image)
[byteArray addObject:DICOMvalue];
}
}
}
data = nil;
}
return byteArray;
Running "Analyze" through Xcode doesn't show any apparent leaks either.

I took this code, almost verbatim, and did some more investigation. With the CFDataRef/NSData, I was able to see the problem you were seeing with the NSDatas not going away, and I was able to solve it by wrapping the portion of the code that uses the NSData in an #autoreleasepool scope, like this:
// Create array to add all files into total image
NSMutableArray *byteArray = [[NSMutableArray alloc] initWithCapacity:(imageHeight * imageWidth)];
// Iterate through each file in files array
for (NSString *file in array)
{
// Set baseURL for individual file path
NSString *baseURL = [NSString stringWithFormat:#"http://xx.225.xxx.xxx%#",[imageInfo objectForKey:#"BaseURL"]];
// Specify imagePath by appending baseURL to file name
NSString *imagePath = [NSString stringWithFormat:#"%#%#", baseURL, file];
// Change NSString --> NSURL --> NSData
NSURL *imageUrl = [NSURL URLWithString:imagePath];
NSData *imageData = [NSData dataWithContentsOfURL:imageUrl];
// Create image from imageData
UIImage *image = [UIImage imageWithData:imageData];
CGImageRef cgimage = image.CGImage;
size_t width = CGImageGetWidth(cgimage);
size_t height = CGImageGetHeight(cgimage);
size_t bpr = CGImageGetBytesPerRow(cgimage);
size_t bpp = CGImageGetBitsPerPixel(cgimage);
size_t bpc = CGImageGetBitsPerComponent(cgimage);
size_t bytes_per_pixel = bpp / bpc;
// Get CGDataProviderRef from cgimage
CGDataProviderRef provider = CGImageGetDataProvider(cgimage);
#autoreleasepool
{
// This is the object that is not being released
NSData *data = (__bridge_transfer NSData *)CGDataProviderCopyData(provider); //Using (__bridge_transfer NSData *) casts the provider to type NSData and gives ownership to ARC, but still not discarded
const UInt8 *bytes = (Byte *)[data bytes];
// Log which file is currently being iterated through
NSLog(#"---Stitching png file to total image: %#", file);
// Populate byte array with channel data from each pixel
for(size_t row = 0; row < height; row++)
{
for(size_t col = 0; col < width; col++)
{
const UInt8* pixel =
&bytes[row * bpr + col * bytes_per_pixel];
for(unsigned short i = 0; i < 4; i+=4)
{
__unused unsigned short red = pixel[i]; // red channel - unused
unsigned short green = pixel[i+1]; // green channel
unsigned short blue = pixel[i+2]; // blue channel
__unused unsigned short alpha = pixel[i+3]; // alpha channel - unused
// Create dicom intensity value from intensity = [(g *250) + b]
unsigned short dicomInt = ((green * 256) + blue);
//Convert unsigned short intensity value to NSNumber so can store in array as object
NSNumber *DICOMvalue = [NSNumber numberWithInt:dicomInt];
// Add to image array (total image)
[byteArray addObject:DICOMvalue];
}
}
}
data = nil;
}
}
return byteArray;
After adding that #autoreleasepool, I then commented out the part where you create NSNumbers and put them in the array, and I was able to see in the Allocations template of Instruments that indeed the CFData objects were now being released with each turn of the loop.
The reason I commented out the part where you create NSNumbers and put them in the array, is that with that code in there, you're going to end up adding width * height * 4 NSNumbers to byteArray. This means that even if the NSData was being released properly, your heap use would be going up by width * height * 4 * <at least 4 bytes, maybe more> no matter what. Maybe that's what you need to do, but it sure made it harder for me to see what was going on with the NSDatas because their size was being dwarfed by the array of NSNumbers.
Hope that helps.

Related

Pass a list of colors from Objective-C to React Native

I am trying to get a list of all the colors in an image in Objective-C. Note, I am COMPLETELY new to Objective-C - I've done some Swift work in the past, but not really Objective-C.
I pulled a library that more or less is supposed to pull all colors as part of its code. I've modified it to look like this (callback at the end is from React Native, path argument is just a string of the path):
getColors:(NSString *)path options:(NSDictionary *)options callback:(RCTResponseSenderBlock)callback) {
UIImage *originalImage = [UIImage imageWithContentsOfFile:path ];
UIImage *image =
[UIImage imageWithCGImage:[originalImage CGImage]
scale:0.5
orientation:(UIImageOrientationUp)];
CGImageRef cgImage = [image CGImage];
NSUInteger width = CGImageGetWidth(cgImage);
NSUInteger height = CGImageGetHeight(cgImage);
// Allocate storage for the pixel data
unsigned char *rawData = (unsigned char *)malloc(height * width * 4);
// Create the color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Set some metrics
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
// Create context using the storage
CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
// Release the color space
CGColorSpaceRelease(colorSpace);
// Draw the image into the storage
CGContextDrawImage(context, CGRectMake(0, 0, width, height), cgImage);
// We are done with the context
CGContextRelease(context);
// determine the colours in the image
NSMutableArray * colours = [NSMutableArray new];
float x = 0;
float y = 0;
for (int n = 0; n<(width*height); n++){
int index = (bytesPerRow * y) + x * bytesPerPixel;
int red = rawData[index];
int green = rawData[index + 1];
int blue = rawData[index + 2];
int alpha = rawData[index + 3];
NSArray * a = [NSArray arrayWithObjects:[NSString stringWithFormat:#"%i",red],[NSString stringWithFormat:#"%i",green],[NSString stringWithFormat:#"%i",blue],[NSString stringWithFormat:#"%i",alpha], nil];
[colours addObject:a];
y++;
if (y==height){
y=0;
x++;
}
}
free(rawData);
callback(#[[NSNull null], colours]);
Now, this script is fairly simple it seems like - it should be iterating over each pixel and adding each color to an array, which is then returned to React Native via the callback.
However, the response to the call is always an empty array.
I'm not sure why that is. Could it be due to where the images are located (they're at AWS, on S3), or something in the algorithm? The code looks right to me, but it's entirely possible that I'm missing something just due to unfamiliarity with Objective-C
I ran your code in an empty project and it performs as expected using an image loaded from the assets library. Is it possible that the UIImage *originalImage = [UIImage imageWithContentsOfFile:path ]; call uses an invalid path. You can easily validate that by simply logging the value of the read image:
UIImage * originalImage = [UIImage imageWithContentsOfFile: path];
NSLog(#"image read from file %#", originalImage);
If the image was not read properly from the file, you will get an empty colours array as the width and height will be nil there will be nothing to loop over.
Also, to avoid modifying the array after your function has returned, it is generally a good practice to return a copy of mutable object or an immutable object (i.e. NSArray instead of NSMutableArray):
callback(#[[NSNull null], [colours copy]]);
Hope this helps
The issue was ultimately that the image download method was returning null - not sure why.
So I took this:
UIImage *originalImage = [UIImage imageWithContentsOfFile:path ];
I changed it to this:
NSData * imageData = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: path]];
UIImage *originalImage = [UIImage imageWithData: imageData];
And now my image downloads just fine and the rest of the script works great.

imagemagick iOS - black border on animated gif

I'm creating a gif as below but I always end up with a black line on one or more of the edges as the attached image shows. How can I avoid this please?
//Create gif
MagickWand *mw = NewMagickWand();
MagickSetFormat(mw, "gif");
for(int i = 0; i < [self.finalImageArray count]; i++)
{
float interval = (100/8);
MagickWand *localWand = NewMagickWand();
UIImage *image = [self.finalImageArray objectAtIndex:i];
NSData *dataObj = UIImagePNGRepresentation(image);
MagickReadImageBlob(localWand, [dataObj bytes], [dataObj length]);
MagickThumbnailImage(localWand, 320, 320);
MagickSetImageDelay(localWand, interval);
MagickAddImage(mw, localWand);
DestroyMagickWand(localWand);
}
size_t my_size;
unsigned char * my_image = MagickGetImagesBlob(mw, &my_size);
NSData *gifData = [[NSData alloc] initWithBytes:my_image length:my_size];
The problem isn't with the creation of the gif, the black line is created earlier when I scale the image

Error decoding animated webp iOS

I've been struggling for two days to display an animated webp image in a UIImageView with no success whatsoever.
Mainly the problem is in the decoding step of the file which gives this error: VP8_STATUS_UNSUPPORTED_FEATURE.
I tried
https://github.com/seanooi/iOS-WebP
https://github.com/mattt/WebPImageSerialization
These projects provide code for creating UIImage with webp files and they work fine with images with no animation but they both fail with the same error as above when attempting to decode images with animation.
I am jailbroken and checking the filesystem I saw that Facebook's Messenger app has some of it's stickers in .webp format and also in the License they mention Google's "webp" library so I'm sure somehow it's possible.
Managed to decode animated .webp using the code snippet at the top of this header which also contains explanations of the data structures used.
static NSDictionary* DecodeWebPURL(NSURL *url) {
NSMutableDictionary *info = [NSMutableDictionary dictionary];
NSMutableArray *images = [NSMutableArray array];
NSData *imgData = [NSData dataWithContentsOfURL:url];
WebPData data;
WebPDataInit(&data);
data.bytes = (const uint8_t *)[imgData bytes];
data.size = [imgData length];
WebPDemuxer* demux = WebPDemux(&data);
int width = WebPDemuxGetI(demux, WEBP_FF_CANVAS_WIDTH);
int height = WebPDemuxGetI(demux, WEBP_FF_CANVAS_HEIGHT);
uint32_t flags = WebPDemuxGetI(demux, WEBP_FF_FORMAT_FLAGS);
if (flags & ANIMATION_FLAG) {
WebPIterator iter;
if (WebPDemuxGetFrame(demux, 1, &iter)) {
WebPDecoderConfig config;
WebPInitDecoderConfig(&config);
config.input.height = height;
config.input.width = width;
config.input.has_alpha = iter.has_alpha;
config.input.has_animation = 1;
config.options.no_fancy_upsampling = 1;
config.options.bypass_filtering = 1;
config.options.use_threads = 1;
config.output.colorspace = MODE_RGBA;
[info setObject:[NSNumber numberWithInt:iter.duration] forKey:#"duration"];
do {
WebPData frame = iter.fragment;
VP8StatusCode status = WebPDecode(frame.bytes, frame.size, &config);
if (status != VP8_STATUS_OK) {
NSLog(#"Error decoding frame");
}
uint8_t *data = WebPDecodeRGBA(frame.bytes, frame.size, &width, &height);
CGDataProviderRef provider = CGDataProviderCreateWithData(&config, data, config.options.scaled_width * config.options.scaled_height * 4, NULL);
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault | kCGImageAlphaLast;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef imageRef = CGImageCreate(width, height, 8, 32, 4 * width, colorSpaceRef, bitmapInfo, provider, NULL, YES, renderingIntent);
[images addObject:[UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(provider);
} while (WebPDemuxNextFrame(&iter));
WebPDemuxReleaseIterator(&iter);
}
}
WebPDemuxDelete(demux);
[info setObject:images forKey:#"frames"];
return info;
}

How to transform byte array in short array with Objective-C?

I'm developing a mobile application for iOS related to voice recording.
Due to that fact, I'm developing some different sound effects to modify recorded voice but I have a problem to implement some of them.
I'm trying to create echo/delay effect and I need to transform a byte array into a short array but I have no idea how to do it in Objective-C.
Thanks.
This is my current source code to implement it, but like byte is a very short type, when I apply attenuation (what must return a float value) produce an awful noise in my audio.
- (NSURL *)echo:(NSURL *)input output:(NSURL *)output{
int delay = 50000;
float attenuation = 0.5f;
NSMutableData *audioData = [NSMutableData dataWithContentsOfURL:input];
NSUInteger dataSize = [audioData length] - 44;
NSUInteger audioLength = [audioData length];
NSUInteger newAudioLength = audioLength + delay;
// Copy bytes
Byte *byteData = (Byte*)malloc(audioLength);
memcpy(byteData, [audioData bytes], audioLength);
short *shortData = (short*)malloc(audioLength/2);
// create a new array to store new modify data
Byte *newByteData = (Byte*)malloc(newAudioLength);
newByteData = byteData;
for (int i = 44; i < audioLength - delay; i++)
{
newByteData[i + delay] += byteData[i] * attenuation;
}
// Copy bytes in a new NSMutableData
NSMutableData *newAudioData = [NSMutableData dataWithBytes:newByteData length:newAudioLength];
// Store in a file
[newAudioData writeToFile:[output path] atomically:YES];
// Set WAV size
[[AudioUtils alloc] setAudioFileSize:output];
return output;
}
Finally, I could finish my echo effect implementing these four methods. I hope they will be useful for you.
Byte to short array
- (short *) byte2short:(Byte *)bytes size:(int)size resultSize:(int)resultSize{
short *shorts = (short *)malloc(sizeof(short)*resultSize);
for (int i=0; i < size/2; i++){
shorts[i] = (bytes[i*2+1] << 8) | bytes[i*2];
}
return shorts;
}
Short to byte array
- (Byte *) short2byte:(short *)shorts size:(int)size resultSize:(int)resultSize{
Byte *bytes = (Byte *)malloc(sizeof(Byte)*resultSize);
for (int i = 0; i < size; i++)
{
bytes[i * 2] = (Byte) (shorts[i] & 0x00FF);
bytes[(i * 2) + 1] = (Byte) (shorts[i] >> 8);
shorts[i] = 0;
}
return bytes;
}
Effect
- (NSMutableData *) effect:(NSMutableData *)data delay:(int)delay attenuation:(float)attenuation{
NSUInteger audioLength = [data length];
// Copy original data in a byte array
Byte *byteData = (Byte*)malloc(sizeof(Byte)*audioLength);
memcpy(byteData, [data bytes], audioLength);
short *shortData = (short*)malloc(sizeof(short)*(audioLength/2 + delay));
shortData = [self byte2short:byteData size:(int)audioLength resultSize:(int)audioLength/2 + delay];
// Array to store shorts
short *newShortData = shortData;
for (int i = 44; i < audioLength/2; i++)
{
newShortData[i + delay] += (short)((float)shortData[i] * attenuation);
}
Byte *newByteData = [self short2byte:newShortData size:(int)(audioLength/2 + delay) resultSize:(int)(audioLength + delay*2)];
// Copy bytes to a NSMutableData in order to create new file
NSMutableData *newAudioData = [NSMutableData dataWithBytes:newByteData length:(int)(audioLength + delay*2)];
return newAudioData;
}
Echo effect
- (NSURL *)echo:(NSURL *)input output:(NSURL *)output{
NSMutableData *audioData = [NSMutableData dataWithContentsOfURL:input];
// we call effect method that returns a NSMutableData and create a new file
[[self effect:audioData delay:6000 attenuation:0.5f] writeToFile:[output path] atomically:YES];
// We set file's size (is a method I have implemented)
[[AudioUtils alloc] setAudioFileSize:output];
return output;
}
There's no predefined function that will create a short array from a byte array, but it should be fairly simple to do it with a for loop
// create a short array
short *shortData = malloc(sizeof(short)*audioLength);
for (i=0; i<bytearray.length, i++)
{
shortData[i] = byteData[i];
}
The code is not rigorously correct (meaning I didn't compile it, just wrote it here on the fly), but it should give you an idea on how to do it.
Also be aware that saving audio data with two bytes instead of one can give very different results when playing back, but I'll assume you know how to handle with audio data for your specific purposes.

CGImageCreateWithPNGDataProvider giving blank image

Please check this code and help me out,,,
CGImageRef cRef = CGImageRetain(im.CGImage);
NSData* pixelData = (NSData*) CGDataProviderCopyData(CGImageGetDataProvider(cRef));
// return pointer to data
unsigned char* pixelBytes = (unsigned char *)[pixelData bytes];
// step through char data
for(int k = 0; k < [pixelData length]; k += 4) {
// change accordingly
pixelBytes[k] = pixelBytes[k];
pixelBytes[k+1] = pixelBytes[k+1];
pixelBytes[k+2] = pixelBytes[k+2];
pixelBytes[k+3] = 255;
}
NSData* newPixelData = [NSData dataWithBytes:pixelBytes length:[pixelData length]];
CFDataRef imgData = (CFDataRef)pixelData;
CGDataProviderRef imgDataProvider = CGDataProviderCreateWithCFData(imgData);
CGImageRef throughCGImage = CGImageCreateWithPNGDataProvider(imgDataProvider, NULL, YES, kCGRenderingIntentDefault);
UIImage* newImage = [UIImage imageWithCGImage:throughCGImage];
NSLog(#"newImage: %#", newImage);
Some data is coming but it is not getting added to UIImageView..it is showing blank.
Problem 0)
You're mutating immutable data, which should be undefined behavior.
Problem 1)
Anyways, a PNG is a real file format, not a bitmap blob with a fixed sample format. PNG can represent images in many ways - your programs just overwrites good data.

Resources