How to convert NSData object to CMSampleBuffer - ios

I have two Apps now, one of those convert the CMSampleBuffer video data to NSData object, then transport it via network.Another App receive those data, now how can I convert NSData object back to CMSampleBuffer data?
Here is a way I use it to convert CMSampleBuffer to NSData object:
CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(sampleBuffer);
size_t length = CMBlockBufferGetDataLength(blockBufferRef);
Byte buffer[length];
CMBlockBufferCopyDataBytes(blockBufferRef, 0, length, buffer);
NSData *data = [NSData dataWithBytes:buffer length:length];
Or I just need another way to transport the video data?

Related

IOS How to convert float array to short array?

How to convert audio float array to short array() ?
Here is how to encode and decode a float with NSData:
encoding:
NSMutableData * data = [NSMutableData dataWithCapacity:0];
float z = ...;
[data appendBytes:&z length:sizeof(float)];
decoding:
NSData * data = ...; // loaded from bluetooth
float z;
[data getBytes:&z length:sizeof(float)];
A couple of things to note here:
You have to use NSMutableData if you are going to add things to the data object after creating it. The other option is to simply load the data all in one shot:
NSData * data = [NSData dataWithBytes:&z length:sizeof(float)];
the getBytes:length: method is for retrieving bytes from an NSData object, not for copying bytes into it.

How to create a CMBlockBufferRef from NSData

I am struggling (getting memory errors, or apparently not correctly deallocating memory) trying to create a CMBlockBufferRef filled with data from an existing NSData (or NSMutableData).
I would be happy with a solution that copies data, but ideally I would be looking at a solution that would use the underlying NSData bytes and keep a strong reference to the NSData object until the CMBlockBuffer is deallocated.
Only for a read-only buffer referring a NSData (of course, without copying), I've just found a way to achieve it.
static void releaseNSData(void *o, void *block, size_t size)
{
NSData *data = (__bridge_transfer NSData*) o;
data = nil; // Assuming ARC is enabled
}
OSStatus createReadonlyBlockBuffer(CMBlockBufferRef *result, NSData *data)
{
CMBlockBufferCustomBlockSource blockSource =
{
.version = kCMBlockBufferCustomBlockSourceVersion,
.AllocateBlock = NULL,
.FreeBlock = &releaseNSData,
.refCon = (__bridge_retained void*) data,
};
return CMBlockBufferCreateWithMemoryBlock(NULL, (uint8_t*) data.bytes, data.length, NULL, &blockSource, 0, data.length, 0, result);
}

NSImage and UIImage give different NSData representations

Scenario:
I have an image in the iPhone camera roll. I access it using ALAssetLibrary and get an ALAsset object. I get a UIImage and NSData object from it using something like the following code.
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref ];
NSData * data = UIImageJPEGRepresentation(largeimage, 1.0f);
}
I then copy the image from the camera roll using Image Capture onto my mac. I then use NSImage in my Mac Code to open the copied image and try to get a NSData representation using the following code.
NSImage * image = [[NSImage alloc] initWithContentsOfURL:fileURL];
NSBitmapImageRep *imgRep = [[image representations] objectAtIndex: 0];
NSData *data = [imgRep representationUsingType: NSJPEGFileType properties: nil];
Problem:
Unfortunately, the two NSData representations I get are very different. I want to be able to get the same NSData representation in both cases (since it is the same file). I can then go on to hash the NSData objects and compare the hashes to conclude that the two are (possibly) the same image. Ideally I would want the following two functions;
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset;
//or
-(NSData *) getDataFromUIImage:(UIImage*)image;
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url;
//or
-(NSData *) getDataFromNSImage:(NSImage*)image;
Such that the NSData* representation I get in OS X and iOS are exactly the same given that they come from the same source image.
What I have tried:
I have tried to play around with how I get the UIImage object from the ALAsset object, I have tried to UIImagePNGRepresentation (and the corresponding for getting NSData in OS X). I have also tried to play around with different parameters for getting the representation in OS X but nothing has come through. I have also tried to create a CGIImageRef on both platforms, convert that to Bitmaps and read them pixel by pixel and even those seem to be off (and yes I do realise that the NSBitmapImageRep has different co-ordinate system).
I did eventually find a way to do what I wanted. The ALAssetRepresentation class's getBytes:fromOffset:length:error: method can be used to get the NSData object which is same as [NSData dataWithContentsOfURL:fileURL] in OS X. Note that doing it from the UIImage is not possible since the UIImage performs some processing on the image. Here is what the requested functions would look like.
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset {
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *assetData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
return assetData;
}
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url
{
return [NSData dataWithContentsOfURL:url];
}

Converting a series of bits into NSData object

How do I convert a series of 32 bits (representing 4 bytes) stored in an NSString, into an NSData object of 4 bytes in objective-c?
For example, how can I convert the following string:
NSString *bitSeries = #"00000000000000000000000111101100";
into NSData object with length precisely 4?
You can use strtoul() with base 2 to convert the string to an unsigned integer:
NSString *bitSeries = #"00000000000000000000000111101100";
uint32_t value = strtoul([bitSeries UTF8String], NULL, 2);
and then create an NSData object:
NSData *data = [NSData dataWithBytes:&value length:sizeof(value)];
NSLog(#"%#", data);
// Output: <ec010000>
Or, if you prefer big-endian byte order:
value = OSSwapHostToBigInt32(value);
NSData *data = [NSData dataWithBytes:&value length:sizeof(value)];
NSLog(#"%#", data);
// Output: <000001ec>

How to update exif of ALAsset without changing the image?

I use setImageData:metadata:completionBlock: of ALAsset to update the exif(metadata) of an asset.
I just want to update the metadata, but this method require an imageData as the first parameter. I use the code below to generate imageData, but it modified my image(I checked the file size and file hash).
ALAssetRepresentation *dr = asset.defaultRepresentation;
UIImage *image = [UIImage imageWithCGImage:dr.fullResolutionImage scale:dr.scale orientation:dr.orientation];
NSData *data = UIImageJPEGRepresentation(image, 1);
Is there any other method I could use to update just the exif of an ALAsset? Or any way to generate the right imageData for method setImageData:metadata:completionBlock: ?
I found a way to generate imageData. Code below:
Byte *buffer = (Byte *)malloc(dr.size);
NSUInteger k = [dr getBytes:buffer fromOffset:0.0 length:dr.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:k freeWhenDone:YES];
So I can use the data above with setImageData:metadata:completionBlock: to update only the exif of ALAsset.

Resources