I'm building an app that allows users to select photos and videos from their device and upload them to the server. Trying to get the file size (in bytes) of each item select, can anyone help me out?
if ([dict objectForKey:UIImagePickerControllerMediaType] == ALAssetTypePhoto){ // image file
if ([dict objectForKey:UIImagePickerControllerOriginalImage]){
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 1; // image
item.itemSize = // what do I need here??
[m_items addObject:item];
}
} else if ([dict objectForKey:UIImagePickerControllerMediaType] == ALAssetTypeVideo){ // video file
if ([dict objectForKey:UIImagePickerControllerOriginalImage]){
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 2; // video
item.itemSize = // what do I need here??
[m_items addObject:item];
}
}
EDIT
Getting NSCocaoErrorDomain 256 with videos:
NSURL* urlPath=[dict objectForKey:#"UIImagePickerControllerReferenceURL"];
item = [BundleItem itemWithPath:urlPath AndDescription:nil];
item.itemImage = [dict objectForKeyedSubscript:UIImagePickerControllerOriginalImage];
item.itemType = 2; // video
//Error Container
NSError *attributesError;
NSDictionary *fileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[urlPath path] error:&attributesError];
NSNumber *fileSizeNumber = [fileAttributes objectForKey:NSFileSize];
long fileSize = [fileSizeNumber longValue];
item.itemSize = fileSize;
[m_items addObject:item];
For only image data selection:
item.itemImage = (UIImage*)[info valueForKey:UIImagePickerControllerOriginalImage];
NSData *imgData = UIImageJPEGRepresentation(item.itemImage, 1); //1 it represents the quality of the image.
NSLog(#"Size of Image(bytes):%d",[imgData length]);
Hope this will help you.
Below method is generalize, it will work for both image and video:
Something like this should take care finding the file size of a selected image or video returned from the UIImagePickerController
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSURL *videoUrl=(NSURL*)[info objectForKey:UIImagePickerControllerMediaURL];
//Error Container
NSError *attributesError;
NSDictionary *fileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[videoUrl path] error:&attributesError];
NSNumber *fileSizeNumber = [fileAttributes objectForKey:NSFileSize];
long long fileSize = [fileSizeNumber longLongValue];
}
Related
I have created a custom camera using AVFoundation, now after capturing images, I need to save them in the iPhone's gallery.
I tried saving images with UIImageWriteToSavedPhotosAlbum but found that this does not save EXIF information.
To Save EXIF information with an image refer below code.
- (void)saveImageDataToPhotoAlbum:(NSData *)originalData
{
NSDictionary *dataDic = [self getDataAndMetadata:originalData];
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib writeImageDataToSavedPhotosAlbum:dataDic[#"data"]
metadata:dataDic[#"metadata"]
completionBlock:^(NSURL *url, NSError *e) {
[self addToMyAlbum:url];
}];
}
- (NSDictionary *)getDataAndMetadata:(NSData *)originalData
{
CGImageSourceRef cimage = CGImageSourceCreateWithData((CFDataRef)originalData, nil);
NSDictionary *metadata = (NSDictionary *)CGImageSourceCopyPropertiesAtIndex(cimage, 0, nil);
NSMutableDictionary *metadataAsMutable = [NSMutableDictionary dictionaryWithDictionary:metadata];
metadataAsMutable[(NSString *)kCGImagePropertyGPSDictionary] = self.myGpsDic;
NSMutableData *dataForMetadataRemoval = [NSMutableData data];
CGImageDestinationRef dest =
CGImageDestinationCreateWithData((CFMutableDataRef)dataForMetadataRemoval, CGImageSourceGetType(cimage), 1, nil);
CGImageDestinationAddImageFromSource(dest, cimage, 0, (CFDictionaryRef)metadataAsMutable);
CGImageDestinationFinalize(dest);
CFRelease(dest);
[metadata release];
CFRelease(cimage);
return #{ #"data" : (dataForMetadataRemoval), #"metadata" : metadataAsMutable};
}
return metadataAsMutable;
}
In my app, I want to Retrieve Photos and Videos from the Photo Library, and then save them into my app documents directory.
Following is my codes:
- (UIImage *)getImageFromAsset:(ALAsset *)asset type:(NSInteger)nType
{
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
CGImageRef imageReference = [assetRepresentation fullResolutionImage];
CGFloat imageScale = [assetRepresentation scale];
UIImageOrientation imageOrientation = (UIImageOrientation)[assetRepresentation orientation];
UIImage *iImage = [[UIImage alloc] initWithCGImage:imageReference scale:imageScale orientation:imageOrientation];
return iImage;
}
- (UIImage *)getImageAtIndex:(NSInteger)nIndex type:(NSInteger)nType
{
return [self getImageFromAsset:(ALAsset *)_assetPhotos[nIndex] type:nType];
}
......
for (NSIndexPath *index in _dSelected) {
DLog(#"the selected index is %#", index);
image = nil;
image = [ASSETHELPER getImageAtIndex:index.row type:ASSET_PHOTO_FULL_RESOLUTION];
NSString *name = [ASSETHELPER getImageNameAtIndex:index.row];
NSString *filepath = [files stringByAppendingPathComponent:name];
NSString *aliapath = [alias stringByAppendingPathComponent:name];
aliapath = [aliapath stringByAppendingString:THUMBNAIL];
DLog(#"the files is %# the alias is %#", filepath, aliapath);
image = nil;
}
If I retrieve just 20 or 30 photos, it would be ok, but if I retrieve too many photos(maybe 50 ones), the App will Terminate due to Memory Pressure. I think I have set the image to nil after every one image , so the ios system shoud get back the memory after each for loop. But why Memory leak happens?
ARC still needs to manage your memory. As long as you are in your for loop, ARC will never have the chnace to release your memory. You need to put the inside of your loop within an autorelase pool.
for (NSIndexPath *index in _dSelected) {
#autoreleasepool {
DLog(#"the selected index is %#", index);
image = nil;
image = [ASSETHELPER getImageAtIndex:index.row type:ASSET_PHOTO_FULL_RESOLUTION];
NSString *name = [ASSETHELPER getImageNameAtIndex:index.row];
NSString *filepath = [files stringByAppendingPathComponent:name];
NSString *aliapath = [alias stringByAppendingPathComponent:name];
aliapath = [aliapath stringByAppendingString:THUMBNAIL];
DLog(#"the files is %# the alias is %#", filepath, aliapath);
image = nil;
}
}
This will let your memory get freed as you deal with each image.
I have searched all over the web and cannot find a tutorial on how to use the SoundTouch library for beat detection.
(Note: I have no C++ experience prior to this. I do know C, Objective-C, and Java. So I could have messed some of this up, but it compiles.)
I added the framework to my project and managed to get the following to compile:
NSString *path = [[NSBundle mainBundle] pathForResource:#"song" ofType:#"wav"];
NSData *data = [NSData dataWithContentsOfFile:path];
player =[[AVAudioPlayer alloc] initWithData:data error:NULL];
player.volume = 1.0;
player.delegate = self;
[player prepareToPlay];
[player play];
NSUInteger len = [player.data length]; // Get the length of the data
soundtouch::SAMPLETYPE sampleBuffer[len]; // Create buffer array
[player.data getBytes:sampleBuffer length:len]; // Copy the bytes into the buffer
soundtouch::BPMDetect *BPM = new soundtouch::BPMDetect(player.numberOfChannels, [[player.settings valueForKey:#"AVSampleRateKey"] longValue]); // This is working (tested)
BPM->inputSamples(sampleBuffer, len); // Send the samples to the BPM class
NSLog(#"Beats Per Minute = %f", BPM->getBpm()); // Print out the BPM - currently returns 0.00 for errors per documentation
The inputSamples(*samples, numSamples) song byte information confuses me.
How do I get these pieces of information from a song file?
I tried using memcpy() but it doesn't seem to be working.
Anyone have any thoughts?
After hours and hours of debugging and reading the limited documentation on the web, I modified a few things before stumbling upon this: You need to divide numSamples by numberOfChannels in the inputSamples() function.
My final code is like so:
NSString *path = [[NSBundle mainBundle] pathForResource:#"song" ofType:#"wav"];
NSData *data = [NSData dataWithContentsOfFile:path];
player =[[AVAudioPlayer alloc] initWithData:data error:NULL];
player.volume = 1.0; // optional to play music
player.delegate = self;
[player prepareToPlay]; // optional to play music
[player play]; // optional to play music
NSUInteger len = [player.data length];
soundtouch::SAMPLETYPE sampleBuffer[len];
[player.data getBytes:sampleBuffer length:len];
soundtouch::BPMDetect BPM(player.numberOfChannels, [[player.settings valueForKey:#"AVSampleRateKey"] longValue]);
BPM.inputSamples(sampleBuffer, len/player.numberOfChannels);
NSLog(#"Beats Per Minute = %f", BPM.getBpm());
I've tried this solution to read the BPM from mp3 files (using the TSLibraryImport class to convert to wav) inside the iOS Music Library:
MPMediaItem *item = [collection representativeItem];
NSURL *urlStr = [item valueForProperty:MPMediaItemPropertyAssetURL];
TSLibraryImport* import = [[TSLibraryImport alloc] init];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSURL* destinationURL = [NSURL fileURLWithPath:[documentsDirectory stringByAppendingPathComponent:#"temp_data"]];
[[NSFileManager defaultManager] removeItemAtURL:destinationURL error:nil];
[import importAsset:urlStr toURL:destinationURL completionBlock:^(TSLibraryImport* import) {
NSString *outPath = [documentsDirectory stringByAppendingPathComponent:#"temp_data"];
NSData *data = [NSData dataWithContentsOfFile:outPath];
AVAudioPlayer *player =[[AVAudioPlayer alloc] initWithData:data error:NULL];
NSUInteger len = [player.data length];
int numChannels = player.numberOfChannels;
soundtouch::SAMPLETYPE sampleBuffer[1024];
soundtouch::BPMDetect *BPM = new soundtouch::BPMDetect(player.numberOfChannels, [[player.settings valueForKey:#"AVSampleRateKey"] longValue]);
for (NSUInteger i = 0; i <= len - 1024; i = i + 1024) {
NSRange r = NSMakeRange(i, 1024);
//NSData *temp = [player.data subdataWithRange:r];
[player.data getBytes:sampleBuffer range:r];
int samples = sizeof(sampleBuffer) / numChannels;
BPM->inputSamples(sampleBuffer, samples); // Send the samples to the BPM class
}
NSLog(#"Beats Per Minute = %f", BPM->getBpm());
}];
The strangeness is that the calculated BMP is always the same value:
2013-10-02 03:05:36.725 AppTestAudio[1464:1803] Beats Per Minute = 117.453835
No matter which track was i.e. number of frames or the buffer size (here I used 2K buffer size as for the SoundTouch example in the source code of the library).
For Swift 3:
https://github.com/Luccifer/BPM-Analyser
And use it like:
guard let filePath = Bundle.main.path(forResource: "TestMusic", ofType: "m4a"),
let url = URL(string: filePath) else {return "error occured, check fileURL"}
BPMAnalyzer.core.getBpmFrom(url, completion: nil)
Feel free to comment!
In order to prevent lagging in my app, I'm trying to compress images larger than 1 MB (mostly for pics taken from iphone's normal camera.
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSData *imageSize = UIImageJPEGRepresentation(image, 1);
NSLog(#"original size %u", [imageSize length]);
UIImage *image2 = [UIImage imageWithData:UIImageJPEGRepresentation(image, 0)];
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1);
NSLog(#"new size %u", [newImageSize length]);
UIImage *image3 = [UIImage imageWithData:UIImageJPEGRepresentation(image2, 0)];
NSData *newImageSize2 = UIImageJPEGRepresentation(image3, 1);
NSLog(#"new size %u", [newImageSize2 length]);
picView = [[UIImageView alloc] initWithImage:image3] ;
However, the NSLog I get outputs something along the lines of
original size 3649058
new size 1835251
new size 1834884
The difference between the 1st and 2nd compression is almost negligible. My goal is to get the image size below 1 MB. Did I overlook something/is there an alternative approach to achieve this?
EDIT: I want to avoid scaling the image's height and width, if possible.
A couple of thoughts:
The UIImageJPEGRepresentation function does not return the "original" image. For example, if you employ a compressionQuality of 1.0, it does not, technically, return the "original" image, but rather it returns a JPEG rendition of the image with compressionQuality at its maximum value. This can actually yield an object that is larger than the original asset (at least if the original image is a JPEG). You're also discarding all of the metadata (information about where the image was taken, the camera settings, etc.) in the process.
If you want the original asset, you should use PHImageManager:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = [result firstObject];
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString *filename = [(NSURL *)info[#"PHImageFileURLKey"] lastPathComponent];
// do what you want with the `imageData`
}];
In iOS versions prior to 8, you'd have to use assetForURL of the ALAssetsLibrary class:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSLog(#"size of original asset %llu", [representation size]);
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data strea
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
NSUInteger bytesRead;
while (bytesRemaining > 0) {
bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(#"error reading asset representation: %#", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(#"error=%#", error);
}];
Please note that this assetForURL runs asynchronously.
If you want a NSData with compression, you can use UIImageJPEGRepresentation with a compressionQuality less than 1.0. Your code actually does this with a compressionQuality of 0.0, which should offer maximum compression. But you don't save that NSData, but rather use it to create a UIImage and you then get a new UIImageJPEGRepresentation with a compressionQuality of 1.0, thus losing much of the compression you originally achieved.
Consider the following code:
// a UIImage of the original asset (discarding meta data)
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// this may well be larger than the original asset
NSData *jpgDataHighestCompressionQuality = UIImageJPEGRepresentation(image, 1.0);
[jpgDataHighestCompressionQuality writeToFile:[docsPath stringByAppendingPathComponent:#"imageDataFromJpeg.jpg"] atomically:YES];
NSLog(#"compressionQuality = 1.0; length = %u", [jpgDataHighestCompressionQuality length]);
// this will be smaller, but with some loss of data
NSData *jpgDataLowestCompressionQuality = UIImageJPEGRepresentation(image, 0.0);
NSLog(#"compressionQuality = 0.0; length = %u", [jpgDataLowestCompressionQuality length]);
UIImage *image2 = [UIImage imageWithData:jpgDataLowestCompressionQuality];
// ironically, this will be larger than jpgDataLowestCompressionQuality
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1.0);
NSLog(#"new size %u", [newImageSize length]);
In addition to the JPEG compression quality outlined the prior point, you could also just resize the image. You can also marry this with the JPEG compressionQuality, too.
You can not compress the image again and again. If so everything can be compressed again and again. Then how small do you think it will be?
One way to make your image smaller is to change it's size. For example change 640X960 to 320X480. But you will lose quality.
I is the first implementation of UIImageJPEGRepresentation (image, 0.75), and then change the size. Maybe image's width and heigh two-thirds or half.
NSString *imgvalue=[[NSString alloc]initWithString:item.imgItem];
printf("\n img1 value is %s",[imgvalue UTF8String]);
cell.imageView.image=[UIImage imageNamed:#"unknown.jpg"];
if (imgvalue !=nil)
{
NSData *imageData;
#try
{
printf("\n image value in image data is %s",[imgvalue UTF8String]);
imageData = [[NSData alloc]initWithContentsOfURL:[NSURL URLWithString:imgvalue]];
printf("\n imageData Length is %d",[imageData length]);
}
#catch (NSException * e)
{
//printf("Exception message is %s",[e);
}
#finally
{
UIImage * imageFromImageData = [[UIImage alloc] initWithData:imageData];
//[image setImage:imageFromImageData];
cell.imageView.image=imageFromImageData;
[imageData release];
[imageFromImageData release];
}
}
After getting the imgValue I copied that url and when I checked in the browser it shows me the image but it didn't store into NSData.Please help me
I checked your code and everything seems to be working.
Please check imageData with
NSLog(#"%u", [imageData length]);
Change this line
NSString *imgvalue=[[NSString alloc]initWithString:item.imgItem];
to
NSString *imgvalue=#"http://animals.catchsmile.com/cat-3/";
using this i got data in log..
i have used static string and tested you can use item.imgItem value as well..