PHAsset get original file name - ios

I wonder if there any way to get the original file name using PHAsset?
I use the following code to extract the file info.
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:requestOption resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
entity.fileUrl = [info objectForKey:#"PHImageFileURLKey"];
entity.filename = [[NSFileManager defaultManager] displayNameAtPath:[ entity.fileUrl path]];
}];
However, It doesn't return original name but the name in the format "img_123"
I've just checked official apple docs . there has been introduced a new class PHAssetResource and the property originalFilename which is available in the iOS 9+. The problem is that I use the image picker library CTAssetsPickerController which 's based on the Photos framework; it returns picked image as the PHAsset object . PS. I'm looking for the solution which is compatible with iOS 8 :).
Thank you!

On iOS 8 your solution is the right (and only approach) to get a filename at all.
On iOS 9 this works:
NSArray *resources = [PHAssetResource assetResourcesForAsset:asset];
NSString *orgFilename = ((PHAssetResource*)resources[0]).originalFilename;

Short way to get file name with one line of code. Asset have a property for accessing file name.
NSString*FileName=[asset valueForKey:#"filename"];
NSLog(#"File name %#",FileName);
Its done.
Note: Accepted answer takes lots of time to load a phasset but it works.

I had to modify my code because it started returning nonsense names. My solution was to pick the resource based on asset's mediaType and resource's type, but maybe there is something easier:
extension PHAsset {
var primaryResource: PHAssetResource? {
let types: Set<PHAssetResourceType>
switch mediaType {
case .video:
types = [.video, .fullSizeVideo]
case .image:
types = [.photo, .fullSizePhoto]
case .audio:
types = [.audio]
case .unknown:
types = []
#unknown default:
types = []
}
let resources = PHAssetResource.assetResources(for: self)
let resource = resources.first { types.contains($0.type)}
return resource ?? resources.first
}
var originalFilename: String {
guard let result = primaryResource else {
return "file"
}
return result.originalFilename
}
}

Maybe you can use the method, it works above iOS8:
[asset requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
CIImage *fullImage = [CIImage imageWithContentsOfURL:contentEditingInput.fullSizeImageURL];
NSLog(#"%#",contentEditingInput.fullSizeImageURL);//get url
NSLog(#"%#", fullImage.properties.description);//get {TIFF}, {Exif}
}];

#holtmann solution written in Swift
let resource = PHAssetResource.assetResources(for: asset)
let filename = resource.first?.originalFilename ?? "unknown"

Related

How to read file as String in Flutter that has been written to filesystem using NSKeyedArchiver in iOS?

After updating iOS native app with an app written on Flutter I want to read a file from filesystem on iOS device using Dart. The file I want to read has been previously written to filesystem using this ObjectiveC code:
- (void)setAccount:(FTAccountModel *)account {
_account = account;
NSString *path = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) firstObject];
path = [path stringByAppendingPathComponent:AccountModelPath];
if (account) {
NSArray * array = #[account];
[array writeToFile:path atomically:YES];
[NSKeyedArchiver archiveRootObject:array toFile:path];
}
}
I've tried the following approach in Flutter using path_provider package:
final appDocDir = await getApplicationDocumentsDirectory();
final accountDataFile = File('${appDocDir.path}/$_iosAccountDataFile');
String contents = await accountDataFile.readAsString();
print("contents: $contents");
But I get an error when calling readAsString() method:
FileSystemException: Failed to decode data using encoding 'utf-8', path = '/var/mobile/Containers/Data/Application/FBCB4862-E5EA-4C93-8C2E-3DF1F00A8645/Documents/AccountModel.data'
How to read file on iOS device using Dart and Flutter, that has been written using NSKeyedArchiver?
As of writing this answer, there are no plugins to read the file, that has been previously written to the filesystem using NSKeyedArchiver in iOS. The way to read the file is to write custom platform-specific code.
So the iOS platform code on Swift will be something like the following:
private func callGetUserIdFromNativeApp(result: #escaping FlutterResult) {
var account: FTAccountModel?
let fm = FileManager.default
let urls = fm.urls(for: .documentDirectory, in: .userDomainMask)
if (!urls.isEmpty) {
let file = urls[0].appendingPathComponent("Accounts.data", isDirectory: false)
if (fm.fileExists(atPath: file.path)) {
if let accountArray: [FTAccountModel] = NSKeyedUnarchiver.unarchiveObject(withFile: file.path) as? [FTAccountModel] {
if (!accountArray.isEmpty) {
account = accountArray[0]
}
}
}
}
if let userId: Int = account?.userId {
result(String(userId))
} else {
result(nil)
}
}
And the flutter part will use MethodChannel to invoke the native code:
static const MethodChannel _channel = const MethodChannel("CHANNEL_NAME");
static Future<String> getUserIdFromNativeIos() async {
try {
return await _channel.invokeMethod("METHOD_NAME");
} catch (e){
return _failedString();
}
}

How to read QR code from static image

I know that you can use AVFoundation to scan a QR code using the device's camera. Now here comes the problem, how can I do this from an static UIImage object?
Swift 4 version of #Neimsz's answer
func detectQRCode(_ image: UIImage?) -> [CIFeature]? {
if let image = image, let ciImage = CIImage.init(image: image){
var options: [String: Any]
let context = CIContext()
options = [CIDetectorAccuracy: CIDetectorAccuracyHigh]
let qrDetector = CIDetector(ofType: CIDetectorTypeQRCode, context: context, options: options)
if ciImage.properties.keys.contains((kCGImagePropertyOrientation as String)){
options = [CIDetectorImageOrientation: ciImage.properties[(kCGImagePropertyOrientation as String)] ?? 1]
} else {
options = [CIDetectorImageOrientation: 1]
}
let features = qrDetector?.features(in: ciImage, options: options)
return features
}
return nil
}
How to use
if let features = detectQRCode(#imageLiteral(resourceName: "qrcode")), !features.isEmpty{
for case let row as CIQRCodeFeature in features{
print(row.messageString ?? "nope")
}
}
And during the execution this doesn't produce the Finalizing CVPixelBuffer 0x170133e20 while lock count is 1
I used the following QRCode Image (QRCode = https://jingged.com)
(Tested on iPhone 6 simulator with iOS version 11.2)
Output:
2018-03-14 15:31:13.159400+0530 TestProject[25889:233062] [MC] Lazy loading NSBundle MobileCoreServices.framework
2018-03-14 15:31:13.160302+0530 TestProject[25889:233062] [MC] Loaded MobileCoreServices.framework
https://jingged.com
The iOS API provides the CIDetector class from CoreImage framework.
CIDetector let you find specific patterns in images, like faces, smiles, eyes, or in our case : QRCodes.
Here is the code to detect a QRCode from an UIImage in Objective-C:
-(NSArray *)detectQRCode:(UIImage *) image
{
#autoreleasepool {
NSLog(#"%# :: %#", NSStringFromClass([self class]), NSStringFromSelector(_cmd));
NSCAssert(image != nil, #"**Assertion Error** detectQRCode : image is nil");
CIImage* ciImage = image.CIImage; // assuming underlying data is a CIImage
//CIImage* ciImage = [[CIImage alloc] initWithCGImage: image.CGImage];
// to use if the underlying data is a CGImage
NSDictionary* options;
CIContext* context = [CIContext context];
options = #{ CIDetectorAccuracy : CIDetectorAccuracyHigh }; // Slow but thorough
//options = #{ CIDetectorAccuracy : CIDetectorAccuracyLow}; // Fast but superficial
CIDetector* qrDetector = [CIDetector detectorOfType:CIDetectorTypeQRCode
context:context
options:options];
if ([[ciImage properties] valueForKey:(NSString*) kCGImagePropertyOrientation] == nil) {
options = #{ CIDetectorImageOrientation : #1};
} else {
options = #{ CIDetectorImageOrientation : [[ciImage properties] valueForKey:(NSString*) kCGImagePropertyOrientation]};
}
NSArray * features = [qrDetector featuresInImage:ciImage
options:options];
return features;
}
}
The returned NSArray* will contain CIFeature* if a QRCode is present and detected. If there was no QRCode, the NSArray* will be nil. If the QRCode decoding fails, the NSArray* will have no element.
To obtain the encoded string :
if (features != nil && features.count > 0) {
for (CIQRCodeFeature* qrFeature in features) {
NSLog(#"QRFeature.messageString : %# ", qrFeature.messageString);
}
}
As in the answer of #Duncan-C, you can then extract QRCode corners and draw an enclosing bounding box of the QRCode on the image.
Note :
Under iOS10.0 beta 6, the call to [qrDetector featuresInImage:ciImage options:options] when using images coming from the cameraSampleBuffer logs this internal warning (it runs smoothly but spam the console with this message, and I could not find a way to get rid of it for now):
Finalizing CVPixelBuffer 0x170133e20 while lock count is 1.
Source :
Apple Dev API Reference - CIDetector
Apple Dev API Programming guide - Face detection
None of the answers here were extremely straightforward in regards to returning test messages. Made a tiny extension that works well for me:
https://gist.github.com/freak4pc/3f7ae2801dd8b7a068daa957463ac645
extension UIImage {
func parseQR() -> [String] {
guard let image = CIImage(image: self) else {
return []
}
let detector = CIDetector(ofType: CIDetectorTypeQRCode,
context: nil,
options: [CIDetectorAccuracy: CIDetectorAccuracyHigh])
let features = detector?.features(in: image) ?? []
return features.compactMap { feature in
return (feature as? CIQRCodeFeature)?.messageString
}
}
}
Core Image has the CIDetector class, with the CIDetectorTypeQRCode for detecting QR codes. You can feed a Core Image filter either a still image or a video.
That should meet your needs. See the Xcode docs for more info.
The Github repo iOS8-day-by-day from ShinobiControls includes a project LiveDetection that shows how to use the CIDetectorTypeQRCode both from a video feed and from a still image. It looks like it hasn't been updated for Swift 2.0, and I wasn't able to get it to compile under Xcode 7.2.1, but the function performQRCodeDetection in the project DOES compile. (The compile problems are with code that handles all the horrible type-casting you have to deal with to handle CVPixelBuffers in Swift, which doesn't matter if all you want to do is find QRCodes in static images.)
EDIT:
Here is the key method from that site (in Swift)
func performQRCodeDetection(image: CIImage) -> (outImage: CIImage?, decode: String) {
var resultImage: CIImage?
var decode = ""
if let detector = detector {
let features = detector.featuresInImage(image)
for feature in features as! [CIQRCodeFeature] {
resultImage = drawHighlightOverlayForPoints(image,
topLeft: feature.topLeft,
topRight: feature.topRight,
bottomLeft: feature.bottomLeft,
bottomRight: feature.bottomRight)
decode = feature.messageString
}
}
return (resultImage, decode)
}
If you need just a string you can use such code:
class QRToString {
func string(from image: UIImage) -> String {
var qrAsString = ""
guard let detector = CIDetector(ofType: CIDetectorTypeQRCode,
context: nil,
options: [CIDetectorAccuracy: CIDetectorAccuracyHigh]),
let ciImage = CIImage(image: image),
let features = detector.features(in: ciImage) as? [CIQRCodeFeature] else {
return qrAsString
}
for feature in features {
guard let indeedMessageString = feature.messageString else {
continue
}
qrAsString += indeedMessageString
}
return qrAsString
}
}
Use Zbar SDK to read QRcode from Static Image.
ZBar-SDK-iOS
Please check this tutorial regarding intigration of Zbar SDK.
ZBar SDK Integration Tutorial
And then try to Scan static image.
Use Zbar scanner class to scan your image.
Here is documentation.
ZBarImageScanner.
Just for Example , How to use Zbar scanner class,
ZBarImageScanner *scandoc = [[ZBarImageScanner alloc]init];
NSInteger resultsnumber = [scandoc scanImage:yourUIImage];
if(resultsnumber > 0){
ZBarSymbolSet *results = scandoc.results;
//Here you will get result.
}
Below link will help you.
scaning-static-uiimage-using-ios-zbar-sdk
Objective-C
- (NSString *)readQR {
CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeQRCode context:nil options:#{
CIDetectorAccuracy:CIDetectorAccuracyHigh
}];
/// in here you can replace `self` to your image value
CIImage *ciImage = [[CIImage alloc]initWithImage:self];
NSArray *features = [detector featuresInImage:ciImage];
if([features count] == 0) {
return nil;
}
__block NSString *qrString = #"";
[features enumerateObjectsUsingBlock:^(CIQRCodeFeature * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
qrString = obj.messageString;
}];
return qrString;
}

how to check if file exist use swift, rewrite from objective c

how to rewrite this objective-c language to swift?
NSString *filePath = #"/Applications/MySample.app";
if ([[NSFileManager defaultManager] fileExistsAtPath:filePath])
{
// avoid open add friend
}
regards.
Equivalent Swift 3 Code:
let filePath = "/Applications/MySample.app"
if (FileManager.default.fileExists(atPath: filePath)) {
// avoid open add friend
}
Swift 2
let filePath = "/Applications/MySample.app"
if (NSFileManager.defaultManager().fileExistsAtPath(filePath))
{
// avoid open add friend
}
Some years after the question has been asked I recommend to take rewrite literally and use the URL related API
let fileURL = URL(fileURLWithPath:"/Applications/MySample.app")
if let _ = try? fileURL.checkResourceIsReachable() {
// file exists
}
let path = "/Applications/MySample.app"
let hasFile = FileManager().fileExists(atPath: path)
if hasFile {
// use file
}
else {
// possibly inform user the file does not exist
}

Load file in Today extension when device is locked

In my today extension with my device unlocked, this line of code works as expected, returning the data from the image path:
let imageData = NSData(contentsOfFile: path)
However when my device is locked with a passcode, it returns nil. Is there any way to access images in the file system when the device is locked? I can access UserDefaults just fine, but not files in the directory for my shared group. Here is how I am creating the path, calling imagePath, which is correctly populated with the path I expect in both cases:
func rootFilePath() -> String? {
let manager = NSFileManager()
let containerURL = manager.containerURLForSecurityApplicationGroupIdentifier(GROUP_ID)
if let unwrappedURL = containerURL {
return unwrappedURL.path
}
else {
return nil
}
}
func imagePath() -> String? {
let rootPath = rootFilePath()
if let uPath = rootPath {
return "\(uPath)/\(imageId).png"
}
else {
return nil
}
}
I just figured it out! You need to set the file permissions accordingly:
NSFileManager *fm = [[NSFileManager alloc] init];
NSDictionary *attribs = #{NSFileProtectionKey : NSFileProtectionNone};
NSError *unprotectError = nil;
BOOL unprotectSuccess = [fm setAttributes:attribs
ofItemAtPath:[containerURL path]
error:&unprotectError];
if (!unprotectSuccess) {
NSLog(#"Unable to remove protection from file! %#", unprotectError);
}
In many cases you wouldn't normally want to do this, but because the information is intended to be viewed from the lock screen, I'm OK with removing file protection.

How can i render a list of all iPhone contacts with Cordova (phone gap)

I am trying to create a Application which lists all contacts from the iPhone address book with the following code (coffeescript)
listContacts: ->
options = new ContactFindOptions()
options.filter = '';
options.multiple = true
fields = ["id", "photos", "name", "phoneNumbers"]
navigator.contacts.find(fields, #onSuccess, #onError, options)
onSuccess: (contacts) ->
console.log contacts.length
onError: (error) ->
console.log error
this seems to work nice for a bunch of contacts. but with 3000 the contacts will never return. the funny thing though this works perfectly on the iOsSimulator.
are there any limitations to the number of contacts which can be retrieved?
I had the same problem with 300 contacts, it took around 5 minutes. After I patched it only takes 10 seconds.
Here is my pull request : https://github.com/phonegap/phonegap/pull/19
They have to generate a temp file for each picture and they are using a crazy loop to find a free file path. Something like :
do {
filePath = [NSString stringWithFormat:#"%#/photo_%03d.jpg", docsPath, i++];
} while ([fileMgr fileExistsAtPath:filePath]);
Now I use mktemp and everything is faster.
If you don't need full res pictures, you can also replace :
CFDataRef photoData = ABPersonCopyImageData(self.record);
by :
CFDataRef photoData = ABPersonCopyImageDataWithFormat(self.record, kABPersonImageFormatThumbnail);
I hope that'll help you !
Edit :
IOS'll flush the temp directory each time you start the application:
You are responsible for deleting any temporary files that you created.
The system will clean them up at startup, but that could be a very
long time away.
From: http://cocoadev.com/wiki/NSTemporaryDirectory
If you don't want to slow down the bootstrap of your application, you should use always the same filepath based on the contact id. You'll save cleanup and write time if the file already exists :
- (NSObject*)extractPhotos
{
NSMutableArray* photos = nil;
if (ABPersonHasImageData(self.record)) {
//CFDataRef photoData = ABPersonCopyImageDataWithFormat(self.record, kABPersonImageFormatThumbnail);
CFDataRef photoData = ABPersonCopyImageData(self.record);
NSData* data = (__bridge NSData*)photoData;
// write to temp directory and store URI in photos array
// get the temp directory path
NSString* docsPath = [NSTemporaryDirectory ()stringByStandardizingPath];
NSError* err = nil;
int recordId = ABRecordGetRecordID(self.record);
NSFileManager* fileMgr = [[NSFileManager alloc] init];
NSString* filePath = [NSString stringWithFormat:#"%#/photo_%03d.jpg", docsPath, recordId];
BOOL hasImage = NO;
if ([fileMgr fileExistsAtPath:filePath]) {
hasImage = YES;
} else if ([data writeToFile:filePath options:NSAtomicWrite error:&err]) {
hasImage = YES;
}
if (hasImage) {
photos = [NSMutableArray arrayWithCapacity:1];
NSMutableDictionary* newDict = [NSMutableDictionary dictionaryWithCapacity:2];
[newDict setObject:filePath forKey:kW3ContactFieldValue];
[newDict setObject:#"url" forKey:kW3ContactFieldType];
[newDict setObject:#"false" forKey:kW3ContactFieldPrimary];
[photos addObject:newDict];
}
CFRelease(photoData);
}
return photos;
}
Edit (08/01/2013):
FYI : merged in cordova : http://git-wip-us.apache.org/repos/asf/cordova-ios/commit/c6a1dbe3
First you have to add plugin from terminal command line
$ cordova plugin add org.apache.cordova.contacts
onDeviceReady you can call a method to open contact list
function chooseContact() {
var options = new ContactFindOptions();
options.fields = ["displayName", "name", "emails", "phoneNumbers"];
navigator.contacts.chooseContact(onSuccess, options);
}
function onSuccess(id, contact) {
console.log(JSON.stringify(contact));
}

Resources