Caching Images in UICollectionViewCell Swift - ios

I am trying to cache images to prevent it from reloading constantly and crashing the app. So, I went to look at apple's implementation of ImageCaching which was written in Objective-C and had to replicate in swift. But, when I run and I try to cache, my app cache with an error of fatal error: found nil while unwrapping an optional value
Swift Code
func updateCachedAssets() -> Void {
let isViewVisible: Bool = self.isViewLoaded() && self.view.window != nil
if !isViewVisible { return }
//The preheat window is twice the height of the visible rect
var preheatRect: CGRect = (self.collectionView?.bounds)!
preheatRect = CGRectInset(preheatRect, 0.0, -0.5 * CGRectGetHeight(preheatRect))
//Check if the collection view is showing an area that is significantly diferent to the last preheated area
let delta: CGFloat = abs(CGRectGetMidY(preheatRect) - CGRectGetMidY(self.prevoiusPreheatRect))
if delta > CGRectGetHeight((self.collectionView?.bounds)!)/3.0 {
//Compute the assets to start caching and to stop caching
let addedIndexPaths = NSMutableArray()
let removedIndexPaths = NSMutableArray()
self.computeDifferenceBetweenRect(self.prevoiusPreheatRect, andRect: preheatRect, removedHandler: { (removedRect) -> Void in
let indexPaths: NSArray = [self.collectionView! .aapl_indexPathsForElementsInRect(removedRect)]
removedIndexPaths.addObjectsFromArray(indexPaths as [AnyObject])
}, addedHandler: { (addedRect) -> Void in
let indexPaths: NSArray = [self.collectionView! .aapl_indexPathsForElementsInRect(addedRect)]
addedIndexPaths.addObjectsFromArray(indexPaths as [AnyObject])
})
//print("AssetAtIndex", self.assetsAtIndexPaths(addedIndexPaths))
let assetsToStartCaching: NSArray = self.assetsAtIndexPaths(addedIndexPaths)
let assetsToStopCaching: NSArray = self.assetsAtIndexPaths(removedIndexPaths)
//Update the assets the PHCachingImageManager is caching.
self.imageManager.startCachingImagesForAssets(assetsToStartCaching as! [PHAsset], targetSize: AssetGridThumbnailSize, contentMode: .AspectFill, options: nil)
self.imageManager.startCachingImagesForAssets(assetsToStopCaching as! [PHAsset], targetSize: AssetGridThumbnailSize, contentMode: .AspectFill, options: nil)
self.prevoiusPreheatRect = preheatRect
}
}
Objective-C
- (void)updateCachedAssets {
BOOL isViewVisible = [self isViewLoaded] && [[self view] window] != nil;
if (!isViewVisible) { return; }
// The preheat window is twice the height of the visible rect.
CGRect preheatRect = self.collectionView.bounds;
preheatRect = CGRectInset(preheatRect, 0.0f, -0.5f * CGRectGetHeight(preheatRect));
/*
Check if the collection view is showing an area that is significantly
different to the last preheated area.
*/
CGFloat delta = ABS(CGRectGetMidY(preheatRect) - CGRectGetMidY(self.previousPreheatRect));
if (delta > CGRectGetHeight(self.collectionView.bounds) / 3.0f) {
// Compute the assets to start caching and to stop caching.
NSMutableArray *addedIndexPaths = [NSMutableArray array];
NSMutableArray *removedIndexPaths = [NSMutableArray array];
[self computeDifferenceBetweenRect:self.previousPreheatRect andRect:preheatRect removedHandler:^(CGRect removedRect) {
NSArray *indexPaths = [self.collectionView aapl_indexPathsForElementsInRect:removedRect];
[removedIndexPaths addObjectsFromArray:indexPaths];
} addedHandler:^(CGRect addedRect) {
NSArray *indexPaths = [self.collectionView aapl_indexPathsForElementsInRect:addedRect];
[addedIndexPaths addObjectsFromArray:indexPaths];
}];
NSArray *assetsToStartCaching = [self assetsAtIndexPaths:addedIndexPaths];
NSArray *assetsToStopCaching = [self assetsAtIndexPaths:removedIndexPaths];
// Update the assets the PHCachingImageManager is caching.
[self.imageManager startCachingImagesForAssets:assetsToStartCaching
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:nil];
[self.imageManager stopCachingImagesForAssets:assetsToStopCaching
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:nil];
// Store the preheat rect to compare against in the future.
self.previousPreheatRect = preheatRect;
}
}
The last ran code before crashing as shown by the debugger
Swift
func assetsAtIndexPaths(indexPaths: NSArray) -> NSArray {
if indexPaths.count == 0 {return []}
let assets = NSMutableArray(capacity: indexPaths.count)
for indexPath in indexPaths {
let asset = self.assetsFetchResults[indexPath.item] as! PHAsset
assets.addObject(asset)
}
return assets
}
Objective-C
- (NSArray *)assetsAtIndexPaths:(NSArray *)indexPaths {
if (indexPaths.count == 0) { return nil; }
NSMutableArray *assets = [NSMutableArray arrayWithCapacity:indexPaths.count];
for (NSIndexPath *indexPath in indexPaths) {
PHAsset *asset = self.assetsFetchResults[indexPath.item];
[assets addObject:asset];
}
return assets;
}
Error
fatal error: unexpectedly found nil while unwrapping an Optional value
The last line before throwing the error. Which is not an optional value is
let asset = self.assetsFetchResults[indexPath.item] as! PHAsset which is in assetsAtIndexPaths method
Any help would be appreciated. Thanks

You should unwrap optional value.

If the crash is due to an undefined value in the Optional, the following will protect against that condition
if let asset = self.assetsFetchResults[indexPath.item] as? PHAsset {
assets.addObject(asset)
}
if you replace this code with it inside your for loop
let asset = self.assetsFetchResults[indexPath.item] as! PHAsset
assets.addObject(asset)
This is called Optional Binding as found in the Swift book:
You use optional binding to find out whether an optional contains a value, and if so, to make that value available as a temporary constant or variable. Optional binding can be used with if and while statements to check for a value inside an optional, and to extract that value into a constant or variable, as part of a single action. if and while statements are described in more detail in Control Flow.

Related

Can I get PHAsset / AVAsset is hdr video/dolby vision on iphone12?

I want to add HDR icon to indicate some asset is HDR, but I can't get any info to check a video if this is a HDR video record from iphone12
+ (BOOL)isHDRVideo:(AVAsset *)avasset {
if (!avasset) {
return NO;
}
__block BOOL isHDRVideo = NO;
[avasset.tracks enumerateObjectsUsingBlock:^(AVAssetTrack * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stopTracks) {
[obj.formatDescriptions enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stopFormatDescriptions) {
CMFormatDescriptionRef desc = (__bridge CMVideoFormatDescriptionRef)obj;
NSDictionary *dic = (__bridge NSDictionary *)CMFormatDescriptionGetExtensions(desc);
NSString *imageBufferColorPrimaries = dic[(__bridge id)kCVImageBufferColorPrimariesKey];
if ([imageBufferColorPrimaries isEqualToString:(__bridge id)kCVImageBufferColorPrimaries_ITU_R_2020]) {
*stopFormatDescriptions = YES;
*stopTracks = YES;
isHDRVideo = YES;
}
}];
}];
return isHDRVideo;
}
A better approach could be using avasset.tracks(withMediaCharacteristic: .containsHDRVideo)
Or
simpleVideo.tracks.forEach{$0.hasMediaCharacteristic(.containsHDRVideo)}
This post really helped my out, so I felt like I had to share back what I've found out to provide a swift based approach for future readers.
One can extend AVAssetTrack to move all the logic where the CMFormatDescription is defined:
public extension AVAssetTrack {
var isHDRVideo: Bool {
guard
self.mediaType == .video, // If is not a video track is not HDR
let cmFormatDescription = self.formatDescriptions.map { $0 as! CMFormatDescription }.first, // Safely get the description
let transferFunction = CMFormatDescriptionGetExtension(
cmFormatDescription,
extensionKey: kCVImageBufferTransferFunctionKey), // It can be nil, which will make the as! CFString fail
else { return false }
return [
kCVImageBufferTransferFunction_ITU_R_2020,
kCVImageBufferTransferFunction_ITU_R_2100_HLG,
kCVImageBufferTransferFunction_SMPTE_ST_2084_PQ
].contains(transferFunction as! CFString)
}
}
Then say you have an AVAsset called asset, you can do:
asset.tracks(withMediaType: .video).map { $0.isHDRVideo }
and build your logic on that.

How can I add search option on UIPickerview in ios?

I am using UIActionsheet uipickerview in my project. Here, I am passing value according to his id.
I mean, in uipickerview list it will show name, and after passing it takes its id.
So, for 5-10 values it is ok, but suppose I have more than 100 values with its id, then it takes to much time to scroll, find and select a value.
So, I want to add search option in it. So after clicking my Text Field that picker or pop up will come. And it shows some list first, as well as search option available there so user can type any one or two words, so according its words data will come according to word (auto search complete).
How can I implement it.
If Anyone wants it in swift...
Create a textField with name txtSearch and then in ViewDidload add below two lines
self.txtSearch.delegate=self
self.txtSearch.addTarget(self, action: #selector(self.textFieldValueChanged(TextField:)), for: UIControlEvents.editingChanged)
func textFieldValueChanged(TextField:UITextField)
{
if((TextField.text?.characters.count)!>0)
{
self.filteredArray = (self.searchInArray(srchArray: self.ArraywithFullData, withKey: "key value", Characters: TextField.text!))!
}
else
{
self.filteredArray = self.ArraywithFullData.mutableCopy() as! NSMutableArray
}
self.pickerView.reloadData()
}
func searchInArray(srchArray:NSMutableArray, withKey:String,Characters:String)->NSMutableArray
{
let resultArray=NSMutableArray()
for index in 0..<srchArray.count
{
let Dict = srchArray[index] as! [String:Any]
if let stringmatcher = Dict[withKey] as? String
{
if(stringmatcher.contains(find: Characters))
{
resultArray.add(Dict)
}
else
{
}
}
}
return resultArray
}
extension String {
func contains(find: String) -> Bool{
return self.range(of: find) != nil
}
And For Objective-C You have to Do something Like below. But don't forget to call textField Delegates.
NSMutableArray *filteredArray;
NSMutableArray *ArraywithFullData;
self.txtSearch.delegate=self
[self.txtSearch addTarget:self action:#selector(textFieldDidChange:) forControlEvents:UIControlEventEditingChanged];
//
-(void)textFieldValueChanged:(UITextField *)TextField
{
if((TextField.text.length)>0)
{
filteredArray = [self searchInArray:ArraywithFullData withKey:#"key value" andCharacters:TextField.text];
}
else
{
filteredArray = [ArraywithFullData mutableCopy];
}
[self.pickerView reloadData];
}
-(NSMutableArray *)searchInArray:(NSMutableArray*)srchArray withKey:(NSString *)key andCharacters:(NSString *)charecters
{
NSMutableArray *resultArray= [[NSMutableArray alloc]init];
for (int index=0 ; index<srchArray.count; index++)
{
NSMutableDictionary *Dict = srchArray[index];
if ([Dict valueForKey:key] != nil)
{
NSString * stringmatcher = [Dict valueForKey:key];
if ([stringmatcher containsString:charecters])
{
[resultArray addObject:Dict];
}
else
{
}
}
}
return resultArray;
}
Note if data is not in form of array of Dictionaries you can remove code work that is about dictionaries

Error in PDKPin.m for Pinterest iOS SDK

I am currently using the iOS SDK's "getAuthenticatedUserPinsWithFields" method to return a user's pins:
let fields = ["id","note","url","image"] as NSSet
PDKClient.sharedInstance().getAuthenticatedUserPinsWithFields(fields as Set<NSObject>,
success: { (responseObject :PDKResponseObject!) -> Void in
I believe I traced the error to PDKPin.m initWithDictionary:
(instancetype)initWithDictionary:(NSDictionary *)dictionary
{
self = [super initWithDictionary:dictionary];
if (self) {
// _url = [NSURL URLWithString:dictionary[#"link"]];
_url = [NSURL URLWithString:dictionary[#"url"]];
_descriptionText = dictionary[#"note"];
_board = [PDKBoard boardFromDictionary:dictionary[#"board"]];
_creator = [PDKUser userFromDictionary:dictionary[#"creator"]];
_metaData = dictionary[#"metadata"];
_repins = [self.counts[#"repins"] unsignedIntegerValue];
_likes = [self.counts[#"likes"] unsignedIntegerValue];
_comments = [self.counts[#"comments"] unsignedIntegerValue];
}
return self;
}
The #"link" reference always returns nil but using "url" will return the url to the pin. Is the SDK wrong in this case?
Thanks, Anita

Convert Swift method using extend and append functions to Objective-C

I have this Swift method that I want to convert to Objective-C. The method is using the extend method of Array and as far as I understand it just adds an object to the array while calling itself again.
Swift:
///the method to serialized all the objects
func serializeObject(object: AnyObject,key: String?) -> Array<HTTPPair> {
var collect = Array<HTTPPair>()
if let array = object as? Array<AnyObject> {
for nestedValue : AnyObject in array {
collect.extend(self.serializeObject(nestedValue,key: "\(key!)[]"))
}
} else if let dict = object as? Dictionary<String,AnyObject> {
for (nestedKey, nestedObject: AnyObject) in dict {
var newKey = key != nil ? "\(key!)[\(nestedKey)]" : nestedKey
collect.extend(self.serializeObject(nestedObject,key: newKey))
}
} else {
collect.append(HTTPPair(value: object, key: key))
}
return collect
}
What I've done so far in Objective-C.
- (NSArray*) serializeObject:(id)obj key:(NSString*)key
{
NSMutableArray* collect = [NSMutableArray array];
if ([obj isKindOfClass:[NSArray class]])
{
NSArray* objArray = obj;
if (obj)
{
for (id nestedObj in objArray)
{
[collect addObject:[self serializeObject:nestedObj key:[NSString stringWithFormat:#"%#[]", key]]];
}
}
}
else if ([obj isKindOfClass:[NSDictionary class]])
{
NSDictionary* dict = obj;
if (dict)
{
for (NSString* nestedKey in dict)
{
NSString* newKey = key != nil ? [NSString stringWithFormat:#"%#[%#]", key, nestedKey] : nestedKey;
id nestedObject = [dict objectForKey:newKey];
if (nestedObject)
{
[collect addObject:[self serializeObject:nestedObject key:newKey]];
}
}
}
}
else
{
[collect addObject:[[WEEHTTPPair alloc] initWithValue:obj andKey:key]];
}
return collect;
}
The goal is to get an NSArray of WEEHTTPPair objects for every key/value pair in the dictionary but I lose the meaning of extend and append to apply in my Objective-C code. For me it looks like both are adding the object to the array which is created new anyway but it's more that I lack in knowledge so far.
[EDIT]
The method is used accordingly.
Swift.
///convert the parameter dict to its HTTP string representation
func stringFromParameters(parameters: Dictionary<String,AnyObject>) -> String {
return join("&", map(serializeObject(parameters, key: nil), {(pair) in
return pair.stringValue()
}))
}
I converted to Objective-C borrowed BlockKits map extension.
Objective-C.
- (NSString*) stringFromParameters:(NSDictionary*)parameters
{
WEENSArrayBlocksKit* blockKit = [WEENSArrayBlocksKit new];
NSArray* serializedParams = [self serializeObject:parameters key:nil];
NSArray* arrayParams = [blockKit bk_map:serializedParams withBlock:^id(id obj)
{
// obj is an array without the desired results
WEEHTTPPair* httpPair = obj;
NSString* stringValue = nil;
if (httpPair)
{
stringValue = [httpPair stringValue];
}
return stringValue;
}];
NSString* joinedString = [arrayParams componentsJoinedByString:#"&"];
return joinedString;
}
When you do this:
[collect addObject:[self serializeObject:nestedObject key:newKey]];
You are adding an NSArray to your collect object. Instead, you want to add the objects contained within the response to collect:
[collect addObjectsFromArray:[self serializeObject:nestedObject key:newKey]];

How to get only images in the camera roll using Photos Framework

The following code loads images that are also located on iCloud or the streams images. How can we limit the search to only images in the camera roll?
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: nil)
After adding the Camera Roll and Photo Stream albums, Apple added the following PHAssetCollectionSubtype types in iOS 8.1:
PHAssetCollectionSubtypeAlbumMyPhotoStream (together with PHAssetCollectionTypeAlbum) - fetches the Photo Stream album.
PHAssetCollectionSubtypeSmartAlbumUserLibrary (together with PHAssetCollectionTypeSmartAlbum) - fetches the Camera Roll album.
Haven't tested if this is backward-compatible with iOS 8.0.x though.
Through some experimentation we discovered a hidden property not listed in the documentation (assetSource). Basically you have to do a regular fetch request, then use a predicate to filter the ones from the camera roll. This value should be 3.
Sample code:
//fetch all assets, then sub fetch only the range we need
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)
assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in
results.addObject(obj)
}
var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %#", argumentArray: [3]))
results = NSMutableArray(array: cameraRollAssets)
If you are searching like me for Objective C code, and also you didn't get Answer of new library/ Photo Framework as you were getting deprecated AssetsLibrary's code , Then this will help you:
Swift
Global Variables:
func getAllPhotosFromCameraRoll() -> [UIImage] {
// TODO: Add `NSPhotoLibraryUsageDescription` to info.plist
PHPhotoLibrary.requestAuthorization { print($0) } // TODO: Move this line of code to somewhere before attempting to access photos
var images = [UIImage]()
let requestOptions: PHImageRequestOptions = PHImageRequestOptions()
requestOptions.resizeMode = .exact
requestOptions.deliveryMode = .highQualityFormat
requestOptions.isSynchronous = true
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: .image, options: nil)
let manager: PHImageManager = PHImageManager.default()
for i in 0..<fetchResult.count {
let asset = fetchResult.object(at: i)
manager.requestImage(
for: asset,
targetSize: PHImageManagerMaximumSize,
contentMode: .default,
options: requestOptions,
resultHandler: { (image: UIImage?, info: [AnyHashable: Any]?) -> Void in
if let image = image {
images.append(image)
}
})
}
return images
}
Objective C
Global Variables:
NSArray *imageArray;
NSMutableArray *mutableArray;
below method will help you:
-(void)getAllPhotosFromCamera
{
imageArray=[[NSArray alloc] init];
mutableArray =[[NSMutableArray alloc]init];
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
requestOptions.synchronous = true;
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
PHImageManager *manager = [PHImageManager defaultManager];
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[result count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in result) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[images addObject:ima];
}];
}
imageArray = [images copy]; // You can direct use NSMutuable Array images
}
If you use your own PHCachingImageManager instead of the shared PHImageManager instance then when you call requestImageForAsset:targetSize:contentMode:options:resultHandler: you can set an option in PHImageRequestOptions to specify that the image is local.
networkAccessAllowed
Property
A Boolean value that specifies whether Photos can download the requested image from iCloud.
networkAccessAllowed
Discussion
If YES, and the requested image is not stored on the local device, Photos downloads the image from iCloud. To be notified of the download’s progress, use the progressHandler property to provide a block that Photos calls periodically while downloading the image. If NO (the default), and the image is not on the local device, the PHImageResultIsInCloudKey value in the result handler’s info dictionary indicates that the image is not available unless you enable network access.
This can help. You can use your own data model instead of AlbumModel I used.
func getCameraRoll() -> AlbumModel {
var cameraRollAlbum : AlbumModel!
let cameraRoll = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil)
cameraRoll.enumerateObjects({ (object: AnyObject!, count: Int, stop: UnsafeMutablePointer) in
if object is PHAssetCollection {
let obj:PHAssetCollection = object as! PHAssetCollection
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let assets = PHAsset.fetchAssets(in: obj, options: fetchOptions)
if assets.count > 0 {
let newAlbum = AlbumModel(name: obj.localizedTitle!, count: assets.count, collection:obj, assets: assets)
cameraRollAlbum = newAlbum
}
}
})
return cameraRollAlbum
}
Here is Objective- c version provided by apple.
-(NSMutableArray *)getNumberOfPhotoFromCameraRoll:(NSArray *)array{
PHFetchResult *fetchResult = array[1];
int index = 0;
unsigned long pictures = 0;
for(int i = 0; i < fetchResult.count; i++){
unsigned long temp = 0;
temp = [PHAsset fetchAssetsInAssetCollection:fetchResult[i] options:nil].count;
if(temp > pictures ){
pictures = temp;
index = i;
}
}
PHCollection *collection = fetchResult[index];
if (![collection isKindOfClass:[PHAssetCollection class]]) {
// return;
}
// Configure the AAPLAssetGridViewController with the asset collection.
PHAssetCollection *assetCollection = (PHAssetCollection *)collection;
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
self. assetsFetchResults = assetsFetchResult;
self. assetCollection = assetCollection;
self.numberOfPhotoArray = [NSMutableArray array];
for (int i = 0; i<[assetsFetchResult count]; i++) {
PHAsset *asset = assetsFetchResult[i];
[self.numberOfPhotoArray addObject:asset];
}
NSLog(#"%lu",(unsigned long)[self.numberOfPhotoArray count]);
return self.numberOfPhotoArray;
}
Where you can grab following details
PHFetchResult *fetchResult = self.sectionFetchResults[1];
PHCollection *collection = fetchResult[6];
**value 1,6 used to get camera images**
**value 1,0 used to get screen shots**
**value 1,1 used to get hidden**
**value 1,2 used to get selfies**
**value 1,3 used to get recently added**
**value 1,4 used to get videos**
**value 1,5 used to get recently deleted**
**value 1,7 used to get favorites**
Apple demo link
Declare your property
#property (nonatomic, strong) NSArray *sectionFetchResults;
#property (nonatomic, strong) PHFetchResult *assetsFetchResults;
#property (nonatomic, strong) PHAssetCollection *assetCollection;
#property (nonatomic, strong) NSMutableArray *numberOfPhotoArray;
I've been banging my head over this too. I've found no way to filter for only assets on the device with fetchAssetsWithMediaType or fetchAssetsInAssetCollection. I'm able to use requestContentEditingInputWithOptions or requestImageDataForAsset to determine if the asset is on the device or not, but this is asynchronous and seems like it's using way too much resources to do for every asset in the list. There must be a better way.
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
for (int i=0; i<[fetchResult count]; i++) {
PHAsset *asset = fetchResult[i];
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
if ([[info objectForKey:PHContentEditingInputResultIsInCloudKey] intValue] == 1) {
NSLog(#"asset is in cloud");
} else {
NSLog(#"asset is on device");
}
}];
}
If you don't want to rely on an undocumented API, look at [asset canPerformEditOperation:PHAssetEditOperationContent]. This only returns true if the full original is available on device.
Admittedly this is also fragile, but testing shows it works for all of the assetSource types (photostream, iTunes sync, etc).

Resources