Why is PHAssetCollection Count 0? - ios

I'm trying to wrangle the new photo features of iOS 8's photo editing capabilities. Their documentation is very sparse so I'd love some input from you as to what might be going on.
I am trying to fetch ALL the images the user has saved. I am doing a fetch but it keeps telling me the result size is 0. It also shoots out a weird error along with it
func initController()
{
_userAlbums = PHCollectionList.fetchTopLevelUserCollectionsWithOptions(nil)//GET PERMISSION BEFORE DOING THIS
println("Albums count is \(_userAlbums.count)") //error when printing this
}
This prints out
2014-10-27 17:43:50.254 appiOS[4854:732084] [PLLogging] ***** Error: logging directory does not exist
/var/mobile/Library/Logs/CrashReporter/DiagnosticLogs/
Albums count is 0
There are at least 100 images on the iPad I am using. Any idea what I am doing wrong?
Update:
Using
_userAlbums = PHAsset.fetchAssetsWithOptions(nil)
Works

There are at least 100 images on the iPad I am using. Any idea what I am doing wrong?
You are not doing anything wrong. Your expectations seem a bit out of whack, though. The statement "there are at least 100 images" seems indicative of a deeper misconception, because, after all, PHCollectionList.fetchTopLevelUserCollectionsWithOptions has nothing to do with images. It has to do with, uh, top level user collections. Evidently your device doesn't have any of those.
But now go to the Photos app on your iPad and make a few albums. Those are top level user collections! So then run your app again. Assuming you've been granted permission to access the photo library, now your logging will result in a number larger than 0.

Related

iOS 11.4 not asking Privacy Usage ( Privacy - Motion Usage Description has been set )

I'm stumped, iOS 11.4 ( 15F79 ), iPhone 6. Cannot get the App to Ask for Motion Data. info.plist has been set via the editor and double checked via the info.plist open in textWrangler, Also deleted key and saved via textWrangler.
<key>NSMotionUsageDescription</key>
<string>This app needs your Phones motion manager to update when the phone is tilted. Please allow this App to use your phones tilt devices</string>
I have deleted then reinstalled the app about 10 times. I have restared the phone 5 times. I have checked through settings and my app does NOT show up in Privacy-Motion and Fitness or anywhere else in settings. I am using a free developer account, maybe that has something to do with it?
I created a new Xcode game template and changed nothing apart from importing CoreMotion and this code
**** Edited, sorry I forgot to say I had started the instance, just forgot to put it here, just in case someone thinks that's the problem ************
let motionManager = CMMotionManager()
override func didMove(to view: SKView) {
motionManager.startDeviceMotionUpdates()
if motionManager.isDeviceMotionActive == true {
motionManager.accelerometerUpdateInterval = 0.2
motionManager.startAccelerometerUpdates(to: OperationQueue.current!, withHandler: {
(accelerometerData: CMAccelerometerData!, error: NSError!) in
let acceleration = accelerometerData.acceleration
print(accelerometerData)
} as! CMAccelerometerHandler)
}else{
print(CMMotionActivityManager.authorizationStatus().rawValue)
}
which prints a 0 ( an Enum - case not determined ) to the console.
In my actual app it was a 3 ( same Enum - case Denied ).
As I've said, I have uninstalled, reinstalled, edited plist via Xcode and text wrangler ( a code editor ) , tried different versions of the code above, tried the code in different places ( in did move to view, in class )tried code off apple docs. etc.... I haven't been asked the NSUsage question and the App keeps crashing.
I have looked for ways to get the Alert fired up, As in CLLocationManager.requestWhenInUseAuthorization() but I cannot find a comparable CMMotion version ( I don't think there is one. ) I have created a new swift file , imported Foundation and CMMotion and just put that code there, But still no Alert asking for Motion Data.
I tried a single view app template instead of a game template thinking that might be the issue, Nope.
What do I do?
Any help Appreciated. Thanks
You are confusing two related but different classes.
CMMotionManager gives access to accelerometer, magnetometer and gyroscope data. It does not require any user permission as this information is not considered privacy related.
In your else clause you are checking the authorisation status of CMMotionActivityManager. This object reports the device motion type (walking, running, driving). This information is considered privacy related and when you create an instance of this class and request data from it, the permissions alert is displayed.
The reason your else is being triggered is because you are checking isDeviceMotionActive; this will be false until you call startDeviceMotionUpdates, which you never do. Even if you used isAccelerometerActive you would have a problem because you call startAccelerometerUpdates in the if clause which will never be reached.
You probably meant to check isAccelerometerAvailable. If this returns false then there isn't much you can do; the device doesn't have an accelerometer.
Update
It doesn't make sense to check isDeviceMotionActive immediately after calling startDeviceMotion:
You know it's active; you just started it
I imagine the start up takes some time, so you could expect to get false if you check immediately.
Apple recommends that you do not have more than one observer in place for each motion device type, so the purpose of check the is...Active to ensure you don't call start... again if you have already done so.
If you only want gyroscope data then you don't need to call startDeviceMotionUpdates at all.

Read HLS Playlist information to dynamically change the preferredBitRate of an Item

I'm working on a video app, we are changing form regular mp4 files to HLS, one of the many reasons we have to do the change is that we hace much more control over the bandwidth usage of videos (we load lots of other stuff in our player, so we need to optimize the experience the best way).
So, AVFoundation introduced in iOS10 the ability to control the bandwidth using:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.urlAsset];
playerItem.preferredForwardBufferDuration = 30.0;
playerItem.preferredPeakBitRate = 200000.0; // Remember this line
There's also a configuration introduced on iOS11 to set the maximum resolution of the item with preferredMaximumResolution, So we're using it, but we still need a solution for iOS10 devices.
Well, now we have control over the preferredPeakBitRate that's nice, but we have a problem, not all the HLS sources are generated by us, so, let's say we want to set a maximum resolution of 480p when you're not connected to a wifi network, today I don't have way to achieve that, not always I'm going to be able to know how much bandwidth needs the 480p source for the selected HLS playlist.
One thing I was thinking about is to read the information inside the m3u8 file, to at least know which are the different quality sources that my player can show and how much bandwidth needs everyone.
One way to do this, would download the m3u8 playlist as a plain text, use a regex to read the file and process this data, well, I'm trying to avoid that, I think that this should far less difficult.
I cannot read this information from the tracks, because a) I can't find the information, b) the tracks are replaced dynamically when changing the quality, yeah 1 track for every quality level.
So, I don't know how I can get this information, I've searched google, stackoverflow and I can't find this information, does any one can help me?
Here's an example for what I want to do, I have this example playlist:
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=314000,RESOLUTION=228x128,CODECS="mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_192k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=478000,RESOLUTION=400x224,CODECS="avc1.42001e,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_400k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=691000,RESOLUTION=480x270,CODECS="avc1.42001e,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_600k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1120000,RESOLUTION=640x360,CODECS="avc1.4d001f,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_1000k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1661000,RESOLUTION=960x540,CODECS="avc1.4d001f,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_1500k.m3u8
And I just want to have that information available on an array inside my code, something like this:
NSArray<ZZMetadata *> *metadataArray = self.urlAsset.bandwidthMetadata;
NSLog(#"Metadata info: %#", metadataArray);
And print something like this:
<__NSArrayM 0x123456789> (
<ZZMetadata 0x234567890> {
trackId: 1
neededBandwidth: 314000
resolution: 228x128
codecs: ...
...
}
<ZZMetadata 0x345678901> {
trackId: 2
neededBandwidth: 478000
resolution: 400x224
}
...
}

Gif using UIImage+animtedGif class

Using this class I am trying to load a gif url into UIImageView.
The thing is , for some url's it takes 10 seconds to load, others 2 seconds.
I have tried almost anything, but still the process is too slow. 1 second would be good, but i had never succeed getting there.
I have also tried with UIWebview which had its own issues .
Here is the code :
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
let fileUrl = NSURL(string:"http://45.media.tumblr.com/6785bae27b8f888fe825f0ade95796a3/tumblr_noenkbeTSw1qjmwryo1_500.gif" )
let gif = UIImage.animatedImageWithAnimatedGIFURL(fileUrl!)
dispatch_async(dispatch_get_main_queue()) {
self.player.image = gif
}
}
The problem with most of the GIF reading tools I have looked at is that they read all the data in at load time and that they allocate memory for all of the decoded frames and hold all that uncompressed data in memory at the same time. This will lead to runtime performance problems and it will crash your app and possibly your device on large/long gifs. On the issue of loading time, there is not all that much you can do since the data does need to be downloaded and read. You are also just assuming that the network cache is going to handle hitting the same GIF over and over without going to the network again, which may or may not work well for you. For a solution that addresses these issues, see this SO Question or you can also take a look at the flipboard solution here.

Iphone returns photo data in different way

I have import contacts to Iphone using CardDAV.
An strange thing happened here for the Photo attribute.
I sent Photo data to Iphone this way,
PHOTO;ENCODING=b;TYPE=JPG:/9j/4AAQSkZJRgABAQEAYABgAAD/2wBDAAgGBgcGBQgHBwcJCQgKDBQNDAsLDBkSEw8UHRofHh0aHBwgJC4nICIsIxwcKDcpLDAxNDQ0Hyc5PTgyPC4zNDL/2wBDAQkJCQwLDBgNDRgyIRwhMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjL/wAARCABqAKADASIAAhEBAxEB/8QAHwAAAQUBAQEBAQEAAAAAAAAAAAECAwQFBgcICQoL/8QAtRAAAgEDAwIEAwUFBAQAAAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6/8QAHwEAAwEBAQEBAQEBAQAAAAAAAAECAwQFBgcICQoL/8QAtREAAgECBAQDBAcFBAQAAQJ3AAECAxEEBSExB`
and so on .....
While Iphone return it in this way,
PHOTO;ENCODING=b;TYPE=JPEG;X-ABCROP-RECTANGLE=ABClipRect_1&0&0&160&160&ke51PoQcz6s21p/Tl7rZyw==:
/9j/4AAQSkZJRgABAQEAYABgAAD/4QCARXhpZgAATU0AKgAAAAgABAEaAAUAAAABAAAAPgEbAA
UAAAABAAAARgEoAAMAAAABAAIAAIdpAAQAAAABAAAATgAAAAAAAABgAAAAAQAAAGAAAAABAAOg
AQADAAAAAQABAACgAgAEAAAAAQAAAKCgAwAEAAAAAQAAAGoAAAAA/9sAQwABAQEBAQEBAQEBAQ
EBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEB/9sA
QwEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQ
EBAQEBAQEBAQEB/8AAEQgAagCgAwERAAIRAQMRAf/EAB8AAAEFAQEBAQEBAAAAAAAAAAABAgME
BQYHCAkKC//EALUQAAIBAwMCBAMFBQQEAAABfQECAwAEEQUSITFBBhNRYQcicRQygZGhCCNCsc
EVUtHwJDNicoIJChYXGBkaJSYnKCkqNDU2Nzg5OkNERUZHSElKU1RVVldYWVpjZGVmZ2hpanN0
dXZ3eHl6g4SFhoeIiYqSk5SVlpeYmZqio6Slpqeoqaqys7S1tre4ubrCw8TFxsfIycrS09TV1t
fY2drh4uPk5ebn6Onq8fLz9PX29/j5+v/EAB8BAAMBAQEBAQEBAQEAAAAAAAABAgMEBQYHCAkK
C//EALURAAIBAgQEAwQHBQQEAAECdwABAgMRBAUhMQYSQVEHYXETIjKBCBRCkaGxwQkjM1LwFW
Jy0QoWJDThJfEXGBkaJicoKSo1Njc4OTpDREVGR0hJSlNUVVZXWFlaY2RlZmdoaWpzdHV2d3h5
eoKDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2u
and so on .....
I need to compare the image and i am unable to do this due the photo data return by IPhone.
Note: Every time Iphone return the photo data different while image is same.
Any body knows what way Iphone is using to transform Image.
The client is just adding proper folding as per https://www.rfc-editor.org/rfc/rfc2426#section-2.6
In other words, you are not supposed to have lines longer than 75 characters in a vCard. You need to fold those.

Configuring multiple devices in PortAudio: Invalid device error

This query is regarding the Portaudio framework. A little background before I ask the question:I am working on an application in PortAudio to output audio through a multichannel(=8) device. However, the device I am using does not expose itself as a single 8-channel device but instead shows up in my device-list as 4 stereo devices. On searching for an approach to handle this, I got to know that WinMME in PortAudio supports multiple devices.
Now, I went through the appropriate header file("pa_win_wmme.h") and followed the suggestions present. But I get the 'Invalid device' error (error number -9996) after calling PA_OpenStream(). In the above mentioned header file, they have in fact specified the right parameter(s) to use when configuring multiple devices to avoid this error, but in-spite of following them, I still get the error.
So I wanted to know if anybody has faced a similar issue and whether I have missed/wrongly configured anything.
I am pasting the required snippets of code below for reference:
PaStreamParameters outputParameters;
PaWinMmeStreamInfo wmmeStreamInfo;
PaWinMmeDeviceAndChannelCount wmmeDeviceAndNumChannels;**
...
...
outputParameters.device = paUseHostApiSpecificDeviceSpecification;
outputParameters.channelCount = 8;
outputParameters.sampleFormat = paFloat32; /* 32 bit floating point processing */
outputParameters.hostApiSpecificStreamInfo = NULL;
wmmeStreamInfo.size = sizeof(PaWinMmeStreamInfo);
wmmeStreamInfo.hostApiType = paMME;
wmmeStreamInfo.version = 1;
wmmeStreamInfo.flags = paWinMmeUseMultipleDevices;
wmmeDeviceAndNumChannels.channelCount = 2;
wmmeDeviceAndNumChannels.device = 3;
wmmeStreamInfo.devices = &wmmeDeviceAndNumChannels;
wmmeStreamInfo.deviceCount = 4;
outputParameters.hostApiSpecificStreamInfo = &wmmeStreamInfo;
The device id = 3 was obtained through
Pa_GetHostApiInfo( Pa_HostApiTypeIdToHostApiIndex( paMME ) )->defaultOutputDevice
I hope I have made the query clear enough. Will be happy to provide more details if required.
Thanks.
I finally figured out the mistake :-)
The configuration for multiple devices must be made as an array. For instance, in the above case
wmmeDeviceAndNumChannels must be an array of 4, with each individual device field containing the corresponding device index of each of the 4 stereo devices. The channelCount remains 2. The outputParameters.channelCount still has to be the aggregate number of channels, i.e. 8. With this I was able to run the application with a single stream, and of course, without any errors related to invalid device or invalid number of channels.:-)
Thanks.
Based on the code pasted above, it looks like you are trying to call open on a single 8-channel device. Instead you will have to get the Pa index of all four devices and call open 4 times. Once for each stereo device. You will then have 4 interleaved stereo streams to maintain. My guess is that changing channelCount = 8 to channelCount = 2 will allow the first stream to open.

Resources