I have an app that implements the MediaPlayer framework. I can read the state, stop, play, skip, reverse, etc.
How can I update the BPM, or the comments, etc...
I have tried a direct assignment on MPMediaItemPropertyBeatsPerMinute and I get the error:
Read-only variable is not assignable
I currently have a UITextField for collecting user input, and a UIButton to save the data.
What methods can be implemented to save/push the data in the UITextField to the tracks MetaData.
Any leads would be great!
No this is not possible without jail breaking. An app cannot write to the media library.
Related
I am building an app where we can record videos but it requires recording certain events due to which the app is closed and when the app is opened again , all the data is lost. Tiktok is using a similar approach where they save the data temporarily on the device storage and shows an alert when the app is opened again. Cant find the solution of what they are using to save data temporarily on device.
This is how they are doing it - tiktok method
The question isn't clear to me. But I assume that you want to record something and get back at the time of relaunching the application.
The best way you can do:
First, continue recording until app goes to the background
Second, in appDidEnterBackground
a) please save the file to a unique name in the document directory/temp directory with a unique name. The second one is the best place to do. Then save the filename to the userdefaults.
or
b) Instead of doing the file save, save the Data to the userdefaults.
Third, in didBecomeActive
a) For 2a, if there is a name, try to fetch the file from the directory.
b) For 2b, if there is data available in userdefaults, try to show an alert that there is something in the userdefaults.
You have thousands of answers in StackOverflow for all of those steps.
I have an app that already uses CloudKit to pass information between the iPhone and Apple Watch App. I'd like to add the ability to let users share their data saved in iCloud with other users. I haven't found any great resources so far. My understanding is that Apple recommends setting up the share logic using UICloudSharingController, but this would need to be wrapped in a UIHostingController for SwiftUI. I found this answer but the code posted uses a few custom properties such as ObjectToShare that I can't follow how were set up. Do I have to use UICloudSharingController? Can I just have one user get their own shareURL and text it to another user? If so, how would user 1 get their own shareURL?
You don't have to use UICloudSharingController, no. You can set up your own CKShare object manually. It has a url property that is what a user needs to click on in order to gain access to the CKRecord you are sharing. So you can display the url in a custom way or pass it around however you like.
The CKShare docs show all the available properties:
https://developer.apple.com/documentation/cloudkit/ckshare
This article is a little old, but it helped me see the overall process of setting up a CKShare: https://kwylez.medium.com/cloudkit-sharing-series-creating-the-ckshare-40e420b94ee8
I'm working on adding the ability to use the share button to allow users to save files for use with my main application. Multiple file types (image, video, audio, pdf, etc) need to be supported.
A general use case would be:
The user takes a picture with the standard Camera app or audio recording using the Voice Memos app.
User clicks the Share button and selects my extension from the share list.
Dialog opens up giving the user to opportunity to give a description for the file.
File is saved to where my main app (the containing app) can later access and process it.
I've been able to get to the point where I am prompted to share the file, but I have not been able to find a location to successfully save to that my main app can later read from. Is this even possible? Is there a better way to handle this scenario?
I am currently doing this using Xamarin so debugging is not supported (and logging is minimal). If someone has an answer in Objective C, that would at least help point me in the right direction.
There are a few things that you need to do.
First, your app and your app extension should belong to the same app group:
https://developer.apple.com/library/prerelease/ios/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.html#//apple_ref/doc/uid/TP40011195-CH4-SW19
Then you can access the shared storage with something like this:
var groupUrl = NSFileManager.DefaultManager.GetContainerUrl ("stackoverflow.com.mygroup")
Now you can access files in the directory pointed by groupUrl.
Xamarin's guide to creating Share extensions for iOS: http://developer.xamarin.com/guides/ios/platform_features/introduction_to_extensions/
I'm wondering if it's at all possible to access features in the main app using a today extension in iOS 8. For example, can a media player send commands from its extension to the main app without opening the app (I know there is a framework for this but it's just an example)? The only solution I can see is using URIs but the problem is that it will open the app which isn't the behavior I'm looking for.
So for a complete example using the media player:
There is an extension in the today screen that allows you to play/pause using a button.
The user presses the button and the app plays/pauses in the background without leaving the notification center.
Any way to achieve this behavior?
Code can be shared using an embedded framework, however there are some backwards compatibility issues.
Data can be shared using shared user defaults.
Both are explained in the apple docs.
Flock is a reasonably new iOS app from the guys at Bump, which has an interesting feature. It somehow knows when photos have been taken by another app, notifies the user (in a notification center way), and asks the user to share them into an album. There are other interesting features of course, but I'm particularly interested in this feature for another app that I'm working on.
I can't see how the API facilitates this directly. I looked carefully through the notifications API documentation, and apps can certainly register to show a notification to the user at a future date/time, and thus be opened by the user at that time... but I couldn't find any system notification for when a photo has been taken. The notifications API also allows server-generated notifications, but once-again, I don't know how Flock's server-side could know when the user has taken a photo in a different application.
I installed the app a couple of days back, and I only seem to get the notification when I have taken photos. It doesn't appear to be just a daily reminder.
Any ideas how Flock (and potentially other apps) manage to do this?
After a couple of days of observing the app's behaviour (and a lot of Googling and bumping into arcane discussions about file locking), I have a strong suspicion of how it works: Flock has simply registered for Significant-Change Location Service, which wakes the app and provides a small processing window when the user changes location. The documentation says:
At wake-up time, your app is put into the background and given a small amount of time to process the location data
I suspect that Flock is checking the image library at that point, and triggering a local notification if photos have been added. This squares with my experience that Flock gives me a local notification ~10 minutes after I leave home... which, in case any of the Bump/Flock devs are reading this, is just about the worst time for me to sort through my photos and share them in an album (perhaps I should use public transport more often).
There are some other interesting SO answers here, here and here... but for the most part they discuss local notifications (which can only be scheduled for a particular time, and will always alert the user at that time, so aren't really background tasks) or the 600 second background processing window afforded to apps that have been shut down by the user (which is certainly a background task, but is clearly not fit for the purpose of running a background task once a day or somesuch).
The Bump devs have also provide some clues to the underlying architecture of the app here.
I haven't used this Framework personally, but I did stumble across this documentation when doing research for a client.
Apple Photos Framework Reference
Under the Features & Concepts there is an entry for "Change Observing" which states:
Use the shared PHPhotoLibrary object to register a change handler for the photo entities you fetch. Photos tells your app whenever another app or device changes the content or metadata of an asset or the list of assets in a collection. PHChange objects provide information about object state before and after each change with semantics that make it easy to update a collection view or similar interface.
It appears that you can use the PHPhotoLibrary singleton to registerChangeObserver: on a class of yours (that adopts the PHPhotoLibraryChangeObserver protocol) to receive PHChange objects from the photoLibraryDidChange: method