Dynamically changing app logo - ios

This question is regarding the iOS 10.3’s new feature of giving user the ability to customize the app logo they see on the homescreen. (Check out MLB at Bat app for reference where they let the user pick which icon will be app logo: http://m.mlb.com/apps/atbat)
According to my research, we need to submit for apple review all the possible logo options. Then user can customize the logo using any of these options. Now in my specific use case, I might not always want all the logo options to be available to all the users. I need help figuring out how to control which logos are shown to all users?
eg. if we have 10 images, for user A we may want to only show Image 1 and 2 to pick from; and for user B we may want to show only Image 3 and Image 4 to pick from as their app logo. Is this possible? Thanks so much in advance!

You can control which icon is set using the setAlternateIconName(_:completionHandler:) method on UIApplication.
Example usage:
UIApplication.shared.setAlternateIconName("myImage", completionHandler: { error in
print("completed")
})
You get to call this whenever it is appropriate in your application. So if you only want to only show a few options, you can do so with your own views, and only call this method when needed.
More information in the documentation here: https://developer.apple.com/reference/uikit/uiapplication/2806818-setalternateiconname
Here is another SO answer with helpful code and images: https://stackoverflow.com/a/41951096/6658553

Add all icons to some folder in your application, with the help of below method you should manage which user should see which icons.
if([user isEqualToString:#"user1"]){
[UIApplication.sharedApplication setAlternateIconName:#"icon1" completionHandler:^(NSError * _Nullable error) {
if (error) {
NSLog(#"---> error - %#",error.description);
}
else{
NSLog(#"---> icon1 ");
}
}]
}
If you want to give user an option to choose from the icons, then also you can write the code accordingly so then he/she can only choose from the filtered ones.

Related

How to do simple AB testing in iOS

I am looking to split up my user base to 10 group and show 10 different UI and see how they feel about it.
so each user group will have single type of UI always.
i.e Let's say I have 10k users and when I roll out my next release when user install I will be showing for 1000 user 1 UI and for another 1000 user 1 UI like all 10K users.
I know this can be done with the help of AB testing framework.
Basically I want to call one API at the launch of app and it has to return value between 1 to 10 then I can store it in my keychain and next time when app is launched I will see if it's already there in keychain and I will not call the API.
So basically the API will know how many requests has come and it'll divide and send right values back
so based on value in keychain I will show different , different UI and here AB testing framework's job would be giving me value 1 to 10 the API part.
There are so many AB testing framework available online.But I couldn't find any framework that suits my needs.
any help is appreciated !
The best approach would be splitting the users into groups in data base and let login API or some other API return some flag to indicate what group each user belongs to and you can show UI accordingly.
But if that's not possible
Then the simplest approach would be generating a random number between 1-10 and keeping it in keychain and showing a particular UI for it so that next time when you launch the app you can look out for the value in Keychain and if its not there then you can create a new random value and store it in the keychain.This way you will show the same UI for that user always.
This splitting approach is not 100% accurate but its close enough I would say
arc4random_uniform
- (NSInteger)randomNumberBetween:(NSInteger)min maxNumber:(NSInteger)max
{
return min + arc4random_uniform((uint32_t)(max - min + 1));
}
if you take sample of these random numbers 10000 times, you can see each number coming 900-1000 times which is 9-10% and its close enough
for(int i=0;i<10000;i++){
NSLog(#"random:%ld",[self randomNumberBetween:1 maxNumber:10]);
}
Seconds of Current time
you can take the seconds of current date and time and if the second is between 1-6 then you can save value 1 in keychain and for 7-12 you can save value 2 in keychain etc..54-60 you can keep value 10 in keychain.
Others
you can consider splitting the users based on Geography or country or timezone and doing this also has its own pit falls.
Like this you can devise your own strategy to split the user
but if none of the suggestions above fits your criteria then the best approach would be to look for third party AB testing frameworks but If it's going to be implemented in enterprise scale they might charge some money for it.
If I come across any such framework that provides this particular functionality alone as you asked I would update it here.
I would like to attribute the credit of this answer to this post as he has pointed out FireBase Remote Config and A/B testing.
As questioner asked I will explain the steps involved in that to achieve it.
Configuration on server
Visit https://console.firebase.google.com/ and sign in with your
google account.
Choose Create project and Click iOS
Key in app id and nick name and click register app
It'll show a link to GoogleService-Info.plist download then drag & drop it in the project
Choose Next
It'll show you Run your app to verify installation you can choose skip this step
Choose remote config from the landing page
Choose Add variable and enter a variable name of your choice but I enter ABTestVariationType and leave value empty and choose Publish changes
Choose A/B testing from the side bar then click Create Experiment then Choose Remote config
In the upcoming pop up Enter the name of your choice I enter as A/B test POC enter some description about it and that's optional anyway
In the the target users choose your app id and in the percentage of target users choose 100% and click Next then it'll show the variants section
In the variants section there will be a general category named Control group with 50% loaded by default and a variant box with 50% filled in and empty box and you can enter any name in that but I would enter variant 2.Now click add a parameter 8 times now you can see each variant has 10% and name all the variants and I would name variant 3,variant 4 to variant 10.
In the same variants section click Add Parameter from Remote config
Now you can see the a box appearing besides each variation parameter.You can enter unique value to identify each flavour.I would enter value 1 for the first variant and 2 for the second variant like that I will finish up with value 10 for the last variant and click Next
Then goal section appears you can choose one of it but I would choose Retention(15+) days and click Review and click start experiment and in prompt that's appearing choose start again
Integrating in the app
Add the following pods in your project
pod 'Firebase/Core'
pod 'Firebase/RemoteConfig'
Drag and drop the GoogleService-Info.plist that was downloaded during the server configuration
Initiate the firebase with following boiler-plate code
#import Firebase;
-(BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions(NSDictionary *)launchOptions
{
[FIRApp configure];
return YES;
}
4.Have the class RcValues which is another boiler-plate code in your project
#import "RcValues.h"
#import Firebase;
#implementation RcValues
+(RcValues *)sharedInstance
{
static RcValues *sharedInstance = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
sharedInstance = [[RcValues alloc] init];
});
return sharedInstance;
}
-(id)init{
self=[super init];
if(self)
{
[self AcivateDebugMode];
[self LoadDefaultValues];
[self FetchCloudValues];
}
return self;
}
-(void)LoadDefaultValues
{
[FIRRemoteConfig.remoteConfigsetDefaults:
#{#"appPrimaryColor":#"#FBB03B"}];
}
-(void)FetchCloudValues
{
NSTimeInterval fetchInterval=0;
[FIRRemoteConfig.remoteConfigfetchWithExpirationDuration:
fetchInterval completionHandler:^(FIRRemoteConfigFetchStatus
status, NSError *_Nullable error)
{
NSLog(#"error:%#",error);
[FIRRemoteConfig.remoteConfig activateFetched];
}];
}
-(void)AcivateDebugMode{ //
FIRRemoteConfig.remoteConfig.configSettings=debugSettings;
FIRRemoteConfigSettings *config = [[FIRRemoteConfigSettings alloc] initWithDeveloperModeEnabled:YES];
FIRRemoteConfig.remoteConfig.configSettings=config;
}
#end
5.Invoke the class in appdelegate didFinishinglaunchoptions
RcValues *Obj=[RcValues sharedInstance];
This will download the keyvalue for ABtesting
6.Use the below code to get the AB testing key from firebase to your app
self.flavourNumber.text=[FIRRemoteConfig.remoteConfig
configValueForKey:#"ABTestVariationType"].stringValue;
Based on the key value you can show different UI as you wish.
Firebase will take care of sending the right value and you don't have to worry about divide the users into groups by yourself.
P.S
Please follow the below tutorials for more detailed info this is just a summary and I will try to summarise or add more pictures when I have free time to make it for easier understand if possible I will try to add sample project in github and link it here
firebase-tutorial-ios-ab-testing
firebase-remote-config-tutorial-for-ios
Imagine changing fonts, colour or some values in your iOS app without submitting a new build. It's pretty easy using Remote config. This tutorial will teach you A/B testing, but before A/B testing I would recommend you to look around Remote Config.

Not able to update profile pic in MFSideMenu

I am trying to update my profile picture on side menu (MFSideMenu) and as soon as I update profile picture over the web server, I want to set it on my iOS app. URL of the image being the same, I am unable to upgrade it. I guess the problem is that every time I change image, image url being the same is loaded from the cache memory so it doesn't change. Is there any way to download changed image as when it changes without the need to change imageUrl?
In response I always get
imageUrl :"user_image" = "profile/local/images/user_id/myimage";
Accordingly I update my model and fetch information from model as I open the side menu.
`func menuAction(){
self.getUserDetails()
}
func getUserDetails() {
NotificationCenter.default.post(name: Notification.Name("NotificationIdentifier"), object: self)
self.menuContainerViewController.toggleLeftSideMenuCompletion(nil)
}`
This may be a quick fix. But this is not the best way.
Add different query parameters to the url when you going to set(download) the image.
For eg:
If your image url is http://www.yourserver.com/images/my_image.png then change it to http://www.yourserver.com/images/my_image.png?any_name=timestamp

Export audiofiles via “open in:” from Voice Memos App

I have the exact same issue as "Paul" posted here: Can not export audiofiles via "open in:" from Voice Memos App - no answers have yet been posted on this topic.
Essentially what I'm trying to do is simple:
After having recorded a Voice Memo on iOS, I select "Open With" and from the popup that is shown I want to be able to select my app.
I've tried everything I can think of and experimented with LSItemContentTypes without success.
Unfortunately I don't have enough reputation to comment on the existing post above, and I'm getting quite desperate for a solution to this. Any help is hugely appreciated, even just to know whether it's doable or not.
Thanks!
After some experimentation and much guidance from this blog post ( http://www.theappguruz.com/blog/share-extension-in-ios-8 ), it appears that it is possible to do this using a combination of app extensions (specifically an Action Extension) and app groups. I'll describe the first part which will enable you to get your recording from Voice Memos to your app extension. The second part -- getting the recording from the app extension to the containing app (your "main" app) -- can be done using app groups; please consult the blog post above for how to do this.
Create a new target within your project for the app extension, by selecting File > New > Target... from Xcode's menu. In the dialog box that prompts you to "Choose a template for your new target:" choose the "Action Extension" and click "Next".
CAUTION: Do not choose the "Share Extension" as is done in the blog post example above. That approach is more appropriate for sharing with another user or posting to a website.
Fill in the "Product Name:" for your Action Extension, e.g., MyActionExtension. Also, for "Action Type:" I selected "Presents User Interface" because this is the way Dropbox appears to do it. Selecting this option adds a view controller (ActionViewController) and storyboard (Maininterface.storyboard) to your app extension. The view controller is a good place to provide feedback to the user and to give the user an opportunity to rename the audio file before exporting it to your app.
Click "Finish." You will be prompted to "Activate “MyActionExtension” scheme?". Click "Activate" and this new scheme will be made active. Building it will build both the action extension and the containing app.
Click the disclosure triangle for the "MyActionExtension" folder in the Project Navigator (Cmd-0) to reveal the newly-created storyboard, ActionViewController source file(s), and Info.plist. You will need to customize these files for your needs. But for now ...
Build and run the scheme you just created. You will be prompted to "Choose an app to run:". Select "Voice Memos" from the list and click "Run". (You will probably need a physical device for this; I don't think the simulator has Voice Memos on it.) This will build and deploy your action extension (and its containing app) to your device. and then proceed to launch "Voice Memos" on your device. If you now make a recording with "Voice Memos" and then attempt to share it, you should see your action extension (with a blank icon) in the bottom row. If you don't see it there, tap on the "More" button in that row and set the switch for your action extension to "On". Tapping on your action extension will just bring up an empty view with a "Done" button. The template code looks for an image file, and finding none does nothing. We'll fix this in the next step.
Edit ActionViewController.swift to make the following changes:
6a. Add import statements for AVFoundation and AVKit near the top of the file:
// the next two imports are only necessary because (for our sample code)
// we have chosen to present and play the audio in our app extension.
// if all we are going to be doing is handing the audio file off to the
// containing app (the usual scenario), we won't need these two frameworks
// in our app extension.
import AVFoundation
import AVKit
6b. Replace the entirety of override func viewDidLoad() {...} with the following:
override func viewDidLoad() {
super.viewDidLoad()
// Get the item[s] we're handling from the extension context.
// For example, look for an image and place it into an image view.
// Replace this with something appropriate for the type[s] your extension supports.
print("self.extensionContext!.inputItems = (self.extensionContext!.inputItems)")
var audioFound :Bool = false
for inputItem: AnyObject in self.extensionContext!.inputItems {
let extensionItem = inputItem as! NSExtensionItem
for attachment: AnyObject in extensionItem.attachments! {
print("attachment = \(attachment)")
let itemProvider = attachment as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMPEG4Audio as String)
//|| itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMP3 as String)
// the audio format(s) we expect to receive and that we can handle
{
itemProvider.loadItemForTypeIdentifier(kUTTypeMPEG4Audio as String,
options: nil, completionHandler: { (audioURL, error) in
NSOperationQueue.mainQueue().addOperationWithBlock {
if let audioURL = audioURL as? NSURL {
// in our sample code we just present and play the audio in our app extension
let theAVPlayer :AVPlayer = AVPlayer(URL: audioURL)
let theAVPlayerViewController :AVPlayerViewController = AVPlayerViewController()
theAVPlayerViewController.player = theAVPlayer
self.presentViewController(theAVPlayerViewController, animated: true) {
theAVPlayerViewController.player!.play()
}
}
}
})
audioFound = true
break
}
}
if (audioFound) {
break // we only handle one audio recording at a time, so stop looking for more
}
}
}
6c. Build and run as in the previous step. This time, tapping on your action extension will bring up the same view controller as before but now overlaid with the AVPlayerViewController instance containing and playing your audio recording. Also, the two print() statements I've inserted in the code should give output that looks something like the following:
self.extensionContext!.inputItems = [<NSExtensionItem: 0x127d54790> - userInfo: {
NSExtensionItemAttachmentsKey = (
"<NSItemProvider: 0x127d533c0> {types = (\n \"public.file-url\",\n \"com.apple.m4a-audio\"\n)}"
);
}]
attachment = <NSItemProvider: 0x127d533c0> {types = (
"public.file-url",
"com.apple.m4a-audio"
)}
Make the following changes to the action extension's Info.plist file:
7a. The Bundle display name defaults to whatever name you gave your action extension (MyActionExtension in this example). You might wish to change this to Save to MyApp. (By way of comparison, Dropbox uses Save to Dropbox.)
7b. Insert a line for the key CFBundleIconFile and set it to Type String (2nd column), and set its value to MyActionIcon or some such. You will then need to provide the corresponding 5 icon files. In our example, these would be: MyActionIcon.png, MyActionIcon#2x.png, MyActionIcon#3x.png, MyActionIcon~ipad.png, and MyActionIcon#2x~ipad.png. (These icons should be 60x60 points for iphone and 76x76 points for ipad. Only the alpha channel is used to determine which pixels are gray, the RGB channels are ignored.) Add these icon files to your app extension's bundle, NOT the containing app's bundle.
7c. At some point you will need to set the value for the key NSExtension > NSExtensionAttributes > NSExtensionActivationRule to something other than TRUEPREDICATE. If you want your action extension to only be activated for audio files, and not for video files, pdf files, etc., this is where you would specify such a predicate.
The above takes care of getting the audio recording from Voice Memos to your app extension. Below is an outline of how to get the audio recording from the app extension to the containing app. (I'll flesh it out later, time permitting.) This blog post ( http://www.theappguruz.com/blog/ios8-app-groups ) might also be useful.
Set up your app to use App Groups. Open the Project Navigator (Cmd-0) and click on the first line to show your project and targets. Select the target for your app, click on the "Capabilities" tab, look for the App Groups capability, and set its switch to "On". Once the various entitlements have been added, click on the "+" sign to add your App Group, giving it a name like group.com.mycompany.myapp.sharedcontainer. (It must begin with group. and should probably use some form of reverse-DNS naming.)
Repeat the above for your app extension's target, giving it the same name as above (group.com.mycompany.myapp.sharedcontainer).
Now you can write the url of the audio recording to the app group's shared container from the app extension side. In ActionViewController.swift, replace the code fragment that instantiates and presents the AVPlayerViewController with the following:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
sharedContainerDefaults?.setURL(audioURL, forKey: "SharedAudioURLKey")
sharedContainerDefaults?.synchronize()
Similarly, you can read the url of the audio recording from the containing app's side using something like this:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
let audioURL :NSURL? = sharedContainerDefaults?.URLForKey("SharedAudioURLKey")
From here, you can copy the audio file into your app's sandbox, e.g., your app's Documents directory or your app's NSTemporaryDiretory(). Read this blog post ( http://www.atomicbird.com/blog/sharing-with-app-extensions ) for ideas on how to do this in a coordinated fashion using NSFileCoordinator.
References:
Creating an App Extension
Sharing Data with Your Containing App

Disable confirmation on delete request in PHPhotoLibrary

What I am trying to do is to save videos to PHPhotoLibrary, and then remove them when upload to clients remote server in the application completes (basically, photo library serves as temporary storage to add additional layer of security in case anything at all fails (I already save my vides it in the applications directory).
Problem:
The problem is for that to work, everything has to work without input from the user. You can write video to photos library like this:
func storeVideoToLibraryForUpload(upload : SMUpload) {
if PHPhotoLibrary.authorizationStatus() != PHAuthorizationStatus.Authorized {
// Don't write to library since this is disallowed by user
return
}
PHPhotoLibrary.sharedPhotoLibrary().performChanges({ () -> Void in
// Write asset
let assetRequest = PHAssetChangeRequest.creationRequestForAssetFromVideoAtFileURL(NSURL(fileURLWithPath: upload.nonsecureFilePath!)!)
let assetPlaceholder = assetRequest.placeholderForCreatedAsset
let localIdentifier = assetPlaceholder.localIdentifier
// Store local identifier for later use
upload.localAssetIdentifier = localIdentifier
}, completionHandler: { (success, error) -> Void in
....
})
}
And that works flawlessly, I get local identifier, I store it for later use.. Unicorns and rainbows.
Now when I want to remove that video immediately after upload finishes, I call following:
func removeVideoFromLibraryForUpload(upload : SMUpload) {
// Only proceed if there is asset identifier (video previously stored)
if let assetIdentifier = upload.localAssetIdentifier {
// Find asset that we previously stored
let assets = PHAsset.fetchAssetsWithLocalIdentifiers([assetIdentifier], options: PHFetchOptions())
// Fetch asset, if found, delete it
if let fetchedAssets = assets.firstObject as? PHAsset {
PHPhotoLibrary.sharedPhotoLibrary().performChanges({ () -> Void in
// Delete asset
PHAssetChangeRequest.deleteAssets([fetchedAssets])
}, completionHandler: { (success, error) -> Void in
...
})
}
}
}
Which successfully deletes the video, BUT user have to confirm deletion first. That is a problem as that backing up won't work.
I obviously know why there is confirmation (so you don't clear entire user library for example, but the thing is, My app made the video - and so I thought there will be way around it, since as an "owner" I should not be doing that, or at least have option to disable confirmation.
Thanks in advance!
TLDR: How can I disable confirmation on delete request, if my application created that content? (I don't want to delete anything else).
Note: Somebody can probably say this is rather strange thing to do but the application is distributed internally and there is good reason to do it like this (the video content is too valuable to be lost, even if user deletes the application for some reason, or there is anything at all that goes wrong, we need to be able to preserve the videos), so please don't question that and just focus your attention on the question :)
I cannot see a way to avoid the delete confirmation. It is an implementation detail of the Photos framework, similar to the way you cannot prevent the device from asking the user's permission to use the microphone when your app tries to use it, and is a matter of security & trust. Once you have saved an asset to the device photo library your app is no longer the owner of that asset, so as you noted in your question the device must of course ensure the app has the user's permission before it goes about deleting such data.
You can never entirely safeguard your users' data against their own unpredictable behaviour - if they decide to remove your app, or delete a particular asset from within Photos, it is up to them. I think your best option is to either put up with the built-in delete confirmation, or to provide a guide to your users that makes it clear that they should be careful to protect this important data by backing up their device, and not deleting the app!
If you did decide to stick to this approach, perhaps the best thing you could do is to prepare the user for the fact that their device may ask them for confirmation to delete a file that is being uploaded to your own servers. For example, put up your own modal alert just before trying to delete the asset. I wouldn't normally suggest that kind of approach for a public shipping app, but since you're only distributing internally it may be acceptable for your team.

App launched with custom URL scheme. How do I return data to the calling app when done?

I am taking an Android programming course at my University only I have been allowed by the teacher to do IOS but I have to implement the same projects. This project is to have two apps. The first app is a color picker from a previous assignment. The second app is to call the colorpicker and allow the user to choose a color and when done return it too the second app to be displayed.
I have defined a custom URL scheme in my ColorPicker which works fine. In my second app I have a changeColor button that has the following IBAction method.
- (IBAction)colorChangePressed:(UIButton *)sender {
UIApplication *test = [ UIApplication sharedApplication ];
BOOL found =
[ test openURL:[ NSURL URLWithString:#"colorPicker://" ] ];
if (found) NSLog( #"Resource was found" );
else NSLog(#"unable to locate resource" );
}
This indeed launches the color picker app and it behaves as expected. My question is, after the color has been selected how do I return to the calling app with the selected color? I will add a finished button in my colorPicker to be clicked when the user is done selecting the color and I will capture the values I need but I can't figure out how to get this data back to the calling app. Is there some protocol/delegate pattern I need to implement?
The complete code is on git hub at. https://github.com/jnels124/CS390H
Thanks in advance for any insight as to how to solve my problem.
You need to have both apps with unique schemes. Encode the scheme of app1 and use it as a part of app1->app2 URL. When app2 is finished, you'll have a app2->app1 URL, use itto open app1 and send it required information (encoded).
It is similar as if you've put a String extra to app2 Intent with the name of app1 Intent, but instead of Intent you use URL and parse it as needed.
I defined a custom scheme in the other project as stated in the first answer but I was unsure how to generate the query string in the called URL and return it to the calling application to be parsed. I had this resolved in the following post.
Syntax for passing NSArray to other application with custom URL Scheme

Resources