BlackBerry: Duplicate application menu items in UiApplication - blackberry

I have a BlackBerry UiApplication, which registers some menu items in the standard Phone and Contacts applications. I need the menu items to be registered on phone startup, ie, before my UiApplication is started.
I can achieve this if I configure my UiApplication to auto-run on startup, and register the menu items in my app initialisation code using ApplicationMenuItemRepository.
My problem is that every time my UiApplication is subsequently opened, my initialisation code is run again, and I get duplicate menu items in the Phone and Contacts app. ApplicationMenuItemRepository does not provide an API to check if they are already registered. Using a static boolean in my own code also does not help, presumably because different classloaders are used for each app instance.
Am I using the wrong approach here? Should I have a separate Application (to register Phone/Contacts menu items) and UiApplication (for my views)? That feels overly complex for my needs.

Use the Alternate Entry Point
Click on the project node.
Right click and select Properties.
In the Properties window, select the Application tab.
Ensure the following options are checked: Auto-run on startup and System module (to register the thread with the system).
Create another project under the same folder as the original project. Right click on the new project node and select Properties.
Select the Application tab and select Alternate CLDC Application Entry Point from the Project type drop down menu. As shown in the attached file, select the name of the original project (for example: trafficreporter) from the Alternate entry point for drop down menu. Also specify the arguments that would launch the application using this alternate entry point (for example: gui).
Modify the main() method of the original project as follows:
public static void main(String[] args) {
if ( args != null && args.length > 0 && args[0].equals("gui") ){
// code to initialize the app
theApp.enterEventDispatcher();
} else {
// code to launch the background thread }
}
}
Add your application iconfile to the this new "Entry Point" application and make it the ribbon icon.

Use the removeMenuItem() method when the user exits your application. it will work.
if(_serverMenuItem != null) {
ApplicationMenuItemRepository.getInstance().
removeMenuItem(ApplicationMenuItemRepository.MENUITEM_PHONE,_serverMenuItem);
}

If you want to add custom menu fields to native applications you can use a RunTime store to register you menu item and then check for it while you re-run your code :
ApplicationMenuItem ami = new ApplicationMenuItem(placement); // some placement you want to use e.g 0x35090
ApplicationMenuItemRepository amir = ApplicationMenuItemRepository.getInstance();
RuntimeStore store = RuntimeStore.getRuntimeStore(); // get the store instance
if(store.get(ApplicationMenuItemRepository.MENUITEM_CALENDAR) == null)// if object is not added only then add the item to the menu
{
try
{
store.put( ApplicationMenuItemRepository.MENUITEM_CALENDAR, ami );
}
catch(IllegalArgumentException e){}
amir.addMenuItem(ApplicationMenuItemRepository.MENUITEM_CALENDAR, ami);
}

Related

How to unfocus from a WebView in UWP

I'm working on a UWP app that hosts a WebView which runs in a separate process.
var webView = new Windows.UI.Xaml.Controls.WebView(WebViewExecutionMode.SeparateProcess)
This results in a behavior that if the WebView has the focus, the containing app can't regain the focus by itself by simply trying to focus on a UI element.
The app supports keyboard shortcuts which may result in different elements getting the focus, but it's not working correctly when the focus is captured by the WebView. The target element seems to be getting the focus but it seems as if the process itself is not activated (as the real focus resides in a different process I suppose...).
I'm currently trying to activate the app programmatically through protocol registration in an attempt to regain focus.
I added a declaration in the app manifest for a custom protocol mycustomprotocol coupled with the following activation overload
protected override void OnActivated(IActivatedEventArgs args)
{
if (eventArgs.Uri.Scheme == "mycustomprotocol")
{ }
}
And the following code to invoke the activation:
var result = await Windows.System.Launcher.LaunchUriAsync(new Uri("mycustomprotocol:"));
Seems to be working only on some computers, on others (not while debugging the app, only when executed unattached) instead of regaining focus the app's taskbar icon just flashes orange.
I've created a sample project showing the problem and the semi working solution here
Any insight on any of this would be great.
I can reproduce your issue. I found that when we switch the focus with the mouse, the focus can be transferred to the TextBlock. So you could solve this question through simulating mouse input.
Please use the following code to instead FocusTarget.Focus(FocusState.Programmatic).
As follows:
InputInjector inputInjector = InputInjector.TryCreate();
var infoDown = new InjectedInputMouseInfo();
// adjust your mouse position to the textbox through changing infoDown.DeltaX,infoDown.DeltaY
infoDown.DeltaX = 10; //change
infoDown.DeltaY = -150; //change
infoDown.MouseOptions = InjectedInputMouseOptions.LeftDown;
var infoUp = new InjectedInputMouseInfo();
infoUp.DeltaX = 0;
infoUp.DeltaY = 0;
infoUp.MouseOptions = InjectedInputMouseOptions.LeftUp;
inputInjector.InjectMouseInput(new[] { infoDown, infoUp });
Note: If you use the input injection APIs, you need to add inputInjectionBrokered Capabilitiy in your Package.appxmanifest.
But this Capabilitiy is a restricted Capabilitiy, you can’t publish this app in store, which can’t pass the verification.
I've been in discussions with a WebView software engineer. The problem is that the separate process still wants to own focus if you try to move the focus away from the webview. His solution is to ask the other process' web engine to give up focus with the following call:
_= webView.InvokeScriptAsync("eval", new string[] { "window.departFocus('up', { originLeft: 0, originTop: 0, originWidth: 0, originHeight: 0 });" });
You can call it before trying to change the focus to your target. I ran various tests and it works consistently.

Export audiofiles via “open in:” from Voice Memos App

I have the exact same issue as "Paul" posted here: Can not export audiofiles via "open in:" from Voice Memos App - no answers have yet been posted on this topic.
Essentially what I'm trying to do is simple:
After having recorded a Voice Memo on iOS, I select "Open With" and from the popup that is shown I want to be able to select my app.
I've tried everything I can think of and experimented with LSItemContentTypes without success.
Unfortunately I don't have enough reputation to comment on the existing post above, and I'm getting quite desperate for a solution to this. Any help is hugely appreciated, even just to know whether it's doable or not.
Thanks!
After some experimentation and much guidance from this blog post ( http://www.theappguruz.com/blog/share-extension-in-ios-8 ), it appears that it is possible to do this using a combination of app extensions (specifically an Action Extension) and app groups. I'll describe the first part which will enable you to get your recording from Voice Memos to your app extension. The second part -- getting the recording from the app extension to the containing app (your "main" app) -- can be done using app groups; please consult the blog post above for how to do this.
Create a new target within your project for the app extension, by selecting File > New > Target... from Xcode's menu. In the dialog box that prompts you to "Choose a template for your new target:" choose the "Action Extension" and click "Next".
CAUTION: Do not choose the "Share Extension" as is done in the blog post example above. That approach is more appropriate for sharing with another user or posting to a website.
Fill in the "Product Name:" for your Action Extension, e.g., MyActionExtension. Also, for "Action Type:" I selected "Presents User Interface" because this is the way Dropbox appears to do it. Selecting this option adds a view controller (ActionViewController) and storyboard (Maininterface.storyboard) to your app extension. The view controller is a good place to provide feedback to the user and to give the user an opportunity to rename the audio file before exporting it to your app.
Click "Finish." You will be prompted to "Activate “MyActionExtension” scheme?". Click "Activate" and this new scheme will be made active. Building it will build both the action extension and the containing app.
Click the disclosure triangle for the "MyActionExtension" folder in the Project Navigator (Cmd-0) to reveal the newly-created storyboard, ActionViewController source file(s), and Info.plist. You will need to customize these files for your needs. But for now ...
Build and run the scheme you just created. You will be prompted to "Choose an app to run:". Select "Voice Memos" from the list and click "Run". (You will probably need a physical device for this; I don't think the simulator has Voice Memos on it.) This will build and deploy your action extension (and its containing app) to your device. and then proceed to launch "Voice Memos" on your device. If you now make a recording with "Voice Memos" and then attempt to share it, you should see your action extension (with a blank icon) in the bottom row. If you don't see it there, tap on the "More" button in that row and set the switch for your action extension to "On". Tapping on your action extension will just bring up an empty view with a "Done" button. The template code looks for an image file, and finding none does nothing. We'll fix this in the next step.
Edit ActionViewController.swift to make the following changes:
6a. Add import statements for AVFoundation and AVKit near the top of the file:
// the next two imports are only necessary because (for our sample code)
// we have chosen to present and play the audio in our app extension.
// if all we are going to be doing is handing the audio file off to the
// containing app (the usual scenario), we won't need these two frameworks
// in our app extension.
import AVFoundation
import AVKit
6b. Replace the entirety of override func viewDidLoad() {...} with the following:
override func viewDidLoad() {
super.viewDidLoad()
// Get the item[s] we're handling from the extension context.
// For example, look for an image and place it into an image view.
// Replace this with something appropriate for the type[s] your extension supports.
print("self.extensionContext!.inputItems = (self.extensionContext!.inputItems)")
var audioFound :Bool = false
for inputItem: AnyObject in self.extensionContext!.inputItems {
let extensionItem = inputItem as! NSExtensionItem
for attachment: AnyObject in extensionItem.attachments! {
print("attachment = \(attachment)")
let itemProvider = attachment as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMPEG4Audio as String)
//|| itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMP3 as String)
// the audio format(s) we expect to receive and that we can handle
{
itemProvider.loadItemForTypeIdentifier(kUTTypeMPEG4Audio as String,
options: nil, completionHandler: { (audioURL, error) in
NSOperationQueue.mainQueue().addOperationWithBlock {
if let audioURL = audioURL as? NSURL {
// in our sample code we just present and play the audio in our app extension
let theAVPlayer :AVPlayer = AVPlayer(URL: audioURL)
let theAVPlayerViewController :AVPlayerViewController = AVPlayerViewController()
theAVPlayerViewController.player = theAVPlayer
self.presentViewController(theAVPlayerViewController, animated: true) {
theAVPlayerViewController.player!.play()
}
}
}
})
audioFound = true
break
}
}
if (audioFound) {
break // we only handle one audio recording at a time, so stop looking for more
}
}
}
6c. Build and run as in the previous step. This time, tapping on your action extension will bring up the same view controller as before but now overlaid with the AVPlayerViewController instance containing and playing your audio recording. Also, the two print() statements I've inserted in the code should give output that looks something like the following:
self.extensionContext!.inputItems = [<NSExtensionItem: 0x127d54790> - userInfo: {
NSExtensionItemAttachmentsKey = (
"<NSItemProvider: 0x127d533c0> {types = (\n \"public.file-url\",\n \"com.apple.m4a-audio\"\n)}"
);
}]
attachment = <NSItemProvider: 0x127d533c0> {types = (
"public.file-url",
"com.apple.m4a-audio"
)}
Make the following changes to the action extension's Info.plist file:
7a. The Bundle display name defaults to whatever name you gave your action extension (MyActionExtension in this example). You might wish to change this to Save to MyApp. (By way of comparison, Dropbox uses Save to Dropbox.)
7b. Insert a line for the key CFBundleIconFile and set it to Type String (2nd column), and set its value to MyActionIcon or some such. You will then need to provide the corresponding 5 icon files. In our example, these would be: MyActionIcon.png, MyActionIcon#2x.png, MyActionIcon#3x.png, MyActionIcon~ipad.png, and MyActionIcon#2x~ipad.png. (These icons should be 60x60 points for iphone and 76x76 points for ipad. Only the alpha channel is used to determine which pixels are gray, the RGB channels are ignored.) Add these icon files to your app extension's bundle, NOT the containing app's bundle.
7c. At some point you will need to set the value for the key NSExtension > NSExtensionAttributes > NSExtensionActivationRule to something other than TRUEPREDICATE. If you want your action extension to only be activated for audio files, and not for video files, pdf files, etc., this is where you would specify such a predicate.
The above takes care of getting the audio recording from Voice Memos to your app extension. Below is an outline of how to get the audio recording from the app extension to the containing app. (I'll flesh it out later, time permitting.) This blog post ( http://www.theappguruz.com/blog/ios8-app-groups ) might also be useful.
Set up your app to use App Groups. Open the Project Navigator (Cmd-0) and click on the first line to show your project and targets. Select the target for your app, click on the "Capabilities" tab, look for the App Groups capability, and set its switch to "On". Once the various entitlements have been added, click on the "+" sign to add your App Group, giving it a name like group.com.mycompany.myapp.sharedcontainer. (It must begin with group. and should probably use some form of reverse-DNS naming.)
Repeat the above for your app extension's target, giving it the same name as above (group.com.mycompany.myapp.sharedcontainer).
Now you can write the url of the audio recording to the app group's shared container from the app extension side. In ActionViewController.swift, replace the code fragment that instantiates and presents the AVPlayerViewController with the following:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
sharedContainerDefaults?.setURL(audioURL, forKey: "SharedAudioURLKey")
sharedContainerDefaults?.synchronize()
Similarly, you can read the url of the audio recording from the containing app's side using something like this:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
let audioURL :NSURL? = sharedContainerDefaults?.URLForKey("SharedAudioURLKey")
From here, you can copy the audio file into your app's sandbox, e.g., your app's Documents directory or your app's NSTemporaryDiretory(). Read this blog post ( http://www.atomicbird.com/blog/sharing-with-app-extensions ) for ideas on how to do this in a coordinated fashion using NSFileCoordinator.
References:
Creating an App Extension
Sharing Data with Your Containing App

how make to make ios keyboard's return key submit input in unity?

I have a Unity UI's input field and a text box. When I use Input.GetKeyDown (KeyCode.Return), it only works on the OS X and PC build and not on the iOS build. iOS keyboard's Return key does nothing. I have tried the events, too, but it doesn't work even then.
Somebody please tell me the solution to this problem if there is any?
While I can't think of a way to harness the return key directly on iOS, there is a way to do so with the "Submit" key using the TouchScreenKeyboard class in Unity
Specifically, it has a variable TouchScreenKeyboard.done to indicate whether the user has pressed the "Submit" (or equivalent) button on any mobile device (iOS, Android WP)
You can also check the wasCanceled variable to see whether the user canceled the input.
Example
public class TouchKeyboardExample : Monobehaviour {
private TouchScreenKeyboard touchScreenKeyboard;
private string inputText = string.Empty;
void Start () {
touchScreenKeyboard = TouchScreenKeyboard.Open(inputText, TouchScreenKeyboardType.Default);
}
void Update () {
if(touchScreenKeyboard == null)
return;
inputText = touchScreenKeyboard.text;
if(touchScreenKeyboard.done)
Debug.Log("User typed in "+inputText);
if(touchScreenKeyboard.wasCanceled)
Debug.Log("User canceled input");
}
}
I've never tried this on IOS, so I'll just guess here.
Are you using the new Unity UI that was introduced in Unity4.6 / Unity5? If so, you might want to use the UI EventSystem, which you probably have somewhere in scene already (it is being added automatically when you add new Canvas object). If you don't have it in scene, add it via menu GameObject->UI->Event System.
In the EventSystem game object, there's a component called Standalone Input Module, where you can then define Submit Button property - which is mapped to Unity's Input Manager (Edit->Project Settings->Input).
On the individual UI element (i.e. InputField in your case), you can now add EventTrigger component, which can listen to Submit event and call a custom method, even pass it some data (e.g. itself, as InputField parameter of the method).
You can also listen to many more events this way (select, hover, drag, etc).
this works fine for me (PC/Mobile), try it out
this.yourInput.onSubmit.AddListener(delegate {
if (this.yourInput.text.Length > 0)
// do something here after enter (PC) or done (mobile)
});

Using offline map sources with Monotouch's Route-me binding

I'm trying to get route-me to show an offline map which is bundled or to be downloaded after app installation. I'm using route-me bindings sample project to get the work done just for now. I also use the mbtiles file from the original route-me repo's SampleMap project. I copy the file to project's root directory and set it's build action to BundleResource (that's what I thought would be appropriate). After that I changed to code to this :
public override void ViewDidLoad ()
{
base.ViewDidLoad ();
RMDBMapSource dbSource = new RMDBMapSource ("Philadelphia.mbtiles");
MapView = new RMMapView(View.Frame, dbSource.Handle);
MapView.AutoresizingMask = UIViewAutoresizing.FlexibleDimensions;
if (UIScreen.MainScreen.Scale > 1.0)
MapView.AdjustTilesForRetinaDisplay = true;
Add (MapView);
}
But no luck. App runs in the simulator but showing only a grey background nothing more. So I need someone to help me and tell me what I'm doing wrong. I need to get it done this week since next week is the deadline for project. So any help would be appreciated.
I haven't actually used offline tiles myself, but based on this thread it looks like you might need to put the RMDBMapSource into an instance of RMMapContents instead of directly into the MapView. So I'm thinking it would be something like this with Xamarin:
RMDBMapSource dbSource = new RMDBMapSource ("Philadelphia.mbtiles");
MapView = new RMMapView(View.Frame, new IntPtr());
RMMapContents contents = new RMMapContents (MapView.Handle, dbSource.Handle);
That assumes you have a wrapper binding for RMMapContents too which from the looks of it the bindings project does not have by default. You'd need to throw in a wrapper that at least defines the constructor.
This page looks like it provides similar code (in Obj-C) towards the bottom.

How can run background service application and UIApplication same time

I want to run background service application and UIApplication same time.
Is it possible to create both in same project or need to create separate separate project.
Actually i am confuse in how call or start background service in Event Thread.
This is how you can set up an alternate entry point for your application:
A- Using the BlackBerry® Java® Plug-in for Eclipse®
After creating the project for the original application, create an alternate entry point to launch the application UI.
1- Double click on BlackBerry_App_Descriptor.xml within your project.
2- Check off System Module and Do not display the application icon on the BlackBerry home screen.
3-Click on the Alternate Entry Point tab.
4- Click the Add button.
5- Enter a title for the entry point and click OK.
6- Specify the application argument that would launch the application using this alternate entry point (for example: gui).
7- Proceed to the Common Steps section.
8- Modify the main() method of the original project as follows:
public static void main(String[] args) {
if ( args != null && args.length > 0 && args[0].equals("gui") ){
// code to initialize the app
theApp.enterEventDispatcher();
} else {
// code to launch the background thread }
}
}
B- Using the BlackBerry JDE
After creating the projects for the original application, you will have to create another project for the UI entry point. Assuming that the thread to be run exists in the same project as the original application, follow these steps:
1- Right-click the project node and select Properties.
2- In the Properties window, select the Application tab.
3- Verify the following options are checked: Auto-run on startup and System module (to register the thread with the system).
4- Create another project under the same folder as the original project. Right-click the new project node and select Properties.
5- Select the Application tab and select Alternate CLDC Application Entry Point from the Project type drop-down list. As shown in the attached file, select the name of the original project (for example, trafficreporter) from the Alternate entry point for drop-down list. Also specify the arguments that would launch the application using this alternate entry point (for example: gui).
Proceed to the Common Steps section.
6- Modify the main() method of the original project as follows:
public static void main(String[] args) {
if ( args != null && args.length > 0 && args[0].equals("gui") ){
// code to initialize the app
theApp.enterEventDispatcher();
} else {
// code to launch the background thread }
}
}
http://supportforums.blackberry.com/t5/Java-Development/Background-thread-for-push-notifications/td-p/563071
the Blackberry dev forums are full of threads and sample code to accomplish this very thing.
Personally, I use the alternate entry point method, run the background app as an autostart UiApplication (with no icon) that never pushes a MainScreen, but uses its own dispatch thread to throw up a dialog or similar notifications, and then when the actual Home Icon is pressed/clicked, I launch the Ui entry point to play with the user.

Resources