Ant build system for apps and sdks - ant

I am working with build.xml and try to understand how it works when there are multiple build.xml files provided by both app and sdk.
For example, app and sdk both depends on a base-sdk, say they will both call some methods in the base-sdk, I am trying to do some pre-process work for their calls to this base-sdk (add identification params). In what order the build process will happen ? It seems like the the build.xml in sdk does not work if i compile from the app level, only app build.xml works. Could anyone help explaining about what happens during this process? Greatly appreciated that.

Related

Adding and using an executable file in iOS at runtime

Suppose, I have some external executable file(call it a .swift file) that are not linked with the xcode project at compile time. That means, I did not have any of those files in my project tree when built the project.
For an example, lets say I have a file called exc.swift. This file was not included while I built the project.
Is there any way that I can execute that executable (the exc.swift) file at runtime?
In android there is a way by using DexClassLoader class. That class is responsible for executing code not installed as part of an application.
The documentation for that class is here.
Is there an iOS equivalent version of this? or in any way is this achievable?
If you are hoping to distribute the app that you are writing, then this is an absolute no-no. The app store review guidelines clearly state "Apps that install or launch other executable code will be rejected" so no app that exhibits this functionality will ever get onto the app store. You may be able to find or devise some kind of hack or workaround to get this kind of thing to work, but it will only ever be for your own amusement.

Modules in IOS with Air app

We have a problem in our company with an application and would like to advise us an optimal solution, the point is that we have an application for tablets made with Flex mobile, our application can open modules in execution time downloading it from a server, these modules are opened perfectly with AIR or Android but in IOS is not possible, that's not working. Some solutions for us to have the extra functionality that provide these modules have occurred, are as follows:
1. Create a library for each extra functionality of each client and import all of them in the main application project.
2. Create a unique library with the functionality of all clients and then, import it in the main application project.
3. Create as many native extensions (ANE) and functionalities as are required by our different customer and import them into our application.
I would like to know which solution is optimal because in the future we can get 100 customers and maybe too much functionality may slow down the application.
Thank you very much.
Apple does not allow dynamic linking at runtime. Modules are executable code and need to be bundled at build time, only then they can be loaded from the bundle.
Otherwise you could circumvent the AppStore and add any, potentially harmful code at execution time.

CanvasCamera for iOS PhoneGap / Cordova

First of all, I'm really new to Cordova and Xcode and I'm trying to create an inline QR Code scanner from and HTML 5 app (or at least see if it's possible with this plugin).
I'm trying to follow the instructions from https://github.com/daraosn/Cordova-CanvasCamera and I am unsure what format or how to edit the config.xml in my project.
The instructions say:
"Edit your config.xml and add CanvasCamera into your Plugins list." but i dont know what this means or what format it should follow.
Also, when I add the plugins to the Plugin folder in the project, Xcode throws an error saying:
"'NSAutoreleasePool' is unavailable: not available in automatic reference counting mode".
I know that getUserMedia isnt support in Safari/iOS so it's pushing the boundaries a bit. If all else fails, i'll just use the input type=file and access the camera that way.
That plugin you reference looks severely dated. My best guess is, for your config.xml, add:
<plugin name="CanvasCamera" />
once you copied the files into your directory per those instructions.
Also, you could check out https://github.com/donaldp24/CanvasCameraPlugin. You can install it by running "cordova plugin add https://github.com/donaldp24/CanvasCameraPlugin.git && cordova prepare".
Hope this helps.
I found a plugin (http://scandit.com) which is much better than what i was originally trying to achieve.
The only downside is that it's a premium service ($200/month), however the upsides are: super easy to install, is really really fast at decoding (doesn't need to focus, shade doesn't matter), available on plenty for platforms, good documentation.
Another point was that it is a fullscreen camera plugin, which actually works better than inline, what we initially wanted.
Hope it helps anyone else.

Why doesn't Xcode have an ios framework option?

I've seen project such as ios-universal-framework, but I want to know why XCode iOS does not natively support having a framework. Is it some kind of legal issue. The static library option is not good enough because I want to be able to use .dylib files in my framework.
A little background on what I want to do with a framework. I have a project that is generated from Unity3D, and when we update, we have to manually add back all of our project changes.
What I want is to use a framework that can store most of those external libraries and resources to make it easier to upgrade our project when updates are released.
From a security perspective no code is allowed to be dynamically loaded, thus only static libraries are allowed.
It is possible to create static psudo-frameworks. Take a look at GitHub iOS-Universal-Framework.
What you need is a PosprocessBuilder as described in the build pipeline described in the Unity3d Documentation.
You can manage the Xcode configurations in this pipeline using scripts like the Xcode Zerg.
I've used one python script written by a guy called Calvin Rien that worked really well, if you want to know more about this script this blog post should give you a hint.
What you really need to look for to you automate the these steps is to look for posts of Continuous Integration using Unity 3d and iOS like this one:
Unity3d: from commit to deployment onto tester devices in 20 min using Jenkins

Is it possible to use MJSIP api with Blackberry?

I am trying to develop an VOIP application for blackberry,after a long surf i came
to know about mjsip api.But i have a doubt that is it possible to use this api
with blackberry development to create VOIP application.Please anyone knows the answer
help me.
Thanks, it's a nice project you found there! There is an J2ME version MjSipME, and the only thing I can say now for sure is that it compiles with Blackberry without any errors.
UPDATE You right, there are missunderstanding with packages/folders structure.
Steps to compile:
download mjsip2me_1.6.zip
create blackberry project (I've used components 4.6)
in project src folder create such structure:
alt text http://img691.imageshack.us/img691/9311/structure.jpg
extract mjsip2me_1.6.zip, copy files and folders:
from mjsip2me_1.6\src\org\zoolu\ to project\src\org\zoolu\
from mjsip2me_1.6\src\j2me\local\ to project\src\local
from mjsip2me_1.6\src\j2me\microtools\ to project\src\org\zoolu\microtools
from mjsip2me_1.6\src\j2me\net\ to project\src\org\zoolu\net
copy mjsip2me_1.6\src\j2me\ExceptionPrinter.java into project\src\org\zoolu\tools
copy mjsip2me_1.6\src\j2me\RotatingLog.java into project\src\org\zoolu\tools
refresh your project, clean and build
It's j2me midlet so don't expect it will start on bb device, but at least it compiles.

Resources