I need to detect whether the user is moving the iPad closer / away from him in order to zoom in / out view. What's the best possible way of doing this (most probably using CoreMotion)?
And, if I've already built the project with an external library using CocoaPods. How to add more frameworks if the Link Binary With Libraries now include one item that is libPods.a?
I would use native libraries like core motion.
https://developer.apple.com/library/prerelease/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/motion_event_basics/motion_event_basics.html
Related
I really want to make use of the pencil kit in my react-native application. I just want to know whether I can do it. If yes, then how?
TLDR: Doesn't exist as of 2/6/2021 and you will need to write bridge code between react native (js) and native (swift/ObjC). There are big performance limitations with this approach. I recommend you create a native Swift based app for your project.
I was also curious if this is available.
For those willing to use Swift, here's the sample shown during the demo.
For those that want to use PencilKit / CoreML native libraries from React native you need need to write bridge code between Javascript (please use typescript) and the native code.
Here's more information on bridging and a guide.
For me, I will be building a note taking app and it needs to be performant. Despite being a react / react native developer I will be choosing Swift to build this project due to performance concerns.
Last point to make is that you can use react native and native together. But this is more of a headache than an enabler. AirBnB used this for some time but moved away from this approach.
For anyone new to React Native, it's a great platform. I personally like to use it for simple applications (not graphically intensive). You can also use it with the Expo tooling which speeds up prototyping but be warned some functionalities are not available bluetooth is one example.
Yes I Did it.
I created a native view for iOS Pencil Kit in React Native with Swift.
You can check basic examples for native module in my Repo
I want to develop an iOS/MacOS app which would use Metal for rendering. This is not supported as a “main window” by JUCE (yet). The reason I was to use JUCE is because it unifies acquiring access to audio interfaces, which is quite different between iOS and MacOS (and also has a GUI component for configuring the audio source, which AudioKit doesn’t have, which is why I don’t want to use it).
I’m struggling to generate a project in JUCE that would have Metal-backed view as a main component. So I’m thinking to instead make a library using JUCE, and then make two native apps for iOS and MacOS which would both use that library. The library would be used to show the dialogs for configuring the audio input (and other parameters that I have, through custom components).
The question: is it possible to make a JUCE library which would be capable if showing dialogs (on both iOS and MacOS) to configure my app, and then use this library from the native iOS and MacOS apps that would have a Metal-backed main view?
Thanks in advance!
I am currently working on a augmented reality project. I would like to place some virtual objects on a human body. Therefore I created an iOS facetracking app(with openCV; C++) which I want to use as a plugin for Unity. Is there a way to build a framework from an existing iOS app? Or do I have to create a new Xcode project and create a cocoa touch framework and copy paste the code from the app into this framework? I am a little bit confused here. Will the framework have camera access?
My idea was to track the position of a face and to send the position to unity, so that I can place some objects on it. But I do not know how to do that. Can anybody help?
nice greets.
as far as I know you need to make your Unity project, and use assets like OpenCV, but it doesn´t allow you to track the human body (without markers).
About building a fremwork starting from an iOS app, first time I heard that!
Is it possible to create an iOS library or framework using libgdx (RoboVM) that can be imported into Xcode?
Background:
One of my colleagues has created a 3D visualisation app as a libgdx project for android and windows desktop. It can be compiled to run on iOS using RoboVM. However, I would like to wrap extra native user interface elements around it using Xcode. I know its possible to build the user interface programmatically via RoboVM but I would be keen to investigate if its possible to bring the existing work into Xcode. I don't need to edit the 3D visualisation component but add extra GUI elements around the 3D Vis window. I thought compiling the libgdx (RoboVM) code to a framework or library might be a solution that could be imported?!
Yes you can do it.
All you need to create a method, say initRoboVM(), This will be called by your code when you want to initialize libgdx. You'll need to pass the app path in, which you can hardcode when you're testing.
initRoboVM() will need some modifications, namely it should not call your Java app's main method, well, at least, that's what well behaving libraries should not do IMO. It should also not call rvmShutdown.
You can get further information from here
Thanks :)
I asked the RoboVM team directly. Their answer: It's not a native function, but it certainly can be done.
The complete message...
Hi,
Sorry for the late reply. This use case is not something we're going
to do now. It is possible though if you're prepared to do some
patching of RoboVM. Search the RoboVM Google Group and you should find
others who have managed to get this working.
We get this request every know and then so we will add support for
this eventually.
Regards, Niklas
I'm doing a series of book apps for a client. There's a lot of books in the series, and each one will be a separate app. Instead of making changes to all the apps every time they want something tweaked in all of them, like the position of a button or something, I'd like to make a universal "framework" (library?) that I can import to a project, just as I would do for one of the iOS SDK's frameworks. The framework would have all the universal components of the apps, which would include a controller class I would subclass in each app to do the app-specific things. Then when I need to make changes to all of them, I could just change the code in the framework, and it would affect all the apps that use it. I'd also like to be able to include common images and other media.
Do I want to use the "Cocoa Touch Static Library" template in Xcode? I also saw this project in github: https://github.com/kstenerud/iOS-Universal-Framework; would this be a better fit for what I'm trying to do?
For your purposes, the simplest approach would simply be to set it up as a static library project. Then, for each application you want to use it, drag the project into your workspace and add the static library product as a dependent target.
As far as I am aware, the current leading method for building a framework on iOS is Jeff Verkoeyen's iOS-Framework.