i am searching a tutorial or helps about to create a snapchat lens-like feature in iOS with swift or objective-C either with Google Mobile Vision or IOS Vision framework.
I want to achieve something like android's do :
https://www.raywenderlich.com/523-augmented-reality-in-android-with-google-s-face-api
if you could help me or suggest me an open source library that has this ability.
Thanks.
Related
I am developing an app that uses turn-by-turn navigation. But due to cost limitations, we cannot use MapBox using their pricing plans.
I came to know OpenStreetMaps is a good option here and also, MapBox is using OpenStreetMaps for some of its features. Has anybody used MapBox SDK to use OpenStreetMaps?
I also searched for other alternative open-source libraries that use OpenStreetMaps. Skobbler SDK was the one I found but it's no longer functional as the development is stopped.
I need help in finding a library that allows me to navigate in-app using OSM.
Please help.
Problem: After building Mediapipe and playing around with their example iOS apps I was surprised to see that I could not find any comprehensive materials out there from which you can learn how to take their Machine Learning frameworks and further integrate them into a custom iOS app.
Question: Am I missing something? Can somebody provide some insight into integrating their ML solutions in an iOS app in order for the output to be further used in a custom use case in Swift?
I have an iOS library and would like to integrate firebase SDK into it so that I can see crashes and other things that happened in the library. I am wondering if this is doable?
Yes. It is usually best to include it from a statically linked library, but possible from a dynamic library.
See details at https://github.com/firebase/firebase-ios-sdk/blob/master/docs/firebase_in_libraries.md.
Has anyone used Yuneec’s SDK or know anything about it? Building SDK for drones creates a lot of possibilities. But want to see if anyone here has experience with Yuneec before.
http://developer.yuneec.com/
Yes! Yuneec has SDK for both iOS and Android. I am currently looking into the Yuneec Android SDK to design a custom app and so far its been great! The SDK exposes a lot of features and looks like they are going to be adding a lot of exciting features soon. SDK provides APIs to control the Yuneec drones and can be customized to your needs. You can definitely check out more here https://developer.yuneec.com.
I am trying to figure out a way to do speech recognition/speech to text on Xamarin iOS. I have been doing a lot of research and so far it seems like this is a feature only available for Android Xamarin Development.
Does anyone have an idea of how to appraoch this? Any links to resources or projects that have implemented this would be much appreciated!
You can try open ears or dragon speech. There is no built in speech recognition for Xamarin.iOS that I am aware of.
https://github.com/oganix/MonoTouch-OpenEars
Has anyone created a MonoTouch binding for the Nuance Dragon Mobile Speech SDK for iOS?
Update
Since iOS 10 you can now access the speech recognition apis documented below
https://blog.xamarin.com/speech-recognition-in-ios-10/?utm_medium=social&utm_campaign=blog&utm_source=linkedin&utm_content=ios10-speech-recognition
Android has for a long time had their APIs available for speech documented below
https://developer.xamarin.com/guides/android/platform_features/speech/