need help regarding Markerless example using opencv for unity - opencv

I am working on project which is based on markerless technology so for this right now I am using opencv for unity and well known unity asset "Markerless example using opencv for unity" so after importing both packages it's showing one error of coremodule. I tried as much as i can but now I need help so i can proceed further to achieve my target.
here I am attaching the error that i am getting!!
enter image description here

This is because the plugin not successfully installed or imported or you're using outdated. Because new update of opencv for unity is just awesome. Do import it again or else you need to manually import classes by going to core module.

Related

Creating and exporting a Metal library

I see that Apple documentation usually focuses on creating Metal libraries that can then be loaded using the Metal API from Swift or Objective-C, but I am trying to create a library that can provide utility Metal functions to projects that would import it (so you would #include <MyLibrary/Header.h>. Unfortunately I can't find anywhere how to do this:
I see no clear way of doing this as part of a Swift Package (which is what I would have preferred), since I would need a way to tell Xcode that the package contains a Metal Library it should link with the current project library
I also tried the "Metal Library" template in Xcode, but I don't understand how it should be used.
What is the correct way of doing this?

How to use openCV in Xamarin?

I am creating a Xamarin project to detect some objects in images .
I want to use openCV library , but I can't wrap it to the xamarin .
I wrapped it to an windows form application but I can not wrap it to Xamarin (I got errors in compatibility)
I have tried to install EMGU.cv but I got error : unable to find version 3.4.3
also I can't update it .
I also tried to wrap the openCV to xamarin by downloading the openCV from here https://opencv.org/releases/
but I didn't find the the .jar file to add it the my project .
so how can I use openCV in xamarin please ?
also is object detection applicable with openCV or not ?
Luckily there is some work done by people before you. You can already use the wrapper that's created for Android, but for iOS you'll have to try to create the wrapper yourself
Edit from 2022: For Android you can also take a look at this GitHub sample, amongst many others

CardIO conflicts with OpenCV framework

I thoroughly enjoy using Card.IO but in order for me to use it, it would have to be decoupled from it's OpenCV .a files, and instead link to the OpenCV framework. Most people have moved on from OpenCV2 to OpenCV3, and this library is stuck in the past. There appears to be no way to work around this, since your dependency is baked into your .a file. (calling creators of Card IO)
Has anyone else been able to work around this? Or is this library junk now if you use OpenCV?
Thanks,
Kevin

MapboxOptimizedTrips missing from package MapboxSDK.AndroidServices (Xamarin.Android)

I'm currently implementing MapBox in a Xamarin.Android application, I'm looking at the examples from MapBox Github, this one in specific https://github.com/mapbox/mapbox-android-demo/blob/master/MapboxAndroidDemo/src/main/java/com/mapbox/mapboxandroiddemo/examples/mas/OptimizationActivity.java, which I'm translating to C#, but now I'm facing a problem. In this example they use a class called "MapboxOptimizedTrips" which comes from lib 'com.mapbox.services.api.optimizedtrips.v1.MapboxOptimizedTrips'. In Xamarin I have imported all libraries that Xamarin made for MapBox, which is these 3:
MapBoxSDK.Android
MapBoxSDK.AndroidServices
MapBoxSDK.JavaServices
None of these includes the Optimized Trips API related classes. Has anyone faced this issue before?
Screenshot of code
I thought you read and followed this document from Mapbox?
Mapbox Navigation library is kept changing and adding new features
There are two DirectionsRoute class from Mapbox libraries
both DirectionsRoute returned by Mapbox Optimization packages doesn't fit with what NavigationRoute does.
I think you have to go with the way you mentioned, call to API directly. Or wait for Mapbox to evolve their library.
Cheers.

Implementing ASIFT in Android

I am new to both openCV and Android. I have to detect objects in my project. So, I have decided to use ASIFT for the same. However, the code they have given here is very lengthy. It contains lots of C file. It also doesn't have openCV support.
Some search on the SO itself suggested that it is easier to connect the ASIFT code to the openCV library, but I can't figure out how to do that. Can anyone help me by giving some link or by telling the steps that I should use to add ASIFT to my openCv library, which I can further utilize in making my Android application?
Also, I would like to know whether using Android NDK along with JNI to make calls to the C files or using Android SDK along with binary package for my android project(Object Detection) would be a suitable option for me?
Finally , I solved my problem by using the source code given at the website of ASIFT developers. I compacted all the source files together to make my own library using make. I then called the required function from the library using JNI.
It worked for me, but the execution is taking approximate 2 mins on an Android device. Anyone having some idea about ways to reduce the running time ?
They used very simple and slow brute force matching (just for proving of concept). You can use FLANN library and it will help a lot. http://docs.opencv.org/doc/tutorials/features2d/feature_flann_matcher/feature_flann_matcher.html

Resources