Sound Localization with a single microphone [closed] - ios

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
I am trying to determine the direction of an audio signal using the microphone on an iPhone. Is there any way to do this? As far as I have read and attempted, it isn't possible. I have made extensive models with keras and even then determining the location of the sound is shaky at best due to the number of variables. So not including any ML aspects, is there a library or method to determine audio direction from an iOS microphone?

No, in general it shouldn't be possible (Even with machine learning)--you need at least two points (and excellent timing) to determine a direction. You MIGHT be able to do something with multiple iPhones, but that would require very tight timing and some learning to determine where the phones are in relation to each other--and I doubt such a library already exists for the iPhone (existing libraries could be ported/adapted though)

Related

SWIFT ARKIT detect the hand and nails, and place the object on the nails? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I would like to know how to detect a person’s hand and nails, and then use ARKIT to place an object on her nails. Frankly, I’ve been looking for information about it for several days in Google, I haven’t found anything that could help me. I would really appreciate if you could help me! Thanks a lot in advance!
You may have to create a machine learning model using Apple's CreateML with images of fingernails and hands to train your app to recognize fingernails and hands and then use CoreML to transfer that recognition to ARKit where you can possibly use it place the object on the nails and hands. I understand that can be a lot to do so for a simpler start to solving your problem Apple has native image recognition functions that you can start experimenting with. Not sure if that necessarily solves your exact problem in recognizing fingernails and hands but at least it's a start.
Check below
https://developer.apple.com/documentation/arkit/tracking_and_altering_images

How to Verify People's faces Using Camera [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I have some program that I want to do face verification by previously stored people faces in app data, and use the device's camera to get the current person standing in front of it, to make the program log the user in.
I have found some code but I don't know how to use it, it uses OpenCVSharp.
Can anyone tell me how to do this verification in Xamarin Forms?
P.S:- I've found some resources that Aren't Free, such as EmguCV, I want something that is free.
You can use Azure Cognitive service
https://azure.microsoft.com/en-gb/services/cognitive-services/computer-vision/
It has a free tier(more than enough if you have less than 100 users I'd say) to use.

Car travel time between two points swift [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am looking for something in swift that can give me the travel time (by car) of two coordinates. On some other threads, I have only seen suggestions to use external sources, but my question is, does apple have a built in feature to do this? Similarily, if there is not can someone please link a few external sources, as I have not found it (probably because I don't know exactly what I am looking for?
Thanks a lot.
I don't know of any Swift API solutions. Google has a great solution though. Check out Google's Distance Matrix API. It lets you calculate travel time for two coordinates with al sorts of extra options.

Better Human detection from a UAV? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I am working on a project wherein I am supposed to detect human beings from a live video stream which I get from a UAV's camera module. I do not need to create any rectangles or boxes around detected subjects, but just need to reply with a yes or no. I am fairly new to Open-CV and have no prior experience.
What I have tried:
I started by training my SVM on HOG features. My team gathered a few images from a UAV we had, with people in it. I then trained the SVM from the crops of those people. We got unsatisfactory results when we used the trained detector on the a video from sky with people. Moreover processing each frame turned out to be very slow , therefore the system became unusable.(it did work on still images to some extent).
My question:
I wanted to know if there is some other technique, library etc I could try for achieving good results. Please point me to the next step.

Easiest way to play music at higher pitch in iOS? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I suppose there's a lot of tutorials everywhere on the proper way to play .mp3 music for your game on iOS, but is it possible to effectively play music at higher pitch in iOS?
What I mean with effectively, is not by loading the .mp3 file as a whole into memory, like when playing little bits of .wav files that are shorter than a few seconds. Unless, this kind of preloading is necessary for doing that?
OpenAl has pitch shifting with AL_PITCH.
more info about AL_PITCH and other methods here
Real-time Pitch Shifting on the iPhone

Resources