Is there a way to create a stereo sine signal with AudioKit that is out of phase? - audiokit

Is there an example available for AudioKit that shows how to create a left/right sine wave that is offset by a specific value (in degrees)? E.g. there is a phase shift of 90° between the two signals. I could not find it.
many thanks

Related

Stereo baseline between two frames

I am trying to calculate the stereo baseline (possibly in cm) between two arbitrary frames, after a 6DoF tracking has being performed. I would like to extract further additional information by comparing two frames but I need somehow to figure out what is the distance between the camera position in frame 1 and, say frame 10. Any suggestion on how to proceed?
I have done this in a more explicit and non-automated way with a 3D tracking software that solves the camera, then I physically measure the distance of the solved camera on frame 1 and 10 but I'm stuck conceptually on how to do this in ArCore

drake: trouble connecting plant output to visualizer input

The background is, I am controlling a quad tilt-wing with states x=[X,Y,Z,ϕ,θ,ψ,X˙,Y˙,Z˙,ϕ˙,θ˙,ψ˙], and control input u=[ω1^2,ω2^2,ω3^2,ω4^2,θ1,θ2,θ3,θ4], here ωi are the speed of the propeller, and θi are the tilting angle of the wings. Basically I assume that I can control the tilting angles instantaneously, and I only care about the position and attitude of the body of the plane. Therefore, in my code, the num_states is 12, and num_inputs is 8.
For the purpose of visualization, I made a urdf file, where I set the tilting wings using revolute joints. And in drake, when I load the urdf using floating base, the rigidbodytree generates 20 states. This is not surprising, as I guess 12 of the states are x=[X,Y,Z,ϕ,θ,ψ,X˙,Y˙,Z˙,ϕ˙,θ˙,ψ˙], and the extra 8 states come from the states of the revolute joints [θ1,θ2,θ3,θ4,θ1˙,θ2˙,θ3˙,θ4˙].
Just today when I was building the c++ code, this raises a problem if I use:
builder.Connect(quad_tilt_wing->get_output_port(0), visualizer->get_input_port(0));
because 12 is not equal to 20.
So I am wondering how should I solve this? I am thinking about two ways:
(1) One is I modify my dynamics, so that my states are x=[X,Y,Z,ϕ,θ,ψ,X˙,Y˙,Z˙,ϕ˙,θ˙,ψ˙,θ1,θ2,θ3,θ4,θ1˙,θ2˙,θ3˙,θ4˙], and my controls are u=[ω21,ω22,ω23,ω24,θ1,θ2,θ3,θ4]. But in this way, I cannot think of a way of writing the dynamics in the form of x˙=f(x,u).
(2) The other way is, I am wondering if I can divide the input port of the visualizer into 2 ports, one is 12, and the other is 8, and then I feed the 12 with my dynamics x, the 8 with direct control using u. Is this doable?
Or do you have better way of addressing this?
Take a look at http://drake.mit.edu/doxygen_cxx/group__primitive__systems.html
You should be able to DeMultiplex your input to get a signal that has just the theta values (that you want to patch into your visualization input). It's tempting to say you could Multiplex the two signals together into the signal you push to your visualizer input, but I think that you want to insert the inputs in the middle of the vector. So I suspect you'll want to do something like Multiplex then have a (matrix) Gain that permutes your vector elements into the right order.
Or, since you are writing the plant yourself, you can just make a second output port that contains the vector information that you need for visualization. That's the approach used in e.g. https://github.com/RobotLocomotion/drake/blob/master/examples/rimless_wheel/rimless_wheel.h#L31

OpenCV: method of measuring an objects angular displacement?

I would like to track an object using a webcam and compare every two frames together to see the changes in the angular displacement of the object. I was thinking to first extract the object from the image using background subtraction then create a bounding box around the object and then measure the angular displacement of the box. Is there any way to implement this on OpenCV?
I have some experimental experience in this.
Try calculating motion vectors and see the orientation and length correlations with the two frames. It will give you neat idea of type of movements eg:
Motion(length correlation)
Direction(collective direction of group of motion vectors)
Depth(direction and relative length of motion eg if there is depth
then direction will be same but length of MV will be smaller).
Rotation and scaling can be seen by motion vectors' group behavior.
Try it and if you want more clarification, write back.
Good Luck and Happy Coding.

iOS Camera Color Recognition in Real Time: Tracking a Ball

I have been looking for a bit and know that people are able to track faces with core image and openGL. However I am not that sure where to start the process of tracking a colored ball with the iOS camera.
Once I have a lead to being able to track the ball. I hope to create something to detect. when the ball changes directions.
Sorry I don't have source code, but I am unsure where to even start.
The key point is image preprocessing and filtering. You can use the Camera API-s to get the video stream from the camera. Take a snapshot picture from it, then you should use a Gaussian-blur on it (spatial enhance), then a Luminance Average Threshold Filter (to make black and white image). After that a morphological preprocessing should be wise (opening, closing operators), to hide the small noises. Then an Edge detection algorithm (with for example a Prewitt-operator). After these processes only the edges remain, your ball should be a circle (when the recording environment was ideal) After that you can use a Hough-transform to find the center of the ball. You should record the ball position and in the next frame, the small part of the picture can be processed (around the ball only).
Other keyword could be: blob detection
A fast library for image processing (on GPU with openGL) is Brad Larsons: GPUImage library https://github.com/BradLarson/GPUImage
It implements all the needed filter (except Hough-transformation)
The tracking process can be defined as following:
Having the initial coordinate and dimensions of an object with a given visual characteristics (image features)
In the next video frame, find the same visual characteristics near the coordinate of the last frame.
Near means considering basic transformations related to the last frame:
translation in each direction;
scale;
rotation;
The variation of these tranformations are strictly related with the frame rate. Higher the frame rate, nearest the position will be in the next frame.
Marvin Framework provides plug-ins and examples to perform this task. It's not compatible with iOs yet. However, it is open source and I think you can port the source code easily.
This video demonstrates some tracking features, starting at 1:10.

XNA value of Apply3D positions

I'm currently working on 3d positional audio in my 3d XNA game (using SoundEffectInstance), however I'm having troubles finding the correct values of the position of the emitter and listers.
I started out setting the position of my listener to the camera position(it's a first person game), and the position of the various emitters to the position of the object that was emitting the sound. Doing this muted the sound completely, compared to before I used the Apply3D method.
I did some experimenting with the values, and figured after I made the values of the positions much much smaller, I started hearing the sound. My map size has values from 0 to 5000 in the x/z plane (only moving between 0 and ~500 on y axis), so the distance between the listener and emitter will generally be high (when comparing to the values I needed to hear anything at all which was between 0 and 1).
Is there any way to control what "close" and "far away" is for the soundEffectInstance? Or am I supposed to normalize the distance values? I read trough several guides on 3D sound, but I have not seen anything related to normalizing or control of the distance values.
After some more testing, I found that simply dividing the position values with a factor (arround 500 seems right for me) provides a good result. I dont know if this is the way It's supposed to be done, but it seems to be working fine.

Resources