Does eggplant support motion detect? - eggplant

I am trying to automate one of the applications wherein I have to check whether a video is getting played or not in full screen mode.I have to specifically check whether there is any motion happening during playback. Can anyone please help me here ? I tried taking multiple frames but it won't be an efficient way to do so as the frames will keep on changing depending on the video.
Thanks
Sanket.

Eggplant doesn't currently have any built in support for motion detection. If you are only verifying that a video is playing you could check if the color has changed at some key points within the video rectangle.
set color1 = colorAtLocation(someLocationWithinVideoFrame)
wait 1
set color2 = colorAtLocation(someLocationWithinVideoFrame)
if color1 = color2
set videoPlaying = false
else
set videoPlaying = true
end if

Related

Different methods of displaying camera under SceneKit

I'm developing a AR Application which can use few different engines. One of them is based on SceneKit (not ARKit).
I used to make SceneView background transparent, and just display AVCaptureVideoPreviewLayer under it. But this have created a problem later - turns out, that if you use clear backgroundColor for SceneView, and then add a floor node to it, which has diffuse.contents = UIColor.clear (transparent floor), then shadows are not displaying on it. And the goal for now is to have shadows in this engine.
I think the best method of getting shadows to work is to set camera preview as SCNScene.background.contents. For this I tried using AVCaptureDevice.default(for: video). This worked, but it has one issue - you can't use video format that you want - SceneKit automatically changes format when it's assigned. I even asked Apple for help using one of two help requests you can send to them, but they replied, that for now there is no public api that would allow me to use this with the format I would like. And on iPhone 6s this format changes to 30 FPS, and I need it to be 60 FPS. So this option is no good.
Is there some other way I would assign camera preview to scene background property? From what I read I can use also CALayer for this property, so I tried assigning AVCaptureVideoPreviewLayer, but this resulted in black color only, and no video. I have updated frame of layer to correct size, but this didn't work anyway. Maybe I did something wrong, and there is a way to use this AVCaptureVideoPreviewLayer or something else?
Can you suggest some possible solutions? I know I could use ARKit, and I do for other engine, but for this particular one I need to keep using SceneKit.

Overlaying Alpha Graphics in IOS (Swift)?

I'm curious if anyone could potentially help me conceptualize the best, and most efficient, approach to accomplish an effect wherein a a video/image sequence is overlayed on top of a view. The best example would be the stock Weather app on iOS. A screenshot;
You can check out the animation here. I've done some testing and have found that the easiest approach is to create my graphics with an alpha channel in After Effects, then export as a .png image sequence, and use the following code;
// Set the animation group
myAnimation!.animationImages = [UIImage]()
for var index = 0; index < 149; index++ {
var frameName = String(format: "image_\(index)")
myAnimation!.animationImages?.append(UIImage(named: frameName)!)
}
myAnimation?.animationDuration = 5.0
myAnimation?.startAnimating()
The issue I keep facing is that, for a five second animation loop, I'm looking at 150 frames of content. Creating these graphics at #3x and #2x resolutions, even with app thinning, is making my app bundle huge. Additionally, and more importantly, appending 150 images to the array seems to create a fair amount of lag in playing the content.
If I were to build a weather app, like the example above, I'd also be looking at dozens of different weather conditions and would need hundreds of individual frames.
Are there better ways of going about this? Thanks!
I would try using AVPlayer with an .mp4. Video with low contrast (like the weather app) will help keep the size down and the impact minimal.
An actual working and optimal solution for alpha channel videos in an iOS app can be found in this answer. Keep in mind that if you compose multiple videos over multiple backgrounds that will inflate your app since each video takes up space in the download. No way around that. Also, decoding all that PNG image data on the fly will burn a lot of CPU.

How to add filters for the removal of noise from a recorded audio file in iOS

I am making an iPhone app to record and play the human voice back just like Talking Tom. Everything works fine, but I am not able to figure out how do I remove the background noise in recorded audio file?
I know the low pass filtering is an option to do so, but I dont know how to code this? Can anyone help me to implement a low pass filter or something else to remove the background noise from a recorded audio file in iOS.
I don't do ios, but there is an example of a low pass filter for accelerometer for the iphone at Accelerometer Low Pass Filtering which you may be able to adapt for your use.
Basically the filtered value at time x = unfiltered value at time x * alpha + filtered value at time x-1 * (1- alpha)
And for the filtered value at time 0 = unfiltered value at time 0 * alpha
(assumes that the unfiltered value at time -1 is 0)
You will need to set the alpha using trial and error. Perhaps somewhere in the range 0.01 to 0.1 might be useful.

Detect motion with iPad camera while doing other things

I have to have a video play-back in a loop, until I detect some motion (activity) with the front camera of iPad.
The video does not need to be recorder or played later, and I do not have to show the current video on the iPad.
Currently they have to tap the screen to stop the video but the customer wants a 'cool' video detection. I am not interested in face detection, just motion.
There are some examples about it ?
thanks,
EDIT
Well, currently the only workaround that I've found is detecting luminance... Just make an image of every frame (or n frame) and check the luminosity of the image, check another image from another frame and check again, if the variance is enough, something has changed :-)
Just find a good threshold variance and ready to go ...
Of course I would prefere a more robust workaround...

Can I create a blue-screen effect with MPMoviePlayerController?

I am showing a video inline (not fullscreen) using MPMoviePlayerController. I am using this class because it is the only player I got working using a remote file (progressive download) and not a local file.
Is there any way to create a blue-screen effect? what I basically mean is decide on a certain RGB value and set that pixel's alpha to 0. Is it possible to perform any image processing per frame with MPMoviePlayerController?
You can not use MPMoviePlayerController for such movie processing.
Still, there is ways to accomplish what you are asking for. You may use the AVAssetWriter etc.
Check my answer on a similar question.

Resources