How to show current position(GPX file) with the html player playing - openlayers-3

Well, here is the thing. I'm now doing a project named html5 player with real time Gps showing on the map. It means that I need to show the current position on the map when the video is playing. Video file and Gpx file related to it are on my hands. I have already realized the player part and I have successfully added a map below the player. The map can already show the track of the video. What I need to do next is to show the position on the map(maybe a marker or icon on the map,showing that you are moving) with the video playing. They should be synchronized. So is there any function or method in Ol3 can realize this? What I have in my mind is that I parse the GPX file to extract time data and position data and then match the video file's current time to it. But it's kind of lot of calculation. I would appreciate that if u guys could help me out wiz this!

You can look at this sample:
http://openlayers.org/en/v3.14.2/examples/feature-move-animation.html
if you have calculated the path of your point, i suggest you use the map postcompos event to keep a smooth rendering

Related

How to detect main sound direction from AVAudioSession

I want to detect the main direction of the sound recorded from iPhone. For example, I want to detect if the sound comes from "front" or "rear" camera.
https://developer.apple.com/documentation/avfoundation/avaudiosessiondatasourcedescription
This link describes how to set, but not how to detect in real time.
UPDATE:
Example use:
I start recording with front and back camera at the same time. I want to detect if audio comes from front o rear to change camera automaticatlly.
Is there any way?
Thanks!
You can iterate over AVAudioSession.inputDataSources, to check out available sources and obtain the one you want, and then set it to AVAudioSession.setPreferredInput(). If You don't need to set the input but just check it, use AVAudioSession.currentRoute.inputs

How to implement a camera taking GIF in iOS?

I want to implement such a function that enable users to make GIF directly from their camera.
In detail, I want to show users a camera view, and a record button. When the button is tapped, the camera starts to record video. In fact, however, behind the scene the camera is actually taking photos at constant speed, say 1 shot per 0.5 second. When the record ends, we got an array of images and then connect them into a GIF.
I think there might be 2 approaches:
1、Directly taking images: Use AVCaptureStillImageOutput's -captureStillImageAsynchronouslyFromConnection method. But it will block UI every time it is called.
2、Take a video and extract several images from it. I have checked video taking libraries such as PBJVision and SCRecorder, and noticed that taking video is typically writing data to a mp4 file locally. I cannot figure out how to extract images at specific time intervals from a video file. Also, is there a way to store the video in memory?
Could anyone help?
Creating Gif
Create and and export an animated gif via iOS?
Convert Images to gif using ios
Extract Images from Video
Get a particular frame by time value using AVAssetReader
Similar here Creating a Movie from Images
How do I export UIImage array as a movie?
You can use a library called 'Regift' by Matthew Palmer, which will convert video to GIF.
Here it is: https://github.com/matthewpalmer/Regift
You can also check out the following answer here on SO:
https://stackoverflow.com/a/28150109/3288936
Hope this will help! :)

Show time value on recorded video in iPhone (Video Filtering)

I am currently working on one application in which I am supposed to display the recording time while video recording is going on. Whats more interesting is that this recording time should also be there in the recorded video.
So I think I have two things to consider :
1.) Show recording time overlay while recording video
2.) Recording time should also be shown in the recorded video.
I know we can do the static text overlay while recording the video and then we add one text layer in recorded video. But in my case, this time value should be changed on every second.
I have searched a lot on google and stake overflow. Tried different solutions. By using GPUImage, By using AVMutableVideoComposition and CALayer. But they all are for static value.
May be I am nearer to my solution but not able to find it. So can anyone guide me on how can I achieve it?
Any help would be greatly appreciated.
Thanks
Use AVAssetReader for recorded video. Get the CMSampleBufferRef, get it timestamp, draw time on sample buffer, write buffer to AVAssetWriterInputPixelBufferAdaptor. Similar approach for video being recorded.

iOS: Draw on top of AV video, then save the drawing in the video file

I'm working on an iPad app that records and plays videos using AVFoundation classes. I have all of the code for basic record/playback in place and now I would like to add a feature that allows the user to draw and make annotations on the video—something I believe will not be too difficult. The harder part, and something that I have not been able to find any examples of, will be to combine the drawing and annotations into the video file itself. I suspect this is part is accomplished with AVComposition but have no idea exactly how. Your help would be greatly appreciated.
Mark
I do not think that you can actually save a drawing into a video file in iOS. You could however consider using a separate view to save the drawing and synchronize the overlay onto the video using a transparent view. In other words, the user circled something at time 3 mins 42 secs in the video. Then when the video is played back you overlay the saved drawing onto the video at the 3:42 mark. It's not what you want but I think it is as close as you can get right now.
EDIT: Actually there might be a way after all. Take a look at this tutorial. I have not read the whole thing but it seems to incorporate the overlay function you need.
http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos

playing images sequentially in ios

Suppose i have multiple frames from a video . I would like to play these frames in a movie player . These frames can change at any point of time . So it should be like a callback that should request each frame and program provides the frame to player as a response to call back.
Is it possible in IOS?
Please guide me in right direction in order to achieve this.
Thanks in advance
mia.
You are not going to be able to implement that type of approach using the built-in movie player. But, if you are just going to loop through video frames stored in PNG files in a directory, that would not be too hard to implement. You could take a look at this code as a starting point. This source code is completely free and does what you need.
PNGAnimatorDemo.zip
If you want to do some more advanced stuff, take a look at the AVImageFrameDecoder class in the AVAnimator (google it to find out more).

Resources