Custom image for dynamic island on iOS - ios

My app is playing media on iOS and sets an album art on [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo when it does that. Unfortunately, this album art does not look too good on the dynamic island as its real estate is too small.
Is it possible to set a different image for the dynamic island? I considered the possibility to create a live activity that would allow me to customize that, however this seems an overkill solution, as the app doesn't need a live activity. The OS fills all of the media needs of the app.

Related

Youtube Iframe Player API Developer Policy Inquiries

I would like to ask about developer policies of Youtube Iframe Player API(https://developers.google.com/youtube/iframe_api_reference).
Our team is currently developing an iOS app including Youtube embedded player.
We would like to check if our use cases below violate the rules for embedded youtube player.
Would it be okay if we use invisible player which we set opacity zero for purpose of getting thumbnail images in video?
Thumnails are taken by invisible player in the way as seeking to specific time and capture view at the time.
The thumnails are solely used for progress bar which consist of sequential images as green box in the below image and each image indicates each parts of video and there is no other purposes.
Images are made and used only in client local environment and not saved in our database.
Additionaly, can we provide the thumbnail extracted in the above method which background is removed?
Would it be possible that we hide below title bar which shows when player paused by adjusting CSS style, view frame size ?
Is there the other way to hide title bar ?
it seems that there are other mobile app services in Korea also using embeded player without titlebar like ‘Cake’ ( https://apps.apple.com/kr/app/cake-케이크-영어회화/id1350420987 )
We are developing feature ‘Metronome’ using BPM(beat per minute). for this, we need to use Audio PCM data in Video. regarding this we are currently examining three cases.
is it allowed to extract audio file from video to get PCM data?
is it allowed to access to device’s audio buffer to get PCM data?
is it allowed to record audio with mobile device to make audio file ?
Is it possible to provide a Metronome function that repeatedly plays a specific sound simultaneously with the audio of the Youtube video?
Would it be okay if we invert embeded player left and right? we plan to provide a mirror mode for user so that they practice dance conveniently.
Is it possible to provide a function that play repeatedly the specific time range (ex. from 00:05 to 00:10)?
Is it possible to provide a function that does not play immediately when the play button is pressed, but counts down for a certain period of time, such as 3 seconds, to play?

Custom camera view on iOS from HTML5

I'm building an app for iOS which I'm currently developing using web programming languages. HTML, CSS, jQuery etc. I do that because it's quicker for me to work with these languages, in a first time.
My app will give users the possibility to record a video, and I know it is possible to do that since iOS 6 with HTML5 and to upload it to my server. However, I would like to be able to create a custom camera view, for example just like the one in Snapchat, with the ability to record a custom duration for the video (for example max 20 seconds), with a progress bar etc.
My first question is : from HTML5, is it possible to limit the duration of the record of a video ? For example with maximum 20 seconds ?
Second question : is it possible for me to continue to develop my app with web languages and when the user clicks on the "record a video button", an event is fired IN the iOS code directly ? So I can launch a custom camera view in iOS language and then send it directly to my server from the iOS code ?
Thanks for your help.
First question:
No, you can't create a custom camera view in HTML. This needs to be done in Swift or Objective-C on the device.
Second question:
I've read, that the new WKWebView can recieve messages from a Javascript.
This tutorial explains, how your app can communicate with the Javascript via delegate-calls. It requires, that your content is NOT displayed in an UIWebView, but in an WKWebView (iOS 8 or newer required)

HTML Recording video on iOS

I'm new to HTML coding, and i'm currently trying to build an app in iOS like snapchat, that will take a users camera and record without stopping if the user goes into the main menu or whatever. I'm looking for some HTML5 code that will allow me to have the main interface just be the back camera output, with buttons that i'll have over the front.
A few searches have led me here: http://www.html5rocks.com/en/tutorials/getusermedia/intro/
Which I have tried to make work but iOS does not support it.
I'm basically asking: How do I make an app record video with it starting on screen?
You could write a web app to do this, but not a native app (i.e. from the App Store). For that, you'll need to learn Objective-C or Swift, then take a look at the AVFoundation framework.

How can we overcome the missing muted/volume property on HTML5 video on UiWebView/iOS Safari?

As many hybrid app developers know, Apple has decided to disallow setting the volume property of HTML5 video elements in JavaScript. This also amounts to the the muted property. The concept of muted videos which autoplay when scrolled into view and with the option of unmuting on tap is growing increasingly popular (pioneered by Vine, Facebook, etc.). I'm trying to find a way around this limitation in design. From what I've been able to read on the subject, there's not any hack or solution that solves this design requirement of mine.
Here's my thoughts so far:
I could split the audio from the video into a separate stream and sync current time with the video and call play() when the user is tapping. However, iOS Safari/UiWebView does not support simultaneous audio/video streams. Thus, this is simply not an option.
I could encode two videos, one with sound and one without. I could then swap the src on tap. However, this requires reloading the entire stream and also nearly doubles the amount of data required. The latency is noticeable. Thus, this won't be a viable solution.
I could embed a native AVPlayer class element in the webview. However, this would be an overlay and not be manageable from within the webview. Custom controls and UI interaction from within the dom would not be possible. Thus, this is not an option.
I could simply disable the output of the app and dynamically switch it on whenever the user taps a video element. However, to my knowledge this is not possible. I could show the native software volume slider, but that would defeat the purpose of this whole thing.
Do you have any suggestions or ways around this limitation?
I managed to find an acceptable solution. I split the videos into three files. One without audio, one without video and then one with both video and audio for desktop browsers/Android.
It seems like running simultaneous streams CAN work as long as they doesn't conflict with each other, which basically means a separate audiotrack and a video with no audio Channels play just fine in unison.

Trying to create an Xcode Objective-C function that records a video capture of my UIView contents and saves to phone

I'm trying to create an Xcode Objective-C function that can be called from a button tap, that will record the contents of a UIView and its subviews (or a fixed section of the screen e.g. 320x320 in the center) and then allow the user to save the video to their iPhone camera roll. I also want to include the audio that is being played by the app at the time of the recording e.g background music and sound effects.
I'm having trouble finding any info on this as their seems to be a lot of people trying to record their running app for external purposes like the app store video preview. I need my video captured within the app, to be used as an app feature.
Does anyone know if this can be done or know a website or tutorial where I can learn what's needed? Thanks
I know this post is two years old, but for anybody who comes along who might need to record their iOS app's screens and save them to the phone's camera roll or even a specific URL, take a look at https://github.com/alskipp/ASScreenRecorder
I've tried it and it works! The frames per second aren't 60 so I don't know how well it would work if you were trying to record an action game, but it's still pretty awesome.
You can't do that with just one function, check that project:
https://github.com/coolstar/RecordMyScreen

Resources