I was trying to get native iOS player as a custom QML component and managed to do it thanks to this. However I'm facing a problem with z order of the component.
Component constructor:
MyVideoView::MyVideoView(QQuickItem *parent /*= 0*/)
: QQuickItem(parent)
, m_view(0)
{
connect(this, SIGNAL(windowChanged(QQuickWindow*)), this, SLOT(onWindowChanged(QQuickWindow*)));
connect(this, SIGNAL(visibleChanged()), this, SLOT(onVisibleChanged()));
}
onWindowChanged implementation:
void MyVideoView::onWindowChanged(QQuickWindow* window)
{
if(!m_view) {
}
if (window != 0) {
UIView *parentView = reinterpret_cast<UIView *>(window->winId());
AVPlayer *_player;
AVURLAsset *_asset;
AVPlayerItem *_playerItem;
AVPlayerLayer *m_playerLayer;
_player = [[AVPlayer alloc] init];
NSURL *baseURL = [[NSURL alloc] initWithString: #"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
_asset = [AVURLAsset assetWithURL:baseURL];
_playerItem = [AVPlayerItem playerItemWithAsset: _asset];
[_player replaceCurrentItemWithPlayerItem:_playerItem];
m_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
m_playerLayer.frame = CGRectMake(this->x(), this->y(), this->width(), this->height());
[parentView.layer addSublayer:m_playerLayer ];
[_player play];
} else {
[m_view removeFromSuperView];
}
}
Whit this I can use the component in my application which is an ApplicationWindow, but the issue is, the component is always on top, covering the whole application even if I set:
MyVideoView {
z:-3
width: 300
height: 200
x:20
y:300
}
Or put z of another component to e.g. 300.
I assume it's because of QQuickWindow or caused by UIView.
Edit: MyVideoView is placed inside an Item component
What I would want to achieve is to:
either make it possible to set the components z order
or get the component "behind" the application (creating transparent part on my app so the video is visible, not the best solution but I'm running out of options)
Is there any way to achieve one of those, or can it be done if the component is something else besides a QQuickItem, since the only part I actually need is the player layer, as I'll create a custom playback control interface?
QQuickWindow is a UIView. The individual QML items are not, so you can't place another UIView(or its layer) "inside" a QML application unless you structure the application as a top level QQuickWindow with additional child QQuickWindow and then sandwich the video layer between those Qt Quick windows (QWindows).
With the hint from Tor, I managed to get it to work.
First I created another UIView to place m_playerLayer inside it.
and then got the player behind:
[playerView.layer addSublayer:m_playerLayer];
[parentView.window addSubview: playerView];
[parentView.window sendSubviewToBack: playerView];
parentView.opaque = NO;
It's important to point out that without parentView.opaque = NO the application would still have a background even with ApplicationWindow color: "transparent"
Related
Using an MPMoviePlayerController.view as a background (think spotify). A user can tap login or signup and they are taken to the appropriate viewController, which has a clear background so that the moviePlayer.view remains as the background (i.e., user continues to see the video regardless of the currently active viewController) throughout the flow.
On some viewControllers the form needs to be lifted up so that the keyboard doesn't cover the field. I'm doing this using a transform.
The background video of the moviePlayer is set to repeat, so the video is on a continuous loop. Each time the video resets (video status goes from 1 to 2 - paused to playing) the transform resets in the child viewControllers. My initial thought was that the view was being redrawn, but this doesn't appear to be the case based on logs (I put nslogs in the drawRect of the views but it's only ever called once at instantiation).
Has anyone come across this?
My setup in the root viewController:
// lazy load moviePlayer
-(MPMoviePlayerController *)moviePlayer
{
if (_moviePlayer) return _moviePlayer;
NSURL *videoURL = [[NSBundle mainBundle] URLForResource:#"resources.bundle/videos/auth_bg" withExtension:#"mp4"];
_moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
_moviePlayer.controlStyle = MPMovieControlStyleNone;
_moviePlayer.scalingMode = MPMovieScalingModeAspectFill;
_moviePlayer.repeatMode = MPMovieRepeatModeOne;
_moviePlayer.shouldAutoplay = true;
return _moviePlayer;
}
-(void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
self.moviePlayer.view.frame = self.view.frame;
self.moviePlayer.view.hidden = false;
// 'still' is an imageView of the first frame to show while video loading
[self.navigationController.view insertSubview:self.moviePlayer.view aboveSubview:still];
}
I suspect this has to do with Autolayout -- I found a few other questions where views were being reset (one example here: https://stackoverflow.com/a/17849584/1542275) ... My solution was to adjust the layout constraint constants as opposed to transforming the view coordinates. Things are now staying put.
All of that said, I'm still not sure why the video restart is resetting the transforms.
I have an really simple audio player with AVAudioPlayer with (at this point) one song.
It is simple to control the volume of both channels (left and right) at the same time.
But is it possible to control only the left channel or only the right channel?
In this way I can set the left speaker a little louder then the right or vice versa.
The AVAudioPlayer reference should be consulted to answer this question.
You can see from the interface that you can't control the left/right channel directly, but you can set the pan from -1 (full left) to 1 (full right). Perhaps this will achieve what you are looking for?
Using core audio will be much more difficult, but it should be able to handle this. Another possibility using AVAudioPlayer is to split the sound into its separate channels and to play them separately. Then, you can set the volume separately. There is an example in the documentation of how to ensure that two sounds play at exactly the same time. (And, you'll want to set the pan for each sound separately so that each sound plays correctly.)
A less efficient method is two play two copies of the .mp3 at once. (Caveat: I haven't tested this code.)
// assumes you have url for the file you want to play
AVAudioPlayer *first = [[AVAudioPlayer alloc] initWithContentsOfURL:myUrl error:nil];
AVAudioPlayer *second = [[AVAudioPlayer alloc] initWithContentsOfURL:myUrl error:nil];
first.pan = -1; // only play on left channel
second.pan = 1; // only play on right channel
first.volume = 0.6;
second.volume = 1.0;
NSTimeInterval shortStartDelay = 0.01;
NSTimeInterval now = player.deviceCurrentTime;
[first playAtTime: now + shortStartDelay];
[second playAtTime: now + shortStartDelay];
I am trying to view a video witch has an alpha channel (the background is transparent). The only problem is that I don't seem to get how to make the background of the player transparent. I know I have to use AVplayer, but I can't access it's .view property. How can I add it to the subview and add a layer?
NSString *path = [NSString stringWithFormat:#"%#%#", [[NSBundle mainBundle] resourcePath], #"/New Project 5.m4v"];
NSURL *filePath = [NSURL fileURLWithPath:path isDirectory:NO];
moviePlayer = [[AVPlayer alloc] initWithURL:filePath];
AVPlayerLayer* playerLayer = [AVPlayerLayer playerLayerWithPlayer:moviePlayer];
self.playerLayer.frame = self.view.bounds;
moviePlayer.view.alpha = 0.3;
[moviePlayer.layer addSublayer:playerLayer];
[moviePlayer play];
The iOS SDK does not properly support alpha channel video playback. That applies for AVFramework as well as the MediaPlayer Framework. Video material that contains an alpha channel will not work as you expect it to when using Apple's API.
And as you actually show within your code, AVPlayer does not use a UIView as its surface for playing videos but a subclass of CALayer, AVLayer.
You will need to rethink your application design or chose a different playback SDK.
I have methods inside my open class controller that instantiates a moviePlayer, which is set to 'autoPlay = NO';
I have added the movieplayer.view as a subview of the controllers view, configured it and created a full screen button on top for starting the video. Since iOS4.3, this has been working fine. The button is transparent and the first frame of the video showed through ( which was a picture of a custom Automoble Auto-Start button).
Since iOS6, I only get a black screen.
Clicking the image-button does start the video as it should; calls [moviePlayer play]
Has something changed that I have not taken into consideration?
I have provided the two sections of code I think are necessary.
#define INTRO_MOVIE #"Intro.mov"
-(void)viewDidLoad
{
if(SHOULD_PLAY_INTRO_VIDEO)//Debug switch to ignore the intro video
{
// Prepare the movie and player
[self configureIntroMoviePlayer];
[self.view addSubview:moviePlayer.view];
[self.view bringSubviewToFront:moviePlayer.view];
// Add and Show the Start Button to start the App
[self configureStartButton];
[self.view addSubview:startButton];
}
}
-(void)configureIntroMoviePlayer
{
LOGINFO
// Prepare the Intro Video
NSString *pathToIntroVideo = [ mainFilePath_ stringByAppendingPathComponent: INTRO_MOVIE];
NSURL *URLToIntroVideo = [NSURL fileURLWithPath:pathToIntroVideo];
moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:URLToIntroVideo];
[moviePlayer setShouldAutoplay:NO];
moviePlayer.view.frame = CGRectMake(0, -20, 1024, 768);
[moviePlayer setControlStyle:MPMovieControlStyleNone];
//fixing video brightness Difference with iPad2
if(isIpad2)
{
moviePlayer.backgroundView.backgroundColor = [UIColor blackColor];
moviePlayer.view.alpha = .99;
}
// Create the sKip button for cancelling the Intro Video
skipIntro = [UIButton buttonWithType:UIButtonTypeCustom];
[skipIntro showsTouchWhenHighlighted];
skipIntro.frame = CGRectMake(900, 20, 111, 57);
[skipIntro addTarget:self action:#selector(skipIntroWasPressed) forControlEvents:UIControlEventTouchUpInside];
}
I am not sure why I got a -1 rating for this question for lack of research or clarity?
Maybe I do not know the proper usage of this forum.
I apologize.
I did find that adding [moviePlayer prepareToPlay] solved the problem. Like I said, it was odd that the first frame always showed up prior to iOS 6.
Have you tried:
[moviePlayer.view addSubView:startButton];
I'm writing an application where the user can record up to 6 video clips each with a duration of 2 seconds. When the video clips are recorded the user can play with them using 6 buttons - one for each clip. The user can then record a movie by switching between the 6 clips. The problem is that I need near instantaneous switching between the 6 clips when the user presses a button - otherwise the illusion of playing with the clips is lost - the functionality is somewhat similar to the app called CamBox in the App Store.
I first tried initializing every clip with and AVAsset in an AvPlayerItem in an AVPlayer every time the user pressed a button. The output of the player was directed at a an AVPlayerLayer in my main view. The problem is that the time it takes to load and start playing is quite long, meaning the the video lags when the user presses the buttons in rapid succession.
I the decided to try to preload all the clips using 5 AVPlayers and 5 AVPlayerLayers. The 5 PlayerLayers are inserted into my main view and when the user presses a button the currently playing AVPlayer is paused and rewound and the the currently visible AVPlayerLayer is hidden. The new AVPlayer is started and the corresponding AVPlayerLayer is shown. It works pretty ok being much faster than my first solution although not instantaneous but the problem is that I can only preload 4 clips meaning than when the user presses the button that play the last two the it lags big time. Below is my code to preload the clips
-(void)loadVideos
{
layers = [[NSMutableArray alloc] initWithCapacity:6];
players = [[NSMutableArray alloc] initWithCapacity:6];
for(int i = 1; i < 7; i++)
{
NSURL* fileURL = [NSURL fileURLWithPath:[self getFileName:i]];
AVPlayerItem* avPlayerItem = [[[AVPlayerItem alloc] initWithURL:fileURL] autorelease];
[avPlayerItem addObserver:self forKeyPath:#"status" options:0 context:nil];
AVPlayer *avPlayer = [[[AVPlayer alloc] initWithPlayerItem:avPlayerItem] autorelease];
[avPlayer addObserver:self forKeyPath:#"status" options:0 context:nil];
[avPlayer addObserver:self forKeyPath:#"currentItem" options:0 context:nil];
AVPlayerLayer* layer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
layer.frame = self.playerView.bounds;
[playerView.layer addSublayer:layer];
[layers addObject:layer];
[players addObject:avPlayer];
layer.hidden = YES;
}
}
The event handler for the 6 buttons looks like this:
- (IBAction)takeBtnClicked:(id)sender {
int tag = ((UIButton*)sender).tag;
AVPlayer* player;
AVPlayerLayer* layer;
if (layerIndex > -1) {
player = [players objectAtIndex:layerIndex];
layer = [layers objectAtIndex:layerIndex];
[player pause];
layer.hidden = YES;
[player seekToTime:kCMTimeZero];
}
layerIndex = tag-1;
player = [players objectAtIndex:layerIndex];
layer = [layers objectAtIndex:layerIndex];
[player play];
layer.hidden = NO;
}
I'm prette sure that the limitation of 4 preloaded video clips is a hardware limitation, but what is the alternative. Does anybody have any ideas?
Thanks in advance.
See my answer for iphone-smooth-transition-from-one-video-to-another, it shows a library you can use to implement this logic and an example app with 3 buttons that kick off animated clips. Each clip also has an associated sound effect.