Context:
I am trying to play multiple MIDI sequences in an iOS app using AVFoundation. The tracks are loaded in MIDI files and I have successfully managed to play them one by one if I load them in a AVAudioSequencer. I also have an AVAudioUnitSampler object which is connected in an AVAudioEngine and it successfully plays the selected instrument from the loaded sound bank in the sampler.
The setup works correctly if I create a new AVAudioSequencer each time I play a sound. However, if I would like to reuse a sequencer after it's finished, it sounds like it's not using the sampler's instrument.
I suspect when I create the AVAudioSequencer objects they are automatically connected to the AVAudioEngine but only the last object will get the connected to the sampler.
I've tried to manually connect the destinationAudioUnit of the tracks in the sequencer to the sampler, but then it doesn't play a sound at all. I also tried to make multiple samplers and connect them all to the engine, but that didn't work either.
My main question is: What is the proper way of using multiple AVAudioSequencer objects with one AVAudioUnitSampler? Or do I need to create a sampler for each sequencer and connect them somehow?
Here's a very basic playground example of two sequencers. When I run it, sequencer B successfully plays the sound through the sampler, but A is not using the instrument.
import UIKit
import PlaygroundSupport
import AVFoundation
class MyViewController : UIViewController {
let buttonA = UIButton(type: .system), buttonB = UIButton(type: .system)
let engine = AVAudioEngine()
lazy var sequencerA = AVAudioSequencer(audioEngine: engine)
lazy var sequencerB = AVAudioSequencer(audioEngine: engine)
let sampler = AVAudioUnitSampler()
// UI setup
override func loadView() {
let view = UIView()
view.backgroundColor = .white
buttonA.setTitle("Sequencer A", for: .normal)
buttonB.setTitle("Sequencer B", for: .normal)
buttonA.addTarget(self, action: #selector(playSequencerA), for: .touchUpInside)
buttonB.addTarget(self, action: #selector(playSequencerB), for: .touchUpInside)
view.addSubview(buttonA)
view.addSubview(buttonB)
buttonA.frame = CGRect(x: 150, y: 200, width: 100, height: 100)
buttonB.frame = CGRect(x: 150, y: 300, width: 100, height: 100)
self.view = view
}
// Sound engine setup
override func viewDidLoad() {
engine.attach(sampler)
engine.connect(sampler, to: engine.mainMixerNode, format: nil)
let soundBankPath = playgroundSharedDataDirectory.appendingPathComponent("gs_instruments.dls")
let midiA = playgroundSharedDataDirectory.appendingPathComponent("sfx_a.mid")
let midiB = playgroundSharedDataDirectory.appendingPathComponent("sfx_b.mid")
try! sampler.loadSoundBankInstrument(at: soundBankPath, program: 11, bankMSB: UInt8(kAUSampler_DefaultMelodicBankMSB), bankLSB: UInt8(kAUSampler_DefaultBankLSB))
try! sequencerA.load(from: midiA, options: [])
try! sequencerB.load(from: midiB, options: [])
try! engine.start()
}
#objc public func playSequencerA() { play(sequencerA) }
#objc public func playSequencerB() { play(sequencerB) }
func play(_ sequencer: AVAudioSequencer) {
if sequencer.isPlaying { sequencer.stop() }
sequencer.currentPositionInBeats = 0
try! sequencer.start()
}
}
PlaygroundPage.current.liveView = MyViewController()
Edit:
After some additional experiments I suspect that the AVAudioEngine cannot have more than one AVAudioSequencer instance (or I'm still doing something wrong). As a workaround, I have created a separate AVAudioEngine object for each MIDI file that I need to play and they all have their own sampler and sequencer inputs, which plays the sounds just fine. I'm pretty sure this solution is not optimal, so I would be glad for any tips about a better setup.
Related
When creating a custom video player using the AVPlayer + AVPlayerLayer + AVPictureInPictureController for a iPhone running iOS 14 (beta 7) the video does not automatically enter picture-in-picture-mode when the app enters the background after player.start() is called from a UIButton action.
The issue does not reproduce using the AVPlayerViewController which seems to indicate a problem with the AVPictureInPictureController on iOS 14 in general, but I was wondering if anyone else had run into this problem and know of any workarounds. I've also filed this problem with Apple under rdar://8620271
Sample code.
import UIKit
import AVFoundation
import AVKit
class ViewController: UIViewController {
private let player = AVPlayer(url: URL(string: "https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4")!)
private var pictureInPictureController: AVPictureInPictureController!
private var playerView: PlayerView!
private var playButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
playerView = PlayerView(frame: CGRect(x: 0, y: 44, width: view.bounds.width, height: 200))
playerView.backgroundColor = .black
playerView.playerLayer.player = player
view.addSubview(playerView)
playButton = UIButton(frame: CGRect(x: view.bounds.midX - 50, y: playerView.frame.maxY + 20, width: 100, height: 22))
playButton.setTitleColor(.blue, for: .normal)
playButton.setTitle("Play", for: .normal)
playButton.addTarget(self, action: #selector(play), for: .touchUpInside)
view.addSubview(playButton)
pictureInPictureController = AVPictureInPictureController(playerLayer: playerView.playerLayer)
do {
let audioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(.playback)
try audioSession.setMode(.moviePlayback)
try audioSession.setActive(true)
} catch let e {
print(e.localizedDescription)
}
}
#objc func play() {
player.play()
}
}
class PlayerView: UIView {
override class var layerClass: AnyClass {
return AVPlayerLayer.self
}
var playerLayer: AVPlayerLayer! {
return layer as? AVPlayerLayer
}
}
The root cause of the problem ended up being twofold:
AVAudioSession.sharedInstance().setActive(true) must be called before the AVPictureInPictureController is initialised.
The frame size for the AVPlayerLayer must have a aspect ratio no greater than 16/9 (filed as a separate bug, rdar://8689203)
For iPads, the video must be the same width as the device (in any given orientation). No separate rdar, as Apple have acknowledged the other bug already.
(The 2nd issues is not present in the example above)
Apple have acknowledged these bugs, and reported back to me that they have been / will be fixed (a rare case of a radar actually resulting in a reply!)
Starting iOS 14.2, Apple has exposed an api to start PIP when app goes into background:
if #available(iOS 14.2, *) {
pictureInPictureController.canStartPictureInPictureAutomaticallyFromInline = true
}
Additionally, it is worth noting that Apple has forbidden to start picture-in-picture without user manually tapping the button. It will result in app rejection.
Best bet is to use Apple's API mentioned above to avoid rejection.
I have this code for running the BroadCast now I need one Button in App to stop broadcast without going to Notification Centre is that possible.
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
UIScreen.main.addObserver(self, forKeyPath: "captured", options: .new, context: nil)
}
func addRPkitVw() {
let broadcastPickerView = RPSystemBroadcastPickerView(frame: CGRect(x: (holderVw.frame.width / 2) - 19, y: 0, width: 38, height: 38))
holderVw.addSubview(broadcastPickerView)
broadcastPickerView.backgroundColor = .clear
broadcastPickerView.showsMicrophoneButton = true
}
I'm facing same issue, but have you tried with finishBroadcastWithError in RPBroadcastSampleHandler. It's temporary solution, cus there is error popup
You have to pass a message to our Extension Whenever you need to stop the recording. Now in your Upload Broadcast Extension when you get the message just call the finishBroadcastWithError function and pass your own error type.
Example:- Recording successfully stoped etc.
What i need to do:
record audio file;
as it's record from iPhone/iPad microphone it can be quiet, so i need to filter it to make it louder;
save filtered record;
I'm new in audio programming, but as I understand so far I need "All Pass" filter (if not please correct me).
For this task I've found two libs: Novocaine and AudioKit, but Novocaine written in C, so it's harder to implement it in swift, and I decided to use AudioKit, but I didn't found "All Pass" filter there.
Does anybody know how to implement it in AudioKit and save filtered file? Thank you!
You have a few choices, for musical recordings I recommend AKBooster as it purely boosts the audio, you have to be careful how much you boost, otherwise you might cause clipping.
For spoken word audio I recommend AKPeakLimiter. It will give you the maximum volume without clipping. Set the attackTime and decayTime to lower values to hear a more pronounced effect.
The values of the sliders won't represent the values of the parameters until you move them.
import UIKit
import AudioKit
class ViewController: UIViewController {
let mic = AKMicrophone()
let boost = AKBooster()
let limiter = AKPeakLimiter()
override func viewDidLoad() {
super.viewDidLoad()
mic >>> boost >>> limiter
AudioKit.output = limiter
AudioKit.start()
let inset: CGFloat = 10.0
let width = view.bounds.width - inset * 2
for i in 0..<4 {
let y = CGFloat(100 + i * 50)
let slider = UISlider(frame: CGRect(x: inset, y: y, width: width, height: 30))
slider.tag = i
slider.addTarget(self, action: #selector(sliderAction(slider:)), for: .valueChanged)
view.addSubview(slider)
}
boost.gain = 1
}
#objc func sliderAction(slider: UISlider) {
switch slider.tag {
case 0:
boost.gain = slider.value * 40
case 1:
limiter.preGain = slider.value * 40
case 2:
limiter.attackTime = max(0.001, slider.value * 0.03)
case 4:
limiter.decayTime = max(0.001, slider.value * 0.06)
default: break
}
}
}
what is the most efficient way to add a GIF/Video to the background of the landing screen ( home screen or first view controller) of my app in Xcode? i.e apps like spotify, uber, insagram etc. Being that my app is universal, how would i make it fit accordingly?
Do you mean the first screen that is displayed after your app is launched? If so: unfortunately you can't have dynamic content; you won't be able to use a gif/video.
That said, what you can do if you have some app-setup on background threads that will take some time anyway, or if you simply want the user to wait longer before interaction so that you can display the gif/video, you can make the static image match the first frame of the gif/video, and have your your entry point be a ViewController that displays the actual gif/video. Because this would delay the time to interaction, though, this would never be recommended.
As for making it fit: as of iOS 8 Apple recommends using LaunchScreen.xib. With it you can use Auto Layout to achieve universality.
To add a video you can use MPMoviePlayerController, AVPlayer, or if you're using SPritekit you can use an SKVideoNode.
EDIT (in response to follow-up comments):
An NSURL is a reference to a local or remote file. This link will give you a decent overview. Just copy the movie in and follow that guide.
In addition to the MPMoviePlayerController solution Saqib Omer suggested, here's an alternative method that uses a UIView with an AVPlayerLayer. It has a button on top of the video as an example, since that's what you're looking for.
import AVKit
import AVFoundation
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Start with a generic UIView and add it to the ViewController view
let myPlayerView = UIView(frame: self.view.bounds)
myPlayerView.backgroundColor = UIColor.blackColor()
view.addSubview(myPlayerView)
// Use a local or remote URL
let url = NSURL(string: "http://eoimages.gsfc.nasa.gov/images/imagerecords/76000/76740/iss030-e-6082_sdtv.mov") // See the note on NSURL above.
// Make a player
let myPlayer = AVPlayer(URL: url)
myPlayer.play()
// Make the AVPlayerLayer and add it to myPlayerView's layer
let avLayer = AVPlayerLayer(player: myPlayer)
avLayer.frame = myPlayerView.bounds
myPlayerView.layer.addSublayer(avLayer)
// Make a button and add it to myPlayerView (you'd need to add an action, of course)
let myButtonOrigin = CGPoint(x: myPlayerView.bounds.size.width / 3, y: myPlayerView.bounds.size.height / 2)
let myButtonSize = CGSize(width: myPlayerView.bounds.size.width / 3, height: myPlayerView.bounds.size.height / 10)
let myButton = UIButton(frame: CGRect(origin: myButtonOrigin, size: myButtonSize))
myButton.setTitle("Press Me!", forState: .Normal)
myButton.setTitleColor(UIColor.whiteColor(), forState: .Normal)
myPlayerView.addSubview(myButton)
}
}
For playing video add following code, declare class variable var moviePlayer:MPMoviePlayerController! . Than in your viewDidLoad()
var url:NSURL = NSURL(string: "YOUR_URL_FOR_VIDEO")
moviePlayer = MPMoviePlayerController(contentURL: url)
moviePlayer.view.frame = CGRect(x: 0, y: 0, width: 200, height: 150)
self.view.addSubview(moviePlayer.view)
moviePlayer.fullscreen = true
moviePlayer.controlStyle = MPMovieControlStyle.Embedded
This will play video. But to fit it you need to add layout contraints. See this link to add constraints pragmatically.
import MediaPlayer
class ViewController: UIViewController {
var moviePlayer: MPMoviePlayerController!
override func viewDidLoad() {
super.viewDidLoad()
// Load the video from the app bundle.
let videoURL: NSURL = NSBundle.mainBundle().URLForResource("video", withExtension: "mp4")!
// Create and configure the movie player.
self.moviePlayer = MPMoviePlayerController(contentURL: videoURL)
self.moviePlayer.controlStyle = MPMovieControlStyle.None
self.moviePlayer.scalingMode = MPMovieScalingMode.AspectFill
self.moviePlayer.view.frame = self.view.frame
self.view .insertSubview(self.moviePlayer.view, atIndex: 0)
self.moviePlayer.play()
// Loop video.
NSNotificationCenter.defaultCenter().addObserver(self, selector: "loopVideo", name: MPMoviePlayerPlaybackDidFinishNotification, object: self.moviePlayer)
}
func loopVideo() {
self.moviePlayer.play()
}
https://medium.com/#kschaller/ios-video-backgrounds-6eead788f190#.2fhxmc2da
I try to play a video using a MPMoviePlayerController. The setup is: I push a new ViewController, then set up the view and the movie player instance in viewDidLoad and then use a NSURLSession.sharedSession().dataTaskWithURL() where I load the REST resource for the movie to give me the URL. In the completion block I set the contentUrl of the movie player instance to this url and say play. However, the movie frame stays black. If I set the contentUrl hardcoded to the url either in viewDidLoad, viewWillAppear, or viewDidAppear, the movie shows just fine.
The errorLog and accessLog are both nil.
So I assume something is wrong with the asynchronous url loading and assigning of the movie contentUrl.
Setup: Swift, Xcode 6 beta, iOS 8.
Below some code snippets:
class PresentationsViewController {
override func tableView(tableView: UITableView!, didSelectRowAtIndexPath indexPath: NSIndexPath!) {
let presentationViewController = PresentationViewController(presentations[indexPath.row])
navigationController.pushViewController(presentationViewController, animated: true)
}
}
class PresentationViewController {
var presentation: Presentation?
var moviePlayer: MPMoviePlayerController?
convenience init(_ presentation: Presentation) {
self.init()
self.presentation = presentation
}
override func viewDidLoad() {
super.viewDidLoad()
moviePlayer = MPMoviePlayerController()
moviePlayer!.view.frame = CGRect(x: X, y: Y, width: W, height: H)
moviePlayer!.movieSourceType = MPMovieSourceType.Unknown
moviePlayer!.controlStyle = MPMovieControlStyle.Embedded
NSURLSession.sharedSession().dataTaskWithURL(presentation.url) { data, response, error in
// Some JSON parsing etc.
self.moviePlayer!.contentURL = presentation.videoUrl
self.moviePlayer!.prepareToPlay()
self.moviePlayer!.play()
}.resume()
view.addSubview(moviePlayer.view)
}
}
I am not sure if this was a bug in the Swift betas or iOS 8 betas, but changing the code to use AVPlayer worked.
import AVFoundation
import AVKit
let playerViewController = AVPlayerViewController()
// In async block:
if let player = AVPlayer.playerWithURL(url) as? AVPlayer {
playerViewController.player = player
}