Filtering audio in AudioKit - ios

What i need to do:
record audio file;
as it's record from iPhone/iPad microphone it can be quiet, so i need to filter it to make it louder;
save filtered record;
I'm new in audio programming, but as I understand so far I need "All Pass" filter (if not please correct me).
For this task I've found two libs: Novocaine and AudioKit, but Novocaine written in C, so it's harder to implement it in swift, and I decided to use AudioKit, but I didn't found "All Pass" filter there.
Does anybody know how to implement it in AudioKit and save filtered file? Thank you!

You have a few choices, for musical recordings I recommend AKBooster as it purely boosts the audio, you have to be careful how much you boost, otherwise you might cause clipping.
For spoken word audio I recommend AKPeakLimiter. It will give you the maximum volume without clipping. Set the attackTime and decayTime to lower values to hear a more pronounced effect.
The values of the sliders won't represent the values of the parameters until you move them.
import UIKit
import AudioKit
class ViewController: UIViewController {
let mic = AKMicrophone()
let boost = AKBooster()
let limiter = AKPeakLimiter()
override func viewDidLoad() {
super.viewDidLoad()
mic >>> boost >>> limiter
AudioKit.output = limiter
AudioKit.start()
let inset: CGFloat = 10.0
let width = view.bounds.width - inset * 2
for i in 0..<4 {
let y = CGFloat(100 + i * 50)
let slider = UISlider(frame: CGRect(x: inset, y: y, width: width, height: 30))
slider.tag = i
slider.addTarget(self, action: #selector(sliderAction(slider:)), for: .valueChanged)
view.addSubview(slider)
}
boost.gain = 1
}
#objc func sliderAction(slider: UISlider) {
switch slider.tag {
case 0:
boost.gain = slider.value * 40
case 1:
limiter.preGain = slider.value * 40
case 2:
limiter.attackTime = max(0.001, slider.value * 0.03)
case 4:
limiter.decayTime = max(0.001, slider.value * 0.06)
default: break
}
}
}

Related

How can I get an entity to constantly look at the camera in RealityKit similar to Billboard Constraint from SceneKit

I have an app that I am trying to update from SceneKit to RealityKit, and one of the features that I am having a hard time replicating in RealityKit is making an entity constantly look at the camera. In SceneKit, this was accomplished by adding the following billboard constraints to the node:
let billboardConstraint = SCNBillboardConstraint()
billboardConstraint.freeAxes = [.X, .Y]
startLabelNode.constraints = [billboardConstraint]
Which would allow the startLabelNode to freely rotate so that it was constantly facing the camera without the startLabelNode changing its position.
However, I can't seem to figure out a way to do this with RealityKit. I have tried the "lookat" method, which doesn't seem to offer the ability to constantly face the camera. Here is a short sample app where I have tried to implement a version of this in RealityKit, but it doesn't offer the ability to have the entity constantly face the camera like it did in SceneKit:
import UIKit
import RealityKit
import ARKit
class ViewController: UIViewController, ARSessionDelegate {
#IBOutlet weak var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
arView.session.delegate = self
arView.environment.sceneUnderstanding.options = []
arView.debugOptions.insert(.showSceneUnderstanding) // Display a debug visualization of the mesh.
arView.renderOptions = [.disablePersonOcclusion, .disableDepthOfField, .disableMotionBlur] // For performance, disable render options that are not required for this app.
arView.automaticallyConfigureSession = false
let configuration = ARWorldTrackingConfiguration()
if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
configuration.sceneReconstruction = .mesh
} else {
print("Mesh Classification not available on this device")
configuration.worldAlignment = .gravity
configuration.planeDetection = [.horizontal, .vertical]
}
configuration.environmentTexturing = .automatic
arView.session.run(configuration)
UIApplication.shared.isIdleTimerDisabled = true // Prevent the screen from being dimmed to avoid interrupting the AR experience.
}
#IBAction func buttonPressed(_ sender: Any) {
let screenWidth = arView.bounds.width
let screenHeight = arView.bounds.height
let centerOfScreen = CGPoint(x: (screenWidth / 2), y: (screenHeight / 2))
if let raycastResult = arView.raycast(from: centerOfScreen, allowing: .estimatedPlane, alignment: .any).first
{
addStartLabel(at: raycastResult.worldTransform)
}
}
func addStartLabel(at result: simd_float4x4) {
let resultAnchor = AnchorEntity(world: result)
resultAnchor.addChild(clickToStartLabel())
arView.scene.addAnchor(resultAnchor)
}
func clickToStartLabel() -> ModelEntity {
let text = "Click to Start Here"
let textMesh = MeshResource.generateText(text, extrusionDepth: 0.001, font: UIFont.boldSystemFont(ofSize: 0.01))
let textMaterial = UnlitMaterial(color: .black)
let textModelEntity = ModelEntity(mesh: textMesh, materials: [textMaterial])
textModelEntity.generateCollisionShapes(recursive: true)
textModelEntity.position.x -= textMesh.width / 2
textModelEntity.position.y -= textMesh.height / 2
let planeMesh = MeshResource.generatePlane(width: (textMesh.width + 0.01), height: (textMesh.height + 0.01))
let planeMaterial = UnlitMaterial(color: .white)
let planeModelEntity = ModelEntity(mesh: planeMesh, materials: [planeMaterial])
planeModelEntity.generateCollisionShapes(recursive:true)
// move the plane up to make it sit on the anchor instead of in the middle of the anchor
planeModelEntity.position.y += planeMesh.height / 2
planeModelEntity.addChild(textModelEntity)
// This does not always keep the planeModelEntity facing the camera
planeModelEntity.look(at: arView.cameraTransform.translation, from: planeModelEntity.position, relativeTo: nil)
return planeModelEntity
}
}
extension MeshResource {
var width: Float
{
return (bounds.max.x - bounds.min.x)
}
var height: Float
{
return (bounds.max.y - bounds.min.y)
}
}
Is the lookat function the best way to get the missing feature working in RealityKit or is there a better way to have a Entity constantly face the camera?
k - I haven't messed with RK much, but assuming entity is a scenekit node? - then set constraints on it and it will be forced to face 'targetNode' at all times. Provide that works the way you want it to, then you may have to experiment with how the node is initially created IE what direction it is facing.
func setTarget()
{
node.constraints = []
let vConstraint = SCNLookAtConstraint(target: targetNode)
vConstraint.isGimbalLockEnabled = true
node.constraints = [vConstraint]
}
I was able to figure out an answer to my question. Adding the following block of code allowed the entity to constantly look at the camera:
func session(_ session: ARSession, didUpdate frame: ARFrame) {
planeModelEntity.look(at: arView.cameraTransform.translation, from: planeModelEntity.position(relativeTo: nil), relativeTo: nil)
}

How to animate GMSURLTileLayer

I have an array of last 10 epochs time.
And I want to create an animation GMSURLTileLayer with these epochs.
I tried to integrate with for loop but it is not working.
This is my code:
let epochs = [10, 20, 30, 40, 50]
private func configureRadarForGoogle(epoch: Int) {
UIView.animate(withDuration: 1.0) {
let url: GMSTileURLConstructor = {(x, y, zoom) in
let urltemplate = "https://tilecache.rainviewer.com/v2/radar/\(epoch)/512/\(zoom)/\(x)/\(y)/2/1_1.png"
return URL(string: urltemplate)
}
let layer = GMSURLTileLayer(urlConstructor: url)
layer.zIndex = 5
layer.map = self.mapView
}
}
private func startAnimation() {
for epoch in self.epochs {
sleep(1)
configureRadarForGoogle(epoch: epoch)
}
}
Is anyone know a better solution? Thanks a lot.
I think you can use the GMSMarkerLayer, which is a subclass of GMSOverlayLayer, available on a per-marker basis, that allows animation of several properties of its associated GMSMarker
Now to Animate you can calculate the points and add them to 2 separate arrays, one for latitude value (y) and one for longitude (x) and then use the values property in CAKeyFrameAnimation to animate as explained here

Using multiple AVAudioSequencer with one or more AVAudioUnitSampler

Context:
I am trying to play multiple MIDI sequences in an iOS app using AVFoundation. The tracks are loaded in MIDI files and I have successfully managed to play them one by one if I load them in a AVAudioSequencer. I also have an AVAudioUnitSampler object which is connected in an AVAudioEngine and it successfully plays the selected instrument from the loaded sound bank in the sampler.
The setup works correctly if I create a new AVAudioSequencer each time I play a sound. However, if I would like to reuse a sequencer after it's finished, it sounds like it's not using the sampler's instrument.
I suspect when I create the AVAudioSequencer objects they are automatically connected to the AVAudioEngine but only the last object will get the connected to the sampler.
I've tried to manually connect the destinationAudioUnit of the tracks in the sequencer to the sampler, but then it doesn't play a sound at all. I also tried to make multiple samplers and connect them all to the engine, but that didn't work either.
My main question is: What is the proper way of using multiple AVAudioSequencer objects with one AVAudioUnitSampler? Or do I need to create a sampler for each sequencer and connect them somehow?
Here's a very basic playground example of two sequencers. When I run it, sequencer B successfully plays the sound through the sampler, but A is not using the instrument.
import UIKit
import PlaygroundSupport
import AVFoundation
class MyViewController : UIViewController {
let buttonA = UIButton(type: .system), buttonB = UIButton(type: .system)
let engine = AVAudioEngine()
lazy var sequencerA = AVAudioSequencer(audioEngine: engine)
lazy var sequencerB = AVAudioSequencer(audioEngine: engine)
let sampler = AVAudioUnitSampler()
// UI setup
override func loadView() {
let view = UIView()
view.backgroundColor = .white
buttonA.setTitle("Sequencer A", for: .normal)
buttonB.setTitle("Sequencer B", for: .normal)
buttonA.addTarget(self, action: #selector(playSequencerA), for: .touchUpInside)
buttonB.addTarget(self, action: #selector(playSequencerB), for: .touchUpInside)
view.addSubview(buttonA)
view.addSubview(buttonB)
buttonA.frame = CGRect(x: 150, y: 200, width: 100, height: 100)
buttonB.frame = CGRect(x: 150, y: 300, width: 100, height: 100)
self.view = view
}
// Sound engine setup
override func viewDidLoad() {
engine.attach(sampler)
engine.connect(sampler, to: engine.mainMixerNode, format: nil)
let soundBankPath = playgroundSharedDataDirectory.appendingPathComponent("gs_instruments.dls")
let midiA = playgroundSharedDataDirectory.appendingPathComponent("sfx_a.mid")
let midiB = playgroundSharedDataDirectory.appendingPathComponent("sfx_b.mid")
try! sampler.loadSoundBankInstrument(at: soundBankPath, program: 11, bankMSB: UInt8(kAUSampler_DefaultMelodicBankMSB), bankLSB: UInt8(kAUSampler_DefaultBankLSB))
try! sequencerA.load(from: midiA, options: [])
try! sequencerB.load(from: midiB, options: [])
try! engine.start()
}
#objc public func playSequencerA() { play(sequencerA) }
#objc public func playSequencerB() { play(sequencerB) }
func play(_ sequencer: AVAudioSequencer) {
if sequencer.isPlaying { sequencer.stop() }
sequencer.currentPositionInBeats = 0
try! sequencer.start()
}
}
PlaygroundPage.current.liveView = MyViewController()
Edit:
After some additional experiments I suspect that the AVAudioEngine cannot have more than one AVAudioSequencer instance (or I'm still doing something wrong). As a workaround, I have created a separate AVAudioEngine object for each MIDI file that I need to play and they all have their own sampler and sequencer inputs, which plays the sounds just fine. I'm pretty sure this solution is not optimal, so I would be glad for any tips about a better setup.

I am building a ship game using swift. Unable to detect collisions

I am building a ship game using swift. The objective is to avoid the incoming stones and score as many points as you can as the level increases. The stones come in opposite direction to hit the ship.But I am unable to detect collisions between ship and stone, stone passes through the ship.The ship can move to the left or to the right.
I used rect1.interects(rect2) for intersection.
Thank You.
here is ViewController.swift
import UIKit
class ViewController: UIViewController {
#IBOutlet weak var moveWater: MovingWater!
var boat:UIImageView!
var stone:UIImageView!
var boatLeftRight : UILongPressGestureRecognizer!
var tapTimer:Timer!
var leftM:UInt32 = 55
var rightM:UInt32 = 250
var leftS:UInt32 = 35
var rightS:UInt32 = 220
func startGame() {
boat = UIImageView(image: UIImage(named: "boat"))
boat.frame = CGRect(x: 0, y: 0, width: 60, height: 90)
boat.frame.origin.y = self.view.bounds.height - boat.frame.size.height - 10
boat.center.x = self.view.bounds.midX
self.view.insertSubview(boat, aboveSubview: moveWater)
boatLeftRight = UILongPressGestureRecognizer(target: self, action: #selector(ViewController.leftRight(tap:)))
boatLeftRight.minimumPressDuration = 0.001
moveWater.addGestureRecognizer(boatLeftRight)
}
func leftRight(tap:UILongPressGestureRecognizer) {
if tap.state == UIGestureRecognizerState.ended {
if (tapTimer != nil) {
self.tapTimer.invalidate()
}
} else if tap.state == UIGestureRecognizerState.began {
let touch = tap.location(in: moveWater)
if touch.x > moveWater.frame.midX {
tapTimer = Timer.scheduledTimer(timeInterval: TimeInterval(0.005), target: self, selector: #selector(ViewController.moveBoat(time:)), userInfo: "right", repeats: true)
} else {
tapTimer = Timer.scheduledTimer(timeInterval: TimeInterval(0.005), target: self, selector: #selector(ViewController.moveBoat(time:)), userInfo: "left", repeats: true)
}
}
}
func moveBoat(time:Timer) {
if let d = time.userInfo as? String! {
var bot2 = boat.frame
if d == "right" {
if bot2.origin.x < CGFloat(rightM) {
bot2.origin.x += 2
}
} else {
if bot2.origin.x > CGFloat(leftM) {
bot2.origin.x -= 2
}
}
boat.frame = bot2
}
}
func movingStone() {
stone = UIImageView(image: UIImage(named: "stones.png"))
var stone2 = leftS + arc4random() % rightS
stone.bounds = CGRect(x:10, y:10, width:81.0, height:124.0)
stone.contentMode = .center;
stone.layer.position = CGPoint(x: Int(stone2), y: 10)
stone.transform = CGAffineTransform(rotationAngle: 3.142)
self.view.insertSubview(stone, aboveSubview: moveWater)
UIView.animate(withDuration: 5, delay: 0, options: UIViewAnimationOptions.curveLinear, animations: { () -> Void in
self.stone.frame.origin.y = self.view.bounds.height + self.stone.frame.height + 10
}) { (success:Bool) -> Void in
self.stone.removeFromSuperview()
self.movingStone()
}
}
func update() {
if(boat.bounds.intersects(stone.bounds)) {
boat.image = //set new image
}
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
moveWater.backgroundStart()
startGame()
movingStone()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
I saw your own answer. This is basic collision detection indeed. You should still have a look at SpriteKit. I see that you are using a Timer, which is not the best way to go since Timers will give you performance issues in the long run. It's a common mistake when you start game developement.
Maybe you think that you can ensure a frame rate by setting a very fast timer. The thing is that timers are not consistent and they have low priority too. This means that your timer call will be delayed if something more important happens in the background. If your movement is based on that forced refresh rate, it will become choppy very quickly. Also the code you run in the timer might get faster or slower depending on the logic you are using.
SpriteKit provides you with an update function that is run every frame and that let's you know about the current system time at every frame. By keeping track of that value, you can then calculate how much time went between two frames and you can then scale your movement accordingly to compensate for the time difference between two frames.
On top of that SpriteKit offers you a bunch of options for collision detection and movement. It integrates a very well made Physics Engine and collision detection system. It will also do collision detection on complex shapes, apply forces to the bodies, etc.
I strongly suggest you follow the link to Ray Wenderlich's website given to you in the answer above. If you have the budget, you might also want to buy their book on how to make 2D games for Apple devices. I've read that cover to cover and I can say I love it. I can now do my own stuff with SpriteKit and it's also a very good starter for newcomers to swift.
Any reason you choose UIKit to make your game?
If you make a game you should really be using SpriteKit instead of UIKit.
Check google and youtube for SpriteKit tutorials, there is loads.
A really good start is this one that teaches you the basics of the SpriteKit Scene editor and how to do collisions etc.
https://www.raywenderlich.com/118225/introduction-sprite-kit-scene-editor
I recommend that you do not continue like this.
Hope this helps
I fixed this myself. It was easy and without any SpriteKit, I have detected collisions.
func intersectsAt(tap2 : Timer) {
var f1 : CGRect!
var f2 : CGRect!
f1 = boat.layer.presentation()?.frame
f2 = stone.layer.presentation()?.frame
if(f1.intersects(f2)) {
stopGame()
}
}

How to stop ViewController in Swift?

my ViewController is still sending an array update, even if I'm in another View, what can I do? Here is my code:
import UIKit
import CoreBluetooth
class ViewController: UIViewController {
var audioVibe : AudioVibes!
var superpowered:Superpowered!
var displayLink:CADisplayLink!
var layers:[CALayer]!
var magnitudeArray : [UInt16] = [0, 0, 0, 0, 0, 0, 0]
override func viewDidLoad() {
super.viewDidLoad()
// Setup 8 layers for frequency bars.
let color:CGColorRef = UIColor(red: 0, green: 0.6, blue: 0.8, alpha: 1).CGColor
layers = [CALayer(), CALayer(), CALayer(), CALayer(), CALayer(), CALayer(), CALayer(), CALayer()]
for n in 0...7 {
layers[n].backgroundColor = color
layers[n].frame = CGRectZero
self.view.layer.addSublayer(layers[n])
}
superpowered = Superpowered()
// A display link calls us on every frame (60 fps).
displayLink = CADisplayLink(target: self, selector: #selector(ViewController.onDisplayLink))
displayLink.frameInterval = 1
displayLink.addToRunLoop(NSRunLoop.currentRunLoop(), forMode: NSRunLoopCommonModes)
}
// Gets triggered when you leave the ViewController.
override func viewWillDisappear(animated: Bool) {
superpowered.togglePlayback();
superpowered.moritz();
//delete(superpowered);
}
func onDisplayLink() {
// Get the frequency values.
let magnitudes = UnsafeMutablePointer<Float>.alloc(8)
superpowered.getMagnitudes(magnitudes)
// Wrapping the UI changes in a CATransaction block like this prevents animation/smoothing.
CATransaction.begin()
CATransaction.setAnimationDuration(0)
CATransaction.setDisableActions(true)
// Set the dimension of every frequency bar.
let originY:CGFloat = self.view.frame.size.height - 20
let width:CGFloat = (self.view.frame.size.width - 47) / 5
var frame:CGRect = CGRectMake(20, 0, width, 0)
for n in 0...4 {
frame.size.height = CGFloat(magnitudes[n]) * 2000
frame.origin.y = originY - frame.size.height
layers[n].frame = frame
frame.origin.x += width + 1
}
// Set the magnitudes in the array.
for n in 0...6 {
magnitudeArray[n] = UInt16(magnitudes[n] * 32768)
}
// Update the array in the audioVibe class to trigger the sending command.
audioVibe.magnitudeArray = magnitudeArray
CATransaction.commit()
// Dealloc the magnitudes.
magnitudes.dealloc(8)
}
}
I want him to stop doing things like audioVibe.magnitudeArray = magnitudeArray while I'm not in his view, what can I do?
Thanks!
Looks like you display link isn't getting paused when you transition to different views.
You could pause your display link when transitioning away from this view and resume it when coming back to this view using the paused property on CADisplayLink. You would pause it in viewWillDisappear and resume in viewWillAppear.
In your viewDidLoad method you're creating a CADisplayLink and adding it to the run loop.
If you don't do anything, that display link will stay active when you push another view controller on top of it.
You should move the code that creates the display link and adds it to the run loop to your viewWillAppear method.
Then you need to add code in viewWillDisappear that removes the display link from the run loop, invalidates it, and nils it out.
That way, you'll start your display link code when the view appears, and stop it when the view disappears.

Resources