I've been developing an iOS app the last year with AudioKit-4.0.4. Now that I have the app working I thought it was time to update my AudioKit library to a newer version.
I have downloaded AudioKit-4.6 and merely swapped out the older "AudioKit For iOS.xcodeproj" in my XCode project with the new version. Everything built just fine, except for AudioKit.start() now has to be wrapped with a "try". No other changes were needed to get a successful build.
But now my app does not produce any sound.
Here is my code for starting AudioKit:
AKSettings.audioInputEnabled = true
mix = AKMixer()
AKSettings.playbackWhileMuted = true
AudioKit.output = mix
do {
try AudioKit.start()
print("----- AudioKit Started -----")
} catch {
print("Error AudioKit.start")
}
do {
try AKSettings.setSession(category: AKSettings.SessionCategory.playback, with: AVAudioSession.CategoryOptions.mixWithOthers)
} catch {
print("Error setSession mixWithOthers")
}
In addition to no audio, I am seeing these repeated messages in the console log:
----- AudioKit Started -----
2019-04-08 15:03:45.709359-0700 HarmonicChimes[2708:2212995] [avas] AVAudioSessionPortImpl.mm:56:ValidateRequiredFields: Unknown selected data source for Port Speaker (type: Speaker)
2019-04-08 15:03:45.711236-0700 HarmonicChimes[2708:2212995] [avas] AVAudioSessionPortImpl.mm:56:ValidateRequiredFields: Unknown selected data source for Port Speaker (type: Speaker)
These AV messages show on my iOS 12 device but not iOS 11 and older. Some googling on the net indicates these AV messages are Apple's problem, not AudioKit, but I was not seeing them when running with AudioKit 4.0.4.
The no sound problem is a show stopper! I have searched for "AudioKit no sound" but not found anything that makes sense.
It would appear that 4.6 is not just a simple plug-in replacement for 4.0? Is there a new AudioKit api to get the sound started? My app's plist and capabilities are set to allow for background operation, could that have something to do with this?
(I am using XCode 10.1, macOS 10.13.6, and iOS 12.)
I posted the answer to this question in AudioKit's GitHub issues pages, but for the record here, it is just that AKOscillator was on by default in the past (bad) and has been fixed in the newest version. So, #WholeCheese has to add an osc.start() to his files. Next, I hope to do a screen share with him to solve the Audiobus issues.
Related
This is my first WatchKit App.
I am working on an app that gets an API call to an iCal file and displays the time remaining in two events.
I am working on Xcode 13.4.1 and created an iOS app with watchOS app using swift and storyboard. I set the target deployment for the iOS app to 15.5, which is the highest version of iOS that my current version of Xcode will let me set. The target deployment for the watchOS app is set to version 6.2 because I have an Apple Watch series 2, and the latest version of watchOS is 6.3. I want the app to at least work for my watch. The code for the iOS app and watch app are pretty much the same. The app works as expected on the watch and phone simulators and on my physical iPhone 8 plus. The problem is that the app does not work on my physical watch.
I created a custom class called API_Manager that handles the API call. I handle my API call as a giant string because I am getting an iCal file and parse it later on in the class. url is defined above. Here is my API call for both the phone and watch app:
public func getData() throws{
//api call
do{
// real url is defined above
let url:String = ""
savedData = try String(contentsOf: URL(string: url)!)
} catch{
print("there was an error with the api call")
throw API_Error.apiCallError
}
// parsing data
}
getData() is called in the init function of the API_Manager class.
init() throws {
// checking internet connection
if (!NetworkMonitor.shared.isConnected){
throw API_Error.noInternet
}
do{
try getData()
} catch{
throw API_Error.apiCallError
}
}
I believe the root of the problem is an Internet connection. I check for an Internet connection before the API call, and if there is no Internet connection, then I throw a custom error. I used this video to create a Network_Monitor class. This is the startMonitoring() function which is called in the applicationDidFinishLaunching() in extensionDelegate of the watch app and application didFinishLaunchingWithOptions in the appDelegate.
public func startMonitoring(){
monitor.start(queue: queue)
monitor.pathUpdateHandler = { [weak self] path in
self?.isConnected = path.status == .satisfied
// getting connection type
self?.getConnectionType(path)
}
}
I created a custom error that is thrown throughout the API_Manager if things don't work as expected. Here is how I created the custom errors:
enum API_Error: Error{
case noInternet
case apiCallError
}
extension API_Error: CustomStringConvertible{
public var description:String {
switch self{
case .noInternet:
return "Error: No internet connection"
case .apiCallError:
return "There was an error with the api call"
}
}
}
I created another enum that is used to display text on a label in the Interface Controller if a specific error is thrown.
enum API_Manager_Status{
case active
case noInternet
case apiCallError
case null // if the api manager is nill
}
Here is how I used the two enums in InterfaceController:
main() is called in the awake() function.
func main(){
do{
apiManager = try API_Manager()
apiManagerStatus = .active
} catch API_Error.noInternet{
apiManagerStatus = .noInternet
} catch API_Error.apiCallError{
apiManagerStatus = .apiCallError
} catch{
apiManagerStatus = .null
}
if (apiManagerStatus == .noInternet){
periodLabel.setText("Error")
timeRemaining.setText("No Internet")
print(API_Error.noInternet)
} else if (apiManagerStatus == .apiCallError){
periodLabel.setText("Error")
timeRemaining.setText("API Call")
print(API_Error.apiCallError)
} else if (apiManagerStatus == .null){
periodLabel.setText("Error")
timeRemaining.setText("Other Error")
print("Cought all errors")
} else{
// main code to display time remaining
}
So I guess my questions are:
Why is my Apple Watch not getting an Internet connection?
How can I fix my app if it worked on the simulators and iPhone, but not watch?
Can the Apple Watch do an API call?
Do I need to configure my app to connect with the iPhone, given that my watchOS verison is 6.2?
Sorry if this is a long question. This is my first time posting a question on Stack Overflow. I would appreciate any help. Let me know if I need to add any more information to help clear things up.
I have tried many potential solutions to fix the problem, but none have worked:
I distributed the app on Test Flight and had my dad download it. He has an Apple Watch Series 4 running on watchOS 9.2, and the app still does not work on his watch.
I have restarted my phone and watch multiple times and even tried to run the app with my phone off, so the watch could connect to wifi.
I tried to install a watchOS 9 beta profile on my watch, knowing it would not work, but I still wanted to see what would happen.
I tried setting the build target to watchOS 8.5, which is the highest my version of Xcode would allow, to see if the problem was watchOS 6.2, and if it would work on my dad's watch.
Lastly, I tried adding privacy messages in the Info.plist for both the phone and the watch app. The messages did not display when I re-downloaded the app, but it still did not fix the issue.
(the messages I used were: "Privacy - Nearby Interaction Usage Description" which does not work on watchOS 6.2, but I still kept in the info.plist, and "Privacy - Bluetooth Always Usage Description")
I know Apple Watches are supposed to use the connected phone for Internet and then wifi if a phone is not connected. My watch shows that my phone is connected, and I have tried it with just wifi. I believe my watch can still get an Internet connection because I can still use Siri and even go to the App Store on my watch, which needs an Internet connection. I would appreciate any help. Let me know if I need to add any more information to help clear things up.
I have made an app that tracks 2D body motion data using the back camera. I have followed Apple's example to create an ARView from the storyboard, and have added this code in my viewDidAppear:
arView.session.delegate = self
// If the iOS device doesn't support body tracking, raise a developer error for
// this unhandled case.
guard ARBodyTrackingConfiguration.isSupported else {
fatalError("This feature is only supported on devices with an A12 chip")
}
// Run a body tracking configration.
let configuration = ARBodyTrackingConfiguration()
arView.session.run(configuration)
When I open the app everything works perfectly when installing from Xcode, but the app crashes when installing from TestFligth. Crash log shows me the last backtrace log at arView.session.delegate = self
I've found a workaround in case someone faces the same problem. According to Apple this is a known issue in XCode 11. To solve it, you need to manually add RealityKit to your target’s Link Binary with Libraries build phase. Hope this helps
Did something change with the latest version 4.9 regarding MIDI input? Seemed to work well with 4.7 but now only MIDI out is working. Tested using IOS 12 and 13.
On startup I'm calling midi.openInput() and then midi.addListener(self) then using the delegate functions to receive messages.
Make sure you are correctly implementing the AKMIDIListener protocol. There were some changes recently for adding portID for the MIDI input and offset to make sample-accurate MIDI handling. Your method signatures for the protocol should include these new elements like this:
func receivedMIDINoteOn(noteNumber: MIDINoteNumber,
velocity: MIDIVelocity,
channel: MIDIChannel,
portID: MIDIUniqueID? = nil,
offset: MIDITimeStamp = 0) {
if you still have:
func receivedMIDINoteOn(noteNumber: MIDINoteNumber,
velocity: MIDIVelocity,
channel: MIDIChannel) {
that won't be called anymore, you need the two new parameters. HTH
#Uncle Kenny,
I don't believe the MIDI input issue is with AudioKit; the change seems to revolve around how Xcode 11, iOS 13, and macOS Catalina are now handling (or not handling) MIDI. AudioKit 4.9 is the version that compiles with Xcode 11.1. Its MIDI library should be the same, but that could be the problem; Apple may have changed it without warning.
Can you get your MIDI controller to control any other MIDI app on iOS 13, such as Animoog, GarageBand, or ? I can't trigger any of the Korg synths or GarageBand via my KMI QuNexus controller, and it used to work without a hitch prior to iOS 13. So, that's why I don't believe that the MIDI issues are limited to AudioKit. But, I could be wrong.
As you may know, many music hardware and software companies are advising musicians to not upgrade to macOS Catalina or iOS 13, if they wish to keep their existing workflow, or continue performing with external MIDI devices:
https://www.sweetwater.com/sweetcare/articles/macos-10-15-catalina-compatibility-list/
https://cdm.link/2019/10/ios-13-music/
https://www.korg.com/us/news/2019/0911/
Another oddity, is, the iOS 13 simulators in Xcode 11.1 don't include the necessary MIDI drivers to run MIDI-enabled apps successfully. Here's a workaround:
https://github.com/AudioKit/AudioKit/issues/1872#issuecomment-536223521
I recommend that you file a bug report about it. We all should, because this is a serious issue that appears to be breaking the MIDI experience on iOS and macOS. If there are new MIDI changes, Apple should be loud and clear about what those are.
https://developer.apple.com/bug-reporting/
I hope that this helps.
This problem is solved by updating to iOS 13.3 but as Aurelius points out you also have to update the AKMIDIListener protocol
I'm making a video stream app using VLC player.
I installed mobileVLCKit-unstable and successfully streamed video but it stops within 1 minute.
I found out that the VLC library's 'hardware decoding' option is 'on'. But I can't find how to do it.
This is what I tried:
myplayer = VLCMediaPlayer
myplayer.media.addOptions(["network-caching":1000]) // this is hint
myplayer.media.addOptions(["hardware-decoding":false]) // i tried, but not worked
myplayer.media.addOptions(["avcodec":false]) // i tried, but not worked
I'm using Swift 4, Xcode 10.
You shouldn’t use the unstable pod of MobileVLCKit anymore. This is no longer needed and will give you a very old and unstable version of the library as we no longer update this pod. Just use the normal MobileVLCKit pod and try again.
Disabling hardware decoding will NOT solve your problem. Please post a debug log of the stable library so we can take a more detailed look.
I solved this problem.
I added an option
let option : [String] = ["--codec=avcodec"]
let player : VLCMediaPlayer = VLCMediaPlayer(options : options)
Then I confirmed the playing time of over 30 minutes.
Since upgrading to iOS 10 and Xcode 8, my iOS app has been throwing an error and crashing whenever I turn off the screen using the lock button. The error is:
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error 561015905'
I'm not explicitly using Core Audio, or any audio at all. But I am using SceneKit, which I assume uses Core Audio.
Actually this behavior doesn't seem to be related to my code at all. It happens on a brand new untouched SceneKit template! It doesn't happen in the simulator, but it happens consistently testing with an iPhone 5. I haven't tried it with another model.
Steps to reproduce:
Create a new project in Xcode 8.0 using the "Game" template with SceneKit. Set your team in the project editor for code signing. Connect an iPhone 5 for testing. Build and run the app. Once it starts (and you see the rotating plane), hit the lock button. The error occurs and the app won't return from the lock screen.
Google results for the error message all seem to be from people actually using Core Audio or trying to play sound, which doesn't apply here.
What is this error and what can be done about it?
This is an Apple bug that has 2 workarounds while we wait for the fix in iOS 10.2:
(1) (worse) enable background audio
(2) (better) see Apple message below
message from Apple:
This is a known issue that will be fixed in 10.2. In the meantime another simpler workaround should work:
Trigger the audio engine creation yourself before entering the background (for example at setup).
You can trigger this simply by getting the audio engine from the SCNView:
scnView.audioEngine;