Audiokit 4.9 midi input no longer working IOS - audiokit

Did something change with the latest version 4.9 regarding MIDI input? Seemed to work well with 4.7 but now only MIDI out is working. Tested using IOS 12 and 13.
On startup I'm calling midi.openInput() and then midi.addListener(self) then using the delegate functions to receive messages.

Make sure you are correctly implementing the AKMIDIListener protocol. There were some changes recently for adding portID for the MIDI input and offset to make sample-accurate MIDI handling. Your method signatures for the protocol should include these new elements like this:
func receivedMIDINoteOn(noteNumber: MIDINoteNumber,
velocity: MIDIVelocity,
channel: MIDIChannel,
portID: MIDIUniqueID? = nil,
offset: MIDITimeStamp = 0) {
if you still have:
func receivedMIDINoteOn(noteNumber: MIDINoteNumber,
velocity: MIDIVelocity,
channel: MIDIChannel) {
that won't be called anymore, you need the two new parameters. HTH

#Uncle Kenny,
I don't believe the MIDI input issue is with AudioKit; the change seems to revolve around how Xcode 11, iOS 13, and macOS Catalina are now handling (or not handling) MIDI. AudioKit 4.9 is the version that compiles with Xcode 11.1. Its MIDI library should be the same, but that could be the problem; Apple may have changed it without warning.
Can you get your MIDI controller to control any other MIDI app on iOS 13, such as Animoog, GarageBand, or ? I can't trigger any of the Korg synths or GarageBand via my KMI QuNexus controller, and it used to work without a hitch prior to iOS 13. So, that's why I don't believe that the MIDI issues are limited to AudioKit. But, I could be wrong.
As you may know, many music hardware and software companies are advising musicians to not upgrade to macOS Catalina or iOS 13, if they wish to keep their existing workflow, or continue performing with external MIDI devices:
https://www.sweetwater.com/sweetcare/articles/macos-10-15-catalina-compatibility-list/
https://cdm.link/2019/10/ios-13-music/
https://www.korg.com/us/news/2019/0911/
Another oddity, is, the iOS 13 simulators in Xcode 11.1 don't include the necessary MIDI drivers to run MIDI-enabled apps successfully. Here's a workaround:
https://github.com/AudioKit/AudioKit/issues/1872#issuecomment-536223521
I recommend that you file a bug report about it. We all should, because this is a serious issue that appears to be breaking the MIDI experience on iOS and macOS. If there are new MIDI changes, Apple should be loud and clear about what those are.
https://developer.apple.com/bug-reporting/
I hope that this helps.

This problem is solved by updating to iOS 13.3 but as Aurelius points out you also have to update the AKMIDIListener protocol

Related

iOS: Concurrency is only available in iOS 15.0.0 or newer in protocol

I have an app which deployment target is iOS 12.1, with many protocols defining functions with completion handlers, i.e.
protocol P {
func f(_ completion: #escaping: (String) -> Void)
}
I would like to replace all these with the new async/await iOS 15 syntax, for a better code readability:
protocol P {
func f() async -> String
}
But when doing so, I get the error:
Concurrency is only available in iOS 15.0.0 or newer
What is a good solution for this, considering that I just cannot switch the deployment target from 12.1 to 15.0?
Thank you for your help
For other people who are looking facing this issue on 15.0 target.
This is likely related to your XCode version, you need to be have minimum XCode 13.2 or later to compile code which uses Swift Concurrency on versions older than iOS 15.
The short answer is "there is currently no solution." If you want your apps to run on iOS 12 and earlier, you can't use the async/await calls, unless you want to write 2 versions of all your async code, one that runs on iOS < 15, and the other that runs on iOS ≥ 15.
As George mentions in his comment, Apple is trying to figure out how to "back-depoloy" async/await support. If they are able to do that, you will be able to use the modern approach with older versions, but I would bet Apple will not go back as far as iOS 12.
Edit:
See Bradley's comment below. The best you will get is async/await support in iOS 13, if Apple is able to pull that off. From the link Bradley posted, iOS 12 definitely won't be supported.

Cocos2d-x app code not allowing volume control buttons to work on iPhone or iPad

I cannot confirm whether this has been since a newer iOS version (perhaps 11 onwards) but the hardware volume buttons do not respond whilst using an app I am updating at present.
It uses SimpleAudioEngine and whilst I have looked at some other projects I have that use Cocos2d-x, I have tried to use the SimpleAudioEngine source from there, with zero success.
The version of cocos2d-x is v2.1.1
In my HelloWorldScene.cpp I am calling
`void HelloWorld::StartSoundAction()
{
if(soundflag==0)
{
SimpleAudioEngine::sharedEngine()->playBackgroundMusic("background_music.mp3");
soundflag=1;
}
else
{
SimpleAudioEngine::sharedEngine()->stopAllEffects();
SimpleAudioEngine::sharedEngine()->playEffect("choose_your_patient3.mp3");
}
}`
The music works, but I am unable to control the volume using the hardware keys on the iPhone x or the iPad Pro.
In most of my apps i use the same syntax and it is working using cocos2d-x version 3.17.2.
CocosDenshion::SimpleAudioEngine::getInstance()->stopBackgroundMusic();
CocosDenshion::SimpleAudioEngine::getInstance()->playBackgroundMusic("background4.mp3", true);
Like this one
But only tested on old iPhone SE hardware.

AudioKit 4.6 no sound

I've been developing an iOS app the last year with AudioKit-4.0.4. Now that I have the app working I thought it was time to update my AudioKit library to a newer version.
I have downloaded AudioKit-4.6 and merely swapped out the older "AudioKit For iOS.xcodeproj" in my XCode project with the new version. Everything built just fine, except for AudioKit.start() now has to be wrapped with a "try". No other changes were needed to get a successful build.
But now my app does not produce any sound.
Here is my code for starting AudioKit:
AKSettings.audioInputEnabled = true
mix = AKMixer()
AKSettings.playbackWhileMuted = true
AudioKit.output = mix
do {
try AudioKit.start()
print("----- AudioKit Started -----")
} catch {
print("Error AudioKit.start")
}
do {
try AKSettings.setSession(category: AKSettings.SessionCategory.playback, with: AVAudioSession.CategoryOptions.mixWithOthers)
} catch {
print("Error setSession mixWithOthers")
}
In addition to no audio, I am seeing these repeated messages in the console log:
----- AudioKit Started -----
2019-04-08 15:03:45.709359-0700 HarmonicChimes[2708:2212995] [avas] AVAudioSessionPortImpl.mm:56:ValidateRequiredFields: Unknown selected data source for Port Speaker (type: Speaker)
2019-04-08 15:03:45.711236-0700 HarmonicChimes[2708:2212995] [avas] AVAudioSessionPortImpl.mm:56:ValidateRequiredFields: Unknown selected data source for Port Speaker (type: Speaker)
These AV messages show on my iOS 12 device but not iOS 11 and older. Some googling on the net indicates these AV messages are Apple's problem, not AudioKit, but I was not seeing them when running with AudioKit 4.0.4.
The no sound problem is a show stopper! I have searched for "AudioKit no sound" but not found anything that makes sense.
It would appear that 4.6 is not just a simple plug-in replacement for 4.0? Is there a new AudioKit api to get the sound started? My app's plist and capabilities are set to allow for background operation, could that have something to do with this?
(I am using XCode 10.1, macOS 10.13.6, and iOS 12.)
I posted the answer to this question in AudioKit's GitHub issues pages, but for the record here, it is just that AKOscillator was on by default in the past (bad) and has been fixed in the newest version. So, #WholeCheese has to add an osc.start() to his files. Next, I hope to do a screen share with him to solve the Audiobus issues.

IOMobileFramebufferGetLayerDefaultSurface not working on iOS 9

My main question is, how can I reverse engineer a private API function that already exists, but has been modified in a new version of iOS?
I have created an iOS application to record the screen content using IOSurface and IOMobileFramebuffer. The main functions the framebuffer use to open it are IOMobileFramebufferGetMainDisplay(connect) and IOMobileFramebufferGetLayerDefaultSurface.
These functions have been used since the very first version of the app, and they have worked on all versions of iOS 7 and 8. However, on the latest iOS 9 beta, which is beta 5, the function IOMobileFramebufferGetLayerDefaultSurface does not work. The function does not return 0, as it should when it successfully opens the framebuffer.
This other user on StackOverflow seems to also be experiencing the same issue: IOMobileFramebufferGetLayerDefaultSurface function failed on iOS 9. We have a reference to IOMobileFramebufferConnection named “_framebufferConnection” and an IOSurfaceRef named “_screenSurface” Here is the current code:
IOMobileFramebufferGetMainDisplay(&_framebufferConnection);
IOMobileFramebufferGetLayerDefaultSurface(_framebufferConnection, 0, &_screenSurface;
As stated before, these work perfectly on iOS 7-8, but on iOS 9, the second function crashes. I have also looked at the binaries with the symbols for both versions and compared them. The second parameter of the LDR is slightly different in iOS 9, when compared to the iOS 8.4.1 binary. So, back to the main question, how can I reverse engineer IOMobileFramebufferGetLayerDefaultSurface, or see how in what way it’s actually been modified on iOS 9?
To answer the question of "how in what way it’s actually been modified on iOS 9", I did some digging into IOMobileFramebufferGetLayerDefaultSurface on iOS8 vs iOS9 (GM). Here are the results of what I found:
Setup:
IOMobileFramebufferRef fb;
IOMobileFramebufferGetMainDisplay(&fb);
iOS8 Implementation:
Calls through to kern_GetLayerDefaultSurface
Which accesses underlying IOConnection
io_connect_t fbConnect = *(io_connect_t *)((char *)fb + 20)
To retrieve the IOSurfaceID via
IOSurfaceID surfaceID;
uint32_t outCount = 1;
IOConnectCallScalarMethod(fbConnect, 3, {0, 0}, 2, &surfaceID, &outCount)
Returns IOSurfaceLookup(surfaceID)
iOS9 Implementation:
Same steps as above aside from the return
Then tries to retrieve a mach port to access the surface via
io_service_t fbService = *(io_service_t *)((char *)fb + 16)
mach_port_t surfacePort;
IOServiceOpen(fbService, mach_task_self(), 3, &surfacePort)
On success, return IOSurfaceLookupFromMachPort(surfacePort)
It is on the last step that IOServiceOpen returns error 0x2c7 (unsupported function). Notice that the 3rd argument specifying the type of connection is 3 instead of the usual 0 when opening the framebuffer service. It is almost certain that this new connection type has permissions restrictions that prevent anyone but Apple from retrieving a mach port to access the IOMFB surface.
What's somewhat interesting is that the call to IOConnectCallScalarMethod still works to retrieve the ID of the IOMFB surface. However, it can no longer be accessed using IOSurfaceLookup because the surface is no longer global. It's a little surprising that it was global in the first place!
Hope this helps demystify why IOMFB can no longer be used to record the screen.
Source: My own use of LLDB with an iPhone6 running iOS 8.4 and an iPhone6+ running iOS9 GM
I believe #nevyn is correct. However, I would like to elaborate a bit more. I have looked into this exact issue extensively, and the IOMobileFramebufferGetLayerDefaultSurface function does return -536870201, while it should return 0 if it runs the function without any problems. This error is on the internet, but it only appears when users encounter generic problems with QuickTime. It could be that Apple has indeed locked up the framework completely, and needs an Apple-only entitlement to access the framebuffer. We cannot add these entitlements, since it also has to be on the provisioning profile. I currently am trying to read and interpret the disassembly and doing some reverse engineering work on the IOMobileFramebuffer binary to see if any of the parameters have changed since the last iOS version. I will surely update this answer if I discover anything. But if this is the case, I would suggest trying to find another method of trying to capture/record the screen content.
-UPDATE-
It seems as if there is evidence that this would be the case, if you read this, it shows the exact same error code, and it means that the function is "unsupported", and returns an IOKit error. At least we know what this means now. However, I am still unsure of how to fix it, or to make the function work. I will continue looking into this.
UPDATE 2
I have actually discovered a brand new class in iOS 9, "FigScreenCaptureController", and it is part of the MediaToolbox framework! What the strange thing is though, is why would Apple include this only in iOS 9? So, maybe there will be a way to record the display through this...I will be looking into this class more in depth very soon.
Not entirely correct - it's just a matter of an entitlement, as you can see if you dump the kext:
$ jtool -d __TEXT.__cstring 97.IOMobileGraphicsFamily.kext | grep com.apple
0xffffff80220c91a2: com.apple.private.allow-explicit-graphics-priority
If you self sign (jtool --sign --ent) with this , everything works well.
This does mean that on non-JB devices you can't use it. But with a jailbreak the immense power is in your hands once more.
IOMobileFramebuffer is completely locked down on iOS 9 and cannot be used from non-Apple apps anymore. AFAICT, this closes the last private API to capture the screen efficiently. ReplayKit is the only replacement, but does not allow programmatic access to the actual video data.

MPMediaLibrary.DidChangeNotification not working

This question is related to Xamarin.iOS.
I have been trying since many days to get MPMediaLibrary.Notifications.ObserveDidChange to work without success. I tried almost everything. Suspecting something bad with Objective-C binding, I tried direct objc calls too using Messaging API. Finally, I built a Native Library and made sure that it works by testing it with pure objective-c app. Native one with Objective-C works without problem. However, the same Library when used with Xamarin.iOS doesn't get MPMediaLibraryDidChangeNotification. I have created in-built selector etc within Native library so that I just call a 'C' function without argument and it works with objective-c app. However, when used with Xamarin, the same doesn't work. I have taken care of calling beginGeneratingLibraryChangeNotifications().
Some people may suspect that My selector/delgate is not being called because of wrong use. However, every other notification is able to call my selector except this one. So syntax is not an issue, I suppose.
After all the efforts, I presume that there is something wrong in Xamarin settings, which is stopping me from getting MPMediaLibraryDidChangeNotification . I really dont know what exactly is it. So my question is - Can you guys get this notification ?
My test phone - iPhone6-8.0.2, Xamarin Studio Version 5.5.3 (build 6) Installation UUID: d84b8c6d-f992-4f19-8a35-c14bcd08420e Runtime: Mono 3.10.0 ((detached/e204655) GTK+ 2.24.23 (Raleigh theme) Package version: 310000023 Apple Developer Tools Xcode 6.1 (6604) Build 6A1052d Xamarin.iOS Version: 8.4.0.16 (Indie Edition) Hash: 80e9ff7 Branch: Build date: 2014-10-22 15:09:12-0400
Thanks, Vinay
For the Record, I am posting the answer.
Since 64 bit transition, The MediaLibrary change notification is stopped for 32 bit apps. If you build your app for 64 bit iOS, everything is fine. However, 64 bit devices with 32 bit applications won't receive these notification. I have tested it thoroughly on iPhone6. So I think this is iOS bug, which Apple needs to rectify. All the Music Player applications on App Store are unable to update library anymore since they are 32 bit.
For Xamarin Users, use Unified API for proper notification support.

Resources