I am using the following code to get the current SSID:
var dict = CaptiveNetwork.CopyCurrentNetworkInfo(curInterface);
string localSsid = dict [CaptiveNetwork.NetworkInfoKeySSID].ToString();
Console.Writeline("Current Local SSID: " + localSsid);
However, often the result is out of date (on the order of minutes or more). I noticed that if I go into settings on the iPad and manually switch to another network just for 5 seconds and then switch back, that the code will then capture the correct SSID.
I am running iOS 5.0 and MonoTouch 3.2.12. Is my code incorrect or is this a bug in MonoTouch or iOS?
MonoTouch.SystemConfiguration.CaptiveNetwork.CopyCurrentNetworkInfo is a direct p/invoke to Apple's CNCopyCurrentNetworkInfo.
IOW there's no caching of the data being done by MonoTouch but, from your description, I assume iOS is doing some by itself. Not sure it would be considered a bug or not but I encourage you to fill a bug report with Apple.
Monotouch 3.2.12
That's likely 5.2.12 :-)
Related
I have a cycript backboardd script that works great on iOS 7 for modifying an app to continue running in the background.
app = [BKProcess processForPid:$PID];
alive = [[BKProcessAssertion alloc] initWithReason:7 identifier:"AppKeepAlive"];
[alive setFlags:0xF];
[sc addAssertion:alive];
This is all that's needed. However on iOS 8 this does't work as BKProcess is now BKSProcess and BKProcessAssertion is now BKSProcessAssertion and they have different methods.
There doesn't seem to be a way to attach the assertion to the app like on iOS 7.
Can some please help me get this working under iOS 8.
There is, however when I used this I simply used the initializer with the required PID.
- (id)initWithPID:flags:reason:name:withHandler:
Check out the BKSProcessAssertion header for reference.
If you want to see an implementation for reference, check out MessageBox (now deprecated, but for reference):
My main question is, how can I reverse engineer a private API function that already exists, but has been modified in a new version of iOS?
I have created an iOS application to record the screen content using IOSurface and IOMobileFramebuffer. The main functions the framebuffer use to open it are IOMobileFramebufferGetMainDisplay(connect) and IOMobileFramebufferGetLayerDefaultSurface.
These functions have been used since the very first version of the app, and they have worked on all versions of iOS 7 and 8. However, on the latest iOS 9 beta, which is beta 5, the function IOMobileFramebufferGetLayerDefaultSurface does not work. The function does not return 0, as it should when it successfully opens the framebuffer.
This other user on StackOverflow seems to also be experiencing the same issue: IOMobileFramebufferGetLayerDefaultSurface function failed on iOS 9. We have a reference to IOMobileFramebufferConnection named “_framebufferConnection” and an IOSurfaceRef named “_screenSurface” Here is the current code:
IOMobileFramebufferGetMainDisplay(&_framebufferConnection);
IOMobileFramebufferGetLayerDefaultSurface(_framebufferConnection, 0, &_screenSurface;
As stated before, these work perfectly on iOS 7-8, but on iOS 9, the second function crashes. I have also looked at the binaries with the symbols for both versions and compared them. The second parameter of the LDR is slightly different in iOS 9, when compared to the iOS 8.4.1 binary. So, back to the main question, how can I reverse engineer IOMobileFramebufferGetLayerDefaultSurface, or see how in what way it’s actually been modified on iOS 9?
To answer the question of "how in what way it’s actually been modified on iOS 9", I did some digging into IOMobileFramebufferGetLayerDefaultSurface on iOS8 vs iOS9 (GM). Here are the results of what I found:
Setup:
IOMobileFramebufferRef fb;
IOMobileFramebufferGetMainDisplay(&fb);
iOS8 Implementation:
Calls through to kern_GetLayerDefaultSurface
Which accesses underlying IOConnection
io_connect_t fbConnect = *(io_connect_t *)((char *)fb + 20)
To retrieve the IOSurfaceID via
IOSurfaceID surfaceID;
uint32_t outCount = 1;
IOConnectCallScalarMethod(fbConnect, 3, {0, 0}, 2, &surfaceID, &outCount)
Returns IOSurfaceLookup(surfaceID)
iOS9 Implementation:
Same steps as above aside from the return
Then tries to retrieve a mach port to access the surface via
io_service_t fbService = *(io_service_t *)((char *)fb + 16)
mach_port_t surfacePort;
IOServiceOpen(fbService, mach_task_self(), 3, &surfacePort)
On success, return IOSurfaceLookupFromMachPort(surfacePort)
It is on the last step that IOServiceOpen returns error 0x2c7 (unsupported function). Notice that the 3rd argument specifying the type of connection is 3 instead of the usual 0 when opening the framebuffer service. It is almost certain that this new connection type has permissions restrictions that prevent anyone but Apple from retrieving a mach port to access the IOMFB surface.
What's somewhat interesting is that the call to IOConnectCallScalarMethod still works to retrieve the ID of the IOMFB surface. However, it can no longer be accessed using IOSurfaceLookup because the surface is no longer global. It's a little surprising that it was global in the first place!
Hope this helps demystify why IOMFB can no longer be used to record the screen.
Source: My own use of LLDB with an iPhone6 running iOS 8.4 and an iPhone6+ running iOS9 GM
I believe #nevyn is correct. However, I would like to elaborate a bit more. I have looked into this exact issue extensively, and the IOMobileFramebufferGetLayerDefaultSurface function does return -536870201, while it should return 0 if it runs the function without any problems. This error is on the internet, but it only appears when users encounter generic problems with QuickTime. It could be that Apple has indeed locked up the framework completely, and needs an Apple-only entitlement to access the framebuffer. We cannot add these entitlements, since it also has to be on the provisioning profile. I currently am trying to read and interpret the disassembly and doing some reverse engineering work on the IOMobileFramebuffer binary to see if any of the parameters have changed since the last iOS version. I will surely update this answer if I discover anything. But if this is the case, I would suggest trying to find another method of trying to capture/record the screen content.
-UPDATE-
It seems as if there is evidence that this would be the case, if you read this, it shows the exact same error code, and it means that the function is "unsupported", and returns an IOKit error. At least we know what this means now. However, I am still unsure of how to fix it, or to make the function work. I will continue looking into this.
UPDATE 2
I have actually discovered a brand new class in iOS 9, "FigScreenCaptureController", and it is part of the MediaToolbox framework! What the strange thing is though, is why would Apple include this only in iOS 9? So, maybe there will be a way to record the display through this...I will be looking into this class more in depth very soon.
Not entirely correct - it's just a matter of an entitlement, as you can see if you dump the kext:
$ jtool -d __TEXT.__cstring 97.IOMobileGraphicsFamily.kext | grep com.apple
0xffffff80220c91a2: com.apple.private.allow-explicit-graphics-priority
If you self sign (jtool --sign --ent) with this , everything works well.
This does mean that on non-JB devices you can't use it. But with a jailbreak the immense power is in your hands once more.
IOMobileFramebuffer is completely locked down on iOS 9 and cannot be used from non-Apple apps anymore. AFAICT, this closes the last private API to capture the screen efficiently. ReplayKit is the only replacement, but does not allow programmatic access to the actual video data.
I am working on an iOS app. I want it to support iOS 7 and 8. It is going pretty nicely, however there are lots of different parts of the app which use Apple APIs. Some of these APIs work in both iOS 8 and 7. However, some of them are deprecated in iOS 8. So I therefore went to the Apple developer site to see what to replace them with (new methods/etc....).
However, I now have the problem that the app will work on iOS 8 fine, but certain parts of it don't work properly on iOS 7 as I'm trying to use an iOS 8 API...... (lol).
So I just wanted to know, what is the best way to implement code which works on iOS 8 and 7. I had a few ideas (below), but I'm not sure which is best:
IDEA 1
Whenever I have code which doesn't work on both OS's, I use an if function (which calls a macro) like so:
if (SYSTEM_VERSION_LESS_THAN(#"8.0")) {
// iOS 7 device. Use iOS 7 apis.
}
else {
// iOS 8 (or higher) - use iOS 8 apis.
}
IDEA 2
I was thinking about using ifdef definitions all around the app like so:
#ifdef __IPHONE_8_0
// iOS 8 code here....
#else
// iOS 7 code here....
#endif
Which way is better? I would have thought that the second idea is much faster and uses less resources right?
Or are both my ideas rubbish? Is there a much better way about solving this problem?
Thanks for your time, Dan.
I don't suggest checking the Version and writing code based on that. Instead you need to check whether that API is available or not.
For checking a class available or not:
Class checkClass = NSClassFromString(#"CheckingClass");
if (checkClass)
{
// Available
}
else
{
// Not Available
}
If you need to check a feature/function available;
if ([checkClass respondsToSelector:#selector(yourMethod:)])
{
// Feature/ Method Available
}
else
{
// Feature/ Method Not Available
}
NOTE:
Deprecated API's doesn't mean that you shouldn't use that in current version. It means, it won't work from next version onwards, and only work till current version.
The ifdef-way won't work, because preprocessor statements are evaluated at compile-time; but only at runtime we know which ios-version we have to deal with.
You would use macros for example if you wanted to support Mac OS X and iOS with the same code, because you know at compile-time if the binary will be for Mac OS or iOS.
So you need in this case approach 1 - or, even better, you should use respondsToSelector: to check for availability instead of testing the iOS version if possible.
However, because you are only dealing with deprecation warnings, you don't have to do anything and should simply continue using the deprecated APIs until the app no longer needs to support ios7.
This question is related to Xamarin.iOS.
I have been trying since many days to get MPMediaLibrary.Notifications.ObserveDidChange to work without success. I tried almost everything. Suspecting something bad with Objective-C binding, I tried direct objc calls too using Messaging API. Finally, I built a Native Library and made sure that it works by testing it with pure objective-c app. Native one with Objective-C works without problem. However, the same Library when used with Xamarin.iOS doesn't get MPMediaLibraryDidChangeNotification. I have created in-built selector etc within Native library so that I just call a 'C' function without argument and it works with objective-c app. However, when used with Xamarin, the same doesn't work. I have taken care of calling beginGeneratingLibraryChangeNotifications().
Some people may suspect that My selector/delgate is not being called because of wrong use. However, every other notification is able to call my selector except this one. So syntax is not an issue, I suppose.
After all the efforts, I presume that there is something wrong in Xamarin settings, which is stopping me from getting MPMediaLibraryDidChangeNotification . I really dont know what exactly is it. So my question is - Can you guys get this notification ?
My test phone - iPhone6-8.0.2, Xamarin Studio Version 5.5.3 (build 6) Installation UUID: d84b8c6d-f992-4f19-8a35-c14bcd08420e Runtime: Mono 3.10.0 ((detached/e204655) GTK+ 2.24.23 (Raleigh theme) Package version: 310000023 Apple Developer Tools Xcode 6.1 (6604) Build 6A1052d Xamarin.iOS Version: 8.4.0.16 (Indie Edition) Hash: 80e9ff7 Branch: Build date: 2014-10-22 15:09:12-0400
Thanks, Vinay
For the Record, I am posting the answer.
Since 64 bit transition, The MediaLibrary change notification is stopped for 32 bit apps. If you build your app for 64 bit iOS, everything is fine. However, 64 bit devices with 32 bit applications won't receive these notification. I have tested it thoroughly on iPhone6. So I think this is iOS bug, which Apple needs to rectify. All the Music Player applications on App Store are unable to update library anymore since they are 32 bit.
For Xamarin Users, use Unified API for proper notification support.
I can't record audio using the SpeakHere example app from apple. When I run the app in Simulator from within Xcode, it starts up normally, but when I press the record button, the error "Thread 1: EXC_BREAKPOINT (code=EXC_I386_BPT, subcode=0x0)" occurs:
The log message about the missing root view controller at app startup is already there BEFORE the above error occurs and it is probably not connected to my problem.
I have downloaded the SpeakHere example project from the linked website (see top of this question), opened the fresh download in Xcode and directly started the app. I did not modify any setting and not any line of code. I've also searched on google and stackoverflow for this problem and didn't find a solution, although this problem must be very general.
I use Xcode Version 4.5.2 (4G2008a) and a MacBook Pro from late 2009 with Mac OS X 10.8.
I've also had a friend try this on his computer and he has the very same problem. He has the same OS and his XCode version is also 4.5.2.
I would now try older Xcode versions, but right now I don't like to download a few gigabytes for a trial'n'error approach on my connection.
Any help appreciated, including reports like "works for me with Xcode version ...". Thanks!
The problem occurs because in the method AQRecorder::StartRecord(CFStringRef inRecordFile), the function CFURLCreateWithString() fails and returns a pointer to nil. This is not detected and later on the code calls CFRelease() on this nil pointer, which causes the EXC_BREAKPOINT.
The purpose of the method CFURLCreateWithString() basically is to take a url string as input and return a pointer to a CFURL object as output. The problem here is that the input is not a url string. Instead, it's simply a path on the local file system without file:/ or the like as prefix. For this reason, this method fails.
The solution is to remove the not-working call to the method CFURLCreateWithString() and instead call a related method, namely CFURLCreateWithFileSystemPath(), which is prepared to take a local file system path and convert it to a CFURL:
In the method AQRecorder::StartRecord(CFStringRef inRecordFile), replace or comment out the line
url = CFURLCreateWithString(kCFAllocatorDefault, (CFStringRef)recordFile, NULL);
and insert
url = CFURLCreateWithFileSystemPath(kCFAllocatorDefault, (CFStringRef)recordFile, kCFURLPOSIXPathStyle, false);
at its place.
url = CFURLCreateWithFileSystemPath(kCFAllocatorDefault, (CFStringRef)recordFile, kCFURLPOSIXPathStyle, false);
The code above made my xcoce 4.6 compile and run in simulator, but it doesnot record my voice from my usb microphone, I test my microphone in the garash band application and sucessfully record and play my voice, and the dbmeter does not move at all, any way when I port it to the real device it work, it just can't record and play voice in my simulator.