Building a mobile web app and trying to take advantage of the new WebRTC features that dropped with iOS 11, but I seem unable to access the microphone, at least from my XCode iOS 11 simulator (iPhone X)
All requests to navigator.mediaDevices.getUserMedia({ audio: true }) and all other constraint combinations I try result in an OverconstrainedError with error message: Invalid constraint
I've also verified that the simulator has correct input and output audio devices set. Safari also has rights to microphone and camera in the iOS settings.
To beat a dead horse I have also tried enabling all experimental features in the iOS settings menu for safari to make sure I hadn't missed anything.
Has anyone been able to successfully get the safari mobile browser to collect microphone input in the simulator?
Related
I am trying to enable "Designed for iPad" as a target for my functioning Flutter based iOS app. My understanding is that the app runs the iPad version in a VM when the user is on an M1 (or Apple Silicon) mac.
The app uses just_audio, all features work fine and I can load a file and play (I am using a StreamAudioSource derived from an encrypted file downloaded on disk). When I try to seek in the file I get this error from the iOS side and the audio does not seek but continues playing from where it was (I see a buffering and ready state).
AQMEIO_HAL.cpp:736 kAudioDevicePropertyMute returned err 2003332927
There is very little info on this error apart from this pretty old Apple developer post
https://developer.apple.com/forums/thread/672311
If I play a file as a stream (encrypted HLS) then seek works exactly as expected.
Has anyone had any experience with just_audio targeted this way?
Thanks!
I'm using Xcode 10.1, and I don't have a checkbox whether I want to connect through network or not on my device page (which was there in Xcode 9 and Xcode 10 beta). Search "connect via network xcode" on google images if you don't know what I mean.
However, I must debug the offline flow of my app (which is written in React-Native btw). Not just no-internet-connection, but turning off wifi and mobile-data, which will trigger a status change. By using the developer settings of the iPhone, I can make every call fail (100% loss), but cannot change the internet-status of the phone.
So I want the debugger to stay connected and either be able to:
debug the old-school way through the cable (if I turn off internet now, I get a red error screen and nothing is possible anymore), so I can disable wifi and mobile-data,
or simulating that status change on the phone.
Btw, I cannot use a simulator, since the app requires Bluetooth.
Thanks in advance!
Edit:
The checkbox is not there for older iPhone devices. With iPhone 7s, I do see the checkbox "connect via network". But enabling or disabling does not change the fact that your iPhone needs internet to debug. If I disable internet on the phone I get the following error:
Ok, I've found the problem. We are testing on a iPhone5, which is no longer officially supported by Apple. Which means that Apple has decided to cut features for iPhone5 so you would buy a new one (wonderful strategy =/). Hence debugging with a cable is no longer supported on iPhone5.
We tested with a iPhone6 from a colleague, and everything works fine.
Edit:
Altough the checkbox is there, and I can disable wifi for connection... The moment I turn off internet on the phone, the app crashes and says: "Runtime is not ready for debugging: make sure packager runtime is running"... so no solution yet...
Has anyone been able to connect wirelessly to QuickTime Player? I'm able to debug wirelessly using Xcode 9, but when I try to connect to QT, the device won't appear on the list...
I've already gone to devices and made sure "Connect via Network" is checked. This works fine if I use the lighting cable.
This is what Apple says on the Xcode 9 page:
Cut the Cord
Choose any of your iOS or tvOS devices on the local network to install, run, and debug your apps – without a USB cord plugged into your Mac. Simply click the ‘Connect via Network’ checkbox the first time you use a new iOS device, and that device will be available over the network from that point forward. Wireless development also works in other apps, including Instruments, Accessibility Inspector, Quicktime Player, and Console.
Any ideas?
I'm using the Google Maps API for iOS to essentially highlight a width of coverage using the GMSPolygon.
Running the iOS simulator I get an odd behavior. Using both a iOS 9.3 build for all the testing, the map properly highlights coverage using the location simulated by the simulator on an iPhone 5S, 6, and newer.
You can see this in this screen capture that I did: 5S Highlighting
Now when using the iPhone 4S and 5 it does not draw the GMSPolygons! iPhone 5 Not Highlighting
A few things to note, I made sure that location permission were authorized on all testing and the same "city Run" simulation was used on all the testing. Again the highlighting works fine on 5S and newer devices running 9.3, but not on 4S & 5 using 9.3. Is there a fundamental API change in the CLLocationManager between the builds for these devices?
As discussed in Simulator User Guide, there are some hardware and API differences in Simulator which may affect your app when testing in Simulator.
In addition to that, please note of these hardware features that are not simulated as of iOS 8.2:
Motion support (accelerometer and gyroscope) are unsupported.
Audio and video input (camera and microphone) are unsupported.
Proximity sensor
Barometer
Ambient light sensor
Aside from those, there are also API differences wherein Simulator APIs don’t have all the features that are available on a device such as these:
Receiving and sending Apple push notifications
Privacy alerts for access to Photos, Contacts, Calendar, and Reminders
The UIBackgroundModes key
Handoff support
Please try going through the Simulator User Guide for more information.
I'm fiddling with an app, and I'm also aware that apps made by developers that allow an iOS Device to receive an audio stream from another iOS Device or iTunes. So I'd like to implement it and possibly find a method within Apple's guidelines, allowing audio to be streamed. I've tried looking for everything, but I can't find where to start. Any ideas, a place to start, maybe even a point of direction would be great.
chekout the airspeaker project on github.
https://github.com/chenkaigithub/AirSpeaker
I was able to run it on iOS simulator 6.0 and then use my iphone to stream audio.
However if I try to stream from iTunes 11 it does not work ( iTunes lists the device in airplay list but , on selection , prompts with error "airplay device is not compatible with this version of iTunes." )