I was quite dubious on this question's title phrasing, but I think that's the whole point as it is.
I've been trying to just read the CoreMotion data on the WatchKit, but as it turns out, I can't get startDeviceMotionUpdatesToQueue to work, my handler is never called.
I tried running in a custom background thread (NSOperationQueue()), still no luck.
I'm debugging on a real Apple Watch, not the simulator.
In my WKInterfaceController:
let manager = CMMotionManager()
override func awakeWithContext(context: AnyObject?) {
super.awakeWithContext(context)
let communicator = SessionDelegate()
manager.deviceMotionUpdateInterval = 1 / 60
manager.startDeviceMotionUpdatesToQueue(NSOperationQueue.mainQueue()) {
(motionerOp: CMDeviceMotion?, errorOp: NSError?) -> Void in
print("got into handler")
guard let motion = motionerOp else {
if let error = errorOp {
print(error.localizedDescription)
}
assertionFailure()
return
}
print("passed guard")
let roll = motion.attitude.roll
let pitch = motion.attitude.pitch
let yaw = motion.attitude.yaw
let attitudeToSend = ["roll": roll, "pitch": pitch, "yaw": yaw]
communicator.send(attitudeToSend)
}
print("normal stack")
}
the output is
normal stack
normal stack
(Yes, twice! I don't know why that either, but that is not the point, must be another thing I'm doing wrongly)
I'm posting this here 'cause I have no clue to where look into, this is freaking crazy.
Device Motion (startDeviceMotionUpdatesToQueue) is not available in WatchOS2 yet (deviceMotionAvailable returns false), probably accelerometer can help you startAccelerometerUpdatesToQueue
Related
I am able to detect when a screen is detected, associate it with an appropriate windowScene and add a view to it. Slightly hacky but approximately working (code for disconnection not included here), thanks to this SO question:
class ExternalViewController: UIViewController {
override func viewDidLoad() {
view.backgroundColor = .cyan
print("external frame \(view.frame.width)x\(view.frame.height)")
}
}
class ViewController: UIViewController {
var additionalWindows: [UIWindow] = []
override func viewDidLoad() {
//nb, Apple documentation seems out of date.
//https://stackoverflow.com/questions/61191134/setter-for-screen-was-deprecated-in-ios-13-0
NotificationCenter.default.addObserver(forName: UIScreen.didConnectNotification, object: nil, queue: nil) { [weak self] notification in
guard let self = self else {return}
guard let newScreen = notification.object as? UIScreen else {return}
// Give the system time to update the connected scenes
DispatchQueue.main.asyncAfter(deadline: .now() + 0.05) {
// Find matching UIWindowScene
let matchingWindowScene = UIApplication.shared.connectedScenes.first {
guard let windowScene = $0 as? UIWindowScene else { return false }
return windowScene.screen == newScreen
} as? UIWindowScene
guard let connectedWindowScene = matchingWindowScene else {
NSLog("--- Connected scene was not found ---")
return
//fatalError("Connected scene was not found") // You might want to retry here after some time
}
let screenDimensions = newScreen.bounds
let newWindow = UIWindow(frame: screenDimensions)
NSLog("newWindow \(screenDimensions.width)x\(screenDimensions.height)")
newWindow.windowScene = connectedWindowScene
let vc = ExternalViewController()
vc.mainVC = self
newWindow.rootViewController = vc
newWindow.isHidden = false
self.additionalWindows.append(newWindow)
}
}
}
}
When I do this in the iOS simulator, I see my graphics fill the screen as intended, but when running on my actual device, it appears with a substantial black border around all sides.
Note that this is not the usual border seen with the default display mirroring behaviour - the 16:9 aspect ratio is preserved, and I do see different graphics as expected, (flat cyan color in my example code, normally I'm doing some Metal rendering that has some slight anomalies that are out of scope here, although perhaps might lead to some different clues on this if I dig into it deeper).
The print messages report the expected 1920x1080 dimensions. I don't know UIKit very well, and haven't been doing much active Apple development (I'm dusting off a couple of old side projects here in the hopes of being able to use them to project visuals at a gig in the near future), so I don't know if there's something else to do with sizing constraints etc that I might be missing, but even so it's hard to see why it would behave differently in the simulator.
Other apps I have installed from the app store do indeed show fullscreen graphics on the external display - Netflix shows fullscreen video as you would expect, Concepts shows a different representation of the document than the one you see on the device.
So, in this instance the issue is to do with Overscan Compensation. Thanks to Jerrot on Discord for pointing me in the right direction.
In the context of my app, it is sufficient to add newScreen.overscanCompensation = .none in the connection notification delegate (actually, in the part that is delayed a few ms after that - it doesn't work if applied directly in the connection notification). In the question linked above, there is further discussion of other aspects that may be important in a different context.
This is my ViewController modified to achieve the desired result:
class ViewController: UIViewController {
var additionalWindows: [UIWindow] = []
override func viewDidLoad() {
//nb, Apple documentation seems out of date.
//https://stackoverflow.com/questions/61191134/setter-for-screen-was-deprecated-in-ios-13-0
NotificationCenter.default.addObserver(forName: UIScreen.didConnectNotification, object: nil, queue: nil) { [weak self] notification in
guard let self = self else {return}
guard let newScreen = notification.object as? UIScreen else {return}
// Give the system time to update the connected scenes
DispatchQueue.main.asyncAfter(deadline: .now() + 0.05) {
// Find matching UIWindowScene
let matchingWindowScene = UIApplication.shared.connectedScenes.first {
guard let windowScene = $0 as? UIWindowScene else { return false }
return windowScene.screen == newScreen
} as? UIWindowScene
guard let connectedWindowScene = matchingWindowScene else {
NSLog("--- Connected scene was not found ---")
return
//fatalError("Connected scene was not found") // You might want to retry here after some time
}
let screenDimensions = newScreen.bounds
////// new code here --->
newScreen.overscanCompensation = .none
//////
let newWindow = UIWindow(frame: screenDimensions)
NSLog("newWindow \(screenDimensions.width)x\(screenDimensions.height)")
newWindow.windowScene = connectedWindowScene
let vc = ExternalViewController()
vc.mainVC = self
newWindow.rootViewController = vc
newWindow.isHidden = false
self.additionalWindows.append(newWindow)
}
}
}
}
In this day and age, I find it pretty peculiar that overscan compensation is enabled by default.
I am trying to add(move forward) 10 second song duration or minus(move backward) 10 second in Spotify player but i am really confused how to add or minus.
When i m trying to use this code the song is not changed duration
// forward button action
#IBAction func moveFrdBtnAction(_ sender: Any) {
SpotifyManager.shared.audioStreaming(SpotifyManager.shared.player, didSeekToPosition: TimeInterval(10))
}
// spotify delegate method seekToPosition
func audioStreaming(_ audioStreaming: SPTAudioStreamingController!, didSeekToPosition position: TimeInterval) {
player?.seek(to: position, callback: { (error) in
let songDuration = audioStreaming.metadata.currentTrack?.duration as Any as! Double
self.delegate?.getSongTime(timeCount: Int(songDuration)+1)
})
}
We are making a music application using the same SDK in both the platforms (Android & iOS), the seekToPosition method of the Spotify SDK is working correctly in the Android version, however, it is not working in the iOS one.The delegate method calls itself but the music stops.
Can you kindly let us know why this scenario is happening, and what should we do to run it on the iOS devices as well.
Can someone please explain to me how to solve this , i've tried to solve this but no results yet.
Any help would be greatly appreciated.
Thanks in advance.
I don't use this API so my answer will be based your code and Spotify's reference documentation.
I think there are a few things wrong with your flow:
As Robert Dresler commented, you should (approximately) never call a delegate directly, a delegate calls you.
I'm pretty sure your action currently results in jumping to exactly 10 seconds, not by 10 seconds.
(As an aside, I'd suggest changing the name of your function moveFrdBtnAction to at least add more vowels)
Anyway, here's my best guess at what you want:
// forward button action
#IBAction func moveForwardButtonAction(_ sender: Any) {
skipAudio(by: 10)
}
#IBAction func moveBackButtonAction(_ sender: Any) {
skipAudio(by: -10)
}
func skipAudio(by interval: TimeInterval) {
if let player = player {
let position = player.playbackState.position // The documentation alludes to milliseconds but examples don't.
player.seek(to: position + interval, callback: { (error) in
// Handle the error (if any)
})
}
}
// spotify delegate method seekToPosition
func audioStreaming(_ audioStreaming: SPTAudioStreamingController!, didSeekToPosition position: TimeInterval) {
// Update your UI
}
Note that I have not handled seeking before the start of the track, nor after the end which could happen with a simple position + interval. The API may handle this for you, or not.
You could take a look at the examples here: spotify/ios-sdk. In the NowPlayingView example they use the 'seekForward15Seconds', maybe you could use that? If you still need 10s I have added a function below. The position is in milliseconds.
"position: The position to seek to in milliseconds"
docs
ViewController.swift
var appRemote: SPTAppRemote {
get {
return AppDelegate.sharedInstance.appRemote
}
}
fileprivate func seekForward15Seconds() {
appRemote.playerAPI?.seekForward15Seconds(defaultCallback)
}
fileprivate seekBackward15Seconds() {
appRemote.playerAPI?.seekBackward15Seconds(defaultCallback)
}
// TODO: Or you could try this function
func seekForward(seconds: Int){
appRemote.playerAPI?.getPlayerState({ (result, error) in
// playback position in milliseconds
let current_position = self.playerState?.playbackPosition
let seconds_in_milliseconds = seconds * 1000
self.appRemote.playerAPI?.seek(toPosition: current_position + seconds_in_milliseconds, callback: { (result, error) in
guard error == nil else {
print(error)
return
}
})
})
}
var defaultCallback: SPTAppRemoteCallback {
get {
return {[weak self] _, error in
if let error = error {
self?.displayError(error as NSError)
}
}
}
}
AppDelegate.swift
lazy var appRemote: SPTAppRemote = {
let configuration = SPTConfiguration(clientID: self.clientIdentifier, redirectURL: self.redirectUri)
let appRemote = SPTAppRemote(configuration: configuration, logLevel: .debug)
appRemote.connectionParameters.accessToken = self.accessToken
appRemote.delegate = self
return appRemote
}()
class var sharedInstance: AppDelegate {
get {
return UIApplication.shared.delegate as! AppDelegate
}
}
Edit1:
For this to work you need to follow the Prepare Your Environment:
Add the SpotifyiOS.framework to your Xcode project
Hope it helps!
I am working on a Swift playground and I am trying to use this code to get the device motion.
#objc func update()
{
if let deviceMotion = motionManager.deviceMotion {
print("Device Motion Yaw: \(deviceMotion.attitude.yaw)")
}
}
However, it seems that device motion does not work on a Swift playground even though it works in iOS. How would I change a playground to support device motion? I am using an iPad running iOS 12 and the latest version of Swift Playgrounds and a Mac for the code. I know that the method gets called perfectly, and the code runs perfectly when I put it as part of an iOS app on both an iPad and an iPhone. How would I modify a playground to support this, as from my understanding it does not by default?
It is entirely possible. I’ve done it on several occasions. You’ll need a CMMotionManager class. There are many ways to do this, but I would recommend using a timer. Here is some example code, taken from Apple’s developer documentation and modified to fit the question.
let motion = CMMotionManager()
func startDeviceMotion() {
if motion.isDeviceMotionAvailable {
//How often to push updates
self.motion.deviceMotionUpdateInterval = 1.0/60.0
self.motion.showsDeviceMovementDisplay = true
self.motion.startDeviceMotionUpdates(using: .xMagneticNorthZVertical)
// Configure a timer to fetch the motion data.
self.timer = Timer(fire: Date(), interval: (1.0 / 60.0), repeats: true,
block: { (timer) in
if let data = self.motion.deviceMotion {
let x = data.attitude.pitch
let y = data.attitude.roll
let z = data.attitude.yaw
//Use the data
}
})
RunLoop.current.add(self.timer!, forMode: RunLoop.Mode.default)
}
}
startDeviceMotionUpdates()
Either do that or try something like this, also from the documentation
func startQueuedUpdates() {
if motion.isDeviceMotionAvailable { self.motion.deviceMotionUpdateInterval = 1.0 / 60.0
self.motion.showsDeviceMovementDisplay = true
self.motion.startDeviceMotionUpdates(using: .xMagneticNorthZVertical,
to: self.queue, withHandler: { (data, error) in
// Make sure the data is valid before accessing it.
if let validData = data {
// Get the attitude relative to the magnetic north reference frame.
let roll = validData.attitude.roll
let pitch = validData.attitude.pitch
let yaw = validData.attitude.yaw
// Use the motion data in your app.
}
})
}
}
I have created a subclass of SCNNode. It is made up of few child nodes.
I have declared a method, viz. soundCasual() which adds a SCNAudioPlayer to the instance of this class. Everything is working as expected and audio is being played, when this method is called. This method is called on that node whenever that node is tapped (gesture).
Code:
class MyNode: SCNNode {
let wrapperNode = SCNNode()
let audioSource5 = SCNAudioSource(fileNamed: "audiofile.mp3")
override init() {
super.init()
if let virtualScene = SCNScene(named: "MyNode.scn", inDirectory: "Assets.scnassets/Shapes/MyNode") {
for child in virtualScene.rootNode.childNodes {
wrapperNode.addChildNode(child)
}
}
}
func soundCasual() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSource5 {
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.wrapperNode.removeAllAudioPlayers()
self?.wrapperNode.addAudioPlayer(audioPlayer)
}
}
}
}
Issue within Instruments (Allocations)
When I analyse my whole codebase (which is several other things), I see that whenever I tap on that node, allocation count of SCNAudioPlayer is increased by one while profiling within Instruments. But all of the increase is persistent. By definition of SCNAudioPlayer, I assumed that this the player is removed after playback, which is why the increment should be in Transient allocations, but it is not working like this. That is why I tried removeAllAudioPlayers() before adding an SCNAudioPlayer to the node, as you can see in the code for soundCasual(). But the issue remains.
Till this snapshot was taken, I had tapped on that node about 17 times, and it also shows 17 against Persistent allocations for SCNAudioPlayer.
Note: SCNAudioSource is 10, as it should be, since there are 10 audio source I am using in the app
And this is happening for all other SCNNodes in my application without fail.
Kindly help as I am not able to understand what exactly am I missing.
EDIT
As per recommended, I changed my init() as
let path = Bundle.main.path(forResource: "Keemo", ofType: "scn", inDirectory: "Assets.scnassets/Shapes/Keemo")
if let path = path , let keemo = SCNReferenceNode(url: URL(fileURLWithPath: path)) {
keemo.load()
}
func soundPlay() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSourcePlay {
audioSource.volume = 0.1
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
}
}
}
Despite, allocations in Instruments show audioPlayers as persistent. Though on checking node.audioPlayers it shows that at one point, there is only one audioPlayer node attached.
EDIT
This issue appears even in a simple case when I use the boilerplate codebase attached in a Scenekit app created by default in XCode. Hence, this issue has been raised as a bug to Apple. https://bugreport.apple.com/web/?problemID=43482539
WORKAROUND
I am using AVAudioPlayer instead of SCNAudioPlayer, not exactly the same thing, but at least memory this way will not cause a crash.
I am not familiar with SceneKit, but from my experience with UIKit and SpriteKit I suspect that your use of wrapperNode and virtualScene is messing with the garbage collector.
I would try removing wrapperNode and adding everything to self (since self is a SCNNode).
Which node is being used in your scene? self or wrapperNode? And with your sample code wrapperNode is not added to self so it may or may not actually be part of the scene.
Also, you should probably be using SCNReferenceNode instead of the virtual scene thing you're using.
!!! this code has not been tested !!!
class MyNode: SCNReferenceNode {
let audioSource5 = SCNAudioSource(fileNamed: "audiofile.mp3")
func soundCasual() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSource5 {
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
}
}
}
}
// If you programmatically create this node, you'll have to call .load() on it
let referenceNode = SCNReferenceNode(URL: referenceURL)
referenceNode.load()
HtH!
If you haven't found the answer already, you need to remove the SCNAudioPlayer from the node once it has completed playing:
if let audioSource = self?.audioSourcePlay {
audioSource.volume = 0.1
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
self?.audioplayer.didFinishPlayback = {
self?.removeAudioPlayer(audioPlayer)
}
}
I have a client that wants to recognize when an user smacks their screen with their whole hand, like a high-five. I suspect that Apple won't approve this, but let's look away from that.
I though of using a four-finger-tap recognizer, but that doesn't really cover it. The best approach would possibly be to check if the user is covering at least 70% of the screen with their hand, but I don't know how to do that.
Can someone help me out here?
You could use the accelerometer to detect the impact of a hand & examine the front camera feed to find a corresponding dark frame due to the hand covering the camera*
* N.B. a human hand might not be big enough to cover the front camera on an iPhone 6+
Sort of solved it. Proximity + accelerometer works good enough. Multitouch doesn't work, as it ignores stuff it doesn't think of as taps.
import UIKit
import CoreMotion
import AVFoundation
class ViewController: UIViewController {
var lastHighAccelerationEvent:NSDate? {
didSet {
checkForHighFive()
}
}
var lastProximityEvent:NSDate? {
didSet {
checkForHighFive()
}
}
var lastHighFive:NSDate?
var manager = CMMotionManager()
override func viewDidLoad() {
super.viewDidLoad()
//Start disabling the screen
UIDevice.currentDevice().proximityMonitoringEnabled = true
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(proximityChanged), name: UIDeviceProximityStateDidChangeNotification, object: nil)
//Check for acceloremeter
manager.startAccelerometerUpdatesToQueue(NSOperationQueue.mainQueue()) { (data, error) in
let sum = abs(data!.acceleration.y + data!.acceleration.z + data!.acceleration.x)
if sum > 3 {
self.lastHighAccelerationEvent = NSDate()
}
}
//Enable multitouch
self.view.multipleTouchEnabled = true
}
func checkForHighFive() {
if let lastHighFive = lastHighFive where abs(lastHighFive.timeIntervalSinceDate(NSDate())) < 1 {
print("Time filter")
return
}
guard let lastProximityEvent = lastProximityEvent else {return}
guard let lastHighAccelerationEvent = lastHighAccelerationEvent else {return}
if abs(lastProximityEvent.timeIntervalSinceDate(lastHighAccelerationEvent)) < 0.1 {
lastHighFive = NSDate()
playBoratHighFive()
}
}
func playBoratHighFive() {
print("High Five")
let player = try! AudioPlayer(fileName: "borat.mp3")
player.play()
}
func proximityChanged() {
if UIDevice.currentDevice().proximityState {
self.lastProximityEvent = NSDate()
}
}
}
You can detect finger count with multi touch event handling. check this answer