I am using the following code to add the now playing artwork as a subview in my application.
override func viewDidLoad() {
super.viewDidLoad()
let artWork = musicPlayer.nowPlayingItem?.valueForProperty(MPMediaItemPropertyArtwork)
let image = artWork?.imageWithSize(CGSizeMake(300, 300))
let imageView = UIImageView(image: image)
imageView.frame = CGRectMake(1, 1, 300, 300)
self.view.addSubview(imageView)
}
Not only does the image not appear, I get this warning in the console:
moveCircleAround[2385:753430] BSXPCMessage received error for message: Connection interrupted
Can someone with knowledge in Swift please help me out with this one?
You won't be able to proceed in Xcode 7 beta. All interaction with MPMusicPlayerController is currently broken in iOS 9 beta. Use Xcode 6.4 and iOS 8.4 instead, until this is fixed.
Edit Fixed in beta 5, so it's safe to return to Xcode 7 now.
There are a few cases where this code fails to get the artwork image:
musicPlayer doesn't yet have a nowPlayingItem.
The nowPlayingItem doesn't have an artwork or it's not downloaded yet.
The artwork doesn't find an image of a matching size that you specified.
Try putting a breakpoint at the top of the function and step through the function and see which variable gets set to nil.
Also, note that the BSXPCMessage error could be a hint that the second issue above is the cause. Maybe try a different song and first check that it has an artwork in your Music app.
(Note that you can simplify the code by using nowPlayingItem?.artwork instead of nowPlayingItem?.valueForProperty(MPMediaItemPropertyArtwork).)
Related
I am using a PDFView to display images in my app (built using SwifUI), simply for quick and easy pinch-to-zoom functionality. This worked perfectly in iOS 15, but since updating to iOS 16, the app freezes when attempting to load the image viewer (PhotoDetailView below). The issue persists across both the simulator and a physical device.
Here is the code I'm using:
import SwiftUI
import PDFKit
struct PhotoDetailView: UIViewRepresentable {
let image: UIImage
func makeUIView(context: Context) -> PDFView {
let view = PDFView()
view.document = PDFDocument()
guard let page = PDFPage(image: image) else { return view }
view.document?.insert(page, at: 0)
view.autoScales = true
return view
}
func updateUIView(_ uiView: PDFView, context: Context) {
// empty
}
}
When I run the code on iOS 16.0 in the simulator, I get 2 errors in the console:
[Assert] -[UIScrollView _clampedZoomScale:allowRubberbanding:]: Must be called with non-zero scale
[Unknown process name] CGAffineTransformInvert: singular matrix.
I have been able to isolate the issue to view.autoscales = true. If I print view.scaleFactor, I can see that it is 1.0 before the autoscale and 0.0 afterward (which is what appears to be prompting the errors). These errors also show up in the console when using iOS 15 in the simulator, but the images load as expected.
When view.autoscales = true is commented out, the image loads, albeit at a size that is much larger than the device screen.
Does anyone have any idea what may be causing this? I'd really like to avoid having to build a custom viewer, since I'm just trying to let users quickly pinch to zoom on images.
I managed to resolve this issue. I'm posting my solution here, in case anyone else runs into the same type of problem.
The issue only occurred when using a NavigationLink to navigate to PhotoDetailView. This led me to look more closely at my navigation stack.
After some digging, I found that the problem was related to my use of .navigationViewStyle(.stack) on the NavigationView. I needed this modifier to get things to display correctly on iOS 15, but the same modifier was breaking my image viewer with iOS 16.
My solution was to create a custom container to replace NavigationView, which conditionally uses NavigationStack for iOS 16, or NavigationView + .navigationViewStyle(.stack) for iOS 15. That worked like a charm and my image viewer is back in action.
I found inspiration for my custom container here: https://developer.apple.com/forums/thread/710377
I use 3.1.0 version of Google Maps SDK for iOS. When assigning view to GMSMapView in loadView function, app crashes with following error: 'NSInvalidArgumentException', reason: 'GMSx_GMMClientPropertiesRequestProto.screenPixelDensity: Attempt to set an unknown enum value (0)'. It is really strange why it happens. I tried to downgrade version of Google Maps, but it didn't help. In AppDelegate, I provide API_KEY. Here is my code, how I use Google Maps:
lazy var mapView: GMSMapView = GMSMapView(frame: .zero)
override func loadView() {
super.loadView()
view = mapView
}
What is the reason why this error occurs and how it can be solved?
UPDATE-1:
I made a little experiment and tested in several devices with following simple code:
override func viewDidLoad() {
super.viewDidLoad()
let mapView = GMSMapView()
let dummyView = UIView()
dummyView.backgroundColor = .purple
view = mapView
}
This code successfully shows a map on iPhone 7, but fails with the same error on iPhone SE. Both devices running on iOS 12.3.1. It fails on initialising mapView, not on setting it to view. Even when I set dummyView to view, it crashes, because I initialised mapView.
UPDATE-2:
Tried to avoid initialising mapView variable myself. Created storyboard, added view and indicated that the class is GMSMapView and connected to my view controller. Removed all above code from VC. Worked in iPhone 7, failed in iPhone SE with the same error.
UPDATE-3:
Created new project and used the same API_KEY to show maps. It worked in iPhone SE! It means, problem is not in device, it is in project. Still don't know where it is, but maybe in Pods.
Finally, after the all updates above, I came up with an idea that I should delete the app in my phone and run it again. Then, it worked! I don't know why it has worked. Possibly, this error occurs only in debug version of the app.
I am using localization in my app and when I changed the languages, I was also adding the following code:
UserDefaults.standard.setValue(lang, forKey: "AppleLanguages")
UserDefaults.standard.synchronize()
So by commenting it, and then reinstalling my app, it worked!
When creating a new Swift Playground / .playgroundbook intended to be used on the iPad App, I often received the error message:
"Problem running playground. There was a problem encountered while running this playground. Check your code for mistakes."
I could track this issue down to be caused when adding certain subviews to my live view. To be more precise, my goal is to split a UIImage into multiple parts and create new UIImageViews for them:
for x in 0..<parts {
for y in 0..<parts {
//Create UIImageView with cropped image
let pieceView = UIImageView.init(frame: CGRect.init(x: CGFloat(x)*singleSize.width, y:CGFloat(y)*singleSize.height, width: singleSize.width, height: singleSize.height))
let imageRef = image.cgImage!.cropping(to: CGRect.init(x:0, y:0, width: 100, height: 100));
pieceView.image = UIImage.init(cgImage: imageRef!)
//Add them to an array
self.viewArray.append(pieceView)
}
}
And that's where things become very tricky for me: Adding 7 of these UIImageViews now works without a problem. But as soon as I want to add 8 or more of them, the playground stops working and gives the error message "Problem running playground..." (see above)
What I tested so far:
Adding UIImageViews with the same image does not cause this problem
Cropping the UIImage in a background thread and adding the view on the main thread does not help either
Creating the UIImageViews without adding them to the live-view does not cause any problems
The code works well when being executed on a mac playground, no matter how man views to add
I experienced this kind of iPad Swift Playground run-time error while adding multiple UI elements.
The problem caused by the default setting of "Enable Results" in the playground's property which is set to be ON. The "Enable Results" previews all the in-line object results' viewer. It makes the swift playground crashed when you produce many UI elements.
Try to disable the "Enable Results". It works for me.
I saw a similar issue, and the problem was that it was just a nil incorrectly used.
In order to get a clear error message, just create a playground with the same files, and run it on the mac: in this way you can get more detailed information about what is going on and find easier to solve your issue.
Let me know if you find any difficulties :)
Probably is a late answer... but I was experiencing same issue during last week... finally today just figure it out how to solve:
I was running a piece of code where I add a background view as next:
func createView(){
// gray background
let marco = CGRect(x:0, y: 0, width: 603, height: 825)
vista = UIView(frame: marco)
vista.backgroundColor = UIColor.gray
vista.isUserInteractionEnabled = true
vista.tag = 5
PlaygroundPage.current.liveView = vista
PlaygroundPage.current.needsIndefiniteExecution = true
}
But I place this at the beginning of the code
What I have changed is :
PlaygroundPage.current.liveView = vista
PlaygroundPage.current.needsIndefiniteExecution = true
Placing it at the end of all the code, before you start to run your program... if you need more detail let me know and can share more information.
UPDATE: I made an XCode 7.3 sample project displaying the problem
Question:
I am transferring image frames as NSData from the iPhone (Obj-C) to the watch (Swift) via session:didReceiveMessage:replyHandler:. The frames are originally requested by the watch via session.sendMessage(myMessage, replyHandler:). I am converting those images back to PNG with UIImage(data: frame) and appending to an array named images. I have a WKInterfaceImage named animationImage where I am able to load the frames and display them like so:
let frames = UIImage.animatedImageWithImages(images, duration: myDuration)
animationImage.setImage(frames)
animationImage.startAnimating()
The problem is that, no matter what value in myDuration I always get the same speed (i.e.: super fast):
These animations display properly in the phone:
What am I doing wrong?
XCode Version 7.3 (7D175) with iOS 9.0 (deployment target)
EDIT:
This is what the docs say in respect to animating a watchOS WKInterfaceImage:
For animations you generate dynamically, use the
animatedImageWithImages:duration: method of UIImage to assemble your
animation in your WatchKit extension, and then set that animation
using the setImage: method.
I can't explain why startAnimating() of WKInterfaceImage doesn't utilize the duration of the animated image, but it does seem to animate appropriately when using this:
animationImage.startAnimatingWithImagesInRange(NSMakeRange(0, images.count), duration: myDuration, repeatCount: 0)
Set animation duration to imageView like,
animationImage.animationDuration = myDuration
Hope this will help :)
Some background: I am just trying to do a simple program using xcode 6 beta 7 in swift to screenshot the iphone after I press a button. It is done in SpiteKit and in the game scene. The background is a random png image and the "hello world" default sample text. I programmatically put a press-able button (the default spaceship image is the button) in gamescene didMoveToView function using the following code:
button.setScale(0.2)
screen.frame = CGRect(origin: CGPointMake(self.size.width/4, self.size.height/1.5), size: button.size)
screen.setImage(playAgainButton, forState: UIControlState.Normal)
screen.addTarget(self, action: "action:", forControlEvents: UIControlEvents.TouchUpInside)
self.view!.addSubview(screen)
This sets my press-able button on the screen where it is linked to a function to take a screenshot using this next code:
func action(sender:UIButton!){
UIGraphicsBeginImageContext(self.view!.bounds.size)
self.view!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
}
So this code does take a screenshot, but when I look in photos, only the press-able button is shown and the rest of the image is white. The image is shown below:
Below, I think the screen shot should look like this as this is what the screen looks like in the simulator (I just used some random images/text as background):
Can someone explain to me why the screenshot program is not also taking a picture of the background and how to fix it?
I've looked online and I have not seen any question that is a solution to my problem. Online I saw some similar problems and tried out their fixes: I imported quartzCore, coreGraphics, and coreImages, but with no fix. Also, I tried using the ViewController and setting a UIbutton on there with a IBaction to screen shot, but still get the same white background image. I've tried different background images with the same result.
I am fairly new to programming so, any help would be appreciated! Thank you in advance!
Have you set the background as pattern-background-color? => Maybe this doesn't work as expected with renderInContext.
Is the background image rendered in self.view, either as a subview or in its drawRect: method? => if not, of course it will not be rendered.
The easiest way to get a screenshot is with the function
UIImage *_UICreateScreenUIImage();
you have to declare it first, because its an undocumented function. I'm not sure if Apple will accept an app that contains a call to this function, but I think it will be okay (I know the review guidelines say you must not use undocumented functions, but I think they will not care in this case.) It's available and working in iOS 5 - iOS 8 (i have tested it. don't care about iOS 4.)
Another thing that should also work is to render the whole screen instead of just one particular view:
UIGraphicsBeginImageContext(self.view!.window!.bounds.size)
self.view!.window!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
(note that I replaced view! with view!.window! to get the one and only window instance.)