cannot change duration of animated dynamic images in watchOS 2 - ios

UPDATE: I made an XCode 7.3 sample project displaying the problem
Question:
I am transferring image frames as NSData from the iPhone (Obj-C) to the watch (Swift) via session:didReceiveMessage:replyHandler:. The frames are originally requested by the watch via session.sendMessage(myMessage, replyHandler:). I am converting those images back to PNG with UIImage(data: frame) and appending to an array named images. I have a WKInterfaceImage named animationImage where I am able to load the frames and display them like so:
let frames = UIImage.animatedImageWithImages(images, duration: myDuration)
animationImage.setImage(frames)
animationImage.startAnimating()
The problem is that, no matter what value in myDuration I always get the same speed (i.e.: super fast):
These animations display properly in the phone:
What am I doing wrong?
XCode Version 7.3 (7D175) with iOS 9.0 (deployment target)
EDIT:
This is what the docs say in respect to animating a watchOS WKInterfaceImage:
For animations you generate dynamically, use the
animatedImageWithImages:duration: method of UIImage to assemble your
animation in your WatchKit extension, and then set that animation
using the setImage: method.

I can't explain why startAnimating() of WKInterfaceImage doesn't utilize the duration of the animated image, but it does seem to animate appropriately when using this:
animationImage.startAnimatingWithImagesInRange(NSMakeRange(0, images.count), duration: myDuration, repeatCount: 0)

Set animation duration to imageView like,
animationImage.animationDuration = myDuration
Hope this will help :)

Related

How do you animate a lock screen widget on iOS 16?

In iOS 16, Lock screen widgets allow you to add an Image to the WidgetView, like in accessoryCircular.
How do you animate this Image? Loading the Image from an animated UIImage with frames just displays the first frame.
Am I missing a startAnimating call, like you have to do with WKInterfaceObjects? Is there some additional step or value to set to get an Image inside a SwiftUI WidgetView to begin animating, or cycling through frames?
Examples of apps that have achieved this are https://www.livelywidget.com or Pixel Pal, so I know it's possible, but when I search for APIs to accomplish this, all that comes up are apps.

UIImageView: How to show scaled image?

I'm new to iOS app development and I begun to learn from this great tutorial:
Start Developing iOS Apps (Swift)
https://developer.apple.com/library/content/referencelibrary/GettingStarted/DevelopiOSAppsSwift/WorkWithViewControllers.html#//apple_ref/doc/uid/TP40015214-CH6-SW1
Everything looks great but when trying to load an image from "Camera roll" on the simulator, I get the image shown oversized and filling all the iPhone simulator screen instead of just showing into the UIImageView box as shown on tutorial images.
The funny fact is that also downloading the correct project provided at the end of the lesson (see the bottom of the page) I get the same issue.
Googling around I get some ideas to insert a:
// The info dictionary may contain multiple representations of the image. You want to use the original
guard let selectedImage = info[UIImagePickerControllerOriginalImage] as? UIImage else {
fatalError("Expected a dictionary containing an image, but was provided the following: \(info)")
}
// Set photoImageView to display the selected image
photoImageView.image = selectedImage
photoImageView.contentMode = UIViewContentMode.scaleAspectFit
// Dismiss the picker
dismiss(animated: true, completion: nil)
after loading image from the camera roll, but lead to no results...
I'm using Xcode 9.2 with deployment target 11.2 and iPhone 8 plus as a target for the simulator.
Any help is apreciated.
It happens because the sample code is only restricting the proportion of the UIImageView (the 1:1 constraint you added in storyboard). In this way, it will always keep your imageView squared but will not limit the size.
As the sample tells, you entered a placeholder size, that is only kept until you set some image on it. When you enter a new image on the UIImageView it changes to the size of the image you set (in your case, probably a bigger image for the camera roll). That's probably why its getting so large.
I think the easy way to fix this to add other constraints in storyboard to limit the size (like actually size or border constraints).

How to run a animated Gif directly by assigning it to Wkinterfaceimage using watchkit in Xcode?

Here is my doubt !!!
How to run a animated gif image directly by getting dynamically from url / NSdata and assigning it to Wkinterfaceimage ???
i am working on the applewatch app development from past few days it's so great.Currently i'm working on the GIf Images assigning to the imageview. I successfuly done by adding the series of images statically in xcode and running it by assigning to Wkinterfaceimage.
Right now, Watchkit does not support to run the GIF image. So although you can download the GIF Image from url but you can't show up on the watchkit interface. to show the gif file, we have split gif file to severals images and after that we can start animated the image instead of show gif file.
Download GIF image from URL to NSData type. Then, Pass that NSData object to WKInterfaceImage setImageData method.

Programatically Set UIImage Animation with WatchKit

I can't seem to figure out how to programmatically set a new image, via the outlet, and make it start animating.
Sequence
zeroEntering0.png
zeroEntering1.png
zeroEntering2.png
zeroEntering3.png
zeroEntering4.png
I imported the sequence of images into the Image.xcassets inside the WatchKit App
I can set the image in the interface builder to "zeroEntering" and set animating to "Yes" and it works correctly.
However, I want something more dynamic, I need a button press to choose a new animation sequence and start it off. If I try and set the image programmatically using the same name from the interface builder, the UIImage is nil.
What naming convention should I use when programmatically setting the UIImage? "zeroEntering", "zeroEntering0", "zeroEntering.png" or "zeroEntering0.png"
I tried using the two non-nil options and the image did not animate and went black.
The answer is subtle and definitely got my wheels spinning for too long.
According to this beautiful article,
You should use setImageNamed(:) when the image you want to display is either cached on the watch on is in an asset catalog in the watch app’s bundle, and use setImage(:) when the image isn’t cached — this will transfer the image data to the Apple Watch over the air!
So, I kept my images in the assets catalog on the watch app, and switch to use,
[self.testImage setImageNamed:#"zeroEntering"];
[self.testImage startAnimatingWithImagesInRange:NSMakeRange(0, 4) duration:0.2 repeatCount:100];
Set the image as [UIImage imageNamed:#"entering"] then call startAnimatingWithImagesInRange:duration:repeatCount:
Check it out here: https://developer.apple.com/library/prerelease/ios/documentation/WatchKit/Reference/WKInterfaceImage_class/#//apple_ref/occ/instm/WKInterfaceImage/startAnimatingWithImagesInRange:duration:repeatCount:
Make sure to follow the tips here: https://developer.apple.com/watchkit/tips/

Screenshotting on Iphone in swift only has a white background

Some background: I am just trying to do a simple program using xcode 6 beta 7 in swift to screenshot the iphone after I press a button. It is done in SpiteKit and in the game scene. The background is a random png image and the "hello world" default sample text. I programmatically put a press-able button (the default spaceship image is the button) in gamescene didMoveToView function using the following code:
button.setScale(0.2)
screen.frame = CGRect(origin: CGPointMake(self.size.width/4, self.size.height/1.5), size: button.size)
screen.setImage(playAgainButton, forState: UIControlState.Normal)
screen.addTarget(self, action: "action:", forControlEvents: UIControlEvents.TouchUpInside)
self.view!.addSubview(screen)
This sets my press-able button on the screen where it is linked to a function to take a screenshot using this next code:
func action(sender:UIButton!){
UIGraphicsBeginImageContext(self.view!.bounds.size)
self.view!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
}
So this code does take a screenshot, but when I look in photos, only the press-able button is shown and the rest of the image is white. The image is shown below:
Below, I think the screen shot should look like this as this is what the screen looks like in the simulator (I just used some random images/text as background):
Can someone explain to me why the screenshot program is not also taking a picture of the background and how to fix it?
I've looked online and I have not seen any question that is a solution to my problem. Online I saw some similar problems and tried out their fixes: I imported quartzCore, coreGraphics, and coreImages, but with no fix. Also, I tried using the ViewController and setting a UIbutton on there with a IBaction to screen shot, but still get the same white background image. I've tried different background images with the same result.
I am fairly new to programming so, any help would be appreciated! Thank you in advance!
Have you set the background as pattern-background-color? => Maybe this doesn't work as expected with renderInContext.
Is the background image rendered in self.view, either as a subview or in its drawRect: method? => if not, of course it will not be rendered.
The easiest way to get a screenshot is with the function
UIImage *_UICreateScreenUIImage();
you have to declare it first, because its an undocumented function. I'm not sure if Apple will accept an app that contains a call to this function, but I think it will be okay (I know the review guidelines say you must not use undocumented functions, but I think they will not care in this case.) It's available and working in iOS 5 - iOS 8 (i have tested it. don't care about iOS 4.)
Another thing that should also work is to render the whole screen instead of just one particular view:
UIGraphicsBeginImageContext(self.view!.window!.bounds.size)
self.view!.window!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
(note that I replaced view! with view!.window! to get the one and only window instance.)

Resources