I am a beginner programmer and am creating a game using iOS sprite-kit. I have a simple animated GIF (30 frames) saved as a .gif file. Is there a simple way (few lines of code maybe similar to adding a regular .png through UIImage) of displaying this GIF in my game? I have done some research on displaying an animated GIF in Xcode and most involve importing extensive classes, most of which is stuff I don't think I need (I barely know enough to sift through it).
The way I think of it gifs are just like animating a sprite. So what I would do is add the gif as textures in a SKSpriteNode in a for loop and then tell it to run on the device with SKAction.repeatActionForever().
To be honest I'm fairly new to this as well. I'm just trying to give my best answer. This is written in Swift, but I don't think it'll be to hard to translate to Objective-C.
var gifTextures: [SKTexture] = [];
for i in 1...30 {
gifTextures.append(SKTexture(imageNamed: "gif\(i)"));
}
gifNode.runAction(SKAction.repeatActionForever(SKAction.animateWithTextures(gifTextures, timePerFrame: 0.125)));
Michael Choi's answer will get you half way there. The rest is getting the individual frames out of the gif file. Here's how I do it (in Swift):
func load(imagePath: String) -> ([SKTexture], TimeInterval?) {
guard let imageSource = CGImageSourceCreateWithURL(URL(fileURLWithPath: imagePath) as CFURL, nil) else {
return ([], nil)
}
let count = CGImageSourceGetCount(imageSource)
var images: [CGImage] = []
for i in 0..<count {
guard let img = CGImageSourceCreateImageAtIndex(imageSource, i, nil) else { continue }
images.append(img)
}
let frameTime = count > 1 ? imageSource.delayFor(imageAt: 0) : nil
return (images.map { SKTexture(cgImage: $0) }, frameTime)
}
extension CGImageSource { // this was originally from another SO post for which I've lost the link. Apologies.
func delayFor(imageAt index: Int) -> TimeInterval {
var delay = 0.1
// Get dictionaries
let cfProperties = CGImageSourceCopyPropertiesAtIndex(self, index, nil)
let gifPropertiesPointer = UnsafeMutablePointer<UnsafeRawPointer?>.allocate(capacity: 0)
if CFDictionaryGetValueIfPresent(cfProperties, Unmanaged.passUnretained(kCGImagePropertyGIFDictionary).toOpaque(), gifPropertiesPointer) == false {
return delay
}
let gifProperties: CFDictionary = unsafeBitCast(gifPropertiesPointer.pointee, to: CFDictionary.self)
// Get delay time
var delayObject: AnyObject = unsafeBitCast(
CFDictionaryGetValue(gifProperties,
Unmanaged.passUnretained(kCGImagePropertyGIFUnclampedDelayTime).toOpaque()),
to: AnyObject.self)
if delayObject.doubleValue == 0 {
delayObject = unsafeBitCast(CFDictionaryGetValue(gifProperties,
Unmanaged.passUnretained(kCGImagePropertyGIFDelayTime).toOpaque()), to: AnyObject.self)
}
delay = delayObject as? TimeInterval ?? 0.1
if delay < 0.1 {
delay = 0.1 // Make sure they're not too fast
}
return delay
}
}
Note that I assume that each frame of the gif is the same length, which is not always the case.
You could also pretty easily construct an SKTextureAtlas with these images.
Related
I'm trying to align the screenshots emitted by RPScreenRecorder's startCapture method to logs saved elsewhere in my code.
I was hoping that I could just match CMSampleBuffer's presentationTimeStamp to the timestamp reported by CMClockGetHostTimeClock(), but that doesn't seem to be true.
I've created a small sample project to demonstrate my problem (available on Github), but here's the relevant code:
To show the current time, I'm updating a label with the current value of CMClockGetTime(CMClockGetHostTimeClock()) when CADisplayLink fires:
override func viewDidLoad() {
super.viewDidLoad()
// ...
displayLink = CADisplayLink(target: self, selector: #selector(displayLinkDidFire))
displayLink?.add(to: .main, forMode: .common)
}
#objc
private func displayLinkDidFire(_ displayLink: CADisplayLink) {
timestampLabel.text = String(format: "%.3f", CMClockGetTime(CMClockGetHostTimeClock()).seconds)
}
And here is where I'm saving RPScreenRecorder's buffers to disk.
Each filename is the buffer's presentationTimeStamp in seconds, truncated to milliseconds:
RPScreenRecorder.shared().startCapture(handler: { buffer, bufferType, error in
switch bufferType {
case .video:
guard let imageBuffer = buffer.imageBuffer else {
return
}
CVPixelBufferLockBaseAddress(imageBuffer, .readOnly) // Do I need this?
autoreleasepool {
let ciImage = CIImage(cvImageBuffer: imageBuffer)
let uiImage = UIImage(ciImage: ciImage)
let data = uiImage.jpegData(compressionQuality: 0.5)
let filename = String(format: "%.3f", buffer.presentationTimeStamp.seconds)
let url = Self.screenshotDirectoryURL.appendingPathComponent(filename)
FileManager.default.createFile(atPath: url.path, contents: data)
}
CVPixelBufferUnlockBaseAddress(imageBuffer, .readOnly)
default:
break
}
}
The result is a collection of screenshots like this:
I'd expect each screenshot's filename to match the timestamp visible in the screenshot, or at least be off by some consistent duration. Instead, I'm seeing variable differences which seem to get worse over time. More confusing, I also sometimes get duplicates of the same screenshot. For example, here are the times from a recent recording:
Visible in the screenshot
The screenshot's filename
Diff
360665.775
360665.076
0.699
360665.891
360665.092
0.799
360665.975
360665.108
0.867
360666.058
360665.125
0.933
360666.158
360665.142
1.016
360665.175
360665.175
0.000
360666.325
360665.192
1.133
360665.175
360665.208
-0.033
...
The results are wild enough that I think I must be doing something exceptionally stupid, but I'm not sure what it is. Any ideas/recommendations? Or, ideas for how to better accomplish my goal?
I have an AVAudioPlayerNode looping a segment of a song:
audioPlayer.scheduleBuffer(segment, at: nil, options:.loops)
I want to get current position of the song while it's playing. Usually, this is done by calculating = currentFrame / audioSampleRate
where
var currentFrame: AVAudioFramePosition {
guard let lastRenderTime = audioPlayer.lastRenderTime,
let playerTime = audioPlayer.playerTime(forNodeTime: lastRenderTime) else {
return 0
}
return playerTime.sampleTime
}
However, when the loop ends and restarts, the currentFrame does not restart. But it still increases which makes currentFrame / audioSampleRate incorrect as the current position.
So what is the correct way to calculate the current position?
Good old modulo will do the job:
public var currentTime: TimeInterval {
guard let nodeTime = player.lastRenderTime,
let playerTime = player.playerTime(forNodeTime: nodeTime) else {
return 0
}
let time = (Double(playerTime.sampleTime) / playerTime.sampleRate)
.truncatingRemainder(dividingBy: Double(file.length) / Double(playerTime.sampleRate))
return time
}
I'm creating a RPGGameKit using SpriteKit to help me develop my iOS games. Now that my player can move, I added animations and an audio system.
I ran on a problem to synchronize textures and sounds. Like a step when my player walk.
let atlas = SKTextureAtlas(named: "Walk")
let textures = atlas.getTextures() // I created an extension that returns textures of atlas
let walkingAnimation = SKAction.animate(with: textures, timePerFrame: 1)
So, walkingAnimation will loop through textures and change it every 1 second.
Now, I want to play a walking sound when the texture changes.
I have look at SKAction and SpriteKit documentation but there is no callback for this SKAction.
If you want to try to get this done with me or you have ideas of how to do it, please leave a comment.
Thanks :)
Try this:
let frame1 = SKAction.setTexture(yourTexture1)
let frame2 = SKAction.setTexture(yourTexture2)
let frame3 = SKAction.setTexture(yourTexture3)
//etc
let sound = SKAction.playSoundFileNamed("soundName", waitForCompletion: false)
let oneSecond = SKAction.wait(forDuration: 1)
let sequence = SKAction.sequence([frame1,sound,oneSecond,frame2,sound,oneSecond,frame3,sound,oneSecond])
node.run(sequence)
So, for now I'm going to do it like this :
let textures = SKTextureAtlas(named: "LeftStep").getTextures()
var actions = [SKAction]()
for texture in textures {
let group = SKAction.group([
SKAction.setTexture(texture),
SKAction.playSoundFileNamed("Step.mp3", waitForCompletion: false)
])
let sequence = SKAction.sequence([
group,
SKAction.wait(forDuration: 0.5)
])
actions.append(sequence)
}
self.node.run(SKAction.repeatForever(SKAction.sequence(actions)))
Thanks #StefanOvomate
I've found myself in the same situation, currently I'm doing the below. From what I have read in the documentation and seen online, the only way to do it is to create the audio file length to match one rotation of the texture animation.
let walkAtlas = global.playerWalkAtlas
var walkFrames: [SKTexture] = []
let numImages = walkAtlas.textureNames.count
for i in 1...numImages {
let texture = "walk\(i)"
walkFrames.append(walkAtlas.textureNamed(texture))
}
walking = walkFrames
isWalking = true
animateMove()
}
func animateMove(){
let animateWalk = SKAction.animate(with: walking, timePerFrame: 0.05)
let soundWalk = global.playSound(sound: .walkSound)
let sequence = SKAction.sequence([soundWalk, animateWalk])
self.run(SKAction.repeatForever(sequence),withKey: "isMoving")
}
func stopMoving(){
self.removeAction(forKey: "isMoving")
isWalking = false
}
I want to show gif image in a UIImageView and with the code below (source: https://iosdevcenters.blogspot.com/2016/08/load-gif-image-in-swift_22.html, *I did not understand all the codes), I am able to display gif images. However, the memory consumption seems high (tested on real device). Is there any way to modify the code below to reduce the memory consumption?
#IBOutlet weak var imageView: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
let url = "https://cdn-images-1.medium.com/max/800/1*oDqXedYUMyhWzN48pUjHyw.gif"
let gifImage = UIImage.gifImageWithURL(url)
imageView.image = gifImage
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
fileprivate func < <T : Comparable>(lhs: T?, rhs: T?) -> Bool {
switch (lhs, rhs) {
case let (l?, r?):
return l < r
case (nil, _?):
return true
default:
return false
}
}
extension UIImage {
public class func gifImageWithData(_ data: Data) -> UIImage? {
guard let source = CGImageSourceCreateWithData(data as CFData, nil) else {
print("image doesn't exist")
return nil
}
return UIImage.animatedImageWithSource(source)
}
public class func gifImageWithURL(_ gifUrl:String) -> UIImage? {
guard let bundleURL:URL? = URL(string: gifUrl) else {
return nil
}
guard let imageData = try? Data(contentsOf: bundleURL!) else {
return nil
}
return gifImageWithData(imageData)
}
public class func gifImageWithName(_ name: String) -> UIImage? {
guard let bundleURL = Bundle.main
.url(forResource: name, withExtension: "gif") else {
return nil
}
guard let imageData = try? Data(contentsOf: bundleURL) else {
return nil
}
return gifImageWithData(imageData)
}
class func delayForImageAtIndex(_ index: Int, source: CGImageSource!) -> Double {
var delay = 0.1
let cfProperties = CGImageSourceCopyPropertiesAtIndex(source, index, nil)
let gifProperties: CFDictionary = unsafeBitCast(
CFDictionaryGetValue(cfProperties,
Unmanaged.passUnretained(kCGImagePropertyGIFDictionary).toOpaque()),
to: CFDictionary.self)
var delayObject: AnyObject = unsafeBitCast(
CFDictionaryGetValue(gifProperties,
Unmanaged.passUnretained(kCGImagePropertyGIFUnclampedDelayTime).toOpaque()),
to: AnyObject.self)
if delayObject.doubleValue == 0 {
delayObject = unsafeBitCast(CFDictionaryGetValue(gifProperties,
Unmanaged.passUnretained(kCGImagePropertyGIFDelayTime).toOpaque()), to: AnyObject.self)
}
delay = delayObject as! Double
if delay < 0.1 {
delay = 0.1
}
return delay
}
class func gcdForPair(_ a: Int?, _ b: Int?) -> Int {
var a = a
var b = b
if b == nil || a == nil {
if b != nil {
return b!
} else if a != nil {
return a!
} else {
return 0
}
}
if a < b {
let c = a
a = b
b = c
}
var rest: Int
while true {
rest = a! % b!
if rest == 0 {
return b!
} else {
a = b
b = rest
}
}
}
class func gcdForArray(_ array: Array<Int>) -> Int {
if array.isEmpty {
return 1
}
var gcd = array[0]
for val in array {
gcd = UIImage.gcdForPair(val, gcd)
}
return gcd
}
class func animatedImageWithSource(_ source: CGImageSource) -> UIImage? {
let count = CGImageSourceGetCount(source)
var images = [CGImage]()
var delays = [Int]()
for i in 0..<count {
if let image = CGImageSourceCreateImageAtIndex(source, i, nil) {
images.append(image)
}
let delaySeconds = UIImage.delayForImageAtIndex(Int(i),
source: source)
delays.append(Int(delaySeconds * 1000.0)) // Seconds to ms
}
let duration: Int = {
var sum = 0
for val: Int in delays {
sum += val
}
return sum
}()
let gcd = gcdForArray(delays)
var frames = [UIImage]()
var frame: UIImage
var frameCount: Int
for i in 0..<count {
frame = UIImage(cgImage: images[Int(i)])
frameCount = Int(delays[Int(i)] / gcd)
for _ in 0..<frameCount {
frames.append(frame)
}
}
let animation = UIImage.animatedImage(with: frames,
duration: Double(duration) / 1000.0)
return animation
}
}
When I render the image as normal png image, the consumption is around 10MB.
The GIF in question has a resolution of 480×288 and contains 10 frames.
Considering that UIImageView stores frames as 4-byte RGBA, this GIF occupies 4 × 10 × 480 × 288 = 5 529 600 bytes in RAM, which is more than 5 megabytes.
There are numerous ways to mitigate that, but only one of them puts no additional strain on the CPU; the others are mere CPU-to-RAM trade-offs.
The method I`m talking about is subclassing UIImageView and loading your GIFs by hand, preserving their internal representation (indexed image + palette). It would allow you to cut the memory usage fourfold.
N.B.: even though GIFs may be stored as full images for each frame (which is the case for the GIF in question), many are not. On the contrary, most of the frames can only contain the pixels that have changed since the previous one. Thus, in general the internal GIF representation only allows to display frames in direct order.
Other methods of saving RAM include e.g. re-reading every frame from disk prior to displaying it, which is certainly not good for battery life.
To display GIFs with less memory consumption, try BBWebImage.
BBWebImage will decide how many image frames are decoded and cached depending on current memory usage. If free memory is not enough, only part of image frames are decoded and cached.
For Swift 4:
// BBAnimatedImageView (subclass UIImageView) displays animated image
imageView = BBAnimatedImageView(frame: frame)
// Load and display gif
imageView.bb_setImage(with: url,
placeholder: UIImage(named: "placeholder"))
{ (image: UIImage?, data: Data?, error: Error?, cacheType: BBImageCacheType) in
// Do something when finish loading
}
I am working on my first application, its a mathematical riddles app. The player can get a hint that will reveal one of the variables - it's basically replacing one image with another. Sometimes I am replacing more than one image so I am using a loop that replace all of them. I want the old image to fade and be replaced with the new image, the answer. Also I would like them to fade one after the other, meaning that there will be a small delay between one image replacement animation to the next.
func changeHintIcons () {
var labelsArr = [[firstEquationFirstElemnt,firstEquationSecondElemnt,firstEquationThirdElemnt],[secondEquationFirstElemnt,secondEquationSecondElemnt,secondEquationThirdElemnt],[thirdEquationFirstElemnt,thirdEquationSecondElemnt,thirdEquationthirdElemnt],[fourthEquationFirstElemnt,fourthEquationSecondElemnt,fourthEquationThirdElemnt], [fifthEquationFirstElemnt,fifthEquationSecondElemnt,fifthEquationThirdElemnt]]
let col:Int = Int(arc4random_uniform(UInt32(gameDifficulty.stages[gameLevel].umberOfVariables)))
let row:Int = Int(arc4random_uniform(UInt32(2))) * 2
let var_to_show = current_equations[col][row]
let image_name = "answer.num.\(var_to_show)"
for i in 0..<current_equations.count {
for j in 0..<current_equations[i].count {
if (current_equations[i][j] == var_to_show) {
var image_index = j
if (j > 0) {
image_index = Int(j/2) //Converting index
}
labelsArr[i][image_index]!.image = UIImage(named: image_name)! //Replacing the image
}
}
}
}
One last thing, what if I want to use in an animation instead of letting the image simply fade out? What are my options and how can I implement them?
Ok I found the answer. Basically swift allows you to create your animation by projecting a set of images one after the other. Follow these steps:
1. Copy animation images to assets folder
2. create an array of UIImages
3. Do the same things as I did in the animate function
Main code -
var animationArray = createImageArray(total: 14, imagePrefix: "hint.animation")
animationArray.append(UIImage(named: imageHintAnswer)!)
animate(imageView: labelsArr[i][image_index]!, images: animationArray)
Functions -
func createImageArray(total: Int, imagePrefix: String) -> [UIImage] {
var imageArray:[UIImage] = []
for imageCount in 1..<total {
let imageName = "\(imagePrefix).\(imageCount)"
let image = UIImage(named: imageName)!
imageArray.append(image)
}
return imageArray
}
func animate(imageView: UIImageView, images: [UIImage]) {
imageView.animationImages = images
imageView.animationDuration = 0.7
imageView.animationRepeatCount = 1
imageView.startAnimating()
}