This is the code I have written so far to change the position of the view when dragged.
The view changes its center point when the UIPanGestureRecognizer is either changed or began and that happen when I let go the gesture. I do not want that. I want it to go along with my drag like what the Notification Center and Control Center does.
Thanks in advance for the help.
class ViewController: UIViewController {
let maxY = UIScreen.main.bounds.height
lazy var slideUpView: UIView = {
let view = UIView(frame: CGRect(x: 0, y: maxY - 295, width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height / 3 ))
view.backgroundColor = .green
return view
}()
lazy var redImageView: UIView = {
let view = UIView(frame: CGRect(x: 0, y: maxY - 295, width: UIScreen.main.bounds.width, height: slideUpView.frame.height / 2 ))
view.backgroundColor = .red
return view
}()
override func viewDidLoad() {
super.viewDidLoad()
print("MaxX: \(UIScreen.main.bounds.maxX)")
print("Max Hight: \(UIScreen.main.bounds.height)")
print("MaxY: \(UIScreen.main.bounds.maxY)")
print("Max width: \(UIScreen.main.bounds.width)")
// MARK: add the views as subview
let viewsArray = [slideUpView, redImageView]
viewsArray.forEach{view.addSubview($0)}
// Add a UIPanGestureRecognizer to it
viewsArray.forEach {createPanGestureRecognizer(targetView: $0)}
}
// The Pan Gesture
func createPanGestureRecognizer(targetView: UIView) {
let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePanGesture(recognizer:)))
panGesture.maximumNumberOfTouches = 1
targetView.addGestureRecognizer(panGesture)
}
#objc func handlePanGesture(recognizer: UIPanGestureRecognizer) {
if recognizer.state == UIGestureRecognizerState.began || recognizer.state == UIGestureRecognizerState.changed {
let translation = recognizer.translation(in: self.view)
print(recognizer.view!.center.y)
let newPos = CGPoint(x:recognizer.view!.center.x + translation.x, y: recognizer.view!.center.y + translation.y)
if insideDraggableArea(newPos) {
guard let targetedView = recognizer.view else {
print("Error: No View to handle")
return
}
targetedView.center.y = newPos.y
recognizer.setTranslation(.zero, in: targetedView)
}
}
}
private func insideDraggableArea(_ point : CGPoint) -> Bool {
return // point.x > 50 && point.x < 200 &&
point.y > (maxY * 0.27) && point.y <= maxY
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let position = touch.location(in: view)
print(position)
}
}
}
I also had this issue, at least in a playgrounds prototype...
I put all of the translation (that you have in the changed state) inside of a UIViewProperty animator that I had defined within the handlePanGesture function and then called .startAnimation() within the began/changed state.
It takes a bit of tweaking on the animation but its a lot smoother for me..
This code works fine. The issues was in the simulator I used yesterday. Everything works as I wanted it to be when tested in an iPhone.
Related
I'm trying to remove the custom view from the superview after the end of the animation in the completion block, but it is called immediately and the animation becomes sharp. I managed to solve the problem in a not very good way: just adding a delay to remove the view.
Here is the function for animating the view:
private func animatedHideSoundView(toRight: Bool) {
let translationX = toRight ? 0.0 : -screenWidth
UIView.animate(withDuration: 0.5) {
self.soundView.transform = CGAffineTransform(translationX: translationX, y: 0.0)
} completion: { isFinished in
if isFinished {
self.soundView.removeFromSuperview()
self.songPlayer.pause()
}
}
}
The problem in this line: self.soundView.removeFromSuperview()
When I call this function in the switch recognizer.state completion block statement it executes early and when elsewhere everything works correctly.
#objc private func soundViewPanned(recognizer: UIPanGestureRecognizer) {
let touchPoint = recognizer.location(in: view)
switch recognizer.state {
case .began:
initialOffset = CGPoint(x: touchPoint.x - soundView.center.x, y: touchPoint.y - soundView.center.y)
case .changed:
soundView.center = CGPoint(x: touchPoint.x - initialOffset.x, y: touchPoint.y - initialOffset.y)
if notHiddenSoundViewRect.minX > soundView.frame.minX {
animatedHideSoundView(toRight: false)
} else if notHiddenSoundViewRect.maxX < soundView.frame.maxX {
animatedHideSoundView(toRight: true)
}
case .ended, .cancelled:
let decelerationRate = UIScrollView.DecelerationRate.normal.rawValue
let velocity = recognizer.velocity(in: view)
let projectedPosition = CGPoint(
x: soundView.center.x + project(initialVelocity: velocity.x, decelerationRate: decelerationRate),
y: soundView.center.y + project(initialVelocity: velocity.y, decelerationRate: decelerationRate)
)
let nearestCornerPosition = nearestCorner(to: projectedPosition)
let relativeInitialVelocity = CGVector(
dx: relativeVelocity(forVelocity: velocity.x, from: soundView.center.x, to: nearestCornerPosition.x),
dy: relativeVelocity(forVelocity: velocity.y, from: soundView.center.y, to: nearestCornerPosition.y)
)
let timingParameters = UISpringTimingParameters(dampingRatio: 0.8, initialVelocity: relativeInitialVelocity)
let animator = UIViewPropertyAnimator(duration: 0.5, timingParameters: timingParameters)
animator.addAnimations {
self.soundView.center = nearestCornerPosition
}
animator.startAnimation()
default: break
}
}
I want the user to be able to swipe this soundView off the screen.
That's why I check where the soundView is while the user is moving it, so that if he moves the soundView near the edge of the screen, I can hide the soundView animatedly.
Maybe I'm doing it wrong, but I couldn't think of anything else, because I don't have much experience. Could someone give me some advice on this?
I managed to solve it this way, but I don't like it:
private func animatedHideSoundView(toRight: Bool) {
let translationX = toRight ? 0.0 : -screenWidth
UIView.animate(withDuration: 0.5) {
self.soundView.transform = CGAffineTransform(translationX: translationX, y: 0.0)
}
DispatchQueue.main.asyncAfter(deadline: .now() + 0.5) {
self.soundView.removeFromSuperview()
self.songPlayer.pause()
}
}
enter image description here
You can see and run all code here: https://github.com/swiloper/AnimationProblem
Couple notes...
First, in your controller code, you are calling animatedHideSoundView() from your pan gesture recognizer every time you move the touch. It's unlikely that's what you want to do.
Second, if you call animatedHideSoundView(toRight: true) your code:
private func animatedHideSoundView(toRight: Bool) {
let translationX = toRight ? 0.0 : -screenWidth
UIView.animate(withDuration: 0.5) {
self.soundView.transform = CGAffineTransform(translationX: translationX, y: 0.0)
} completion: { isFinished in
if isFinished {
self.soundView.removeFromSuperview()
self.songPlayer.pause()
}
}
}
sets translationX to Zero ... when you then try to animate the transform, the animation will take no time because you're not changing the x.
Third, I strongly suggest that you start simple. The code you linked to cannot be copy/pasted/run, which makes it difficult to offer help.
Here's a minimal version of your UniversalTypesViewController class (it uses your linked SoundView class):
final class UniversalTypesViewController: UIViewController {
// MARK: Properties
private lazy var soundView = SoundView(frame: CGRect(x: 0, y: 0, width: 80, height: 80))
private let panGestureRecognizer = UIPanGestureRecognizer()
private var initialOffset: CGPoint = .zero
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .systemYellow
panGestureRecognizer.addTarget(self, action: #selector(soundViewPanned(recognizer:)))
soundView.addGestureRecognizer(panGestureRecognizer)
}
private func animatedShowSoundView() {
// reset soundView's transform
soundView.transform = .identity
// add it to the view
view.addSubview(soundView)
// position soundView near bottom, but past the right side of view
soundView.frame.origin = CGPoint(x: view.frame.width, y: view.frame.height - soundView.frame.height * 2.0)
soundView.startSoundBarsAnimation()
// animate soundView into view
UIView.animate(withDuration: 0.5, delay: 0.0, options: .curveEaseOut) {
self.soundView.transform = CGAffineTransform(translationX: -self.soundView.frame.width * 2.0, y: 0.0)
}
}
private func animatedHideSoundView(toRight: Bool) {
let translationX = toRight ? view.frame.width : -(view.frame.width + soundView.frame.width)
UIView.animate(withDuration: 0.5) {
self.soundView.transform = CGAffineTransform(translationX: translationX, y: 0.0)
} completion: { isFinished in
if isFinished {
self.soundView.removeFromSuperview()
//self.songPlayer.pause()
}
}
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
// if soundView is not in the view hierarchy,
// animate it into view - animatedShowSoundView() func adds it as a subview
if soundView.superview == nil {
animatedShowSoundView()
} else {
// unwrap the touch
guard let touch = touches.first else { return }
// get touch location
let loc = touch.location(in: self.view)
// if touch is inside the soundView frame,
// return, so pan gesture can move soundView
if soundView.frame.contains(loc) { return }
// if touch is on the left-half of the screen,
// animate soundView to the left and remove after animation
if loc.x < view.frame.midX {
animatedHideSoundView(toRight: false)
} else {
// touch is on the right-half of the screen,
// so just remove soundView
animatedHideSoundView(toRight: true)
}
}
}
// MARK: Objc methods
#objc private func soundViewPanned(recognizer: UIPanGestureRecognizer) {
let touchPoint = recognizer.location(in: view)
switch recognizer.state {
case .began:
initialOffset = CGPoint(x: touchPoint.x - soundView.center.x, y: touchPoint.y - soundView.center.y)
case .changed:
soundView.center = CGPoint(x: touchPoint.x - initialOffset.x, y: touchPoint.y - initialOffset.y)
case .ended, .cancelled:
()
default: break
}
}
}
If you run that, tapping anywhere will animate soundView into view at bottom-right. You can then drag soundView around.
If you tap away from soundView frame, on the left-half of the screen, soundView will be animated out to the left and removed after animation completes.
If you tap away from soundView frame, on the right-half of the screen, soundView will be animated out to the right and removed after animation completes.
Once you've got that working, and you see what's happening, you can implement it in the rest of your much-more-complex code.
Edit
Take a look at this modified version of your code.
One big problem in your code is that you're making multiple calls to animatedHideSoundView(). When the drag gets near the edge, your code calls that... but then it gets called again because the drag is still "active."
So, I added a var isHideAnimationRunning: Bool flag so calls to positioning when dragging and positioning when "hide" animating don't conflict.
A few other changes:
instead of mixing Transforms with .center positioning, get rid of the Transforms and just use .center
I created a struct with logically named corner points - makes it much easier to reference them
strongly recommended: add comments to your code!
So, give this a try:
import UIKit
let screenWidth: CGFloat = UIScreen.main.bounds.width
let screenHeight: CGFloat = UIScreen.main.bounds.height
let sideSpacing: CGFloat = 32.0
let mediumSpacing: CGFloat = 16.0
var isNewIphone: Bool {
return screenHeight / screenWidth > 1.8
}
extension CGPoint {
func distance(to point: CGPoint) -> CGFloat {
return sqrt(pow(point.x - x, 2) + pow(point.y - y, 2))
}
}
// so we can refer to corner positions by logical names
struct CornerPoints {
var topLeft: CGPoint = .zero
var bottomLeft: CGPoint = .zero
var bottomRight: CGPoint = .zero
var topRight: CGPoint = .zero
}
final class ViewController: UIViewController {
private var cornerPoints = CornerPoints()
private let soundViewSide: CGFloat = 80.0
private lazy var halfSoundViewWidth = soundViewSide / 2
private lazy var newIphoneSpacing = isNewIphone ? mediumSpacing : 0.0
private lazy var soundView = SoundView(frame: CGRect(origin: .zero, size: CGSize(width: soundViewSide, height: soundViewSide)))
private lazy var notHiddenSoundViewRect = CGRect(x: mediumSpacing, y: 0.0, width: screenWidth - mediumSpacing * 2, height: screenHeight)
private var initialOffset: CGPoint = .zero
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .yellow
// setup corner points
let left = sideSpacing + halfSoundViewWidth
let right = view.frame.maxX - (sideSpacing + halfSoundViewWidth)
let top = sideSpacing + halfSoundViewWidth - newIphoneSpacing
let bottom = view.frame.maxY - (sideSpacing + halfSoundViewWidth - newIphoneSpacing)
cornerPoints.topLeft = CGPoint(x: left, y: top)
cornerPoints.bottomLeft = CGPoint(x: left, y: bottom)
cornerPoints.bottomRight = CGPoint(x: right, y: bottom)
cornerPoints.topRight = CGPoint(x: right, y: top)
let panGestureRecognizer = UIPanGestureRecognizer()
panGestureRecognizer.addTarget(self, action: #selector(soundViewPanned(recognizer:)))
soundView.addGestureRecognizer(panGestureRecognizer)
// for development, let's add a double-tap recognizer to
// add the soundView again (if it's been removed)
let dt = UITapGestureRecognizer(target: self, action: #selector(showAgain(_:)))
dt.numberOfTapsRequired = 2
view.addGestureRecognizer(dt)
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
self.animatedShowSoundView()
}
}
#objc func showAgain(_ f: UITapGestureRecognizer) {
// if soundView has been removed
if soundView.superview == nil {
// add it
animatedShowSoundView()
}
}
private func animatedShowSoundView() {
// start at bottom-right, off-screen to the right
let pt: CGPoint = cornerPoints.bottomRight
soundView.center = CGPoint(x: screenWidth + soundViewSide, y: pt.y)
view.addSubview(soundView)
soundView.startSoundBarsAnimation()
// animate to bottom-right corner
UIView.animate(withDuration: 0.5, delay: 0.0, options: .curveEaseOut) {
self.soundView.center = pt
}
}
// flag so we know if soundView is currently
// "hide" animating
var isHideAnimationRunning: Bool = false
private func animatedHideSoundView(toRight: Bool) {
// only execute if soundView is not currently "hide" animating
if !isHideAnimationRunning {
// set flag to true
isHideAnimationRunning = true
// target center X
let targetX: CGFloat = toRight ? screenWidth + soundViewSide : -soundViewSide
UIView.animate(withDuration: 0.5) {
self.soundView.center.x = targetX
} completion: { isFinished in
self.isHideAnimationRunning = false
if isFinished {
self.soundView.removeFromSuperview()
//self.songPlayer.pause()
}
}
}
}
#objc private func soundViewPanned(recognizer: UIPanGestureRecognizer) {
let touchPoint = recognizer.location(in: view)
switch recognizer.state {
case .began:
// only execute if soundView is not currently "hide" animating
if !isHideAnimationRunning {
initialOffset = CGPoint(x: touchPoint.x - soundView.center.x, y: touchPoint.y - soundView.center.y)
}
case .changed:
// only execute if soundView is not currently "hide" animating
if !isHideAnimationRunning {
soundView.center = CGPoint(x: touchPoint.x - initialOffset.x, y: touchPoint.y - initialOffset.y)
if notHiddenSoundViewRect.minX > soundView.frame.minX {
animatedHideSoundView(toRight: false)
} else if notHiddenSoundViewRect.maxX < soundView.frame.maxX {
animatedHideSoundView(toRight: true)
}
}
case .ended, .cancelled:
// only execute if soundView is not currently "hide" animating
if !isHideAnimationRunning {
let decelerationRate = UIScrollView.DecelerationRate.normal.rawValue
let velocity = recognizer.velocity(in: view)
let projectedPosition = CGPoint(
x: soundView.center.x + project(initialVelocity: velocity.x, decelerationRate: decelerationRate),
y: soundView.center.y + project(initialVelocity: velocity.y, decelerationRate: decelerationRate)
)
let nearestCornerPosition = nearestCorner(to: projectedPosition)
let relativeInitialVelocity = CGVector(
dx: relativeVelocity(forVelocity: velocity.x, from: soundView.center.x, to: nearestCornerPosition.x),
dy: relativeVelocity(forVelocity: velocity.y, from: soundView.center.y, to: nearestCornerPosition.y)
)
let timingParameters = UISpringTimingParameters(dampingRatio: 0.8, initialVelocity: relativeInitialVelocity)
let animator = UIViewPropertyAnimator(duration: 0.5, timingParameters: timingParameters)
animator.addAnimations {
self.soundView.center = nearestCornerPosition
}
animator.startAnimation()
}
default: break
}
}
private func project(initialVelocity: CGFloat, decelerationRate: CGFloat) -> CGFloat {
return (initialVelocity / 1000) * decelerationRate / (1 - decelerationRate)
}
private func nearestCorner(to point: CGPoint) -> CGPoint {
var minDistance = CGFloat.greatestFiniteMagnitude
var nearestPosition = CGPoint.zero
for position in [cornerPoints.topLeft, cornerPoints.bottomLeft, cornerPoints.bottomRight, cornerPoints.topRight] {
let distance = point.distance(to: position)
if distance < minDistance {
nearestPosition = position
minDistance = distance
}
}
return nearestPosition
}
/// Calculates the relative velocity needed for the initial velocity of the animation.
private func relativeVelocity(forVelocity velocity: CGFloat, from currentValue: CGFloat, to targetValue: CGFloat) -> CGFloat {
guard currentValue - targetValue != 0 else { return 0 }
return velocity / (targetValue - currentValue)
}
}
I am trying to implement a multi-row sequence of items (like Video editing sequence in Final Cut Pro or Adobe Premiere pro shown below).
While I one can always implement it using UIScrollView and placing custom views manually, it would be tedious particularly in reordering items and animating changes and also zooming across the timeline using pinch gesture. Is it possible to implement it using UICollectionView using UICollectionViewCompositionalLayout and UICollectionViewDiffableDataSource? From WWDC videos, it seems almost everything is possible using compositional layout but it isn't clear if it is possible to implement a timeline using it. Maybe UICollectionView is not the right paradigm for this use case and one should use UIScrollView? Even if I use UIScrollView, managing things like dragging & reordering items, animating datasource changes, trimming items, zooming the content are going to be issues. Any pointers to existing code base that implements these features?
Here is my playground code as a partial answer for a simple empty iOS Playground file. It should give you a basic idea how to implement it using SpriteKit. I didn't add any animations and the scene so far has a fixed width and the "camera" is also fixed and doesn't allow zooming yet. But I wanted to give you something so you can decided if this is even the right solution for you.
import UIKit
import SpriteKit
import PlaygroundSupport
class MyViewController: UIViewController {
override func loadView() {
// Setting up a basic UIView as parent
let parentView = UIView()
parentView.frame = CGRect(x: 0, y: 0, width: 600, height: 600)
parentView.backgroundColor = .black
// Defining the SKView
let tracksSKView = SKView(frame: parentView.frame)
tracksSKView.ignoresSiblingOrder = false
// Options to debug visually
// tracksSKView.showsNodeCount = true
// tracksSKView.showsPhysics = true
// tracksSKView.showsFields = true
// tracksSKView.showsLargeContentViewer = true
// Defining our subclassed SKScene
let scene = GameScene(size: tracksSKView.bounds.size)
// Presenting and adding views and sceens
tracksSKView.presentScene(scene)
parentView.addSubview(tracksSKView)
self.view = parentView
}
}
//MARK: - Custom SKScene
class GameScene: SKScene {
let trackSize = CGSize(width: 2048, height: 120)
let tracksCount = 4
// Hardcoded clips, use your data source and update when a clip has been moved in any way.
let clips: [Clip] = [
Clip(name: "SongA", track: 1, xPosition: 0, lengh: 245),
Clip(name: "SongB", track: 2, xPosition: 200, lengh: 166, color: .blue),
Clip(name: "SongC", track: 3, xPosition: 200, lengh: 256, color: .red)
]
var touchingClip = false
var touchedClip = SKNode()
// Bacically like loadView or viewDidLoad
override func didMove(to view: SKView) {
physicsWorld.contactDelegate = self
self.size = CGSize(width: 1024, height: 768)
self.name = "scene"
addTracks(amount: tracksCount)
addClips(clips: clips)
}
// Adding x amount of tracks.
func addTracks(amount: Int) {
for n in 0..<amount {
let trackNode = SKSpriteNode(color: n%2 == 0 ? .systemGray : .systemGray2, size: trackSize)
// Setting up physical propeties of the border of the track
trackNode.physicsBody = SKPhysicsBody(edgeLoopFrom: trackNode.frame)
trackNode.physicsBody?.restitution = 0.2
trackNode.physicsBody?.allowsRotation = false
trackNode.physicsBody?.affectedByGravity = false
trackNode.physicsBody?.isDynamic = false
// Positioning the track
trackNode.zPosition = -1
trackNode.position.y = frame.minY + trackSize.height / 2 + CGFloat(n) * trackSize.height
addChild(trackNode)
}
}
// Adding the Clip objects stored in an array.
func addClips(clips: [Clip]) {
for clip in clips {
let clipNode = SKSpriteNode(color: clip.color, size: CGSize(width: clip.lengh, height: Int(trackSize.height) - 20))
clipNode.position.x = clip.xPosition + CGFloat(clip.lengh / 2)
clipNode.position.y = frame.minY + (trackSize.height * CGFloat(clip.track)) + 1
clipNode.zPosition = 1
clipNode.physicsBody = SKPhysicsBody(rectangleOf: clipNode.frame.size)
clipNode.physicsBody?.affectedByGravity = true
clipNode.physicsBody?.allowsRotation = false
clipNode.physicsBody?.restitution = 0.2
addChild(clipNode)
}
}
//MARK: - User interaction
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
let location = touch.location(in: self)
// getting all nodes the user touched (visible and hidden below others.
let tappedNodes = nodes(at: location)
//getting the top node
if let node = tappedNodes.first {
touchedClip = node
touchingClip = true
}
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
guard touchingClip else { return }
// Moving the clip (node) based on the movement of the touch. It's very basic and can look jittery. Using the animate methods would create better results.
for touch in touches {
let location = touch.location(in: self)
touchedClip.position = location
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
touchingClip = false
}
}
//MARK: - Interaction in between object like collisions etc.
extension GameScene: SKPhysicsContactDelegate {
// handle different contact cases here
}
//MARK: - Clip object
struct Clip {
var name: String
var track: Int
var xPosition: CGFloat
var lengh: Int
var color: UIColor = .green
}
PlaygroundPage.current.liveView = MyViewController()
I've added a gesture recognizer for a long press to move the clips, while touch and pan is not resizing the clip. Here is the new code:
import UIKit
import SpriteKit
import PlaygroundSupport
PlaygroundPage.current.liveView = MyViewController()
class MyViewController: UIViewController {
override func loadView() {
// Setting up a basic UIView as parent
let parentView = UIView()
parentView.frame = CGRect(x: 0, y: 0, width: 600, height: 600)
parentView.backgroundColor = .black
// Defining the SKView
let tracksSKView = SKView(frame: parentView.frame)
tracksSKView.ignoresSiblingOrder = false
// Options to debug visually
tracksSKView.showsNodeCount = true
tracksSKView.showsPhysics = true
tracksSKView.showsFields = true
tracksSKView.showsLargeContentViewer = true
// Defining our subclassed SKScene
let scene = GameScene(size: tracksSKView.bounds.size)
// Presenting and adding views and sceens
tracksSKView.presentScene(scene)
parentView.addSubview(tracksSKView)
self.view = parentView
}
}
//MARK: - Custom SKScene
class GameScene: SKScene {
let trackSize = CGSize(width: 2048, height: 120)
let tracksCount = 4
// Hardcoded clips, use your data source and update when a clip has been moved in any way.
let clips: [Clip] = [
Clip(name: "SongA", track: 1, xPosition: 0, lengh: 245),
Clip(name: "SongB", track: 2, xPosition: 200, lengh: 166, color: .blue),
Clip(name: "SongC", track: 3, xPosition: 200, lengh: 256, color: .red)
]
// Different interactions, I used a sepperate variable for each interaction instead of one to be able to add more later.
var touchingClip = false
var movingClip = false
var resizingClip = true
var touchedClip = SKNode()
var location = CGPoint()
// Bacically like loadView or viewDidLoad
override func didMove(to view: SKView) {
physicsWorld.contactDelegate = self
// Using the UI gesture recognizer in the case of a long press seemed easier than trying to figure out the gestures in the touches methods.
let longPressRecognizer = UILongPressGestureRecognizer(target: self, action: #selector(GameScene.longPress))
self.view!.addGestureRecognizer(longPressRecognizer)
// Adding tracks and clips
addTracks(amount: tracksCount)
addClips(clips: clips)
}
// Method that handles the long press
#objc func longPress(sender: UILongPressGestureRecognizer) {
if sender.state == .began || sender.state == .changed {
movingClip = true
resizingClip = false
} else {
movingClip = false
resizingClip = true
}
location = sender.location(in: self.view)
}
//MARK: - Setting up the tracks and clips
// Adding x amount of tracks.
func addTracks(amount: Int) {
for n in 0..<amount {
let trackNode = SKSpriteNode(color: n%2 == 0 ? .systemGray : .systemGray2, size: trackSize)
// Setting up physical propeties of the border of the track
trackNode.physicsBody = SKPhysicsBody(edgeLoopFrom: trackNode.frame)
trackNode.physicsBody?.restitution = 0.2
trackNode.physicsBody?.allowsRotation = false
trackNode.physicsBody?.affectedByGravity = false
trackNode.physicsBody?.isDynamic = false
// Positioning the track
trackNode.zPosition = -1
trackNode.position.y = frame.minY + trackSize.height / 2 + CGFloat(n) * trackSize.height
addChild(trackNode)
}
}
// Adding the Clip objects stored in an array.
func addClips(clips: [Clip]) {
for clip in clips {
let clipNode = SKSpriteNode(color: clip.color, size: CGSize(width: clip.lengh, height: Int(trackSize.height) - 20))
clipNode.name = clip.name
clipNode.position.x = clip.xPosition + CGFloat(clip.lengh / 2)
clipNode.position.y = frame.minY + (trackSize.height * CGFloat(clip.track)) + 1
clipNode.zPosition = 1
clipNode.physicsBody = SKPhysicsBody(rectangleOf: clipNode.frame.size)
clipNode.physicsBody?.affectedByGravity = true
clipNode.physicsBody?.allowsRotation = false
clipNode.physicsBody?.restitution = 0.2
clipNode.physicsBody?.isDynamic = true
addChild(clipNode)
}
}
//MARK: - User interaction
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
guard touches.first != nil else { return }
for touch in touches {
let location = touch.location(in: self)
touchedClip = atPoint(location) as! SKSpriteNode
if clips.contains(where: { $0.name == touchedClip.name }) {
touchingClip = true
}
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
guard touchingClip else { return }
for touch in touches {
if resizingClip {
let resizeValue = touch.location(in: touchedClip).x - touch.previousLocation(in: touchedClip).x
// Checking that we're only adding width to the clip or trimming no more then the remaining width.
if resizeValue > 0 || (resizeValue < 0 && abs(resizeValue) < touchedClip.frame.size.width) {
let action = SKAction.resize(byWidth: resizeValue, height: 0, duration: 0.0)
action.timingMode = .linear
touchedClip.run(action)
}
}
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
touchingClip = false
resizingClip = true
movingClip = false
}
//MARK: - Scene update
// Runs as long as scene is active once per frame (target of 60 frames per second)
override func update(_ currentTime: TimeInterval) {
// The moving needs to be done in the update method, the touches methods are unresponsive while the gesture recognizer is active.
if movingClip && touchingClip {
let newLocation = convertPoint(fromView: location)
let action = SKAction.move(to: newLocation, duration: 0.1)
action.timingMode = .easeInEaseOut
touchedClip.run(action)
}
// The physics body does not change when the clip node is resized. I'm updating it here.
if resizingClip && touchingClip {
touchedClip.physicsBody = SKPhysicsBody(rectangleOf: touchedClip.frame.size)
touchedClip.physicsBody?.affectedByGravity = true
touchedClip.physicsBody?.allowsRotation = false
touchedClip.physicsBody?.restitution = 0.2
touchedClip.physicsBody?.isDynamic = true
}
}
}
//MARK: - Interaction in between object like collisions etc.
extension GameScene: SKPhysicsContactDelegate {
// handle different contact cases here
}
//MARK: - Clip object
struct Clip {
var name: String
var track: Int
var xPosition: CGFloat
var lengh: Int
var color: UIColor = .green
}
Sources:
www.udemy.com/course/dive-into-spritekit (Pretty good, but not great)
designcode.io (Not recommended)
stackoverflow.com/questions/30337608/detect-long-touch-in-sprite-kit
as well as more SO and Apple Dev :)
What I am doing is that, I took pdf view, it contains one sample pdf.
On top of that, I am adding more than 1 signatories(custom view) views, when user click on add button from navigation bar.
Scene 1: When add first Signatory view (customview) on pdf, it is adding add and I can drag/move that first Signatory view (Signatory1) on pdf, this is working fine.
Scene 2: When add second Signatory view (Signatory2 customview) on pdf, it is adding and I can drag/move that second Signatory view (Signatory2) on pdf, this is also working fine, but in this scenario I can't move/drag the first signatory view (Signatory1)
Scene 3: Similarly When add third Signatory view (Signatory3 customview) on pdf, it is adding and I can drag/move that third Signatory view (Signatory3) on pdf, this is also working fine, but in this scenario I can't move/drag the first signatory view (Signatory1) and second signatory view (Signatory2) and so on
The problem is that, I have access to only the current Signatory only (I can move/drag only the current Signatory view which added recently), I can can't able to move/drag old Signatories views.
How I can move/drag any Signatory, according to my choice when I click/touch (drag/move) any specific Signatory view on the pdf view ?
Here is the some code,
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
customView1 = SignatoryXibView(frame: CGRect(x: 30, y: 30, width: 112, height: 58))
customView2 = SignatoryXibView(frame: CGRect(x: 30, y: 30, width: 112, height: 58))
customView3 = SignatoryXibView(frame: CGRect(x: 30, y: 30, width: 112, height: 58))
customView4 = SignatoryXibView(frame: CGRect(x: 30, y: 30, width: 112, height: 58))
customView5 = SignatoryXibView(frame: CGRect(x: 30, y: 30, width: 112, height: 58))
loadPdf()
}
func loadPdf(){
if let path = Bundle.main.path(forResource: "appointment-letter", ofType: "pdf") {
if let pdfDocument = PDFDocument(url: URL(fileURLWithPath: path)) {
pdfView.displayMode = .singlePage // .singlePage //.singlePageContinuous //.twoUp
//by default display mode is - singlePageContinuous
pdfView.autoScales = true
pdfView.displayDirection = .vertical // .horizontal//.vertical
pdfView.document = pdfDocument
pdfView.autoresizingMask = [.flexibleWidth, .flexibleHeight, .flexibleTopMargin, .flexibleBottomMargin]
pdfView.zoomIn(self)
pdfView.autoScales = true
pdfView.backgroundColor = UIColor.white
//Imp line
pdfView.usePageViewController(true, withViewOptions: [:])
currentPageNumberOfPdf = self.pdfView.currentPage?.pageRef?.pageNumber ?? 0
print("Total Pages in PDF : ",pdfDocument.pageCount);
self.pdfView.bringSubviewToFront(customView1!)
}
}
}
#IBAction func addSignatoryButtonClicked(_ sender: Any) {
signatoryCount = signatoryCount + 1
if signatoryCount == 1 {
customView1?.signatoryLabel.text = "Signatory \(signatoryCount)"
self.pdfView.addSubview(customView1!)
}
else if signatoryCount == 2 {
customView2?.signatoryLabel.text = "Signatory \(signatoryCount)"
self.pdfView.addSubview(customView2!)
}
else if signatoryCount == 3 {
customView3?.signatoryLabel.text = "Signatory \(signatoryCount)"
self.pdfView.addSubview(customView3!)
}
else if signatoryCount == 4 {
customView4?.signatoryLabel.text = "Signatory \(signatoryCount)"
self.pdfView.addSubview(customView4!)
}
else if signatoryCount == 5 {
customView5?.signatoryLabel.text = "Signatory \(signatoryCount)"
self.pdfView.addSubview(customView5!)
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first
let touchLocation = touch?.location(in: self.pdfView)
// customView1?.center = touchLocation!
if signatoryCount == 1 {
customView1?.center = touchLocation!
return
}
else if signatoryCount == 2 {
customView2?.center = touchLocation!
return
}
else if signatoryCount == 3 {
customView3?.center = touchLocation!
return
}
else if signatoryCount == 4 {
customView4?.center = touchLocation!
return
}
else if signatoryCount == 5 {
customView5?.center = touchLocation!
return
}
// frame = view.convert(customView1!.frame, from: pdfView)
// print("touchesMoved \(frame!.dictionaryRepresentation)")
}
Here is the complete project source code
You got pretty close. You just need to apply the same approach as I suggested in your last question but respecting the superview's frame.
First add those helpers to your project:
extension CGPoint {
static func +(lhs: CGPoint, rhs: CGPoint) -> CGPoint {
.init(x: lhs.x + rhs.x, y: lhs.y + rhs.y)
}
static func +=(lhs: inout CGPoint, rhs: CGPoint) {
lhs.x += rhs.x
lhs.y += rhs.y
}
}
extension UIView {
func translate(_ translation: CGPoint) {
let destination = center + translation
let minX = frame.width/2
let minY = frame.height/2
let maxX = superview!.frame.width-minX
let maxY = superview!.frame.height-minY
center = CGPoint(
x: min(maxX, max(minX, destination.x)),
y: min(maxY ,max(minY, destination.y)))
}
}
Second just get rid of the pan gesture recognizer and the correspondent method in your ViewController.
Third change your SignatoryXibView pan gesture to the one below. This will translate the center of the view and respect the frame of its superview:
#objc func pan(_ gesture: UIPanGestureRecognizer) {
translate(gesture.translation(in: self))
gesture.setTranslation(.zero, in: self)
setNeedsDisplay()
}
sample project
I would like to get the rectangular position in the scene view of a tapped node so that I can use that position to animate another UIView from that position to a bigger size (exactly like in the Measure app):
I get the tapped node with this code, though how do I get the rect or the dimensions or whatever to make the smooth transition from the node to the view?
public override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
guard let touch = touches.first else { return }
if (touch.view == self.sceneView) {
let viewTouchLocation:CGPoint = touch.location(in: sceneView)
guard let result = sceneView.hitTest(viewTouchLocation, options: nil).first else {
return
}
}
}
I've looked into using hitTest or smartHitTest on the scene view given the touch position but I can't really seem to be able to get the size/position of the smaller view.
Not exactly your answer but try it out: (edited to send it back)
var front = false
var boxView = UIView()
#objc func tap(_ sender: UITapGestureRecognizer){
if !front{
let position = sender.location(in: sceneView)
let hitTest = sceneView.hitTest(position, options: nil)
if let hitNode = hitTest.first?.node{
self.boxView = UIView(frame: CGRect(x: position.x, y: position.y, width: 100, height: 40))
self.boxView.backgroundColor = .red
sceneView.addSubview(self.boxView)
UIView.animate(withDuration: 0.2) {
self.boxView.frame = CGRect(x: self.sceneView.frame.width/2 - 150, y: self.sceneView.frame.height/2 - 100, width: 300, height: 200)
}
front = true
}
}else{
let p = plane.convertPosition(plane.position, to: sceneView.scene.rootNode)
print(p, plane.position) // see difference between p and plane.position
let projectedPoint = sceneView.projectPoint(p)
let point = CGPoint(x: CGFloat(projectedPoint.x), y: CGFloat(projectedPoint.y))
UIView.animate(withDuration: 0.2, animations: {
self.boxView.frame = CGRect(x: point.x, y: point.y, width: 100, height: 40)
}) { (completed) in
self.boxView.removeFromSuperview()
}
front = false
}
}
plane here is node in my test case. You might want to look about convertPosition and projectPoint
I am trying to create a custom slider that uses a UIButton as the thumb. Since I can't seem to find a way to use a UIView as the thumb in a slider, I decided to build a custom one. However, when I run it, the thumb is sluggish to respond on the initial drag. The following is the code so far. The gif below is running slower than actual speed but illustrates the point.
import Foundation
import UIKit
class CustomSlider: UIView, UIGestureRecognizerDelegate {
let buttonView = UIButton()
var minimumPosition = CGPoint()
var maximumPosition = CGPoint()
var originalCenter = CGPoint()
override init(frame: CGRect) {
super.init(frame: frame)
commonInit()
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
commonInit()
}
func commonInit() {
// add a pan recognizer
let recognizer = UIPanGestureRecognizer(target: self, action: #selector(handlePan(recognizer:)))
recognizer.delegate = self
addGestureRecognizer(recognizer)
backgroundColor = UIColor.green
buttonView.frame = CGRect(x: 0, y: 0, width: bounds.height, height: bounds.height)
buttonView.backgroundColor = UIColor.purple
self.addSubview(buttonView)
minimumPosition = CGPoint(x:buttonView.frame.width / 2, y: 0)
maximumPosition = CGPoint(x: bounds.width - buttonView.frame.width / 2, y: 0)
}
#objc func handlePan(recognizer: UIPanGestureRecognizer) {
if recognizer.state == .began {
originalCenter = buttonView.center
}
if recognizer.state == .changed {
let translation = recognizer.translation(in: self)
let newX = originalCenter.x + translation.x
buttonView.center = CGPoint(x: min(maximumPosition.x, max(minimumPosition.x, newX)), y: buttonView.frame.midY)
}
}
}
To make it look smoother, you should reset the translation of the pan and just add it continuously to the center offset:
#objc func handlePan(recognizer: UIPanGestureRecognizer) {
if recognizer.state == .began {
// originalCenter = buttonView.center
}
if recognizer.state == .changed {
originalCenter = buttonView.center
let translation = recognizer.translation(in: self)
let newX = originalCenter.x + translation.x
buttonView.center = CGPoint(x: min(maximumPosition.x, max(minimumPosition.x, newX)), y: buttonView.frame.midY)
recognizer.setTranslation(.zero, in: self)
}
}
Also, for making a custom control, you can check out this tutorial:
https://www.sitepoint.com/wicked-ios-range-slider-part-one/
It might be old and in objective-c, but it gives the general idea of overriding UIControl, handling touches and creating a custom control - which you can also extend with #IBDesignable and #IBInspectable.
Hope it helps.
Finally figured it out. It only happens in the iPhone XR simulator. Tried the iPhone XS, among others, and they were smooth as glass.