I would like to ask you about moving UIImageView.
Let’s imagine that an UIImageView is at location X1. If I want to move imv to location X2 using drag event, How can I implement it?
Please give me advice.
Thanks for reading.
I have refered to How to drag UIImageView using touches method
import UIKit
class ViewController: UIViewController {
#IBOutlet var imgView: UIImageView!
var imgMarker:UIImage?
var location = CGPoint()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
imgMarker = UIImage(named: "marker.png")
imgView.image = imgMarker
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?){
if let touch = touches.first {
location = touch.location(in: self.view)
imgView.center = location
}
}
}
Related
I'm trying to build an AR application. I'm using Swift code like UIKit, ARCL, Corelocation, and SceneKit. I have already created the AR run well on the mobile device and my corelocation each pin is successfully placed on each location.
The problem is, when I click my object, which is "the pin/the object that I have been putt", it doesn't show any activities on the User Interface of Augmented Reality. But the system works well on the debug area-activated console. When I click, it shows all the names and the description of each pin that I press. But the problem is, the text/description doesn't show up on the augmented reality/User Interface.
Can you guys solve this problem? I want to make the text/description appear on the UI of augmented reality when I click the imagebutton that I created called "pin".
Thank you.
here is my coding :
(Import)
import UIKit
import ARCL
import CoreLocation
import SceneKit
**(#IBOutlet)**
#IBOutlet weak var myimageview : UIImageView!
#IBOutlet weak var Topcoord: UILabel!
#IBOutlet var contentView: UIView!
**(CODE LOCATION)**
var location = CLLocation(coordinate:
CLLocationCoordinate2D(latitude: -6.7596, longitude: 107.6098), altitude: 2084)
let playButton = UIButton(type: .custom)
if let image = UIImage(named: "pin") {
playButton.setImage(image, for: .normal)
var annotationNode = LocationAnnotationNode(location: location, image: image)
annotationNode.annotationNode.name = "tangkuban perahu"
sceneLocationView.addLocationNodeWithConfirmedLocation(locationNode: annotationNode)
**(CODE TEXT/DESCRIPTION)**
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let touchLocation = touch.location(in: sceneLocationView)
let hitResults = sceneLocationView.hitTest(touchLocation)
for result in hitResults {
print("HIT:-> Name: \(result.node.description)")
print("HIT:-> description \(String(describing: result.node.name))")
func test(_ sender: Any)
{
Topcoord.text = String(describing: result.node.name)
}
}
}
}
(ACTIVATE CONSOLE "OUTPUT")
tangkuban perahu
I have an IBOutlet-connected UIView object (innerView) sitting over view controller (UIViewController)'s view. I want to let the user rotate innerView with their finger. So I have a subclass (TouchView) of UIView set to innerView's class. As the user rotates the view object, I want the view controller to receive a value from TouchView. And what I have is the following.
// UIView
protocol TouchDelegate: class {
func degreeChanged(degree: Double)
}
class TouchView: UIView {
weak var delegate: TouchDelegate? = nil
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let aTouch: AnyObject = touches.first! as UITouch
let newLoc = aTouch.location(in: self.superview)
...
...
let degree = { (radians: Double) -> Double in
return radians * 180 / M_PI
}
self.delegate?.degreeChanged(degree: degree)
}
}
// UIViewController
class ViewController: UIViewController, TouchDelegate {
#IBOutlet weak var innerView: UIView!
// MARK: - View
override func viewDidLoad() {
super.viewDidLoad()
let touchView = TouchView()
touchView.delegate = self
}
// MARK: - From TouchView
func degreeChanged(degree: Double) {
print(degree) // <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< no call
}
}
Well, the view controller never receives a delegate call from TouchView. What am I doing wrong? Thank you for your help.
I see 2 problems
You created a new instance of touchView and did not add it into your view controller's views
Your innerView is not an instance of touchView and its delegate is not set
My approach to your situation would look something like this:
// UIViewController
class ViewController: UIViewController, TouchDelegate {
#IBOutlet weak var innerView: TouchView!
// MARK: - View
override func viewDidLoad() {
super.viewDidLoad()
innerView.delegate = self
}
// MARK: - From TouchView
func degreeChanged(degree: Double) {
print(degree) // <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< no call
}
}
I am using xcode 8.0 & Swift 3.
Here's the overview of my issue:
Scrollview with an imageview named "insidepic" as a subview.
a duplicate of the imageview named "outsidepic" is positioned outside the scrollview.
When "outsidepic" is tapped, touchesBegan is fired.
When "insidepic" is tapped, the tap gesture is fired...but not touchesBegan or touchesEnded are fired.
Here's what I need to solve:
I need for touchesBegan to get fired when the scrollview is tapped. I have added the ".cancelsTouchesInView = false" to the gesture.
Furthermore, the zoom/pan gesturing on the scrollview needs to stay intact. So userinteraction is enabled on both the scrollview & the imageview inside.
This see attached image shows the layout & the viewcontroller swift file.
The yellow area is the scrollview. (with "insidepic" inside)
But, for quick reference, here is my code:
import UIKit
var tap = UITapGestureRecognizer()
class ViewController: UIViewController, UIScrollViewDelegate, UIGestureRecognizerDelegate {
#IBOutlet weak var scroller: UIScrollView!
#IBOutlet weak var insidepic: UIImageView!
#IBOutlet weak var outsidepic: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
scroller.minimumZoomScale = 1
scroller.maximumZoomScale = 6
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
tap = UITapGestureRecognizer(target: self, action: #selector(self.handleTap))
tap.numberOfTapsRequired = 1;
tap.numberOfTouchesRequired = 1;
tap.delegate = self
tap.cancelsTouchesInView = false
scroller.addGestureRecognizer(tap)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
print("Touches Began")
}
func handleTap(){
print("Tap Gesture Received")
}
func viewForZooming(in scrollView: UIScrollView) -> UIView? {
return insidepic
}
}
MANY THANKS FOR ANY HELP OR INSIGHT YOU CAN OFFER!!
I have a slider sliderLineSize and a variable lineSize in a ViewController. The UISlider sliderLineSize changes lineSize. However, lineSize actually used in the drawRect section of the viewLine class which attaches to a UIView.
Question:
How do I pass or make accessible the variable lineSize which is set in the ViewController to the class viewLine where it is used in drawRect?
import UIKit
class ViewController: UIViewController {
#IBOutlet weak var myView: UIView!
#IBOutlet weak var myImageView: UIImageView!
var lineSize: Int = 1
override func viewDidLoad() {
super.viewDidLoad()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
myImageView.alpha = 0.5
}
#IBAction func sliderLineSize(sender: UISlider) {
lineSize = Int(sender.value)
}
}
class viewLine: UIView {
let path=UIBezierPath()
var incrementalImage:UIImage?
var previousPoint:CGPoint = CGPoint.zero
var strokeColor:UIColor?
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
override func drawRect(rect: CGRect) {
incrementalImage?.drawInRect(rect)
path.lineWidth = lineSize
path.stroke()
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
let touch: AnyObject? = touches.first
let currentPoint = touch!.locationInView(self)
path.moveToPoint(currentPoint)
previousPoint=currentPoint
self.setNeedsDisplay()
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
let touch: AnyObject? = touches.first
let currentPoint = touch!.locationInView(self)
let midPoint = self.midPoint(previousPoint, p1: currentPoint)
path.addQuadCurveToPoint(midPoint,controlPoint: previousPoint)
previousPoint=currentPoint
path.moveToPoint(midPoint)
self.setNeedsDisplay()
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
self.drawBitmap()
self.setNeedsDisplay()
path.removeAllPoints()
}
func midPoint(p0:CGPoint,p1:CGPoint)->CGPoint {
let x=(p0.x+p1.x)/2
let y=(p0.y+p1.y)/2
return CGPoint(x: x, y: y)
}
func drawBitmap() {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, true, 1)
strokeColor?.setStroke()
if((incrementalImage) == nil){
let rectPath:UIBezierPath = UIBezierPath(rect: self.bounds)
UIColor.whiteColor().setFill()
rectPath.fill()
}
incrementalImage?.drawAtPoint(CGPointZero)
path.stroke()
incrementalImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
}
}
There are two main ways to do this.
Option 1:
Give your ViewLine class its own lineSize property:
class ViewLine: UIView {
var lineSize = 1
}
Give ViewController a reference to ViewLine, and use a property observer to update the property inside viewLine whenever it changes in ViewController:
class ViewController: UIViewController {
// v~~~ be sure to connect this outlet in interface builder
#IBOutlet weak var viewLine: ViewLine!
var lineSize = 1 {
didSet {
viewLine.lineSize = lineSize
}
}
}
Now your ViewLine class will have its own lineSize property that can be accessed from within its drawRect method directly.
Option 2:
Give your ViewLine class a reference to ViewController:
class ViewLine: UIView {
// v~~~ be sure to connect this outlet in interface builder
#IBOutlet weak var controller: ViewController!
}
Now, in your drawRect method, replace path.lineWidth = lineSize with path.lineWidth = controller.lineSize.
Basically, one of your classes needs a reference to the other in order for them to be able to communicate.
You should make Singleton Model class. A singleton class can be accessed from anywhere. Here is how you should create a singleton class in swift.
class ApplicationModel {
class var sharedInstance: ApplicationModel {
get {
struct Static {
static var instance: ApplicationModel? = nil
static var token: dispatch_once_t = 0
}
dispatch_once(&Static.token, {
Static.instance = ApplicationModel()
})
return Static.instance!
}
}
var lineSize = 1
}
Inside ViewController
override func viewDidLoad() {
super.viewDidLoad()
//Instantiate ApplicationModel
//GET
let lineSize = ApplicationModel.sharedInstance.lineSize
//SET
ApplicationModel.sharedInstance.lineSize = 5
}
Inside viewLine
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
//Access Application Model
//GET
let lineSize = ApplicationModel.sharedInstance.lineSize
//SET
ApplicationModel.sharedInstance.lineSize = 5
}
Hope this helps!
From the developer docs by apple here, I came across this.
Instead of handling a gesture, you could choose to track and handle the “raw” touches that make up the gesture.
Examples in the documentation are given for Objective-C however not for Swift. How can I track and handle these "raw" touches in Swift globally? Not just within an NSView?
My class:
import Cocoa
import Security
import AppKit
#NSApplicationMain
class AppDelegate: NSObject, NSApplicationDelegate {
#IBOutlet weak var window: NSWindow!
#IBOutlet weak var dropMenu: NSMenu!
#IBOutlet weak var resetView: NSView!
#IBOutlet weak var resetWindow: NSPanel!
#IBOutlet weak var keyLabel: NSTextField!
let statusItem = NSStatusBar.systemStatusBar().statusItemWithLength(-1);
func applicationDidFinishLaunching(aNotification: NSNotification) {
NSApp.setActivationPolicy(NSApplicationActivationPolicy.Accessory)
let menuIcon = NSImage(named: "menuIcon")
menuIcon?.template = true;
statusItem.image = menuIcon;
statusItem.menu = dropMenu;
}
#IBAction func quit(sender: NSMenuItem) {
NSApplication.sharedApplication().terminate(self)
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
for touch: AnyObject! in touches {
let touchLocation = touch.locationInNode(self)
//Use touchLocation for example: button.containsPoint(touchLocation) meaning the user has pressed the button.
}
}
}
Updated Again, NSEvent:
(touchesBegan UIEvent etc is for iOS, MacOS is different and uses NSEvent)
NSEvent is what you want to look into, where you can use mouseDown, mouseUp, mouseMove etc and then get the cursor point:
https://developer.apple.com/library/mac/documentation/Cocoa/Reference/ApplicationKit/Classes/NSResponder_Class/index.html#//apple_ref/occ/instm/NSResponder/mouseDown:
Swift Example of mouseDown:
override func mouseDown(event: NSEvent) {
let point: NSPoint = event.locationInView
print("X: \'point.x'")
print("Y: \'point.y'")
}
Objc Example:
- (void)mouseDown:(NSEvent *)event {
NSLog( #"mouse down event: %#", event );
NSPoint point = [event locationInWindow];
NSLog( #"mouseDown location: (%d,%d)", point.x, point.y );
}
Hope this helps.