Im using the new RoomPlanner API from iOS 16, and the idea is to calculate every wall area in meters, or at least the widht and height. Is there a way o calculate that using the object
CapturedRoom.Walls ?
Heres the object that im supposed to use;
https://developer.apple.com/documentation/accelerate/simd_float3
Just iterate through CapturedRoom.Walls.
let walls = capturedRoom.walls
var totalWallArea: Float = 0
for wall in walls {
let wallArea = wall.plane.width * wall.plane.height
totalWallArea += wallArea
}
print("Total wall area: \(totalWallArea) square meters")
Related
Using Apple's RoomPlan API, I managed to create a series of nodes representing the walls of a room. However, after applying their transform, the yaw (eulerAngles.y) ends up being a random number which makes aligning other nodes not generated as part of the room tricky.
My thought was to add all the walls as children to a parent node, then rotate the parent until the largest wall had a eulerAngle.y of 90. However, I am having trouble getting it to work right.
Code for generating the walls from a RoomPlan CapturedRoom
for scannedWall in roomScan.walls {
//Generate new wall geometry
let length = 0.01
let width = scannedWall.dimensions.x
let height = scannedWall.dimensions.y
//Generate new SCNNode
let newWallGeometry = SCNBox(
width: CGFloat(width),
height: CGFloat(height),
length: CGFloat(length),
chamferRadius: 0
)
newWallGeometry.firstMaterial?.transparency = 0.6
let newWall = SCNNode(geometry: newWallGeometry)
newWall.simdTransform = scannedWall.transform
parentNode.addChildNode(newWall)
}
My initial thought was just to rotate the parent node by the difference between the longest wall's rotation and 90 degrees, but that does not seem to be correct:
parentNode.eulerAngles.y = (.pi/2 - longestWall.eulerAngles.y)
Any help would be greatly appreciated, I'm fairly new to swift/scenekit and am banging my head against a wall
I am working in ARKit/RealityKit/Swift, If I keep one box near to the camera then it has a perfect size. But if I move it far from the camera e.g 1 or 2 meters far then it looks smaller. I want to keep its size same in the user's eyes/perception.
I have done some basic math on the Z coordinate of the object. e.g. if z = -0.9 and size = 0.5 meter then what will be new size if z = -2 or far. But it does not work.
//Sample code to get an idea
let original = Position()
let origianlSize: Float = 0.5
let newPosition = Position() // Assume it is far from original position
let newSize = origianlSize * newPosition.z / original.z // New size is not proper
In SpriteKit, I can use touch locatons to record "Hits" in a target, where center of the target, "bulls eye" have the coordinates (0,0). After plenty of shooting, I will fetch all hits as an array with CGPoints. Since the target is 500 x 500 points (SKScene, sks-file), all hits can have a x position from -250 to +250 and likewise for y position.
In the attatched photo, the hits are registered as points at around (150, 150).
The problem arises when I will use the famous LFHeatMap https://github.com/gpolak/LFHeatMap.
+ (UIImage *)heatMapWithRect:(CGRect)rect
boost:(float)boost
points:(NSArray *)points
weights:(NSArray *)weights;
The LFHeatMap generates a UIImage based on the array, which I add to a UIImageView. The problem is that the UIViews has the x and y values arranged differently from SKScenes
func setHeatMap() {
let points = getPointsFromCoreData()
let weigths = getWeightsFromCoreData()
var rect = CGRectMake(0, 0, 500, 500)
rect.origin = CGPointMake(-250, -250)
let image =
LFHeatMap.heatMapWithRect(rect, boost: 1, points: points, weights: weights)
heatMapView.contentMode = UIViewContentMode.ScaleAspectFit
heatMapView.image = image
}
Lowering the shots makes the heat move higher.
How can I solve this? Either All points have to be converted to fit another coordinate system, or the coordiate of the CGrect making the heatmap, must be changed. How can this be done?
This was embarrasingly easy when the solution first occured.
Run a loop trough the points array, and multiply the point.y with -1...
Then all the valus on the y-axis is correct.
I'm building an app that features some graphical manipulation. I'm storing shapes as UIBezierPaths, and I want to allow users to touch points along the line to create saved locations. Using the wonderful answer to this question, and more specifically, this project, I'm able to place a point on a line knowing the percentage of its length the point rests on. This is half of my problem.
I want a way to take a point on a path, and derive the percent of its length.
My math-fu is extremely weak. I've studied bezier curves but I simply don't have the math to understand it.
I would humbly submit that "go back and learn geometry and trigonometry" is a correct answer, but sadly one I don't have time for at present. What I need is a way to fill in this method:
- (CGFloat)percentOfLengthAtPoint:(CGPoint)point onPath:(UIBezierPath*)path
Any help appreciated!
I have working code that solves my problem. I'm not particularly proud of it; the overall technique is essentially a brute-force attack on a UIBezierPath, which is kind of funny if you think about it. (Please don't think about it).
As I mentioned, I have access to a method that allows me to get a point from a given percentage of a line. I have taken advantage of that power to find the closest percentage to the given point by running through 1000 percentage values. To wit:
Start with a CGPoint that represents where on the line the user touched.
let pointA = // the incoming CGPoint
Run through the 0-1 range in the thousands. This is the set of percentages we're going to brute-force and see if we have a match. For each, we run pointAtPercentOfLength, from the linked project above.
var pointArray:[[String:Any]] = []
for (var i:Int = 0; i <= 1000; i++) {
let value = CGFloat(round((CGFloat(i) / CGFloat(1000)) * 1000) / 1000)
let testPoint = path.pointAtPercentOfLength(value)
let pointB = CGPoint(x: floor(testPoint.x), y: floor(testPoint.y))
pointArray.append(["point" : pointB, "percent" : value])
}
That was the hard part. Now we take the returning values and calculate the distance between each point and the touched point. Closest one is our winner.
// sort the damned array by distance so we find the closest
var distanceArray:[[String:Any]] = []
for point in pointArray {
distanceArray.append([
"distance" : self.distanceFrom(point["point"] as! CGPoint, point2: pointA),
"point" : point["point"],
"percent" : point["percent"] as! CGFloat
])
}
Here's the sorting function if you're interested:
func distanceFrom(point1:CGPoint, point2:CGPoint) -> CGFloat {
let xDist = (point2.x - point1.x);
let yDist = (point2.y - point1.y);
return sqrt((xDist * xDist) + (yDist * yDist));
}
Finally, I sort the array by the distance of the values, and pick out the winner as our closest percent.
let ordered = distanceArray.sort { return CGFloat($0["distance"] as! CGFloat) < CGFloat($1["distance"] as! CGFloat) }
ordered is a little dictionary that includes percent, the correct value for a percentage of a line's length.
This is not pretty code, I know. I know. But it gets the job done and doesn't appear to be computationally expensive.
As a postscript, I should point to what appears to be a proper resource for doing this. During my research I read this beautiful article by David Rönnqvist, which included an equation for calculating the percentage distance along a path:
start⋅(1-t)3 + 3⋅c1⋅t(1-t)2 + 3⋅c2⋅t2(1-t) + end⋅t3
I was just about to try implementing that before my final solution occurred to me. Math, man. I can't even brain it. But if you're more ambitious than I, and wish to override my 30 lines of code with a five-line alternative, everyone would appreciate it!
I think your approach is sound, but you could do this far more efficiently.
Instead of creating an two arrays of dicts (with a thousand elements each) and then sorting the array - just use a while loop to move from 0.0 to 1.0, calculate the distance to the touch point and keep track of the minimum distance.
For example:
var t:CGFloat = 0.0
let step:CGFloat = 0.001
var minDistance:CGFloat = -1.0
var minPoint:CGPoint = CGPointZero
var minT:CGFloat = -1;
while (t<1.0) {
let point = pointAtPercentOfLength(t)
let distance:CGFloat = self.distanceFrom(point, point2: pointA)
if (minDistance == -1.0 || distance < minDistance) {
minDistance = distance
minPoint = point
minT = t
}
t += step
}
print("minDistance: \(minDistance) minPoint: \(minPoint.x) \(minPoint.y) t\(minT)\n")
I essentially want the "sprites" to collide when they stick together. However, I don't want the "joint" to be rigid; I essentially want the sprites to be able to move around as long as they are in contact with each other. Imagine two circles connected, and you can move one circle around the other, as long as it remains in contact.
I found this question: How to make one body stick to another moving object in SpriteKit and a lot of other resources that explain how to make sprites stick upon collision, but they all use SKJoints, which are rigid are not really flexible.
I guess another way to phrase it would be to say that I want the sprites to stick, but I want them to be able to "slide" on each other.
Well, I can think of one workaround, but this wouldn't work with non-normal polygons.
Sticking (pun unintended) with your circles example, what if you lock the position of the circle?
let circle1 = center circle
let circle2 = movable circle
Knowing the width of both circles, you can place in the update function that the position should be exactly the distance of:
((circle1.frame.width / 2) + (circle2.frame.width / 2))
If you're up to it, here's some code to help you on your way.
override func update(currentTime: CFTimeInterval) {
{
let distance = hypotf(Float(circle1.position.x - circle2.position.x), Float(circle1.position.y - circle2.position.y))
//calculate circle distances from each other
let radius = ((circle1.frame.width / 2) + (circle2.frame.width / 2))
//distance of circle positions
if distance != radius
{
//if distance is less or more than radius
let pointA = circle1.position
let pointB = circle2.position
let pointC = CGPointMake(pointB.x + 2, pointB.y)
let angle_ab = atan2(pointA.y - pointB.y, pointA.x - pointB.x)
let angle_cb = atan2(pointC.y - pointB.y, pointC.x - pointB.x)
let angle_abc = angle_ab - angle_cb
//get angle of circles from each other using atan2
let vectorx = cos(angle_abc)
let vectory = sin(angle_abc)
//convert angle into vectors
let x = circle1.position.x + radius * vectorx
let y = circle1.position.y + radius * vectory
//get new coordinates from vector, radius and center circle position
circle2.position = CGPointMake(x, y)
//set new position
}
}
Well you need to write code to make sure the movable circle, is well movable.
But, this should work.
I haven't tested this yet though, and I haven't even learned geometry let alone trig in school yet.
If I'm reading your question as you intended it, you can still use joints- just create actions with Inverse Kinematic constraints that allow rotation and translation around the contacting circles' joint.
https://developer.apple.com/library/prerelease/ios/documentation/SpriteKit/Reference/SKAction_Ref/index.html#//apple_ref/doc/uid/TP40013017-CH1-SW72