Lua raycasting anomaly - lua

I have this new efficient raycasting code:
vectorX, vectorY = math.cos(r * math.pi/180), math.sin(r * math.pi/180)
wallDistanceX, wallDistanceY = v[3] - v[1], v[4] - v[2]
wallParallelX, wallParallelY = -wallDistanceY, wallDistanceX
rayX, rayY = v[1] - cam.x, v[2] - cam.y
Distance = (
(rayX * wallParallelX) + (rayY * wallParallelY))/(
(vectorX * wallParallelX) + (vectorY * wallParallelY)
)
if (
(math.min(v[1],v[3]) <= (cam.x + (Distance * vectorX)) and math.max(v[1],v[3]) >= (cam.x + (Distance * vectorX))) and
(math.min(v[2],v[4]) <= (cam.y + (Distance * vectorY)) and math.max(v[2],v[4]) >= (cam.y + (Distance * vectorY)))
) then
table.insert(possibles, Distance)
else
table.insert(possibles, 0)
end
This segment of code was converted to Lua from the answer of this post, but when the rays are drawn, they look broken up, like this:
I've done my debugging and so, and it turns out that when the ray is tested for intersection of a line, a distance of 0 is returned (a distance of 0 being returned means that the ray supposedly didn't intersect with a wall). There is a wall present as you see, and the minimap in the top-left corner shows just that.
Any ideas on why the ray is not being detected when it intersects? If you don't understand something pleeease ask.
r = angle for ray
v = an index of the list {}; v[i] is either the x1,y1,x2,y2 of the line

Related

How to detect object clicked in WebGL?

Firstly, I am not using 3Js in my Orbits app because I encountered a number of limitations including, but not limited to, issues with texture resolution and my requirement for complex lighting equations but I would like to implement something like 3Js' raycaster to allow me to detect the object clicked by the user.
I'm new to WebGL, but an "old hand" in software development so I'm looking for some hints about where to start.
The approach is as follows:
You generate your scene twice, once normally which is displayed and the second, with the objects uniquely coloured but not displayed. Then you use gl.readPixels from the second scene using the position on the first and decode the colour to identify the object.
Now I have to implement it myself.
Picking spheres
When picking spheres, or objects that are separated (not one inside another) you can use a simple distance from ray to very quickly get the closest object.
Example
The function returns a function that does the calculation. As it is only the closest you are interested in the distances can remain as squares. The distance from the camera is held as a unit distance along the ray.
function distanceFromRay() {
var dSqr, ox, oy, oz, vx, vy, vz;
function distanceSqr(px, py, pz) {
const ax = px - ox, ay = py - oy, az = pz - oz;
const u = (ax * vx + ay * vy + az * vz) / dSqr;
distanceSqr.unit = u;
if (u > 0) { // is past origin
const bx = ox + vx * u - px, by = oy + vy * u - py, bz = oz + vz * u - pz;
return bx * bx + by * by + bz * bz; // dist sqr to closest point on ray
}
return Infinity;
}
distanceSqr.unit = 0;
distanceSqr.setRay(x, y, z, xx, yy, zz) { // ray from origin x, y,z,
// infinite length along xx,yy,zz
(ox = x, oy = y, oz = z);
(vx = xx, vy = yy, vz = zz);
dSqr = vx * vx + vy * vy + vz * vz;
}
return distanceSqr;
}
Usage
There is a one time setup call;
// setup
const distToRay = distanceFromRay();
At the start of a frame that requires a pick, calculate the pick ray and set it. Also set the min distance from ray and eye.
// at start of frame set pick ray
distToRay.setRay(eye.x, eye.y, eye.z, pointer.ray.x, pointer.ray.y, pointer.ray.y);
var minDist = maxObjRadius * maxObjRadius;
var nearestObj = undefined;
var eyeDist = Infinity;
Then for each pickable object get the distance by passing the objects center and comparing it to any previous (in frame) found distance, objects radius, and distance from eye.
// per object
const dis = distToRay(obj.pos.x, obj.pos.y, obj.pos.z);
if (dis < obj.radius && dis < minDist && distToRay.unit > 0 && distToRay.unit < eyeDist ) {
minDist = dis;
eyeDist = distToRay.unit;
nearestObj = obj;
}
At the end of the frame if nearestObj is not undefined it will hold a reference to the picked object.
// end of frame
if (nearestObj) {
// you have the closest object
}

Get Rho and Theta from Hough-Transform opencvsharp?

I have Hough-Transform implemented using Opencvsharp (opencv), and get the lines detected on my image in console application/windows-from-application:
lines = edgeImg.HoughLines2(storage, HoughLinesMethod.Probabilistic, 1, Math.PI / 180, 60, 100, 100);
for (int i = 0; i < lines.Total; i++)
{
CvLineSegmentPoint segP= lines.GetSeqElem<CvLineSegmentPoint>(i).Value;
double angle = Math.Atan2((segP.P2.Y) - (segP.P1.Y), (segP.P2.X) - (segP.P1.X)) * 180 / Math.PI;
if (Math.Abs(angle) <= 60)
continue;
if (segP.P1.Y > segP.P2.Y + 20 || segP.P1.Y < segP.P2.Y - 20)
src.Line(segP.P1, segP.P2, CvColor.blue, 2, LineType.AntiAlias, 0);
}
I have tried different methods for visualizing the rho-theta space. since "HoughLinesMethod" does all the transformation internally, I have tried to get these values from x,y in the reverse way:
double angle = Math.Atan2(dy, dx) * 180 / Math.PI;
double theta = 90 - angle;
var thetaRad = theta*Math.PI/180;
double rho = (x1 * Math.Cos(thetaRad) + y1 * Math.Sin(thetaRad));
my first question is if I need to get two values for rho/theta, both for x1,y1 and also x2,y2 ; or calculating only one "rho/theta" would be the right intersect?
Thanks!
second, how can I visualize them in the right format? (what I currently see on my outout image is some random white dots at the top left corner of my output)
third, is it rational to get rho,theta values back in this way or you would suggest to perform the hough transform by myself and reduce the complexity? (I used opencvsharp function for better and efficient performance!)

Get points from a UIBezierPath

I drew the above BezierPath by doing:
// location is where the user touches screen.
// location will be the maximum of the graph
CGPoint origin = CGPointMake(xStart, 620.0);
CGPoint endpt = CGPointMake(xEnd, 620.0);
CGPoint midpt1 = midPointForPoints(origin, location);
CGPoint midpt2 = midPointForPoints(location, endpt);
UIBezierPath *path = [UIBezierPath bezierPath];
[path moveToPoint:origin];
[path addQuadCurveToPoint:location controlPoint:CGPointMake(midpt1.x, midpt1.y+50)];
[path addQuadCurveToPoint:endpt controlPoint:CGPointMake(midpt2.x, midpt2.y+50)];
[shapeLayer setPath:path.CGPath];
Now, I want to retrieve y-coordinates for certain x-coordinates that lie on the path. For example, given x = 0.0, I want to get y = 0.0, or given x = 300.0, y = 50.0.
I looked at some references like this question and sample code I am still not sure.
Update: basically, I want to do something like this.
Update:
Following #Fang's advice:
Given the equation
X = (1-t)^2*X0 + 2*t*(1-t)*X1 + t^2 *X2
I solve for t
t = ((2.0 * x0 - x1) + sqrt(((-2.0 * x0 + x1) ** 2.0)
- ((4 * (x0 - 2.0 * x1 + x2)) * (x0 - x)))) / (2.0 * (x0 - 2.0 * x1 + x2))
or
t = ((2.0 * x0 - x1) - sqrt(((-2.0 * x0 + x1) ** 2.0)
- ((4 * (x0 - 2.0 * x1 + x2)) * (x0 - x)))) / (2.0 * (x0 - 2.0 * x1 + x2))
Using this value, find Y that corresponds to X (we used X to find the above t value)
Y = (1-t)^2*Y0 + 2*t*(1-t)*Y1 + t^2 *Y2
Following the above equation, I am supposed to get the y-value of the point that lies on the Bezier curve but I get a point that's far from the right one. Any further help will be very much appreciated..
Concern: I think one possible problem is that I am calling addQuadCurveToPoint() twice with two control points instead of once with two control points. Does it mean I draw two Bezier paths and combine them? I am also looking at this to see what's wrong with my computation and the only difference seems to be that he uses two control points when calling addQuadCurveToPoint().
Update after Fang's intense consulting:
- (float)getYFromBezierPath:(float)x location:(CGPoint)location ctrlpt1:(CGPoint)ctrlpt1 ctrlpt2:(CGPoint)ctrlpt2 endpt:(CGPoint)endpt {
float yVal;
float tVal;
if (x <= location.x) {
tVal = [self getTvalFromBezierPath:x x0Val:0.0 x1Val:ctrlpt1.x x2Val:location.x];
yVal = [self getCoordFromBezierPath:tVal origin:0.0 p1Val:ctrlpt1.y p2Val:location.y];
} else {
// THIS PART IS THE PROBLEM //
tVal = [self getTvalFromBezierPath:x x0Val:location.x x1Val:ctrlpt2.x x2Val:endpt.x];
yVal = [self getCoordFromBezierPath:tVal origin:location.y p1Val:ctrlpt2.y p2Val:endpt.y];
}
return yVal;
}
- (float)getTvalFromBezierPath:(float)x x0Val:(float)x0 x1Val:(float)x1 x2Val:(float)x2 {
float tVal = (x-x0)/(2*(x1-x0));
return tVal;
}
- (float)getCoordFromBezierPath:(float)t origin: (float)origin p1Val: (float)p1 p2Val: (float)p2 {
// tVal = (sqrt((-2.0 * x * x1) + (x * x0) + (x * x2) + pow(x1, 2) - (x0 * x2)) + x0 - x1) / (x0 - (2.0 * x1) + x2);
return (pow((1-t),2) * origin) + (2 * t * (1-t) * p1) + (pow(t,2) * p2);
}
Last question:
for the second Bezeir path, y-value should decrease as t-value increases. Right now, y-value keeps increasing. How should I fix this? After intensive debugging I haven't found why this is happening because everything conforms to the document.
It should be possible to get points along a Bezier path as addQuadCurveToPoint is to add a quadratic Bezier segment into the path. So, the three control points of your first quadratic Bezier curve are (refer to the code piece in original post)
P(0) = origin
P(1) = (midpt1.x, midpt1.y+50)
P(2) = location
You can compute as many points on this quadratic Bezier curve as you want by varying the parameter t from 0 to 1 by any small increment value as
C(t) = (1-t)^2*P(0) + 2*t*(1-t)*P(1) + t^2 *P(2)
To get the Y value from a given X value, you will have to solve for the t value from the given X value from this quadratic polynomial of t:
X = (1-t)^2*X0 + 2*t*(1-t)*X1 + t^2 *X2
where X0, X1 and X2 are the X coordinates of P(0), P(1) and P(2), which means X0=origin.x, X1=midpt1.x and X2=location.x.
From this, we can obtain a quadratic equation
(X0-2*X1+X2)t^2 + 2(X1-X0)*t + (X0-X) = 0
You can solve for t using the quadratic formula. If your X0, X1 and X2 values happens to make the coefficient of t^2 term zero, you can solve for t directly as t = (X-X0)/(2*(X1-X0)).
Once you have the t value, you can easily evaluate the corresponding Y value.
CGPath is opaque data types, ie. in this case, we can only get the points on which we define on the creation, like eg. the graph you create, there are only three points that can be obtained.
Like the sample code, you obtain those points using CGPathApply. If you append below code after your codes, it will output 3 points.
...
[shapeLayer setPath:path.CGPath];
NSMutableArray *keyPoints = [NSMutableArray array];
CGPathApply(path.CGPath, (__bridge void *)keyPoints, getPointsFromBezier);
NSLog(#"Points = %#", points);
}
// copied from the sample code.
void getPointsFromBezier(void *info, const CGPathElement *element)
{
NSMutableArray *bezierPoints = (__bridge NSMutableArray *)info;
CGPathElementType type = element->type;
CGPoint *points = element->points;
if (type != kCGPathElementCloseSubpath)
{
if ((type == kCGPathElementAddLineToPoint) ||
(type == kCGPathElementMoveToPoint))
[bezierPoints addObject:VALUE(0)];
else if (type == kCGPathElementAddQuadCurveToPoint)
[bezierPoints addObject:VALUE(1)];
else if (type == kCGPathElementAddCurveToPoint)
[bezierPoints addObject:VALUE(2)];
}
}
So, in short, you cannot get every single coordinate on that graph like you required given its x/y counterpart.

Evenly arranging circular points in Lua

I managed to make points via the print function appear in a circle shape and animated it to go in a constant rotation with the circle variable. However my attempt to auto arrange the points into a 1/number of points segments of the circle to make them laid out evenly without inputting the angle of each one seems instead to make them go around far more than 360 degrees around the circle, as it feeds into itself.
For example for 5 points I'd want each circle in 1/5th of the 360 degrees with even space on each side, which should make a regular pentagon shape if you joined up the dots
function love.load() --Only run at startup
cycle = 0
points = 9 -- should work on any value
radius = 0.5
love.window.setMode(90, 90)
end
function love.update()
cycle = cycle + 0.05
if cycle >= 360 then
cycle = 0
--prevent huge values
end
end
function love.draw()
i = 0
while i < points do
x = radius * math.deg(math.sin(cycle + (360 * (i / points )) ) )
y = radius * math.deg(math.cos(cycle + (360 * (i / points )) ) )
--cycle to move and i + 1 / points to auto arrange
b = (i / points )
b = round(b, 2)
love.graphics.print( b , 33 + x, 33 + y)
i = i + 1
end
end
function round(num, idp) --rounding function for display
local mult = 10^(idp or 0)
return math.floor(num * mult + 0.5) / mult
end
What currently happens:
In your loop, you're doing
x = radius * math.deg(math.sin(cycle + (360 * (i / points )) ) )
y = radius * math.deg(math.cos(cycle + (360 * (i / points )) ) )
whereas, you want to have:
b = i / points
local c = cycle + (360 * b) -- to lessen the computation cost
x = radius * math.sin( math.rad(c) )
y = radius * math.cos( math.rad(c) )
You just needed to convert that 360 to radians using the lua math.rad() function, as lua's math.sin(O) and math.cos() functions go by radians rather than degrees. the excalamation point have nothing to do with code and are just there to grab your attention to the problem.
x = math.deg(radius*math.sin(cycle + (!!!math.rad(360)!!! * (i / points )) ) )
y = math.deg(radius*math.cos(cycle + (!!!math.rad(360)!!! * (i / points )) ) )
You need to use cos for x and sin for y to get the right order:
x = radius * math.deg(math.cos(cycle + (360 * (i / points )) ) )
y = radius * math.deg(math.sin(cycle + (360 * (i / points )) ) )

Draw Perpendicular line to a line in opencv

I better explain my problem with an Image
I have a contour and a line which is passing through that contour.
At the intersection point of contour and line I want to draw a perpendicular line at the intersection point of a line and contour up to a particular distance.
I know the intersection point as well as slope of the line.
For reference I am attaching this Image.
If the blue line in your picture goes from point A to point B, and you want to draw the red line at point B, you can do the following:
Get the direction vector going from A to B. This would be:
v.x = B.x - A.x; v.y = B.y - A.y;
Normalize the vector:
mag = sqrt (v.x*v.x + v.y*v.y); v.x = v.x / mag; v.y = v.y / mag;
Rotate the vector 90 degrees by swapping x and y, and inverting one of them. Note about the rotation direction: In OpenCV and image processing in general x and y axis on the image are not oriented in the Euclidian way, in particular the y axis points down and not up. In Euclidian, inverting the final x (initial y) would rotate counterclockwise (standard for euclidean), and inverting y would rotate clockwise. In OpenCV it's the opposite. So, for example to get clockwise rotation in OpenCV: temp = v.x; v.x = -v.y; v.y = temp;
Create a new line at B pointing in the direction of v:
C.x = B.x + v.x * length; C.y = B.y + v.y * length;
(Note that you can make it extend in both directions by creating a point D in the opposite direction by simply negating length.)
This is my version of the function :
def getPerpCoord(aX, aY, bX, bY, length):
vX = bX-aX
vY = bY-aY
#print(str(vX)+" "+str(vY))
if(vX == 0 or vY == 0):
return 0, 0, 0, 0
mag = math.sqrt(vX*vX + vY*vY)
vX = vX / mag
vY = vY / mag
temp = vX
vX = 0-vY
vY = temp
cX = bX + vX * length
cY = bY + vY * length
dX = bX - vX * length
dY = bY - vY * length
return int(cX), int(cY), int(dX), int(dY)

Resources