Core Games Scripting : Line of sight - lua

so I'm working with a gaming engine called Core whos scripts are written in Lua. And I'm having trouble writing a script to detect line of sight, I cant seem to find any other help scripting this in Cores function document and as it is a relatively new engine I also can't find any others with this issue.
If anyone knows how to detect line of sight in a Core script I would greatly appreciate it.

What I think you need is World.Raycast(), which takes a Vector3 for the start and end point. There is an example here: https://docs.coregames.com/api/hitresult/#examples that gets a HitResult based on a starting position Vector3 and a camera direction:
local rayStart = player:GetViewWorldPosition()
local cameraForward = player:GetViewWorldRotation() * Vector3.FORWARD
local rayEnd = rayStart + cameraForward * 10000
local hitResult = World.Raycast(rayStart, rayEnd)

Related

Load player character to dummy

I'm pretty new to scripting but I'm trying to make a lobby system. Basically when players join they are assigned to the 4 dummies (so 4 players max). So the dummies look like the players. I'm getting very confused looking at the Roblox documentation followed by little scripting experience experience.
I want to:
First delete any dummys that are already loaded
Assign players to the dummies as they join then remove dummies as they leave
Duplicate the dummies from replicated storage and parent them to the Workspace.
Just looking for some guidance please don't write an entire script as I'm trying to use this as a learning tool.
Hers the code so far and now my head hurts haha
local function characterRig()
for i,player in pairs(game.Players:GetPlayers()) do
local dummy = game.ServerStorage["Dummy"..i]
player.CharacterAutoLoads = false
player.Character = dummy
local playerHumanoid = player.Character.humanoid
playerHumanoid:GetHumanoidDescriptionFromUserId()
dummy.Parent = workspace
end
end
game.Players.PlayerAdded:Connect(characterRig)
game.Players.PlayerRemoving:Connect(characterRig)

How to make a player fly to a point in lua roblox?

I want my player to fly at a certain speed to the specified point after running the code, it should not be teleportation, but exactly smooth movement, as in this video - https://youtu.be/_p7HmviCIF8?t=231
For your case here, if you want an exact time for every teleport, you're gonna need to use TweenService.
So you're first going to reference where you're going. Let's say that our point is a CFrame value of an object.
Remember, whenever we want to tp our character, we use CFrames and not Positions.
So first, you're gonna wanna make a TweenInfo, which is basically the parameters of the tween, e.g Time to get to the point, the movement it should have (Linear, Elastic, etc.), etc.
And then you're gonna need a table containing of the property that needs to be changed. In which case we want the CFrame value of the HumanoidRootPart to be the point we set.
Then we're going to make a new tween and have it tween our HumanoidRootPart CFrame to the point CFrame.
local TweenService = game:GetService("TweenService")
local TweeningInfo = TweenInfo.new(
-- The time to get there here
)
local TargetValue = {
CFrame = -- Point CFrame here.
}
local Tween = TweenService:Create(game.Players.LocalPlayer.Character.HumanoidRootPart, TweeningInfo, TargetValue)
Tween:Play()
For your video its just an script for a exploit like synapse x but there are a lot of pastebin or free scripts in toolbox or simply video on that or use the adonis admin on studio and run this in chat :fly me "speed"

How to render from ob openCV

In my python neat code I am using opencv to downscale and covert into gray every frame of the environment. what I want a archive is that opencv opens a window displaying the frame/ video that is it processing.
In short I want to view the the neat algorithm learning and evolving.
Because there are 3 environment running in parallel i want opencv to display the frame/video that is best performing right now.
I am working with the python neat library to do some machine learning tasks. At the moment I am doing parallel learning with 3 threads with the environment of sonic the hedgehog. I have tried to do simple open CV frame commands, but its just opening a black window.
net = neat.nn.FeedForwardNetwork.create(self.genome, self.config)
fitness = 0
xpos = 0
xpos_max = 0
counter = 0
imgarray = []
while not done:
# self.env.render()
ob = cv2.resize(ob, (inx, iny))
ob = cv2.cvtColor(ob, cv2.COLOR_BGR2GRAY)
ob = np.reshape(ob, (inx, iny))
imgarray = np.ndarray.flatten(ob)
actions = net.activate(imgarray)
ob, rew, done, info = self.env.step(actions)
xpos = info['x']
This is the part of the code that downscales the frame and converts it to gray scale.
Bonus if it could only show the frame/worker that is doing the best based on the fitness value.
View full code here: https://gitlab.com/lucasrthompson/Sonic-Bot-In-OpenAI-and-NEAT/blob/master/neat-paralle-sonic.py
by lucasrthompson
The output that I expect is one window that shows the frame/ video of the environment. Awesome
The built it render
self.env.render()
Pops up many many windows with past and present versions of the environment.
thanks
I am writing my own NEAT implementation and I am also testing with OpenAi gym.
You can use wrappers to record the video for you, and this will be the real video, without downscaling or changing colors:
env_wrapped = gym.make('OpenAI-env-id')
env = wrappers.Monitor(env_wrapped, dir , video_callable=record_video_function)
Where the "record_video_function" is a callable which can return true or false when you desire the episode to be recorded.
What I usually do to see the best performing genomes is:
Sort the genomes by fitness
Run the evaluation loop
If a last species champion is next, I change a global variable to True
In the "record_video_function" I return the value of this global variable, so if it's true it will enable the video recording for the episode
After the episode is over, I return this global variable to False
So, with this I can see the best genome performers of last generation. You can't see the best of the current generation because there's no way to know how they will perform. If the environment is deterministic, you would be able to see the best performance in the next generation. If it's stochastic, then it may not be the best anymore.

How to use Tiled Maps/Tileset exported as Lua in Love2d

I have exported my Game world (Tiles + Collision Tiles) from Tiled as a Lua File, How do i integrate this into my Love2d Game and Determine which one is a Walkable path and Which one is not?
If you look in this question (What code do I wrap around Lua code in C++ with LuaBind?) you will find an example of what the lua file looks like. All it is is a file that when run in lua will return a dictionary containing all the data about your game world.
Assuming your map is called "mymap.lua" you would do this in one of the following ways:
require("mymap")
local mymap = love.filesystem.load("mymap.lua")()
Then, to use it, you would do something like:
-- loads the sprites of the first tileset
local tileset1 = love.graphics.newImage(mymap.tilesets[1].image)
-- print the width and height of the first layer
print(mymap.layers[1].width, mymap.layers[1].height)
As for what each piece of the imported data means, you will have to work it out yourself.

How do I detect a collision with a poly line in the lua love engine?

I am using the lua love engine to make a simple game. However, im having a little trouble with collision.
I have a polyline between a set of points (representing the rocky ground) and a box I need to collide with it, but I cant think of an easy implementation.
love.graphics.line( 0,60, 100,70, 150,300, 200,230, 250,230, 300,280, 350,220, 400,220, 420,150, 440,140, 480,340 )
If anyone could help with code snippets or advice, it would be much appreciated.
You need to create a more abstract representation of the ground, from which you can generate both the line for the graphics, and the body for the physics.
So for example:
--ground represented as a set of points
local ground = {0,60, 100,70, 150,300, 200,230, 250,230, 300,280, 350,220, 400,220, 420,150, 440,140, 480,340}
This may not be the best representation, but I will continue with my example.
You now have a representation of the ground, now you a way of converting that something that your physics can understand (to do the collision) and something that your graphical display can understand (do draw the ground). So you declare two functions:
--converts a table representing the ground into something that
--can be understood by your physics.
--If you are using love.physics then this would create
--a Body with a set PolygonShapes attached to it.
local function createGroundBody(ground)
--you may need to pass some additional arguments,
--such as the World in which you want to create the ground.
end
--converts a table representing the ground into something
--that you are able to draw.
local function createGroundGraphic(ground)
--ground is already represented in a way that love.graphics.line can handle
--so just pass it through.
return ground
end
Now, putting it all together:
local groundGraphic = nil
local groundPhysics = nil
love.load()
groundGraphic = createGroundGraphic(ground)
physics = makePhysicsWorld()
groundPhysics = createGroundBody(ground)
end
love.draw()
--Draws groundGraphic, could be implemented as
--love.graphics.line(groundGraphic)
--for example.
draw(groundGraphic)
--you also need do draw the box and everything else here.
end
love.update(dt)
--where `box` is the box you have to collide with the ground.
runPhysics(dt, groundPhysics, box)
--now you need to update the representation of your box graphic
--to match up with the new position of the box.
--This could be done implicitly by using the representation for physics
--when doing the drawing of the box.
end
Without knowing exactly how you want to implement the physics and the drawing, it is hard to give more detail than this. I hope that you use this as an example of how this code could be structured. The code that I have presented is only a very approximate structure, so it will not come close to actually working. Please ask about anything that I have been unclear about!

Resources