I just started to use jslint with backbone. At the beginning of project I create object:
App = {
Models: {},
Views: {},
Controller: {}
}
and get error:" 'App' was used before it was defined."
Then later I use it as:
App.Models.Task = Backbone.Model.extend({})
and at this point jslint can't pass me through. it says
unexpected 'App'. App.Models.Task = Backbone.Model.extend({}) // Line 17, Pos 1
#8 Stopping. (7% scanned).
I've read that probably jslint sees it as critical error because it stopped but it is not an error. what should I do?
It should be var App (or window.App, depending on your goals; but, in my opinion, even global variables are better defined as local ones first, then exported into outer space within a single statement). Otherwise JSLint (quite rightly) thinks that you just forgot to define this variable in some other place OR made a typo in its name.
Related
how to get string as true value of tfx.orchestration.data_types.RuntimeParameter during execution pipeline?
Hi,
I'm defining a runtime parameter like data_root = tfx.orchestration.data_types.RuntimeParameter(name='data-root', ptype=str) for a base path, from which I define many subfolders for various components like str(data_root)+'/model' for model serving path in tfx.components.Pusher().
It was working like a charm before I moved to tfx==1.12.0: str(data_root) is now providing a json dump.
To overcome that, i tried to define a runtime parameter for model path like model_root = tfx.orchestration.data_types.RuntimeParameter(name='model-root', ptype=str) and then feed the Pusher component the way I saw in many tutotrials:
pusher = Pusher(model=trainer.outputs['model'],
model_blessing=evaluator.outputs['blessing'],
push_destination=tfx.proto.PushDestination(
filesystem=tfx.proto.PushDestination.Filesystem(base_directory=model_root)))
but I get a TypeError saying tfx.proto.PushDestination.Filesystem does not accept Runtime parameter.
It completely breaks the existing setup as i received those parameters from external client for each kubeflow run.
Thanks a lot for any help.
I was able to fix it.
First of all, the docstring is not clear regarding which parameter of Pusher can be a RuntimeParameter or not.
I finally went to __init__ code definition of component Pusher to see that only the parameter push_destination can be a RuntimeParameter:
def __init__(
self,
model: Optional[types.BaseChannel] = None,
model_blessing: Optional[types.BaseChannel] = None,
infra_blessing: Optional[types.BaseChannel] = None,
push_destination: Optional[Union[pusher_pb2.PushDestination,
data_types.RuntimeParameter]] = None,
custom_config: Optional[Dict[str, Any]] = None,
custom_executor_spec: Optional[executor_spec.ExecutorSpec] = None):
Then I defined the component consequently, using my RuntimeParameter
model_root = tfx.orchestration.data_types.RuntimeParameter(name='model-serving-location', ptype=str)
pusher = Pusher(model=trainer.outputs['model'],
model_blessing=evaluator.outputs['blessing'],
push_destination=model_root)
As push_destination parameter is supposed to be message proto tfx.proto.pusher_pb2.PushDestination, you have then to respect the associated schema when instantiating and running a pipeline execution, meaning the value should be like:
{'type': 'model-serving-location': 'value': '{"filesystem": {"base_directory": "path/to/model/serving/for/the/run"}}'}
Regards
So I need to secure my code.
My code is obfuscated and im using load(tostring(resultServerr))() but someone can do load = print on top and it will print everything from my code..
I have to secure it to avoid replacement but i dont know how.
If you use Lua 5.2 or higher, you can provide your own sandboxed _ENV table in 4th argument of load. If you use Lua 5.1 or 5.0 you can use setfenv, which works almost the same way as new _ENV.
local func, err = load(unsafecode, nil, nil, {})
if not func then print(err) return end
func()
or
local func, err = load(unsafecode)
if not func then print(err) return end
setfenv(func, {})
func()
Here I use empty table {} to protect from using all globals, adding new ones and overwriting existing. If you want to provide some functions, just add them into this table, they won't be removed from _G if they change it inside the sandboxed code.
It depends on how and where are you using lua environment and what normal user is allowed to do with lua?
If only you controls startup process of the server or programm and the user has no access to startup process, it is possible to do sandboxing on init state as suggested Spar, or something like:
init.lua:
function prepare()
local load = _G.load
_G.load_extension = function()
local obfuscated = get_extension()
load(obfuscated)
end
end
Original load is stored as local variable and can't be overwritten for load_extension in following lua scripts/functions.
If normal user can modify init scripts or can load external libraries (.dll/.so), you're out of luck to properly secure your code.
If the user has access to resultServerr content, you're out of luck hiding it, as this variable can be passed to any other process/function, not only print or load.
Overwrite print = nil then it can't print.
You can create your own hidden print function if you still need that capability; just remember what you called it, so you can access it later.
hidden_print = print
print = nil
print('this')
stdin:1: attempt to call a nil value (global 'print')
stack traceback:
stdin:1: in main chunk
[C]: in ?
hidden_print('that')
that
I am trying to make changes to the database using the changelog. Since I cannot guarantee that the values currently exist for the specific code, but could exist, I need to be able to check for them in order to either do an insert or an update.
Here is what I have been testing, which doesn't appear to do anything. Any words of advice are welcome.
databaseChangeLog = {
changeSet(author:'kmert', id:'tubecap-insert-update-1') {
preConditions(onFail="WARN",onFailMessage:"Tube cap does not exist,skipping because it cannot be updated."){
sqlCheck(expectedResult='1', 'SELECT * FROM [ltc2_tube_cap] WHERE code=11')
}
grailsChange {
change {
sql.execute("""
UPDATE [ltc2_tube_cap]
SET [name] = 'White'
WHERE [code] = 11;
""")
}
rollback {
}
}
}
}
UPDATE: I got the changelog script running, but I am now getting this error. I found the code from an online source. I cannot find a lot of documentation on preconditions...
| Starting dbm-update for database hapi_app_user # jdbc:jtds:sqlserver://localhost;databaseName=LabTraffic;MVCC=TRUE;LOCK_TIMEOUT=10000
problem parsing TubeCapUpdate.groovy: No signature of method: grails.plugin.databasemigration.DslBuilder.sqlCheck() is applicable for argument types: (java.lang.String, java.lang.String) values: [1, SELECT * FROM ltc2_tube_cap WHERE code=11] (re-run with --verbose to see the stacktrace)
problem parsing changelog.groovy: No signature of method: grails.plugin.databasemigration.DslBuilder.sqlCheck() is applicable for argument types: (java.lang.String, java.lang.String) values: [1, SELECT * FROM ltc2_tube_cap WHERE code=11] (re-run with --verbose to see the stacktrace)
groovy.lang.MissingMethodException: No signature of method: grails.plugin.databasemigration.DslBuilder.sqlCheck() is applicable for argument types: (java.lang.String, java.lang.String) values: [1, SELECT * FROM ltc2_tube_cap WHERE code=11]
at grails.plugin.databasemigration.DslBuilder.invokeMethod(DslBuilder.groovy:117)
at Script1$_run_closure1_closure2_closure3.doCall(Script1.groovy:13)
at grails.plugin.databasemigration.DslBuilder.invokeMethod(DslBuilder.groovy:117)
at Script1$_run_closure1_closure2.doCall(Script1.groovy:12)
at grails.plugin.databasemigration.DslBuilder.invokeMethod(DslBuilder.groovy:117)
at Script1$_run_closure1.doCall(Script1.groovy:11)
at grails.plugin.databasemigration.GrailsChangeLogParser.parse(GrailsChangeLogParser.groovy:84)
at grails.plugin.databasemigration.DslBuilder.handleIncludedChangeLog(DslBuilder.groovy:747)
at grails.plugin.databasemigration.DslBuilder.createNode(DslBuilder.groovy:139)
at grails.plugin.databasemigration.DslBuilder.createNode(DslBuilder.groovy:590)
at grails.plugin.databasemigration.DslBuilder.invokeMethod(DslBuilder.groovy:117)
at Script1$_run_closure1.doCall(Script1.groovy:6)
at grails.plugin.databasemigration.GrailsChangeLogParser.parse(GrailsChangeLogParser.groovy:84)
at liquibase.Liquibase.update(Liquibase.java:107)
at DbmUpdate$_run_closure1_closure2.doCall(DbmUpdate:26)
at _DatabaseMigrationCommon_groovy$_run_closure2_closure11.doCall(_DatabaseMigrationCommon_groovy:59)
at grails.plugin.databasemigration.MigrationUtils.executeInSession(MigrationUtils.groovy:133)
at _DatabaseMigrationCommon_groovy$_run_closure2.doCall(_DatabaseMigrationCommon_groovy:51)
at DbmUpdate$_run_closure1.doCall(DbmUpdate:25)
Your syntax is incorrect for the sqlCheck preCondition.
sqlCheck(expectedResult:'1', 'SELECT * FROM [ltc2_tube_cap] WHERE code=11')
Notice in your code the first argument is an assignment statement expectedResult=1 and it should be a map entry expectedResult:1.
I found the answer buried on this Jira page. https://jira.grails.org/browse/GPDATABASEMIGRATION-40 which ironically is about adding lots of examples to the DB migration DSL to the documentation.
Make sure the following
grails dbm-gorm-diff add-your-file-forupdate.groovy -add
then inside your-file-forupdate.groovy is expected to see
databaseChangeLog = {
changeSet(author:'kmert', id:'tubecap-insert-update-1') {
.
.
.
}
}
Then ,the Big deal is either you have included this as migration script file to be executed as follow:
just manually add a line like the following to the end of grails-app/migrations/changelog.groovy:
include file: 'your-file-forupdate.groovy'
The changelog.groovy is always run from beginning to end, so make sure that you always add newly created migrations to the end.
Cheers! for more info see this
tl;dr: What design pattern allows you to split Lua code over multiple files that need to share some information without affecting the global table?
Background
It is considered bad form to create a library in Lua where requiring the library affects the global namespace:
--> somelib.lua <--
SomeLib = { ... }
--> usercode.lua <--
require 'somelib'
print(SomeLib) -- global key created == bad
Instead, it is considered a best practice to create a library that uses local variables and then returns them for the user to assign as they see fit:
--> somelib.lua <--
local SomeLib = { ... }
return SomeLib
--> usercode.lua <--
local theLib = require 'somelib' -- consumers name lib as they wish == good
The above pattern works fine when using a single file. However, this becomes considerably harder when you have multiple files that reference each other.
Concrete Example
How can you rewrite the following suite of files so that the assertions all pass? Ideally the rewrites will leave the same files on disk and responsibilities for each file. (Rewriting by merging all code into a single file is effective, but not helpful ;)
--> test_usage.lua <--
require 'master'
assert(MASTER.Simple)
assert(MASTER.simple)
assert(MASTER.Shared)
assert(MASTER.Shared.go1)
assert(MASTER.Shared.go2)
assert(MASTER.Simple.ref1()==MASTER.Multi1)
assert(pcall(MASTER.Simple.ref2))
assert(_G.MASTER == nil) -- Does not currently pass
--> master.lua <--
MASTER = {}
require 'simple'
require 'multi'
require 'shared1'
require 'shared2'
require 'shared3'
require 'reference'
--> simple.lua <--
MASTER.Simple = {}
function MASTER:simple() end
--> multi.lua <--
MASTER.Multi1 = {}
MASTER.Multi2 = {}
--> shared1.lua <--
MASTER.Shared = {}
--> shared2.lua <--
function MASTER.Shared:go1() end
--> shared3.lua <--
function MASTER.Shared:go2() end
--> reference.lua <--
function MASTER.Simple:ref1() return MASTER.Multi1 end
function MASTER.Simple:ref2() MASTER:simple() end
Failure: Setting the Environment
I thought to solve the problem by setting the environment to my master table with a self-reference. This does not work when calling functions like require however, as they change the environment back:
--> master.lua <--
foo = "original"
local MASTER = setmetatable({foo="captured"},{__index=_G})
MASTER.MASTER = MASTER
setfenv(1,MASTER)
require 'simple'
--> simple.lua <--
print(foo) --> "original"
MASTER.Simple = {} --> attempt to index global 'MASTER' (a nil value)
You are giving master.lua two responsibilities:
It defines the common module table
It imports all of the submodules
Instead you should create a separate module for (1) and import it in all of the submodules:
--> common.lua <--
return {}
--> master.lua <--
require 'simple'
require 'multi'
require 'shared1'
require 'shared2'
require 'shared3'
require 'reference'
return require'common' -- return the common table
--> simple.lua <--
local MASTER = require'common' -- import the common table
MASTER.Simple = {}
function MASTER:simple() end
etc.
Finally, change the first line of test_usage.lua to use a local variable:
--> test_usage.lua <--
local MASTER = require'master'
...
The tests should now pass.
I have a systematic way to solve that problem. I have refactored your module in a Git repository to show you how it works: https://github.com/catwell/dont-touch-global-namespace/commit/34b390fa34931464c1dc6f32a26dc4b27d5ebd69
The idea is that you should have the sub-parts return a function that takes the main module as an argument.
If you cheat by opening the source files in master.lua, append a header and a footer and use loadstring, you can even use them unmodified (only master.lua has to be modified, but it is more complex). Personally, I prefer to keep it explicit, which is what I have done here. I don't like magic :)
EDIT: it is very close to Andrew Stark's first solution, except I patch the MASTER table directly in the sub-modules. The advantage is that you can define several things at once, like in your simple.lua, multi.lua and reference.lua files.
We can solve the problem by changing the master file to modify the environment in which all required code is run:
--> master.lua <--
local m = {} -- The actual master table
local env = getfenv(0) -- The current environment
local sandbox = { MASTER=m } -- Environment for all requires
setmetatable(sandbox,{__index=env}) -- ...also exposes read access to real env
setfenv(0,sandbox) -- Use the sandbox as the environment
-- require all files as before
setfenv(0,env) -- Restore the original environment
return m
The sandbox is an empty table that inherits values from _G but that also has a reference to the MASTER table, simulating a global from the perspective of later code. Using this sandbox as the environment causes all later requires to evaluate their "global" code in this context.
We save the real environment for later restoration, so that we don't mess with any later code that might want to actually set a global variable.
The question concerns:
Not polluting the global space when making modules.
Making modules in such a way that they might be split into multiple files, for maintenance reasons, among others.
My solution to the above problem lies in tweaking the "return as table" idiom in Lua such that instead of returning a table, you return a function that returns a table, when state needs to be passed between sub-modules.
This works well for sub-modules that are entirely dependent upon some root-module. If they are loaded independently, then they require the user to know that they need to call the module before they can use it. This is unlike every other module that has a collection of methods, ready to go from local a = require('a').
At any rate, this works like so:
--callbacks.lua a -- sub-module
return function(self)
local callbacks = {}
callbacks.StartElement = function(parser, elementName, attributes)
local res = {}
local stack = self.stack
---awesome stuff for about 150 lines...
return callbacks
end
To use it, you can...
local make_callbacks = require'callbacks'
self.callbacks = make_callbacks(self)
Or, better yet, simply call the return value of require when assigning the callback table to the parent module, like so:
self.callbacks = require'trms.xml.callbacks'(self)
Most often, I try not to do this. If I'm passing state or self between submodules, I find that I'm often doing it wrong. My internal policy is that if I'm doing something that is highly-related to another file, I might be okay. More likely, I'm putting something in the wrong spot and there is a way to do it without passing anything between modules.
The reason that I don't like this is that which I pass by table has methods and properties unseen in the file that I am working within. I'm not free to refactor the internal implementation of one of my files, without horking the others. So, I humbly suggest that this idiom is a yellow flag, but probably not a red one. :)
While this solves the problem of state-sharing without globals, it doesn't really protect the user from the accidental omission of local. If I may speak to that implied question...
The first thing that I do is remove access to the global environment from my module. Remembering that it's only available as long as I don't
reset _ENV, reseting it is the first thing that I do. This is done by packing only what is needed into a new _ENV table.
_ENV = {print = print,
pairs = pairs, --etc
}
However, constantly re-typing all of the things that I need from lua into each file is a giant, error-prone pain. To avoid this, I make one file in my module's base directory and use it as the home for all of my modules' and sub-modules' common environments. I call it _ENV.lua.
Note: I cannot use "init.lua" or any other root-module for this purpose, because I need to be able to load it from the sub-modules, which are being loaded by
the root-module, which loads the sub-modules, which are...
My abbreviated _ENV.lua file looks something like the following:
--_ENV.lua
_ENV = {
type = type, pairs = pairs, ipairs = ipairs, next = next, print =
print, require = require, io = io, table = table, string = string,
lxp = require"lxp", lfs = require"lfs",
socket = require("socket"), lpeg = require'lpeg', --etc..
}
return _ENV
With this file, I now have a common base from which to work.
All of my other modules load this first, using the following command:
_ENV = require'root_mod._ENV' --where root_mod is the base of my module.
This facility was critical for me, for two reasons. First, it keeps me
out of global space. If I see that I am missing something from the global environment _G (took me a surprisingly long time before I saw that I didn't have
tostring!), I can go back into my _ENV.lua file and add it. As
a required file, this only gets loaded one time, so having it applied
to all of my submodules is 0 calories.
Second, I find that it gives me everything that I really needed for using
the "return module as table" protocol, with only a few exceptions where "return a function that returns a table" is needed.
TL;DR: Don't return the module, set package.loaded[...] = your_module as early as possible (can still be empty), then just require the module in submodules and it will be properly shared.
The clean way to do this is to explicitly register the module and not rely on require to implicitly register it at the end. The documentation says:
require (modname)
Loads the given module. The function starts by looking into the
package.loaded table to determine whether modname is already loaded.
If it is, then require returns the value stored at
package.loaded[modname]. [This gets you the caching behavior that
every file is run only once.] Otherwise, it tries to find a loader for
the module. [And one of the searchers is looking for Lua files to run,
which gets you the usual file loading behavior.]
[…]
Once a loader is found, require calls the loader with two arguments:
modname and an extra value dependent on how it got the loader. (If the
loader came from a file, this extra value is the file name.) If the loader
returns any non-nil value [e.g. your file returns the module table],
require assigns the returned value to package.loaded[modname]. If the
loader does not return a non-nil value and has not assigned any value to
package.loaded[modname], then require assigns true to this entry.
In any case, require returns the final value of
package.loaded[modname].
(emphasis, [comments] added by me.)
With the return mymodule idiom, the caching behavior fails if you have a loop in your dependencies – the cache is updated too late. (As a result, files may be loaded several times (you may even get endless loops!) and sharing will fail.) But explicitly saying
local _M = { } -- your module, however you define / name it
package.loaded[...] = _M -- recall: require calls loader( modname, something )
-- so `...` is `modname, something` which is shortened
-- to just `modname` because only one value is used
immediately updates the cache, so that other modules can already require your module before its main chunk returned. (Of course, at that time they can only actually use what's already been defined. But that's not usually a problem.)
The package.loaded[...] = mymodule approach works in 5.1–5.3 (incl. LuaJIT).
For your example, you would adjust the start of master.lua to
1c1,2
< MASTER = {}
---
> local MASTER = {}
> package.loaded[...] = MASTER
and for all other files
0a1
> local MASTER = require "master"
and you're done.
I'm using a graphics library that lets you program in Lua. I have a need for the A* pathfinding library so I found one online. It's just 1 lua file that does the pathfinding and 1 example file. In the example file it uses the object like:
-- Loading the library
local Astar = require 'Astar'
Astar(map,1) -- Inits the library, sets the OBST_VALUE to 1
I run the script and everything works. So now I add the Astar.lua file to the path location where my graphics engine is running and do the same thing and I get the error on the Astar(map, 1) line:
"attempt to call local 'AStar' (a number value)
Any ideas why I would be getting that error when I'm doing the same thing as the example that comes with this AStar lib?
Here is a little of the AStar file
-- The Astar class
local Astar = {}
setmetatable(Astar, {__call = function(self,...) return self:init(...) end})
Astar.__index = Astar
-- Loads the map, sets the unwalkable value, inits pathfinding
function Astar:init(map,obstvalue)
self.map = map
self.OBST_VALUE = obstvalue or 1
self.cList = {}
self.oList = {}
self.initialNode = false
self.finalNode = false
self.currentNode = false
self.path = {}
self.mapSizeX = #self.map[1]
self.mapSizeY = #self.map
end
So note that when I run this from my graphics engine it's returning 1, but when run from the example that it came with it's returning a table, which is what it should be returning. So not sure why it would only be returning 1.
How is Astar getting added to the package.loaded table for the example script, as opposed to your code?
QUICK LUA SYNTACTIC SUGAR REVIEW:
func 'string' is equivalent to func('string')
tabl.ident is equivalent to tabl['ident']
When you run a script using require('Astar'), this is what it does:
checks if package.loaded['Astar'] is a non-nil value.
If it is, it returns this value. Otherwise it continues down this list.
Runs through filenames of the patterns listed in package.path (and package.cpath), with '?' replaced with 'Astar', until it finds the first file matching the pattern.
Sets package.loaded['Astar'] to true.
Runs the module script (found via path search above- for the sake of this example we'll assume it's not a C module) with 'Astar' as an argument (accessible as ... in the module script).
If the script returns a value, this value is placed into package.loaded['Astar'].
The contents of package.loaded['Astar'] are returned.
Note that the script can load the package into package.loaded['Astar'] as part of its execution and return nothing.
As somebody noted in the comments above, your problem may come from loading the module using 'AStar' instead of 'Astar'. It's possible that Lua is loading this script using this string (since, on the case-insensitive Windows, a search for a file named "AStar.lua" will open a file called "Astar.lua"), but the script isn't operating with that (by using a hard-coded "Astar" instead of the "AStar" Lua is loading the script under).
You need to add return Astar at the end of Astar.lua.