Logger does on work in RestrictedPython script in Plone - printing

How to print in a .cpy file (python) ? I'm using Zope/Plone and I've just started with Python. I've tried this
import logging
logger = logging.getLogger()
logger.info("hello plone")
But it doesn't work.
Thank you for your answer

The answer above means that you cannot import any modules in RestrictedPython scripts which are through-the-web editable Plone scripts. These scripts have end-user permissions, so they are not allowed to run arbitrary Python code.
http://collective-docs.readthedocs.org/en/latest/security/sandboxing.html
You can use context.plone_log("mystring") style logging in restricted python scripts for logging purposes.

"It doesnt' work" is awfully vague, but your problem is probably a violation of the security sandbox imposed on Python used in scripts that may be edited through the web. "Restricted Python" limits your imports to modules that have been audited to assure that they don't have nasty side effects -- like dumping noise into logs. See http://wiki.zope.org/zope2/PythonScripts for details on Restricted Python.
The general solution to this kind of problem is to build your functionality in unrestricted Python in a Python package. A Zope named utility is the usual mechanism for providing this kind of functionality, and you'll be able to reach the utility's operations from restricted Python by traversing to the named utility.

Related

What is the best way to package Elixir CLI application?

Suppose I have a CLI application with subcommands and arguments (like application foo --bar baz). How can I package it for distribution without requiring user to install Erlang?
I know there's mix escript Mix task, but it builds a binary that requires Erlang to be installed, also Mix reference states that escripts should be used only for development purposes.
mix release, however, produces redundant shell scripts that I don't want to see in dist.
So, is there a way to make a standalone distributable of an Elixir CLI application?
P. S. This is actually my first experience with Elixir (and the whole Erlang ecosystem)
Elixir applications can be packaged as Erlang releases, see also here. These can include Erlang VM. Elixir since 1.9 supports building releases without extra tooling, using included mix task mix release - please check the documentation for greasy details: https://hexdocs.pm/mix/Mix.Tasks.Release.html
You might benefit from a quick look at this blog post for inspiration and conceptual overview, noticing that for simple CLI app it is much simpler: https://www.cogini.com/blog/best-practices-for-deploying-elixir-apps/
Bakeware generates a single executable file.
They have added a CLI app example here.

build research and use it as an external package for research project

I want to perform some research regarding quantization/sparsification, I would like to use run_experiment.py script as a template, to do so in a clean matter as research is not part of the pip package I was wondering if it is possible to build it myself and then reuse it as a dependency (as in run_experiment.py some functions from research are used). I am not sure however how to do it. I am not familiar with bazel. I was able to install it and run the script, that's all. Any guidance would be highly appreciated! Or if it's not possible it would be good to know as well! Thank you for any advice in this matter.
EDIT:
I built something using bazel and I have it in bazel-bin I don't know now however how to reuse it in my script, as if I just wanted to do it in a python manner
from research.compression import compression_process_adapter
or somehthing similar in my script
Using TFF for Federated Learning Research gives a rough introduction on suggestions for organizing the experiment conceptually.
From there, seeing how "run scripts" are setup in various sub-directories under tensorflow_federated/python/research/ might be good examples. If there is an sub directory that is close to what you want to accomplish, forking/copying it might be a good place to start.
For instance, tensorflow_federated/python/research/gans/experiments/emnist/run_experiments.py might be a useful example for how to setup an experiment grid. This iteratively runs tensorflow_federated/python/research/gans/experiments/emnist/train.py, which has an example of how to import libraries under the research/ directory. Note that all of these uses bazel, and the depedencies for the imports are decalred in the tensorflow_federated/python/research/gans/experiments/emnist/BUILD file.
Finally, this script can be run with (from the git repo root directory):
bazel run -c opt tensorflow_federated/python/research/gans/experiments/emnist:run_experiments

How can I distribute my program if it has dependencies?

So I recently wrote a chat bot which relies on lua and luasocket to respond to a twitch stream's chat. It's very basic and has various files it reads/writes to. It runs from the local computer. I finally got it working perfectly and now I'm interested in potentially distributing it to streamers who would get the most practical usage out of it.
But I can't just give them the files and lua script; they wouldn't be able to run it. They would need an interpreter and they would need to set up luasocket. With very little experience this is a very daunting task. Even I struggled to properly get luasocket working to make this bot.
So my question: Is there a way to package the lua interpreter and luasocket library such that I can give my bot to other people in an easy to use and practical manner? Preferably a .exe file, but really anything that doesn't require them to go out and set up the entire language and script dependencies on their own.

What is a "debian policy" in regards to lua?

So, you can see this "policy" at this URL https://packages.debian.org/wheezy/devel/lua5.1-policy-dev
What exactly is a "debian policy" in this sense?
This sounds like windows "group policies" which has never seemed like a good use of the word policy to me.
I can find lots of policy manuals, but this use of policy seems to mean something different.
Good start is README, which states:
lua5.1-policy
This debian packages contains the debian policy for libraries related
to the lua5.1 scripting language.
The lua5.1-policy package contains the policy, in txt and html
format.
The lua5.1-policy-dev package is meant to ease the packager life. It
can be declared in the the build-depends field of a lua library
package and It provides the following utility and CDBS class:
lua5.1-policy-create-svnbuildpackage-layout
Should be used to initialise an svn repository for a new package.
lua.mk to be included in your rules file
I would call it rather guidelines, than policy, but it is how Debian way of naming things. In short, it helps to adhere to the rules = Debian Packaging Policy.

Dart: Possible to exchange source code on the fly in a running system?

In this article it says: "The Dart VM reads and executes source code, which means there is no compile step between edit and run.". Does that mean that you can exchange source-code on the fly in a running Dart system like in Erlang? Maybe the compiler is removed from the runtime system and then this is no longer possible. So that's why I'm asking.
Dart is run "natively" only in Dartium, which is a flavour of Chrome with DartVM. When you develop an application you still need to compile it it to JavaScript. This way you get fast development lifecycle and in the end you can compile code to JS. Because it's compiled code there is lots more room for compiler to run optimisations on the code. So from my perspective, the compiler is still there and I don't think you would be able to replace code at runtime.
You can send around source code and run it, but it would need to be in a separate isolate. Isolates do have some relationship to Erlang concepts.
The Dart VM doesn't support hot swapping (Called live edit in V8). However, based on mailing list discussions, it sounds like this is something that the authors do want to support in the future.
However, as the others have mentioned, it is possible to dynamically load code into another isolate.

Resources