How does sightly HTML files access global implicit objects in AEM 6 - sightly

Unlike inclusion of global.jsp with every component jsp in CQ5, sightly does not include any such dependency. How does it actually access all the global objects. What is the backend process of it. And how sightly code compiles to java??

how sightly code compiles to java?
Sling sightly API has two bundles to support this, the first step is to compile sightly into Abstract Syntax Tree (The Abstract Syntax Tree maps plain Java source code in a tree form. This tree is more convenient and reliable to analyse and modify programmatically than text-based source.) This is done by Apache Sling Scripting Sightly Compiler
Next is to convert (transpile) the Abstract Syntax Tree into java source code. This is achieved in bundle Java Compiler
How does it actually access all the global objects.
To understand this you need to understand how script resolution occurs in Sling and how are resource resolved to scripts which is core to Sling Scripting engine. To understand basics of ScriptEngine look at java docs here, implementation of this is SightlyScriptEngine
The way script resolution works is that the resource is adapted to DefaultSlingScript , this is done by SlingScriptAdapterFactory.
SlingScriptAdapterFactory has references to BindingsValuesProvider which is passed to the DefaultSlingScript. One of the implementation of BindingsValuesProvider is AEMSightlyBindingsValuesProvider (you can see this as a service in /system/console/services) which provides the default objects.
The DefaultSlingScript then responsible for invoking SightlyScriptEngine and calling its method eval which populates the default objects in binding and then setting this binding as request attribute.

Related

Dart Angular 2 annotation how do they work?

I'm currently playing with the Dart version of Angular 2.
I have seen that the library is using a lot of Metadata as #Component for example.
I would like to know how are those directives working?
I went on http://www.darlang.org. They explain how to define an annotation but not how to use it to construct an object as it is done in angular.io.
Could someone explain how the magic is working?
In Dart annotations by itself don't do anything than exist beside the code element where they are added.
At runtime:
You can use dart:mirrors to query the imported libraries for elements like fields, functions, classes, parameters, ... for these annotations.
dart:mirrors is discouraged for browser applications. In this case you can use the reflectable package with quite similar capabilities.
See also:
https://www.dartlang.org/articles/reflection-with-mirrors/
https://api.dartlang.org/1.14.1/dart-mirrors/dart-mirrors-library.html
https://stackoverflow.com/search?q=%5Bdart-mirrors%5D+annotations
At buildtime
You can create a transformer and register it in pubspec.yaml to be run by pub serve and pub build.
In this case the Dart analyzer can be utilized to query the source files for annotations and, like Angular does, modify the source code in a build step to add/replace/remove arbitrary code.
For more details about transformers
- https://www.dartlang.org/tools/pub/assets-and-transformers.html
- https://www.dartlang.org/tools/pub/transformers/
- https://www.dartlang.org/tools/pub/transformers/examples/
- https://www.dartlang.org/tools/pub/transformers/aggregate.html
- https://pub.dartlang.org/packages/code_transformers

XTSE1650: net.sf.saxon.trans.LicenseException: Requested feature (xsl:import-schema) requires Saxon-EE

I use java and saxonee-9.5.1.6.jar included build path , when run, getting these errors at different times.
Error at xsl:import-schema on line 6 column 169 of stylesheet.xslt:
XTSE1650: net.sf.saxon.trans.LicenseException: Requested feature (xsl:import-schema)
requires Saxon-EE
Error on line 1 column 1
SXXP0003: Error reported by XML parser: Content is not allowed in prolog.
javax.xml.transform.TransformerConfigurationException: Failed to compile stylesheet. 1 error detected.
I open .xslt file in hex editor and dont see any different character at the beginning AND
I use transformerfactory in a different project but any error I get.
Check what the implementation class of tFactory is. My guess is it is probably net.sf.saxon.TransformerFactoryImpl - which is basically the Saxon-HE version.
When you use JAXP like this, you're very exposed to configuration problems, because it loads whatever it finds sitting around on the classpath, or is affected by system property settings which could be set in parts of the application you know nothing about.
If your application depends on particular features, it's best to load a specific TransformerFactory, e.g. tFactory = new com.saxonica.config.EnterpriseTransformerFactory().
I don't know whether your stylesheet expects the source document to be validated against the schema, but it it does, note that this isn't automatic: you can set properties on the factory to make it happen.
I would recommend using Saxon's s9api interface rather than JAXP for this kind of thing. The JAXP interface was designed for XSLT 1.0, and it's a real stretch to use it for some of the new 2.0 features like schema-awareness: it can be done, but you keep running into limitations.

run-time evaluation of values in DelphiWebScript

My delphi application runs scripts using JvInterpreter (from the Jedi project).
A feature I use is runtime evaluation of expressions.
Script Example:
[...]
ShowMessage(X_SomeName);
[...]
JvInterpreter doesn't know X_SomeName.
When X_SomeName's value is required the scripter calls its OnGetValue-callback.
This points to a function I handle. There I lookup X_SomeName's value and return it.
Then JvInterpreter calls ShowMessage with the value I provided.
Now I consider switching to DelphiWebScript since it has a proper debug-interface and should also be faster than JvInterpreter.
Problem: I didn't find any obvious way to implement what JvInterpreter does with its OnGetValue/OnSetValue functions, though.
X_SomeName should be considered (and actually is, most of the time) a variable which is handled by the host application.
Any Ideas?
Thanks!
You can do that through the language extension mechanism, which has a FindUnknownName method that allows to register symbols on the spot.
It is used in the asm lib module demo, and you can also check the new "AutoExternalValues" test case in ULanguageExtensionTests, which should be closer to what you're after.

working with xtext not as plugin

In general I have my dsl as plugin and I want to create a new app that use my dsl
so i tried to write this code:
JsonParser p = new JsonParser();
IParseResult r = p.parse(new StringReader("{}"));
//once that work it will be the file data instead of {}
but when i do the parse the node model builder is null and the following line has exception:
return doParse(ruleName, in, nodeModelBuilder.get(), 0);
and i'm not sure how to init nodeModelBuilder
i'm sure i missing some steps but i'm not quite familiar with the xtext process.
thanks!
You already read the following answer on Eclipse Forum. You need to create an IParser instance by injecting it. All dependencies gets also injected. The necessary bindings are described in your JsonRuntimeModule. Xtext uses Guice and theses Modules to glue everything together. This pattern is called Dependency Injection.
... I want to create a new app that use my dsl
So you want to use your Json DSL in standalone mode.
My suggestion:
Create a minimum Eclipse IApplication with CLI that reads and parses an input file. The advantage of an Eclipse IApplication is that you can easily deploy an headless version of your DSL runtime. [1]
Have a look at your JsonInjectorProvider and the ParseHelper [2] from Xtext's JUnit support for examples how to use your DSL and Xtext runtime in standalone mode.
[1] http://www.eclipsezone.com/eclipse/forums/t99762.html
[2] org.eclipse.xtext.junit.util.ParseHelper
You are not supposed to call parser directly. See:
http://wiki.eclipse.org/Xtext/FAQ#How_do_I_load_my_model_in_a_standalone_Java_application.C2.A0.3F
The code should look like:
Injector injector = new MyDslStandaloneSetup().createInjectorAndDoEMFRegistration();
XtextResourceSet resourceSet = injector.getInstance(XtextResourceSet.class);
resourceSet.addLoadOption(XtextResource.OPTION_RESOLVE_ALL, Boolean.TRUE);
Resource resource = resourceSet.getResource(new File("/../../some.json").toURI(), true);
Model modelRootElement = (Model) resource.getContents().get(0);
Replace MyDsl with 'JsonParser' or 'Json' or whatever is your DSL name. Look for class JsonStandaloneSetup or JsonParserStandaloneSetup in your DSL source code. This class is generated when you start the Xtext project (or when you run workflow for the first time, not sure now). Replace Model with whatever is your root element type. It must be EObject subclass.
The parsing/validation/buidling AST is done resource.getContents() command. Not very intuitive, I know. It is because you have to initialize context, all sorts of contexts i fact, Guice context, EMF context, and perhaps other, all encapsulated in the StandaloneSetup (and RuntimeModule). The context is similar to Spring Application Context.
You need to use StandaloneSetup to run in standalone mode.
See this tutorial for help

statically analysing Lua code for potential errors

I'm using a closed-source application that loads Lua scripts and allows some customization through modifying these scripts. Unfortunately that application is not very good at generating useful log output (all I get is 'script failed') if something goes wrong in one of the Lua scripts.
I realize that dynamic languages are pretty much resistant to static code analysis in the way C++ code can be analyzed for example.
I was hoping though, there would be a tool that runs through a Lua script and e.g. warns about variables that have not been defined in the context of a particular script.
Essentially what I'm looking for is a tool that for a script:
local a
print b
would output:
warning: script.lua(1): local 'a' is not used'
warning: script.lua(2): 'b' may not be defined'
It can only really be warnings for most things but that would still be useful! Does such a tool exist? Or maybe a Lua IDE with a feature like that build in?
Thanks, Chris
Automated static code analysis for Lua is not an easy task in general. However, for a limited set of practical problems it is quite doable.
Quick googling for "lua lint" yields these two tools: lua-checker and Lua lint.
You may want to roll your own tool for your specific needs however.
Metalua is one of the most powerful tools for static Lua code analysis. For example, please see metalint, the tool for global variable usage analysis.
Please do not hesitate to post your question on Metalua mailing list. People there are usually very helpful.
There is also lua-inspect, which is based on metalua that was already mentioned. I've integrated it into ZeroBrane Studio IDE, which generates an output very similar to what you'd expect. See this SO answer for details: https://stackoverflow.com/a/11789348/1442917.
For checking globals, see this lua-l posting. Checking locals is harder.
You need to find a parser for lua (should be available as open source) and use it to parse the script into a proper AST tree. Use that tree and a simple variable visibility tracker to find out when a variable is or isn't defined.
Usually the scoping rules are simple:
start with the top AST node and an empty scope
item look at the child statements for that node. Every variable declaration should be added in the current scope.
if a new scope is starting (for example via a { operator) create a new variable scope inheriting the variables in the current scope).
when a scope is ending (for example via } ) remove the current child variable scope and return to the parent.
Iterate carefully.
This will provide you with what variables are visible where inside the AST. You can use this information and if you also inspect the expressions AST nodes (read/write of variables) you can find out your information.
I just started using luacheck and it is excellent!
The first release was from 2015.

Resources