how to define step definitions location for cucumber in intelliJ 12 - grails

I have my feature files in src/resources/com/features and my step definitions in src/main/java/com/step_definitions
My tests all run correctly, but intelliJ refuses to see where the step defs are, even if I ask it to create a new one. Where is this configured?

I was just tearing my hair out with exactly the same problem (for the record my background is Java, Ruby, Cucumber and RubyMine but I'm completely new to IntelliJ and Cucumber-JVM).
In the Cucumber-JVM run configuration you must specify the package where the step definitions are stored in the glue field as mentioned in the IntelliJ documentation. IntelliJ - for me at least - does not seem to provide a default value.
To elaborate further, a very basic project looks like this:
Example
└───src
├───main
│ └───java
└───test
├───java
│ └───com
│ └───bensnape
│ └───example
│ MyStepdefs.java
└───resources
example.feature
The glue value here would be com.bensnape.example.
Update
After playing with IntelliJ some more this morning, it seems that it does provide the glue value for you if you adhere to the Cucumber-JVM conventions - i.e. the features must live under src/test/resources/<package> and similarly, the steps must live under src/test/java/<package>.
Example project tree:
Example
└───src
├───main
│ └───java
└───test
├───java
│ └───com
│ └───bensnape
│ └───example
│ MyStepdefs.java
│
└───resources
└───com
└───bensnape
└───example
example.feature

If you mark the folder as a (test) source root (right click folder -> Mark directory as -> (Test) Source Root) it will work as well.
My grails project is set up like this:
test
├───cucumber
│ └───steps
│ └───support
│ └───features
└───unit
Unit was marked as test source root, after also marking cucumber as one the step definitions were parsed correctly.

The default convention is to have step definitions defined in a step_definitions sub-folder under the features directory. The name of the sub-folder isn't important; it will work the same whatever the name is.
My guess is that an IDE would follow the same convention, and hence IntelliJ should execute the features correctly if the step_definitions folder is moved under features folder.
The cucumber command takes a -r option to require files before executing the features. This option can be used to make it look for step_definitions in a non-conventional place.
I am guessing you may have -r src/main/java/com/step_definitions on your local configuration for cucumber to see these step_definitions when invoked from commandline.
Running cucumber --verbose shows where the command line is finding the step definition code.

You can set the glue location globally by opening "Edit Configurations -> Defaults -> Cucumber Java -> Glue" and add the package names.
(IntelliJ 12.1.4)

I was having the same problem with version 12.1.3. My folders and files are set out in the standard way as described by the other answers and marking the code as a test source did not resolve it.
Then I updated to 12.1.6 the problem was fixed :)

Related

With Bazels `http_archive` - is there a way to add already existing files to the extracted sources?

With Bazel I'm building an external library using http_archive together with
some patches which bring additional features:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name="some-lib",
build_file="#my-project//some-lib:BUILD.some-lib",
url="https://example/download/some-lib.tar.gz",
sha256="8d9405baf113a9f25e4fb961d56f9f231da02e3ada0f41dbb0fa4654534f717b",
patches=[
"//some-lib/patches:01-add-additional-features.dif",
],
patch_args=["-p1"],
patch_tool="patch",
)
The file structure looks like this:
some-lib
├── BUILD
├── BUILD.some-lib
├── include/additional_features.h
├── some-lib.bzl
└── patches
    ├── 01-add-additional-features.dif
    └── BUILD
This basically works but I still struggle with adding include/additional_features.h
into the extracted source folder.
I first tried to just list the file in the filegroup I use to later run
configure_make like this:
filegroup(
name="all_srcs",
srcs=glob(["**"]) + ["#my-project//some-lib:include/additional_features.h"],
)
then I'm getting
no such target '//some-lib:includes/additional_features.h': target 'includes/additional_features.h' not declared in package 'some-lib'; however, a source file of this name exists.
My next idea was to use the tools http_archive provides to make the file part of the source folder.
While you can use patches to modify the extracted folder you'd need a dedicated
dif file just to create the extra header file, which then you would have to
create in advance (i.E. create a Bazel rule) and declare it a dependency etc..
which I'd like to avoid to keep things simple.
There is also patch_cmds which next to patches can be used to modify the
extracted source folder by running arbitrary bash commands so I tried something like this:
patch_cmds=[
"cp #my-project//some-lib:include/additional_features.h include/",
],
but this does not work for me, I'm getting
Error in fail: Error applying patch command cp #my-project//some-lib:include/additional_features.h include/:
cp: cannot stat '#my-project//some-lib:include/additional_features.h': No such file or directory
So it looks like the syntax for specifying a path like I do with build_file does
not work with patch_cmds or that file can't be accessed at that specific stage.
Does one of the approaches I tried actual work and I just didn't use the right
syntax?
What's the Bazel-way to add (a bunch of) readily available files (i.e. in the same
repository as the Bazel-rules I provide) to a http_archive based source directory?
Try putting exports_files(["include/additional_features.h"], visibility=["//visibility:public"]) in some-lib/BUILD, so that Bazel will let you reference source files from the //some-lib package in the external repository (and elsewhere).
I even thought the "no such target" error message suggested exports_files?

How to generate package.xml automatically for salesforce source org

I would like to move from one saleforce Dev org to another Dev org using ANT Migration Tool. I would like to autogenerate package.xml file which takes care for all customfields, customObjects and all custom components so which helps me to move easily Source org to target org without facing dependencies issues.
There are a lot of answers here.
I would like to rank those answers here.
The simplest way I think is to consider using Ben Edwards's heroku service https://packagebuilder.herokuapp.com/
Another option is to use npm module provided by Matthias Rolke.
To grab a full package.xml use force-dev-tool, see: https://github.com/amtrack/force-dev-tool.
npm install --global force-dev-tool
force-dev-tool remote add mydev user pass --default
force-dev-tool fetch --progress
force-dev-tool package -a
You will now have a full src/package.xml.
Jar file provided by Kim Galant
Here's a ready made Java JAR that you point to an org (through properties files), tell it what metadata types to go and look for, and it then goes off and inventories your org and builds a package.xml for you based on the types you've specified. Even has a handy-dandy feature allowing you to skip certain things based on a regular expression, so you can easily e.g. exclude managed packages, or some custom namespaces (say you prefix a bunch of things that belong together with CRM_) from the generated package.
So a command line like this:
java -jar PackageBuilder.jar [-o <parameter file1>,<parameterfile2>,...] [-u <SF username>] [-p <SF password>] [-s <SF url>] [-a <apiversion>] [-mi <metadataType1>,<metadataType2>,...] [-sp <pattern1>,<pattern2>,...] [-d <destinationpath>] [-v]
will spit out a nice up-to-date package.xml for your ANT pleasure.
Also another way is to use Ant https://www.jitendrazaa.com/blog/salesforce/auto-generate-package-xml-using-ant-complete-source-code-and-video/
I had idea to create some competing service to aforementioned services but I dropped that project ( I didn't finish the part that was retrieving all parts from reports and dashboards)
There is an extention in VS Code that allows you to choose components and generate package.xml file using point and click
Salesforce Package.xml generator for VS Code
https://marketplace.visualstudio.com/items?itemName=VignaeshRamA.sfdx-package-xml-generator
I am affiliated to this free vs code extension as the developer

Dart scripts that invoke scripts by importing them

I have this setup:
├── bin
│   ├── all.dart
│   ├── details
│   │   ├── script1.dart
│   │   └── script2.dart
| | .....
all.dart simply imports script1.dart and script2.dart and calls their main. The goal is to have a bunch of scripts under details that can be run individually. Additionally I want a separate all.dart script that can run them all at once. This will make debugging individual scripts simpler, yet still allowing all to run.
all.dart
import 'details/script1.dart' as script1;
import 'details/script2.dart' as script2;
main() {
script1.main();
script2.main();
}
script1.dart
main() => print('script1 run');
script2.dart
main() => print('script2 run');
So, this is working and I see the print statements expected when running all.dart but I have two issues.
First, I have to softlink packages under details. Apparently pub does not propagate packages softlinks down to subfolders. Is this expected or is there a workaround?
Second, there are errors flagged in all.dart at the point of the second import statement. The analyzer error is:
The imported libraries 'script1.dart' and 'script2.dart' should not have the same name ''
So my guess is since I'm importing other scripts as if they are libraries and since they do not have the library script[12]; statement at the top they both have the same name - the empty name?
Note: Originally I had all of these under lib and I could run them as scripts specifying a suitable --package-root on the command line even though they were libraries with main. But then to debug I need to run in Dart Editor, which is why I'm moving them to bin. Perhaps the editor should allow libraries under lib with a main to be run as a script since they run outside the editor just fine? The actual differences between script/library seems a bit unnecessary (as other scripting languages allow files to be both).
How do I clean this up?
I'm not sure what the actual question is.
If a library has not library statement then the empty string is used as a name.
Just add a library statement with an unique name to fix this.
Adding symlinks to subdirectories solves the problem with the imports for scripts in subdirectories.
I do this regularily.
It was mentioned several times in dartbug.com that symlinks should go away entirely but I have no idea how long this will take.
I have never tried to put script files with a main in lib but it is just against the package layout conventions and I guess this is why DartEditor doesn't support it.

How to generate language fragment bundles to localize Carbon products

In the blog How to generate language fragment bundles to localize Carbon products by Tanya Madurapperuma, I am having the following problem. Once generated the language bundles with ant localize command, these bundles are generated in the CARBON_HOME/repository/components/dropins/ folder. The problem is that when I run the tool I'm not looking to change the language to Spanish. I would appreciate help to correct what I may be missing to do?
Note: All resources.properties files are translated into Spanish.
If you have the jars with translated resources.properties files in you dropins folder, you need to restart the server and set the Locale setting of your browser to Spanish.
Locale should be changed in the browser, and then the server will pick the matching resources files to use.
UPDATE:
There are some problems here.
First, there's a bug if you have multiple directories in /resources directory. For now, you can make sure that you have only one directory inside resources directory when you run localize task.
You should have the properties files inside a directory with the bundle name, without the tree structure. So your resources directory should look like this.
../resources/
└── org.wso2.carbon.i18n_4.2.0
├── JSResources_es.properties
└── Resources_es.properties
You need to include the locale code as _es in your files as shown above.
Also the localize tool seems to append i18n at the end of the folder structure of the built jar. This works with ui bundles but in the case of org.wso2.carbon.i18n it looks as org/wso2/carbon/i18n/i18n. So open the built jar in dropins folder and remove the extra i18n folder so that the jar tree structure looks like following.
../repository/components/dropins/org.wso2.carbon.i18n.languageBundle_4.2.0.jar
├── META-INF
│   └── MANIFEST.MF
└── org
└── wso2
└── carbon
└── i18n
├── JSResources_es.properties
└── Resources_es.properties
Did you get this to work?
The place I doubt that you might have gone wrong is the folder structure in the resources folder. (You can place your resource files anywhere and execute command as ant localize -Dresources.directory=path_to_your_resources_directory)
Also note that a resource folder should have the proper naming conventions of the osgi bundle.
Ex: org.wso2.carbon.claim.mgt.ui_4.2.0 (This entire thing is name of the folder)
If you still couldn't get this to work mail me your resources folder to tanyamadurapperuma#gmail.com

Include *.sty file from a super/subdirectory of main *.tex file

I want to share a latex document via git with many other people.
Therefore we decided to put all the special sty files, that are not present in everyones latex-installation, into a resources directory. It would be cool, if this dir would be a superdir. of the actual working directory
How exactly can I import those style files?
It is important that even the dependencies of those remote styles are resolved with other remote styles.
You can import a style file (mystyle.sty) into your document in two ways:
If you have it in your path or in the same folder as the .tex file, simply include this line in your preamble: \usepackage{mystyle}
If you have it in a different folder, you can access using its full path as \usepackage{/path/to/folder/mystyle}
That said, if you're not sure if the style file is in everyone's installation, simply include it in the same directory and make sure you do git add mystyle.sty and track it along with the rest of your files (although most likely there won't be any changes to it). There is no need for a parent directory. But if you insist on a different directory, see option 2 above.
It would be better if it were in a subdirectory than in a parent directory, as you can still call the file as \usepackage{subdir/mystyle} and be certain that you are invoking your style file. However, if you escape out to the parent directory, you never know if the other users have a similarly named folder that is not part of your package, which can result in errors.
This probably isn't relevant to you any more, but here is another way to do what you want.
Set up your git repository like this:
mystyle.sty
project/
makefile
project.tex
and put \usepackage{mystyle} in the preamble of project.tex.
Compiling project.tex manually won't work, of course, because mystyle.sty is not in the same directory as project.tex.
However, if makefile contains something along the lines of:
project.pdf: mystyle.sty project.tex
pdflatex project
mystyle.sty: ../mystyle.sty
cp ../$# $#
then running make from within the project directory will cause mystyle.sty to be copied to the correct place before project.tex is (this time successfully) compiled.
This way might seem a little bit over the top, but it does combine the best features of other methods.
If several projects in the same repository require mystyle.sty then having a common mystyle.sty sitting above them all makes more sense than having a copy in each project directory; all these copies would have to be maintained.
The compilation is portable, in the sense that if you gave me your copies of mystyle.sty and project.tex then I would (in theory at least) be able to compile manually without needing to modify the files you gave me.
For example, I would not have to replace \usepackage{/your/path/mystyle} with \usepackage{/my/path/mystyle}.
You can use Makefiles as suggested above. Another option is CMake. I didn't test for parent directories.
If you have the following file structure:
├── CMakeLists.txt
├── cmake
│   └── UseLATEX.cmake
├── img
│   └── logo.jpg
├── lib
│   └── framed.sty
└── main.tex
you should have CMake installed, instructions on CMake resources
UseLATEX.cmake can be downloaded from here
then inside the CMakeLists.txt
╚═$ cat CMakeLists.txt
cmake_minimum_required (VERSION 2.6)
set(PROJECT_NAME_STR myProject)
project(${PROJECT_NAME_STR})
set(CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake")
include(UseLATEX)
ADD_LATEX_DOCUMENT(main.tex
IMAGE_DIRS img
DEFAULT_PDF
MANGLE_TARGET_NAMES)
Some example content for main.tex (note the image)
╚═$ cat main.tex
\documentclass{report}
\begin{document}
\begin{center}
\includegraphics[width=300px]{img/logo.jpg}
\end{center}
\end{document}
The lib directory has the *.sty files
You can now compile:
cd /directory/that/has/CMakeLists.txt/
mkdir build
cd build
cmake ..
make
you can then view main.pdf which is in the build directory.
When you use TeX distribution that uses kpathsea, you can use the TEXINPUTS environment variable to specify where TeX is looking for files. The variable needs to be used in the following way.
The paths in TEXINPUTS are separated by :. An empty path will include the default search paths, i.e., just the colon. Two consecutive slashes means that the directory and all sub-directories are searched.
Thus, e.g., to build a file document.pdf which uses files in the current directory, all sub-directories of the resources directory and the default directories, you can use the following Makefile.
document.pdf: document.tex
TEXINPUTS=.:./resources//: pdflatex document.tex
To speed up the filename lookup, you can build a ls-R database using the mktexlsr command.
For all the details on kpathsea take a look at the manual.
You can use latexmk and its facilities
There is a feature documented under Utility subroutines on page 48 here in latexmk which can update TEXINPUTS during a run. If you can consider to use the .latexmkrc file to configure your chain and options, you can add ensure_path() to the file:
Here is an example:
# .latexmkrc
ensure_path('TEXINPUTS', './path/to/something//', '/full/path/to/something/else//')
# [...] Other options goes here.
$pdf_update_method = 3;
$xelatex = 'xelatex -synctex=1 -interaction=nonstopmode -file-line-error %O %S';
$pdf_previewer = 'start "%ProgramFiles%/SumatraPDF/SumatraPDF.exe" %O %S';
$out_dir = 'build/';
Notice the // at the end of a path, This will aid LaTeX to search for files in the specified directory and in all subdirectories.
Please note that while this is an amazing feature, you need to take good care of your naming scheme. If you use the same file name several places, you can run into trouble when importing them with, say \include{somefile}.

Resources