Is this possible, if yes, how would I go about it?
I have some subgenerators, which install express, wordpress or drupal. You can also name the install folder and name the assets folders (css/js/images). I'd like to pass those folder names to the parent generator for templating.
Variables can be pass through options/arguments when using composition. This only flows from the parent to the children.
There's no way for the children to pass back values to the parent generator. This is by design.
Better way is to rely on configuration files like package.json or infer settings from the content of files in the destination folder. This is better because that mean your generator is loosely couple to match the architecture of any existing project or any other generators your end user might want to compose with.
Related
I am working on a project where we rewrite the interfacing of an existing application, porting everything to swagger/openAPI.
Right now, each feature has its own yml file right now, which is a standalone spec. But there are some drawbacks:
duplicated content in the yml files (e.g. models which could be shared accross files)
duplicated program code (which is generated from those yml files).
having to process each yml file individually when using tools.
Ideally we would like to have a seperate folder for each service, with the models and service description for that specific service close together, but separated from the other services. Of course there are also shared models, which we then want in a different folder (e.g. "/shared-models"). And finally we want all those files to be included by 1 main yml root file.
So, we have been looking at splitting/importing files with a $ref attribute. But it is tricky to come up with a full-scale file and folder structure, because the spec seems to allow usage of $ref on some places, but not all places. You can't just split and structure files any way you like. So, we will probably need some kind of trade-off.
I was especially wondering how other companies do this setup. (e.g. an example of a setup that uses an enterprise level structure of swagger files, would be excellent.) We like to keep things simple and whenever possible according to standards or popular conventions.
(For clarity: my question is not: "how to use $ref")
I need to store Houdini *.hda files on a network share.
This folder needs to be sourced by all users.
Usually, for those kind of requests, I use an environment variable in ~/houdini17.0/houdini.env like for exemple:
HOUDINI_TEMP_DIR="/my/custom/temp/path"
But the issue is that I can find a solution for hda/otls files.
Adding it to HOUDINI_PATH="${HOUDINI_PATH};/my/custom/hda/path" or HOUDINI_OTLSCAN_PATH doesn't work and worst, it seems to break other links since a few other houdini nodes aren't available anymore.
Can someone point me to the right environnement variables?
Try using $HSITE and/or $JOB environment variables. Houdini will scan sub folders of the paths defined by $HSITE and $JOB for all relevant files and folders so you don't need to set a bunch of different env vars. You can mirror the folder structure found in C:\Users\username\Documents\houdini16.5
Obviously replace the Houdini version with yours. Also note that $HSITE needs to point the the folder that contains the houdini16.5 folder not the folder itself. This way you can support multiple houdini versions with a single env var.
http://www.sidefx.com/docs/houdini/basics/config.html
For example if $HSITE= //myNetworkShare/Houdini
You would need this folder structure:
//myNetworkShare/Houdini
/Houdini16.5
/otls
/scripts
/python2.7libs
/.....
Note you can only give $HSITE a single path.
When I create a project, I first create a project.json file (used internally, not as part of some package). Then for some projects I run yeoman with our own custom generators.
What I would like is for yeoman to pick up my projects.json and read that instead of prompting me for settings.
Is this possible, and if so; how do I do this? Basically I think I need to know:
How do I load the file from the project-root into index.js (and defer to prompts if it doesn't exist)
How to I bind the properties from the json to 'this'.
A Yeoman generator is just a searchable and composable Node.js project running within the context of Yeoman.
As so, it can do anything.
So, if you want to write a generator reading a project.json configuration file to know what to generate - well just do it!
http://yeoman.io/authoring/
I can use:
#+INCLUDE:
to include an org file in another org file, which allows me to assemble, say, a website from various org files. I'm exporting from the C-c C-e exporter in org-mode 7.5.
I could maintain a quite complex publication this way. This modular approach is quite common in, e.g. LaTeX and Texinfo publications.
However, links to images no longer work from the #+INCLUDEd org files. What seems to be happening is that the path to the images is taken as being from the org file that I am exporting from, rather than the actual org file that references the image.
The only ways I can see to resolve this are to:
use a flat file structure; or
make the image path from the referencing file (which I might not know in advance) rather than itself.
Neither of these is really sustainable.
How do I tell org to use the correct image path from its own relevant org file rather than the parent org file?
From what I know of the exporter, INCLUDE files are inserted into the document before export. Therefore the content is part of the document before it starts following paths to reach any links to files (images).
After a bit of testing you likely will need to use absolute file paths. Since you move between Windows and Linux your best bet would be to use a consistent scheme on both starting from your home directory.
Like that you can make the Org link:
[[~/path/to/image.jpg]], which will work on both systems (assuming you have set %HOME% on Windows).
Option 1 is potentially an alternative (although I agree it wouldn't be ideal at all), whereas the second option would have obvious pitfalls if you INCLUDE the file in more than one future document.
I would like to create a simple file repository in Ruby on Rails. Users have their accounts, and after one logs in they can upload a file or download files previously uploaded.
The issue here is the security. Files should be safe and not available to anyone but the owners.
Where, in which folder, should I store the files, to make them as safe as possible?
Does it make sense, to rename the uploaded files, store the names in a database and restore them when needed? This might help avoid name conflicts, though I'm not sure if it's a good idea.
Should the files be stored all in one folder, or should they be somewhat divided?
rename the files, for one reason, because you have no way to know if today's file "test" is supposed to replace last week's "test" or not (perhaps the user had them in different directories)
give each user their own directory, this prevents performance problems and makes it easy to migrate, archive, or delete a single user
put metadata in the database and files in the file system
look out for code injection via file name
This is an interesting question. Depending on the level of security you want to apply I would recommend the following:
Choose a folder that is only accessible by your app server (if you chose to store in the FS)
I would always recommend to rename the files to a random generated hash (or incremntally generated name like used in URL shorteners, see the open source implementation of rubyurl). However, I wouldn't store them in a database because filesystems are built for handling files, so let it do the job. You should store the meta data in the database to be able to set the right file name when the user downloads the file.
You should partition the files among multiple folders. This gives you multiple advantages. First, filesystems are not built to handle millions of files in a single folder. If you have operations that try to get all files from a folder this takes significantly more time. If you obfuscate the original file name you could create one directory for each letter in the filename and would get a fairly good distributed number of files per directory.
One last thing to consider is the possible collision of file names. A user should not be able to guess a filename from another user. So you might need some additional checks here.
Depending on the level of security you want to achieve you can apply more and more patterns.
Just don't save the files in the public folder and create a controller that will send the files.
How you want to organise from that point on is your choice. You could make a sub folder per user. There is no need to rename from a security point of view, but do try to cleanup the filename, spaces and non ascii characters make things harder.
For simple cases (where you don't want to distribute the file store):
Store the files in the tmp directory. DON'T store them in public. Then only expose these files via a route and controller where you do the authentication/authorisation checks.
I don't see any reason to rename the files; you can separate them out into sub directories based on the user ID. But if you want to allow the uploading of files with the same name then you may need to generate a unique hash or something for each file's name.
See above. You can partition them any way you see fit. But I would definitely recommend partitioning them and not lumping them in one directory.