In org-mode, how do I keep the original path to images when using #+INCLUDE:? - latex

I can use:
#+INCLUDE:
to include an org file in another org file, which allows me to assemble, say, a website from various org files. I'm exporting from the C-c C-e exporter in org-mode 7.5.
I could maintain a quite complex publication this way. This modular approach is quite common in, e.g. LaTeX and Texinfo publications.
However, links to images no longer work from the #+INCLUDEd org files. What seems to be happening is that the path to the images is taken as being from the org file that I am exporting from, rather than the actual org file that references the image.
The only ways I can see to resolve this are to:
use a flat file structure; or
make the image path from the referencing file (which I might not know in advance) rather than itself.
Neither of these is really sustainable.
How do I tell org to use the correct image path from its own relevant org file rather than the parent org file?

From what I know of the exporter, INCLUDE files are inserted into the document before export. Therefore the content is part of the document before it starts following paths to reach any links to files (images).
After a bit of testing you likely will need to use absolute file paths. Since you move between Windows and Linux your best bet would be to use a consistent scheme on both starting from your home directory.
Like that you can make the Org link:
[[~/path/to/image.jpg]], which will work on both systems (assuming you have set %HOME% on Windows).
Option 1 is potentially an alternative (although I agree it wouldn't be ideal at all), whereas the second option would have obvious pitfalls if you INCLUDE the file in more than one future document.

Related

setup for multiple swagger API files

I am working on a project where we rewrite the interfacing of an existing application, porting everything to swagger/openAPI.
Right now, each feature has its own yml file right now, which is a standalone spec. But there are some drawbacks:
duplicated content in the yml files (e.g. models which could be shared accross files)
duplicated program code (which is generated from those yml files).
having to process each yml file individually when using tools.
Ideally we would like to have a seperate folder for each service, with the models and service description for that specific service close together, but separated from the other services. Of course there are also shared models, which we then want in a different folder (e.g. "/shared-models"). And finally we want all those files to be included by 1 main yml root file.
So, we have been looking at splitting/importing files with a $ref attribute. But it is tricky to come up with a full-scale file and folder structure, because the spec seems to allow usage of $ref on some places, but not all places. You can't just split and structure files any way you like. So, we will probably need some kind of trade-off.
I was especially wondering how other companies do this setup. (e.g. an example of a setup that uses an enterprise level structure of swagger files, would be excellent.) We like to keep things simple and whenever possible according to standards or popular conventions.
(For clarity: my question is not: "how to use $ref")

How to describe logical paths that includes looking inside zip and tar.gz files?

I want to build a list of all the files on a large disk that includes the files that are inside container formats like tar files and zip files.
So I'd like to know if there already exists a notation for describing what you might call "logical" filesystem paths, which includes looking inside zip and tar.gz files?
For example, if I had a directory named a.dir that includes the file b.zip, and that file was a compressed version of b.txt, then then I'd imagine a notation that could describe the location of b.txt that would look something like
a.dir/b.zip/b.txt
However, I am not sure if that would always work, and it doesn't really tell you that b.zip is a zip file.
I am looking for a simple syntax that will identify common compression/container formats (zip, tar, tar.gz, tar.bz2, etc), handle nested compressed files, and handle compressed files with absolute or relative paths. Paths must uniquely identify one file (there can be no ambiguity.) Should work across a range of filesystems. Identifying symlinks would be a bonus.
I am NOT looking for a syntax that unix commands would understand or that programs would be able to open directly. The syntax does not need to explain how to access those files. (However, these are the obvious next steps.)
Thanks.

A list of professionally-useful and safe file types?

I have a system where users can upload, well, anything really - and these files are available to other users.
I need to come up with a list of file types that are genuinely needed by professionals in different industries that are safe from hacking/viruses, etc.
.doc .docx .gif .jpg .jpeg .mpg .mpeg .mp3 .odt .odp .ods .pdf .ppt .pptx .tif .tiff .txt .xls .xlsx .wav
What other file types do you know of that are both useful and safe?
Clarification
Many of the comments and responses are asking for a clearer definition of 'safe from hacking/viruses' - I ask the question with precisely that level of detail because I don't have as sophisticated an understanding of file types and their risks as many of you do, and I would like guidance on 1) any file types that may keep my site more secure, and 2) if there are no 'safe' file types then any advice on how to move forward with a system that allows for flexible uploading and sharing of files.
If indeed any malicious file can be packaged as a seemingly-safe file, how can I protect my users?
No filetype is safe if the program you use to open it with is badly (or carelessly or evil-y) written.
You can't assume that all files with a given extension is safe from 'viruses'.
I can easily rename a malicious executable to .doc and 'hack' your system.
EDIT:
There is no (simple?) way to check whether a user-uploaded file is malicious or not.
The app that you're creating is no different than any other file sharing websites out there (Rapidshare, Megaupload, etc).
There is nothing stopping anyone to upload malicious files to those websites.
Safe files does not exists. The ordinary text file is safe? For example with content:
format c:
if some program can execute a content of the file... you get the idea.
So, here are not safe files - only restrictions to RUN code (programs). (And I understand if this answer does not like.) :)
For "useful" you'll need to ask your customers.
For safe, there's no such thing because a file extension is just a part of the file name that gives a suggestion of what type of file it is. It need not accurately represent the type, and is easily manipulated.
Rather than protecting based on file type. I would get a 3rd party to virus scan each file on upload. Reject those which are identified as positive.
The list is pretty endless! A quick search finds http://filext.com/alphalist.php?extstart=^A
Well you can include all data files and exlude all executable/script files.
One list of executable file extensions is here: http://pcsupport.about.com/od/tipstricks/a/execfileext.htm
you may look other sources to inprove coverage.
Edit: for second part of the question addressing sequrity-
It would be best to have bunch of anti malware software installed on the server to check each sumbission - they are designed for this specialized task, use them. Anyways no executable file is professionaly useful as long as people are not looking for crackware.

File repository in ruby on rails

I would like to create a simple file repository in Ruby on Rails. Users have their accounts, and after one logs in they can upload a file or download files previously uploaded.
The issue here is the security. Files should be safe and not available to anyone but the owners.
Where, in which folder, should I store the files, to make them as safe as possible?
Does it make sense, to rename the uploaded files, store the names in a database and restore them when needed? This might help avoid name conflicts, though I'm not sure if it's a good idea.
Should the files be stored all in one folder, or should they be somewhat divided?
rename the files, for one reason, because you have no way to know if today's file "test" is supposed to replace last week's "test" or not (perhaps the user had them in different directories)
give each user their own directory, this prevents performance problems and makes it easy to migrate, archive, or delete a single user
put metadata in the database and files in the file system
look out for code injection via file name
This is an interesting question. Depending on the level of security you want to apply I would recommend the following:
Choose a folder that is only accessible by your app server (if you chose to store in the FS)
I would always recommend to rename the files to a random generated hash (or incremntally generated name like used in URL shorteners, see the open source implementation of rubyurl). However, I wouldn't store them in a database because filesystems are built for handling files, so let it do the job. You should store the meta data in the database to be able to set the right file name when the user downloads the file.
You should partition the files among multiple folders. This gives you multiple advantages. First, filesystems are not built to handle millions of files in a single folder. If you have operations that try to get all files from a folder this takes significantly more time. If you obfuscate the original file name you could create one directory for each letter in the filename and would get a fairly good distributed number of files per directory.
One last thing to consider is the possible collision of file names. A user should not be able to guess a filename from another user. So you might need some additional checks here.
Depending on the level of security you want to achieve you can apply more and more patterns.
Just don't save the files in the public folder and create a controller that will send the files.
How you want to organise from that point on is your choice. You could make a sub folder per user. There is no need to rename from a security point of view, but do try to cleanup the filename, spaces and non ascii characters make things harder.
For simple cases (where you don't want to distribute the file store):
Store the files in the tmp directory. DON'T store them in public. Then only expose these files via a route and controller where you do the authentication/authorisation checks.
I don't see any reason to rename the files; you can separate them out into sub directories based on the user ID. But if you want to allow the uploading of files with the same name then you may need to generate a unique hash or something for each file's name.
See above. You can partition them any way you see fit. But I would definitely recommend partitioning them and not lumping them in one directory.

Word 2010 automation with templates

I have written several applications in Delphi which use Word automation. The programs all use templates which are stored in a directory. In pre-2010 versions of Word, one would define the location of the templates in tools|options|file locations; the programs would pass the name of the template and Word would know where to find it.
My client has now moved to Office 2010, and as a result, Word cannot find the template when started by my programs. I haven't been able to find a similar dialog box in Word in which I can define the default directory for templates. How does one define such a directory?
Click File | Options | Advanced | File Locations and you get the same dialog as in older verions
Instead of forcing your user to configure Word to define the location of templates, you might prefer to invoke word using /t switch.
/ttemplatename starts Word with a new document based on a template other than the Normal template.
>"%programfiles%\Microsoft Office\Office14\winword.exe" /t"c:\MYTEMPLATES\mytemplate.dotx"
Can't you just specify the full path when creating a new document? Why rely on a settings that possibly can even be changed by the user? Put your templates in your own folder and specify the full path.
Word's paths configuration are stored
You can get the USER template folder via
Word.Application.Options.DefaultFilePath(WdDefaultFilePath.wdUserTemplatesPath)
(there are others options for that property too).
As far as I can tell, the template loading rules haven't changed from 2007 to 2010.
Generally speaking, if your add in needs to load a template, you should specify the FULL path and file name to the template, but you can get the typical user path via the above.
On the other hand, if you install the template into WORD\STARTUP, word will automatically load it. that may not be what you need/want, though.
Finally, if your template doesn't/shouldn't change, it might be better to leave it in your PROGRAM FILES\appname folder and load it from there.
Generally speaking, +requiring+ users to change the FILE LOCATIONS in word (or changing it programmatically) is a bad idea, just because so many people wouldn't have a clue, and those that do definitely DO NOT want addins changing those settings automatically!

Resources