MAMP/WAMP - Switch back from real project URLs to localhost in all my files - url

Good morning,
does anyone know how I could configure MAMP (or WAMP) in order to automatically change my project URLs, to localhost without having to search/replace inside my documents (operation I suppose to be a bit gross because possibly altering my code).
My goal is to develop on local while keeping the final and real URLs in my documents.I suppose lot of you have encountered this issue one day :)
In other word, I would like to alternate between online and local more easily.
I a beginner, please consider,

for all the biginners, here's the thing. I've created a config.php file which contains constants: one config file for the local project folder and one for the online server folder.
Inside this config file, I've create a constant (constant are then available everywhere in the project) to define the main URL of the project. e.g.:
define('CST_MAIN_URL',http://www.myproject.com); // for the online config.php file
define('CST_MAIN_URL',http://localhost:8888); // for the local config.php file
Thus, each header or redirection can work with that constant, like:
header('location:' . CST_MAIN_URL . 'index.php');
Then, things must have to do with RewriteEngine in your htaccess file, for instance whenever you must modify the behavior of MAMP/WAMP if an interrogation point or a slash provokes you with its malicious resistance. But, unfortunately RegEx expression must be understood as a basic level for mastering those url rewritings.
Hope it'll helps.

Related

Whats the purpose of public url in Laravel?

When we setup laravel, the default url will have "public" on it.
Example : localhost/laravel/public/users
We know that we always wanted to remove the "public" in our url specially when we go to production.
I searched in the internet and looks like removing the "public" url needs some work to achieve like moving files to other folder, modifying .htaccess, etc...
My question is, whats the purpose of "public" url?
Why doesnt laravel provide an easy way of removing the "public" url..
I know Laravel is popular these days but why give beginners a headache in just removing a part of the url?
Let me know your thoughts
The public directory holds all your files that should be accessible from the outside.
This way nobody can access any other file in your application.
How do you remove the public in your URL? Well you don't "remove" it at all. Instead you should point your domain (virtual host) directly at public (this is called Document Root)
If you don't have the possibility to do that (usually because your on a shared hosting with very restricted permissions) then and only then you need to move files or use a workaround with .htaccess. However if your hoster doesn't allow you to set the document root I would consider to switch to another one...
By the way: This is not really unique to Laravel. A lot of PHP frameworks do it (Zend Framework, Symfony, etc) all have a dedicated directory for files that should be accessible - separated from the framework core and your application.
If you're having trouble setting up your development environment correctly, you should try out Laravel Homestead. It's a pre-configured Vagrant box that makes it very easy to get your site locally up and running (also comes without public in your URL!)
When you make your site's root point to the public folder, e.g. www.mysite.com points to /path/to/laravel/public, there are a few advantages.
Your files outside of public, like your .env with your passwords cannot be accessed doing things like www.mysite.com/../.env and other common "exploits" are prevented just by taking this simple approach.
It's quite a common pratice in other frameworks too, not only Laravel or PHP.

In org-mode, how do I keep the original path to images when using #+INCLUDE:?

I can use:
#+INCLUDE:
to include an org file in another org file, which allows me to assemble, say, a website from various org files. I'm exporting from the C-c C-e exporter in org-mode 7.5.
I could maintain a quite complex publication this way. This modular approach is quite common in, e.g. LaTeX and Texinfo publications.
However, links to images no longer work from the #+INCLUDEd org files. What seems to be happening is that the path to the images is taken as being from the org file that I am exporting from, rather than the actual org file that references the image.
The only ways I can see to resolve this are to:
use a flat file structure; or
make the image path from the referencing file (which I might not know in advance) rather than itself.
Neither of these is really sustainable.
How do I tell org to use the correct image path from its own relevant org file rather than the parent org file?
From what I know of the exporter, INCLUDE files are inserted into the document before export. Therefore the content is part of the document before it starts following paths to reach any links to files (images).
After a bit of testing you likely will need to use absolute file paths. Since you move between Windows and Linux your best bet would be to use a consistent scheme on both starting from your home directory.
Like that you can make the Org link:
[[~/path/to/image.jpg]], which will work on both systems (assuming you have set %HOME% on Windows).
Option 1 is potentially an alternative (although I agree it wouldn't be ideal at all), whereas the second option would have obvious pitfalls if you INCLUDE the file in more than one future document.

How to migrate my files from one folder location to another with full user visibility

For years I've put my application data files in c:\MyCompany\MyDataFileFolder\App where 'App' is my application name. I made this choice in the early days of Wild-West-Windows when Microsoft seemed to keep changing its own mind (My Documents, Documents, Program Data etc). As I've learnt more about how to do things correctly, and as Windows has now 'settled down' and is more picky about permissions I'd like to move my files. Users have got used to where they are though, and what I'd really like to do is to implement something like Windows does with 'special folders' where there are several synonymous names, thus in my legacy folder I'd like to put something in (or change a folder to) an alias for the real location which will now be something descended from Program Data. This way, the files are in a good place with the correct permissions and if we run a utility expecting or modifying files in the 'old' place this gets changed transparently to the 'new' place (thus a simple shortcut wont work).
Is this possible? Is there are recognised technique for this? I'm using Delphi XE2.
What you are looking for is either a Symbolic Link or a Reparse Point.

File repository in ruby on rails

I would like to create a simple file repository in Ruby on Rails. Users have their accounts, and after one logs in they can upload a file or download files previously uploaded.
The issue here is the security. Files should be safe and not available to anyone but the owners.
Where, in which folder, should I store the files, to make them as safe as possible?
Does it make sense, to rename the uploaded files, store the names in a database and restore them when needed? This might help avoid name conflicts, though I'm not sure if it's a good idea.
Should the files be stored all in one folder, or should they be somewhat divided?
rename the files, for one reason, because you have no way to know if today's file "test" is supposed to replace last week's "test" or not (perhaps the user had them in different directories)
give each user their own directory, this prevents performance problems and makes it easy to migrate, archive, or delete a single user
put metadata in the database and files in the file system
look out for code injection via file name
This is an interesting question. Depending on the level of security you want to apply I would recommend the following:
Choose a folder that is only accessible by your app server (if you chose to store in the FS)
I would always recommend to rename the files to a random generated hash (or incremntally generated name like used in URL shorteners, see the open source implementation of rubyurl). However, I wouldn't store them in a database because filesystems are built for handling files, so let it do the job. You should store the meta data in the database to be able to set the right file name when the user downloads the file.
You should partition the files among multiple folders. This gives you multiple advantages. First, filesystems are not built to handle millions of files in a single folder. If you have operations that try to get all files from a folder this takes significantly more time. If you obfuscate the original file name you could create one directory for each letter in the filename and would get a fairly good distributed number of files per directory.
One last thing to consider is the possible collision of file names. A user should not be able to guess a filename from another user. So you might need some additional checks here.
Depending on the level of security you want to achieve you can apply more and more patterns.
Just don't save the files in the public folder and create a controller that will send the files.
How you want to organise from that point on is your choice. You could make a sub folder per user. There is no need to rename from a security point of view, but do try to cleanup the filename, spaces and non ascii characters make things harder.
For simple cases (where you don't want to distribute the file store):
Store the files in the tmp directory. DON'T store them in public. Then only expose these files via a route and controller where you do the authentication/authorisation checks.
I don't see any reason to rename the files; you can separate them out into sub directories based on the user ID. But if you want to allow the uploading of files with the same name then you may need to generate a unique hash or something for each file's name.
See above. You can partition them any way you see fit. But I would definitely recommend partitioning them and not lumping them in one directory.

How can I make a server log file available via my ASP.NET MVC website?

I have an ASP.NET MVC website that works in tandem with a Windows Service that processes file uploads. For easy maintenance of the site, I'd like the log file for the Windows Service to be accessible (to me, only) via the website, so that I can hit http://myserver/logs/myservice to view the contents of the log file. How can I do that?
At a guess, I could either have the service write its log file in a "Logs" folder at the top level of the site, or I could leave it where it is and set up a virtual directory to point to it. Which of these is better - or is there another, better way?
Wherever the file is stored, I can see that there's going to be another problem. I tried out the first option (Logs folder in my website), but when I try to access the file via HTTP I get an error:
The process cannot access the file 'foo' because it is being used by another process.
Now, I know from experience that my service keeps the file locked for writing while it's running, but that I can still open the file in Notepad to view the current contents. (I'm surprised that IIS insists on write access, if that's what's happening).
How can I get around that? Do I really have to write a handler to read the file and serve it to the browser myself? Or can I fix this with configuration or somesuch?
PS. I'm using IIS7 if that helps.
Unfortunately I'm afraid you'll have to write a handler that will open the file, and return it to the client.
I've written an IIS Manager extension that displays server log files, and what I've noticed that even the simple
System.IO.File.OpenRead("")
can still run in the same problem, and return the same error.. It was kind of confusing.
In the end I used
System.IO.File.Open("", FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
and I could easily open the file while the server was writing logs to it :)
I think the virtual directory is an "okay" solution, if you add the directory (application) with READ ONLY rights + perhaps "BROWSE directory" too (so you can see the folder contents rendered by the IIS).
(But once you do that, you should consider that you also anonymous access to that folder - unless you enable authentication, so watch out for "secret" contents of the logfiles that you might expose? just a thought.)
Another approach, I prefer myself, is to make a MVC/ASP.NET page that does the lookup in the folder by normal code, so that you 100% can filter whatever data is shown in the HTML.
You can open the files as TextStream's and in Read Only mode.
If it's a problem to gain access to the logfolder, I would use the virtual directory with READ ONLY access and then program something that renders the logfiles as HTML on my screen and with my detail levels. Perhaps even add some sort of "login" first. But it all depends on your security levels and contents of logfiles.
is this meaningfull to you? if not, please explain more, as I've been through this thought a few times already for similar situations.

Resources