I want to create URL case insensitive for which I am already using CheckSpelling On and it works fine.
In Parallel I also wished to remove extension from URL for that I applied
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}\.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
It also worked.
But both doesn't work altogether.
If I keep both in htaccess it starts to give error 300 ("Multiple Choices")
You're making it hard on yourself. It'd be easier if you do basic routing in your php. Just send anything to your index.php that's not a directory, and is either not a file or is a php file:
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f [OR]
RewriteCond %{REQUEST_FILENAME} (?>.*)(?<=\.php) [NC]
RewriteRule ^(?!index\.php$). index.php [NS,L]
At the top of index.php, do something like this:
<?php
$url = explode('?', $_SERVER['REQUEST_URI'], 2);
$url = substr($url[0], 1);
if ($url) {
$url = strtolower($url) . '.php';
if (preg_match('#^[^./][^/]*(?:/[^./][^/]*)*$#', $url) && file_exists($url)) {
. # does not contain dotfiles, nor `..` directory traversal, so is a php file below web root
include $url;
}
else {
# virtual URL doesn't exist,
# set 404 response code header and serve a default page
include '404.php';
}
exit;
}
# no virtual URL, continue processing index.php
Add a rel=canonical to each page's <head> that contains the lowercase URL (with any query string re-added) so you are not penalized for duplicate content,
Related
I've an Nuxt app, deployed on Vercel, xpto.vercel.app with specific client routes:
xpto.vercel.app/client-a
xpto.vercel.app/client-b
xpto.vercel.app/admin
I have 3 domains, I don't know if this is possible, but there is any way to point each domain to that client specific route? (with only one project on vercel)
www.client-a.com => xpto.vercel.app/client-a
www.client-b.com => xpto.vercel.app/client-b
www.app-admin.com => xpto.vercel.app/admin
This is my current solution, but it's far from ideal, and requires FTP.
.htaccess
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.+)$ index.php [QSA,L]
</IfModule>
index.php
<?php
$project = "client-a";
$url = "https://xpto.vercel.app/" . $project . $_SERVER['REQUEST_URI'];
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($ch);
curl_close($ch);
echo $data;
?>
From Vercel Support:
Thank you for reaching out to Vercel Support.
Unfortunately, this is not possible in a single Vercel project.
Let us know if you have any further questions or concerns.
I am using Google's workbox-cli tool in order to precache some of the files on my website. Is it possible to setup the webserver to return the following in the HTTP response header for all files by default:
cache-control: s-maxage=2592000, max-age=86400, must-revalidate, no-transform, public
But, have the webbrowser use the follwing instead only if the file is going to be precached by the service worker:
cache-control: s-maxage=2592000, max-age=0, must-revalidate, no-transform, public
So, I would like the service worker to change max-age=86400 into max-age=0 in the webserver's response header before precaching the file. This makes the service worker fetch files, that have changed according to the revision in sw.js, from the webserver instead of retrieving them from local cache. Any files not managed by the service worker are cached for 86400 seconds by default.
Some background info
Currently, I am using the following bash script to setup my sw.js:
#!/bin/bash
if [ ! -d /tmp/workbox-configuration ]; then
mkdir /tmp/workbox-configuration
fi
cat <<EOF > /tmp/workbox-configuration/workbox-config.js
module.exports = {
"globDirectory": "harp_output/",
"globPatterns": [
EOF
( cd harp_output && find assets de en -type f ! -name "map.js" ! -name "map.json" ! -name "markerclusterer.js" ! -name "modal.js" ! -name "modal-map.html" ! -name "service-worker-registration.js" ! -name "sw-registration.js" ! -path "assets/fonts/*" ! -path "assets/img/*-1x.*" ! -path "assets/img/*-2x.*" ! -path "assets/img/*-3x.*" ! -path "assets/img/maps/*" ! -path "assets/img/video/*_1x1.*" ! -path "assets/img/video/*_4x3.*" ! -path "assets/js/workbox-*" ! -path "assets/videos/*" ! -path "de/4*" ! -path "de/5*" ! -path "en/4*" ! -path "en/5*" | sort | sed 's/^/"/' | sed 's/$/"/' | sed -e '$ ! s/$/,/' >> /tmp/workbox-configuration/workbox-config.js )
cat <<EOF >> /tmp/workbox-configuration/workbox-config.js
],
"swDest": "/tmp/workbox-configuration/sw.js"
};
EOF
workbox generateSW /tmp/workbox-configuration/workbox-config.js
sed -i 's#^importScripts(.*);$#importScripts("/assets/js/workbox-sw.js");\nworkbox.setConfig({modulePathPrefix: "/assets/js/"});#' /tmp/workbox-configuration/sw.js
sed -i 's/index.html"/"/' /tmp/workbox-configuration/sw.js
uglifyjs /tmp/workbox-configuration/sw.js -c -m -o harp_output/sw.js
On my Nginx webserver the following HTTP header is delivered by default:
more_set_headers "cache-control: s-maxage=2592000, max-age=0, must-revalidate, no-transform, public";
But, if the requested ressource is not handled by the service worker, the default cache-control setting is overwritten:
location ~ ^/(assets/(data/|fonts/|img/(.*-(1|2|3)x\.|maps/|video/.*_(1x1|4x3)\.)|js/(map|markerclusterer|modal|service-worker-registration|sw-registration)\.js|videos/)|(de|en)/((4|5).*|modal-map\.html)) {
more_set_headers "cache-control: s-maxage=2592000, max-age=86400, must-revalidate, no-transform, public";
}
Problem with the current approach (see background info)
I have to keep track of the files and update nginx.confcorrespondingly.
max-age=0 is used also for webbrowsers that don't support service-workers. So, they request the ressources from the webservers on each page visit.
1st Update
My desired precaching behaviour can be illustrated with two of the workbox strategies. I want the service worker to show below behaviour as described in scenario 1 and 2, although cache-control: max-age=86400 is delivered in the HTTP header by the webserver for an asset (e.g. default.js).
Scenario 1: revision in sw.js didn't change
The webpage is accessed, the sw.js file is retrieved from the webserver due to max-age=0 and the webbrowser noticed that the revision for default.js didn't change. In this case, default.js is retrieved from the precache cache:
Scenario 2: revision in sw.js did change
The webpage is accessed, the sw.js file is retrieved from the webserver due to max-age=0 and the webbrowser noticed that the revision of default.js changed. In this case, default.js is retrieved from the webserver:
2nd Update
Basically, the desired strategy is similar to the network-first strategy. But, step 2 is only taken if the revision of the file in sw.js has changed.
3rd Update
If I am not mistaken, there is already some work on this:
self.addEventListener('install', event => {
event.waitUntil(
caches.open(`static-${version}`)
.then(cache => cache.addAll([
new Request('/styles.css', { cache: 'no-cache' }),
new Request('/script.js', { cache: 'no-cache' })
]))
);
});
I don't think you have a comprehensive enough understanding of how service workers actually work.
You define one, or many caches for a service worker to use. You specify what goes in which cache, whether to cache future requests etc
The service worker now intercepts all network requests from the client and then responds to them however you have programmed it to. It can return cached content if available, cached content first while updating over the network, network first and copy to cache in case of no connection, cache for images but not for anything else, only cache GET requests, only cache certain domains, file types etc......
What it caches and for how long each cache is valid is entirely up to you and not influenced by server response headers at all. If you tell your service worker to make a fetch request for a resource then it will load that resource over the network, regardless of any headers or what is already cached locally.
You have total control over the entire caching process, which is very useful but has it's own set of pitfalls.
I used s-max-age instead of s-maxage in the cache-control HTTP header, which lead to some unexpected behaviour with my reverse proxy and workbox service worker. After the fix, the service worker is working as expected.
What's a simple way to grep just the dot files (.*) in the current directory ($HOME)? Using
grep target .*
returns a lot of nonsense:
grep: foo is a directory.
I don't want to see the nonsense.
grep has an option -s :
-s, --no-messages
Suppress error messages about nonexistent or unreadable files.
so you can just do grep -s 'target' .*
Could you tell me why this rewrite do not pass 2nd value? tt value is passed in to %1 but name is not in to %2
RewriteBase /
RewriteRule ^wedding-hair-and-make-up-images-([^/]*)-(.*)\.php$ franchisee-gallery.php?franchise_name=$1&id_tt=$2 [L]
RewriteCond %{THE_REQUEST} ^(GET|POST|HEAD|TRACE)\ /franchisee-gallery.php
RewriteCond %{QUERY_STRING} name=([^\&]*)
RewriteCond %{QUERY_STRING} tt=(.*)
RewriteRule ^franchisee-gallery.php$ wedding-hair-and-make-up-images-%1-%2.php? [R,L]
The %N refeers to the matching groups of last condition, so %1 is the match of condition:
RewriteCond %{QUERY_STRING} tt=(.*)
http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html#RewriteCond
RewriteCond backreferences: These are backreferences of the form %N (1 <= N <= 9), which provide access to the grouped parts (again, in parentheses) of the pattern, from the last matched RewriteCond in the current set of conditions.
For your case, you could rewrite the access of "franchisee-gallery.php" to another PHP, that makes the redirection ( via HTTP Header Location ). I can't figure it out how to make this via Rewrite. Maybe with PERL External Rewriting Program, using RewriteMap.
Solution founded! Using custom environment variables, it's possible to store the matched values in temporary variables!
RewriteEngine on
RewriteBase /
RewriteRule ^wedding-hair-and-make-up-images-([^/]*)-(.*)\.php$ franchisee-gallery.php?franchise_name=$1&id_tt=$2 [L]
RewriteCond %{REQUEST_URI} ^/franchisee-gallery.php
RewriteCond %{QUERY_STRING} name=([^&]*)
RewriteRule .* - [E=name:%1]
RewriteCond %{REQUEST_URI} ^/franchisee-gallery.php
RewriteCond %{QUERY_STRING} tt=([^&]*)
RewriteRule .* - [E=tt:%1]
RewriteCond %{THE_REQUEST} ^(GET|POST|HEAD|TRACE)\ /franchisee-gallery.php
RewriteRule ^franchisee-gallery.php$ wedding-hair-and-make-up-images-%{ENV:name}-%{ENV:tt}.php? [R,L]
How I can run Yard server on production server?
Maybe use some task?
Load from capistrano, using passenger and nginx, Jenkins(Hudson).
I found the simplest option to be just symlinking the generated docs folder from /public in my rails app. You just need to be sure that the js/css resources are accessible via the same path.
For example:
$ cd <railsapp>
$ ls
Gemfile
app/
..
public/
doc/ <- Folder that contains the html files generated by yard
$ cd public/
$ ln -s ../doc/ docs
This would serve your docs at /docs/index.html
The javascript based search for classes/methods/files still works as it is javascript based. However the search that appears on the top won't appear in this method. However I found the javascript based search sufficient.
I use nginx and passenger, serving this tiny web app:
# ~/Documentation/config.ru
require 'rubygems'
require 'yard'
libraries = {}
gems = Gem.source_index.find_name('').each do |spec|
libraries[spec.name] ||= []
libraries[spec.name] << YARD::Server::LibraryVersion.new(spec.name, spec.version.to_s, nil, :gem)
end
run YARD::Server::RackAdapter.new libraries
Nginx virtual host:
# /opt/nginx/config/sites-enabled/gems.doc
server {
listen 80;
server_name gems.doc;
root /Users/your-user/Documentation/yard/public;
rails_env development;
passenger_enabled on;
}
More in this post: http://makarius.me/offline-rails-ruby-jquery-and-gems-docs-with
I'm use this shell script:
#!/bin/sh
#or you process here
PROCESS='ruby */yard server'
PID=`pidof $PROCESS`
start() {
yard server &
}
stop() {
if [ "$PID" ];then
kill -KILL $PID
echo 'yard is stopped'
fi
}
case "$1" in
start)
start
;;
stop)
stop
;;
restart)
stop
start
;;
*)
echo Usage: $0 [start|stop|restart]
;;
esac
And in Hudson: yard doc && ./yard.sh restart.