Based on 18022890 this is my code.
/* Register service provider */
// ...
$app->register(new TwigServiceProvider(), array(...));
$app->boot();
$app->register(new SecurityServiceProvider(), array(...));
/* Route collection */
// ....
/* Not found exception */
$app->error(function (\Exception $e, $code) use ($app) {
if ($app['debug']) return;
$templates = array(
'error/'.$code.'.twig',
'error/'.substr($code, 0, 2).'x.twig',
'error/'.substr($code, 0, 1).'xx.twig',
'error/default.twig',
);
return new Response($app['twig']->resolveTemplate($templates)->render(array('code' => $code)), $code);
});
FYI: I got route collection from here.
Everything seems OK, except http ://silex.local/undefinedroute won't display my 404 error template. How to handle this situation? Thanks.
I changed the registering order (Security before Twig) and without $app->boot() then my 404 error template shows up.
/* Register service provider */
// ...
$app->register(new SecurityServiceProvider(), array(...));
$app->register(new TwigServiceProvider(), array(...));
Related
I am implementing PWA into my project, I have setted up the serviceworker.js, and I am using workbox.js for cache routing and strategies.
1- I add the offline page to cache on install event, when a user first visit the site:
/**
* Add on install
*/
self.addEventListener('install', (event) => {
const urls = ['/offline/'];
const cacheName = workbox.core.cacheNames.runtime;
event.waitUntil(caches.open(cacheName).then((cache) => cache.addAll(urls)))
});
2- Catch & cache pages with a specific regex, like these:
https://website.com/posts/the-first-post
https://website.com/posts/
https://website.com/articles/
workbox.routing.registerRoute(
new RegExp('/posts|/articles'),
workbox.strategies.staleWhileRevalidate({
cacheName: 'pages-cache'
})
);
3- Catch errors and display the offline page, when there's no internet connection.
/**
* Handling Offline Page fallback
*/
this.addEventListener('fetch', event => {
if (event.request.mode === 'navigate' || (event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html'))) {
event.respondWith(
fetch(event.request.url).catch(error => {
// Return the offline page
return caches.match('/offline/');
})
);
}
else{
// Respond with everything else if we can
event.respondWith(caches.match(event.request)
.then(function (response) {
return response || fetch(event.request);
})
);
}
});
Now this is working for me so far if I visit for example: https://website.com/contact-us/ but if I visit any url within the scope I defined earlier for "pages-cache" like https://website.com/articles/231/ this would not return the /offline page since it's not in the user cache, and I would get a regular browser error.
There's an issue in how errors are handled, when there's a specific caching route by workbox.
Is this the best method to apply for offline fallback? how can I catch errors from these paths: '/articles' & '/posts' and display an offline page?
Please refer as well to this answer where there's a different
approach to applying the fallack with workbox, I tried it as well same
results. Not sure which is the accurate approach for this.
I found a way to do it right with workbox.
For each route I would add a fallback method like this:
const offlinePage = '/offline/';
/**
* Pages to cache
*/
workbox.routing.registerRoute(/\/posts.|\/articles/,
async ({event}) => {
try {
return await workbox.strategies.staleWhileRevalidate({
cacheName: 'cache-pages'
}).handle({event});
} catch (error) {
return caches.match(offlinePage);
}
}
);
In case of using network first strategy this is the method:
/**
* Pages to cache (networkFirst)
*/
var networkFirst = workbox.strategies.networkFirst({
cacheName: 'cache-pages'
});
const customHandler = async (args) => {
try {
const response = await networkFirst.handle(args);
return response || await caches.match(offlinePage);
} catch (error) {
return await caches.match(offlinePage);
}
};
workbox.routing.registerRoute(
/\/posts.|\/articles/,
customHandler
);
More details at workbox documentation here: Provide a fallback response to a route
I have a service worker for caching images, this service worker is only registered within the frontend template but it still keeps spreading into my admin template.
This causes my forms to behave unpredictably as the validation tokens get impacted with it.
With some console.log I figured the install event is triggered before getting to the requested page but I'm unable to determine the current/next URL there.
How can I prevent the service worker to spreading to the admin panel and interfere with the pages? I just want only assets to be cached.
This is my service worker as far as that is relevant:
const PRECACHE = 'precache-v1.0.0';
const RUNTIME = 'runtime';
// A list of local resources we always want to be cached.
const PRECACHE_URLS = [
"public",
"media",
"unify",
];
importScripts('./cache-polyfill.js');
// The install handler takes care of precaching the resources we always need.
self.addEventListener('install', function(event) {
console.log('installing resources');
event.waitUntil(
caches.open(PRECACHE)
//.then(cache => cache.addAll(PRECACHE_URLS))
.then(self.skipWaiting())
);
});
// The activate handler takes care of cleaning up old caches.
self.addEventListener('activate', function(event) {
const currentCaches = [PRECACHE, RUNTIME];
event.waitUntil(
caches.keys().then(cacheNames => {
return cacheNames.filter(cacheName => !currentCaches.includes(cacheName));
}).then(cachesToDelete => {
return Promise.all(cachesToDelete.map(cacheToDelete => {
return caches.delete(cacheToDelete);
}));
}).then(() => self.clients.claim())
);
});
// The fetch handler serves responses for same-origin resources from a cache.
// If no response is found, it populates the runtime cache with the response
// from the network before returning it to the page.
self.addEventListener('fetch', event => {
// Skip cross-origin requests, like those for Google Analytics.
if (event.request.method === "GET") {
if (event.request.url.indexOf(PRECACHE_URLS) > -1) {
console.log("fetching " + event.request.url + " by the service worker");
event.respondWith(
caches.match(event.request).then(cachedResponse => {
if (cachedResponse) {
return cachedResponse;
}
return caches.open(RUNTIME).then(cache => {
return fetch(event.request).then(response => {
// Put a copy of the response in the runtime cache.
return cache.put(event.request, response.clone()).then(() => {
console.log('cached: ' + event.request.url);
return response;
});
});
});
})
);
}
else {
console.log("fetching " + event.request.url + " by service worker blocked, it's not a resource");
}
}
return fetch(event.request);
});
The problem is most likely that your admin pages lie inside the SW scope. This means that your SW controls eg. everything in / and your admin pages are located in /admin/ or something.
You can prevent the behaviour by checking the fetch requests your SW is intercepting. Something like:
if (event.request.url.match('^.*(\/admin\/).*$')) {
return false;
}
This should be the first thing in the SW's fetch listener. It checks whether it received a request for something from the admin pages and then cancels out if it did. Otherwise, it continues normally.
In an AngularJS directive the templateUrl parameter is defined dinamically.
'templates/' + content_id + '.html'
I don't want to establish rules to check if content_id value is valid and manage it as 404 errors, i.e. if the template doesn't exist (server return a 404 error when loading the template) load template/404.html instead.
How can I do that?
Edited: The current answers suggest to use a response error interceptor. In this case ¿how can I know that the response is to a loading of this template?
You will need to write response error interceptor. Something like this:
app.factory('template404Interceptor', function($injector) {
return {
responseError: function(response) {
if (response.status === 404 && /\.html$/.test(response.config.url)) {
response.config.url = '404.html';
return $injector.get('$http')(response.config);
}
return $q.reject(response);
}
};
});
app.config(function($httpProvider) {
$httpProvider.interceptors.push('template404Interceptor');
});
Demo: http://plnkr.co/edit/uCpnT5n0PkWO53PVQmvR?p=preview
You can create an interceptor to monitor all requests made with the $http service and intercept any response errors. If you get a status 404 for any request made, simply redirect the user to error page(template/404.html in your case).
.factory('httpRequestInterceptor', function ($q) {
return {
'responseError': function(rejection) {
if(rejection.status === 404){
// do something on error
}
}
return $q.reject(rejection);
}
};
});
You would need to push the interceptor to $httpProvider in your config function.
myApp.config( function ($httpProvider, $interpolateProvider, $routeProvider) {
$httpProvider.interceptors.push('httpRequestInterceptor');
});
Here's the demo
Cheers!
I created a task to automate emailing a report in Symfony 1.4. Both this task and a related module for viewing in the web share a custom PHP class file. The task is able to pull in the data correctly, but I have not been able to get it to send out the email. I attempted to follow the official Symfony 1.4 documentation as well as a number of examples from a Google search, but none are solving my problem. The terminal isn't displaying any error either.
My Code:
<?php
require_once sfConfig::get('sf_lib_dir').'/vendor/sesame/reports/reports.class.php';
//use reports;
class reportsTask extends sfBaseTask
{
protected function configure()
{
// // add your own arguments here
// $this->addArguments(array(
// new sfCommandArgument('my_arg', sfCommandArgument::REQUIRED, 'My argument'),
// ));
$this->addOptions(array(
new sfCommandOption('application', null, sfCommandOption::PARAMETER_REQUIRED, 'The application name'),
new sfCommandOption('env', null, sfCommandOption::PARAMETER_REQUIRED, 'The environment', 'dev'),
new sfCommandOption('connection', null, sfCommandOption::PARAMETER_REQUIRED, 'The connection name', 'doctrine'),
new sfCommandOption('type', null, sfCommandOption::PARAMETER_OPTIONAL, 'The output type of the report', 'email')
// add your own options here
));
$this->namespace = '';
$this->name = 'reports';
$this->briefDescription = '';
$this->detailedDescription = <<<EOF
The [reports|INFO] task does things.
Call it with:
[php symfony reports|INFO]
EOF;
}
protected function execute($arguments = array(), $options = array())
{
$databaseManager = new sfDatabaseManager($this->configuration);
$databaseManager->loadConfiguration();
$reports = new reports();
$output = $reports->buildReport($options['type']);
switch($options['type']){
case 'csv':
echo $output;
break;
case 'email':
$message = $this->getMailer()->compose($output['from'], $output['to'], $output['subject']);
$message->setBody($output['body']['content'], $output['body']['type']);
$message->attach(Swift_Attachment::newInstance($output['attachment']['content'], $output['attachment']['name'], $output['attachment']['type']));
$this->getMailer()->sendNextImmediately()->send($message) or die('email failed to deliver');
$output = array('status'=>'success', 'to'=>$output['to']);
default:
$this->logSection('results', json_encode($output));
}
}
}
The terminal command being attempted from the project root:
php symfony reports
Any answers leading to the right path would be most helpful. Please keep in mind that I need to stay with version 1.4. The server is capable of sending off emails and my module version does just that when invoked by a URL. I need it to run on the command line though so I can set up a cron.
I am trying to setup a simple API. I have a Controller all other API controllers are extending, and I have attached a dispatch listener like so. I created a test that will always fail. Set the status code to 401, and return a message. However, it's still calling the main Controller method and not abandoning the request from the preDispatch method. Can I build a proper response here and force ZF2 not to continue executing the requested route? I tried just adding an exit() statement, but the client side receives an incomplete response.
protected function attachDefaultListeners()
{
parent::attachDefaultListeners();
$events = $this->getEventManager();
$this->events->attach('dispatch', array($this, 'preDispatch'), 100);
$this->events->attach('dispatch', array($this, 'postDispatch'), -100);
}
public function preDispatch (MvcEvent $e)
{
$this->dm = $this->getServiceLocator()->get('doctrine.documentmanager.odm_default');
// TODO: Check user and token from DB
if (Cookie::isCookieSet())
{
$cookie = Cookie::readCookie();
if (empty($cookie->user))
{
$this->getResponse()->setStatusCode(401);
return new JsonModel(array('auth' => false, 'msg' => 'Try again'));
}
// Cookie not set, if we are authenticating, continue; otherwise return a failure
} else {
}
}
You need to return a response object to short circuit the process, not a ViewModel:
Try something like this:
$response = $this->getResponse();
$response->setStatusCode(401);
$response->setContent(array('auth' => false, 'msg' => 'Try again'));
return $response;