I am trying to implement a service worker using Workbox for webdav requests. However I cannot cache those requests whose request method says PROPFIND.
workbox.routing.registerRoute(
new RegExp('http://xxxx/remote.php/webdav/(.*)'),
workbox.strategies.cacheFirst({
plugins: [
new workbox.cacheableResponse.Plugin({
statuses: [0, 200]
})
]
})
);
They are not being cached when I look for cached requests in Cache Storage.
So how do I register these routes and cache their responses using Service Workers?
Related
I'm using the following strategy to cache images:
workbox.routing.registerRoute(
/.*\.(?:png|jpg|jpeg|svg|webp|gif)/,
new workbox.strategies.CacheFirst({
"cacheName": "images",
plugins: [
new workbox.expiration.Plugin({
maxEntries: 60,
maxAgeSeconds: 2592000,
purgeOnQuotaError: false})]
}), 'GET');
I'm not defining any other particular strategies.
When I try it in an index.html containing a Gif hosted by Giphy:
<img src="https://media.giphy.com/media/xUA7baWfTjfHGLZc3e/giphy.gif"/>
Worbox do seeems to cache properly as long as I'm online
Using CacheFirst to respond to 'https://media.giphy.com/media/xUA7baWfTjfHGLZc3e/giphy.gif'
If I disable my wifi and try to refresh offline my page I hit the following errors:
workbox Network request for 'https://media.giphy.com/media/xUA7baWfTjfHGLZc3e/giphy.gif' threw an error. TypeError: Failed to fetch
Uncaught (in promise) no-response: The strategy could not generate a response for 'https://media.giphy.com/media/xUA7baWfTjfHGLZc3e/giphy.gif'. The underlying error is TypeError: Failed to fetch.
at CacheFirst.makeRequest (https://storage.googleapis.com/workbox-cdn/releases/4.3.1/workbox-strategies.dev.js:180:15)
GET https://media.giphy.com/media/xUA7baWfTjfHGLZc3e/giphy.gif net::ERR_FAILED
Therefore am I missing something? Should I configure something more to be able to cache Giphy gifs? Or is it actually a bug? Or something which can't be solved?
Any help appreciated, thank you in advance
My question is a duplicate of Workbox Cache First not caching properly
The answer is the one provided by Diego H Ferraz in https://stackoverflow.com/a/59040270/5404186
I have implemented the cacheableResponse for Workbox by checking the response header from the API.
However, it seems it does not cache API requests that have the x-is-cacheable header present on the response.
Here's how I implemented my service worker
const cacheableResponse = new workbox.cacheableResponse.Plugin({
statuses: [0, 200],
headers: {
'x-is-cacheable': true,
},
});
// APIs
workbox.routing.registerRoute(
new RegExp('https://my-api-url.here'),
workbox.strategies.networkFirst({
cacheName: 'api-cache',
plugins: [
cacheableResponse
]
})
);
I can confirm that the API response header has the x-is-cacheable: true present and it returns the status code 200.
If I remove the headers then it works, however I need to filter out the specific APIs that I need to cache.
Does anyone have an idea why this solution does not work?
So, I just discovered on Github that my issue is somewhat related with CORS and in order for the X-Is-Cacheable to work I also needed to add Access-Control-Expose-Headers: X-Is-Cacheable in our API response (it may vary with different codebase so please refer to your API framework on how to add response header in to your API).
reference: https://github.com/GoogleChrome/workbox/issues/2051
I am using session based authentication in my Angular Universal app. Problem is when http request is made from Angular app, backend (node.js) doesn't access the ongoing session, but creates new. You might think this is because cors, but the thing is, the first initial load only doesn't access session. So when I open up my app on page that has resolver or guard, that is making http request. That http request is going to create new session. Then navigating to other pages in app, it all works. http requests made after initial load will access the session. If I start from page that has no resolver/guards and then navigate to page that has and makes http request, this request will access to session.
Here is how my session is setup in index.js:
var sessionStore = new MySQLStore(options);
app.use(
session({
key: 'sessionStorage',
store: sessionStore,
secret: config.get('demoSess'),
saveUninitialized: false,
resave: false,
name: 'demo',
cookie: {
maxAge: 60000,
secure: false
},
})
)
const cors = require('cors');
app.use(cors({
origin: [
'http://localhost:4200'
],
credentials: true
})
);
And this how http request is made from frontend:
this.http.get(environment.apiUrl+'/server/page/auth', {withCredentials: true});
Is this how it should be? Backend runs on port 8080 and frontend 4200.
In app.module.ts, I have written TransferHttpCacheModule. If I remove it, I can see from backend, when I console log something, that first http request is made twice- first one doesn't access session and then second one does. So if I was to console.log(req.session.userId) in /server/page/auth, I would get undefined and 1 on next line. As I read, something like this was normal and to get around it, transferstate comes to into play, but as I understand TransferHttpCacheModule is basically easy way to do the transferstate. I tried also with writting the transferstate into resolver and outcomes was same- one request is only made, but that request wont access session.
I am hoping I am missing something when I am making http request from frontend or my session/cors is missing something. At this point I am running out of idea what to check or test, any hint what to check out is welcoming.
So I started to build around my authentication in Angular to use localStorage. I ran there into problem and while searching for solution I ran into tutorial talking something about isPlatformBrowser. So I started thinking, maybe Angular Universal in some way is making two request, but these two request are different and I need to eliminate one of them. So I ended up wrapping my http request with if(isPlatformBrowser(this.platformId)) { } and so far it seems I got my problem fixed.
So I am deploying an Angular 5 app with a Rails 5 back-end. I can get the data to flow properly between the two locally, but trying to connect to the deployed version of the API (which is on Heroku) I run into some authorization issue. The error is:
Failed to load https://my_api.herokuapp.com/data.json: No 'Access-Control-Allow-Origin' header is present on the requested resource.
Origin 'http://localhost:4200' is therefore not allowed access.
The response had HTTP status code 404.
Cross-Origin Read Blocking (CORB) blocked cross-origin response <URL> with MIME type application/json.
See <URL> for more details.
Is this something I need to change within the Rails API or in Angular? The deployed Rails API is essentially the same as the local version so I'm not sure where the disconnect is coming from.
There are only two refrences to the API in Angular. I connect to it the same way that I do to the local server:
Angular, app-module.ts
providers: [Angular2TokenService, AuthService, AuthGuard,
// {provide: 'api', useValue: 'http://localhost:3000/'}
{provide: 'api', useValue: 'https://my_ api.herokuapp.com/data.json'}
]
Perhaps it's my use of Angular2TokenService?
Angular, environment.ts:
export const environment = {
production: false,
token_auth_config: {
// apiBase: 'http://localhost:3000'
apiBase: 'https://my_api.herokuapp.com/data.json'
}};
Thanks! Let me know of any suggestions you might have or if you need clarification.
It's issue with CORS(cross-origin-resource-sharing). You can handle it by adding callback in your API like below:
def cors_set_access_control_headers
headers['Access-Control-Allow-Origin'] = ENV['SERVER_URL'] || '*'
end
where SERVER_URL is your front-end server URL
Else you can use gem 'rack-cors' as suggested in comments by #Kedarnag Mukanahallipatna
I have a spring backend which i'm accessing my Elastic search cluster through by a proxylike endpoint. The request has to be authorized with a cookie.
I'm currently using searchkit with supports authenticating requests through the withCredentials flag. Is there a similar option for reactivesearch or is there any other solution for authorizing the request with a cookie?
I could add: the backend exposes a swagger client which runs on a different domain than my frontend client. This client "owns" the cookie and thus i cannot read the cookie from my frontend client
You can use the headers prop in ReactiveBase to pass custom headers along with the requests. Link to docs. Since there is no withCredentials you could read the cookies and set in custom headers to verify the requests at the proxy middleware.
<ReactiveBase
...
headers={{
customheader: 'abcxyz'
}}
>
<Component1 .. />
<Component2 .. />
</ReactiveBase>
Here is a sample proxy server but its in NodeJS
Okey so it turns out, Reactivesearch uses fetch and fetch wants credentials: 'include' for cookie authentication. This may not be placed in the headers that Reactivesearch supplies and must be placed on the root object for the request.
It's possible to do this by implementing beforeSend on ReactiveBase.
const Base = ({ children }) => {
const onBeforeSend = props => {
return {
...props,
credentials: 'include',
}
}
return (
<ReactiveBase
app="app-name"
url="url"
beforeSend={onBeforeSend}
>
{children}
</ReactiveBase>
)
}