Delete all Cookies in Electron desktop app - webview

I am using oauth (Stack Overflow) in an electron desktop app and there is a webview which loads the oauth url. I have a signout button in my app which will signout the user from Stack Overflow website and also from the app. How can I do this ?
How to remove all session cookies from the webview in electron app ?

You can remove cookies using Electron's cookies.remove() function (https://electron.atom.io/docs/api/cookies/#cookiesremoveurl-name-callback)
The trick is to convert cookie.domain into url.
import { session } from 'electron';
export default function deleteAllCookies() {
session.defaultSession.cookies.get({}, (error, cookies) => {
cookies.forEach((cookie) => {
let url = '';
// get prefix, like https://www.
url += cookie.secure ? 'https://' : 'http://';
url += cookie.domain.charAt(0) === '.' ? 'www' : '';
// append domain and path
url += cookie.domain;
url += cookie.path;
session.defaultSession.cookies.remove(url, cookie.name, (error) => {
if (error) console.log(`error removing cookie ${cookie.name}`, error);
});
});
});
}

If you want all cookies cleared, this would be the most straightforward way.
const { session } = require('electron');
session.defaultSession.clearStorageData({storages: ['cookies']})
.then(() => {
console.log('All cookies cleared');
})
.catch((error) => {
console.error('Failed to clear cookies: ', error);
});
It supports more complex requests. You can check the documentation here.

Related

Supabase Auth - redirectTo not working for OAuth

I am switching from Flutter to Supabase and am running into an issue with Authentication. Although I can successfully launch the URL with the correct redirect value, I keep getting redirected to the site URL which should only be used for web, not iOS or Android. Below is the function I am using for Apple but this is happening with all other providers as well.
const isWeb = Platform.OS === "web";
const redirectTo = isWeb
? "https://web.example.com/login-callback/"
: "com.example.react://login-callback/";
export const signInWithApple = async () => {
const { data, error } = await supabase.auth.signInWithOAuth({
provider: "apple",
options: {
redirectTo: redirectTo,
},
});
if (error !== null) {
console.log(error?.message);
return "error";
} else {
console.log(data);
Linking.openURL(data.url);
return "success";
}
};
The URL that gets logged before launching is correct, for example, LOG {"provider": "apple", "url": "https://api.example.com/auth/v1/authorize?provider=apple&redirect_to=com.example.react%3A%2F%2Flogin-callback%2F"}, but I always get redirected to something like https://web.example.com/#access_token=*****. I had a similar issue with Flutter, and that was because I had not added the additional redirect in Supabase but I already did that. I also confirmed that I have CFBundleURLSchemes set in the info.plist for iOS but that did not fix it.
IF SELF-HOSTING:
Check that you do not have spaces after commas in ADDITIONAL_REDIRECT_URLS.
Correct ✅ :
ADDITIONAL_REDIRECT_URLS="URL,URL,URL"
Incorrect ❌ :
ADDITIONAL_REDIRECT_URLS="URL, URL, URL"

How to share cookies between in-app browser and React Native (iOS)?

I am trying to implement SSO in a mobile app.
I am using react-native-inappbrowser-reborn to handle the SSO auth flow in an in-app browser. I can successfully authenticate inside the in-app browser. That is to say, I receive a session cookie and I can view the web version of the app from inside of the in-app browser. However, when I redirect back to the mobile application, none of my fetch requests include the session cookie! I have fetch configured with {credentials: 'include'}. CookieManager.getAll() returns an empty object.
I am experiencing this problem on iOS (v15.2), and I have yet to test on Android.
According to the documentation I should be able to share the cookie set in the in the in-app browser with my react-native app.
I am using the following code based off of the react-native-inappbrowser-reborn documentation
async function authenticate() {
const url = 'http://localhost:3000/sso/login?redirect_uri=myapp://home';
const deepLink = 'myapp://home';
try {
if (await InAppBrowser.isAvailable()) {
InAppBrowser.openAuth(url, deepLink, {
// iOS Properties
ephemeralWebSession: false,
// Android Properties
showTitle: false,
enableUrlBarHiding: true,
enableDefaultShare: false,
}).then(async (response) => {
if (response.type === 'success' && response.url) {
console.log('should see a new cookie set!');
console.log(await CookieManager.getAll()); // Sadly, nothing in here
const user = await fetch('http://localhost:3000/user', {
credentials: 'include',
headers: {
Accept: 'application/json',
'Content-Type': 'application/json',
},
});
/* This is an authenticated endpoint which returns a 401,
because for some reason the session cookie doesn't go along with the request */
}
});
} else {
throw new Error('login unsuccessful');
}
} catch (error) {
throw new Error();
}
}
Any bit of insight here would be immensely appreciated!

Is there any way to track an event using firebase in electron + react

I want to ask about how to send an event using firebase & electron.js. A friend of mine has a problem when using firebase analytics and electron that it seems the electron doesn't send any event to the debugger console. When I see the network it seems the function doesn't send anything but the text successfully go in console. can someone help me to figure it? any workaround way will do, since he said he try to implement the solution in this topic
firebase-analytics-log-event-not-working-in-production-build-of-electron
electron-google-analytics
this is the error I got when Try to use A solution in Point 2
For information, my friend used this for the boiler plate electron-react-boilerplate
The solution above still failed. Can someone help me to solve this?
EDIT 1:
As you can see in the image above, the first image is my friend's code when you run it, it will give a very basic example like in the image 2 with a button to send an event.
ah just for information He used this firebase package :
https://www.npmjs.com/package/firebase
You can intercept HTTP protocol and handle your static content though the provided methods, it would allow you to use http:// protocol for the content URLs. What should make Firebase Analytics work as provided in the first question.
References
Protocol interception documentation.
Example
This is an example of how you can serve local app as loaded by HTTP protocol and simulate regular browser work to use http protocol with bundled web application. This will allow you to add Firebase Analytics. It supports poorly HTTP data upload, but you can do it on your own depending on the goals.
index.js
const {app, BrowserWindow, protocol} = require('electron')
const http = require('http')
const {createReadStream, promises: fs} = require('fs')
const path = require('path')
const {PassThrough} = require('stream')
const mime = require('mime')
const MY_HOST = 'somehostname.example'
app.whenReady()
.then(async () => {
await protocol.interceptStreamProtocol('http', (request, callback) => {
const url = new URL(request.url)
const {hostname} = url
const isLocal = hostname === MY_HOST
if (isLocal) {
serveLocalSite({...request, url}, callback)
}
else {
serveRegularSite({...request, url}, callback)
}
})
const win = new BrowserWindow()
win.loadURL(`http://${MY_HOST}/index.html`)
})
.catch((error) => {
console.error(error)
app.exit(1)
})
async function serveLocalSite(request, callback) {
try {
const {pathname} = request.url
const filepath = path.join(__dirname, path.resolve('/', pathname))
const stat = await fs.stat(filepath)
if (stat.isFile() !== true) {
throw new Error('Not a file')
}
callback(
createResponse(
200,
{
'content-type': mime.getType(path.extname(pathname)),
'content-length': stat.size,
},
createReadStream(filepath)
)
)
}
catch (err) {
callback(
errorResponse(err)
)
}
}
function serveRegularSite(request, callback) {
try {
console.log(request)
const req = http.request({
url: request.url,
host: request.url.host,
port: request.url.port,
method: request.method,
headers: request.headers,
})
if (req.uploadData) {
req.write(request.uploadData.bytes)
}
req.on('error', (error) => {
callback(
errorResponse(error)
)
})
req.on('response', (res) => {
console.log(res.statusCode, res.headers)
callback(
createResponse(
res.statusCode,
res.headers,
res,
)
)
})
req.end()
}
catch (err) {
callback(
errorResponse(err)
)
}
}
function toStream(body) {
const stream = new PassThrough()
stream.write(body)
stream.end()
return stream
}
function errorResponse(error) {
return createResponse(
500,
{
'content-type': 'text/plain;charset=utf8',
},
error.stack
)
}
function createResponse(statusCode, headers, body) {
if ('content-length' in headers === false) {
headers['content-length'] = Buffer.byteLength(body)
}
return {
statusCode,
headers,
data: typeof body === 'object' ? body : toStream(body),
}
}
MY_HOST is any non-existent host (like something.example) or host that is controlled by admin (in my case it could be electron-app.rumk.in). This host will serve as replacement for localhost.
index.html
<html>
<body>
Hello
</body>
</html>

What's the right way to implement offline fallback with workbox

I am implementing PWA into my project, I have setted up the serviceworker.js, and I am using workbox.js for cache routing and strategies.
1- I add the offline page to cache on install event, when a user first visit the site:
/**
* Add on install
*/
self.addEventListener('install', (event) => {
const urls = ['/offline/'];
const cacheName = workbox.core.cacheNames.runtime;
event.waitUntil(caches.open(cacheName).then((cache) => cache.addAll(urls)))
});
2- Catch & cache pages with a specific regex, like these:
https://website.com/posts/the-first-post
https://website.com/posts/
https://website.com/articles/
workbox.routing.registerRoute(
new RegExp('/posts|/articles'),
workbox.strategies.staleWhileRevalidate({
cacheName: 'pages-cache'
})
);
3- Catch errors and display the offline page, when there's no internet connection.
/**
* Handling Offline Page fallback
*/
this.addEventListener('fetch', event => {
if (event.request.mode === 'navigate' || (event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html'))) {
event.respondWith(
fetch(event.request.url).catch(error => {
// Return the offline page
return caches.match('/offline/');
})
);
}
else{
// Respond with everything else if we can
event.respondWith(caches.match(event.request)
.then(function (response) {
return response || fetch(event.request);
})
);
}
});
Now this is working for me so far if I visit for example: https://website.com/contact-us/ but if I visit any url within the scope I defined earlier for "pages-cache" like https://website.com/articles/231/ this would not return the /offline page since it's not in the user cache, and I would get a regular browser error.
There's an issue in how errors are handled, when there's a specific caching route by workbox.
Is this the best method to apply for offline fallback? how can I catch errors from these paths: '/articles' & '/posts' and display an offline page?
Please refer as well to this answer where there's a different
approach to applying the fallack with workbox, I tried it as well same
results. Not sure which is the accurate approach for this.
I found a way to do it right with workbox.
For each route I would add a fallback method like this:
const offlinePage = '/offline/';
/**
* Pages to cache
*/
workbox.routing.registerRoute(/\/posts.|\/articles/,
async ({event}) => {
try {
return await workbox.strategies.staleWhileRevalidate({
cacheName: 'cache-pages'
}).handle({event});
} catch (error) {
return caches.match(offlinePage);
}
}
);
In case of using network first strategy this is the method:
/**
* Pages to cache (networkFirst)
*/
var networkFirst = workbox.strategies.networkFirst({
cacheName: 'cache-pages'
});
const customHandler = async (args) => {
try {
const response = await networkFirst.handle(args);
return response || await caches.match(offlinePage);
} catch (error) {
return await caches.match(offlinePage);
}
};
workbox.routing.registerRoute(
/\/posts.|\/articles/,
customHandler
);
More details at workbox documentation here: Provide a fallback response to a route

Firefox webextension API - Kill cookies

How to kill specific cookies using the webextension API?
I can fetch the cookies using -
browser.cookies.getAll({domain: cookieDomain})
But to remove cookies, I require both the url and name,
browser.cookies.remove({name: cookie.name, url: cookie.domain})
And, domain cannot be passed to url parameter to remove.
Also, I don't get the url from the cookie object.
Then, how do you remove specific cookies?
Thanks.
You should be able to construct the url by concatenating cookie.domain and cookie.path, and you get the protocol by checking cookie.secure:
const cookieName = cookie.name;
const cookieProtocol = cookie.secure ? 'https://' : 'http://';
const cookieUrl = cookieProtocol + cookie.domain + cookie.path;
browser.cookies.remove({name: cookieName, url: cookieUrl}).then(
() => {
console.log('Removed:', cookieName, cookieUrl);
}
).catch(
(aReason) => {
console.log('Failed to remove cookie', aReason);
}
);

Resources