I have some server side code which depends upon userAgent. the code was working fine because we were setting up custom user agent on client side. but after enabling PWA on our site i am not getting my custom userAgent.
I just want to know, how to set custom userAgent in service worker?
I guess this should help:
self.addEventListener("fetch", event => {
// inject custom UA string
event.request.headers.append('User-Agent', 'MyAwesomePWA/1.0.0');
event.respondWith(fetch(event.request));
}
Links:
Fetch in service workers
Headers in fetch
User agent header
Related
I'm looking to make a web extension for Firefox that stores HTML pages and other resources in local storage and serves them for offline viewing. To do that, I need to intercept requests that the browser makes for the pages and the content in them.
Problem is, I can't figure out how to do that. I've tried several approaches:
The webRequest API doesn't allow fulfilling a request entirely - it can only block or redirect a request, or edit the response after it's been done.
Service Workers can listen to the fetch event, which can do what I want, but calling navigator.serviceWorker.register in an addon page (the moz-extension://<id> domain) results in an error: DOMException: The operation is insecure. Relevant Firefox bug
I could possibly set up the service worker on a self hosted domain with a content script, but then it won't be completely offline.
Is there an API that I missed that can intercept requests from inside a web extension?
I am trying to make a request to a website with a proxy using httparty like so:
def self.fetch_page_with_html_response(url, proxy_id)
proxy = Proxy.find(proxy_id)
request_options = {
http_proxyaddr: proxy.url, http_proxyport: proxy.port, http_proxyuser: proxy.username, http_proxypass: proxy.password,
headers: {"User-Agent" => proxy.user_agent}
}
response = HTTParty.get(url, request_options)
response
end
On certain websites my requests either hangs or returns an error page where the website is blocking me from fetching the page.
When i use these same proxy settings in my Chrome browser using an extension like SwitchyOmega the requests goes through fine and the page loads.
Is there any reason why the request would be getting blocked from my web server but not through my browser?
I even tested using the same user agent and providing the same exact headers my browser is sending.
There could be some reasons in your case.
Can you check if proxy works correctly?
Please send get request to https://api.ipify.org/ using your proxy in code.
If it returns correct ip address, then the proxy works.
Please disable javascript run in chrome settings. Then browse the website via proxy.
And check if website load correctly.
Because some websites render html and css using javascript.
Please feel free to reply me if you still need help.
I'm experimenting with ReactiveSearch and so far have tried the DataSearch and ResultList components. I'm looking over the required component to look at all the props and I see this
<ReactiveBase
app="appname"
credentials="abcdef123:abcdef12-ab12-ab12-ab12-abcdef123456"
headers={{
secret: 'reactivesearch-is-awesome'
}}
>
<Component1 .. />
<Component2 .. />
</ReactiveBase>
If the app is already secured using Appbaseio and the credentials gives my React app access to my ES cluster hosted there... what exactly could headers be used for? At first I thought username and password but you wouldn't do that.
What would be some of the scenarios where I SHOULD/COULD use the headers prop?
The headers are added to each request sent to the url. Normally you wouldn't need these. But in production you might want to add a layer of proxy server between your elasticsearch cluster and the client side ReactiveSearch code, this is where headers can be helpful.
You could add authentication in the flow. For example, you could restrict the elasticsearch calls to authenticated users by sending an access token via the headers prop and then verifying it at the proxy server (example of proxy server).
You could also implement some custom logic by adding custom headers and a logic to handle them at the proxy server.
The iOS application we have has a WkWebview that tries to communicate with our server by calling a https endpoint. The server works similar to a proxy and all calls to our endpoint will then forward the request to the destination site. For example - in our app if we were to set our destination to https://www.google.com the application will translate that to https://server.com/http://www.google.com.
The problem we are trying to solve is the interception of all http/https calls after the original WkWebview call. This includes all resource calls like css and javascript files. We have tried to use a custom scheme handler but since we do not parse the html/css on the server side we cannot add a custom scheme to intercept all http/https calls.
You can add the interception logic inside the webview for example every request store url and current number of calls inside a hidden element , and check it's value periodically by evaluteJavaScript function of the webview for that element
I have two separate services communicating using AmqpProxyFactoryBean (the "client") and AmqpInvokerServiceExporter (the "server"). Now, I'd like to include some custom headers on every request made through the AMQP proxy and be able to access them on the "server". Is there any easy way I can achieve this?
Since AmqpClientInterceptor uses AmqpTemplate to send and receive AMQP messages, you can provide for that RabbitTemplate any custom MessageConverter. And populate any additional headers from your toMessage() implementation.
However I'm not sure that you will be able to access to those custom header on the server side. We end up there just with RemoteInvocation.invoke().
So, seems for me you finally come up to the solution with an additional RPC param.
From other side that custom header may be useful for other AQMP routing scenarios when you can route that RPC message not only to the RPC queue.
Consider using Spring Integration AMQP Gateways instead of remoting over rabbitmq; that way you have complete control over the headers passed back and forth.
If you don't want to use Spring Integration, you can use the RabbitTemplate sendAndReceive methods on the client and either the receiveAndSend or a listener container on the server.
Again, this gives you full control over the headers.