I'm running gatling.io to load test my server, but I would like to be able to view the calls so I can debug portions of the script. I know I can have it write all the logs to the console, but viewing through fiddler is nicer.
I searched for a few hours until I found a solution. This is by far the easiest. Just modify your gatling.io's scala script's http configuration object to use fiddler's proxy.
Just like this:
val httpConf = http
.proxy(
Proxy("127.0.0.1", 8888)
.httpsPort(8888)
)
Related
We are running a service behind an nginx proxy so that:
http://service-post:8080/swagger-ui.html is routed to public address https://host.com/services/post/swagger-ui.html
Or to define from the other way:
When nginx receives request on https://host.com/services/post/swagger-ui.html, it strips the /services/post/ prefix and passes the request to the post service on /swagger-ui.html path.
Before setting up anything (with default SpringDoc configuration) I can correctly see the swagger docs on http://service-post:8080/swagger-ui.html.
To set the paths for the public address on host.com, I am using:
springdoc.api-docs.path: /services/post/api-docs
springdoc.swagger-ui.path: /services/post/swagger-ui.html
springdoc.swagger-ui.configUrl: /services/post/v3/api-docs/swagger-config
However it seems that this brakes it completely:
/swagger-ui.html, /api-docs and /v3/api-docs/swagger-config return 404 both for service-post:8080/* and https://host.com/services/post/*
Only thing that seems to work is https://host.com/services/post/swagger-ui/index.html which shows the petstore documentation.
We are not using Spring Boot, just Spring MVC of version 5.3.1.
So how do I set up to keep the handling of the original paths (eg. /api-docs), but performing the lookup on the prefixed path (/services/post/api-docs)?
In the end I completely ignore the default redirect:
swagger-ui.html -> `swagger-ui/index.html?url=/v3/api-docs
And implemented my own one:
docs -> swagger-ui/index.html?url=MY_PREFIX/v3/api-docs
This way I don't need to change anything and everything works with default settings.
It's all documented here:
https://springdoc.org/index.html#how-can-i-deploy-springdoc-openapi-ui-behind-a-reverse-proxy
If you are not using spring-boot, you can add the ForwardedHeaderFilter bean:
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/web/filter/ForwardedHeaderFilter.html
I have this Telegram bot written in Lua that I am doing as a hobby for a language network. And I have been reading new messages via the getUpdates API call all the time. Now I want to rewrite it to use webhooks, but I have no experience with that whatsoever. I have googled but didn't find anything certain. I kinda feel that WSAPI is the library to use, but I am not sure. Moreover, I am not really sure I need any special library just for reading POST requests (which is all that the Telegram bot API uses). I tried using sockets:
socket = require 'socket'
server = assert(socket.bind("*", 9000))
function read(client, pattern, prefix)
local data, emsg, partial = client:receive(pattern, prefix)
if data then
return data
end
if partial and #partial > 0 then
return partial
end
return nil, emsg
end
while true do
local client = server:accept()
client:settimeout(3)
local msg, err = read(client, '*a')
if not err then
print(msg)
client:close()
end
end
The print(msg) here gives me the full POST request including headers, which I am probably able to parse (the body is supposed to always be a JSON). I am not really that familiar with HTTP requests though and I'm not sure I can just throw away everything that goes before the first {.
My setup is Lua 5.2, Ubuntu x64 16.04 and Nginx. What I need to do is to receive and read POST requests, nothing more.
TL;DR: is it okay to parse the POST request I receive from the code above or am I missing something, like a library that'd make my life easier?
Thanks!
Maybe (hopefully) I'm missing something very simple, but I can't seem to figure this out.
I have a set of gRPC services that I would like to put behind a nghttpx proxy. For this I need to be able to configure my client with a channel on a non-root url. Eg.
channel = grpc.insecure_channel('localhost:50051/myapp')
stub = MyAppStub(channel)
This wasn't working immediately through the proxy (it just hangs), so I tested with a server on the sub context.
server = grpc.server(executor)
service_pb2.add_MyAppServicer_to_server(
MyAppService(), server)
server.add_insecure_port('{}:{}/myapp'.format(hostname, port))
server.start()
I get the following
E1103 21:00:13.880474000 140735277326336 server_chttp2.c:159]
{"created":"#1478203213.880457000","description":"OS Error",
"errno":8,"file":"src/core/lib/iomgr/resolve_address_posix.c",
"file_line":115,"os_error":"nodename nor servname provided, or not known",
"syscall":"getaddrinfo","target_address":"[::]:50051/myapp"}
So the question is - is it possible to create gRPC channels on non-root urls?
As confirmed here, this is not possible. I will route traffic via subdomains in nghttpx.
I'm trying to upload to my server (on Heroku) a file stored in a password protected FTP.
The problem is that this FTP also dont contain my production IP address on his whitelist (and i cant add it..) so i should use a proxy to connect my rails app this FTP.
I tried this code :
proxy_uri = URI(ENV['QUOTAGUARDSTATIC_URL'] || 'http://login:password#myproxy.com:9293')
Net::HTTP::Proxy(proxy_uri.host, proxy_uri.port,"login","password").start('ftp://login:password#ftp.website.com') do |http|
http.get('/path/to/myfile.gz').body
end
But my http.get returns me lookup ftp: no such host.
I also got this code for FTP download, but i dont know how to make it works with a proxy :
ftp = Net::FTP.new('ftp.myftp.com', 'login', 'password')
ftp.chdir('path/to')
ftp.getbinaryfile('myfile.gz', 'public/myfile.gz', 1024)
ftp.close
Thanks in advance.
I realise that you asked this question over 6 months ago, but I recently had a similar issue and found that this (unanswered) question is the top Google result, so I thought I would share my findings.
mudasobwa's comment below your original post has a link to the net/ftp documentation which explains how to use a SOCKS proxy...
Although you don't mention a specific requirement for a HTTP proxy in your original post, it seems obvious to me that is what you were trying to use. As I'm sure you're aware, this makes the SOCKS documentation totally irrelevant.
The following code has been tested on ruby-1.8.7-p357 using an HTTP proxy that does not require authentication:
file = File.open('myfile.gz', 'w')
http = Net::HTTP.start('myproxy.com', '9293')
resp, data = http.get('ftp://login:password#ftp.website.com')
file.write(data) if resp.code == "200"
file.close unless file.nil?
Source
This should give you a good starting point to figure the rest out for yourself.
To get you going, I would guess that you could use user:pass#myproxy.com for basic auth, or perhaps sending a Proxy-Authorization header in your GET request.
Whatever I do, I always get a
Symfony\Component\HttpKernel\Exception\NotFoundHttpException: "No
route found for [...]"
in $crawler->text(), when I try to request an external URL with $crawler = $client->request('GET', 'http://anotherdomain.com');.
I want to do that because I'm using another virtualHost to render some pages with Symfony 1.2 and some others with Symfony 2.3.
I also tried to
$client = static::createClient(array(), array('HTTP_HOST' => 'anotherdomain.com'));
$client->followRedirects(true);
But it's always trying to render it whithin Symfony 2.
It's not possible, because $client actually doesn't send any http request (you may notice that when you try run your "functional" test with www server disabled - they still should work). Instead of that it simulates http request and run normal Symfony's dispatching.