Detecting network request - ruby-on-rails

When using gems for external services, sometimes I don't know whether a method does a trip to the service or just returns some local computation.
An example - I'm using an image hosting service (Cloudinary) that has a Ruby wrapper for their API. The following command returns a complete URL to a hosted image when providing an image_id:
Cloudinary::Utils.cloudinary_url(image_id)
#=> "http://res.cloudinary.com/my_service/image/upload/v1/some_image_id"
I'd probably construct that URL manually to skip a HTTP request (if that what's what happens).
So to my question - is there any faster way than digging in the source code or unplugging internet access to detect if network request is made before returning a value?

Cloudinary is using RestClient gem. It allows set global logging.
https://github.com/rest-client/rest-client#logging
To enable logging globally you can:
set RestClient.log with a Ruby Logger, or set an environment variable
to avoid modifying the code (in this case you can use a file name,
"stdout" or "stderr"):
$ RESTCLIENT_LOG=stdout path/to/my/program

Related

Getting "ECONNREFUSED" error when trying to upload to Wolkenkit Blob Server

I'm currently developing a Wolkenkit application which is run on my local machine.
I want to upload a file from the Wolkenkit app to the blob server (as documented here).
When sending a POST request from the server to https://local.wolkenkit.io:3001/, Node.js gives me the error ECONNREFUSED.
I've tested the POST-Request with another program and it works there. Any idea why it doesn't work from the wolkenkit application itself?
Thanks!
The Storing files sample you linked to shows code that is to be run in the browser, not in the backend itself. Of course, both should work, but there are a few minor differences you need to watch out for.
Fixing the host name
First, I suppose that local.wolkenkit.io in your case maps to 127.0.0.1, which is the default for wolkenkit. That means that when you try to connect to this domain from within a Docker container, the container does not try to call out to the blog storage container, but it stays within itself. So, the first thing that needs to be fixed is the host name.
Basically, there are two options for this: You can either setup local.wolkenkit.io so that it resolves to the external IP address of your machine. This would work, but is pretty cumbersome. The other option is to directly address the appropriate container that is responsible for blob storage, by its internal name. The internal name is <name-of-your-app>-depot-file. So you need to replace https://local.wolkenkit.io:3001/ by https://<...>-depot-file.wolkenkit.io:3001/.
Fixing the port
Second, the port is wrong. This is because the blob storage service is internally running on port 3000, externally on 3001. So instead of https://<...>-depot-file.wolkenkit.io:3001/ you need to use https://<...>-depot-file.wolkenkit.io:3000/.
Once you have done this you should not get any more errors like ECONNREFUSED, since now the service can be found.
Fixing SSL issues
Third, since you are now connecting to the blob storage service using a different domain name, the SSL certificate doesn't match any more, since it was issued for local.wolkenkit.io. As a result, you will get SSL errors when trying to connect.
The simplest way to get around this is to disable any SSL checks (albeit this is also the most insecure way to handle this!). How to do this depends on the HTTP client module you are using. E.g., in request there is an option called strictSSL that you can set to false.
Of course, what you actually should do is to either use a custom certificate which includes this domain name as well, or to write a function that handles the certificate check and accepts the presented one, especially in this case.
If you do all of this, things should work :-)
PS: I am one of the authors of wolkenkit. Thanks a lot for bringing up this issue, and we will take care of this in the future, to make storing blobs easier.

Stream remote file to client in ruby/rails 4/unicorn/nginx

I am trying to stream a file from a remote storage service (not s3 :-)) to the client using Ruby on Rails 4.2.
My server needs to stay in the middle of things to authenticate the client request but also to build up the request to the remote storage service since all requests to that service need to be authenticated using a custom header param. This makes it not possible to do a simple redirect_to and let the client download the file directly (but do let me know if this IS in fact possible using rails!). Also I want to keep the url of the file cloaked for the client.
Up until now I am using a gem called ZipLine but this also does not work as it still buffers the remote file before sending it to the client. As I am using unicorn/nginx, this might also be due to a setting in either of those two, that prevents proper streaming.
As per rails doc's instructions I have tried adding
listen 3000, tcp_nopush: false
to config/unicorn.rb but to no avail.
A solution might be to cache the remote file locally for a certain period and just serve that file. This would make some things easier but also creating new headaches like keeping the remote and cached files in sync, setting the right triggers for cache expiration, etc.
So to sum up:
1) How do I accomplish the scenario above?
2) If this is not a intelligent/efficient way of doing things, should I just cache a remote copy?
3) What are your experiences/recommendations in given scenario?
I have come across various solutions scattered around the interweb but none inspire a complete solution.
Thanks!
I am assuming you the third party storage service has an HTTP access. If you did consider using redirect_to, I assume the service also provides a means to allow per download authorization. Like unique key in the header that expires and does not expose your secret api keys or HMAC signed URL with expiration time as a param.
Anyhow, most cloud storage services provide this kind of file access. I would highly recommend let the service stream the file. Your app should simply authorize the user and redirect to the service. Rails allows you to add custom headers while redirecting. It is discussed in Rails guides.
10.2.1 Setting Custom Headers
If you want to set custom headers for a response then response.headers
is the place to do it. The headers attribute is a hash which maps
header names to their values, and Rails will set some of them
automatically. If you want to add or change a header, just assign it
to response.headers
So your action code would end up being something like this:
def download
# do_auth_check
response.headers["Your-API-Auth-Key"] = "SOME-RANDOM-STRING"
redirect_to url
end
Don't use up unnecessary server resources by streaming through them all those downloads. We are paying cloud services to that after all :)

Mock API Requests Xcode 7 Swift Automated UI Testing

Is there a way to mock requests when writing automated UI tests in Swift 2.0. As far as I am aware the UI tests should be independent of other functionality. Is there a way to mock the response from server requests in order to test the behaviour of the UI dependant on the response. For example, if the server is down, the UI tests should still run. Quick example, for login, mock if password failed then UI should show alert, however, if the login is successful the next page should be shown.
In its current implementation, this is not directly possible with UI Testing. The only interface the framework has directly to the code is through it's launch arguments/environment.
You can have the app look for a specific key or value in this context and switch up some functionality. For example, if the MOCK_REQUESTS key is set, inject a MockableHTTPClient instead of the real HTTPClient in your networking layer. I wrote about setting the parameters and NSHipster has an article on how to read them.
While not ideal, it is technically possible to accomplish what you are looking for with some legwork.
Here's a tutorial on stubbing network data for UI Testing I put together. It walks you through all of the steps you need to get this up and running.
If you are worried about the idea of mocks making it into a production environment for any reason, you can consider using a 3rd party solution like Charles Proxy.
Using the map local tool you can route calls from a specific endpoint to a local file on your machine. You can past plain text in your local file containing the response you want it to return. Per your example:
Your login hits endpoint yoursite.com/login
in Charles you using the map local tool you can route the calls hitting that endpoint to a file saved on your computer i.e mappedlocal.txt
mappedlocal.txt contains the following text
HTTP/1.1 404 Failed
When Charles is running and you hit this endpoint your response will come back with a 404 error.
You can also use another option in Charles called "map remote" and build an entire mock server which can handle calls and responses as you wish. This may not be exactly what you are looking for, but its an option that may help others, and its one I use myself.

How to setup QuotaGuard Static for a Rails app hosted on heroku?

I'm trying to setup my heroku app to have an static IP using QuotaGuard (I know proximo is the other option, but it's pretty expensive).
I added the heroku QuotaGuard Static addon and got the two IPs it generates as well as the proxy url.
What is my next step? (aka how do I tell my Rails app to use the proxy provided by QuotaGuard)
I see they have ruby code samples using REST-client and HTTParty, but do I put that somewhere like in the application.rb??
Most likely a bit too late to answer this question, but still.
Like you said, the first step to configuring QuotaGuard Static is provisioning the addon on Heroku (either via the Web Interface or the Heroku CLI). From there, you are able to get your two outbound IPs, and your proxy URL. The two IPs you were given should be whitelisted on whichever remote service you are trying to access.
As you mentioned, the documentation gives you a couple of samples using Rest Client for Ruby on Rails. This snippet should pretty much go anywhere you want to access whichever resource you need to access via the static IP Addresses. Assuming you want to access a Web Service hosted on an Amazon EC2 instance with elastic IP 1.2.3.4, your would write:
RestClient.proxy = ENV["QUOTAGUARDSTATIC_URL"]
res = RestClient.get("http://1.2.3.4/yourWebService")
And from there process the response stored in res appropriately. This code would go in say whichever controller's method you'll be using to access the remote web service. In this case, you also need to add the Rest Client to your controller, so at the top of that file you shoud also add require "rest-client" . Don't forget to add the rest-client gem to your Gemfile.
Summing up, basically the snippets from the documentation go wherever it is you want to use the proxy to access a remote service requiring a fixed, whitelisted set of IP addresses.
Source: https://devcenter.heroku.com/articles/quotaguardstatic

Delphi XE2 - How to get IP of a specified website?

I have a program in which checks a php file on a web server to see if the user is verified. The php files runs through the DB and checks and echos "verified" if they are.
Now, people are now easily bypassing the verification system by installing Xampp, routing my server to 127.0.0.1 in their hosts file, and then setting a script that echos verified.
I want to be able to check the IP address of my domain to check if it is routing to 127.0.0.1.
How would I go about resolving the IP address of a domain through delphi?
I used to use a similar hack to get around ICQ server-side verifications. Very convenient when I wanted to test alpha/beta builds that I was not invited to :-)
Indy, which ships with Delphi, has a TIdStack.ResolveHost() function, and a separate TIdDNSResolver component, which can both be used to get the domain's IP(s). It also has a TIdStack.LocalAddresses property to retreive the local IPv4 addresses. Or you can just use the socket API gethostbyname() or getaddrinfo() functions directly, along with platform-specific APIs to enumerate the local IPs, like the GetAdaptersAddresses() function on Windows.
However, rather than having the PHP script simply echo plain-text back to your app, a much more secure option that does not require you to verify IPs is to have your app create a dynamically generated nonce value and send it to the PHP script, then have the script process it, hash it, whatever as needed using an algorithm that only you know, and then send it back to the app. The app can perform the same algorithm and compare the results. Unless someone takes the time to reverse engineer your app, they will not be able to reproduce your algorithm or fake its results with their custom Xampp scripts.
Even better, use SSL/TLS to encrypt your connection to your domain server, and give your domain server an SSL certificate that your app can verify before it exchanges any data with your PHP script. If you do just this much, you can continue using the plain-text echo since SSL/TLS will verify you are connected to your domain for you.

Resources