Rails Paperclip S3 signed URL not working anymore - ruby-on-rails

A couple months ago I set up a rails app. Uploaded picture were saved in a private bucket on S3. I could download them through an expiring URL. Now that is not working anymore, even though I did not change anything. What could be the reason? A timezone issue?
My generated Link looks like this:
http:// [bucket] .s3-eu-west-1.amazonaws.com//original/image.jpg?AWSAccessKeyId= [AQCCESS_KEY] \u0026Expires=1408020974\u0026Signature= [signature]
With this linked I tried to download a file at 10.56 am central european time on August 10th. Might there be a time zone issue? How to I read the Expires parameter in that URL.
Any idea what could be the problem here?
Thank you in advance!

This answers your - 'How to I read the Expires parameter in that URL.?' question:
The Expires time is defined as Epoch time. In order to see what that time is, you can run the following ruby code:
require 'date'
DateTime.strptime("1408020974",'%s').to_s
which returns this:
"2014-08-14T12:56:14+00:00"
Hope it helps.

Okay, I found the problem:
As I said, the link looks like this:
http:// [bucket] .s3-eu-west-1.amazonaws.com//original/image.jpg?AWSAccessKeyId= [AQCCESS_KEY] \u0026Expires=1408020974\u0026Signature= [signature]
The problem is the u0026 which has to be replaced by "&"-sign. I will think about how to solve the problem. But at least it is identified :)

Related

Uploading App Images: "Invalid GeoJSON: Your routing app coverage file is invalid."

This question is not a duplicate to another question that asks about the same message, but in another context. The context of this question is just about uploading screenshot images and getting the message.
Today, I had a new message when uploading images to App Store Connect:
Invalid GeoJSON: Your routing app coverage file is invalid.
This makes absolutely no sense since, at this time, I had not even chosen a build for the upload.
Retrying to upload the images, it worked. But unfortunately, the message appeared for each language and format.
Is this a bug by Apple or am I missing something? I would guess that uploading images has nothing to do with GeoJSON.
I used Safari. Others seem to have the problem with Chrome. So it occasionally seems to happen on all browsers.
I had this same problem today while uploading App Store Icon on Preparing for submission page. Solved it by removing "-" from my image name.
This is an unusual bug. Apple might be already working on it. It's not coming on any specific browser. It occurs mostly when we are trying to upload more than one images at once.
Apple always keeps their live site maintenance work active, so this is most likely a bug occurring in their live site maintenance. It will be fixed soon.
For now, if you are finding difficulties handling screenshot uploads, you can try to upload them one by one rather than uploading in a bulk.
Important Note:
I am stating this on basis of the last few uploads I have experienced. Also, the solution I have given is tried from my side and it worked for me well. So, you can just try it out and I'm sure that it's not a browser issue. It can occur on any browser.
It did not work for me even if I provided English file names. It kept giving the above error.
Only thing that worked for me was to remove all underscores. So instead of iphone_xs_max_1.png, it worked when I renamed it to iphone1.png and uploaded.
Make sure screenshot files name in English.
Make sure screenshot files all the directory path(and folder name) in English.
it worked for me.
I had the same bug today. Some of the images uploaded without problems, others didn't.
I was uploading in Chrome when I got the issue. Opening the site in Safari and uploading the images there, solved the problem.
What solved this for me was removing strings of numbers and periods from the filenames. It appears the system is running the filenames through some kind of geocoder, and if there are strings of characters that could be interpreted as locations, it will error out.
Make sure after editing the image you save the file with an extension like myimage.png or myimage.jpg
In my case, I forgot to save the file with extension after removing alpha and transparency properties and no need to change browser etc.

AWS S3 not showing images on heroku?

I built a rails app and set up s3 and paperclip together. So far, the images are being posted into my s3 account. But on the live app it's not actually showing the image and just showing the broken file icon.
Any ideas why this is happening? Is it a paperclip error? is it Heroku? Is it my controller?
Here's the live app: http://petaluma-marin.herokuapp.com/Nutrition-Recipes
Here's my repo: https://github.com/Gcamara14/Recipe_app
Thanks!!
Your url to the images is wrong. The URL for your second image currently is this
http://s3.amazonaws.com/recipe-app-gio/recipes/images/000/000/009/medium/Screen_Shot_2017-05-30_at_1.19.49_PM.png?1496243164
What it needs to be is this
http://s3-us-west-1.amazonaws.com/recipe-app-gio/recipes/images/000/000/009/medium/Screen_Shot_2017-05-30_at_1.19.49_PM.png?1496243164
Notice instead of http://s3.aws... at the beginning you need http://s3-us-west-1.aws...
Whenever I have issues with S3 I find it is easiest to go to the bucket and look at the path and then inspect the image or asset and see if they match.
To give you a hint about what the issue might be, in your browser if you copy/paste the url for a photo you should see this message:
The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.
Take a look at your paperclip_defaults. You are missing the s3_host_name that would contain something like s3-us-west-1 (as mentioned in one of the prior answers).
Also looks like there is an issue already created in the paperclip repo that should help you out (here).

ERROR loading files on wxMaxima [iMac]

I hope you can solve this or at least tell what to do about it because I'm clueless. The thing is that once I've saved a .wxm file and then want to open it appears on wxMaxima this error and the "app" crashes:
Maxima encountered a Lisp error:
decoding error on stream
<SB-SYS:FD-STREAM for "socket 127.0.0.1:62607, peer: 127.0.0.1:4011"
{13F30991}
(:EXTERNAL-FORMAT :ASCII):
the octet sequence (195) cannot be decoded.
Automatically continuing.
To enable the Lisp debugger set debugger-hook to nil.
Thanks in advance.
P.S.: I run the latest Mac OS X version on my iMac.
Create a new file and write it down.
(setf sb-impl::*default-external-format* :utf-8)
(setf sb-alien::*default-c-string-external-format* :utf-8)
Save the file as .sbclrc at Home (User) folder.
I come back on this post as I have been affected by this today.
I am using Ubunutu 14.04 and the same bug appears. To me it is due to Maxima not being able to load anything else than ".mac" files, nothing to do with utf-8/ASCII (I have mv a file that is working to a wxm and vice-versa, it will not work anymore / rework)
Also I have prepared a workaround:
The idea is too have a tool that translates your .wxm files to a .mac file just before you load it (it is actually a very easy bash script)
So:
you put the wxm-to-mac.sh files into your path
then inside maxima, instead of doing
load("foo.wxm")
you simply do
system("wxm-to-mac foo.wxm")$
load("foo.mac")$
Bare in mind you shall not edit the foo.mac file because a routine might re-erase it afterward. Instead keep editing the .wxm file.
Hope it helps someone
Looks like the file has been saved with some non-ASCII characters (e.g. UTF-8) in it, but it is not read as UTF-8; that seems to be a bug in wxMaxima. Can you please post the offending .wxm file?
I was having the same problem, since none here could give a straight and correct answer (at least is not marked with the green icon) i tried to look somewhere else. I couldn't find an answer that solve my problem, but then i thought that i had used WxMaxima on Mac before and it worked pretty nice. The last time i installed wxmaxima on mac and it worked was in september 2012, i went to the http://sourceforge.net website and searched for the maxima file that was available on that date and i found http://sourceforge.net/projects/maxima/files/Maxima-MacOS/5.18.1-MacOS/, i installed it and its working pretty fine (about the problem, it is weird, even if i create a new maxima file typed "a" and saved it, i could no longer open it, so i'm guessing that it has nothing to do with ascii or non-ascii characters) I have no idea why this error happens on the recent version of Maxima/WxMaxima, but it makes no sense that we have to install a previous version for it to work.
Anyway, it's working for me, and i hope it works for you too. Glad i helped :)
I had the same problem.
In my case, the name of the directory where the ".wxm" file is located, contained Korean letter. I changed the directory name with an English one.
Then the problem has been solved.
I hope this works for you, too.

Can't use tempurl with put method to upload

I have installed keystone and swift and had proxy-server.conf configured.
I can generate a temp url to download a object without any promblem..
But when I use the same way(I did changed the method into "PUT") to generate a temp url to upload a object I got error 401...The log said can't find authentication head...
I had tried all the way that I can think of to solve the problem but I got no luck:<
It turned out that was a silly problem caused by system date between two servers are not sync...

Net::FTPPermError (500 I won't open a connection to 10.10...... (only to 174.12........)

I have a rails app deployed to heroku. I have used paperclipftp to upload files to an ftp server, as heroku doesn't give much features with file uploading. So when I try to upload a file and save a record, I get this error.
Net::FTPPermError (500 I won't open a connection to 10.10...... (only to 174.12........)
I don't know why this is coming up. After some searching I came to know that Heroku doesn't allow active FTP connections so tried to establish a passive connection by editing the paperclipftp file.
I added this line in its initialize block
#ftp.passive = #ftp_credentials[:passive] if #ftp_credentials.has_key?("passive") and passed a variable [passive:true] in my YAML config file. But still it doesn't work.
Please Help. Thanks in advance.
have you found an answer to your problem ? I'm facing the exact same issue and do not know how to bypass this. A solution could be to use Amazon S3 to save your file. I will check in this direction and let you know.
Regards,
Luc
EDIT (28/03/11): S3 is definitively a great solution, very easy to setup. On top of this it's really cheap if you do not have tons of pictures to upload

Resources