I am using AWS S3 and Cloudfront to store images and render images on a webpage. I have managed to generate a signed URL using Cloudfront to render S3 Objects.
The problem is, I tried attaching that signed URL within an a tag for href='' tagger. I was hoping that, when users clicked on the link, it would direct them to the image through the web. Instead, the file is downloaded when a user clicks on the link.
How photo key is generated
var photoKey = $ctrl.clientDetail["_id"] + '/' + fileName;
I created a subdirectory for each user, and store the file in the respective subdirectory.
My Upload Function in AngularJS
s3.upload({
Key: photoKey,
Body: file,
ACL: 'public-read',
Metadata: {
'id': $ctrl.clientDetail["_id"],
'phone': $ctrl.phone
}
}, function(err, data){
if (err) {
$scope.uploadedFail = true;
console.log(err)
genericServices.setErrorInfo($scope.configErrorAddDocAlert, addDoc_reason);
}
else {
$scope.uploadedSuccess = true;
genericServices.setSuccessInfo($scope.configSuccessAddDocAlert, addDoc_success);
}
}
);
What I found interesting was that when I manually uploaded a file in a subdirectory in the AWS S3 console, the generated URL was able to serve the file through the web.
How do I make it, so that the Object within the bucket is not a downloadable file, but a link?
For those who faced the same problem, here is the solution.
When uploading, specify the content-type.
S3 Upload Process modified
s3.upload({
Key: photoKey,
Body: file,
ContentType: file.type,
Metadata: {
'id': $ctrl.clientDetail["_id"],
'phone': $ctrl.phone
}
}, function(err, data){
if (err) {
$scope.uploadedFail = true;
console.log(err)
genericServices.setErrorInfo($scope.configErrorAddDocAlert, addDoc_reason);
}
else {
$scope.uploadedSuccess = true;
genericServices.setSuccessInfo($scope.configSuccessAddDocAlert, addDoc_success);
}
}
);
Initially, the type was application/octet-stream, so it was only downloadable.
Thanks you #michael-sqlbot for the comment
Related
I am using this Ghost plugin to store image data on Google Drive. Recently, images have stopped loading with this error page downloaded in place of the image:
The site is running in a containerized Ghost instance on Google Cloud Run, source here
Do I need to open a support ticket somewhere to resolve this? Site in question is here
EDIT: Here is the code used to access the saved content.
jwtClient.authorize(function(err, tokens) {
if (err) {
return next(err);
}
const drive = google.drive({
version: API_VERSION,
auth: jwtClient
});
drive.files.get(
{
fileId: id
},
function(err, response) {
if (!err) {
const file = response.data;
const newReq = https
.request(
file.downloadUrl + "&access_token=" + tokens.access_token,
function(newRes) {
// Modify google headers here to cache!
const headers = newRes.headers;
headers["content-disposition"] =
"attachment; filename=" + file.originalFilename;
headers["cache-control"] = "public, max-age=1209600";
delete headers["expires"];
res.writeHead(newRes.statusCode, headers);
// pipe the file
newRes.pipe(res);
}
)
.on("error", function(err) {
console.log(err);
res.statusCode = 500;
res.end();
});
req.pipe(newReq);
} else {
next(err);
}
}
);
});
Your problem is related to file.downloadUrl. This field is not guaranteed to work and is not supposed to be used to download files.
The correct way to do this is to use the webContentLink property instead. You can look at here for reference.
I am using Onedrive Rest API to Upload a file into my Onedrive Account. Below is the mentioned Microsoft documentation link to the Upload file.
https://learn.microsoft.com/en-us/onedrive/developer/rest-api/api/driveitem_createuploadsession?view=odsp-graph-online
Whenever I used the above API the file gets uploaded into my account but the file gets corrupted.
Below mentioned is my request object.
{
method: "PUT",
url: Upload Url,
processData: false,
headers: {
"Authorization": <access_token>
"Content-Disposition": 'form-data; name="metadata"',
"Content-Type": "application/json; charset=UTF-8",
"Content-Transfer-Encoding": "8bit"
},
formData: {
file: {
value: fs.createReadStream("Smile.png"),
options:
{
filename: "Smile.png,
contentType: null
}
}
}
}
The file gets uploaded in the proper folder but it's corrupted and I am unable to view it in my Onedrive Account.
Can someone please help me with this.
The problem lies in how you pass the data in the body. I had the same issue and solved by passing directly the image buffer (in your case fs.createReadStream("Smile.png") as a body (without any curly brackets {})
My code:
const config = {
headers: { Authorization: `Bearer ${token}`,
}
};
const bodyParameters = imageBuffer;
await axios.put("https://graph.microsoft.com/v1.0/drives/{drive-id}/items/{item-id}:/{filename}:/content",bodyParameters,config)
I implemented a Jersey REST service to download the zip file.
Now, I would like to use axios in front end to download the zip file.
Everything is fine in PC Chrome but when tried with Safari on iPad it opens a tab with name "unknown".
I have searched some articles and mentioned that this may related to IOS safari compatibility.
e.g. https://caniuse.com/#feat=download
However, I also want know if there is any method to show the downloaded file as "file.zip" for safari.
Below is my code
Backend:
#GET
#Path("/getTestingReport")
#Produces("application/zip")
public Response getTestingReport() throws Exception {
// set file (and path) to be download
File file = new File("C:/Users/abc/Desktop/test.zip");
ResponseBuilder responseBuilder = Response.ok((Object) file);
responseBuilder.header("Content-Disposition", "attachment; filename=\"MyJerseyZipFile.zip\"");
return responseBuilder.build();
}
Frontend:
axios.get("report/getTestingReport").then((response) => {
console.log("response", response)
var blob = new Blob([response.data], { type: "application/zip" });
const url = window.URL.createObjectURL(blob);
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', 'file.zip');
document.body.appendChild(link);
link.click();
}).catch((error) => {
console.error("error response", error.response)
});
May I have any suggestion?
I've created an app that allows users to upload their images to a Google Cloud Storage bucket - which is then used in social media sharing previews.
The image is uploaded directly to the bucket from the user's browser - using the Firebase API.
What I also want to do is - when an image is saved - to automatically post that image on my app's twitter feed.
The way I've done this is to use a Cloud Function trigger on Cloud Storage - which downloads the image and then uploads via the Twitter API.
There's essentially an unnecessary double handling of traffic here is there a way to just give the Twitter API the public location of the file and have it source the file directly?
Here's my code for the current solution:
class Defferred {
constructor() {
const that = this;
this.prom = new Promise((resolve, reject) => {
that.resolve = resolve;
that.reject = reject;
});
}
}
exports.onNewImage = functions.storage.object().onFinalize((object) => {
const prom = new Defferred();
bucket.file(object.name).download((err, file, response) => {
if (err) {
return prom.reject(err);
} else {
twitterClient.post('media/upload', {
media: file
}, (err, media, response) => {
if (!err) {
let status = {
status: "Somebody created this at https://geoplanets.io #geometry #geometricart",
media_ids: media.media_id_string
}
twitterClient.post('statuses/update', status, (error, tweet, response) => {
if (!error) {
return prom.resolve(response);
} else {
return prom.reject(error);
}
});
} else {
return prom.reject(err);;
}
});
}
});
return prom.prom;
});
Is there an alternative way of doing this that doesn't involve downloading the file? - A good answer would highlight the relevant parts of the API documentation that highlight how I would go about working this out myself.
The Twitter node api doesn't have a way to simply pass an URL for media upload. The example they give shows what you're doing now - sending the full content with the request.
The node client is just a wrapper around the REST API, and if you read its docs, you'll see that you have to provide the file content directly to the POST.
Yes!
We can upload media using URL of file by making the downstream of a file.
First we need to make Axios request to have a buffer of it then we can pass it with file type using
twitter-api-v2
use it in this package or REST API
const client = new TwitterApi({
appKey: CONSUMER_KEY,
appSecret: CONSUMER_SECRET,
accessToken: oauth_token,
accessSecret: oauth_token_secret,
});
const url = 'URL OF THE FILE';
const downStream = await axios({
method: 'GET',
responseType: 'arraybuffer',
url: url,
}).catch(function (error) {
res.send({error:error});
});
const mediaId = await client.v1.uploadMedia(downStream.data,{ mimeType: 'png'});
const newTweet = await client.v1.tweet('Hello link tweet!', { media_ids: mediaId });
sample Image
I have a Meteor application deployed with nginx.
I try to upload images from the application to save the images on the server. When I'm in localhost, I save my images in the myapp/public/uploads folder. But, when I deploy, this folder become myapp/bundle/programs/web.browser/app/uploads. So, when I upload an image, it saved in a new folder in myapp/public/uploads. But so, I can't access to it. When I'm in localhost I access to my images like that : localhost:3000/uploads/myImage.png but when I do myAdress/uploads/myImage.png I access to the myapp/bundle/programs/web.browser/app/uploads folder and not the one where the images are saved (myapp/public/uploads).
This is my code to save images :
Meteor.startup(function () {
UploadServer.init({
tmpDir: process.env.PWD + '/app/uploads',
uploadDir: process.env.PWD + '/app/uploads',
checkCreateDirectories: true,
uploadUrl: '/upload',
// *** For renaming files on server
getFileName: function(file, formData) {
//CurrentUserId is a variable passed from publications.js
var name = file.name;
name = name.replace(/\s/g, '');
return currentTileId + "_" + name;
},
finished: function(fileInfo, formFields) {
var name = fileInfo.name;
name = name.replace(/\s/g, '');
insertionImages(name, currentTileId, docId);
},
});
});
So, do you know how can I do to save and access to my images when the application is deployed ? Maybe save the image in the myapp/bundle/programs/web.browser/app/uploads folder or access to the myapp/public/uploads folder with an url.
This is what we do.
Use an external dir for uploads, say, /var/uploads. Keeping the uploads in public folder makes the meteor app to reload in the dev environment, on any file upload.
Now, at local, use Meteor to serve these files at a certain url. In production, use nginx to serve the same at the same url.
For Development
1) Symlink your upload dir to a hidden folder in public.
eg:
ln -s /var/uploads /path/to/public/.#static
2) Serve the public hidden folder via Meteor by using:
The url /static will server the folder public/.#static by using the following code on the server. Ref: How to prevent Meteor from watching files?
var fs = require('fs'), mime = require('mime');
WebApp.rawConnectHandlers.use(function(req, res, next) {
var data, filePath, re, type;
re = /^\/static\/(.*)$/.exec(req.url);
if (re !== null) {
filePath = process.env.PWD + '/public/.#static/' + re[1];
try {
stats = fs.lstatSync(filePath);
if (stats.isFile()) {
type = mime.lookup(filePath);
data = fs.readFileSync(filePath, data);
res.writeHead(200, {
'Content-Type': type
});
res.write(data);
res.end();
}
}
catch (e) {
// console.error(filePath, "not found", e); // eslint-disable-line no-console
next();
}
}
else {
next();
}
});
For production
1) Use nginx for serving the upload dir
server {
...
location /static/ {
root /var/uploads;
}
...
}
That's it. /static will server the content of your uploads dir i.e. /var/uploads