I want to download some reports from Google Cloud Storage and I'm trying the Gcloud gem. I managed to successfully connect and now I am able to list my buckets, create one, etc.
But I can't find a way to programically get files from buckets, which are shared with me. I got and address like gs://pubsite... and I need to connect to that bucket, to download some files. How can I achieve that?
Do I need to have billing enabled?
In order to list all the object in a bucket you can use Google Cloud Storage Object list API.
You need to provide the Bucket ID and should have the access to the bucket to read the objects. You can try the API before implementing it in your code.
I hope that helps.
You do not need billing enabled to download objects from a GCS bucket. Operations on GCS buckets are billed to the project that owns the bucket. You only need to enable billing in order to create a new bucket.
Downloading a single file using the Gcloud gem looks like this:
require "gcloud"
gcloud = Gcloud.new
storage = gcloud.storage
bucket = storage.bucket "pubsite"
file = bucket.file "somefile.png"
file.download "/tmp/somefile.png"
There are some examples at http://googlecloudplatform.github.io/gcloud-ruby/docs/v0.2.0/Gcloud/Storage.html
Related
I need to store some images inside Firebase Storage and I was wondering if there's any way to add a record inside Realtime DB with the download url when adding an image inside Storage.
I don't want to do it necessarily from an app, is there a way to add a record to DB just by adding an image inside Storage from the console?
I was trying to retrieve the images urls directly from Storage but in some cases it resulted tedious and I thought that putting the urls inside the DB would be easier. And it was a good opportunity to try Realtime DB too.
You can use Cloud Storage Triggers for Cloud Functions that'll add the URL in database after a file is uploaded like this:
exports.updateDb = functions.storage.object().onFinalize(async (object) => {
// read object data
// add required data to Firebase RTDB
});
You cannot get the Firebase storage download URLs with the ?token= parameter using Admin SDK so also checkout How can I generate access token to file uploaded to firebase storage?.
If you don't want to use Cloud Functions then you can simply fetch the download URL, and then update database directly from client SDK.
The title says it all. I have a VM instance set up in my google cloud for generating some model data. A friend of mine also has a new account. We're both basically using the free credits Google provides. We're trying to figure out if there is a way that I can generate the data in my VM instance and then transfer it to my friend's GCS Bucket. He hasn't set up any buckets yet, so we're also open to suggestions on the type of storage that would help us do this task.
I realize I can set up a persistent disk and mount it to my own VM instance. But that isn't our goal right now. We just need to know if there is a way to transfer data from one Google account to another. Any input is appreciated.
There is a way to do this by having your friend create the bucket and then he gives your email permission to access the bucket. Then from your VM you can us the gsutil command to copy the files to the bucket.
1) Have your friend create the bucket in the console.
2) In the permissions section, he will Add Member and add your email and provide you with Storage Object Creator role
3) Then you SSH into your VM and use the following gsutil command to copy the files. For example gsutil cp testfile.txt gs://friend_bucket
4) If you get 403 error, you probably have to run gcloud auth login first
We are looking to support export of photos from our S3 location to users' Dropbox. Currently I am using the code like below:
#photo = Photo.find(id) #Photo.image has attachment
#photo.image.copy_to_local_file(nil, 'tmp/png/temp.png') #Get the file locally from S3
local_file = File.open('tmp/png/temp.png')
response = client.put_file('sample.png', local_file) # Then copy to Dropbox
The above method costs twice the bandwidth. Is there anyway I can transfer the images directly from S3 to Dropbox without copying them locally?
Thanks in advance!
How about trying something like mover and use their APIs?
https://mover.io/
http://support.mover.io/knowledgebase/articles/214572-how-to-transfer-or-backup-your-amazon-s3-buckets-t
Or you can also try SME Storage (Storage Made Easy)
http://storagemadeeasy.com/
It's kinda ironic that DropBox uses Amazon S3 to stores all its files.
Or you can also write your own streamer in Ruby and run it in an Amazon instance it will be much faster since all the data would be within Amazon.
How do I HTTP post stream data from memory in Ruby?
I would like to know why if someone is facing the same problem to get your Rails Assets files from AWS S3 bucket!
and why keep showing this access denied when i try to get the css upload by AssetSync
Thank you vey much
By default, objects on S3 are "private" -- they are only accessible if you prove to "own" those objects by providing some credentials in the query string.
To make the objects publicly accessible (ie, without having to sign the requests), you need to attach a policy to the bucket.
To add that permission, go to S3 on the AWS Management Console, click on your bucket, select Properties, and there you will see "Permissions". Try that.
I am storing many images in Amazon S3,
using a ruby lib (http://amazon.rubyforge.org/)
I don't care the photos older than 1 week, then to free the space in S3 I have to delete those photos.
I know there is a method to delete the object in a certain bucket:
S3Object.delete 'photo-1.jpg', 'photos'
Is there a way to automatically delete the image older than a week ?
If it does Not exist, I'll have to write a daemon to do that :-(
Thank you
UPDATE: now it is possible, check the Roberto's answer.
You can use the Amazon S3 Object Expiration policy
Amazon S3 - Object Expiration | AWS Blog
If you use S3 to store log files or other files that have a limited
lifetime, you probably had to build some sort of mechanism in-house to
track object ages and to initiate a bulk deletion process from time to
time. Although our new Multi-Object deletion function will help you to
make this process faster and easier, we want to go ever farther.
S3's new Object Expiration function allows you to define rules to
schedule the removal of your objects after a pre-defined time period.
The rules are specified in the Lifecycle Configuration policy that you
apply to a bucket. You can update this policy through the S3 API or
from the AWS Management Console.
Object Expiration | AWS S3 Documentation
Some objects that you store in an Amazon S3 bucket might have a
well-defined lifetime. For example, you might be uploading periodic
logs to your bucket, but you might need to retain those logs for a
specific amount of time. You can use using the Object Lifecycle
Management to specify a lifetime for objects in your bucket; when the
lifetime of an object expires, Amazon S3 queues the objects for
deletion.
Ps: Click on the links for more information.
If you have access to a local database, it's easy to simply log each image (you may be doing this already depending on your application), and then you can perform a simple query to retrieve the entire list and delete them each. This is much faster than querying S3 directly, but does require local storage of some kind.
Unfortunately, Amazon doesn't offer an API for automatic deletion based on a specific set of criteria.
You'll need to write a daemon that goes through all of the photos and and selects just those that meet your criteria, and then delete them one by one.