According to this documentation, https://developers.google.com/identity/oauth2/web/guides/use-token-model , which says:
Token expiration
By design, access tokens have a short lifetime. If the access token expires prior to the end of the user's session, obtain a new token by calling requestAccessToken() from a user-driven event such as a button press.
This steers us away from using a refresh token. That is Question 0: when should a refresh token be used?
My app is a web app and users stay signed in for a long time -> longer than access token expiry time, which seems to be one hour. To make it a little easier to continually fetch fresh access tokens as they are required, I wrote this code below.
It is intended to enable me to make statements like:
const provider = new GoogleAccessTokenProvider(clientId)
const scopes = ["https://www.googleapis.com/auth/drive.metadata.readonly"]
provider.getTokenWithScopes(scopes)
.then(token => {
console.log(token)
}).catch(e => {
console.log(e)
})
It seems like I have to do a lot of uninteresting stuff in this code, so I have many questions about it, listed below:
Question 1 - It stores the token response from google.accounts.oauth2 in local storage. Is this the kind of thing Google intended us to do - store the access token locally?
Question 2 - It manages token expiry (with a ten second buffer). Again, is this something Google intend us to concern ourselves with? Is there a lib that does this automatically?
Question 3 - If calls requestAccessToken() with no prompt if the access token has the required scopes, but it is expired. This flashes up the Google consent dialog for a second, then it disappears. This seems to me to be where a refresh token would be handy.
Anyway, it works, but this flash up of the consent dialog is not great UX - is this what Google intended us to do?
Question 4 - It calls initTokenClient for every requestAccessToken(), because scopes are bound to this call. Seems odd to me - is this what Google intended us to do?
Anyway, here is the code:
class LocalStorage {
getItem(key: string): string | null {
return window.localStorage.getItem(key)
}
setItem(key: string, value: string): void {
return window.localStorage.setItem(key, value)
}
}
class TokenResponseStorage {
constructor(private readonly localStorage = new LocalStorage()) {
}
getItem(key: string): WrappedGoogleAccessTokenResponse | null {
const value = this.localStorage.getItem(key)
if (value === null) {
return null
}
return JSON.parse(value)
}
setItem(key: string, value: WrappedGoogleAccessTokenResponse): void {
this.localStorage.setItem(key, JSON.stringify(value))
}
}
interface WrappedGoogleAccessTokenResponse {
_type: "wrapped.google.access.token.response"
expiresAtMillis: number
tokenResponse: google.accounts.oauth2.TokenResponse
}
export class GoogleAccessTokenProvider {
private readonly expiryMarginMillis = 1000 * 10 // ten seconds before expiry
constructor(private readonly clientId: string,
private readonly storage = new TokenResponseStorage(),
private readonly storageKey = 'docsndata.google.access.token') {
}
async getTokenWithScopes(scopes: string[]): Promise<string> {
return new Promise(resolve => {
const wrappedToken = this.storage.getItem(this.storageKey)
if (wrappedToken && scopes.every(s => wrappedToken.tokenResponse.scopes.includes(s))) {
if (this._hasSufficientExpiry(wrappedToken)) {
resolve(wrappedToken.tokenResponse.access_token)
} else {
this._promptForToken(scopes, 'none').then(resolve)
}
} else {
this._promptForToken(scopes, 'consent').then(resolve)
}
})
}
private _hasSufficientExpiry(token: WrappedGoogleAccessTokenResponse): boolean {
return token.expiresAtMillis - this.expiryMarginMillis > Date.now()
}
private async _promptForToken(scopes: string[], prompt: "none" | "consent"): Promise<string> {
const that = this
return new Promise(resolve => {
const tokenClient = google.accounts.oauth2.initTokenClient({
client_id: this.clientId,
scope: scopes.join(' '),
callback: function (tokenResponse) {
that._storeTokenResponse(tokenResponse)
resolve(tokenResponse.access_token)
}
})
tokenClient.requestAccessToken({prompt})
})
}
private _storeTokenResponse(tokenResponse: google.accounts.oauth2.TokenResponse) {
const wrappedTokenResponse: WrappedGoogleAccessTokenResponse = {
_type: "wrapped.google.access.token.response",
expiresAtMillis: Date.now() + parseInt(tokenResponse.expires_in) * 1000,
tokenResponse
}
this.storage.setItem(this.storageKey, wrappedTokenResponse)
}
}
Setup
msal (in another file. Passed using MsalProvider):
const msalInstance = new PublicClientApplication({
auth: {
clientId: <B2C-Application-ID>,
authority: "https://login.microsoftonline.com/<tenant-directory-id>",
redirectUri: "http://localhost:3000",
},
cache: {
cacheLocation: "sessionStorage",
storeAuthStateInCookie: false,
}
});
Import:
import * as msal from "#azure/msal-browser";
import {EventType, InteractionStatus} from "#azure/msal-browser";
import React, {createContext, FC, useState} from "react";
import {useIsAuthenticated, useMsal} from "#azure/msal-react";
import {AuthenticationContextType} from "../#types/authentication";
import {EndSessionRequest} from "#azure/msal-browser/dist/request/EndSessionRequest";
import jwtDecode, {JwtPayload} from "jwt-decode";
Variables:
const {instance, accounts, inProgress} = useMsal();
const isAuthenticated = useIsAuthenticated();
const [token, setToken] = useState<string | null>(null);
Login:
function loginRedirect() {
instance.loginRedirect({
scopes: ["User.Read"],
prompt: "select_account"
});
}
Acquire token:
function getToken(): string | null {
if (token) {
const decodedJwt = jwtDecode<JwtPayload>(token);
if (decodedJwt.exp && decodedJwt.exp * 1000 > Date.now()) {
return token; // Token is still valid
}
}
// If token is not available or not valid anymore, acquire a new one
if (instance.getActiveAccount() && inProgress === InteractionStatus.None) {
const accessTokenRequest = {
scopes: ["User.Read"],
account: accounts[0]
}
instance.acquireTokenSilent(accessTokenRequest)
.then(response => {
console.log(`access token: ${response.accessToken}`);
console.log(`id token: ${response.idToken}`);
setToken(response.accessToken);
return response.accessToken;
})
.catch(err => {
if (err instanceof msal.InteractionRequiredAuthError) {
return instance.acquireTokenPopup(loginRequest)
.then(response => {
setToken(response.accessToken);
return response.accessToken;
})
.catch(err => {
console.log(err);
})
} else {
console.log(err);
}
})
} else {
console.error("No account logged in to acquire token");
}
return null;
}
Problem
I acquire two tokens (ID and access) from msal (see console logs). The ID token is being validated successfully (on my API and jwt.io) but my access token is not (neither on my API nor jwt.io). Referring to this microsoft documentation I should use the access token to validate against an API.
As far as I can see, jwt.io does fetch the public key correctly from https://sts.windows.net/<tenant-directory-id>/discovery/v2.0/keys. This means this solution is either outdated, or doesn't solve my problem. To go sure I also tried to copy&paste the public key, which didn't work either.
I also found this solution which didn't work for me either. Changing the scopes leads to an endless login loop.
Versions:
"#azure/msal-browser": "^2.28.3",
"#azure/msal-react": "^1.4.7",
"jwt-decode": "^3.1.2",
1. Scope
For requesting B2C access tokens you have to specify a valid scope. These are also set in Azure (Azure AD B2C -> App registrations -> your application -> Manage -> API permissions). There you have to specify a scope. While acquiring the tokens you have to specify these scopes like this:
const accessTokenRequest = {
scopes: ["https://<tenant-name>.onmicrosoft.com/<app-id>/<scope>"],
}
await instance.acquireTokenSilent(accessTokenRequest)
.then(response => {
setIdToken(response.idToken);
setAccessToken(response.accessToken);
})
.catch(async err => {
if (err instanceof msal.InteractionRequiredAuthError) {
await instance.acquireTokenPopup(accessTokenRequest)
.then(response => {
setIdToken(response.idToken);
setAccessToken(response.accessToken);
})
.catch(err => {
console.log(err);
})
} else {
console.log(err);
}
})
tenant-name you can find this in the Application ID URI
app-id is your Application (client) ID
your-scope could be something like Subscriptions.Read
A full example for a scope could be:
https://mydemo.onmicrosoft.com/12345678-0000-0000-0000-000000000000/Subscriptions.Read
2. Invalid token version
For me the problem was 1. Scope but maybe this does not solve the problem for others. Here is something else to try:
Following this article, the sts url is used vor the v1 endpoint. The documentation claims:
The endpoint used, v1.0 or v2.0, is chosen by the client and only impacts the version of id_tokens. Possible values for accesstokenAcceptedVersion are 1, 2, or null. If the value is null, this parameter defaults to 1, which corresponds to the v1.0 endpoint.
This means that the used endpoint (v2.0 in my case) affected only the id-token, which made it validate successfully. The access token was still v1 thus with no validated signature.
Solution
To change the version, accessTokenAcceptedVersion needs to be set to 2 inside the Manifest. It is located at portal.azure.com -> Azure AD B2C -> App registrations -> your application -> Manage -> Manifest:
{
...
"accessTokenAcceptedVersion": 2,
...
}
Save the changes and wait. For me it took several hours to wait for the change to be applied. And I had to apply solution 1. Scope as well. After that, the iss of new tokens should be https://login.microsoftonline.com/<tenant-directory-id>/v2.0 instead of the sts-uri
Can anyone provide or point at an AWS-CDK/typescript example provisioning an AWS AppConfig app/env/config consumed by a lambda please? Could not find anything meaningful online.
A little late to this party, If anyone is coming to this question, looking for a quick answer, here you go:
AppConfig Does Not Have Any Official L2 Constructs
This means that one has to work with L1 constructs currently vended here
This is ripe for someone authoring an L2/L3 construct (I am working on vending this custom construct soon - so keep an eye out on updates here).
This does not mean its hard to use AppConfig in CDK, its just that unlike L2/L3 constructs, we have to dig deeper into setting AppConfig up reading their documentation.
A Very Simple Example (YMMV)
Here is a custom construct I have to setup AppConfig:
import {
CfnApplication,
CfnConfigurationProfile,
CfnDeployment,
CfnDeploymentStrategy,
CfnEnvironment,
CfnHostedConfigurationVersion,
} from "aws-cdk-lib/aws-appconfig";
import { Construct } from "constructs";
// 1. https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_appconfig-readme.html - there are no L2 constructs for AppConfig.
// 2. https://docs.aws.amazon.com/appconfig/latest/userguide/appconfig-working.html- this the CDK code from this console setup guide.
export class AppConfig extends Construct {
constructor(scope: Construct, id: string) {
super(scope, id);
// create a new app config application.
const testAppConfigApp: CfnApplication = new CfnApplication(
this,
"TestAppConfigApp",
{
name: "TestAppConfigApp",
}
);
// you can customize this as per your needs.
const immediateDeploymentStrategy = new CfnDeploymentStrategy(
this,
"DeployStrategy",
{
name: "ImmediateDeployment",
deploymentDurationInMinutes: 0,
growthFactor: 100,
replicateTo: "NONE",
finalBakeTimeInMinutes: 0,
}
);
// setup an app config env
const appConfigEnv: CfnEnvironment = new CfnEnvironment(
this,
"AppConfigEnv",
{
applicationId: testAppConfigApp.ref,
// can be anything that makes sense for your use case.
name: "Production",
}
);
// setup config profile
const appConfigProfile: CfnConfigurationProfile = new CfnConfigurationProfile(
this,
"ConfigurationProfile",
{
name: "TestAppConfigProfile",
applicationId: testAppConfigApp.ref,
// we want AppConfig to manage the configuration profile, unless we need from SSM or S3.
locationUri: "hosted",
// This can also be "AWS.AppConfig.FeatureFlags"
type: "AWS.Freeform",
}
);
// Update AppConfig
const configVersion: CfnHostedConfigurationVersion = new CfnHostedConfigurationVersion(
this,
"HostedConfigurationVersion",
{
applicationId: testAppConfigApp.ref,
configurationProfileId: appConfigProfile.ref,
content: JSON.stringify({
someAppConfigResource: "SomeAppConfigResourceValue",
//... add more as needed.
}),
// https://www.rfc-editor.org/rfc/rfc9110.html#name-content-type
contentType: "application/json",
}
);
// Perform deployment.
new CfnDeployment(this, "Deployment", {
applicationId: testAppConfigApp.ref,
configurationProfileId: appConfigProfile.ref,
configurationVersion: configVersion.ref,
deploymentStrategyId: immediateDeploymentStrategy.ref,
environmentId: appConfigEnv.ref,
});
}
}
Here is what goes inside my lambda handler (please note that you should have lambda layers enabled for AppConfig extension, see more information here):
const http = require('http');
exports.handler = async (event) => {
const res = await new Promise((resolve, reject) => {
http.get(
"http://localhost:2772/applications/TestAppConfigApp/environments/Production/configurations/TestAppConfigProfile",
resolve
);
});
let configData = await new Promise((resolve, reject) => {
let data = '';
res.on('data', chunk => data += chunk);
res.on('error', err => reject(err));
res.on('end', () => resolve(data));
});
const parsedConfigData = JSON.parse(configData);
return parsedConfigData;
};
EDIT: As promised, I released a new npm library : https://www.npmjs.com/package/cdk-appconfig, as of this update, this is still pre-release, but time permitting I will release a 1.x soon. But this will allow people to check out/fork if need be. Please share feedback/or open PRs if you'd like to collaborate.
I have no experience with AWS AppConfig. But I would start here and here
I've been playing around with stripe and would like to learn how to get ephemeral keys in the following way:
Back-end:
//Stripe API requirement for payment
exports.stripeEphemeralKey = functions.https.onCall((data, context) => {
const uid = context.auth.uid;
//Get values
admin.database().ref().child("users").child(uid)
.on("value", (snapshot) =>{
//Get user data
let user = snapshot.val()
//Log data
console.log("Create ephemeral key for:")
console.log(user)
//Create ephemeral key
stripe.ephemeralKeys.create(
{customer: user.customerid },
{stripe_version: '2018-11-08'}
)
.then((key) => {
console.log(key)
console.log("Succesful path. Ephemeral created.")
return "Testing"
})
.catch((err) => {
console.log("Unsuccesful path. Ephemeral not created.")
console.log(err)
return {
valid: false,
data: "Error creating Stripe key"
}
})
})
})
Client-side:
functions.httpsCallable("stripeEphemeralKey").call(["text": "Testing"]) { (result, error) in
print(result?.data)
}
I have tested this code by replacing the body of the stripeEphemeralKey with a simple "Testing" string and that returns just fine. But with the code above I just get Optional() back.
For testing I added lots of console logs. Firebase logs show the execution path gets to the "Succesful path. Ephemeral created." log, and furthermore I can actually see the ephemeral key I get back from stripe.
So, what is the proper correct way to get the ephemeral key in Swift for iOS using the onCall Firebase function?
The backend does what it should, but I can't seem to get the answer back.
Thank you.
The backend does not actually do what it should do. You're doing at least two things wrong here.
First, your callable function needs to return a promise that resolves with the value that you want to send to the client. Right now, your function callback isn't returning anything at all, which means the client won't receive anything. You have return values inside promise handlers, but you need a top-level return statement.
Second, you're using on() to read data from Realtime Database, which attaches a listener that persists until it's removed. This is almost certainly never what you want to do in a Cloud Function. Instead, use once() to get a single snapshot of the data you want to read, and act on that.
For my own reference, and those who might find this helpful:
//Stripe API requirement for payment
exports.stripeEphemeralKey = functions.https.onCall((data, context) => {
const uid = context.auth.uid;
return admin.database().ref().child("users").child(uid)
.once("value", (snapshot) =>{
console.log(snapshot.val() )
})
.then( (snap) => {
const customer = snap.val()
//Log data
console.log("Create ephemeral key for:")
console.log(customer)
//Create ephemeral key
return stripe.ephemeralKeys.create(
{customer: customer.customerid },
{stripe_version: '2018-11-08'}
)
.then((key) => {
console.log(key)
console.log("Succesful path. Ephemeral created.")
return {
valid: true,
data: key
}
})
.catch((err) => {
console.log("Unsuccesful path. Ephemeral not created.")
console.log(err)
return {
valid: false,
data: "Error creating Stripe key"
}
})
})
.catch( err => {
console.log("Unsuccesful path. Ephemeral not created.")
console.log(err)
return {
valid: false,
data: "Error gettting customerid"
}
})
})
The key seems to be to chain the initial database request with .then(), and do our our work chaining the returns uninterrupted as we use functions that return promises. In particular, placing my work-code inside the callback on the original admin.database().ref().once() function did not work for me.
I am new to this kind of programming, so someone who knows about this might the why better.
I'm working on uploading images, everything works great, but I have 100 pictures and I would like to show all of them in my View, as I get the complete list of the images in a folder, I can not find any API for this work.
Since Firebase SDKs for JavaScript release 6.1, iOS release 6.4, and Android release version 18.1 all have a method to list files.
The documentation is a bit sparse so far, so I recommend checking out Rosário's answer for details.
Previous answer, since this approach can still be useful at times:
There currently is no API call in the Firebase SDK to list all files in a Cloud Storage folder from within an app. If you need such functionality, you should store the metadata of the files (such as the download URLs) in a place where you can list them. The Firebase Realtime Database and Cloud Firestore are perfect for this and allows you to also easily share the URLs with others.
You can find a good (but somewhat involved) sample of this in our FriendlyPix sample app. The relevant code for the web version is here, but there are also versions for iOS and Android.
As of May 2019, version 6.1.0 of the Firebase SDK for Cloud Storage now supports listing all objects from a bucket. You simply need to call listAll() in a Reference:
// Since you mentioned your images are in a folder,
// we'll create a Reference to that folder:
var storageRef = firebase.storage().ref("your_folder");
// Now we get the references of these images
storageRef.listAll().then(function(result) {
result.items.forEach(function(imageRef) {
// And finally display them
displayImage(imageRef);
});
}).catch(function(error) {
// Handle any errors
});
function displayImage(imageRef) {
imageRef.getDownloadURL().then(function(url) {
// TODO: Display the image on the UI
}).catch(function(error) {
// Handle any errors
});
}
Please note that in order to use this function, you must opt-in to version 2 of Security Rules, which can be done by making rules_version = '2'; the first line of your security rules:
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
I'd recommend checking the docs for further reference.
Also, according to setup, on Step 5, this script is not allowed for Node.js since require("firebase/app"); won't return firebase.storage() as a function. This is only achieved using import * as firebase from 'firebase/app';.
Since Mar 2017: With the addition of Firebase Cloud Functions, and Firebase's deeper integration with Google Cloud, this is now possible.
With Cloud Functions you can use the Google Cloud Node package to do epic operations on Cloud Storage. Below is an example that gets all the file URLs into an array from Cloud Storage. This function will be triggered every time something's saved to google cloud storage.
Note 1: This is a rather computationally expensive operation, as it has to cycle through all files in a bucket / folder.
Note 2: I wrote this just as an example, without paying much detail into promises etc. Just to give an idea.
const functions = require('firebase-functions');
const gcs = require('#google-cloud/storage')();
// let's trigger this function with a file upload to google cloud storage
exports.fileUploaded = functions.storage.object().onChange(event => {
const object = event.data; // the object that was just uploaded
const bucket = gcs.bucket(object.bucket);
const signedUrlConfig = { action: 'read', expires: '03-17-2025' }; // this is a signed url configuration object
var fileURLs = []; // array to hold all file urls
// this is just for the sake of this example. Ideally you should get the path from the object that is uploaded :)
const folderPath = "a/path/you/want/its/folder/size/calculated";
bucket.getFiles({ prefix: folderPath }, function(err, files) {
// files = array of file objects
// not the contents of these files, we're not downloading the files.
files.forEach(function(file) {
file.getSignedUrl(signedUrlConfig, function(err, fileURL) {
console.log(fileURL);
fileURLs.push(fileURL);
});
});
});
});
I hope this will give you the general idea. For better cloud functions examples, check out Google's Github repo full of Cloud Functions samples for Firebase. Also check out their Google Cloud Node API Documentation
Since there's no language listed, I'll answer this in Swift. We highly recommend using Firebase Storage and the Firebase Realtime Database together to accomplish lists of downloads:
Shared:
// Firebase services
var database: FIRDatabase!
var storage: FIRStorage!
...
// Initialize Database, Auth, Storage
database = FIRDatabase.database()
storage = FIRStorage.storage()
...
// Initialize an array for your pictures
var picArray: [UIImage]()
Upload:
let fileData = NSData() // get data...
let storageRef = storage.reference().child("myFiles/myFile")
storageRef.putData(fileData).observeStatus(.Success) { (snapshot) in
// When the image has successfully uploaded, we get it's download URL
let downloadURL = snapshot.metadata?.downloadURL()?.absoluteString
// Write the download URL to the Realtime Database
let dbRef = database.reference().child("myFiles/myFile")
dbRef.setValue(downloadURL)
}
Download:
let dbRef = database.reference().child("myFiles")
dbRef.observeEventType(.ChildAdded, withBlock: { (snapshot) in
// Get download URL from snapshot
let downloadURL = snapshot.value() as! String
// Create a storage reference from the URL
let storageRef = storage.referenceFromURL(downloadURL)
// Download the data, assuming a max size of 1MB (you can change this as necessary)
storageRef.dataWithMaxSize(1 * 1024 * 1024) { (data, error) -> Void in
// Create a UIImage, add it to the array
let pic = UIImage(data: data)
picArray.append(pic)
})
})
For more information, see Zero to App: Develop with Firebase, and it's associated source code, for a practical example of how to do this.
I also encountered this problem when I was working on my project. I really wish they provide an end api method. Anyway, This is how I did it:
When you are uploading an image to Firebase storage, create an Object and pass this object to Firebase database at the same time. This object contains the download URI of the image.
trailsRef.putFile(file).addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
#Override
public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
Uri downloadUri = taskSnapshot.getDownloadUrl();
DatabaseReference myRef = database.getReference().child("trails").child(trail.getUnique_id()).push();
Image img = new Image(trail.getUnique_id(), downloadUri.toString());
myRef.setValue(img);
}
});
Later when you want to download images from a folder, you simply iterate through files under that folder. This folder has the same name as the "folder" in Firebase storage, but you can name them however you want to. I put them in separate thread.
#Override
protected List<Image> doInBackground(Trail... params) {
String trialId = params[0].getUnique_id();
mDatabase = FirebaseDatabase.getInstance().getReference();
mDatabase.child("trails").child(trialId).addValueEventListener(new ValueEventListener() {
#Override
public void onDataChange(DataSnapshot dataSnapshot) {
images = new ArrayList<>();
Iterator<DataSnapshot> iter = dataSnapshot.getChildren().iterator();
while (iter.hasNext()) {
Image img = iter.next().getValue(Image.class);
images.add(img);
}
isFinished = true;
}
#Override
public void onCancelled(DatabaseError databaseError) {
}
});
Now I have a list of objects containing the URIs to each image, I can do whatever I want to do with them. To load them into imageView, I created another thread.
#Override
protected List<Bitmap> doInBackground(List<Image>... params) {
List<Bitmap> bitmaps = new ArrayList<>();
for (int i = 0; i < params[0].size(); i++) {
try {
URL url = new URL(params[0].get(i).getImgUrl());
Bitmap bmp = BitmapFactory.decodeStream(url.openConnection().getInputStream());
bitmaps.add(bmp);
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
return bitmaps;
}
This returns a list of Bitmap, when it finishes I simply attach them to ImageView in the main activity. Below methods are #Override because I have interfaces created and listen for completion in other threads.
#Override
public void processFinishForBitmap(List<Bitmap> bitmaps) {
List<ImageView> imageViews = new ArrayList<>();
View v;
for (int i = 0; i < bitmaps.size(); i++) {
v = mInflater.inflate(R.layout.gallery_item, mGallery, false);
imageViews.add((ImageView) v.findViewById(R.id.id_index_gallery_item_image));
imageViews.get(i).setImageBitmap(bitmaps.get(i));
mGallery.addView(v);
}
}
Note that I have to wait for List Image to be returned first and then call thread to work on List Bitmap. In this case, Image contains the URI.
#Override
public void processFinish(List<Image> results) {
Log.e(TAG, "get back " + results.size());
LoadImageFromUrlTask loadImageFromUrlTask = new LoadImageFromUrlTask();
loadImageFromUrlTask.delegate = this;
loadImageFromUrlTask.execute(results);
}
Hopefully someone finds it helpful. It will also serve as a guild line for myself in the future too.
Combining some answers from this post and also from here, and after some personal research, for NodeJS with typescript I managed to accomplish this by using firebase-admin:
import * as admin from 'firebase-admin';
const getFileNames = () => {
admin.storage().bucket().getFiles(autoPaginate: false).then(([files]: any) => {
const fileNames = files.map((file: any) => file.name);
return fileNames;
})
}
In my case I also needed to get all the files inside a specific folder from firebase storage. According to google storage the folders don't exists but are rather a naming conventions. Anyway I managed to to this (without saving each file full path into DB) by adding { prefix: ${folderName}, autoPaginate: false } inside the getFiles function call so:
...
const getFileNames = (folderName: string) => {
admin.storage().bucket().getFiles({ prefix: `${folderName}`, autoPaginate: false })
.then(([files]: any) => {
...
You can list files in a directory of firebase storage by listAll() method.
To use this method, have to implement this version of firebase storage.
'com.google.firebase:firebase-storage:18.1.1'
https://firebase.google.com/docs/storage/android/list-files
Keep in mind that upgrade the Security Rules to version 2.
A workaround can be to create a file (i.e list.txt) with nothing inside, in this file you can set the custom metadata (that is a Map< String, String>) with the list of all the file's URL.So if you need to downlaod all the files in a fodler you first download the metadata of the list.txt file, then you iterate through the custom data and download all the files with the URLs in the Map.
One more way to add the image to Database using Cloud Function to track every uploaded image and store it in Database.
exports.fileUploaded = functions.storage.object().onChange(event => {
const object = event.data; // the object that was just uploaded
const contentType = event.data.contentType; // This is the image Mimme type\
// Exit if this is triggered on a file that is not an image.
if (!contentType.startsWith('image/')) {
console.log('This is not an image.');
return null;
}
// Get the Signed URLs for the thumbnail and original image.
const config = {
action: 'read',
expires: '03-01-2500'
};
const bucket = gcs.bucket(event.data.bucket);
const filePath = event.data.name;
const file = bucket.file(filePath);
file.getSignedUrl(config, function(err, fileURL) {
console.log(fileURL);
admin.database().ref('images').push({
src: fileURL
});
});
});
Full code here:
https://gist.github.com/bossly/fb03686f2cb1699c2717a0359880cf84
For node js, I used this code
const Storage = require('#google-cloud/storage');
const storage = new Storage({projectId: 'PROJECT_ID', keyFilename: 'D:\\keyFileName.json'});
const bucket = storage.bucket('project.appspot.com'); //gs://project.appspot.com
bucket.getFiles().then(results => {
const files = results[0];
console.log('Total files:', files.length);
files.forEach(file => {
file.download({destination: `D:\\${file}`}).catch(error => console.log('Error: ', error))
});
}).catch(err => {
console.error('ERROR:', err);
});
Actually this is possible but only with a Google Cloud API instead one from Firebase. It's because a Firebase Storage is a Google Cloud Storage Bucket which can be reached easily with the Google Cloud APIs however you need to use OAuth for Authentication instead of the Firebase one's.
#In Python
import firebase_admin
from firebase_admin import credentials
from firebase_admin import storage
import datetime
import urllib.request
def image_download(url, name_img) :
urllib.request.urlretrieve(url, name_img)
cred = credentials.Certificate("credentials.json")
# Initialize the app with a service account, granting admin privileges
app = firebase_admin.initialize_app(cred, {
'storageBucket': 'YOURSTORAGEBUCKETNAME.appspot.com',
})
url_img = "gs://YOURSTORAGEBUCKETNAME.appspot.com/"
bucket_1 = storage.bucket(app=app)
image_urls = []
for blob in bucket_1.list_blobs():
name = str(blob.name)
#print(name)
blob_img = bucket_1.blob(name)
X_url = blob_img.generate_signed_url(datetime.timedelta(seconds = 300), method='GET')
#print(X_url)
image_urls.append(X_url)
PATH = ['Where you want to save the image']
for path in PATH:
i = 1
for url in image_urls:
name_img = str(path + "image"+str(i)+".jpg")
image_download(url, name_img)
i+=1
Extending Rosário Pereira Fernandes' answer, for a JavaScript solution:
Install firebase on your machine
npm install -g firebase-tools
On firebase init set JavaScript as default language
On the root folder of created project execute npm installs
npm install --save firebase
npm install #google-cloud/storage
npm install #google-cloud/firestore
... <any other dependency needed>
Add non-default dependencies on your project like
"firebase": "^6.3.3",
"#google-cloud/storage": "^3.0.3"
functions/package.json
{
"name": "functions",
"description": "Cloud Functions for Firebase",
"scripts": {
"lint": "eslint .",
"serve": "firebase serve --only functions",
"shell": "firebase functions:shell",
"start": "npm run shell",
"deploy": "firebase deploy --only functions",
"logs": "firebase functions:log"
},
"engines": {
"node": "10"
},
"dependencies": {
"#google-cloud/storage": "^3.0.3",
"firebase": "^6.3.3",
"firebase-admin": "^8.0.0",
"firebase-functions": "^3.1.0"
},
"devDependencies": {
"eslint": "^5.12.0",
"eslint-plugin-promise": "^4.0.1",
"firebase-functions-test": "^0.1.6"
},
"private": true
}
Create sort of a listAll function
index.js
var serviceAccount = require("./key.json");
const functions = require('firebase-functions');
const images = require('./images.js');
var admin = require("firebase-admin");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://<my_project>.firebaseio.com"
});
const bucket = admin.storage().bucket('<my_bucket>.appspot.com')
exports.getImages = functions.https.onRequest((request, response) => {
images.getImages(bucket)
.then(urls => response.status(200).send({ data: { urls } }))
.catch(err => console.error(err));
})
images.js
module.exports = {
getImages
}
const query = {
directory: 'images'
};
function getImages(bucket) {
return bucket.getFiles(query)
.then(response => getUrls(response))
.catch(err => console.error(err));
}
function getUrls(response) {
const promises = []
response.forEach( files => {
files.forEach (file => {
promises.push(getSignedUrl(file));
});
});
return Promise.all(promises).then(result => getParsedUrls(result));
}
function getSignedUrl(file) {
return file.getSignedUrl({
action: 'read',
expires: '09-01-2019'
})
}
function getParsedUrls(result) {
return JSON.stringify(result.map(mediaLink => createMedia(mediaLink)));
}
function createMedia(mediaLink) {
const reference = {};
reference.mediaLink = mediaLink[0];
return reference;
}
Execute firebase deploy to upload your cloud function
Call your custom function from your app
build.gradle
dependencies {
...
implementation 'com.google.firebase:firebase-functions:18.1.0'
...
}
kotlin class
private val functions = FirebaseFunctions.getInstance()
val cloudFunction = functions.getHttpsCallable("getImages")
cloudFunction.call().addOnSuccessListener {...}
Regarding the further development of this feature, I ran into some problems that might found here.
I am using AngularFire and use the following for get all of the downloadURL
getPhotos(id: string): Observable<string[]> {
const ref = this.storage.ref(`photos/${id}`)
return ref.listAll().pipe(switchMap(list => {
const calls: Promise<string>[] = [];
list.items.forEach(item => calls.push(item.getDownloadURL()))
return Promise.all(calls)
}));
}
I faced the same issue, mine is even more complicated.
Admin will upload audio and pdf files into storage:
audios/season1, season2.../class1, class 2/.mp3 files
books/.pdf files
Android app needs to get the list of sub folders and files.
The solution is catching the upload event on storage and create the same structure on firestore using cloud function.
Step 1: Create manually 'storage' collection and 'audios/books' doc on firestore
Step 2: Setup cloud function
Might take around 15 mins: https://www.youtube.com/watch?v=DYfP-UIKxH0&list=PLl-K7zZEsYLkPZHe41m4jfAxUi0JjLgSM&index=1
Step 3: Catch upload event using cloud function
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
admin.initializeApp(functions.config().firebase);
const path = require('path');
export const onFileUpload = functions.storage.object().onFinalize(async (object) => {
let filePath = object.name; // File path in the bucket.
const contentType = object.contentType; // File content type.
const metageneration = object.metageneration; // Number of times metadata has been generated. New objects have a value of 1.
if (metageneration !== "1") return;
// Get the file name.
const fileName = path.basename(filePath);
filePath = filePath.substring(0, filePath.length - 1);
console.log('contentType ' + contentType);
console.log('fileName ' + fileName);
console.log('filePath ' + filePath);
console.log('path.dirname(filePath) ' + path.dirname(filePath));
filePath = path.dirname(filePath);
const pathArray = filePath.split("/");
let ref = '';
for (const item of pathArray) {
if (ref.length === 0) {
ref = item;
}
else {
ref = ref.concat('/sub/').concat(item);
}
}
ref = 'storage/'.concat(ref).concat('/sub')
admin.firestore().collection(ref).doc(fileName).create({})
.then(result => {console.log('onFileUpload:updated')})
.catch(error => {
console.log(error);
});
});
Step 4: Retrieve list of folders/files on Android app using firestore
private static final String STORAGE_DOC = "storage/";
public static void getMediaCollection(String path, OnCompleteListener onCompleteListener) {
String[] pathArray = path.split("/");
String doc = null;
for (String item : pathArray) {
if (TextUtils.isEmpty(doc)) doc = STORAGE_DOC.concat(item);
else doc = doc.concat("/sub/").concat(item);
}
doc = doc.concat("/sub");
getFirestore().collection(doc).get().addOnCompleteListener(onCompleteListener);
}
Step 5: Get download url
public static void downloadMediaFile(String path, OnCompleteListener<Uri> onCompleteListener) {
getStorage().getReference().child(path).getDownloadUrl().addOnCompleteListener(onCompleteListener);
}
Note
We have to put "sub" collection to each item since firestore doesn't support to retrieve the list of collection.
It took me 3 days to find out the solution, hopefully will take you 3 hours at most.
To do this with JS
You can append them directly to your div container, or you can push them to an array. The below shows you how to append them to your div.
1) When you store your images in storage create a reference to the image in your firebase database with the following structure
/images/(imageName){
description: "" ,
imageSrc : (imageSource)
}
2) When you load you document pull all your image source URLs from the database rather than the storage with the following code
$(document).ready(function(){
var query = firebase.database().ref('images/').orderByKey();
query.once("value").then(function(snapshot){
snapshot.forEach(function(childSnapshot){
var imageName = childSnapshot.key;
var childData = childSnapshot.val();
var imageSource = childData.url;
$('#imageGallery').append("<div><img src='"+imageSource+"'/></div>");
})
})
});
You can use the following code. Here I am uploading the image to firebase storage and then I am storing the image download url to firebase database.
//getting the storage reference
StorageReference sRef = storageReference.child(Constants.STORAGE_PATH_UPLOADS + System.currentTimeMillis() + "." + getFileExtension(filePath));
//adding the file to reference
sRef.putFile(filePath)
.addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
#Override
public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
//dismissing the progress dialog
progressDialog.dismiss();
//displaying success toast
Toast.makeText(getApplicationContext(), "File Uploaded ", Toast.LENGTH_LONG).show();
//creating the upload object to store uploaded image details
Upload upload = new Upload(editTextName.getText().toString().trim(), taskSnapshot.getDownloadUrl().toString());
//adding an upload to firebase database
String uploadId = mDatabase.push().getKey();
mDatabase.child(uploadId).setValue(upload);
}
})
.addOnFailureListener(new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception exception) {
progressDialog.dismiss();
Toast.makeText(getApplicationContext(), exception.getMessage(), Toast.LENGTH_LONG).show();
}
})
.addOnProgressListener(new OnProgressListener<UploadTask.TaskSnapshot>() {
#Override
public void onProgress(UploadTask.TaskSnapshot taskSnapshot) {
//displaying the upload progress
double progress = (100.0 * taskSnapshot.getBytesTransferred()) / taskSnapshot.getTotalByteCount();
progressDialog.setMessage("Uploaded " + ((int) progress) + "%...");
}
});
Now to fetch all the images stored in firebase database you can use
//adding an event listener to fetch values
mDatabase.addValueEventListener(new ValueEventListener() {
#Override
public void onDataChange(DataSnapshot snapshot) {
//dismissing the progress dialog
progressDialog.dismiss();
//iterating through all the values in database
for (DataSnapshot postSnapshot : snapshot.getChildren()) {
Upload upload = postSnapshot.getValue(Upload.class);
uploads.add(upload);
}
//creating adapter
adapter = new MyAdapter(getApplicationContext(), uploads);
//adding adapter to recyclerview
recyclerView.setAdapter(adapter);
}
#Override
public void onCancelled(DatabaseError databaseError) {
progressDialog.dismiss();
}
});
Fore more details you can see my post Firebase Storage Example.
In Swift
public func downloadData() async {
let imagesRef = storage.child("pictures/")
do {
let storageReference = try await storage.root().child("pictures").listAll()
print("storageReference: \(storageReference.items)")
} catch {
print(error)
}
}
Output
[
gs://<your_app_name>.appspot.com/pictures/IMG_1243.JPG,
gs://<your_app_name>.appspot.com/pictures/IMG_1244.JPG,
gs://<your_app_name>.appspot.com/pictures/IMG_1245.JPG,
gs://<your_app_name>.appspot.com/pictures/IMG_1246.JPG
]
Here is the reference
So I had a project that required downloading assets from firebase storage, so I had to solve this problem myself. Here is How :
1- First, make a model data for example class Choice{}, In that class defines a String variable called image Name so it will be like that
class Choice {
.....
String imageName;
}
2- from a database/firebase database, go and hardcode the image names to the objects, so if you have image name called Apple.png, create the object to be
Choice myChoice = new Choice(...,....,"Apple.png");
3- Now, get the link for the assets in your firebase storage which will be something like that
gs://your-project-name.appspot.com/
like this one
4- finally, initialize your firebase storage reference and start getting the files by a loop like that
storageRef = storage.getReferenceFromUrl(firebaseRefURL).child(imagePath);
File localFile = File.createTempFile("images", "png");
storageRef.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
#Override
public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
//Dismiss Progress Dialog\\
}
5- that's it
For Android the best pratice is to use FirebaseUI and Glide.
You need to add that on your gradle/app in order to get the library. Note that it already has Glide on it!
implementation 'com.firebaseui:firebase-ui-storage:4.1.0'
And then in your code use
// Reference to an image file in Cloud Storage
StorageReference storageReference = FirebaseStorage.getInstance().getReference();
// ImageView in your Activity
ImageView imageView = findViewById(R.id.imageView);
// Download directly from StorageReference using Glide
// (See MyAppGlideModule for Loader registration)
GlideApp.with(this /* context */)
.load(storageReference)
.into(imageView);