Downloading from /tmp folder in Heroku - ruby-on-rails

I have a rake task that will generate a particular CSV file. I would like to be able to download that CSV file that is going to be placed in /tmp.
My application is hosted in Heroku. How can I download that CSV file?

If you just want to do a one off download, you could try using heroku exec. The exec command allows you to create an SSH connection to a dyno - https://devcenter.heroku.com/changelog-items/1112
First, figure out the file path. You can run bash, then do usual bash commands like ls:
heroku ps:exec -a <myapp> bash
Next, use cat to read the file, and pipe the output to a local file:
heroku ps:exec -a <myapp> cat tmp/myfile.csv > mylocal.csv

The /tmp directory on Heroku is exactly that – temporary. Even if you store the file in the /tmp file, it won't be persisted for long enough that any users will be able to access it. Instead, you should look into an integrated storage solution like Amazon AWS.
With that in place, your users should be able to access those CSV files directly from your storage host without needing to tie up any Heroku dynos/resources.

why is it necessary to place it in tmp folder? if you generate something it has to be important file not temporal one...
solution is easy, just setup your rake task in a way when your file will be saved into public directory (or subdirectory of public directory)
and then you can open/download your export.csv using
http://your-domain/[subdirectory-in-public-directory]/export.csv url

Files in the tmp directory are emptied everyday, tmp directory lives #:
/app/tmp
where app is the root directory
To download files from it you can read the file and convert it into a base 64 and send it back to the client as a data URL:
Server:
let filePath = path.join(__dirname, '..', '..', 'tmp', fileName);
fs.readFile(filePath, {encoding: 'base64'}, function (err, data) {
if (!err) {
let returnData = `data:${mimeType};base64,` + data;
res.json({fileName: fileName, displayName: displayName, base64: returnData})
} else {
console.log(err);
}
});
Client side:
function b64toBlob(dataURI) {
var byteString = atob(dataURI.split(',')[1]);
var ab = new ArrayBuffer(byteString.length);
var ia = new Uint8Array(ab);
for (var i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
return new Blob([ab], { type: 'image/jpeg' });
}
var blob = b64toBlob(res.data.base64);
var blobUrl = URL.createObjectURL(blob);
var link = document.createElement("a"); // Or maybe get it from the current document
link.href = blobUrl;
link.download = res.data.displayName;
document.body.appendChild(link)
link.click()

Related

How to upload file in input file field using Selenium Webdriver when script execution is running on zalenium docker container browser?

I am using Selenium WebDriver for automation. Uploading files in WebDriver is done by simply using the sendKeys() method on the file input field.
Code snippet:
WebElement uploadElement = driver.findElement(By.id("uploadfile"));
// enter the absolute file path into the file input field
uploadElement.sendKeys("C:\\test.txt");
Above code snippet works as expected when script execution is running on Local machine..
But it is not working when script execution is running on Zalenium docker container.
File upload is relatively very simple but slightly different when you are using Docker concept. You need to ensure that you set the file detector for the file (using LocalFileDetector class) which you want to upload.
Refer below code snippet:
WebElement uploadElement = driver.findElement(By.id("uploadfile"));
LocalFileDetector detector = new LocalFileDetector();
File localFile = detector.getLocalFile("C:\\test.txt");
uploadElement.setFileDetector(detector);
// enter the absolute file path into the file input field
uploadElement.sendKeys(localFile.getAbsolutePath());
Above code snippet will upload the file when script execution on Local/Remote/Zalenium docker container.
This worked for me. There is no need to mount volume to get this done
File file = new File(filePath); //my local filepath where the file will be created
File tempDir = new File(System.getProperty("java.io.tmpdir", null), "uploadFile");
if (!tempDir.exists()) {
tempDir.mkdir();
}
File fileToCreate = new File(tempDir, file.getName());
byte[] bytes = Base64.getDecoder().decode(value.getBytes());
FileUtils.writeByteArrayToFile(fileToCreate, bytes);
Thread.sleep(3000);
RemoteWebDriver remoteDriver = new RemoteWebDriver(
new URL("http://localhost:4444/wd/hub"), capabilities);
remoteDriver.setFileDetector(new LocalFileDetector());
remoteDriver.findElement(locator).sendKeys(fileToCreate.toString());

Docker: Can't read class path resource from spring boot application

Reading a classpath resource as,
try {
final ClassPathResource classPathResource = new ClassPathResource(format("location%sGeoLite2-City.mmdb", File.separator));
final File database = classPathResource.getFile();
dbReader = new DatabaseReader.Builder(database).build();
} catch (Exception e) {
System.out.println("Exception: " + e);
}
I've packaged this with docker using following Dockerfile,
FROM java:8
ADD build/libs/*.jar App.jar
CMD java -jar App.jar
But while running this application as docker run -p 8080:8080 app-image I can hit the application endpoint and from application logs I can see it fails to read this file (following is from logs),
Exception: java.io.FileNotFoundException: class path resource [location/GeoLite2-City.mmdb] cannot be resolved to absolute file path because it does not reside in the file system: jar:file:/App.jar!/BOOT-INF/classes!/location/GeoLite2-City.mmdb
Would appreciate any comment, Things to know before you comment,
**- Running on windows 10, intellij 2018.2, jdk 8
- Can run application successfully with intellij as well as command line
- File exists in jar (I did extract jar and checked )
**
Since you are using springboot you can try to use the following annotation for loading your classpath resource. Worked for me because I had the same exception. Be aware that the directory "location" must be under the src/main/resources folder:
#Value("classpath:/location/GeoLite2-City.mmdb")
private Resource geoLiteCity;
Without springboot you could try:
try (InputStream inputStream = getClass().getClassLoader().getResourceAsStream("/location/GeoLite2-City.mmdb")) {
... //convert to file and other stuff
}
Also the answers before were correct as the use of "/" is not good at all and File.separator would be the best choice.
It is not a good approach to use slashes.
Always use File Seperators as they work irrespective of System OS.
Change
(location\\GeoLite2-City.mmdb)
to
("location"+ File.separator +"GeoLite2-City.mmdb")
Refer this for more.
https://www.journaldev.com/851/java-file-separator-separatorchar-pathseparator-pathseparatorchar
Difference between File.separator and slash in paths
Had the same issue, the file worked when running the Spring boot app but was not working in Docker. My issue got resolved by using ClassPathResource for the resource and reading the resource as stream using InputStreamReader.
Resource resource = new ClassPathResource("test-xyz.json");
InputStream inputStream = null;
try {
inputStream = resource.getInputStream();
Reader reader = new InputStreamReader(inputStream, "UTF-8");
....

Using electron-boilerplate to create an .exe for windows. It needs to run a .bat file. Once it's packaged, it doesn't run

Using the electron-boilerplate to create an .exe for windows, it needs to run a .bat file. However, using npm start it works but when it gets packaged with npm run release, it doesn't run the .bat
This is my code for the function
const spawn = require('child_process').spawn;
const bat = spawn('cmd.exe', ['/c', 'Install.bat']);
bat.stdout.on('data', (data) => {
var str = String.fromCharCode.apply(null, data);
addLog(data);
console.info(str);
});
bat.stderr.on('data', (data) => {
var str = String.fromCharCode.apply(null, data);
addLog(data,"error");
console.error(str);
});
bat.on('exit', (code) => {
console.log(`Exit ${code}`);
});
Already checked for child-process
When you run electron via npm start it will typically set the current working directory to the folder for the app (containing your package.json). So it will look for cmd.exe in that folder.
After you build the app and run it, the current working directory might be somewhere else, for example C:\\ (on Windows). You can find the current working directory with process.cwd().
To find the app folder regardless of how the app is running, Electron provides electron.app.getAppPath().
So you can use it like this:
const path = require('path');
const cmdPath = path.join(electron.app.getAppPath(),'cmd.exe');
const bat = spawn(cmdPath, ['/c', 'Install.bat']);

Deploy Angular 2 app to Heroku

In the past I always bundled my Angular 1 and Rails apps together and typically used heroku, which has worked great for me. Now that I'm over to Angular 2 I want to separate out my Angular and Rails code. I've created a very basic Angular 2 app via the Angular-Cli, but I haven't been able to figure out how to deploy it to Heroku. I'm not using expressjs or anything like that. Anyone figure it out yet?
Ok I came up with a solution. I had to add a very basic PHP backend, but it's pretty harmless. Below is my process.
First setup a heroku app and Angular 2 app.
Create your heroku app
Set the heroku buildpack to heroku/php
heroku buildpacks:set heroku/php --app heroku-app-name
Create a project via Angular-Cli
Add a index.php file to /scr with the below snippet
<?php include_once("index.html"); ?>
Add a Procfile to /scr with the below snippet
web: vendor/bin/heroku-php-apache2
Added /deploy to the .gitignore
Now I used a npm package to push a tarballs to heroku
Here's a simple package to upload the tarball, https://www.npmjs.com/package/heroku-deploy-tarball
npm i heroku-deploy-tarball --save
I'm also using tar.gz to create the tarball
npm i tar.gz --save
Then I created the deploy.js file at the root of my projecdt with the following code. I first run the buildCommand specified and then move the index.php and Profile to the dist folder. I then tarball the entire dist folder and it gets uploaded to heroku.
var deploy = require('heroku-deploy-tarball');
var targz = require('tar.gz');
var exec = require('child_process').exec;
var requestedTarget = process.argv[2];
if (!requestedTarget) {
console.log('You must specify a deploy target');
return;
}
var targets = {
production: {
app: 'heroku-app-name',
tarball: 'deploy/build.tar.gz',
buildCommand: 'ng build --prod'
}
}
var moveCompressFiles = function (callback) {
exec('cp ./src/index.php ./dist/index.php',
function(err) {
if(err)
console.log(err);
console.log('index.php was copied.');
});
exec('cp ./src/Procfile ./dist/Procfile',
function(err) {
if(err)
console.log(err);
console.log('Procfile was copied.');
});
new targz().compress('./dist', './deploy/build.tar.gz',
function(err){
if(err)
console.log(err);
else
callback();
console.log('The compression has ended!');
});
};
console.log('Starting ' + targets[requestedTarget].buildCommand);
exec(targets[requestedTarget].buildCommand, {maxBuffer: 1024 * 500}, function(error) {
if (!error) {
console.log(targets[requestedTarget].buildCommand + ' successful!');
moveCompressFiles(function () {
deploy(targets[requestedTarget]);
});
} else {
console.log(targets[requestedTarget].buildCommand + ' failed.', error);
}
});
Now just run node deploy production and it should deploy to heroku.
Edit
Just got word from heroku that they are working on an experimental buildpack that would allow for static sites like this. Here is the link to the build pack.

How to find path to the package directory when the script is running with `pub run` command

I am writing a package that loads additional data from the lib directory and would like to provide an easy way to load this data with something like this:
const dataPath = 'mypackage/data/data.json';
initializeMyLibrary(dataPath).then((_) {
// library is ready
});
I've made two separate libraries browser.dart and standalone.dart, similar to how it is done in the Intl package.
It is quite easy to load this data from the "browser" environment, but when it comes to the "standalone" environment, it is not so easy, because of the pub run command.
When the script is running with simple $ dart myscript.dart, I can find a package path using dart:io.Platform Platform.script and Platform.packageRoot properties.
But when the script is running with $ pub run tool/mytool, the correct way to load data should be:
detect that the script is running from the pub run command
find the pub server host
load data from this server, because there could be pub transformers and we can't load data directly from the file system.
And even if I want to load data directly from the file system, when the script is running with pub run, Platform.script returns /mytool path.
So, the question is there any way to find that the script is running from pub run and how to find server host for the pub server?
I am not sure that this is the right way, but when I am running script with pub run, Package.script actually returns http://localhost:<port>/myscript.dart. So, when the scheme is http, I can download using http client, and when it is a file, load from the file system.
Something like this:
import 'dart:async';
import 'dart:io';
import 'package:path/path.dart' as ospath;
Future<List<int>> loadAsBytes(String path) {
final script = Platform.script;
final scheme = Platform.script.scheme;
if (scheme.startsWith('http')) {
return new HttpClient().getUrl(
new Uri(
scheme: script.scheme,
host: script.host,
port: script.port,
path: 'packages/' + path)).then((req) {
return req.close();
}).then((response) {
return response.fold(
new BytesBuilder(),
(b, d) => b..add(d)).then((builder) {
return builder.takeBytes();
});
});
} else if (scheme == 'file') {
return new File(
ospath.join(ospath.dirname(script.path), 'packages', path)).readAsBytes();
}
throw new Exception('...');
}

Resources