Docker: User upload image - docker

I have a local web app running with docker (php, apache and mySQL) and all is fine on local environment. I deployed to Digital Ocean, mySQL is running and html is ok as well EXCEPT show user uploaded image files.
I have a Admin system, where the user can upload product images, the files are being save in the "asset" folder. There's isn't any error on chrome dev tools and no errors on Digital Ocean logs too. The files is being uploaded, but I can't access them to show to my user!
How to do it?
Best

Related

How do I setup an Oracle Dynamo Admin Server?

I am confused by Oracle documentation on how to setup the (ATG) Web Commerce available on the edelivery website.
I would like to get to the step where I have properly set up the admin console.
Running the bin files on a server seems not work for various reasons:
either installation finishes but nothing is working
OR
the installation endlessly asks for arbitrary input.
Also, I want to know if it is possible to setup the server in docker and/or an Amazon Linux EC2 instance.
There are quite a number of steps involved in getting the ATG Admin Server up and running. These start with installing a JDK, Application Server and provisioning a database. Once you have gone through the Installer (which you downloaded from the edelivery site) you need to go through a basic setup process using the CIM tool. The installation process (for ATG 11.3.1) is documented here, while the steps to setup a basic application is documented here.
Working through the steps in the CIM tool, you will end up with a deployable .ear file that you can copy to your application server. Once your application server is started, you will be able to access the Dynamo Admin server.
As of version 11.3.1 ATG is officially supported on Docker. Considering that you compile your own .ear file and it can be deployed to an Application Server (such as Weblogic), Docker support won't necessarily provide you with an ATG Image. It will simply allow you to run your compiled artefact on a Docker container. You are more likely wanting to get hold of a Weblogic Docker Image and deploy your ATG artefact there.

aws s3 ruby aws-sdk file transer using pivotal cloud foundry

Help: How do you transfer a PDF File from S3 to a local desktop file directory when the rails app is deployed in Pivotal Cloud Foundry?
Using ruby and the aws-sdk to download a PDF File from S3 to local client machine directory.
get_object(bucket: #s3_bucket, key: file_name, response_target: "#{Rails.root}/Downloads/#{file_name}")
The above ruby code works. The file appears inside the Download Directory on the Mac.
Problem: deploying the rails app into Pivotal Clown Foundry breaks the placement of the PDF File and where it gets stored. It seems as if get_object returns the PDF to the Cloud Foundry Container in directory in "/home/vcap/app" rather than getting pulled down to the user's remote computer file directory. But that's not available to the person to open up the file.
Pivotal Cloud Foundry will not download any files onto your local work machine by default.
When code is run on Pivotal Cloud Foundry the code only has access to the filesystem in the app instance (container).
If you wish to retrieve a file from a app instance you may SCP it from the Application Instance. Here are the docs on how to do that

Sublime Text 3 and Rails: manage an app on a remote server

I'm new to Sublime Text and recently discovered the wonderful SFTP plugin on Sublime Text 3, which lets you work with a local folder and sync it to a remote server directory.
I have some questions:
is it possible to sync the folder automatically once files are created or deleted, either locally or remotely, instead of clicking everytime on Sync Local -> Remote or viceversa?
is it possible to launch rails or rake commands from the Sublime console to the remote server and after that sync the local folder contents automatically?
My local machine runs Windows 10 whereas my remote server is a Ubuntu 16.04.
Thanks in advance
Is it possible to sync the folder automatically once files are created or deleted, either locally or remotely, instead of clicking everytime on Sync Local -> Remote or viceversa?
Yes, but only for the files changed locally. Check upload_on_save.
Is it possible to launch rails or rake commands from the Sublime console to the remote server and after that sync the local folder contents automatically?
You cannot run such commands using a FTP client. You will need SSH access to the server for that.
A work around would be to set up a listener on your server that will trigger rails commands when files are changed.

How can I run a Jekyll powered website from HostGator?

I want to create a website with 'dynamic' pages for a web app. I 'cant' use Node because HostGator charges more for a dedicated server (required for Node apparently). Looking for a solution such as Jekyll or RoR. Does anyone know how to spin up a simple webApp powered by these on an environment like HostGator?
Thanks in advance.
Just build your Jekyll site locally and upload the _site/ directory into your website folder on your shared HostGator folder. You could also use a git repo and push all the files from Jekyll's _site directory to master and then pull them down via ssh on your HostGator website folder. Or create a git hook or other automated way of updating the files.
Should be as easy as uploading a static html site, since that is what Jekyll is.
If you are looking to build a Rails site check out the HostGator guide here http://support.hostgator.com/articles/specialized-help/technical/how-do-i-start-using-ruby-on-rails

How do I run Snap! without an internet connection?

I can run Snap! by visiting the website http://snap.berkeley.edu/snapsource/snap.html, but is there a way for me to run it when I don't have an internet connection?
Snap! can be downloaded and run locally. When run locally, an internet connection is not necessary. This may be useful for students with a computer at home, but no reliable internet. Below are the instructions to download and run it.
Run Snap! from http://snap.berkeley.edu/snapsource/snap.html#
Click on the Snap! logo in the upper-left of the app.
Choose “Download source” from the menu
Save snap.zip locally on your computer.
Extract snap.zip.
Open snap.html in a web browser.
Every time you wish to run snap, you can do so by opening snap.html. No internet required.
Note that you cannot use cloud storage when running Snap! locally from snap.html. Also, the browser storage is not shared between running locally and running from the Berkeley website.
To transfer projects from a locally-run (offline) Snap! to the Berkeley-hosted (online) Snap!, you'll need to export the project from the locally-run Snap! window and import it in the Berkeley-hosted Snap! window.

Resources