I have written a discord bot that essentially will download a file and then reupload it to streamable. The problem is that once it deploys to Heroku it will be able to download the file but it does not upload the file to streamable. All the bot does is POST the video to streamable with some auth. I have the bot working on my EC2 instance but I want to move it to heroku. I am using the node request module to post the data
Related
I have created a rasa chat bot that is working properly on my system. chat response below
But when I deployed It on the Heroku the bot is not responding. chat responses below
Here is my code link GitHub code repository and Heroku deployed app link
Heroku link
Can anyone tell me what is problem with this?
I have used docker image to manage dependencies but I think my model is not properly deployed. I want to get an answer from the model.
Your project relies on 2 open ports. This is not possible with Heroku because your webbrowser clients make a connection with your flask app in the browser and have to send API calls to your rasa action server.
1 port is used for the rasa action server.
1 port is used for your flask app.
Your rasa action server declared in start_services.sh is never started.
Put your rasa action server in a separate Heroku app. Point the API endpoint calls of your flask app to that new Heroku app.
I am looking for how can I manage my passwords/credentials while deploying my rails application with habitat. so that I don't have to commit my credentials into the version control.
after researching a lot I found a work around for the credentials.
so habitat credentials are stored in .toml file and you can use these credentials directly. now we need to send this toml file to server directly if we do not want to commit file.
or we can create databags.
Successful test on Amazon Device Farm fails when it is run with jenkins. Could someone tell me why is this happening, please? The error message given when the test is executed by Jenkins is "FBSSystemService] Sending request to open "com.apple.test.WebDriverAgentRunnerRunner"
Ipa and zip-with-dependencies.zip files are the same in both cases.
Assuming you're using the device farm plugin for Jenkins you may also need to give an IAM policy to whatever entity Jenkins runs with for s3. Device farm uses s3 in the sdk to store it's uploads which you can see if you try to upload something manually with the create upload command. Using it the IAM policy generator
You can create a new policy with full access to device farm and the ability to do a putObject on the specific bucket from the create upload command.
Hope that helps
I have accessed my google sheet file using google drive gem with command
GoogleDrive.saved_session("./config.json")
Config file contains credentials to connect to drive. On my local machine server asks to paste as url in browser and paste authentication key.
I have done all this and on my localhost code is working fine.
But I have deployed my code to heroku and I am facing the issue. App crashes with logs saying to to find authentication code from given url and paste it.
I find this authentication code successfully but on heroku how to paste it? On localhost I was able to put in the key in terminal. How to make it run on heroku.
Please help me.
I am using a Digital Ocean Droplet with Nginx + Passenger as the server. We are using CarrierWave gem in Rails to upload the Images and Resize/Process and upload it to Amazon S3. It works perfectly fine in the Local Environment but when i deploy it to the the Production the Image Uploading does not work.
Error:
We're sorry, but something went wrong.
The App is running at port 80
Not sure where to look at to even Debug the Issue. Passenger Logs doesnt show any error for the same either.
You can see logs into nginx.
For access log you can check into '/var/log/nginx/access.log'
or
For error log you can check into '/var/log/nginx/error.log'
Let me know if you need me more.
You can have a look in the S3 logs as well. Or in the network tab of your browser (enable preserve log). There has to be an error somewhere ;)
Have you checked your IAM user policies? Make sure you are using a IAM user instead of the root AWS user/key, for s3 upload. Here is an example of a policy to allow anonymous upload to your bucket. Surely you don't want anonymous upload, this is just as an example policy, perhaps your policy requirements may be more restrictive.
Amazon S3 bucket policy for anonymously uploading photos to a bucket