I'm using this gem stated in the title: https://github.com/paypal/permissions-sdk-ruby
I got everything to work, but it seems the gem only has built-in methods that request basic and advanced user info. However, I need to get transaction/payment data. What's the best way to go about this?
Do I use the tokens to now make requests manually via HTTParty or such?
paypal.yml
test: &default
# Credentials for REST APIs
client_id: AZeKLB9rkwMumoPHGg_xG-sMOtEREDh3VeSP2cgbdorScFDkGBAoG2WQ0ZNtIgdKM6eaCfYqKmXzxDqS
client_secret: ELXfwfA4wGU_PEwjIiYllCvs7gQCYsaWyN_yzXux5XSrHv6ZxGEUasnSIHvkAZ4-rYXTcvC4Igy097xo
# Mode can be 'live' or 'sandbox'
mode: sandbox
# Credentials for Classic APIs
app_id: APP-80W284485P519543T
username: jb-us-seller_api1.paypal.com
password: WX4WTU3S8MY44S7F
signature: AFcWxV21C7fd0v3bYYYRCpSSRl31A7yDhhsPUU2XhtMoZXsWHFxu-RWy
# # With Certificate
# cert_path: "config/cert_key.pem"
sandbox_email_address: kolbywebdev-facilitator#gmail.com
# # IP Address
# ip_address: 127.0.0.1
# # HTTP Proxy
# http_proxy: http://proxy-ipaddress:3129/
development:
<<: *default
production:
<<: *default
mode: live
Related
I'm following the 'Running Rails on the Cloud Run environment' instructions and have hit a snag. I used their provided github repo and the google cloud shell and I had success in launching the working application.
Now, I am trying to integrate Cloud Run into my rails template. While 'Deploying the app to Cloud Run' using the cloudbuild.yaml file provided, the build crashes during database migration. I am using postgreSQL. Here are the error details:
The error
"bundle exec rails db:migrate" ->
"ActiveRecord::ConnectionNotEstablished: could not connect to server: No such file or directory"
I think I've traced it to database.yml file where Google recommends this host:
production:
<<: *default
database: <%= ENV["PRODUCTION_DB_NAME"] %>
username: <%= ENV["PRODUCTION_DB_USERNAME"] %>
password: <%= Rails.application.credentials.gcp[:db_password] %>
host: "<%= ENV.fetch("DB_SOCKET_DIR") { '/cloudsql' } %>/<%= ENV["CLOUD_SQL_CONNECTION_NAME"] %>"
It is unclear where this ENV.fetch("DB_SOCKET_DIR") comes from [at least to me, I'm new]. Their git repo holds a folder where I found templates for another build that included an app.standard.yaml and a config/database_unix.yml that I've tried integrating.
app.standard.yaml:
entrypoint: bundle exec rackup --port $PORT
runtime: ruby27
env_variables:
SECRET_KEY_BASE: <SECRET_KEY>
RAILS_ENV: production
INSTANCE_UNIX_SOCKET: /cloudsql/<PROJECT-ID>:<INSTANCE-REGION>:<INSTANCE-NAME>
DB_USER: <YOUR_DB_USER_NAME>
DB_PASS: <YOUR_DB_PASSWORD>
DB_NAME: <YOUR_DB_NAME>
beta_settings:
cloud_sql_instances: <PROJECT-ID>:<INSTANCE-REGION>:<INSTANCE-NAME>
database_unix.yml:
# [START cloud_sql_postgres_activerecord_connect_unix]
unix: &unix
adapter: postgresql
# Configure additional properties here.
# [END cloud_sql_postgres_activerecord_connect_unix]
pool: 5
timeout: 5000
# [START cloud_sql_postgres_activerecord_connect_unix]
# Note: Saving credentials in environment variables is convenient, but not
# secure - consider a more secure solution such as
# Cloud Secret Manager (https://cloud.google.com/secret-manager) to help
# keep secrets safe.
username: <%= ENV["DB_USER"] %> # e.g. "my-database-user"
password: <%= ENV["DB_PASS"] %> # e.g. "my-database-password"
database: <%= ENV.fetch("DB_NAME") { "vote_development" } %>
# Specify the Unix socket path as host
host: "<%= ENV["INSTANCE_UNIX_SOCKET"] %>"
# [END cloud_sql_postgres_activerecord_connect_unix]
development:
<<: *unix
# Warning: The database defined as "test" will be erased and
# re-generated from your development database when you run "rake".
# Do not set this db to the same as development or production.
test:
<<: *unix
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 1 } %>
database: <%= ENV.fetch("DB_NAME") { "vote_test" } %>
production:
<<: *unix
database: <%= ENV.fetch("DB_NAME") { "vote_production" } %>
Some other solutions I've seen mention this instead of host:
socket: “/cloudsql/project_id:us-central1:photo-album-production”
I tried this with no luck. I cloned the repo to my machine and re-tried the Cloud Run instructions. No success this time as it's getting the same migration error. Am I thinking about this wrong?
Check this out: https://cloud.google.com/sql/docs/postgres/connect-build.
In short, you'll need to run the Cloud SQL Auth Proxy to create a Unix socket in Cloud Build so your app can connect.
If you're using a private IP instance, you'll need to make sure you're using private pools.
I successfully connect the CASino gems to Active Directory by using LDAP. But the problem is, I need to gather the data in AD that located in several folders. How can I do this?
In my cas.yml code I write like this
development:
<<: *defaults
frontend:
sso_name: AUTHENTICATION CENTER
authenticators:
my_company_ldap:
authenticator: LDAP
options:
host: LDAP_HOST
port: 389
base: OU=Users,OU=UserFolder1,OU=Root,DC=company,DC=com
username_attribute: sAMAccountName
encryption: false
admin_user: ADMIN
admin_password: admin
extra_attributes:
email: mail
fullname: displayname
How I can connect to several folders in AD? I tried to write another base: inside it but it only read the newest base, I also tried to customized the value of cas.yml inside the controller, but I need to restart the server to apply the changes. Can I make the base: to read several folders not only UserFolder1?
I found it by myself.
The answer simply just adding a new configuration below that. That makes the config to access several folders in AD.
Here's the example of the implementations
development:
<<: *defaults
frontend:
sso_name: AUTHENTICATION CENTER
authenticators:
my_company_ldap1:
authenticator: LDAP
options:
host: LDAP_HOST
port: 389
base: OU=Users,OU=UserFolder1,OU=Root,DC=company,DC=com
username_attribute: sAMAccountName
encryption: false
admin_user: ADMIN
admin_password: admin
extra_attributes:
email: mail
fullname: displayname
my_company_ldap2:
authenticator: LDAP
options:
host: LDAP_HOST
port: 389
base: OU=Users,OU=UserFolder2,OU=Root,DC=company,DC=com
username_attribute: sAMAccountName
encryption: false
admin_user: ADMIN
admin_password: admin
extra_attributes:
email: mail
fullname: displayname
I'm having some trouble loading custom configuration data from a yml file. I've looked at a few resources and can't seem to get anywhere with it.
When I try to load some custom settings from the yml file I get an empty hash.
my application.rb contains:
config.myapp = config_for(:myapp)
my myapp.yml contains:
default: &default
emails:
support: test#myapp.com
marketing: marketing#myapp.com
address: 123 Test lane
production:
<<: *default
development:
<<: *default
test:
<<: *default
When I call:
Rails.configuration.myapp
I get:
{}
Any thoughts what the issue might be?
Thanks
It's got something to do with the lifecycle of the Application object and where your config.myapp = config_for(:myapp) call sits.
So I tried it locally and solved this problem by putting it inside a config.before_initialize callback block:
this is in my config/application.rb file
config.before_initialize do
config.myapp = config_for(:myapp)
end
and then in rails console Rails.application.config.myapp correctly dumps the config parsed from yml.
I'm trying to send custom metrics to NewRelic insights, but unfortunately it is not working for my Rails app that is currently sending default data to New Relic.
Steps to reproduce
I just logged in the console of the working application and ran the following command:
NewRelic::Agent.record_metric('/Custom/MyCategory/MyMetric', 5)
Unfortunately it never appeared in the Insights Data Explorer.
The configuration in the application is the following:
common: &default_settings
license_key: <MY_KEY>
app_name: my_app
log_level: info
development:
<<: *default_settings
app_name: executive_alerts (development)
monitor_mode: false
test:
<<: *default_settings
app_name: executive_alerts (test)
monitor_mode: false
staging:
<<: *default_settings
app_name: executive_alerts (staging)
production:
<<: *default_settings
Thank you!
From the documentation for the New Relic agent:
http://www.rubydoc.info/github/newrelic/rpm/NewRelic/Agent#record_metric-instance_method
https://github.com/newrelic/rpm/blob/c252fad410a5d41a65a827d633338af609f8dff6/lib/new_relic/agent.rb#L143-L144
metric_name should follow a slash separated path convention. Application specific metrics should begin with Custom/.
Change your command from:
NewRelic::Agent.record_metric('/Custom/MyCategory/MyMetric', 5)
to:
NewRelic::Agent.record_metric('Custom/MyCategory/MyMetric', 5)
I have problem with seamless gem
development:
adapter: jdbcmysql
database: mydb_development
username: read_user
password: abc123
pool_adapter: jdbcmysql
port: 3306
master:
host: master-db.example.com
port: 6000
username: master_user
password: 567pass
read_pool:
- host: read-db-1.example.com
pool_weight: 2
- host: read-db-2.example.com
it should read for slave right [read-db-1.example.com] ? but it was weird.. it always read to master database [mydb_development] .
do you have any suggestion, how should i do to configure this gem for default read to slave database?
Thank you
Specify pool_weight=0 in the master configuration
By default, the master connection will be included in the read pool. If you would like to dedicate this connection only for write operations, you should set the pool weight to zero.
seam_leass_database_pool plugin