Step by step to making db connection in symfony1.4 - database-connection

I'm new to symfony. My existing application is developed in symfony1.4. They didn't used any db connection. Now I want to create new MySQL database connection in this application.
How can I do that?

in settings.yml
all:
use_database: true
from the console run this command
php symfony configure:database "mysql:host=dbhost;dbname=yourdbname" dbuser dbpassword

Please check all the steps to make db connection symfony1.4
Step 1 :
file : config/ProjectConfiguration.class.php
find what:
---------
public function setup()
{
$this->enablePlugins('owCorePlugin');
}
Repalce with :
------------
public function setup()
{
$this->enablePlugins('owCorePlugin');
$this->enablePlugins('sfDoctrinePlugin');
}
Step 2 :
config/databases.yml
You need to create this file and add below code.replace caps letter with your details.You need to maintain same space also.
all:
doctrine:
class: sfDoctrineDatabase
param:
dsn: mysql:host=HOST_NAME;dbname=DB_NAME
username: USERNAME
password: PASSWORD
Step 3:
frontend/config/settings.yml
find what :
use_database: false
Replace with
use_database: true
Run the following command:
change the project configuration.Make sure you need to run this command in root folder of our project
php symfony configure:database --name=doctrine --class=sfDoctrineDatabase "mysql:host=HOST_NAME;dbname=DB_NAME" USERNAME PASSWORD
step 5: schema files
$ php symfony doctrine:build-schema
$ php symfony doctrine:build-model
$ php symfony doctrine:build-forms
$ php symfony doctrine:build-filters
$ ln -s lib/vendor/symfony/lib/plugins/sfDoctrinePlugin/web web/sfDoctrinePlugin

Related

Configuring Backup gem in Rails 5.2 - Performing backup of PostgreSQL database

I would like to perform a regular backup of a PostgreSQL database, my current intention is to use the Backup and Whenever gems. I am relatively new to Rails and Postgres, so there is every chance I am making a very simple mistake...
I am currently trying to setup the process on my development machine (MAC), but keep getting an error when trying to connect to the database.
In the terminal window, I have performed the following to check the details of my database and connection:
psql -d my_db_name
my_db_name=# \conninfo
You are connected to database "my_db_name" as user "my_MAC_username" via socket in "/tmp" at port "5432".
\q
I have also manually created a backup of the database:
pg_dump -U my_MAC_username -p 5432 my_db_name > name_of_backup_file
However, when I try to repeat this within db_backup.rb (created by the Backup gem) I get the following error:
[2018/10/03 19:59:00][error] Model::Error: Backup for Description for db_backup (db_backup) Failed!
--- Wrapped Exception ---
Database::PostgreSQL::Error: Dump Failed!
Pipeline STDERR Messages:
(Note: may be interleaved if multiple commands returned error messages)
pg_dump: [archiver (db)] connection to database "my_db_name" failed: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/tmp/pg.sock/.s.PGSQL.5432"?
The following system errors were returned:
Errno::EPERM: Operation not permitted - 'pg_dump' returned exit code: 1
The contents of my db_backup.rb:
Model.new(:db_backup, 'Description for db_backup') do
##
# PostgreSQL [Database]
#
database PostgreSQL do |db|
# To dump all databases, set `db.name = :all` (or leave blank)
db.name = "my_db_name"
db.username = "my_MAC_username"
#db.password = ""
db.host = "localhost"
db.port = 5432
db.socket = "/tmp/pg.sock"
# When dumping all databases, `skip_tables` and `only_tables` are ignored.
# db.skip_tables = ["skip", "these", "tables"]
# db.only_tables = ["only", "these", "tables"]
# db.additional_options = ["-xc", "-E=utf8"]
end
end
Please could you suggest what I need to do to resolve this issue and perform the same backup through the db_backup.rb code
In case someone else gets stuck in a similar situation, the key to unlocking this problem was the lines:
psql -d my_db_name
my_db_name=# \conninfo
I realised that I needed to change db.socket = "/tmp/pg.sock" to db.socket = "/tmp", which seems to have resolved the issue.
However, I don't understand why the path on my computer differs to the default as I didn't do anything to customise the installation of any gems or the Postgres App

Neo4j - How to set DEV ENV to point to an AWS database

I created a new app in Rails using:
rails new myapp -m http://neo4jrb.io/neo4j/neo4j.rb -O
I did not executed the command:
rake neo4j:install[community-2.2.0,development]
since my database is already created, populated and hosted by an AWS server.
How I can set my Rails dev env to use the database on the AWS server?
When running the command from my local computer myapp folder
$ rails s -d
I am getting the error:
Expected response code 200 Error for request http://my-aws-server.com:7474/db/data/, 401, 401 (Neo4j::Server::Resource::ServerException)
I added these three lines in the file config/environments/development.rb:
config.neo4j.session_options = { basic_auth: { username: 'neo4j_user', password: 'neo4j_pass'} }
config.neo4j.session_type = :server_db
config.neo4j.session_path = 'http://my-aws-server.com:7474'
Problem solved.

RMySQL not working with a cnf file

I am trying to connect to a MySQL server through R and it works perfect with the follwoing line:
con <- dbConnect(MySQL(), user="user", password="password",dbname="dbname", host="localhost", port=3306)
But, I would like to use a cnf file so that my user/apssword credentials donot appear in my code and tried the following:
rmysql.settingsfile<-"mydefault.cnf"
rmysql.db<-"test_db"
drv<-dbDriver("MySQL")
con<-dbConnect(drv,default.file=rmysql.settingsfile,group=rmysql.db)
And this is how my cnf file looks:
[test_db]
user=user
password=password
database=dbname
host=localhost
port=3306
It is in the same folder as in my R script which is my current working directory. But, I run into the following error:
Error in mysqlNewConnection(drv, ...) :
RS-DBI driver: (Failed to connect to database: Error: Access denied for user 'ODBC'#'localhost' (using password: NO)
)
Any suggestions, please?
Thanks so much
I had this problem very recently. RMySQL looks in the root directory for these files so you need to fully qualify the location of the file. i.e.:
rmysql.settingsfile<-"/home/MD-Tech/mydefault.cnf"
or
rmysql.settingsfile<-"c:\Users\MD-Tech\rfiles\mydefault.cnf"
Two things could be going on.
The CNF file should be encrypted, password should say password = ****. The MySQL documentation shows how to create a CNF file. Below would work for your code to create the CNF
shell> mysql_config_editor set --login-path=test_db --host=localhost --user=user --password
press enter without typing password, you will be prompted to enter it
The second thing is that user = NULL and password = NULL are missing as referenced in the src_mysql documentation
rmysql.settingsfile <- "~/.mylogin.cnf"
rmysql.db <- "test_db"
drv <- dbDriver("MySQL")
con <- dbConnect(drv, default.file = rmysql.settingsfile, group = rmysql.db, user = NULL, password = NULL)
When you add these and run the code, you should be set.
Fing something working at: https://www.r-bloggers.com/mysql-and-r/
Not in configuration file... but work.
con <- dbConnect(MySQL(),
user="me", password="nuts2u",
dbname="my_db", host="localhost")
Ya, getting this setup for the first time can be like pulling cats' teeth! Here is what I did while running R on a Droplet (Ubuntu 16.04, MySQL 5.7.16).
First, make sure you can at least login successfully to MySQL through the terminal
mysql -u kevin -p
Next, run R and verify that you can login in directly with dbConnect() using a user name and password
mydb = dbConnect(drv, user='kevin', password='ilovecats', dbname='catnapdb', host='127.0.0.1', port=3306)
Edit your mysql.cnf text file and at the bottom add a new group (exact name of this file and its location will depend on operating system and versions).
[whiskerpatrol]
user = kevin
password = ilovecats
host = 127.0.0.1
port = 3306
database = catnapdb

Unable to perform queries on mongodb started with --auth switch in rails 3 with mongoid.

Simplified case:
I create a new rails 3.2 project, without active record. I add mongoid 3.0.0.rc to the Gemfile and then rails g mongoid:config. I edit my mongoid.yml to look like the one I have posted below (except that hosts is now set to localhost:27068).
I have added an admin user to mongodb:
$ mongo localhost/admin
> db.addUser(myadmin,adminpass)
Also I have added a regular user to my database:
use mydb
> db.addUser(myuser, mypassword)
I confirm that I can connect to my database:
$ mongo localhost/mydb -u myuser -pmypassword
MongoDB shell version: 2.0.4
connecting to: localhost/mydb
> _
After that, I start mongod with --auth switch to force authentication:
$ mongod --auth --dbpath /my/db/path
Now that everything seems to be OK, I create some random scaffold like:
$ rails g scaffold User name email
and try to run the project in the browser: localhost:3000/users. BOOM! I'm hit with the error message posted below.
Is this a bug in mongoid? Or am I missing something?
Original Question
I'm unable to do anything on my MongoHQ hosted database in a Rails 3.2 project with mongoid 3 rc. A simple query for login action gives me something like this error message:
The operation: #<Moped::Protocol::Query
#length=83
#request_id=3
#response_to=0
#op_code=2004
#flags=[]
#full_collection_name="mydb.users"
#skip=0
#limit=-1
#selector={"name"=>"Abbas"}
#fields=nil>
failed with error 10057: "unauthorized db:mydb lock type:-1 client [some ip]"
Here's what my mongoid.yml looks like:
development:
sessions:
default:
database: mydb
user: myuser
password: mypassword
hosts:
- flame.mongohq.com:27068
options:
consistency: :strong
options:
include_type_for_serialization: true
So I'm doing it the wrong way. The db user is not marked as "Read-only" in MongoHQ panel. And I'm NOT deploying to Heroku; just testing on my localhost.
Any help is appreciated.

Deploying Rails App to AWS/EC2 Using Rubber

I have a question about using the Rubber gem to deploy a Rails app to EC2. When I go about running
cap rubber:create_staging
This line of code runs in a loop.
executing `rubber:_allow_root_ssh'
executing "sudo -p 'sudo password: ' bash -l -c 'cp /home/ubuntu/.ssh/authorized_keys /root/.ssh/'"
servers: ["witheld"]
. Failed to connect to witheld, retrying
I believe this may be an issue with my keypairs.
In terms of my keynames, I have a current private key called keyname (plain text file) and a public key called keyname.pub in my config/rubber folder. My rubber.yml file lists
key_name: keyname
key_file: "#{Dir[(File.expand_path('~') rescue '/root') + '/.ec2/*' + cloud_providers.aws.key_name].first}"
I'm pretty sure all other information is correct, but I obviously can't copy and paste it in. Any suggestions?
your keys should be in ~/.ec2/ folder not in config/rubber folder. also make sure you remove .pem extension from your private key file and .pub stays with your public file.
Also change key_name: [your private key file name here] in your rubber.yml file.
based on your key_file: settings, rubber will look for these keys in ~/.ec2 folder. So move them to there.

Resources