I am trying to run stored procedure from a limited permission login that has been granted execute permissions for said stored procedure. The stored procedure access 2 databases that exist on the same server. When I execute the stored procedure I receive an error that states:
The server principal "LimitedUser" is not able to access the database "Database2" under the current security context.
Some background:
I have recently been tasked with the goal of migrating our 2 different database servers into a single database. I have backed up and exported the necessary databases and restored them into the new server. The older databases are MS sql server 2000 (for Database 2), and MS sql server 2005 (for database 1 - where the aforementioned stored proc is located)
I have found some leads that seem to suggest that because I imported the databases, the owners were different and that would cause a problem. So i ran "exec sp_changedbowner 'sa'" on the 2 databases to ensure they had the same owner. I still got the same error when running the stored proc from the LimitedUser. A lot of other examples on various forum sites deal with databases that are on different servers...and having to utilize open query commands. I do not believe this is necessary.
When I run it as a user who has more admin permissions, the stored proc runs just fine. So my question is, what permissions should I be setting to allow this action from LimitedUser?
Thanks!
LimitedUser needs permissions on Database2 to do whatever the stored procedure is doing in that database, ownership chaining will only work within the same database (unless you enable the server option Cross Database Ownership Chaining, which I don't recommend as it breaks down the database container as a security boundary).
So, for example, you have db1 and db2, there is a stored proc in db1 that executes select * from db2.dbo.table1
For this you need LimitedUser to have:
execute permissions in the db1 database for the procedure
select permissions on table1 in db2
Related
I have a remote Firebird 3.0 server with a database. In this database, there is a big table. The client very often queries this table during their work. There are too many clients and bad internet connection, so the work with this table is terrible. I made a local copy of this table via IBExpert into a temporary database, which is distributed with client application.
But now there is a need in a change of some values in this table (add new values and edit some olds). So I need some kind of synchronization - copying of remote modified table to client's local database.
The client application was made by use of Delphi Berlin 10.1. So the synchronization should be done by Delphi code.
Can you give me an idea, how it will be correctly to synchronize such a big table, please?
You could fire POST_EVENT on master database (for insert, update, delete (triggers)) to notify client applications that there are changes.
Then your client would need to fire procedure (on local DB) to do a sync. This could be done by EXECUTE STATEMENT ON EXTERNAL
FOR EXECUTE STATEMENT ('SELECT ... WHERE CURRENT_TIMESTAMP >= tablename.modifiedon')
ON EXTERNAL 'SERVER/PORT:DBPATH'
You should include date of insert/modified/delete in master DB.
What I'm trying to do is to transfer my Company model that has a lot of associations, depots, products, users, owners, etc, to another database(server). I've tried cloning the company but it doesn't get the associations. How exactly can I get the Company data and it's children to the other database? I don't want to dump my data and restore it, I want to establish a connection between the two databases and be able to transfer what data I have from the first server to the second.
I'd suggest instead do this on the data base level rather than the application level. This would result in a more reliably faithful copy of the data.
For Postgres, you can do the following (found at https://www.google.com.tw/search?q=postgres+backup+and+restore&ie=utf-8&oe=utf-8):
Backup a local postgres database and restore to remote server using single command:
$ pg_dump dbname | psql -h hostname dbname. ...
I am currently writing RoR applications and deploying using Heroku.
Is there any way to connect directly to the DB using a connection string?
I guess what I am asking here is can I connect directly to the DB, is there a connection string, how can I get the connection string, etc. I want to be able to perform querys on the DB outside of the terminal I am developing. My current solution is using
heroku db:pull 'anotherPOSTGRESQLdatabasesCONNECTIONstring'
and then performing queries on that database, but this is not a valid solution, because I am developing this application for users who are not code savvy, and they should be able to perform queries on the database without me or them using the terminal.
I think what you might be asking is available here: https://postgres.heroku.com/
Click "Try For Free", Select the option where you already have a heroku account, login with the correct credentials.
Now you can choose the database you want to perform actions on, and create "Data Clips" that your clients can run to get data reports etc (data clips can be used to run arbitrary select SQL commands).
Overview:
I have written an application that allows a user to define a query, submit it to a server and view the results. The software can run on DB2 or MySQL.
Problem:
We've had issues in the DB2 version where a user has tried to run a query, and found that it has failed because their user profile has been disabled. In order to run a query on DB2 (on an IBM i), the user's profile name and password are provided in the connection string. Security on the server can specify that a user's profile is disabled after two or three incorrect logins.
Question:
I've debugged the application and found that the problem is down to the query being submitted twice. If the user's password is wrong, then of course, this is having the knock-on effect of disabling their profile.
On further inspection, when I've inspected the logs on the server (while debugging line by line), I've found that the query is submitted to the server when you call TADOQuery.sql.add(), and again when the TADOQuery's active propery is set to true (which is the point at which I would expect the query to be submitted to the server). Here's an example of the code that I'm using to run the query:
adoqry.active := false;
adoqry.sql.clear;
adoqry.sql.add('SELECT * FROM SOMEDB.SOMETABLE');
adoqry.active := true;
My question is therefore quite simple:
1. Why does the TADOQuery.sql.add() method submit the query (when it should just be adding the sql to the TADOQuery's sql property)?
2. What can I do to prevent this? i.e. is there any way to prevent the sql being submitted when I call the add() method?
For those of you that would like extra information about the logs, the exit point logs on the IBM i show that when I call adoqry.sql.add in the above example, the query is run through the "Database Server-SQL Requests" exit point application, via function "Prepare and Describe". When I call adoqry.active := true in the above example, the same query goes through the same exit point application, but via the "Open/Describe" function.
If you're not familiar with the IBM i, don't worry about it - I'm just including that information as proof that I have traced the query being submitted twice. The real issue is with the TADOQuery's sql.add() processing.
From your description of your problem, I assume you specify the ConnectionString of the ADOQuery. Doing this combines the database login with the running of the query. You have found that this has undesirable side effects when the user's credentials are invalid.
Separate the database login from the query by using an ADOConnection. Specify the ConnectionString of the ADOConnection and assign the ADOConnection to the ADOQuery.Connection property. This way, you control the database login and can catch logins with bad credentials. Additionally the ADOConnection.Open method allows you to specify the username and password so you do not have to put them in the ConnectionString.
While this does not answer you specific questions, this approach will help you solve the problem of the user's profile being disabled by separating the login from the running of the query.
There is a delphi application in which I am trying to connect to Oracle database Using provider MSDAORA.1 but problem is coming in connecting. Oracle error message which is coming is "Oracle error occurred, but error message could not be retrieved from Oracle"
I am able to connect to database with Oracle10g client.
Connection String: Provider=MSDAORA.1;
User ID=murat;
Password = murat;
Data Source=(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp) (HOST= INGPSP)(PORT=1521))(CONNECT_DATA=(SID=INGPSP)));
Persist Security Info=False;
Please provide your expert opinion what can be the reason of this?
The service name seems to be lacking in your address.
Set a tnsnames.ora file, and use the entry as data source instead of the data_source parameter you set. Follow the steps available on the faq.
Or use use connection strings like '//host[:port]/[service_name]' for your data source: //INGPSP:1521/ServiceName
For Oracle, both Microsoft and Oracle OleDB providers are known to have issue with BLOBs. If you can, use another mean of connection.
What I see that is strange is that your HOST and SID are the same. The HOST is the name of the machine on your network and the SID is the database instance on that machine. I created the following ConnectionString for the PRD3 database on machine DB19 (there are multiple databases on DB19) on our network. I was able to connect to the database successfully with real User ID and Password.
Provider=MSDAORA.1;
Password=123456;
User ID=abc;
Data Source="(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=db19)(PORT=1521))(CONNECT_DATA=(SID=prd3)))";
Persist Security Info=True
Normally the Data Source I use is the database name as defined in TNSNAMES.ORA. It is a lot less to type (fewer potential errors) and can be changed to another database without recompiling the program (such as switching between a development database and production database).