Currently, if I want to make a backup of a stored procedure, using Mircosoft SQL Server Management Studio 2008 R2, I right click on my stored procedure, choose modify and then in the stored procedure change alter proc part to create proc, and add the word "backup" to the end of the name of the stored procedure. Is there a better way to do this? In a perfect world, I would like to be able to backup all the stored procedures in a database and keep them maybe somewhere locally. I don't like how my list of stored procedures is getting sloppy (for lack of a better word) with all these backups I have made. If you can't tell, I am exteremly new to writing stored procedures and want to be able to have this to safeguard the existing stored procedures from any mistakes I might make.
Thanks in advance for all your help!
There are multiple ways to keep backups of your stored procedures apart from your live database. Here are just a few:
When you backup your database, all the stored procedures are included in that backup. If you need to revert to an older version, you can restore to another database and script the procedure to a new editor tab or file or whatever. Hopefully, you have a live and test database anyway, so you could just go to your live database and script the stored procedure there rather than having to restore from backup.
You can script each version of your stored procedure to separate files as you create them and name and append a date to the name of the file. You can script all existing stored procedures by looking at the answer to this question.
You can use a version control product. I'm not sure if I'm allowed to point you to one here, but just do a search on "SQL source control" and you will find a very good one in the search results.
Related
I'm trying to make a shell script that will allow the users to backup an Informix IDS database before using it and rollback (restore it) if they need to do so.
I know I can use ontape and onbar but I don't know if it would work for every database, no matter the size, and to be honest, I don't know if it would be safe for the users to use a script that takes the DBNAME as an argument to backup/restore.
Using ON-Tape (ontape), you can back up a whole server, but not a single database. Using ON-Bar (onbar), you can back up one or more storage spaces (dbspaces, blobspaces, etc) or the whole server. Therefore, if you locate the database in a separate dbspace and ensure no other database uses the dbspace, then you can use ON-Bar to achieve a database-level backup. Consequently, you must design your system to allow for database recovery and restore.
Running backups requires administrative privileges, which you should not give to anyone casually. Therefore, you will need to design a backup and restore system that will limit people to backing up the databases you intend them to be able to backup. I have some views on how this can be done, but the result is complex.
Amongst other places, read the Comparison of the ON-Bar and ON-Tape utilities. That is part of the Backup and Restore Guide documentation.
I am writing a script (a Rails runner, if it matters) that will run periodically. It uses a gem to query a SQL database. Because that database does not update existing rows, but merely adds new ones to reflect changes to data, the script will run a query that only finds objects with an id greater than the id that was in the database the last time the script was run.
In what file should that id be stored? Is it bad practice to store it in the script and have the script write over itself, and if so, why?
Store the ID in a separate file. Not only would the script writing over itself be more difficult to do correctly, but that practice would also be likely to confuse users, and could result in a whole host of other problems, such as additional friction when trying to version control the script or update it to a new version.
Under most circumstances, data and code should be separate.
Can I edit or update the content of a stored procedure in iron speed or not? If I would update it through sql server management studio and i would rebuild my application in iron speed, will my updated stored procedure deleted or not? Please do help me with this. Badly need your ideas. Thank you
As long as the stored proc name stays the same it has to work. I've done this before with custom stored procedures where I edit the content of the stored proc in sql server management studio and then resync the database with ISD. Just keep the stored proc name the same.
I'm working on a project in which we have two versions of an MVC App, the live, and the dev versions, I've made changes to the dev version and added tables and data, etc.
Is there any way to migrate these changes onto the live version without losing all data(i.e. just regenerating the database).
I've already tried just rebuilding the database but we lose all data that was previously stored( as obviously we are essentially deleting the old database and rebuilding it).
tl;dr
How do I migrate my dev version of an mvc app along with any new tables to the live version of an mvc app with missing models and tables.
Yes, it is possible to migrate your changes from your dev instance to your production instance; to do so you must create SQL scripts that update your production database with the changes. This may be accomplished by manually writing the scripts or by using tools to generate the scripts for you. However you go about it, you will need scripts to update your database (well, you could perform manual updates via the tooling of your database, but this is not desirable, as you want the updates to occur in a short time window, and you want this to happen reliably and repeatably).
The easiest way that I know of to do this is to use tools like SQL Compare (for schema updates) or SQL Data Compare (for data updates). These are from Redgate, but they cost a fair bit of money. They are well worth they price, and most companies I've worked with are happy to pay for licenses. However, you may not want to shell out for them personally. To use these tools, you connect them to source and destination databases, and they analyze the differences between the databases (schematically or data) and produce SQL scripts. These scripts may then be run manually or by the tools themselves.
Ideally, when you work on your application, you should be producing these scripts as you go along. That way when it comes time to update your environments, you may simply run the scripts you have. It is worth taking the time to include this in your build process, so database scripts get included in your builds. You should write your scripts so they are idempotent, meaning that you can run them multiple times and the end result will be the same (the database updated to the desired schema and data).
One may of managing this is creating a DBVersions table in your database. This table includes all your script updates. For example you could have a table like the following (this is SQL Server 2008 dialect):
CREATE TABLE [dbo].[DBVersions] (
[CaseID] [int] NOT NULL,
[DateExecutedOn] [datetime] NOT NULL,
CONSTRAINT [PK_DBVersions] PRIMARY KEY CLUSTERED (
[CaseID] ASC
)
) ON [PRIMARY]
CaseID refers to the case (or issue) number of the feature or bug that requires the SQL update. Your build process can check this table to see if the script has already been run. If not, it runs it. This is useful if you cannot write your scripts in a way that allows them to be run more than once. If all your scripts can be run an unbounded number of times, then this table is not strictly necessary, though it could still be useful to reduce the need to run a large number of script every time a deployment is done.
Here are links to the Redgate tools. There may be many other tools out there, but I've had a very good experience with these.
http://www.red-gate.com/products/sql-development/sql-compare/
http://www.red-gate.com/products/sql-development/sql-data-compare/
It depends on your deployment strategy and this is more of a workflow that your team need to embrace. If regenerating the live database from scratch it can take awhile depending how big the database size is. I don't see a need to do this in most scenarios.
You would only need to separate out database schema object and data row scripts. The live database version should have its database schema objects scripted out and stored in a repository. When a developer is working on a new functionality, he/she will need to make those changes against the database scripts in the repository. If there is a need to make changes to the database rows then the developer will also need to check in the data row scripts in the repository. On a daily deployment the live database version can be compared against what is checked in the repository and pushed to make it in sync.
On our side we use tools such as RedGate Schema Compare and Data Compare to do the database migration from the dev version to our intended target version.
I have been put in charge of looking at setting up a build server for our office. We currently put all queries into stored procedures in SQL 2000 server. This is done manually and no SQL files are produced or put into SVN.
What I am after is a good way of dealing with having a build server that can get all the stored procs from a DB.
I am guessing this might not be possible / practice and am pretty sure not the best practice. I realize one solution could be to start creating SQL script files and putting them into SVN so they can be picked up and dealt with.
You have answered your own question. Get these things into source control before you start digging yourself further into a hole you really don't want to be in.
Once done, an approach we have used successfully is to have an initial snapshot set of scripts, then version numbered script folders for changes, with the overall database version number stored in a database table specifically for that purpose. We then wrote a utility to assemble all the update scripts since the stored version number, run them, and update the version number. This integrated with our build script that was run against the dev DB by an automated build. Schedules and so on are of course up to you.
Would strongly advise you make all DB scripts safely repeatable as well.