I have a production SQL Azure database. Is there a way to run Migrations against it remotely e.g via Powershell? I don't have access to my development machine at the moment, and through a browser inspection I realized the error below is returned:
There is already an object named 'AspNetRoles' in the database.
Normally, I would do:
Add-Migration Initial -IgnoreChanges then Update-Database
Note: I want to disable* or run migrations that are already deployed remotely, whichever will be faster. I'll appreciate any help on this.
Related
I have created a simple ASP.NET Core MVC application using EF Core and SQL Server. On the Windows development machine it is using localdb. I am trying to deploy to Azure App Service (Linux). I have created an Azure SQL database. Deploying from Visual Studio 2019, I have set the database as a dependency. In the publish profile settings I have selected the Azure SQL connection string for the database context I am using. I have also checked the EF Migrations and on deployment the script successfully created the tables for the application. I can connect to Azure SQL and see the tables. However when I run the deployed application and try a database operation I get: PlatformNotSupportedException: LocalDB is not supported on this platform
I can see from the docs various ways to set the connection string but I would like to know what the publish wizard in Visual Studio 2019 is trying to do and why it is not working? I'm also unclear where the password is stored. In the publish profile the password seems to be in the connection string as plain text, not good. I'd like to know how to get this right for production.
Update I have fixed this for the moment by following the steps in the Linux tutorial, using the Azure CLI and running the following command:
az webapp config connection-string set --resource-group [myResourceGroup] --name [app name] --settings MyDbConnection='[connection-string]' --connection-string-type SQLServer
I am not sure of the security of this approach and plan to investigate further.
The publish wizard simply handles the database creation/migration for you, it doesn't modify your project, as that's 1) not its purpose and 2) it can't make the configuration decision for you (i.e. use appsettings, environment variables, etc.)
You need to provide the connection string in production via configuration, just as in development. Since you're deploying to an Azure App Service, the most logical place for that would be to the App Settings in the Azure. These will be loaded in via environment variables. Simply specify the same key you're using in development and specify the production database target there.
I use code first and the app works well on a local database which was generated.
But when I deploy to Azure, although it succeeds, the tables are not created, just the empty database.
I excluded the local app_data folder and chose to run code first migrations
in the deployment options.
Any tips what's wrong?
Have you configured Azure deployment to replace connection strings (via the publishing wizard) or are you using environmental variables in your code? It doesn't sound like it. It sounds like you deployed with localdb which does not work in Azure.
You need to either (there are more options, but these are easy to implement):
Configure your deployment process to update your web.config with your SQL Azure connection string (you can use config transformations or deplyment wizard)
Use Azure environmental variables to be used automatically when running in Azure and local variables when locally
My dry run attempt to restore a set of TFS databases failed because the admin console reports that Tfs_Configuration already exists and should be dropped before the restore:
TF400990: Database Tfs_Configuration exists on SQL instance
However, the admin console has an open connection to the Tfs_Configuration which prevents me from dropping the database. If I close the admin console, perform the drop, and try to re-launch the admin console it fails because it cannot find the Tfs_Configuration database. So I'm stuck.
I can work around by manually restoring the databases or using the tfsrestore.exe tool (but that won't include the logs -- just the full backup).
So how is restore supposed to work from the admin console? My scenario is total vanilla. Install TFS, backup, quiesce, restore... seems like that should work.
Duh. I had to uninstall the application tier first.
I've used Taps before on Heroku, but what is a good solution on non-heroku rails apps?
You could create a capistrano task/s to mysqldump on the source database, gzip the file, then scp it to the destination and execute the mysql script there to import.
I wrote a Capistrano recipe some time ago to sync a MySQL database and files between different environments: https://gist.github.com/111597
OK, there are some things you need to keep in mind. If you are using SQlite for development and MySQL / Postgre for production on the server, then sinking is almost impossible. On the contrary if you are using the same DB engines you can use administrator interface like MySQL administrator on your desktop and generate a backup file and upload it to server and vice versa.
May hosting providers provide PHPMyAdmin to take backups and restore it on the server.
I am using postgresql database for my Heroku Application.
I have very large database on AmazonAws as heroku not providing the Postgresql Database.
Now my client want to switch to EngineYard from Heroku.
Can i use same database (w/o taking backup and then reload) for my application on the EngineYard?
If YES
How can i use or steps for using the Existing AmazonAws Database with the new EngineYard Application.
You can, but only if you are using a dedicated database. From the Heroku database FAQ
Shared Database
No, connecting to your database from machines outside of Heroku is not
supported. We recommend that you encapsulate data access in an API to
manipulate it.
Dedicated Database
It's possible to connect to our dedicated databases using our
pg:ingress feature. Please see using the PG console for more
information.
The database connection string is available in the DATABASE_URL config. You can run
$ heroku config --long
to view it. However, it won't probably work if you use a shared database because it seems connection is restricted to the Heroku net.
Surely this is just a case of getting the correct connection credentials for the DB regardless of where it's hosted?
For instance, if the DB is on Heroku, then ENV['DATABASE_URL'] give you everything you need. All these details then go in your database.yml as normal (assuming you're using ActiveRecord)
For the record, Heroku do provide Postgres and it's part of their core business.