exporting views definition using Dba tools from synapse serverless database - serverless

i am able to export it but there are different schemas and i want to export only specific schema views.
below is the dba tools script using. How can i add filter in below statement to get only specific schema views.
Get-DbaDbview -SqlInstance $server -ExcludeSystemView | Export-DbaScript -FilePath C:\Scripts\Views.sql -Append

Related

Can we Scaffold DbContext from selected tables of an existing database [duplicate]

This question already has answers here:
Entity Framework 7 Database First configuration (MVC 6)
(2 answers)
Closed last year.
As in previous versions of Entity Framework, is it possible in Entity Framework Core to reverse engineer only the selected tables of an existing database to create model classes out of them. This official ASP.NET site reverse engineers the entire database. In past, as shown in this ASP.NET tutorial, using old EF you could reverse engineer only the selected tables/Views if you chose to.
One can solve the problem by usage of dotnet ef dbcontext scaffold command with multiple -t (--table) parameters. It allows to specify all the tables, which needed by imported (scaffolded). The feature is described initially here.
It is possible to specify the exact tables in a schema to use when scaffolding database and to omit the rest. The command-line examples that follow show the parameters needed for filtering tables.
.NET Core CLI:
dotnet ef dbcontext scaffold
"server=localhost;port=3306;user=root;password=mypass;database=sakila"
MySql.Data.EntityFrameworkCore -o sakila
-t actor -t film -t film_actor -t language -f
Package Manager Console in Visual Studio:
Scaffold-DbContext "server=localhost;port=3306;user=root;password=mypass;database=sakila"
MySql.Data.EntityFrameworkCore -OutputDir Sakila
-Tables actor,film,film_actor,language -f
Force tag will update the existing selected models/files in the output
directory.
Scaffold-DbContext "Server=(localdb)\v11.0;Database=MyDB;Trusted_Connection=True;" Microsoft.EntityFrameworkCore.SqlServer -OutputDir Models -t User, Role -f
.NET Core CLI:
dotnet ef dbcontext scaffold "server=localhost;port=3306;user=root;password=mypass;database=sakila" MySql.Data.EntityFrameworkCore -o sakila -t actor -t film -t film_actor -t language -f
Package Manager Console in Visual Studio:
Scaffold-DbContext "server=localhost;port=3306;user=root;password=mypass;database=sakila" MySql.Data.EntityFrameworkCore -OutputDir Sakila -Tables actor,film,film_actor,language -f
EF Core,MS SQL PM :
Scaffold-DbContext "server=PC\SQL2012;user=test;password=test123;database=student" Microsoft.EntityFrameworkCore.SqlServer -OutputDir student-Tables stu.names,stu.grades -f
For more reference Visit entityframework-core-scaffold
Package Manger Console (MySql)
Scaffold-DbContext "server=localhost;port=3306;user=root;password=yourpassword;database=sakila" MySql.EntityFrameworkCore -OutputDir Models -Tables actor,film,film_actor,language -f
Package Manager Console (MSSQL)
Scaffold-DbContext "Server=desktop-vd5sscb;Initial Catalog=databaseName;Integrated Security=True" Microsoft.EntityFrameworkCore.SqlServer -OutputDir Models -f
Package Manager Console (Sqlite)
Scaffold-DbContext "data source = yourdbname" Microsoft.EntityFrameworkCore.Sqlite -OutputDir Models -f
For Sqlite The default db dir is your project folder... where controller folders are located
Considering, If you have n number of tables, initially at design time, your database design architecture should group those tables in their suitable schemas
For eg: For Database “Company” you can have many tables, when you design database group these tables into schemas like: Users, ProductA, ProductB, ProductC etc
Then assuming you are working on ProductA tables only then you can simply add -Schemas flag and scaffold only tables in ProductA
Another example could be suppose you are working on authorisation based project only and you want to implement identity auth with ef, then you can simply scaffold “Users” schema instead of products and make your oAuth APIs work.
Scaffold-DbContext ... -Schemas Users
These are just few use cases where you can use scaffolding effectively.
The parameter -Tables table1, table2, table3 works for me for more tables.
The -o Model parameter is the output that creates the folder to which the model is generated.
The -force parameter regenerates the model each time it is started, such as updating the database.
The -Context DbE parameter renames the database context class.
Package manager console
Scaffold-DbContext name=ConnectionStrings:DbE Microsoft.EntityFrameworkCore.SqlServer -o Model -force -Tables T_Users_Of_Chat -Context DbE

How to use EF code first migration to generate scripts from powershell

I have a need to email our DBA when a deployment, that utilizes EF6 code-based migrations, goes out. I am able to use the migrate.exe tool with the verbose flag, through powershell, to get the scripts but each command is truncated after 10222 characters. This usually only effects the model hash for the migrationHistory. Does anyone know of a way to generate the full sql script for EF6 migrations through Powershell
thanks
T
Figured out that migrate.exe is wrapping the toolingfacade class So I created the object passing in the required variables as well as setting the verbosedelegate. The nice thing about this is that I could run the updatescript function instead if I just wanted to get the sql scripts
[Reflection.Assembly]::LoadFrom("EntityFramework.dll") | Out-Null
$con = New-Object -TypeName System.Data.Entity.Infrastructure.DbConnectionInfo -ArgumentList #("constring", "System.Data.SqlClient")
$tools = New-Object -TypeName System.Data.Entity.Migrations.Design.ToolingFacade -ArgumentList #("dbcondllname", "dbcondllname",$null,"workingdr",$null,$null,$con)
$tools.LogVerboseDelegate = {param($sql)
Write-Verbose $sql -verbose #dumps the sql to RM log
}
$tools.Update($null,$false)

how to dump entire neo4j database

Hi guys how to dump entire database i'm using neo4j shell tolls to export entire data he is my query "export-cypher -r -b 10 -o /dump.cypher MATCH (n)<-[r]->(p) return n,r,p
" but few relations are not creating.
You should look at the APOC Procedures plugin for Neo4j. It has export procedures which should accomplish exactly what you need.

How do you delimit mulitiple Neo4j cypher queries in a script file?

I have programmatically generated a bunch of cypher queries to populate a Neo4j database. I wanted to use the drag and drop feature of the Database access page at port 7474 to load the statements. I can execute the individual statements just fine. But the statements in aggregate (delimited with ';') produce a syntax error.
You can use the neo4j-shell (Neo4jShell.bat) to run multiple statements separated by ;
The shell lives in the bin directory of your neo4j-server, but is also available under localhost:7474/webadmin/#/console/.
By default it connects to a running server but you can also specify a database directory:
bin/neo4j-shell -path test.db [-config conf/neo4j.properties] [-file import.cql]
And you can pass along a file to be read and executed (e.g. for import).
On Unix Systems you can also pipe to the shell:
cat import.cql | bin/neo4j-shell -path test.db
See Rik's Blog for more fun with the shell, there is also http://www.neo4j.org/develop/shell

How to write stored procedures to separate files with mysqldump?

The mysqldump option --tab=path writes the creation script of each table in a separate file. But I can't find the stored procedures, except in the screen dump.
I need to have the stored procedures also in separate files.
The current solution I am working on is to split the screen dump programatically. Is there a easier way?
The code I am using so far is:
#save all routines to a single file
mysqldump -p$PASSWORD --routines --skip-dump-date --no-create-info --no-data --skip-opt $DATABASE > $BACKUP_PATH/$DATABASE.sql
#save each table to its file
mysqldump -p$PASSWORD --tab=$BACKUP_PATH --skip-dump-date --no-data --skip-opt $DATABASE
Even if I add --routines to the second command, they will not get their own files.
I created a script to output to a separate file.
https://gist.github.com/temmings/c6599ff6a04738185596
example: mysqldump ${DATABASE} --routines --no-create-info --no-data --no-create-db --compact | ./seperate.pl
File is output to the directory(out/).
$ tree
.
└── out
├── FUNCTION.EXAMPLE_FUNCTION.sql
└── PROCEDURE.EXAMPLE_PROCEDURE.sql
The mysqldump command does not support dumping stored procedures into individual files.
But, it is possible to do it using the mysql command.
mysql --skip-column-names --raw mydatabase -e "SELECT CONCAT('CREATE PROCEDURE `', specific_name, '`(', param_list, ') AS ') AS `stmt`, body_utf8 FROM `mysql`.`proc` WHERE `db` = 'mydatabase' AND specific_name = 'myprocedure';" 1> myprocedure.sql
For a more complete example, using Windows Batch, look into my answer on another question.
MySQL - mysqldump --routines to only export 1 stored procedure (by name) and not every routine
I think the answer is: it is not possible without post-processing
This writes table definitions (not SPs) fwiw:
mysqldump -u<username> -p<password> -T<destination-directory> --lock-tables=0 <database>
One snag I ran into was, make sure you put enough permissions on . I just did chmod 777 on it.
A note on this--MySQL will write out the table structures in .sql files, and the data in .txt files. I wish it would just do it normal, thanks.

Resources