I am trying to import data from sql server to power bi.
The data are being generated by views and then the views are being called by sp in ordered to get the results. How can i call the store procedures from power bi in order to populate the report?
When creating a data source from SQL Server, expand the Advanced options and write a SQL Statement to execute your stored procedure such as:
EXEC [dbo].[usp_NameOfYourStoredProcedure]
Related
I am trying to connect to my organisation's SQL database using Power Query to create some reports. I need to delete/edit some tables and join multiple tables to come up with the desired report output...
I don't want the change or edit I will do on the excel-power query to reflect on the live database but just in excel .
The short answer is no, any button you press in the Power Query Editor interface does not modify the source database. I must admit that I have not found any page in the Microsoft Docs on Power Query that states this clearly. The page What is Power Query? states that:
Power Query is a data transformation and data preparation engine. Power Query comes with a graphical interface for getting data from sources and a Power Query Editor for applying transformations.
Other pages contain similarly general and vague descriptions but let me reassure you that any data transformation you carry out by using the Power Query Editor interface will not modify your SQL database. All you see in Power Query is a view of the source database.
Seeing as you are connecting to a SQL database, it is likely that query folding is activated. This means that when you remove a column (or row), this will update the SQL query used to extract the data from the database. That query is written as a single SELECT statement that can contain multiple clauses like GROUP BY and WHERE. Transformations that add data (e.g. Add Custom Column, Fill Down) are not included in the query, they are carried out only within the Power Query engine. You can read more about this in the docs.
How to edit a database with Power Query when native SQL queries are supported
That being said, you can actually edit a database from within Power Query if the database supports the use of native SQL queries, if you have write permission for the database, and if you edit and run one of the two M functions that let you write native SQL queries. Here is an example using the Sql.Database function:
Sql.Database("servername", "dbname", [Query = "DROP TABLE tablename"])
And here is an example using the Value.NativeQuery function:
Source = Sql.Databases("servername"){[Name="dbname"]}[Data],
#"Native Query" = Value.NativeQuery(Source, "DROP TABLE tablename")
Unless you have changed the default Query Options, these functions should raise a warning message requiring you to permit running the query:
This prevents you from modifying the database without confirmation, so any database modification cannot happen just by accident.
I verified this using Excel Microsoft 365 (Version 2108) on Windows 10 64-bit connected to a local SQL Server 2019 (15.x) database.
I have requirement to extract SQL queries from snowflake stored procedure. I have decided to extract SQL queries via Snowflake-JDBC API.
I have analyzed Java documentation of Snowflake-JDBC API but unfortunately could not find any methods to extract SQL queries from stored procedure. I found a class namely QueryExecDTO in Snowflake-JDBC API , which has getSqlText() method but it is of no use in my concern (I have to extract SQL from stored procedure). I am also aware of Snowflake-JavaScript API's Statement object , which has method getSqlText() to get text of SQL queries but it can be use inside JavaScript only as it is part of JavaScript-API
Is there any way to extract SQL from stored procedure using Snowflake-JDBC API?
You would need to run something like:
select get_ddl('procedure', '*proc_name*(*arg list*)');
To get the text of the SP and then you would need to parse that text to extract the SQL statements.
If you want to just extract the SQL statements that should be relatively straightforward; however if you want to parse the statements to, for example, list the tables being used, then you are going to struggle.
Parsing SQL is incredibly complex (given how flexible the language is) which is illustrated by the fact that there are very few general SQL parsers available - and those that actually work are not cheap.
I have a situation where I need to get data from excel sheet and sql table then store it into a single sql server table.
Below steps I have completed already,
1.In control flow, by using execute sql task, fetched data from sql and assigned it to variable.
2.Then added Data flow task in control flow.
3.In data flow added Excel source and oledb destination.
When I trying to edit oledb destination I can only able to see excel sheet columns,
How to I do get variables also in Oledb destination? or is there any other best approach ?
You will need to add a merge join control inside of a data flow task.
Probably easier to change your execute sql task to be a second data source within your data flow task and then join on the key within a merge join control.
Be warned the merge join requires the data to be sorted. If the data is sorted on input, you can tell SSIS that by setting the input / output properties of the advanced editor. Otherwise, you can add a sort task after each data source.
Suppose I have an application which fetches a custom XML packet from the server which represents a dataset. Then, suppose I wish to execute a SQL statement on that data via a dataset. What can I use to do this? I don't need to know the code necessarily, but just what to use to make this possible and a general explanation of how.
For example, I may fetch a list of customers in XML format from the server. Then, I can use any third-party parser to dump that XML data into some client dataset. Then, execute a query on that dataset, for example select * from customers where ZipCode = '12345' without fetching this data from the server again.
XML is not the only limitation, that's just an example. I might want to do the same to some application settings loaded from an INI file. Either way, the concept is that the original source of the data is unknown.
Whether the dataset stores its temporary data in the memory or on the disk doesn't matter, but it would be excellent if it could keep it in the disk.
TXQuery (http://code.google.com/p/txquery/) is a component that provides a local SQL engine for executing SQL queries against one or more TDataSets. The only issues I have had with it is updating data via a TDBGrid of a query joining multiple tables (TDataSets) - specifically which table is being updated.
AnyDac v6 (now FireDac) also has a local SQL engine. http://www.da-soft.com/anydac/docu/frames.html?frmname=topic&frmfile=Local_SQL.html
Edit: For the example SQL in your question, because it only involves a single table, you do this with just a Filter on the datatset. For example
ADataSet.Filtered := False;
ADataSet.Filter := 'ZipCode=' + QuotedStr('12345');
ADataSet.Filtered := True;
Such a feature can be done using a local database. You just insert the TDataSet result into a local in-memory (or file-based) stand-alone database, then you can use regular SQL queries on it, including JOIN.
You can for instance use SQLite3, or the free edition of NexusDB.
NexusDB embedded has the benefit of being a native Delphi database, so stick to the DB.pas TDataSet paradigm.
Another option is to use the so-called Virtual Table mechanism of SQLite3, which allows to expose any data (even from TDataSet, XML, JSON or in-memory objects) to the SQLite3 engine, just as regular tables. Then you can run SQL statements on those "virtual" tables, including JOINs. With this approach, you do not require to INSERT the data into regular tables, but the data remain in their original form. Of course, you will miss some performance features like indexes, which should be handled on the virtual table provider side. We use this feature as the database core of our mORMot ORM/SOA framework, and this is pretty powerful.
The general process that you want to perform is complicated by the difference in data representation. SQL data is stored in tables made up of distinguishable records. XML is a structured representation of data, but in tree form rather than table/row form.
Each of these data forms may be qualified by a schema that provides a context for the data.
You have two general paths that you can follow:
Take the XML, and based on the schema insert it into a set of interlinked tables, then perform the SQL query. - if you have the schema, you can use code generators to make a parser, and then based ont the parse tree, you can insert into a local db with tables constructed on the fly. You can set up my SQL pretty easily from https://dev.mysql.com/doc/refman/5.7/en/installing.html and then in your version of delphi make a connection to the database, first fill it in, then query. This would satisfy your desire to have the data stored on the disk. unless you purge the tables when done, the data are still available in the local machine db.
This seems like more work than:
Use Xpath or Xquery and work directly on the XML. For this, a package like saxon in your favorite environment, or expat in python would work nicely.
Let me know if either of these paths seems as if it may be fruitful.
I'm using EF 4.1 (Code First). I need to add/update products in a database based on data from an Excel file. Discussing here, one way to achieve this is to use dbContext.Products.ToList() to force loading all products from the database then use db.Products.Local.FirstOrDefault(...) to check if product from Excel exists in database and proceed accordingly with an insert or add. This is only one round-trip.
Now, my problem is there are two many products in the database so it's not possible to load all products in memory. What's the way to achieve this without multiplying round-trips to the database. My understanding is that if I just do a search with db.Products.FirstOrDefault(...) for each excel product to process, this will perform a round-trip each time even if I issue the statement for the exact same product several times ! What's the purpose of the EF caching objects and returning the cached value if it goes to the database anyway !
There is actually no way to make this better. EF is not a good solution for this kind of tasks. You must know if product already exists in database to use correct operation so you always need to do additional query - you can group multiple products to single query using .Contains (like SQL IN) but that will solve only check problem. The worse problem is that each INSERT or UPDATE is executed in separate roundtrip as well and there is no way to solve this because EF doesn't support command batching.
Create stored procedure and pass information about product to that stored procedure. The stored procedure will perform insert or update based on the existence of the record in the database.
You can even use some more advanced features like table valued parameters to pass multiple records from excel into procedure with single call or import Excel to temporary table (for example with SSIS) and process them all directly on SQL server. As last you can use bulk insert to get all records to special import table and again process them with single stored procedures call.