BI Publisher: Add all columns to tabular report - bi-publisher

My company is using BI Publisher for some data dumps and I know BI Publisher isn't really designed for that, but this is what I have to use.
I have two files with over 100 fields each. Is there a way to add every field to the report or do I have to add each field individually?

If you have an data model export, you could use the BIP designer in word to add a table, and select multiple fields to the output. The wizard will do the xdo code for you.

Use E-Text output. It is designed for EFT (Electronic Fund Transfers) in Payment module. But you can use it to export a CSV (Comma Separated Value) file, or a fixed width file. Both of which can be opened in Excel. The E-Text output is fully documented in the BI Publisher user guides. Some advanced stuff is a bit harder to accomplish, but it is not terribly difficult. A simple CSV file should be quick and easy to create. You will need to list every field, there's no "give me everything" command.
It's actually to your benefit to use those for larger data extracts, when you you use the Word RTF tables, the output Excel files are VERY large due to the way BI Publisher formats the cells.

Related

SAS Stored Procedure User Enters List

Is it possible to created a Stored Procedure in SAS Enterprise Guide and allow the user to enter in a list of values, without having to manually enter in the list?
I use more Base SAS than EG, so I'm not an expert on Stored Procedures. Currently, an analyst in my area may have to search for a list of values like so:
012345678
123456789
231456789
091236574
439857345
120129038
230918239
....
....
N
and is using a Stored Procedure that was built to enter in these values. However, this is not efficient as this last can be >40 values, and SAS will only allow you to enter in one at a time.
I've been messing around with the prompt manager for an hour or so and haven't had any luck. I've also tried 'User selects from a static list', using an excel doc that I imported. Which worked great ad-hoc, but, because the values will always be different, I can't figure out how to make EG first import this excel doc, then bring up the prompt for her to select all the (new) values, then run the rest of the program.
Also, it seems that I would have to change the 'Static Value List' in the prompt manager every time the doc was imported, even if the rest of the program was conditioned on the import of the excel doc. I'm going to continue playing around with this, but looking for ideas as to if anyone has done this previously.
Sounds like you want "select multiple values from a dynamic list". I suggest you read the excel file that holds all the response options into a SAS dataset. Then register that dataset in the SAS metadata server. When you create a dynamic prompt you point to the source SAS dataset that holds the response options. After creating the prompt, you can update the dataset any time you want (add/delete records), and then STP user will see those updated response options in the prompts.
It may also be possible to register an Excel file in metadata instead of reading it into a SAS dataset. But I always try to limit Excel usage as much as possible.

Can I import multiple Excel ranges into SPSS?

I've got an absolutely horrible dataset in Excel (13,500 variables, a lot of them irrelevant to my purposes). I need to analyse in SPSS as I have a lot of data transformations to do... but SPSS 24 struggles with a dataset that size. Well... either SPSS struggles, or my work PC does.
Is there a way, when importing data from Excel, to import multiple ranges? Specifically, I want Column A (my unique identifier), then several other ranges (e.g. G:AC, DD:JJ, etc.).
/CELLRANGE=RANGE only seems to allow a single range.
If you have the appropriate ODBC driver for Excel, you can use the database reading facility in Statistics to select the fields. However, the catch with ODBC and Office is that the bigness has to match. If both are 32-bit, that's easy, but 64-bit Statistics would need 64-bit Office, and that's a world of hurt.
No (you can't specify multiple ranges), is the short answer. What you ought to be doing is importing the entire relevant range and then using SPSS commands to manipulate as you desire. For example using ADD FILES FILE with DROP sub command to delete irrelevant columns/variables.
I don't know of a way to directly import multiple ranges from Excel. You might try importing the file to an MS Access database first, then read the table from the database to SPSS using (in SPSS) "open database" --> "new query". The query editor will let you specify the fields you want to import.

How do I use Power BI Desktop with version control?

Greetings beloved comrades,
I am building a series of power bi dashboards, and as they go into production I'd like to put them into TFS. However, due to the large datasets involved, some of these report definitions are quite large (1.6GB).
It doesn't seem like a good idea to force TFS to store all of the actual data, when only the definition really matters.
Is there a simple way to remove the data from a .pbix file or save only the definition?
Edit: Looks like Microsoft has rendered this question obsolete with the creation of PowerBI templates. April Update for PowerBI
Nevertheless, the workaround in the answer could be used for other purposes.
I would add a "Parameters" Query (a table with a single row - created using Edit Queries / Edit Data) with a column called [Data Load], with a single row containing "Yes".
Then I would add a Filter step to the end of all the other Queries, referring to that "Parameters" Query. The Filter syntax would be:
Parameters{0}[Data Load] = "Yes"
That syntax is a bit obscure - it means:
Go to the Parameters Query, get the value from the 1st row, in the [Data Load] column, test if it equals "Yes".
When you want to empty all the data from the .pbix file, edit the Source step in the "Parameters" Query and change the [Data Load] value to "No", Apply and Refresh.
I've built a working example of this which you can download from my OneDrive and try out:
http://1drv.ms/1AzPAZp
It's the file: Power BI Demo - Dynamically filter all data.pbix
Convert the pbix files to a pbit file using "Save As..." option, and then version those pbit files in TFS, using Visual Studio, but controlling them on the server.
This approach is a bit interesting. When you commit a .pbix it uploads it to Premium, extracts the JSON metadata for the model, and then commits that back to DevOps next to the .pbix. That way you can see a diff over time of the model metadata including Power Query changes, measure changes, etc.

Syntax to extract variable labels from SPSS file

I have hundreds of SPSS .sav files. For each one I want to extract the variable NAMES and variable LABELS as a two column table to a csv file. I know that this is straightforward by simply copying and pasting from the "Variable view" window, but I would really like to know how to do this using syntax. Is this possible?
Many thanks in advance for any help!
You might be interested in the GATHERMD extension command, It takes a wildcard for the file names and builds a dataset with three variables: the file name, the variable name, and the variable label. You could then just save that as a csv file.
This command requires the Python Essentials available with your Statistics installation or via the SPSS Community website (www.ibm.com/developerworks/spssdevcentral).
Using native Statistics syntax, DISPLAY DICTIONARY and CODEBOOK might be helpful, but they won't give you all this information in one table.

Creating a data visualization site with Rails

I have a very large excel spreadsheet that consists of a user name, a location, a date, and some fields of numbers, for example.
User,location,date,value1,value2,value3
Steve,NYC,2012,9,1,3
Steve,NYC,2011,3,3,2
Steve,CA,2011,1,2,0
Michael,CA,2012,10,3,2
Michael,CA,2011,10,2,0
How would I go about organizing a rails site such that one can view all the values for a certain user?
For example,
/users/steve/all
would display all the values in descending order of date where user=steve.
/users/steve/nyc
would display all the values in descending order of date where user=steve and location=nyc.
I think I would need to create a users model and import all the data from the excel into the database, but I'm lost about how to do that.
The application, in essence, would be a simple data visualizer. Maybe I have to separate the database and create a user has_many :locations and locations :belongs_to user, I'm not sure. I want the data to be viewed in all sorts of ways—maybe I want to display all the users from a certain location, or view all the locations of a certain user, etc...
I suggest setting up your model within your rails application first. Then, you can just write a rake task probably similar to this question or you can build it from scratch. There's also a railscast.
If you need to directly import from Excel (e.g. the excel sheets are uploade by a user). You can use one of the following gems to importing data from an Excel Sheet
roo Reads new and old excel format (+others)
spreadsheet Reads and writes old exel format
If you only have this one excel sheet it will be far easier to simply export the data to csv and follow the answers given in the stackoverflow question mentioned above.
As for the second part of your question on how to model your database you already had the right idea.
Easiest is to fully model what your domain looks like and transform the data accordingly.

Resources