In order to join one of my tables to other tables, I had to use Calculated Fields to make the field format same as others. However, when I want to join the tables, the new field name created by Calculated Fields does not appear.
I know that I can export modified data, and import it again to solve the problem, but I'm interested to know if there is a simpler way to do so in Tableau.
Unfortunately you can not join in the data model using calculated field. If your data is not too large, you can make the join in Tableau by using data blending. Alternatively, create a view or views with the calculated field in the source SQL database.
Related
I'm trying to use two datatables like below to meet my requirement.
dtExcelData: This DataTable holds the data which is uploaded from Excel file.
dtDbData: This DataTable holds data from Database.
The requirement is that I should validate dtExcelData before I insert into database. There exists 39 columns in dtExcelData datatable with the column headings column1, column2, ... column39. And, the number of rows can range up to 400 (or even little more).
I've to do validation like below:
column6, and column22 combinedly is considered as primary key. If this same data is already available in database, I should NOT consider that record to insert into database. I can simply ignore that record. All other records should be inserted into database.
After some analysis, understood that we can use LINQ Except method to meet my requirement.
I've tried number of approaches to meet this requirement, but unable to arrive to proper solution.
I am looking for some approach like below:
dtExcelData.Except(dtDbData)
Can someone suggest me the better approach!
I have a DirectQuery table (Weather) which is sourced from an Azure SQL server. I would like to join this with an Imported table (Buckles) from an Excel sheet sourced from SharePoint Online.
Both tables have a UID field that is made up of a concatenation between a SiteID and timestamp. The UID field is named differently for each table.
I have created a One-To-Many relationship between the two tables.
I have tried to create a new DAX table using a NATURALINNERJOIN on Weather and Buckles but I get this error:
"No common join columns detected. The join function 'NATURALINNERJOIN' requires at-least one common join column."
I am confident it is not a problem with the underlying data because I've created a new imported Excel table (Test) with a selection of the data from Weather and I'm able to successfully create the join on Test and Buckles.
Is the joining of DirectQuery and Imported tables supported? I feel like this may be a type casting issue, but as far as I can see, both UID fields are set as Text.
The UID field is named differently for each table.
I suspect this may be the issue. NATURALINNERJOIN looks for matching column names
and if the two tables have no common column names, an error is returned.
Note that if you create a calculated DAX table using a DirectQuery source, I don't think that table will still act like DirectQuery. If I understand correctly, it will materialize the calculated table into your model and DAX that references that calculated table no longer points back to the SQL server (and consequently will only update when the calculated table gets rebuilt).
I want to create only certain columns on my "dbgrid" at run-time, and set them to
other table field(s) or same field . How do you do that :
illustration:
I have 3 Tables :
Student(IdStudent, NameStudent ...) ,
Module(idModul,NameModule...),
Notes(idNote,idStudent,idModul,Note).
I Want to insert All Notes in one Dbgrid and names of columns of DBgrid are names of Module Table. I have No idea?
Thanks.
You cannot do this with a dbgrid; dbgrids have only one datasource and a datasource has only one dataset. If you are using an SQL compliant database you should look into a join and/or crosstab to return a single dataset. (I think this is what MartynA is talking about) Or create a clientdataset at run-time and build it with the columns/data you want if you want data-aware. I would look into using a stringgrid, listview or treeview and build the whole thing by hand.
I'm trying to design my first data mart with a star schema from an Excel Sheet containing informations about a Help Desk Service calls, this sheet contains 33 fields including different informations and I can't identify the fact table because I want to do the reporting later based on different KPI's.
I want to know how to identify the fact table measures easily and I have another question which is : Can a fact table contain only foreign keys of dimensions and no measures? Thanks in advance guys and sorry for my bad English.
You can have more than one fact table.
A fact table represents an event or process that you want to analyze.
The structure of the fact tables depend on the process or event that you are trying to analyze.
You need to tell us the events or processes that you want to analyze before we can help you further.
Can a fact table contain only foreign keys of dimensions and no measures?
Yes. This is called a factless fact table.
Let's say you want to do a basic analysis of calls:
Your full table might look like this
CALL_ID
START_DATE
DURATION
AGENT_NAME
AGENT_TENURE (how long worked for company)
CUSTOMER_NAME
CUSTOMER_TENURE (how long a customer)
PRODUCT_NAME (the product the customer is calling about)
RESOLVED
You would turn this into a fact table like this:
CALL_ID
START_DATE_KEY
AGENT_KEY
CUSTOMER_KEY
PRODUCT_KEY
DURATION (measure)
RESOLVED (quasi-measure)
And you would have a DATE dimension table, AGENT dimension table, CUSTOMER dimension table and PRODUCT dimension table.
Agile Data Warehouse Design is a good book, as are the ones by Kimball.
In general, the way I've done it (and there are a number of ways to do anything) is that the categorical data is referenced with a FKey in the fact table, but anything you want to perform aggregations on (typically as data types $/integers/doubles etc) can be in the fact table as well. So for example, a fact table might contain a hierarchy of types, such as product_category >> product_name, and it usually contains a time and/or location field as well; all of which would be referenced by a FKEY to a lookup table. The measure columns are usually integer based or money data, and are used in aggregate functions grouped by the other fields like this:
select sum(measureOne) as sum, product_category from facttable
where timeCol between X and Y group by product_category...etc
At one time a few years ago, I did have a fact table that had no measure column... because the only measure I had was based on count, which I would do dynamically by grouping different dimensions in the fact table.
I have a pFibdataset(which is working similar to BDEDataset) in which I need to make the following join selection
select table.Name as name,
table1.Name as name_1,
table2.Name as name_2
from table
left join table table_1 on table.id=table_1.id
left join table table_2 on table.id=table_2.id
Fields name, name_1 and name_2 are linked to some data-aware edits. Now, I want after I'm modifying(update,delete,insert operations) the name,name_1 and name_2 fields to be updated in the tables. Based on the wiki Using_Multiple_Update_Objects_Index I can use UpdateObjects, or OnUpdateRecord event.
The problem is that I don't understand how this need to be implemented. I have the join select on the query, how I need to define and work with name_1 and name_2 fields. Can someone provide me an example for this?
I know how to use subqueries in order to accomplish this. I need to see how can I can make it by using UpdateObjects or OnUpdateRecord.
TpFibUpdateObject works like a trigger on client side. To make it work, set the following properties:
DataSet - dataset (master) to monitor
KindUpdate - Insert/Update/Delete - action to monitor
SQL - command to execute when action is fired, params are taken from DataSet
ExecuteOrder - AfterDefault/BeforeDefault - probably you need after / master
BUT, instead using a lot of UpdateObject components and such tangled approach, I recommend two alternative (read better) ways:
Updatable view. It will work like a "virtual table". Create a view that joins these theee tables and write Before Insert/Update/Delete triggers. In Delphi use it as a regular table: select from view / insert into view / update view and delete from view. Anyway I suppose you need in many places these tables linked toghether.
Use EXECUTE BLOCK statements in your TpFIBDataSet SQLs. Insert / Update / Delete in a batch all tables.
Solution : OnUpdateRecord it must be created an TUpdateObject for each field from the joined table.
UpdateObjectvariable.DataSet := Dataset;
fill the SQL text
Apply.
After all update objects are set, UpdateAction := uaApplied; must be called.