Sourcing & Filtering - The field type must match the type of the field you are sourcing from (Text Area) - field

I created a custom record and I am trying to create field (List/Record : Adress Book) sourced from all address of Vendor Field.
Here the vendor field
Here the Address Book field
And here the different Test and result I got
First configuration test
Result : "The field type must match the type of the field you are sourcing from (Text Area)."
Second configuration test
Result : "The list or record type of the field or source (Vendor) must match the list or record type of the field you are filtering by (Entity)."

It is saying that "Address" is technically a text area representation of the address, not actually a reference to the address book. If you make your "Delivery address" field of type text area and do configuration 1, then it will work

Related

How to list the types of a influxdb measurement fields?

I have an influxdb with a measurement named http_reqs. This measurement has several fields which I can 'list' as follows (I see 11 fields):
SELECT * FROM http_reqs LIMIT 1;
Here the first line of the output is as follows:
time error error_code method name proto scenario status tls_version type url value
I assume to have 11 fields in that 'measurement' http_reqs:
time
error
error_code
method
name
proto
scenario
status
tls_version
type
url
value
I want to know the 'type' of these fields.
For example:
I the field status a string? Or is the field status an integer? Or is the field status a float? Or is the field status a boolean?
I hope my question if a bit more clear now.
I found this documentation and I can run
SHOW FIELD KEYS FROM http_reqs;
but it seems to list only 2 fields! The output is:
name: http_reqs
fieldKey fieldType
-------- ---------
url string
value float
Yes that is what I want for all of the fields! I can see, that the field url has type string, and I can see that the field value is of type float.
But 9 of the above listed fields seem to be missing. I see only two fields (url and value). I do not see the type of the field status, for example. I want to know the type of the field status.
I also can do the following query:
SHOW TAG KEYS FROM http_reqs
which gives this output:
name: http_reqs
tagKey
------
error
error_code
method
name
proto
scenario
status
tls_version
type
Interestingly, this query lists all of the 9 'missing' fields (or tags, or keys, or whatever these things are). But this output does not tell me what type the element status is, for example. I want to know the type of the element status.
How can I see the types of each of the 11 elements of the 'measurement' http_reqs?
Your term fields is very likely InfluxDB fields + InfluxDB tags in your case:
Inspect InfluxDB fields:
SHOW FIELD KEYS FROM http_reqs
Inspect InfluxDB tags:
SHOW TAG KEYS FROM http_reqs

Teradata:Get a specific part of a large variable text field

My first Post: (be kind )
PROBLEM: I need to extract the View Name from a Text field that contains a full SQL Statements so I can link the field a different data source. There are two text strings that always exist on both sides of the target view. I was hoping to use these as identifying "anchors" along with a substring to bring in the View Name text from between them.
EXAMPLE:
from v_mktg_dm.**VIEWNAME** as lead_sql
(UPPER CASE/BOLD is what I want to extract)
I tried using
SELECT
SUBSTR(SQL_FIELD,INSTR(SQL_FIELD,'FROM V_MKTG_TRM_DM.',19),20) AS PARSED_FIELD
FROM DATABASE.SQL_STORAGE_DATA
But am not getting good results -
Any help is appreciated
You can apply a Regular Expression:
RegExp_Substr_gpl(SQL_FIELD, '(v_mktg_dm\.)(.*?)( AS lead_sql)',1,1,'i',2)
This looks for the string between 'v_mktg_dm.' and ' AS lead_sql'.
RegExp_Substr_gpl is an undocumented variation of RegExp_Substr which simplifies the syntax for ignoring parts of the match

Zoho Creator - Retrieving the actual text value of a Form's Lookup Dropdown field

I'm following the Zoholics videos on Deluge and used this statement to lookup the pizza price from the Inventory:
colPizzaDetails = Inventory[Pizza == input.Pizza && Size == input.Pizza_Size];
This error is displayed when trying to save the script:
In Criteria left expression is of type STRING and right expression is of type BIGINT and the operator == is not valid
In the video, the narrator was able to save the script. Looks like a BIGINT is stored in the Lookup Dropdown now instead of text as shown in the video.
Can you tell me how to alter the code statement so I can change input.Pizza to represent the actual Pizza Name text that the user sees on the screen?
you should convert the input result to an integer using toNumber() function Like this:
colPizzaDetails = Inventory[Pizza == input.Pizza && Size ==
input.Pizza_Size.toNumber()];
It seams Inventory form is having the Pizza field as a string field.
And in the current form it seams Pizza field is a lookup from Pizza Form.
so whenever you try to access a lookup field , u'll get the Record ID in return, you can check it by logging that using alert or info command , example alert or info (input.Pizza) in u'r current form.
so first u'll need to fetch the pizza name from Pizza Form and then compare it in u'r query.
lets say pizza name field in Pizza Form is Pizza_Name;
so this is how u'r code should look
// fetch pizza_name : make sure you replace Pizza_Name
// by the deluge name of pizza name field in pizza form.
pizza_name = Pizza[ID==input.Pizza].Pizza_Name;
// compare pizza_name with inventory pizza and
// fetch inventory data.
colPizzaDetails = Inventory[Pizza == pizza_name && Size == input.Pizza_Size];
and this should work fine.
OR
you can also try
colPizzaDetails = Inventory[Pizza == input.Pizza.Pizza_Name && Size == input.Pizza_Size];
but that can give you this error in some cases.
Pizza is a lookup field and child fields cannot be accessed
so the First approach will work in all cases & is preferred approach from zoho standard point of view.

How to transform to Entity Attribute Value (EAV) using Spoon Normalise

I am trying to use Spoon (Pentaho Data Integration) to change data that is in typical row format to Entity Attribute Value format.
My source data is as follows:
My Normaliser is setup as follows:
And here are the results:
Why is the value for the CONDITION_START_DATE and CONDITION_STOP_DATE in the string_value column instead of the date_value column?
According to this documentation
Fieldname: Name of the fields to normalize
Type: Give a string to classify the field.
New field: You can give one or more fields where the new value should transferred to.
Please check Normalizing multiple rows in a single step section in http://wiki.pentaho.com/display/EAI/Row+Normaliser. Accordind to this, you should have a group of fields with the same Type (pr_sl -> Product1, pr1_nr -> Product1), only in this case you can get multiple fields in output (pr_sl -> Product Sales, pr1_nr -> Product Number).
In your case you can convert dates to strings and then use row normalizer with single new field and then use formula for example:
And then convert date_value to date.

Mapping Values in Qlikview

In Qlikview, I have an excel sheet that I use to map USERNAME to a TEAM value. But everytime I refresh the dashboard, new USERNAME values come up and since they are not in the excel sheet, these USERNAME values show up as their own value in the TEAM column. How would I make it so that any USERNAME that is not in the excel sheet shows up as 'Unidentified' or another value under the TEAM column instead of showing up as their own separate value?
First of all when posting question here if possible always include the source code so everybody will have more clear picture about your problem. Just saying.
On the topic ...
Use the mapping load in this case with supplying the third parameter. For example:
TeamMapping:
Mapping
Load
UserName,
Team
From
[User_to_Team_Mapping.xlsx] (ooxml, embedded labels, table is [Sheet1])
;
Transactions:
Load
Id,
Amount,
ApplyMap( 'TeamMapping', User, 'Unidentified') as Team
From
Transactions.qvd (qvd)
;
The third parameter in ApplyMap is the default string value when mapping value was not found in the mapping table (TeamMapping)

Resources