How to integrate Power BI in a Delphi desktop application - delphi

Has anyone integrated Microsoft Power BI into a Delphi application. I beleive that I will need to embed a webpage into a form,I am ok with that, however I cant see how you force a refresh or feed Power BI the run-time selection criteria.
It will be linked to a standard SQL Server database (not cloud based at the moment). I have the graph I want working on Power BI desktop.

I'm integrating it in WPF C# application. It's pretty much the same as in Delphi, but easier due to availability of ADAL library for C#.
If you want to display a report (or tile, or dashboard) based on the current selection from your application, you must provide this information to the report. You can save the selection to a table in the database (or information about the selection, like primary key values) and build the report on this table. Put a session column in it, and on every save generate an unique session ID value. Then filter the report to show only data for your session.
To filter the embedded report, define a filter and assign it to filters property of the embed configuration object, that you are passing to the JavaScript Power BI Client, or call report.setFilters method. In your case, IBasicFilter is enough. Construct it like this:
const basicFilter: pbi.models.IBasicFilter = {
$schema: "http://powerbi.com/product/schema#basic",
target: {
table: "ReportTempTableName",
column: "SessionId"
},
operator: "In",
values: [12345],
filterType: 1 // pbi.models.FilterType.BasicFilter
}
replacing 12345 with the unique session ID value, that you want to visualize.
To avoid the possibility the user to remove the applied filter and see the data for all sessions, you may hide the filter pane:
var embedConfig = {
...
settings: {
filterPaneEnabled: false
}
};

Related

Allow "Create New Filter View" feature on Google Sheet with some columns protected

I am working on a large Google Sheet that has many editors from multiple companies, so I have had to give permissions on each tab on a per column basis. The issue is, because most editors only have editor access to 10-20% of the columns, the "Create New Filter View" feature is not working for those users. I am finding Filter View access to be very finicky in general. Sometimes it works, sometimes it doesn't on a sheet with similar protections. Other times "Create New Filter View" is not an available option, but if you select an existing filter view and click "Duplicate" you can create one. Seems like there are a few weird bugs with Google Sheets and permissions that I can't pin down.
Any ideas on how to use Apps Script to unlock the protections on the "Create New Filter View" option? This could be for all editors or all viewers, either would work in this instance. Thanks!
Issue:
While a FilterView can be created in a script if the Advanced Sheets Service is enabled, the same issue you're experiencing while creating it in the UI would show up here: a user cannot create a FilterView if one of the columns is protected from this user.
Naturally, this same user cannot run a script to unprotect the corresponding columns either - what worth would a protection be if the users to which it applies could remove it!
Possible workarounds:
A way around this would be to have a script that runs under the authority -see getEffectiveUser()- of a user who can edit those protected columns. In most situations, this is the user that is triggering the script -see getActiveUser()-, but in certain situations, like an installable trigger or a web app that executes as the user who deployed it, that is not the case.
For example, you could install an onEdit trigger with a user who has access to all the columns in the desired FilterView.
Then, this would fire whenever any user edits the spreadsheet, but it would fire under the authority of the user who installed the trigger (and can access the protected columns), so the FilterView could be created.
In order to create the FilterView only for certain edits (for example, when a specific cell is edited and there is a specific edited value), you could check these conditions at the beggining of the onEdit function. And if you needed to pass more information for customization of the filter view (for example, which columns and rows, which sheet, etc.), you could put that information in other cells and retrieve the corresponding values via getValue()/getValues().
For example, It could be something like this:
function fireOnEdit(e) {
if (e.range.getA1Notation() === "E1" && e.value === "Create FilterView") {
const ss = e.source;
const spreadsheetId = ss.getId();
const sheet = ss.getSheets()[0];
const resource = {
requests: {
addFilterView: {
filter: {
range: {
sheetId: sheet.getSheetId(),
startRowIndex: 0,
endRowIndex: 4,
startColumnIndex: 0,
endColumnIndex: 3
}
}
}
}
}
Sheets.Spreadsheets.batchUpdate(resource, spreadsheetId);
}
}

How to get via Organisation service for Microsoft Dynamics the OptionSet value and Formatted value in different languages?

I have a custom .NET application to query and managing data on a Microsoft Dynamics CRM instance.
This application is multilingual and the user can change via a language switch the language of the application.
For the connection and actions I'm using the OrganizationService and CRMServiceClient from Microsoft.Xrm.Sdk. This is combined with dependency injection to pass the connection to our different classes.
With Ninject this bindings look like
Bind().To().WithConstructArgument("crmConnectionString","the connection string");
Querying and updating the data in Dynamics is working but we are not able to retrieve the OptionSet values and Formatted values in the language the visitor have selected in the custom app. This is always in the same language even when we change the culture for the Thread before we call Dynamics.
How can we pass the current language / culture to the OrganizationService so that it knows in what language it have to retrieve the fields?
Someone told me that this is based on the account used to connect to the CRM. So if that's indeed the case then it means that if we have 5 languages that we need to have 5 connection strings and 5 OrgnaizationService instances that need to be called. How should I handle this in a good way in that case?
Thanks for your answers
The solution I implemented was to use CallerId.
Before returning the client I fill the CallerId with a Guid.
The Guid is from a user configured with a specific language in Dynamics.
Based on the language I take a different user.
I don't know if you can pass a culture to the OrganizationService, and I think having different connection strings would work if you want to go this route.
However, you can query the CRM to retrieve the localized labels for the option set you want, as described here.
To sum it up, it's using a RetrieveAttributeRequest, passing the entity logical name and field name and looping trough the result to get the labels.
var request = new RetrieveAttributeRequest
{
EntityLogicalName = "incident",
LogicalName = "casetypecode"
};
var response = organizationService.Execute(request) as RetrieveAttributeResponse;
var optionSetAttributeMetadata = response.AttributeMetadata as EnumAttributeMetadata;
foreach (var option in optionSetAttributeMetadata.OptionSet.Options)
{
Console.WriteLine($"Localized labels for option {option.Value}:");
foreach (var locLabel in option.Label.LocalizedLabels)
{
Console.WriteLine($"Language {locLabel.LanguageCode}: {locLabel.Label}");
}
Console.WriteLine($"Localized description for option {option.Value}:");
foreach (var locLabel in option.Description.LocalizedLabels)
{
Console.WriteLine($"Language {locLabel.LanguageCode}: {locLabel.Label}");
}
}
The code in the link also add caching of already retrieved values, so that you only query the CRM once per option set.

Adobe DTM Pass Unix Timestamp to eVar

I'd like to pass the Unix timestamp to a hit level eVar in DTM. I would assume I could pass some Javascript like this:
function() {
var now = new Date();
return now.getTime();
}
However, I am not sure where to pass it in DTM. Would this be passed in the "Customize Page Code" editor in the Tool Settings or somewhere else?
You can create a Data Element of type Custom Code. Name it something like current_timestamp or whatever. The code should not be wrapped in the function declaration syntax (DTM already wraps it in a function callback internally). So just put the following in the code box:
var now = new Date();
return now.getTime();
Then in your Adobe Analytics Tool Config (for global variables), or within a Page Load, Event Based, or Direct Call Rule, within the Adobe Analytics Config section. choose which eVar you want to set, and for the value, put %current_timestamp% (or whatever you named it, using % at start/end of it. You should see it show up in a dropdown as you start typing % in the value field).
Alternatively, if you want to assign the eVar in a custom code box in one of those locations, you can use the following javascript syntax e.g (assume eVar1 in example).
s.eVar1 = _satellite.getVar('current_timestamp');
Note that with this syntax, you do not wrap the data element name with %
One last note. This is client-side code, so the timestamp will be based on the user's browser's timezone settings. So for example, a visitor from the US and another visitor from China both visiting a page physically at the same time (server request at the same time), will show two different timestamps because they are in two different timezones.
This makes for some misleading data in reports, so make sure you break it down by other geo based dimensions, or do some extra math in your Data Element to convert the timestamp to a single timezone (e.g. convert it to EST). In practice, most people will pick whatever timezone their office is located in, or else what their server's timezone is set to.

Where in the Admin site of EventStore I can view my saving events?

By the way how do you create a STREAM?
I use AppendToStreamAsync directly, is this right or shall I create a
stream first then append onto this stream?
I also tried performing some tests but using the methods below I can write
events onto EventStore but can't read Events from it.
And most import question is how do I view my saving events in the Admin site of EventStore?
Here are the code:
public async Task AppendEventAsync(IEvent #event)
{
try
{
var eventData = new EventData(#event.EventId,
#event.GetType().AssemblyQualifiedName,
true,
Serializer.Serialize(#event),
Encoding.UTF8.GetBytes("{}"));
var writeResult = await connection.AppendToStreamAsync(
#event.SourceId.ToString(),
#event.AggregateVersion,
eventData);
Console.WriteLine(writeResult);
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
public async Task<IEnumerable<IEvent>> ReadEventsAsync(Guid aggregateId)
{
var ret = new List<IEvent>();
StreamEventsSlice currentSlice;
long nextSliceStart = StreamPosition.Start;
do
{
currentSlice = await connection.ReadStreamEventsForwardAsync(aggregateId.ToString(), nextSliceStart, 200, false);
if (currentSlice.Status != SliceReadStatus.Success)
{
throw new Exception($"Aggregate {aggregateId} not found");
}
nextSliceStart = currentSlice.NextEventNumber;
foreach (var resolvedEvent in currentSlice.Events)
{
ret.Add(Serializer.Deserialize(resolvedEvent.Event.EventType, resolvedEvent.Event.Data));
}
} while (!currentSlice.IsEndOfStream);
return ret;
}
Streams are created automatically as you write events. You should follow the recommended naming convention though as it enables a few features out of the box.
await Connection.AppendToStreamAsync("CustomerAggregate-b2c28cc1-2880-4924-b68f-d85cf24389ba", expectedVersion, creds, eventData);
It is recommended to call your streams as "category-id" - (where category in our case is the aggregate name) as we use are using DDD+CQRS pattern
CustomerAggregate-b2c28cc1-2880-4924-b68f-d85cf24389ba
The stream matures as you write more events to the same stream name.
The first events ID becomes the "aggregateID" in our case and then each new
eventID after that is unique. The only way to recreate our aggregate is
to replay the events in sequence. If the sequence fails an exception is thrown
The reason to use this naming convention is that Event Store runs a few default internal projection for your convenience. Here is a very convoluted documentation about it
$by_category
$by_event_type
$stream_by_category
$streams
By Category
By category basically means there is stream created using internal projection which for our CustomerAggregate we subscribe to $ce-CustomerAggregate events - and we will see only those "categories" regardless of their ID's - The event data contains everything we need there after.
We use persistent subscribers (small C# console applications) which are setup to work with $ce-CustomerAggregate. Persistent subscribers are great because they remember the last event your client acknowledged. So if the application crashes, you start it and it starts from the last place that application finished.
This is where event store starts to shine and stand out from the other "event store implementations"
Viewing your events
The example with persistent subscribers is one way to set things up using code.
You cannot really view "all" your data in the admin site. The purpose of the admin site it to manage projections, manage users, see some statistics, create some projections, and have a recent view of streams and events only. (If you know the ID's you can create the URL's as you need them - but you cant search for them)
If you want to see ALL the data then you use the RESTfull API using by using something like Postman. Maybe there is a 3rd party software that can create a grid like data source viewer but I am unaware of this. That would probably also just hook into the REST API and you could create your own visualiser this way quite quickly.
Again back to code, you can also always read all events from 0 using on of the libraries - which incidentally using DDD+CQRS you always read the aggregates stream from 0 to rebuilt its state. But you can do the same for other requirements.
In some cases looking at how to use snapshots makes replaying events allot faster, if you have an extremely large stream to deal with.
Paradigm shift
Event Store has quite a learning curve and is a paradigm shift from conventional transactional databases. Event Stores best friend is CQRS - We use a slightly modified version of the CQRS Lite open source framework
To truly appreciate Event Store you would need to understand DDD concepts and then dig into CQRS/ES - There are a few good YouTube videos and examples.

How to keep conversation data in MS Bot framework

I am working with Microsoft bot development framework, using its node.js sdk.
I have been looking for a way to save all the messages of a conversation. I set persistConversationData to true, and tried to access the conversationData using session.conversationData. However, it is empty.
1- Is there a builtin method to access all the messages within a conversation?
2- If persistConversationData is not for that, can anyone please explain its usage.
Thank you so much.
By default, messages will not be persisted by the Microsoft Bot Framework. For stateful operations, you can use the Bot State API the following ways:
Set userData. The persisted data will be available to the same user across different conversations.
Set conversationData. The persisted data will be available to all the users within the same conversation.
Set privateConversationData. The persisted data will be available to the given user in the given conversation.
Set dialogData for storing temporary information in between the steps of a waterfall.
According to the documentation, conversationData is disabled by default. If you want to use it, you have to set persistConversationData to true.
tl;dr You have to take care of persistence for yourself. E.g.
// ...
var bot = new builder.UniversalBot(connector, { persistConversationData: true });
bot.dialog('/', function (session) {
let messages = session.conversationData || [];
messages.push(session.message);
session.conversationData = messages;
});

Resources