Visual Studio 2019 Logic Apps Designer removing code - visual-studio-2019

I have Visual Studio 2019 16.10.4 and Azure Logic Apps Tools for Visual Studio 2019 2.24.2
I have created a Logic App by first going to the portal and getting a simple skeleton trigger that listens on an Event Grid Topic and connects using a Managed Identity.
I then copy the json over to my Visual Studio project.
Once in Visual Studio if I deploy the ARM template (note I deploy the ARM template using a simple powershell script and not the one generated by the tool but that should not matter) everything works as I expect, I get the API Connection, Logic App, and Event Grid Trigger all created so I am happy that the contents of the ARM template and parameters file all work as I expect.
The issue I face is that when I open the Logic App in the Logic App designer the tool seems to remove the managed identity code from the json and then the tool spits out an error in the output window.
If I can try to explain.
This is the code in the raw json file before I open in the designer:
"$connections": {
"value": {
"azureeventgrid": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/', parameters('LogicAppLocation'), '/managedApis/', 'azureeventgrid')]",
"connectionId": "[resourceId('Microsoft.Web/connections', parameters('azureeventgrid_1_Connection_Name'))]",
"connectionName": "[parameters('azureeventgrid_1_Connection_Name')]",
"connectionProperties": {
"authentication": {
"type": "ManagedServiceIdentity"
}
}
}
}
}
When I open in the designer I see this error message in the Output window:
The workflow connection parameter 'azureeventgrid' is not valid. The API connection 'azureeventgrid' is configured to support managed identity but the connection parameter is either missing 'authentication' property in connection properties or authentication type is not 'ManagedServiceIdentity'.
And then when I look at the raw json this is what now appears:
"$connections": {
"value": {
"azureeventgrid": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/', parameters('LogicAppLocation'), '/managedApis/', 'azureeventgrid')]",
"connectionId": "[resourceId('Microsoft.Web/connections', parameters('azureeventgrid_1_Connection_Name'))]",
"connectionName": "[parameters('azureeventgrid_1_Connection_Name')]"
}
}
}
}
So it appears as if the tool is not happy with some part of my code and removes it.
I then do my work, close the designer, open the raw json, copy back in the removed connectionProperties and deploy. So I do have a workaround but it is a bit tedious to have to do this all the time.
Is this a known issue? For example I can see that the designer does not seem to allow me to create a logic app with a trigger to event grid that uses Managed Identity (hence why I started out by creating a skeleton in the portal and copying the code over).

Related

Azure Web Job - Data Processing

In VS I've created an Azure Web Job. I see a boiler plate method:
static void Main()
{
var host = new JobHost();
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
Also a function method:
// This function will get triggered/executed when a new message is written
// on an Azure Queue called queue.
public static void ProcessQueueMessage([QueueTrigger("queue")] string message, TextWriter log)
{
log.WriteLine(message);
}
Cool... however I don't want to use Azure Queue or blog storage. I don't need to pass in any data as arguments or trigger it.
I simply want a job that will run every hour and do some data processing. Specially hit a 3rd party API and load some data into my Azure DB.
What am I missing here?
EDIT
Should I just be using a vanilla console app in this situation and publish it as an "Azure Web Job" ?
You should just use a vanilla console app and deploy that as an Azure Web Job. See the steps below:
Right-click the web project in Solution Explorer, and then click Add > Existing Project as Azure WebJob. The Add Azure WebJob dialog box appears.
In the Project name drop-down list, select the Console Application project to add as a WebJob.
Complete the Add Azure WebJob dialog, and then click OK.
The Publish Web wizard appears. If you don't want to publish immediately, close the wizard. The settings that you've entered are saved for when you do want to deploy the project.
Source with screenshots: https://azure.microsoft.com/nl-nl/documentation/articles/websites-dotnet-deploy-webjobs/#convert
You can find more information about this here: https://azure.microsoft.com/nl-nl/documentation/articles/websites-dotnet-deploy-webjobs/.
On this page you can also read that a console application can be used as an Azure Web Job by adding the Microsoft.Web.WebJobs.Publish NuGet package and a webjob-publish-settings.json.
Example webjob-publish-settings.json:
{
"$schema": "http://schemastore.org/schemas/json/webjob-publish-settings.json",
"webJobName": "WebJob1",
"startTime": "2014-06-23T00:00:00-08:00",
"endTime": "2014-06-27T00:00:00-08:00",
"jobRecurrenceFrequency": "Minute",
"interval": 5,
"runMode": "Scheduled"
}
When you want to add this Azure Web Job to an existing Azure Web App (website) project you can link the webjob by adding a webjobs-list.json file to the website project.
Example webjobs-list.json:
{
"$schema": "http://schemastore.org/schemas/json/webjobs-list.json",
"WebJobs": [
{
"filePath": "../ConsoleApplication1/ConsoleApplication1.csproj"
},
{
"filePath": "../WebJob1/WebJob1.csproj"
}
]
}

How to refresh updated opened work item from TFS 2013 server side plugin?

I am trying to create TFS 2013 server side plugin which will transition the work item state depending upon certain fields. The fields are being updated correctly, but not refreshed as the work item is opened in client (VS team explorer). I need to manually press the refresh button to display the correct state.
Can I force refresh the displayed work item after state change from plugin?
Following is the code which is handling the work item changed event.
if (null != workItem)
{
workItem.PartialOpen();
if (!workItem.Fields["ALMTool.FF.Team.Leader"].Value.Equals(string.Empty))
{
if (workItem.Fields["System.State"].Value.Equals("Raised"))
{
workItem.State = "Analyse";
}
}
else
{
workItem.State = "Raised";
}
workItem.Save();
workItem.Store.RefreshCache(true);
//workItem.Close();
workItem.SyncToLatest();
}
I've had the same need long time ago. But I was trying to achieve this in Web Access rather than Team Explorer. For Web Access, this was not possible, because of a simple explanation:
My implementation was in server-side and the operation
refresh is actually happening in client-side.
But I think Team Explorer will not be able to do it either, since TFS don't give any library to make UI operation like "Open Work Item Window, Open Pending Changes Windows, etc."
You've already done the SyncToLatest and TFS will force the user to refresh first in order to make any change on the work item after your operation, but the user has to refresh the work item client-side first, manually.
If you want to achieve this without the user manually refreshing and Web Access is ok for you, you should check if your implementation could be done using TFS Web Access Extensions which are run client-side. You can take a look and gather detailed information about them from Serkan's blog.
Beytan, Yes If writing client side plugin, then I need to delpoy library to each client. But I was trying to avoid that by implementing TFS server side plugin. But server will be loaded with the workitemchange Event handlers. So now I am investigating the client side add-ins to automate the state transitions.(Shouldn't be that hard :))

Visual Studio 2010 Controller changes only pickup after Rebuild in MVC 4 Project

I have to rebuild the project each time after making changes to the controllers for it to pick up and show when running the project. Is there a setting VS2010 that's making this happen? I'm using [OutputCacheAttribute(VaryByParam = "none", Duration = 1, NoStore = true)] for each method so I don't think it's the web server. It doesn't even debug down into the controller unless I rebuild.
Any code (controllers, services, data layer items) are compiled into a .dll file which is what the server uses for processing. The raw text files (what you are editing) are not used by the server. Therefore, any change you make to any .cs or .vb file will always require a re-compilation. This is also true for using the built-in VS web server (which, typically I get yelled at by VS saying "You can't modify this file while the debugger is running" or "the source file does not match the current debug file")
You can, however, modify views on the fly without recompiling in most cases.

Creating and deploying a windows service publishing pages in SDL Tridion

Our requirement is to schedule content publishing of a content page to run in recurring intervals in Tridion CMS application. We are currently using Tridion 2009 SP1 version.
As per the suggestion from the experts as in: Tridion 2009 SP1: How to schedule a content page for a recurring publishing? we have created a simple C# console application that has referenced Triond Interop .dll's as below:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Tridion.ContentManager.Interop.TDS;
using Tridion.ContentManager.Interop.TDSDefines;
using Tridion.ContentManager.Interop.msxml4;
using System.Configuration;
namespace SchedulePublish
{
class Program
{
static void Main(string[] args)
{
//Please use your system related corresponding webdav url's and tcm id's where ever required. Below are just sample :)
TDSE tdse = new TDSE();
//Give some identity that has access rights on tridion UI
string Identity = #"Domain Name\Username";
tdse.Impersonate(Identity);
tdse.Initialize();
string targetTypeId = "tcm:0-1-65537";
Publication Pub_Obj = (Publication)tdse.GetPublication("/webdav/30%20DIRECTV%20sites");
XMLReadFilter Filter = new XMLReadFilter();
Component CompObj = (Component)tdse.GetObject("/webdav/30%20DIRECTV%20sites/Home/System/xml/Knavigation.xml",
EnumOpenMode.OpenModeView, Pub_Obj.ID, Filter);
DateTime schedulePublishDate = Convert.ToDateTime(ConfigurationManager.AppSettings["SharedPath"].ToString());
CompObj.Publish(targetTypeId, false, false, false, schedulePublishDate, DateTime.MinValue, DateTime.Now, true, EnumPublishPriority.High, false, 3);
}
}
}
As we are new, please provide pointers to implement the below steps:
1.Tridion CMS servers do not have Visual studio installed so please suggest a way to run this application and verify if we are able to publish the content as required.
2.Host this application in the Tridion CMS Server and schedule it to run at the desired intervals every week.
You don't need Visual Studio to run your new console app, simply compile it and copy the files to the CMS server.
If you run the application, you should see items appearing in your Publication Queue, if you don't see your items added to the Publish Queue, I would recommend adding some logging calls to your application so you can see where the code is failing (consider using Log4J.NET if you have not done logging before).
Once you have validated that it works as desired, the easiest way to schedule it is to create a task using the Windows Task Scheduler. There is no way to run such a task from within the CMS. Alternatively you could convert your console app to a windows service, but I think this would be overkill in this case.

How to load Crystal Report from database in MVC 4? [duplicate]

I am trying to run a crystal report from my web application which was built using ASP.NET 4.0 and Visual Studio 2010. I have installed the following from the SAP site (http://www.businessobjects.com/jump/xi/crvs2010/us2_default.asp)
1) SAP Crystal Reports, version for Visual Studio 2010 - Standard EXE installation package which installs the software into the Visual Studio IDE.
2) SAP Crystal Reports runtime engine for .NET Framework 4 (64-bit)
I have a page called Reports.aspx in which I have a crystal report viewer control
<CR:CrystalReportViewer ID="rptViewer" runat="server" AutoDataBind="true" />
In the Reports.aspx.cs file I have the following code:
protected override void OnPreRender(EventArgs e)
{
ReportDocument report = new ReportDocument();
var path = Server.MapPath("Reports/Sample.rpt");
report.Load(path);
report.SetDatabaseLogon("username", "password", "servername", "databasename");
rptViewer.ReportSource = report;
}
On the report.Load(path) line I get the following error:
Unsupported Operation. A document processed by the JRC engine cannot be opened in the C++ stack.
How can I fix this?
I also got into the same problem my problem was report path was not valid, May be you have same problem , check Server.MapPath("Reports/Sample.rpt") returning valid path ?
make sure the report is in app_code folder
initialize a new instance of it instead of initializing a reportdocument and loading the report in it.
Sample report = new Sample();
this worked for me
You will need to modify two properties in the .rpt files:
Build Action is set to "Embedded Resource" by default. Change it to "Content".
Copy to Output is set to "Do not copy" by default. Change it to "Copy always".
Rebuild, Build deployment package and Publish. Done!
NOTE: Below, the term "WebSite" refers to actual web site nodes in IIS, NOT a virtual directory within a web site.
Problem Root Cause: There is no "aspnet_client" folder accessible by the application.
This can happen for several reasons:
Since the SAP CR installer appears to install the aspnet_client folder in the ...\inetpub\wwwroot\ folder, if your Web Site physical path is NOT ...\inetpub\wwwroot, your application will not have access to the aspnet_client folder.
If the aspnet_client folder was moved or deleted from to the top level of your web site's physical path, your IIS application will not have access to the folder.
Problem Solution (For Windows Server 2008 R2)
Go to the IIS manager on your server
Expand the tree view node for the WebSite running your application
Look at the level immediately under the web site node and ensure you see a "aspnet_client" folder.
If you do see the folder, then perhaps this root cause is not the cause of your problem.
If you do NOT see the folder, search the server's hard drive for it and COPY it to the Web Site's Physical path.
Right mouse button click on the Web Site node and click Refresh from the popup Menu
You should now see the aspnet_client folder at the level directly under your web site node and the reports in the application should work.
I ran into this when I converted a web site to a web application. The report would run fine on my dev machine, but not on the server. Then I realized the rpt file was missing on the server!
By default the report files were considered embedded resources and were not copied when the web application was published. I just changed them individually, republished the site, and all was well again.
I have also had a report load failure if I have mistakenly left the report file open in crystal reports designer.
This error is a real treat, and seems to have many possible antecedents. Fortunately I only wasted a day on my particular variation:
ReportDocument.Load() also makes a local temp copy. (This may only be in the case of a load from a network drive location, I didn't test this in the case of a local load.)
If the user context under which the load occurs does not have authority to create the temp file locally, Load will fail with the same very unhelpful error.
Also, I ended up diagnosing this with Process Monitor. It may be helpful for you as well.
Please make sure your report rpt files are in their original folder. I got the same error first, after I "published" my MVC web site to IIS. I didn't realize that "Publish" didn't put rpt files in the package.
Re-install the "Crystal report engine" to the server
Build Action set to "Content".
It perfectly worked for me..!

Resources