Refactoring features for ASP.NET Core on VS 2017 are not working very well for me… Am I doing something wrong? - asp.net-mvc

I am in the process of learning ASP.NET Core MVC, while experimenting, I noticed that when I try to refactor the code using the built in refactoring tools provided by Visual Studio they don't work very well. For example:
Renaming a class via the solution explorer pane usually fails. When it succeeds it will fail to rename the class on the view declaration (eg: #model ClassNameIsNotRenamed);
If I rename a controller or action, the helper tags such as asp-controller="NotRenamed" and asp-action="NotRenamed" do not get updated.
etc.
I don't want to get too detailed on what works and what does not work, the point that I am trying to make is that VS 2017 does not appear to do a good job when it comes to refactoring.
So my question is... Am I asking to much from VS 2017? Is there something that I can do to make refactoring work better?
Thanks.

Doing something like a rename requires that your project can build successfully. If you attempt rename something like a class, and the project either hasn't been built or failed the build, the rename will fail as well. This is fairly logical, as doing a rename refactor requires tracking references, which it can't do without the IL.
As far as your Razor views and controller/action references go, you're dealing with strings and/or code that is not compiled. As a result, it doesn't participate in code-based refactoring generally. Certain things like renames give you option to search strings as well, which you can do to catch more places where things should be renamed, such as your controller/action references. However, that can also cause unwanted side effects if you happen to be using the same string somewhere else, in a different context (which is why string replacement is not done by default).

Related

Resx in Blazor WASM: What is "the issue" with the old static way of using the Resx Files?

[Disclaimer: I'm a long-time Desktop developer slowly learning Web and Blazor, so might be a noob question] but,
How come, when you try to find best-practice for doing Localization in Blazor you are told from official MS Docs (https://learn.microsoft.com/en-us/aspnet/core/blazor/globalization-localization?view=aspnetcore-5.0&pivots=webassembly) and various blogs to do the following:
Add NuGet Package: Microsoft.Extensions.Localization
Register localization "builder.Services.AddLocalization();"
Add your resx Files
Make IStringLocalizer (#inject IStringLocalizer Loc)
And finally use the following in your razor pages: #Loc["Greeting"]
Sure above works, but to a Desktop developer, this feels like a massive step-back in quality and "refactor-safeness" and the new way to use "magic strings" to reference the translations.
I've tested, and the "old way" on a Blazor Page of just:
Adding a MyResource.resx
Let it use the custom tool "PublicResXFileCodeGenerator" to make the .designer file
Simply reference the translation using MyResource.MyTranslationKey;
It works, it is refactor-safe, no need for an injection or NuGet packages... It just works, but despite that, it is not the recommended way... My question is why not? What is the drawback (all the blog and documentation fail to say why the new way is better)
I think there are a number of disadvantages using PublicResXFileCodeGenerator, which may have led to the current recommendations on how to support i18n-capabilities in [blazor-]apps.
Note that this is just a list of reasons I personally came up with finding possible causes which may have led to the current recommendations:
A: Visual Studio exclusivness
The way how PublicResXFileCodeGenerator generates files seem to be VisualStudio exclusive. Today´s teams tend to use a variety of IDEs / editor to build software, (f.e. VS, VSCode, Rider, WebStorm, etc.).
At least from my perception over the last couple of years
using IStringLocalizer works for all editors, even notepad or vim.
B: no default fallback
With the recommended way of accessing a translation, there will always be a useful fallback which is provided in markup. That is not the case when using the generated types to access translation-units.
C: no builtin-support for interpolation
Using IStringLocalizer, there is a built-in, lightweight and formalized way for utilizing interpolated strings. It even encourages using such strings in favor of manually building together such values, which is considered bad-practice when translating software.
DO:
#inject IStringLocalizer<DemoPage> L
<h1>#L["Greetings, {0}", userName]</h1> <!--Greetings, Arthur-->
#code {
string userName = "Arthur";
}
DON´T:
<h1>#DemoPageRessources.Greeting #userName</h1> <!--Greetings Arthur-->
#code {
string userName = "Arthur";
}
this dictates the order of strings, which might be OK for one language, but not for another. Achieving this with the generated type is a bit more verbose, and even may lead to runtime-exceptions, when there is no actual translation, i guess.

Dynamicly resolving Assemblies without the file name

Yes, I've read the warning label, and I know that dynamically loading assemblies is somewhat discouraged. That said, I have an application that loads assemblies - that's just how it works. It works fine on Windows. Works fine on Windows CE. I need it to "work fine" on Android, even if it takes some massaging.
Basically the app is an engine that loads up plug-in DLLs (we'll call it an Adapter) that meet specific interfaces at run time. Under Windows, it even detects the appearance of a DLL at any point and goes and loads it - I'm fine if that's not going to work under Android.
What I'm having trouble getting working is having the Engine load an Adapter that it knew about at design/compile time but without hard coding the name of that Adapter into the Engine code. I'm fine with adding a reference to the Adapter to get it to not get linked out, but I really, really don't want to have to add in the DLL name every time, as the DLLs change with different deployments, and that would lead to a huge headache.
So I figured that if it's referenced, it would get into the APK, and I could use reflection to load it like this:
var asm = Assembly.Load("TheAdapterName.dll");
Initial tests show that this works for the Adapter if I just hard code in the name, but again, I really, really want to avoid that.
So I thought that maybe I could reflect through the references and extract the name, but oddly, not all references actually show up when I do that. So I do this:
var refs = asm.GetReferencedAssemblies().Select(a => a.Name).ToArray();
And I get back an array of 14 assembly names. But the assembly (asm) has 16 references, one of which is the Adapter plug-in I need to load. The Adapter is definitely there - heck I used Assembly.Load with the full name two lines above and it resolved.
I thought, ok, maybe I can figure out the "path" to the folder from which I'm running, and then look for DLLs there and load that way. Ha. After several hours of trying to figure out a way to get the path that would work under Debug and Release, I came up with nothing but more grey hair.
Sooooo...... any thoughts on how I might get the name of a DLL that I know is in my APK, but that I don't "know" the name of at build time (I'm loading them and looking for interfaces via reflection to detect their "Adapterness").
If those methods aren't working for you, then the only suggestion I can think of is to add a prebuild step which updates either a C# or an Assets file in order to provide the list you need.
Obviously this is extra work, but should be fully automated and is guaranteed to work no matter what platform changes get thrown at you.
As an aside, I also just looked at one of my mvx projects using reflector - it shows the same asm.GetReferencedAssemblies() list as your investigations report - runtime-loaded plugins are not listed. I guess that the GetReferencedAssemblies method is reporting only on assemblies actually used to import Type references at the IL level - so if you reference an assembly in the csproj but don't import any types then it doesn't list them as references in the compiled code.

How to reuse a mvc app

I've created a web app (mvc4) that I'd like to reuse in multiple projects. The site is an admin panel, but it may be extended and slightly modified in each project. I want to avoid copying the project over, because I'd like to be able to update each project to the latest version at the lowest possible cost.
So far I have tried 2 approaches:
a script that 'clones' the project by copying all the necessary things as well as altering others (guids in assemblies, namespaces and things like that) - this works fine for extensibility and modification, but that's just a copy so pushing 'updates' is a mess (I did it manually) and it does not scale.
portable areas from mvc contrib project - this seemed like a good idea at first, but it turns out that it's nice for simple scenarios, but fails at more advanced use cases. It doesn't support localization (from resources embedded in the portable area), bundling and min requires a lot of hacks (mvc contrib is still on mvc 3), it's not possible (out of the box) to reuse shared views or Display/Editor templates from the portable area and it looks like if I'd go further that way, some new things would come up
Currently I'm thinking about 'just' branching each project from the core one. This would of course require the same changes (or at least a big subset of them) that were done in the script I mentioned earlier, and I'm afraid that if I try to pull updates from the core project the number of conflicts will render the whole approach unusable.
Does anyone have an idea on how I could tackle this problem?
I'd suggest to create a NuGet package of the mvc app and reuse it. So versioning and applying updates would be much easier. However it takes a bit work to make your code completely isolated from the codes you want to add in the new project.

How to work with NopCommerce MVC as a team

We are currently looking at the newest version (2.60) of NopCommerce in MVC and we will be integrating it pretty soon…We’ve downloaded the Source Code and paid the 20$ for the User Guide documentation. The documentation is great! I mean…it is great in the sense that it explains how to deploy, install and how to work around the UI Frontend and Backend. This is great for an overall overview but what it lacks is the understanding of how to work with NopCommerce as a team. What are/is the best practices etc...
As an example (or parallel), if you decide to work with Dotnetnuke as a team, you usually work in the following fashion:
Each developer downloads/installs Dotnetnuke locally on their
machine.
You also download/install Dotnetnuke on a dedicated server (let’s say
dev-server).
As a developer, you work and create modules which you test locally
within your Dotnetnuke installation.
Once it is done, you package your module (and any SQL scripts that
comes with your module) into a zip file.
Once the package is ready, you upload/install that package on the
dedicated server (dev-server).
This approach works great for Dotnetnuke and more importantly if you have a team of developers creating modules.
My question is how does a team work with NopCommerce MVC?
I’m assuming it is a bad idea to directly work within the source code in case your team decides to modify core elements/source which will make any upgrade to newer versions impossible (or break changes).
I’m not sure if my parallel to Dotnetnuke is a correct one…but would anyone have any idea (or help me clarify) how does a team work with NopCommerce MVC.
In addition, should the team only rely on creating plugins for NopCommerce and stay away from modifying the core or should this be irrelevant?
What about adding new objects in SQL (or modifying existing ones) should we prefix our objects in case an eventual NopCommerce MVC upgrade creates similar objects and/or overwrites them?
Thank you for helping me shed some light on this.
Sincerely
Vince
Plugins in NopCommerce are almost like modules in DNN. Depending on what you need to do, it sometimes is necessary to modify the core code.
What I've been doing for the Services is create a new class and inherit from the existing service, then override the function you want to change. Create a new DependencyRegistrar class and set your new service classes as the implementation for that particular interface. Also make sure the Order property is 1 so that your DR class is loaded after the stock one. Since you're inheriting from the core class, any functions you didn't override will be handled by the parent class. If I need to add a new function, I'm just modifying the interface, putting a stub in the stock class, and implementing it in my own.
Views in the Nop.Web project can be overridden by Themes. The Admin stuff and the Web Controllers get trickier. I'm just modifying those files directly.
The Core and Data classes can be done using partial classes to add your new fields.
In any case you will still need to merge changes with your solution when an update is released. My opinion is that you are better off writing clean, readable code now and bite the merge bullet when it comes.
I don't really worry about SQL scripts right now because I'm a single developer but maybe you add a folder for ALTER scripts and name them after the day they were created. Then each dev knows which scripts they need to run when they get latest.

Visual Studio: What approach do you use to 'template' plumbing for similar projects?

When building ASP.NET projects there is a certain amount of boilerplate, or plumbing that needs to be done, which is often identical across projects. This is especially the case with MVC and ALT.NET approaches. [I'm thinking of things such as: IoC, ORM, Solution structure (projects), Session Management, User Management, I18n etc.]
I would like to know what approach you find best for 'reusing' this plumbing across projects?
Have a 'master solution' which you duplicate and rename somehow? (I'm using a this to a degree at the moment, but it's fairly messy. Would be interested how people do this 'better')
Mainly rely on Shared Library projects? (I find this appropriate for some things, but too restrictive for things that have to be customised)
Code generation tools, such as T4? (Similar to the approach used by SharpArchitecture - have not tried this myself)
Something else?
Visual Studio supports Custom Templates.
I definitely (mostly!) go for T4 templates in conjunction with a modified version of SubSonic 3. I kind of use the database to model my domain and then use the T4 templates to generate the model and associated controllers and views. It takes about 50-60% of the effort out and keeps a consistency in place.
I then work on overrides (partials) of the classes along with filters and extension methods to 'make the app'. Now that I'm familiar with the environment and what I'm doing, I can have a basic model with good plumbing in place in a very short space of time. More importantly, because I create a set of partial class files, I can regenerate all I want without losing any of my 'custom' coding.
It works for me anyway :)
You could do it the bearded, t-shirted, agile style and create a nice template and put it in sourcecontrol. So when you need a new project, you just checkout the template?
For insanely fast MVC site setup, I use modified T4 templates (created with T4 Editor) and with ALOT of help from Oleg Sych's blogs for page generation (for your typical add/edit/index pages) combined with an awesome implementation of an automated create-update-delete called MVCCrud (if LINQ-to-SQL is your preferred data access method)
Using modified T4 templates and MVCCrud you can create fully functional entities (Create/Edit/List/Delete) with error handling and intuitive error messages in about 4 minutes for each.
I create a new project using the new project wizard so that I get unique project GUIDs assigned. Then I would use "Add Existing Item" to copy items from similar projects if it made sense to do so.
I sometimes use a file diff tool to copy references from one project to another, otherwise I just add the references by hand. A file diff tool can also be used to include similar source files, but the underlying files have to be copied anyway, so I prefer "Add Existing Item".
I've used T4 to generate solution and project files, but that definitely seems like an edge case and not something that would normally be necessary. In that case, I'd probably wrap the T4 in a PowerShell like script to create and populate the rest of the directory structure.
I use "shared libraries" pretty aggressively in general, but not specifically due to this scenario.
In general, I don't find myself reusing plumbing between projects much. It's probably more often that I hack away in one "prototype" project, then abandon it, and rebuild the project from scratch following the above approach and only bring over the "non-hacky" code.
I'm creating a MVC2 application template at http://erictopia.com. It will contain all the basic items I think should be in a MVC project. These include BDD specifications, an ORM (NHibernate and possibly Lightspeed), T4 templates, custom providers, ELMAH support, CSS/Javascript minifier, etc.

Resources