Published Azure Code - asp.net-mvc

Is it possible to retrieve the published code from an Azure Cloud Service.
When I changed my TFS mapping, TFS wiped out the code I had written on my local machine. It converted the .csproj and .ccprof files to csproj.user and .ccproj.user files. It also removed the solution. I havn't checked anything in since February so checking out loses 3 months worth of code. I have access to some of the views, scripts, and .css files but all .cs files are gone. I have tried the following.
Remote desktop into the published site.
-Works but all .cs files are stored as a .dll and code is lost and
"obfuscated" when decompling.
Wondershare data recovery.
Some files are found but often in an unreadable format. Many are still
missing.
Getting the blob in the vsdeploy folder in Azure Storage.
I have the blob. Now what? Is there a way to convert that back into a readable project?
Using "Open from Azure Website" extension to load the project into visual
studio by the publishsettings file in Azure Portal.
This works great for app services, but I cannot find any existence of a
.PublishSettings file in Azure. The Get-AzurePublishSettingsFile call from
Windows Powershell doesn't not download the correct file. When using the
extension I get a "Object not set to an instance of an object" exception. I
have tested the extension with an app service and it works perfectly.

If you're talking about web/worker roles in a cloud service, then no - you cannot retrieve deployed code. To get code to a cloud service, it gets packaged up first by Visual Studio (or directly through command line tools, or via Eclipse). This entails compiling all of your code first. Source files are not included in the package (unless you've explicitly done something like setting "copy local" to true in the package, which I can't imagine anyone doing).
As far as what's in blob storage: If your .cspkg is still sitting in a blob, sure, you can download and examine it. But again, it'll just contain the same package that was built locally and uploaded during deployment.
With Web Apps, your code will be available in your d:\ drive, since deployments are done via version control (unless you simply ftp something up).
With Virtual Machines (which sit in cloud services in the "classic" deployment model), you would have needed to push code to the VMs on your own (there's no built-in push-from-version-control). So again, unless you pushed source code to the VM, there's no way to retrieve said source code.
As far as the code that was wiped out on your local machine, it might be worth looking into recovery / forensic disk tools (which it looks like you've started doing), to see if your code is still sitting around somewhere, hidden. But, really, how you go about hunting for deleted / overwritten files would be off-topic (or something to ask about on SuperUser).

Related

What is the purpose of "building" a .NET MVC application if it runs fine without it?

I have a simple MVC Web application in the .NET Framework. To run it, I can click the green arrow ("play" button) in Visual Studio, which does a "build" and starts a Web browser pointing to the application.
Or, I can just start up IIS Express with the proper command line options, and navigate to localhost:8080 in a browser and run the application without a "build".
What is the purpose of "building" the application in Visual Studio if it runs fine without it?
The simple answer is that it doesn't run without the build step; your assumptions are wrong.
However, Visual Studio continuously monitors your source files and compiles them, e.g. to be able to show intellisense suggestions and compiler errors while you type. This means that there are in fact compiled binaries based on your source somewhere, maybe just not in the bin folder under your project root (that somewhere might be in memory, or in some cache location on disk, depending on circumstances out of scope for this question).
It's also very likely that you've previously built your application, resulting in binaries in your bin folder, even if you didn't do it with the purpose of running the application right after. In either case, if you get it working with IIS Express it's because it can find compiled binaries somewhere, and run those.
The main reason to have Visual Studio explicitly rebuild your app when you hit play, is to make sure that you're running the latest version of your code. Sure, it takes a few extra seconds every time you start the debugger, but it's nothing compared to the time you'd lose trying to track down a bug that you've already fixed in your code, but which still manifests in the running application, because the running application is an outdated version. (It also makes things like stepping through the code much less confusing, since, again, the source code on file will always be in sync with the running application.)

VirtualStore for delphi application

I use opendialog to load file to application path . is there any way to load the file to %userprofile% > AppData > Local > VirtualStore > Program Files > MyApplication Folder, it is because users should not see the the loaded file
Windows will automatically show applications the "VirtualStore" files for old applications. This is done to try and make sure that old applications build before UAC continue to run correctly. To turn off this behavior you need to add a application manifest to your program. This will make windows turn off the VirtualStore behavior both for files and registry entries.
Here is a good page that describes what is happening in detail:
http://www.codeproject.com/Articles/17968/Making-Your-Application-UAC-Aware
The manifest is an XML resource file that can be embedded into the application. In terms of UAC, this serves 2 purposes. Firstly it tells the operating system that the application has been designed with UAC in mind, and that it therefore should not attempt to virtualize any folders or registry settings. If the application still attempts to access protected resources after making its declaration, then these requests will simply fail rather than virtualize. The other thing it does is allow the application to state the privilege level at which it needs to run, and whether it requires elevation.
There are several questions already on StackOverflow that deal with creating and adding an application manifest to Delphi 7 projects. Here is one link to get you started:
Delphi 7 vista / windows 7 manifest
Once you tell windows that you know about the new version of Windows via the manifest you will need to make sure that you are playing by the new rules and don't write data back to any of the protected locations.

debug ASP.NET MVC system code in Azure Compute Emulator

I'm developing an ASP.NET MVC (still v2) Azure web application. I've run into a problem that seems to require me to step through the MVC code itself. I can do this but also need to be able to inspect the code as I step, so I need to be an unoptimized version (so as to not get the "Cannot obtain value of local or argument ...") message in the debugger.
The standard way of circumventing the optimizations (http://blogs.msdn.com/b/sburke/archive/2008/01/29/how-to-disable-optimizations-when-debugging-reference-source.aspx) doesn't seem applicable to running in the Compute Emulator.
I also tried this by creating a local debug build of the System.Web.Mvc project but my web role hangs when I try to start it in the Compute Emulator.
So ... Any help with either of the following would be much appreciated:
Running an Azure web app in Visual Studio (2010) so that it will ignore code optimizations in system dlls.
OR
Creating a local system debug build so that it can be referenced by an Azure web app being debugged in the Compute Emulator.
If the Azure Compute Emulator is giving you issues you could run your MVC project using IIS Express. Just right click and and Select Debug/Start New Instance.
I was finally able to get unoptimized ASP.NET code while debugging in the compute emulator. The basic approach described on MSDN (http://msdn.microsoft.com/en-us/library/9dd8z24x%28v=vs.100%29.aspx) and elsewhere (http://martin.bz/blog/asp-net-mvc-source-debugging-the-easy-way among others) is to put an .ini file that tells the JIT compiler not to optimize in the same directory as the DLL.
The first challenge was to determine just where that was; it finally dawned on me to watch the logs in the Compute Emulator UI and see where they loaded the DLL (in this case System.Web.Mvc) from.
The second challenge was getting the .ini file there. Windows Explorer didn't work because it uses a different way of viewing assembly caches that doesn't give you direct access to the files. One of the posts I read reminded me that using the Command Prompt might give me that access and it did. The last step was realizing, when the Command Prompt wouldn't permit me to move the .ini file into the assembly directory, that I needed to run Command Prompt as admin.
Once I could view variables while debugging, I pretty quickly realized where my bug was.

How to migrate Delphi or clone Delphi registry settings?

I have two PCs both with XE2. I thought that I had installed identically on both but have problems installing 3rd party packages on one while the other is just fine.
I want the same on both anyway. The easist would probably just to "migrate" the working set-up by moving in into my Dropbox folder. Can I do that? If so, how?
If not, can I (easilly) backup my registry settings on one machine and then import them on the other?
I suppose I could just sort out the problem on the one PC, but am not having much luck so far. I would rather invest the time in only having one Delphi setup. And since I am moving lots of other stuff to DropBox anyway ...
The tool for this is now built into Delphi XE8 and higher.
It's found here:
C:\Program Files (x86)\Embarcadero\Studio\20.0\bin\migrationtool.exe
Online documentation:
http://docwiki.embarcadero.com/RADStudio/Rio/en/Settings_Migration_Tool
Install CnPack wizards from http://www.cnpack.org
From the CnPack toolbar select IDE Config Backup/Restore (image below) and save this file somewhere safe
Copy the components to the second delphi machine . Keep the exact same directory structure.
I store my components as follows this helps backing up, moving etc., but you can use your own structure
D:\components_bds\DCU
D:\components_bds\BPL
D:\components_bds\ComponentsThemselves
Use the restore config file from CnPack to restore your components on the new machine
This is also useful if your testing components that you plan to remove later and keeping a backup of your installation incase something goes wrong you can save time with new delphi installation if hard drive dies. Keep a copy on flashdrive or somewhere safe
You may compare/diff the config file created by cnPack using a tool like Beyond Compare and see what the differences are to find out why third party components give problems on one of the machines. It may be a Delphi registry/installation problem or a problem in the paths of the thirdparty components. Components need to be installed in an order perhaps it did not find the needed dcu or dll it depends on.
I don't know of any way to do so with DropBox. Here's an old post I made (related to Delphi 7, but with correction of registry keys still applicable) in the CodeGear newsgroups; hopefully it will help.
(It probably goes without saying, but back up the existing registry settings on the destination machine before starting by using RegEdit and exporting them, just in case. You'll at least be able to get back to the point you're at now if something goes wrong by deleting the imported entries and then importing the saved ones.)
You can't, without some difficulty anyway. (Especially if you have
third party components installed, as they may have placed files in the
%SYSTEM% folder you may not know about.)
You may be able to (for going from the old computer to the new
computer running the same exact version of Windows!) by exporting the
registry keys under HKCU\Software\Embarcadero and
HKLM\Software\Embarcadero from the old machine, and then after
installing Delphi on the new machine (in the exact same folder
location) importing that registry file.
Many of the compiler, linker, and other settings are configured on a
per-project basis, and should transfer over when you move your source
code to the new machine.
Third-party components are a problem, as I mentioned above. You may be
able to get away with using the registry export/import if you copy
each third-party component set from the old computer into exactly
the same location on the new machine before importing the registry
file. You'll probably have to track down some .BPL files that end up
in the $(BDS)\Bin and possibly other folders under the $(BDS)
tree; the IDE will tell you about missing stuff when you try and start
it. Make sure you answer "Yes " when asked if you want to try and load
it again next time!
Most of my development is hobby stuff or wannabe releases. Instead of dying trying to move my XE2 Pro from my Dell Inspiron N7110 Win 7 machine to my new Win 10 SSD machine, I'm seriously thinking of switching to Lazarus. I've used Lazarus 2.x with Indy 10, ZeosLib, and Firebird and successfully created a working distributed internet system. I also created Lazarus version of my XE2 Blackjack program. When compared to XE2, Lazarus (IMO) has only two weakness and neither are deal breakers for me. BTW, I have successfully duplicated Lazarus (with all installed components) from one machine to another simply by copying and pasting the Lazarus directory and it works. Try that with Delphi.
Sam

IDAPI , BdeAdmin and Windows 7

After many months of postponing it, this week, I finally started using a new Windows 7 Professional PC for actual development (which is 90% still done in Delphi 7 with some of these programs still using the Borland IDAPI to access Paradox files). The previous development pc was still an XP-one.
Every thing works except for one thing: somehow the settings of the IDAPI and BdeAdmin configuration files are messed up or they are read/written in different locations. To be more precise, it looks like two configuration files are active.
It must have something to do with rights or settings being read/written in the wrong folder or registry setting, but after searching for it for a couple of hours, I give up.
Anyone had any problems with this, before ? And if so, hopefully, has any one solved this problem ?
Thx for any thoughts/solutions ...
My guess is it has something to do with the fact that Vista and Windows 7 don't allow programs to change files under the C:\Program Files folder. They create a copy of those changed files in a virtual store, the process is known as virtualization. The copies end up in the hidden appdata folder of the user account and can be found in the Local\VirtualStore\Program Files folder. The structure in that folder reflects the one in the actual Program Files folder.
Programs that access their files in the Program Files folder using a "hardcoded" path, will always get the original - unchanged - file contents.
Solution: running the apps in a virtual XP system or upgrading the apps is probably your best bet.
You could try to run the apps elevated. That is: right click them and choose Run as Administrator. Please note that it isn't enough to be logged in as an administrator. Even administrators run all processes unelevated by default. Instead of right-clicking, you can also create a shortcut and set the Run as administrator for the shortcut - the checkbox for this is on the compatibility tab of the properties dialog. No guarantees though that this will alleviate the problem.
Since IIRC D7 setup allows you to configure paths in multiple ways, maybe simply do a reinstall outside "program files"?
Afaik this solves several vista/w7 problems.

Resources