Installing Umbraco programmatically - umbraco

I'm trying to install Umbraco without using the visual interface, in order to increase my productivity.
Currently my code looks like this:
var installApiController = new InstallApiController();
var installSetup = installApiController.GetSetup();
var instructions = new Dictionary<string, JToken>();
var databaseModel = new DatabaseModel
{
DatabaseType = DatabaseType.SqlCe
};
var userModel = new UserModel
{
Email = "my#email.com",
Name = "My name",
Password = "somepassword",
SubscribeToNewsLetter = false
};
foreach (var step in installSetup.Steps)
{
if (step.StepType == typeof(DatabaseModel))
{
instructions.Add(step.Name, JToken.FromObject(databaseModel));
}
else if (step.StepType == typeof(UserModel))
{
instructions.Add(step.Name, JToken.FromObject(userModel));
}
}
var installInstructions = new InstallInstructions
{
InstallId = installSetup.InstallId,
Instructions = instructions
};
InstallProgressResultModel progressInfo = null;
do
{
string stepName = "";
if (progressInfo != null)
{
stepName = progressInfo.NextStep;
}
try
{
progressInfo = installApiController.PostPerformInstall(installInstructions);
}
catch (Exception e)
{
throw new Exception(stepName, e);
}
}
while (progressInfo.ProcessComplete == false);
The code fails at this line: https://github.com/umbraco/Umbraco-CMS/blob/dev-v7.8/src/Umbraco.Web/Install/InstallSteps/NewInstallStep.cs#L47, and I believe it's because the ApplicationContext isnt updated for each of the installation steps (e.g. not updated after the database is created).
Is it possible to update the ApplicationContext manually after each step in the installation progress, or do I have to trigger all installation steps in separate HTTP requests?
The code works, if I run each step in separate HTTP requests.

Related

'Create copy of work item' via REST API for Azure DevOps?

I'm wanting to 'Create copy of work item' which is available via the UI, ideally via the API.
I know how to create a new work item, but the feature in the UI to connect all current parent links / related links, and all other details is quite useful.
Creating via this API is here: https://learn.microsoft.com/en-us/rest/api/azure/devops/wit/work%20items/create?view=azure-devops-rest-5.1
Any help would be greatly appreciated.
We cannot just copy a work item because it contains system fields that we should skip. Additionally your process may have some rules that may block some fields on the creation step. Here is the small example to clone a work item through REST API with https://www.nuget.org/packages/Microsoft.TeamFoundationServer.Client:
class Program
{
static string[] systemFields = { "System.IterationId", "System.ExternalLinkCount", "System.HyperLinkCount", "System.AttachedFileCount", "System.NodeName",
"System.RevisedDate", "System.ChangedDate", "System.Id", "System.AreaId", "System.AuthorizedAs", "System.State", "System.AuthorizedDate", "System.Watermark",
"System.Rev", "System.ChangedBy", "System.Reason", "System.WorkItemType", "System.CreatedDate", "System.CreatedBy", "System.History", "System.RelatedLinkCount",
"System.BoardColumn", "System.BoardColumnDone", "System.BoardLane", "System.CommentCount", "System.TeamProject"}; //system fields to skip
static string[] customFields = { "Microsoft.VSTS.Common.ActivatedDate", "Microsoft.VSTS.Common.ActivatedBy", "Microsoft.VSTS.Common.ResolvedDate",
"Microsoft.VSTS.Common.ResolvedBy", "Microsoft.VSTS.Common.ResolvedReason", "Microsoft.VSTS.Common.ClosedDate", "Microsoft.VSTS.Common.ClosedBy",
"Microsoft.VSTS.Common.StateChangeDate"}; //unneeded fields to skip
const string ChildRefStr = "System.LinkTypes.Hierarchy-Forward"; //should be only one parent
static void Main(string[] args)
{
string pat = "<pat>"; //https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate
string orgUrl = "https://dev.azure.com/<org>";
string newProjectName = "";
int wiIdToClone = 0;
VssConnection connection = new VssConnection(new Uri(orgUrl), new VssBasicCredential(string.Empty, pat));
var witClient = connection.GetClient<WorkItemTrackingHttpClient>();
CloneWorkItem(witClient, wiIdToClone, newProjectName, true);
}
private static void CloneWorkItem(WorkItemTrackingHttpClient witClient, int wiIdToClone, string NewTeamProject = "", bool CopyLink = false)
{
WorkItem wiToClone = (CopyLink) ? witClient.GetWorkItemAsync(wiIdToClone, expand: WorkItemExpand.Relations).Result
: witClient.GetWorkItemAsync(wiIdToClone).Result;
string teamProjectName = (NewTeamProject != "") ? NewTeamProject : wiToClone.Fields["System.TeamProject"].ToString();
string wiType = wiToClone.Fields["System.WorkItemType"].ToString();
JsonPatchDocument patchDocument = new JsonPatchDocument();
foreach (var key in wiToClone.Fields.Keys) //copy fields
if (!systemFields.Contains(key) && !customFields.Contains(key))
if (NewTeamProject == "" ||
(NewTeamProject != "" && key != "System.AreaPath" && key != "System.IterationPath")) //do not copy area and iteration into another project
patchDocument.Add(new JsonPatchOperation()
{
Operation = Operation.Add,
Path = "/fields/" + key,
Value = wiToClone.Fields[key]
});
if (CopyLink) //copy links
foreach (var link in wiToClone.Relations)
{
if (link.Rel != ChildRefStr)
{
patchDocument.Add(new JsonPatchOperation()
{
Operation = Operation.Add,
Path = "/relations/-",
Value = new
{
rel = link.Rel,
url = link.Url
}
});
}
}
WorkItem clonedWi = witClient.CreateWorkItemAsync(patchDocument, teamProjectName, wiType).Result;
Console.WriteLine("New work item: " + clonedWi.Id);
}
}
Link to full project: https://github.com/ashamrai/AzureDevOpsExtensions/tree/master/CustomNetTasks/CloneWorkItem

Large File upload to ASP.NET Core 3.0 Web API fails due to Request Body to Large

I have an ASP.NET Core 3.0 Web API endpoint that I have set up to allow me to post large audio files. I have followed the following directions from MS docs to set up the endpoint.
https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size
When an audio file is uploaded to the endpoint, it is streamed to an Azure Blob Storage container.
My code works as expected locally.
When I push it to my production server in Azure App Service on Linux, the code does not work and errors with
Unhandled exception in request pipeline: System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Request body too large.
Per advice from the above article, I have configured incrementally updated Kesterl with the following:
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((ctx, options) =>
{
var config = ctx.Configuration;
options.Limits.MaxRequestBodySize = 6000000000;
options.Limits.MinRequestBodyDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.MinResponseDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.RequestHeadersTimeout =
TimeSpan.FromMinutes(2);
}).UseStartup<Startup>();
Also configured FormOptions to accept files up to 6000000000
services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 6000000000;
});
And also set up the API controller with the following attributes, per advice from the article
[HttpPost("audio", Name="UploadAudio")]
[DisableFormValueModelBinding]
[GenerateAntiforgeryTokenCookie]
[RequestSizeLimit(6000000000)]
[RequestFormLimits(MultipartBodyLengthLimit = 6000000000)]
Finally, here is the action itself. This giant block of code is not indicative of how I want the code to be written but I have merged it into one method as part of the debugging exercise.
public async Task<IActionResult> Audio()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
{
throw new ArgumentException("The media file could not be processed.");
}
string mediaId = string.Empty;
string instructorId = string.Empty;
try
{
// process file first
KeyValueAccumulator formAccumulator = new KeyValueAccumulator();
var streamedFileContent = new byte[0];
var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(Request.ContentType),
_defaultFormOptions.MultipartBoundaryLengthLimit
);
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);
if (hasContentDispositionHeader)
{
if (MultipartRequestHelper
.HasFileContentDisposition(contentDisposition))
{
streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
_permittedExtensions, _fileSizeLimit);
}
else if (MultipartRequestHelper
.HasFormDataContentDisposition(contentDisposition))
{
var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name).Value;
var encoding = FileHelpers.GetEncoding(section);
if (encoding == null)
{
return BadRequest($"The request could not be processed: Bad Encoding");
}
using (var streamReader = new StreamReader(
section.Body,
encoding,
detectEncodingFromByteOrderMarks: true,
bufferSize: 1024,
leaveOpen: true))
{
// The value length limit is enforced by
// MultipartBodyLengthLimit
var value = await streamReader.ReadToEndAsync();
if (string.Equals(value, "undefined",
StringComparison.OrdinalIgnoreCase))
{
value = string.Empty;
}
formAccumulator.Append(key, value);
if (formAccumulator.ValueCount >
_defaultFormOptions.ValueCountLimit)
{
return BadRequest($"The request could not be processed: Key Count limit exceeded.");
}
}
}
}
// Drain any remaining section body that hasn't been consumed and
// read the headers for the next section.
section = await reader.ReadNextSectionAsync();
}
var form = formAccumulator;
var file = streamedFileContent;
var results = form.GetResults();
instructorId = results["instructorId"];
string title = results["title"];
string firstName = results["firstName"];
string lastName = results["lastName"];
string durationInMinutes = results["durationInMinutes"];
//mediaId = await AddInstructorAudioMedia(instructorId, firstName, lastName, title, Convert.ToInt32(duration), DateTime.UtcNow, DateTime.UtcNow, file);
string fileExtension = "m4a";
// Generate Container Name - InstructorSpecific
string containerName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{instructorId}";
string contentType = "audio/mp4";
FileType fileType = FileType.audio;
string authorName = $"{firstName} {lastName}";
string authorShortName = $"{firstName[0]}{lastName}";
string description = $"{authorShortName} - {title}";
long duration = (Convert.ToInt32(durationInMinutes) * 60000);
// Generate new filename
string fileName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{Guid.NewGuid()}";
DateTime recordingDate = DateTime.UtcNow;
DateTime uploadDate = DateTime.UtcNow;
long blobSize = long.MinValue;
try
{
// Update file properties in storage
Dictionary<string, string> fileProperties = new Dictionary<string, string>();
fileProperties.Add("ContentType", contentType);
// update file metadata in storage
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("author", authorShortName);
metadata.Add("tite", title);
metadata.Add("description", description);
metadata.Add("duration", duration.ToString());
metadata.Add("recordingDate", recordingDate.ToString());
metadata.Add("uploadDate", uploadDate.ToString());
var fileNameWExt = $"{fileName}.{fileExtension}";
var blobContainer = await _cloudStorageService.CreateBlob(containerName, fileNameWExt, "audio");
try
{
MemoryStream fileContent = new MemoryStream(streamedFileContent);
fileContent.Position = 0;
using (fileContent)
{
await blobContainer.UploadFromStreamAsync(fileContent);
}
}
catch (StorageException e)
{
if (e.RequestInformation.HttpStatusCode == 403)
{
return BadRequest(e.Message);
}
else
{
return BadRequest(e.Message);
}
}
try
{
foreach (var key in metadata.Keys.ToList())
{
blobContainer.Metadata.Add(key, metadata[key]);
}
await blobContainer.SetMetadataAsync();
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
blobSize = await StorageUtils.GetBlobSize(blobContainer);
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
Media media = Media.Create(string.Empty, instructorId, authorName, fileName, fileType, fileExtension, recordingDate, uploadDate, ContentDetails.Create(title, description, duration, blobSize, 0, new List<string>()), StateDetails.Create(StatusType.STAGED, DateTime.MinValue, DateTime.UtcNow, DateTime.MaxValue), Manifest.Create(new Dictionary<string, string>()));
// upload to MongoDB
if (media != null)
{
var mapper = new Mapper(_mapperConfiguration);
var dao = mapper.Map<ContentDAO>(media);
try
{
await _db.Content.InsertOneAsync(dao);
}
catch (Exception)
{
mediaId = string.Empty;
}
mediaId = dao.Id.ToString();
}
else
{
// metadata wasn't stored, remove blob
await _cloudStorageService.DeleteBlob(containerName, fileName, "audio");
return BadRequest($"An issue occurred during media upload: rolling back storage change");
}
if (string.IsNullOrEmpty(mediaId))
{
return BadRequest($"Could not add instructor media");
}
}
catch (Exception ex)
{
return BadRequest(ex.Message);
}
var result = new { MediaId = mediaId, InstructorId = instructorId };
return Ok(result);
}
I reiterate, this all works great locally. I do not run it in IISExpress, I run it as a console app.
I submit large audio files via my SPA app and Postman and it works perfectly.
I am deploying this code to an Azure App Service on Linux (as a Basic B1).
Since the code works in my local development environment, I am at a loss of what my next steps are. I have refactored this code a few times but I suspect that it's environment related.
I cannot find anywhere that mentions that the level of App Service Plan is the culprit so before I go out spending more money I wanted to see if anyone here had encountered this challenge and could provide advice.
UPDATE: I attempted upgrading to a Production App Service Plan to see if there was an undocumented gate for incoming traffic. Upgrading didn't work either.
Thanks in advance.
-A
Currently, as of 11/2019, there is a limitation with the Azure App Service for Linux. It's CORS functionality is enabled by default and cannot be disabled AND it has a file size limitation that doesn't appear to get overridden by any of the published Kestrel configurations. The solution is to move the Web API app to a Azure App Service for Windows and it works as expected.
I am sure there is some way to get around it if you know the magic combination of configurations, server settings, and CLI commands but I need to move on with development.

How can i get HttpClient to work with NTLM in a pcl with android

have the following which works on Win10 phone in a pcl.
But i cannot get the same code to return OK on samsung s7 with android 7.0
project is xamarin forms.
nuget for system.net.http is 2.2.29.
I've include the same nuget in my UWP for the win10 phone and android projects.
i've also changed the user to include be "domain\user", "domain#user", "user#domain"
var httpClientHandler = new System.Net.Http.HttpClientHandler()
{
Credentials = credentials.GetCredential(new Uri(location), "NTLM")
};
I've tried and alternative to setting the httpClientHandler.Credentials.
var credentials = new NetworkCredentials("user", "pass", "domain");
var location = "http://apps.mysite.com/api#/doit";
var httpClientHandler = new HttpClientHandler{
Credentials = credentials
}
using (var httpClient = new HttpClient(httpClientHandler, true))
{
httpClient.DefaultRequestHeaders.Accept.Clear();
httpClient.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("application/json"));
httpClient.DefaultRequestHeaders.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
try
{
var httpResponseMessage = await httpClient.GetAsync(location);
if (httpResponseMessage.StatusCode != System.Net.HttpStatusCode.OK)
{
//handle error
}
else
{
//do something
}
}
catch (Exception)
{}
finally
{}
}
another strange thing. when i run this on android, the code hits the await httpclient.getasync(location);
and the immediately jumps to the finally.
I hava a simple form with username & password Entry Fields, plus an OK button.
all three controls are bound to a viewmodel. the OK button via an ICommand.
this code and the view live in the PCL. which has a reference to Microsoft.Net.Http.
I have Android and Universal Windows Xamarin forms builds that consume the PCL.
Android Properties. Default httpClient, SSL/TLS Default. supported arch armeabi, armeabi-v7a;x86
Android Manifest: Camera, flashlight and internet
private bool calcEnabled = false;
private ICommand okCommand;
private string message = string.Empty;
private string validatingMessage = "Validating!";
private string unauthorizedMessage = "Invalid Credentials!";
private string authenticatedMessage = "Validated";
private bool validating = false;
public ICommand OkCommand => okCommand ?? (okCommand = new Command<object>((o) => clicked(o), (o) => CalcEnabled));
protected async void clicked(object state)
{
try
{
Validating = true;
Message = validatingMessage;
var credentials = new
System.Net.NetworkCredential(Helpers.Settings.UserName, Helpers.Settings.Password, "www.domain.com");
var location = "http://apps.wwwoodproducts.com/wwlocator#/information";
var httpClientHandler = new System.Net.Http.HttpClientHandler()
{
Credentials = credentials.GetCredential(new Uri(location), "NTLM") };
using (var httpClient = new System.Net.Http.HttpClient(httpClientHandler))
{
httpClient.DefaultRequestHeaders.Accept.Clear();
httpClient.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("application/json"));
httpClient.DefaultRequestHeaders.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
try
{
var httpResponseMessage = await
httpClient.GetAsync(location);
if (httpResponseMessage.StatusCode != System.Net.HttpStatusCode.OK)
{
Message = unauthorizedMessage;
}
else
{
Message = authenticatedMessage;
Messenger.Default.Send<bool>(true);
}
}
catch (Exception)
{
Message = unauthorizedMessage;
}
finally
{
Validating = false;
}
}
}
catch (Exception)
{
throw;
}
}

Using Neo4jUserManager

I'm new at Neo4j and I'm going to use neo4j.AspNet.Identity for my authentication and authorization. I'm using neo4jUserManager and also neo4jUserStore.But when I run the application, in Neo4jUserManager create method I'll face with NullExceptionReference and I don't know why? Can Anybody help me?
Below is my code
public class Neo4jUserManager : UserManager<ApplicationUser>
{
public Neo4jUserManager(IUserStore<ApplicationUser> store)
: base(store)
{
}
public async Task SetLastLogin()
{
// Store.FindByIdAsync()
}
public static Neo4jUserManager Create(IdentityFactoryOptions<Neo4jUserManager> options, IOwinContext context)
{
var graphClientWrapper = context.Get<GraphClientWrapper>();
var manager = new Neo4jUserManager(new Neo4jUserStore<ApplicationUser>(graphClientWrapper.GraphClient));
// Configure validation logic for usernames
// manager.UserValidator = new UserValidator<Neo4jUser>(manager)
// {
// AllowOnlyAlphanumericUserNames = false,
// RequireUniqueEmail = true
// };
manager.PasswordValidator = new PasswordValidator
{
RequiredLength = 6,
RequireNonLetterOrDigit = true,
RequireDigit = true,
RequireLowercase = true,
RequireUppercase = true
};
// Configure user lockout defaults
manager.UserLockoutEnabledByDefault = false;
manager.DefaultAccountLockoutTimeSpan = TimeSpan.FromMinutes(5);
manager.MaxFailedAccessAttemptsBeforeLockout = 5;
// Register two factor authentication providers. This application uses Phone and Emails as a step of receiving a code for verifying the user
// You can write your own provider and plug it in here.
// manager.RegisterTwoFactorProvider("Phone Code", new PhoneNumberTokenProvider<Neo4jUser>
// {
// MessageFormat = "Your security code is {0}"
// });
// manager.RegisterTwoFactorProvider("Email Code", new EmailTokenProvider<Neo4jUser>
// {
// Subject = "Security Code",
// BodyFormat = "Your security code is {0}"
// });
// manager.EmailService = new EmailService();
// manager.SmsService = new SmsService();
var dataProtectionProvider = options.DataProtectionProvider;
if (dataProtectionProvider != null)
{
manager.UserTokenProvider =
new DataProtectorTokenProvider<ApplicationUser>(dataProtectionProvider.Create("ASP.NET Identity"));
}
return manager;
}
}
To setup the graphClientWrapper, you can do this:
var gc = new GraphClient(new Uri("http://localhost.:7474/db/data"));
gc.Connect();
var graphClientWrapper = new GraphClientWrapper(gc);
var manager = new Neo4jUserManager(new Neo4jUserStore<ApplicationUser>(graphClientWrapper.GraphClient));
Also, as a side I would read up on how to use Owin. It should have been obvious to you that you're trying to pull from an IOwinContext object, and that if you're getting null you should investigate how to set that up. I imagine less than 5 minutes on Google would have helped you. OR just looking at what calls that method would have shown you how the EntityFramework was being setup in the template you're modifying.

BreezeJS - issues with export and import

I am trying to work as The "Sandbox Editor"
function getClonedWorkItem(entityType, entityId) {
var clonedManager = entityManager.createEmptyCopy();
var oentity = entityManager.getEntityByKey(_mapEntityTypeToSingleType(entityType), entityId);
//export it to the new manager
var exportData = entityManager.exportEntities([oentity], true);
clonedManager.importEntities(exportData);
var cloned = clonedManager.getEntityByKey(_mapEntityTypeToSingleType(entityType), entityId);
return {
entity: cloned,
__context: clonedManager
};
}
the problem is that the cloned item dont have the extrametadata under the aspect, so i cannot sent it to the server to update - as it fails on
function updateDeleteMergeRequest(request, aspect, prefix) {
var extraMetadata = aspect.extraMetadata;
var uri = extraMetadata.uri || extraMetadata.id;
if (__stringStartsWith(uri, prefix)) {
uri = uri.substring(prefix.length);
}
request.requestUri = uri;
if (extraMetadata.etag) {
request.headers["If-Match"] = extraMetadata.etag;
}
}

Resources