Error: More than one package matched with specified pattern: please restrain the search pattern restrain - jquery-ui

As part of the requirement I am working on the yaml pipeline of UI and that needs to be deploy in IIS on VM. In order to this I am getting an error in the task - IISWebAppDeploymentOnMachineGroup#0.
The error is -More than one package matched with specified pattern: please restrain the search pattern restrain
Could someone please help me on this...
Here is my code:
-task: IISWebAppDeploymentOnMachoneGroup#0
display name: 'Iis web app deploy'
Inputs:
WebsiteName: '${{Parameters WebsiteName}}'
Package: '${{Parameters.Package}}'
Iis web package
name: Package
displayName: IIS Deploy Package
type: string
default: '$(Agent.BuildDirectory)\BuildArtifact*.zip'
I tried of replacing the buildartifacts***.zip.
But didn't work as expected. Please let us know ..what needs to be change in my code.

Related

Jenkins Xray Integration - Jira Issue Type with wrong character

In my Jennkins pipeline I have a Jira/Xray integration step :
step([$class: 'XrayImportBuilder',
endpointName: '/xunit',
fixVersion: '1.0',
importFilePath: '/MyFirstUnitTests/TestResults.xml',
importToSameExecution: 'true',
testExecKey: 'TSTLKS-753',
serverInstance: '9146a388-e399-4e55-be28-8c65404d6f9d',
credentialId:'75287529-134d-4s91-9964-7h740d8d2i63'])
Currently I'm having the following error :
ERROR: Unable to confirm Result of the upload..... Upload Failed!
Status:400 Response:{"error":"Issue with key
\u0027TSTLKS-753\u0027 does not exist or is not of type Test
Execution."}
But my issue (TSTLKS-753) is of type "Test Execution":
It appears that the string "\u0027" is being added both as a prefix
and as a suffix on my issue when building the pipeline.
I've searched for this string and it appears to be a Quotation Mark:
I tried out replacing it by double quotes. But I end up with the same error. Also tried to remove them.
In any case, if someone already got this error please let me know. Thank you very much
Can you confirm that the user that you have configured in Jenkins for the Xray instance has access to that Jira project where you have your Test Execution issue?
Can you try to import it without specifying testExecKey field, with importToSameExecution: 'false', and specifying the projectKey field using something like projectKey: 'TSTLKS' ?
If this last option returns an error (e.g. "project does not exist") then it's for sure a permission issue, so you'll either need to use a different Jira user/pass or fix the permissions on Jira side.

import python functions into serversless

I h got an issue with the following serverless config.
this is my handler and the files/folders structure.
the issue is that after uploading my project to AWS when I test my lambda I got an error as follows:
lambda execution fails: "errorMessage": "Unable to import module 'app_monitor': No module named 'monitoring'"
{
"errorMessage": "Unable to import module 'src/app_monitor': No module named 'monitoring'",
"errorType": "Runtime.ImportModuleError",
"requestId": "bca3f67d-815f-452b-a2a6-c713ad2c6baa",
"stackTrace": []
}
have you got any clue how can I add this into serverless config.?
First, a quick tip on troubleshooting: When I ran into such issues it was helpful to go to the AWS console, look at the lambda function, and see what the uploaded file structure looks like on that end. Is the monitoring folder there?
Moreover, in order to specify how a specific function is packaged, you have to explicitly state that you want it to be individually packaged and not follow the general rules of the project as a whole.
You should try to add:
app_monitoring:
package:
individually: true
patterns:
- 'src/**'
More documentation on packaging configuration here
You may also have better luck with explicitly stating the patterns you need, I know I've had issues with globs in the past. So for example you can try:
patterns:
- 'src/app_monitoring.py'
- 'src/monitoring/get_lb.py'

XCUITest pre-testing setup

I need to do the pre-test config, one time setup before I run XCUITest (automation test) cases,
Example of pre-test setup: (This needs to be done once for test cycle, the output of below APIs is used in all test cases)
Fetching qTest access token
Fetching the qTest URLs from remote config file.
From docs I found that testBundleWillStart method of XCTestObservation protocol is ideal place to do pre-test setup.
But
-testBundleWillStart method is not getting called or executed,
all below listed methods of XCTestObservation are getting executed correctly.
testSuiteWillStart
testCaseWillStart
testCaseDidFinish
testSuiteDidFinish
I tried setting Principal Class in UITest info.plist but no luck, it shows following error
Test bundle Info.plist at /var/containers/Bundle/Application/BEAFB0C2-43D5-4C90-B50F-B1FF1A16BC23/MyAppUITests-Runner.app/PlugIns/MyAppUITests.xctest/Info.plist specified MyAppUItests.TestObserver for NSPrincipalClass, but no class matching that name was found.
How can I get the method testBundleWillStart executed?
Any help would be appreciated.
Found the issue,
My project name had a space, so replacing the space with _ worked.
Info about my project:
My project name: My App
My project test bundle name: My-AppUITests
Working Solution: (Replacing space with _ character)
<key>NSPrincipalClass</key>
<string>My_AppUITests.SOTestObserver</string>
Let me also share the things that seem right but still don't work:
Non-working #1: (Replacing space with - character)
<key>NSPrincipalClass</key>
<string>My-AppUITests.SOTestObserver</string> //Didn't work
Non-working #2: (Replacing project name with $(PRODUCT_NAME) environment variable)
<key>NSPrincipalClass</key>
<string>$(PRODUCT_NAME).SOTestObserver</string> //Didn't work

Image constraints with dataflow workers' boot image

I tried setting up a Dataflow streaming job using the "Pub/Sub topic to BigQuery" template. My org has an image constraint policy in place. According to the documentation for image constraints (https://cloud.google.com/compute/docs/images/restricting-image-access#limitations), any image used by a GCP service should not be affected by these constraints. However the dataflow workers fail to launch, citing image constraints as a reason. What is the correct way to set image constraints in such a scenario?
This is what the error looked like -
{
insertId: "qnh47fd17tx"
labels: {
dataflow.googleapis.com/job_id: "job_id"
dataflow.googleapis.com/job_name: "job_name"
dataflow.googleapis.com/region: "us-central1"
}
logName: "projects/app/logs/dataflow.googleapis.com%2Fjob-message"
receiveTimestamp: ""
resource: {
labels: {
job_id: ""
job_name: ""
project_id: ""
region: "us-central1"
step_id: ""
}
type: "dataflow_step"
}
severity: "ERROR"
textPayload: "Workflow failed. Causes: Step "setup_resource_disks_harness50" failed., Step setup_resource_disks_harness50: Set up of resource disks_harness failed, Unable to create data disk(s)., Unknown error in operation 'operation-1600084247324-5af44a52c2574-7f195f5c-376e0b61': [CONDITION_NOT_MET] 'Constraint constraints/compute.trustedImageProjects violated for project getmega-app. Use of images from project dataflow-service-producer-prod is prohibited.'."
timestamp: ""
}
Since your project is using Image Constraints, you also have a trusted image policy configure. So, only sourced from that project are allowed to start VM's accross your organisation.
However, services such as Google Cloud Dataflow and Datalab use images from other Google projects to create VMs within your VPC, which means that you may encounter an error when launching a Dataflow Templated Job. This can be easily overcome by adding a few projects to your trusted project images. As follows:
Using gcloud,
1 - Get the existing policy for you project
gcloud beta resource-manager org-policies describe \
compute.trustedImageProjects --effective \
--project [PROJECT_ID] > policy.yaml
2 - Open the policy.yaml file in a text editor. You should see a file as below:
constraint: constraints/compute.trustedImageProjects
listPolicy:
allowedValues:
- projects/debian-cloud
- projects/cos-cloud
deniedValues:
- projects/unwanted-images
3 - Modify the compute.trustedImageProjects constraint by adding the following projects:
projects/cos-cloud
projects/dataflow-service-producer-prod
projects/serverless-vpc-access-images
projects/windows-cloud
Notice that I have added all the projects that Google services may use to retrive/launch services. In your specific case, just adding projects/dataflow-service-producer-prod would be enough.
4 - Apply the policy.yaml file to your project.
gcloud beta resource-manager org-policies set-policy \
--project [PROJECT_ID] policy.yaml
After perfoming these actions, you will be able to launch you templated Dataflow Job. Lastly, you can use the Console to add the projects specified in the 3rd step, as described in the documentation.
Note: be careful when sharing your logs that may contain personal information such as Project id or Job id. These information should not be disclosed in public.

F#: unable to use Suave.Types for build script

Trying to use Thomas Petriek's build script: https://github.com/tpetricek/suave-xplat-gettingstarted/blob/master/build.fsx
Getting error that the namespace Types is not defined in open Suave.Types.
Is this some namespace that has since been deprecated?
I am not sure if that is an old version of the build script, but I am using the same build script myself, I had to make a couple of changes namely, remove the logging, use
open Suave.Http
instead of
open Suave.Types
and use these bindings
bindings = [ HttpBinding.create Protocol.HTTP Net.IPAddress.Loopback 8083us ]

Resources