automating hudson builds with ant throwing 403 - ant

We have a hudson server which deploys builds. We have a few services which we want to be able to remotely tell hudson to deploy a certain build ... these services are using ant. So I'm trying to get it working but keeping getting a 403 response when giving a build number like so...
<ac:post to="http://hostname:8080/hudson/job/test_release_indexes/build?"
verbose="true" wantresponse="true">
<prop name="token" value="indexes"/>
<prop name="BUILDNUMBER" value="0354"/>
</ac:post>
this throws the 403. I've also tried passing it props for the username and password like so ...
<ac:post to="http://srulesre2:8080/hudson/job/test_dartmouth_indexes/build?"
verbose="true" wantresponse="true">
<prop name="token" value="indexes"/>
<prop name="BUILDNUMBER" value="0354"/>
<prop name="username" value="test"/>
<prop name="password" value="test"/>
</ac:post>
I've tried a hundred different variations on username and password ... like j_username and j_password or user and pass ... but nothing is working ... keep getting the same 403. And the username and password are valid because I can manually log in with admin privileges. Any ideas would be great

Can you do a view source on the Hudson login screen to see what fields the form takes? I dont have a running instance myself.

Related

For keycloak-19.0.3-legacy docker image, how can I enable redirect_uri legacy functionality?

I need to set the flags to enable the default redirect_uri behavior for keycloak 19.0.3-legacy.
However, nothing I've tried so far has worked.
We're using the standalone-ha.xml configuration file. (not sure if this is the could be the right place to configure this.)
I need to set the following flags:
spi-login-protocol-openid-connect-suppress-logout-confirmation-screen=true
spi-login-protocol-openid-connect-legacy-logout-redirect-uri=true
https://www.keycloak.org/docs/19.0.3/upgrading/#openid-connect-logout-prompt
https://www.keycloak.org/docs/latest/upgrading/#openid-connect-logout
However, I run a standalone instance and don't run using kc.sh.
I've tried setting environment variables without success:
KC_SPI_LOGIN_PROTOCOL_OPENID_CONNECT_LEGACY_LOGOUT_REDIRECT_URI=true
KC_SPI_LOGIN_PROTOCOL_OPENID_CONNECT_SUPPRESS_LOGOUT_CONFIRMATION_SCREEN=true
and
KEYCLOAK_SPI_LOGIN_PROTOCOL_OPENID_CONNECT_LEGACY_LOGOUT_REDIRECT_URI=true
KEYCLOAK_SPI_LOGIN_PROTOCOL_OPENID_CONNECT_SUPPRESS_LOGOUT_CONFIRMATION_SCREEN=true
and
SPI_LOGIN_PROTOCOL_OPENID_CONNECT_LEGACY_LOGOUT_REDIRECT_URI=true
SPI_LOGIN_PROTOCOL_OPENID_CONNECT_SUPPRESS_LOGOUT_CONFIRMATION_SCREEN=true
and
LEGACY_LOGOUT_REDIRECT_URI=true
SUPPRESS_LOGOUT_CONFIRMATION_SCREEN=true
I've also tried to adding to a config file, but it doesn't seem to have been picked up from where it was put in the Dockerfile.
Dockerfile:
COPY conf.d/keycloak.conf /opt/jboss/keycloak/conf/keycloak.conf
and
COPY conf.d/keycloak.conf /opt/keycloak/conf/keycloak.conf
keycloak.conf
spi-login-protocol-openid-connect-suppress-logout-confirmation-screen=true
spi-login-protocol-openid-connect-legacy-logout-redirect-uri=true
and
suppress-logout-confirmation-screen=true
legacy-logout-redirect-uri=true
I also tried adding it to thedocker-entrypoint.sh parameters:
exec /opt/jboss/tools/docker-entrypoint.sh $# -Dspi-login-protocol-openid-connect-suppress-logout-confirmation-screen=true -Dspi-login-protocol-openid-connect-legacy-logout-redirect-uri=true
and
--This one won't even start up. It fails stating that the parameters are invalid.
exec /opt/jboss/tools/docker-entrypoint.sh $# --spi-login-protocol-openid-connect-suppress-logout-confirmation-screen=true --spi-login-protocol-openid-connect-legacy-logout-redirect-uri=true
Update 1/24/23
Tried updating standalone-ha.xml, but it seems to have been ignored:
<subsystem xmlns="urn:jboss:domain:keycloak-server:1.1">
<web-context>auth</web-context>
<providers>
<provider>
classpath:${jboss.home.dir}/providers/*
</provider>
<provider>
module:org.keycloak.storage.ldap.LDAPSyncOnly
</provider>
</providers>
<master-realm-name>master</master-realm-name>
<scheduled-task-interval>900</scheduled-task-interval>
<theme>
<staticMaxAge>2592000</staticMaxAge>
<cacheThemes>false</cacheThemes>
<cacheTemplates>false</cacheTemplates>
<welcomeTheme>${env.KEYCLOAK_WELCOME_THEME:keycloak}</welcomeTheme>
<default>${env.KEYCLOAK_DEFAULT_THEME:keycloak}</default>
<dir>${jboss.home.dir}/themes</dir>
</theme>
... Bunch of other spi tags. ...
<spi name="login-protocol">
<provider name="openid-connect" enabled="true">
<properties>
<property name="suppress-logout-confirmation-screen" value="true"/>
<property name="legacy-logout-redirect-uri" value="true"/>
</properties>
</provider>
</spi>
</subsystem>
Useful links:
https://github.com/keycloak/keycloak/blob/10b7475b0431ed380d45b840578bc666ecb3263a/services/src/main/java/org/keycloak/protocol/oidc/OIDCLoginProtocolFactory.java#L106-L121
Shows the warning message that will print to the logs if this is set correctly.
https://www.keycloak.org/server/configuration#_example_configuring_the_db_url_host_parameter
Shows alternate ways to configure keycloak.
https://github.com/keycloak/keycloak-containers/tree/19.0.3
https://quay.io/repository/keycloak/keycloak?tab=tags
We figured it out.
By adding the following CLI commands we can properly update the high availability config file to enable the legacy flag.
embed-server --server-config=standalone-ha.xml --std-out=echo
/subsystem=keycloak-server/spi=login-protocol:add
/subsystem=keycloak-server/spi=login-protocol/provider=openid-connect:add(enabled=true)
/subsystem=keycloak-server/spi=login-protocol/provider=openid-connect:write-attribute(name=properties.legacy-logout-redirect-uri,value=true)
/subsystem=keycloak-server/spi=login-protocol/provider=openid-connect:write-attribute(name=properties.suppress-logout-confirmation-screen,value=true)
stop-embedded-server
I don't know why this worked but manually editing the standalone-ha.xml config didn't.

ANT-Task for FlexUnit for AIR Mobile project in Jenkins

I am building a Mobile application with Flex 4.11.0 and AIR 4.0. My IDE is Flash Builder 4.7. I wrote a lot of unit tests, some of them using AIR features such as File system access.
I am trying to integrate the project into a CI job on jenkins. I have an ANT script doing the following:
Compiling
Packaging for Android
Packaging for iOS
Generating ASDOC
What I want now is to write an ANT-Task to launch my unit tests and generate a report in XML or HTML which can be parsed by Jenkins afterwards.
I have tried the following:
- Followed the tutorial on http://tutorials.digitalprimates.net/flexunit/Unit-16.html and got the example to work. However, this is a Flash project and not an AIR-Project!
- Read the documentation on https://cwiki.apache.org/confluence/display/FLEX/FlexUnit+Ant+Task, downloaded and built the FlexUnit code from git#github.com:flexunit/flexunit.git to get the FlexUnit4AIRCIListener.swc
- Read a LOT of information on the internet from all over the place, not finding an answer (I did find some hints, but a lot of the information is outdated or references dead links)
What I have so far is the following:
<taskdef resource="flexUnitTasks.tasks" classpath="${basedir}\libs\flexUnitTasks-4.1.0.jar" />
<target name="test" >
<echo>Testing...</echo>
<echo>==========</echo>
<!-- 1. Compile FlexUnit-Application -->
<mxmlc file="${PROJECT.src}\FlexUnit.mxml" output="FlexUnit.swf" >
<load-config filename="D:\tools\sdk\flex\4.11.0_AIR4.0\frameworks\air-config.xml" append="true" />
<source-path path-element="${PROJECT.src}" />
<source-path path-element="${basedir}\test" />
<library-path dir="${PROJECT.libs}" append="true">
<include name="**/*.swc" />
<include name="**/*.ane" />
</library-path>
<library-path dir="D:\tools\sdk\flex\4.11.0_AIR4.0\frameworks\libs\air" append="true">
<include name="airglobal.swc" />
</library-path>
<compiler.verbose-stacktraces>true</compiler.verbose-stacktraces>
<compiler.headless-server>true</compiler.headless-server>
</mxmlc>
<!-- 2. Run the compiled SWF -->
<flexunit swf="FlexUnit.swf"
player="air"
timeout="180000"
toDir="${OUTPUT.root}\flexUnit"
haltonfailure="false"
verbose="true"
localTrusted="true"
/>
<!-- 3. Generate readable JUnit-style reports -->
<junitreport todir="${OUTPUT.root}\flexUnit">
<fileset dir="${OUTPUT.root}\flexUnit">
<include name="TEST-*.xml" />
</fileset>
<report format="frames" todir="${OUTPUT.root}\flexUnit\html" />
</junitreport>
</target>
Here are the relevant parts of my FlexUnit.mxml-Application:
protected function onCreationComplete(event:FlexEvent):void
{
core = new FlexUnitCore();
core.addListener(new AirCIListener());
core.run(currentRunTestSuite());
}
public function currentRunTestSuite():Array
{
var testsToRun:Array = new Array();
testsToRun.push(test.suites.CLXSatelliteTestSuite);
return testsToRun;
}
Step 1. from the ANT-Task works (at least I get the FlexUnit.swf). However, Launching the SWF in the <flexunit>-Task fails:
VerifyError: Error #1014: Class flash.filesystem::File could not be found.
Console output:
[flexunit] Generating default values ...
[flexunit] Using default working dir [D:\workspaces\flex\projects\clx-satellite]
[flexunit] Using the following settings for the test run:
[flexunit] FLEX_HOME: [D:\tools\sdk\flex\4.11.0_AIR4.0]
[flexunit] haltonfailure: [false]
[flexunit] headless: [false]
[flexunit] display: [99]
[flexunit] localTrusted: [true]
[flexunit] player: [flash]
[flexunit] port: [1024]
[flexunit] swf: [D:\workspaces\flex\projects\clx-satellite\FlexUnit.swf]
[flexunit] timeout: [180000ms]
[flexunit] toDir: [D:\workspaces\flex\projects\clx-satellite\deploy\flexUnit]
[flexunit] Setting up server process ...
[flexunit] Starting server ...
[flexunit] Opening server socket on port [1024].
[flexunit] Waiting for client connection ...
[flexunit] OS: [Windows]
[flexunit] Launching player:
[flexunit] Executing 'rundll32' with arguments:
[flexunit] 'url.dll,FileProtocolHandler'
[flexunit] 'D:\workspaces\flex\projects\clx-satellite\FlexUnit.swf'
[flexunit] The ' characters around the executable and arguments are
[flexunit] not part of the command.
[flexunit] Client connected.
[flexunit] Setting inbound buffer size to [262144] bytes.
[flexunit] Receiving data ...
[flexunit] Sending acknowledgement to player to start sending test data ...
[flexunit]
[flexunit] Stopping server ...
[flexunit] End of test data reached, sending acknowledgement to player ...
BUILD FAILED
D:\workspaces\flex\projects\clx-satellite\build.xml:148:
java.util.concurrent.ExecutionException: could not close client/server socket
When I include a single test which does not use the File-Class, the tests work and I get a similar error (ReferenceError: Error #1065: Variable flash.desktop::NativeApplication is not defined.) but at least the tests run through and I get XML-output. Seems to me like FlexUnit is not really compatible with AIR, although I use player=airin the task.
Does anybody of you have a working example of running Unit Tests with FlexUnit for an AIR Application (possibly a mobile application) through ANT?
Never mind, I figured it out myself and blogged about it in my personal blog:
http://www.tiefenauer.info/ci-for-flex-mobile-applications-part-3-performing-unit-tests/
I described a whole CI process there, just in case anyone has the same problem.
There is Apache FlexUnit feature request for this here: Apache FlexUnit: FLEX-35090
Or you can utilize this feature by compiling your own FlexUnit task by using this fork of FlexUnit 4.1: additionalCompilerOptions branch
The formatting supported with the custom FlexUnit Ant Task is as follows:
<flexunit
workingDir="${bin.loc}"
toDir="${report.loc}"
haltonfailure="false"
verbose="true"
localTrusted="true" >
<!-- only supported with custom FlexUnit Ant tasks -->
<additionalCompilerOption option="-define+=MY_CONST::foo,'BAR'" />
</flexunit>

Consuming Web Service using 2 Way SSL using Orbeon client code

We are trying to consume web service from orbeon client code. Everything works fine with one way SSL however we now wish to call the web service using 2 way SSL. We are able to call the web service using 2 way SSL successfully using the Apache CXF framework using Java code.
I followed the steps outlined in the Orbeon Wiki.
Changes made in properties-local.xml
<property as="xs:anyURI"
name="oxf.http.ssl.keystore.uri"
value="/apps/property/ClientStore.jks"/>
<property as="xs:string"
name="oxf.http.ssl.keystore.password"
value="password"/>
<property as="xs:anyURI"
name="oxf.url-rewriting.service.base-uri"
value="http://localhost:8085/Orbeon"/>
<property as="xs:anyURI"
name="oxf.fr.persistence.exist.uri"
value="http://localhost:8085/fr/service/exist"/>
<property as="xs:anyURI"
name="oxf.fr.persistence.exist.exist-uri"
value="http://localhost:8085/exist/rest/db/orbeon/fr"/>
After implementing the changes outlined above we are getting the exception below:
ERROR XFormsServer - xforms-submit-error - setting throwable {throwable:
"javax.net.ssl.SSLPeerUnverifiedException: peer not authenticated
at com.sun.net.ssl.internal.ssl.SSLSessionImpl.getPeerCertificates(Unknown Source)
at org.apache.http.conn.ssl.AbstractVerifier.verify(AbstractVerifier.java:128)
at org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:390)
at org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:488)
at org.apache.http.conn.scheme.SchemeSocketFactoryAdaptor.connectSocket(SchemeSocketFactoryAdaptor.java:62)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:148
Java start up options are:
-Djavax.net.ssl.trustStorePassword=password
-Djavax.net.ssl.keyStore=/apps/property/DMClientStore.jks
-Djavax.net.ssl.keyStorePassword=password
-Djavax.net.ssl.trustStore=/apps/property/trustkeystore.jks
Questions:
Are these properties sufficient for enabling 2 way SSL?
For Apache CXF we need to provide two keystores, one with the client certificate and a truststore. Where do we configure both of these keystores for Orbeon?

Spring Security SAML - Failed to verify signature

I'm using the Spring Security SAML 2.0 sample webapp on Tomcat 7 and have modified it to try to get it to authenticate against a Ping Identity service. The webapp is talking to the service and it's sending back an assertion, but it's failing when trying to verify the signature, as shown by the debug output below:
- Attempting to verify signature and establish trust using KeyInfo-derived credentials
- Signature contained no KeyInfo element, could not resolve verification credentials
- Failed to verify signature and/or establish trust using any KeyInfo-derived credentials
- Attempting to verify signature using trusted credentials
- Failed to verify signature using either KeyInfo-derived or directly trusted credentials
- Validation of received assertion failed, assertion will be skipped
org.opensaml.xml.validation.ValidationException: Signature is not trusted or invalid
I understand that it's not able to verify the signature, and I have been given a certificate by the Ping Identity admins to use, but I'm unsure of how to include it in the application. I've tried adding it to the JKS (keystore) that comes with the sample application using the JDK's keytool program, but it can't seem to find it in there. I've also tried adding it to the service provider's metadata xml file like this:
<md:KeyDescriptor use="signing">
<ds:KeyInfo xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
<ds:X509Data>
<ds:X509Certificate>
[Certificate is here...]
</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</md:KeyDescriptor>
However it still returns the same error.
Is there a specific place I should put the certificate in order to validate the signature? I'm relatively new to SAML and application security in general, so I apologise if I'm using the wrong terminology.
Finally got this to work. Turns out I'd missed out a line of configuration in the security context file, and that (it appears as though) no X509 certificate definition was needed in the service provider's metadata XML file.
Basically I'd already imported the public key I'd been provided with into the existing JKS (using keytool), but I hadn't told the application to specifically use this. In order to do this, I had to go into the security context file (in my case "securityContext.xml") and add the following line to the ExtendedMetadata bean definition for the SP's metadata xml file:
<property name="signingKey" value="[alias of the provided key in the JKS goes here]"/>
Hence after this modification, the ExtendedMetadataDelegate bean definition looked like this:
<bean class="org.springframework.security.saml.metadata.ExtendedMetadataDelegate">
<constructor-arg>
<bean class="org.opensaml.saml2.metadata.provider.FilesystemMetadataProvider">
<constructor-arg>
<value type="java.io.File">classpath:security/[Path to SP metadata xml file].xml</value>
</constructor-arg>
<property name="parserPool" ref="parserPool" />
</bean>
</constructor-arg>
<constructor-arg>
<bean class="org.springframework.security.saml.metadata.ExtendedMetadata">
<property name="alias" value="[SP alias goes here]" />
<property name="signingKey" value="[alias of the provided key in the JKS goes here]"/>
</bean>
</constructor-arg>
</bean>
Hope this helps anyone who might be in a similar situation.
In spring boot it can be configured in the assertingparty configuration
spring:
security:
saml2:
relyingparty:
registration:
yourrequestissuerid:
assertingparty:
verification:
credentials:
- certificate-location: "classpath:idp.crt"

Passing input to Ant's <exec> task

I have an Ant script running a standard -task after taking in an inputed password:
<input message="Password:" addproperty="password">
<handler classname="org.apache.tools.ant.input.SecureInputHandler" />
</input>
<exec executable="/bin/sh" input="${password}" failonerror="true">
<arg line='-c "myScript.sh"' />
</exec>
The script myScript.sh prompts the user for a password, and, it was my understanding that from the Ant documentation that input is supposed relay input into whatever the <exec> task is executing, but instead I get (for entering the password foobar)
[exec] Failed to open /usr/local/foobar
which is followed by a stack trace from my script complaining about an incorrect password...so obviously I've understood the documentation wrong. Does anybody know how to handle prompted input from external scripts in Ant?
input="${password}"
This will try to read from the file ${password} and send the contents into your script. Try using:
inputstring="${password}"
instead. This will send the string itself instead of treating it like a filename

Resources