AWS Device Farm - extra data path in script - appium

My Appium script perfectly working on local but moving to aws device farm, returns parsing error because of one class file.
I am trying to import data from an excel file within this class file. I think error because of the path of excel file.
I upload the data excel file as extra data in aws but i can’t find out the location.
public static void changeCity() throws InterruptedException{
try{
File src = new File("data1.xls");
Workbook wb = Workbook.getWorkbook(src);
Sheet sh1 = wb.getSheet(0);
Sheet bugzillaUpdation = wb.getSheet("UtilityCredentials");
Please help me to resolve the issue.
#jmp
I used junit and put the location as
File src = new File("/acme-android-appium/src/test/resources/data1.xls");
I am not clear about the XML file you said above. Can you please explain how can we find file in my script.
Please take a look at the attached image.

I work for the AWS Device Farm team.
To read a .xlsx or .csv file the following two approaches can be used:
Make sure the excel file is placed under src/test/resources/com/yourexcelfile.xlsx/csv
This will place the file in the test jar file under the com folder.
Once that is confirmed you should be able to read the file using one of the two code snippets below:
Without any external libraries:
java.net.URL resource = ClassLoader.getSystemResource("/com/yourexcelfile.xlsx");
File file = new File(resource.toURI());
FileInputStream input = new FileInputStream(file);
OR
If you are using Apache POI APIs for reading excel files:
InputStream ins = getClass().getResourceAsStream(“/com/yourexcelfile.xlsx”);
workbook = new XSSFWorkbook(ins);
sheet = workbook.getSheetAt(1);
ins.close();
Hope this helps.

What happens if you try putting the file in the src/test/java/resources/ of the project? That way it would be built with the jar and device farm may have reference to it then.
[update]
Tried this myself with the example project from awslabs github page[1]. I also optionally created a testng.xml file that contained this xml:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd">
<suite name="Default Suite">
<test name="test">
<classes>
<class name="file.FindFile"/>
</classes>
</test>
</suite>
This file is located in /Referenceapp-Appium-Test/src/test/resources/testng.xml and is referenced from the pom.xml using this plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.4</version>
<configuration>
<suiteXmlFiles>
<suiteXmlFile>src/test/resources/testng.xml</suiteXmlFile>
</suiteXmlFiles>
</configuration>
</plugin>
The testng.xml is used here to only run the specific test and not all the of the tests in the example project. It is optional and here for FYI knowledge.
I then created a test.csv file in the same directory and created a new package 'file' with a test class inside of it 'FindFile':
package file;
import java.io.File;
import org.testng.annotations.Test;
import Pages.BasePage;
import io.appium.java_client.AppiumDriver;
public class FindFile extends BasePage {
protected FindFile(AppiumDriver driver) {
super(driver);
// TODO Auto-generated constructor stub
}
#Test
public static void changeCity() throws InterruptedException{
try{
File src = new File("/Referenceapp-Appium-Test/src/test/resources/test.csv");
System.out.println("File found!!!");
}catch(Exception e){
System.out.println(e);
}
}
}
So when I executed this it in device farm with a random apk I had it only executed the tests inside of the FindFile.java. When I looked at the appium java output I seen my println there so that is how I know it works.
[TestNG] RUNNING: Suite: "Command line test" containing "1" Tests (config: null)
[TestNG] INVOKING: "Command line test" - file.FindFile.changeCity()
[Invoker 1121172875] Invoking file.FindFile.changeCity
File found!!!
[TestNG] PASSED: "Command line test" - file.FindFile.changeCity() finished in 35 ms
===== Invoked methods
FindFile.changeCity()[pri:0, instance:null] <static>
=====
Creating /tmp/scratchLT6UDz.scratch/resultsm_f_bN/Command line suite/Command line test.html
Creating /tmp/scratchLT6UDz.scratch/resultsm_f_bN/Command line suite/Command line test.xml
PASSED: changeCity
Hope that Helps
Best Regards
James
[1] https://github.com/awslabs/aws-device-farm-appium-tests-for-sample-app

Related

AtlasMap can't get target document from .adm file imported into java

I imported the target xml file into the AtlasMap Data Mapper UI as below:
<?xml version="1.0" encoding="UTF-8"?>
<ns:XmlOE xmlns:ns="http://atlasmap.io/xml/test/v2" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<ns:Address>
<ns:addressLine1>1040 Notexisting St</ns:addressLine1>
<ns:zipCode>01886</ns:zipCode>
</ns:Address>
<ns:Contact>
<ns:fullName>Totton</ns:fullName>
<ns:phoneNumber>123-456-7890</ns:phoneNumber>
<ns:zipCode>01886</ns:zipCode>
</ns:Contact>
</ns:XmlOE>
Then I exported it to *.adm file and after that importing it to the eclipse.
There is the log:
log_and_code
I run main class and get errors.
If I use *.adm file from example project and re-export it from AtlasMap Data Mapper UI, it work well.
The answer from #igarashitm
You'd need to copy Document ID of the imported target document from
UI and use it as a key to retrieve a target document. Or simply use
getDefaultTargetDocumentIO. When you import a document, an auto
generated Document ID is assigned which has GUID suffix. So the
target document is no longer XMLInstanceSource.
In my case getDefaultTargetDocument() works well.
I believe it is the Document ID he is referring to.

Customize XML report files in Spock

I am new to Spock and need to figure out if I can customize the XML test report file generated by Spock. As far as I could figure out so far, I can enable generating JSON report file in which I would have access to all tests' start and end time.
I have integrated Spock with Jenkins and I am able to see the generated test reports after each build. I am wondering if there is a way by which I can customize this report to include start and end time?
Is there any way by which I can
include my own defined parameters into the test results
have Jenkins to show also my defined parameters in the report
Here is an example of what I want to have
<?xml version="1.0" encoding="UTF-8"?>
<testsuite name="SpecName" tests="12" skipped="0" failures="2" errors="0" timestamp="2018-16-15T09:12:59" hostname="DESKTOP-VANP1TU" time="0.864">
<properties/>
<testcase name="FeatureName" classname="SpecName" time="0.116" startTime="2018-16-15T09:12:59" endTime="2018-16-15T09:12:59"/>
.
.
.
</testsuite>
As you can see I have added two fields (StartTime and endTime) to the report.
the spock reports extension lets you define templates for your reports: https://github.com/renatoathaydes/spock-reports
with that at hand, you should also be able to define an XML template...

Importing external domain classes into Grails

I'm trying to create a sample project where domain classes are in an external normal groovy project and then used in a grails app (see https://github.com/ivanarrizabalaga/grails-domain-griffon):
book-svr
book-common
I'm also following the grails guide in order to get this (see http://grails.org/doc/latest/guide/hibernate.html) but the imported classes are not being recognized as domain classes.
The relevant parts:
External domain class:
package com.nortia.book
import grails.persistence.Entity
#Entity
class Book implements Serializable{
private static final long serialVersionUID = 1L;
String title
String author
static constraints = {
title blank:false
author blank:false
}
}
build.gradle:
....
dependencies {
// We use the latest groovy 2.x version for building this library
compile 'org.codehaus.groovy:groovy:2.1.7'
compile "org.grails:grails-datastore-gorm-hibernate4:3.0.0.RELEASE"
compile "org.grails:grails-spring:2.3.7"
....
In the grails app,
hibernate.cfg.xml:
<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE hibernate-configuration PUBLIC
'-//Hibernate/Hibernate Configuration DTD 3.0//EN'
'http://www.hibernate.org/dtd/hibernate-configuration-3.0.dtd'>
<hibernate-configuration>
<session-factory>
<mapping package='com.nortia.book' />
<mapping class='com.nortia.book.Book' />
</session-factory>
</hibernate-configuration>
BookController.groovy (I've tried an scaffolded and coded controller, and both failed):
...
class BookController{
static scaffold=Book
}
...
console.log (error):
ERROR ScaffoldingGrailsPlugin - Cannot generate controller logic for scaffolded class class com.nortia.book.Book. It is not a domain class!
Finally, suspicious log messages while initializing:
DEBUG cfg.Configuration - session-factory config [null] named package [com.nortia.book] for mapping
INFO cfg.Configuration - Mapping package com.nortia.book
WARN cfg.AnnotationBinder - Package not found or wo package-info.java: com.nortia.book
DEBUG cfg.Configuration - session-factory config [null] named class [com.nortia.book.Book] for mapping
INFO cfg.Configuration - Configured SessionFactory: null
So I'm wondering:
What's the missing piece?
According to the docs, looks like external 'domains' must be java classes but that's not a good option for my purpose.
I haven't try yet with a grails binary plugin instead of a groovy project, is it the way to go? (I need to use the domains in a griffon project that is why I opted this way first).
Finally solve it creating a 'sort of' a binary plugin manually.
Let's take a look step by step:
book-domain
Tree structure
src
main
groovy
demo
Book.groovy
resources
META-INF
grails-plugin.xml
build.gradle
Book.groovy
package demo
import grails.persistence.Entity
#Entity
class Book{
String title
static constraints = {
title blank: false
}
}
grails-plugin.xml
<plugin name='book-domain' version='1.0' grailsVersion='2.3 > *'>
<author>Ivan Arrizabalaga</author>
<title>External domains</title>
<description>An external domain plugin</description>
<documentation>http://grails.org/plugin/book-domain</documentation>
<type>demo.BookDomainGrailsPlugin</type>
<packaging>binary</packaging>
<resources>
<resource>demo.Book</resource>
</resources>
</plugin>
build.gradle
/*
* This build file was auto generated by running the Gradle 'init' task
* by 'arrizabalaga' at '5/26/14 12:34 PM' with Gradle 1.11
*
* This generated file contains a sample Groovy project to get you started.
* For more details take a look at the Groovy Quickstart chapter in the Gradle
* user guide available at http://gradle.org/docs/1.11/userguide/tutorial_groovy_projects.html
*/
// Apply the groovy plugin to add support for Groovy
apply plugin: 'groovy'
apply plugin: 'maven'
group = 'demo'
version = '1.0'
// In this section you declare where to find the dependencies of your project
repositories {
// Use 'maven central' for resolving your dependencies.
// You can declare any Maven/Ivy/file repository here.
mavenCentral()
mavenLocal()
}
// In this section you declare the dependencies for your production and test code
dependencies {
// We use the latest groovy 2.x version for building this library
compile 'org.codehaus.groovy:groovy-all:2.1.9'
//compile "org.grails:grails-datastore-gorm-hibernate4:3.1.0.RELEASE"
compile "org.grails:grails-datastore-gorm-hibernate:3.1.0.RELEASE"
compile "commons-lang:commons-lang:2.6"
// We use the awesome Spock testing and specification framework
testCompile 'org.spockframework:spock-core:0.7-groovy-2.0'
testCompile 'junit:junit:4.11'
}
task sourcesJar(type: Jar, dependsOn: classes) {
classifier = 'sources'
from sourceSets.main.allSource
}
task javadocJar(type: Jar, dependsOn: javadoc) {
classifier = 'javadoc'
from javadoc.destinationDir
}
artifacts {
archives sourcesJar
archives javadocJar
}
using it
Now the project can be built (gradle install to publish it into your local maven repo) and used (declaring the proper dependency) in any given project.
If the project that uses the jar is a Grails app it turns automatically the #Entity classes into real Domains.
Hope it helps.

#Resource Injection in jar in 'lib' of an ear; why doesn't that work?

I have simple ear (GF 4.0, JDK 7; sticking with EE6 for now)
The ear contains:
EJBJar
WAR
lib/Shared.jar
Shared has an #Qualifier (#UserDS) in it (it also has META-INF/beans.xml).
I have an #Producer like this:
package fhw.producers;
import fhw.qaulifiers.ListingDS;
import fhw.qaulifiers.UserDS;
import javax.annotation.Resource;
import javax.sql.DataSource;
import javax.enterprise.inject.Default;
import javax.enterprise.inject.Produces;
#Default
public class DataSourceProducer
{
#Resource(lookup = "Member")
private DataSource userDS;
public DataSourceProducer()
{
System.err.println("DataSourceProducer.DataSourceProducer -- CONSTRUCTION");
}
#Produces
#UserDS
public DataSource getUserDataSource()
{
System.err.println("******DataSourceProducer.getUserDataSource; am I null? " + (null == userDS) ) ;
return userDS;
}
}
I have a simple EJB (it has a beans.xml) that uses it via:
#Inject
#UserDS
private DataSource userDS;
QUESTION: When I put DataSourceProducer in the EJBJAR and deploy; my print statements come out and my #Resource resolves and everything is fine. When I put DataSourceProducer in the Shared.jar; the print statements still come out but the #Resource didn't work and the EJB NPE's on the null DS returned by producer method etc. In both tests the qualifier stayed in the Shared.jar. I have no DDs anyway where (well a web.xml for the war -- all else is implicit)
Part of me thinks this makes a bit of sense; #Resource is sort of EE oriented (or no?); and should mostly make sense within a EE deployable.
OTOH, why can't have I have hand-full of qualifiers and some producers in a Shared JAR in lib dir of an EAR that all EJBJars and WARs (in the EAR) can use?
Is there a way to make this work?
If you really want -- you can see an entire example here: https://github.com/fwelland/ResJect
I got the same issue on GF3, but the solution seems the same.
Remove the dependency from the lib directory and add it to the root of the ear.
Then add the following to the application.xml
<module><ejb>Shared.jar</ejb></module>
Tip: using maven-ear-plugin you can automatically add dependencies as modules to your ear
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ear-plugin</artifactId>
<configuration>
<displayName>...</displayName>
<application-name>...</application-name>
<defaultJavaBundleDir>lib</defaultJavaBundleDir>
<!-- not generate application.xml! we include it ourselves -->
<generateApplicationXml>false</generateApplicationXml>
<modules>
<ejbModule>
<groupId>...</groupId>
<artifactId>Shared</artifactId>
<bundleFileName>Shared.jar</bundleFileName>
</ejbModule>
</modules>
</configuration>
</plugin>
Note that if you're using GlassFish 4, you're using Java EE 7, not Java EE 6. In order for your situation to work, you need to register your shared jar as a module in the application.xml, so that it knows to scan it.

Error in loading XSL files and DTD files in XSLT transformation

I am trying to create HTML files using XSLT, I have used xml file and xsl files to create HTML file. Here some other xsl files which are located in same location are included in xsl file by using <xsl:include href="temp.xsl"/>.
Here Xsl files are located in "D:/XSL_Folder/" path.
I am running Main.java file which is located in D:/Workspace/Webapp_Project/ path.
When i try to create HTML files by using passing "D:/XSL_Folder/root.xsl" and "D:/XML_Folder/data.xml" files to Main.java as arguments, I am getting following error while creating Templates.
Templates lTemplates = TransformerFactory.newInstance().newTemplates(new StreamSource(new FileInputStream(lFileXSL)));
ERROR: 'D:\Workspace\Webapp_Project\temp.xsl (The system cannot find the file specified)'
FATAL ERROR: 'Could not compile stylesheet'
12:20:07 ERROR f.s.t.v.v2.dao.impl.DocUnitDaoImpl - Error while creating a new XslTransformerGetter. The path to the XSL may be wrong.
javax.xml.transform.TransformerConfigurationException: Could not compile stylesheet
at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTemplates(TransformerFactoryImpl.java:885) ~[na:1.7.0_13]
In error report we can see that parser is checking included xsl file in project path (D:\Workspace\Webapp_Project), not in the path where root.xsl file is located (D:/XSL_Folder/).
Can anyone suggest me why parser searching xsl file in project folder in the path where root.xsl file is located and how to fix this problem?
Code I'm using to create HTML file by using XSL and XML file :
public static void simpleTransform(InputStream lXmlFileStream, File lXSLFile,
StreamResult lHtmlResult, Map<String, String> lArguments) {
TransformerFactory tFactory = TransformerFactory.newInstance();
try {
Transformer transformer =
tFactory.newTransformer(new StreamSource(lXSLFile));
for (Entry<String, String> lEntrie : lArguments.entrySet()) {
transformer.setParameter(lEntrie.getKey(), lEntrie.getValue());
}
transformer.setOutputProperty(OutputKeys.ENCODING, "UTF-8");
transformer.transform(new StreamSource(lXmlFileStream), lHtmlResult);
}
catch (Exception e) {
e.printStackTrace();
}
}
You have tagged the question "saxon", and you have said you are using XSLT 2.0, but the error messages show that you are using Xalan. If you specifically want to use Saxon then the best way is to avoid using the JAXP classpath search and instantiate Saxon directly - in place of TransformerFactory.newInstance(), use new net.sf.saxon.TransformerFactory().
Supplying a File as the argument to StreamSource ought to be OK; but I would like to see how the File lXSLFile object is created. My suspicion would be that you have done something like new File ("root.xsl") and it has been resolved relative to the current directory.
You may try to use <xsl:include href="resolve-uri('temp.xsl')"/> instead of <xsl:include href="temp.xsl"/> to avoid this problem.

Resources