SnakeYaml class not found exception - maven-3

When I parse config.yaml using SnakeYaml 1.14 I get a "Class not found exception". The following is the code used for parsing. I have used maven to build the project.
public class AppConfigurationReader
{
private static final String CONFIG_FILE = "config.yaml";
private static String fileContents = null;
private static final Logger logger = LoggerFactory.getLogger(AppConfigurationReader.class);
public static synchronized AppConfiguration getConfiguration() {
return getConfiguration(false);
}
public static synchronized AppConfiguration getConfiguration(Boolean forceReload) {
try {
Yaml yaml = new Yaml();
if(null == fileContents || forceReload) {
fileContents = read(CONFIG_FILE);
}
yaml.loadAs(fileContents, AppConfiguration.class);
return yaml.loadAs(fileContents, AppConfiguration.class);
}
catch (Exception ex) {
ex.printStackTrace();
logger.error("Error loading fileContents {}", ex.getStackTrace()[0]);
return null;
}
}
private static String read(String filename) {
try {
return new Scanner(new File(filename)).useDelimiter("\\A").next();
} catch (Exception ex) {
logger.error("Error scanning configuration file {}", filename);
return null;
}
}
}

I too had this, it was due to an incorrect set of dependencies.
I had used
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot</artifactId>
</dependency>
when I should have used
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
The difference being that the latter includes org.yaml:snakeyaml:jar:1.27:compile

May be I am bit late to respond, but it will help others in future.
This issue comes when your class is not able to load the class, sometimes even if it is present in your classpath also.
I encountered this Issue and can be handled in this way.
package my.test.project;
import java.io.InputStream;
import org.yaml.snakeyaml.Yaml;
import org.yaml.snakeyaml.constructor.CustomClassLoaderConstructor;
public class MyTestClass {
public static void main(String[] args) {
InputStream input = MyTestClass.class.getClassLoader().getResourceAsStream("test.yml");
Yaml y = new Yaml(new CustomClassLoaderConstructor(MyTestClass.class.getClassLoader()));
TestConfig test =y.loadAs(input, TestConfig.class);
System.out.println(test);
}
}
You need to initialize Yaml object with CustomClassLoaderConstructor it will help load the bean class before it was actually used internally.

I have found a similar error but dumping a file.
You could write the complete name of the class in yaml.load instruction.
For example, if AppConfiguration.class was in org.example.package1, you would write something like:
yaml.loadAs(fileContents, org.example.package1.AppConfiguration.class);

Seems like snakeyaml libraries are not included in your jar file, you have to use the maven assembly plugin eather than just package, so that all the dependency jars gets included

Related

Azure Cosmos: Register Stored Procedure If not exist already

I want to register and execute stored proc. I am using spring+Java with cosmos DB. Everytime I stop my application and restart it , it tried to create new sproc and since it already exists in cosmos DB it fails with below error . Is their any option available like "only create if not exist". I am fetching js file from src/main/resources folder.
I am following below doc to register the stored proc
https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/how-to-use-stored-procedures-triggers-udfs?tabs=java-sdk
#Configuration
public class StoredProcConfig
{
#Autowired
#Qualifier(BeansConstants.PAYMENT_CONTAINER)
CosmosContainer container;
#Bean
public CosmosStoredProcedureResponse registerSp() throws IOException
{
InputStream is = getFileFromResourceAsStream("storedProcedures/createStudent.js");
CosmosStoredProcedureProperties definition = new CosmosStoredProcedureProperties("spCreateToDoItems",
IOUtils.toString(is, StandardCharsets.UTF_8));
return container.getScripts().createStoredProcedure(definition);
}
private InputStream getFileFromResourceAsStream(String fileName)
{
// The class loader that loaded the class
ClassLoader classLoader = getClass().getClassLoader();
InputStream inputStream = classLoader.getResourceAsStream(fileName);
// the stream holding the file content
if (inputStream == null)
{
throw new IllegalArgumentException("file not found! " + fileName);
} else
{
return inputStream;
}
}
}
Error
Caused by: com.azure.cosmos.CosmosException: {"innerErrorMessage":"Message: {\"Errors\":[\"Resource with specified id, name, or unique index already exists.\"]}
Modify your registerSp() bean as below:
private static final Logger logger = LoggerFactory.getLogger(CosmosConfiguration.class);
#Bean
public CosmosStoredProcedureResponse registerSp() throws IOException
{
InputStream is = getFileFromResourceAsStream("storedProcedures/createStudent.js");
CosmosStoredProcedureProperties definition = new CosmosStoredProcedureProperties("spCreateToDoItems",
IOUtils.toString(is, StandardCharsets.UTF_8));
return createStoredProcedureIfNotExists(definition);
}
public CosmosStoredProcedureResponse createStoredProcedureIfNotExists(CosmosStoredProcedureProperties definition){
try {
CosmosStoredProcedureResponse storedProc = container.getScripts().getStoredProcedure(definition.getId()).read();
logger.info("found stored proc");
return storedProc;
}
catch (CosmosException e){
logger.info("stored proc not found, creating....");
return container.getScripts().createStoredProcedure(definition);
}
}

Configuring a neo4j test container in Spring Boot with #Configuration file

I'm trying to run spock tests against a neo4j container using the TestContainers library. Previously, I used the test-harness library to run tests against an embedded neo4j database but the need to migrate to using TestContainers has come up. With test-harness I used a configuration file to include some necessary beans for callback functionality. Without these beans, some unit tests are failing.
Previous test-harness method
#ContextConfiguration(classes = [
Neo4jTestHarnessAutoConfiguration,
Neo4jDriverAutoConfiguration,
Neo4jDataAutoConfiguration,
CustomNeo4jConfiguration
])
#ImportAutoConfiguration(classes = [
Neo4jTestHarnessAutoConfiguration,
Neo4jDriverAutoConfiguration,
Neo4jDataAutoConfiguration
])
#Transactional
trait EmbeddedNeo4j {
// Neo4j embedded database is created in Neo4jTestHarnessAutoConfiguration
}
CustomNeo4jConfiguration
#Configuration
#EntityScan(value = {
"data.neo4j.nodes",
"data.neo4j.relationships",
"data.neo4j.queryresults"
})
#EnableNeo4jRepositories("data.neo4j.repository")
#EnableNeo4jAuditing(auditorAwareRef = "auditorAware")
public class CustomNeo4jConfiguration {
#Bean
public AuditorAware<String> auditorAware(){
return new CustomAuditorAwareImpl();
}
#Bean
public BeforeBindCallback neo4jCustomEntitiesCallback(AuditorAware<String> auditorAware) {
return new CustomNeo4jEntitiesCallback(auditorAware);
}
#Bean
public Neo4jConversions neo4jCustomConversions() {
Set<GenericConverter> additionalConverters = Collections.singleton(new InstantStringConverter());
return new Neo4jConversions(additionalConverters);
}
}
The above method works fine. All tests run properly and all beans are created.
TestContainers attempt
#Testcontainers
#Transactional
trait EmbeddedNeo4j {
#Shared
static final Neo4jContainer<?> neo4jContainer = new Neo4jContainer<>("neo4j:4.3.9")
.withReuse(true)
#DynamicPropertySource
static void neo4jProperties(DynamicPropertyRegistry registry) {
neo4jContainer.start()
registry.add("spring.neo4j.uri", neo4jContainer::getBoltUrl)
registry.add("spring.neo4j.authentication.username", () -> "neo4j")
registry.add("spring.neo4j.authentication.password", neo4jContainer::getAdminPassword)
}
}
With the above, all tests successfully run against the test container. However, any test that needs the functionality of the beans in the CustomNeo4jConfiguration fails.
What I've tried
I've attempted to use the same #ContextConfiguration annotations in various configurations, such as...
#ContextConfiguration(classes = [
Neo4jDriverAutoConfiguration,
Neo4jDataAutoConfiguration,
CustomNeo4jConfiguration
])
#ImportAutoConfiguration(classes = [
Neo4jDriverAutoConfiguration,
Neo4jDataAutoConfiguration
])
#Transactional
trait EmbeddedNeo4j {
TestContainer code here...
However, this fails with the following stack trace:
Failed to load ApplicationContext
java.lang.IllegalStateException: Failed to load ApplicationContext
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContext(DefaultCacheAwareContextLoaderDelegate.java:132)
at org.springframework.test.context.support.DefaultTestContext.getApplicationContext(DefaultTestContext.java:124)
at org.springframework.test.context.web.ServletTestExecutionListener.setUpRequestContextIfNecessary(ServletTestExecutionListener.java:190)
at org.springframework.test.context.web.ServletTestExecutionListener.prepareTestInstance(ServletTestExecutionListener.java:132)
at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:248)
at org.spockframework.spring.SpringTestContextManager.prepareTestInstance(SpringTestContextManager.java:56)
at org.spockframework.spring.SpringInterceptor.interceptInitializerMethod(SpringInterceptor.java:43)
at org.spockframework.runtime.extension.AbstractMethodInterceptor.intercept(AbstractMethodInterceptor.java:24)
at org.spockframework.runtime.extension.MethodInvocation.proceed(MethodInvocation.java:101)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$prepare$2(NodeTestTask.java:123)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.prepare(NodeTestTask.java:123)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:90)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1541)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:148)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1541)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:107)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:88)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:54)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:67)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:52)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86)
at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86)
at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:53)
at org.gradle.api.internal.tasks.testing.junitplatform.JUnitPlatformTestClassProcessor$CollectAllTestClassesExecutor.processAllTestClasses(JUnitPlatformTestClassProcessor.java:99)
at org.gradle.api.internal.tasks.testing.junitplatform.JUnitPlatformTestClassProcessor$CollectAllTestClassesExecutor.access$000(JUnitPlatformTestClassProcessor.java:79)
at org.gradle.api.internal.tasks.testing.junitplatform.JUnitPlatformTestClassProcessor.stop(JUnitPlatformTestClassProcessor.java:75)
at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.stop(SuiteTestClassProcessor.java:61)
at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
at org.gradle.api.internal.tasks.testing.worker.TestWorker.stop(TestWorker.java:133)
at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'auditFieldRepository' defined in data.neo4j.repository.AuditFieldRepository defined in #EnableNeo4jRepositories declared on CustomNeo4jConfiguration: Cannot resolve reference to bean 'neo4jTemplate' while setting bean property 'neo4jOperations'; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'neo4jTemplate' available
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:342)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:113)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1707)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:619)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:936)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:583)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:734)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:408)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:308)
at org.springframework.boot.test.context.SpringBootContextLoader.loadContext(SpringBootContextLoader.java:132)
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContextInternal(DefaultCacheAwareContextLoaderDelegate.java:99)
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContext(DefaultCacheAwareContextLoaderDelegate.java:124)
... 64 more
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'neo4jTemplate' available
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanDefinition(DefaultListableBeanFactory.java:874)
at org.springframework.beans.factory.support.AbstractBeanFactory.getMergedLocalBeanDefinition(AbstractBeanFactory.java:1344)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:309)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:330)
... 82 more
I've also attempted in various ways to create the missing beans, but that doesn't feel like the right path to take and that didn't produce any results.
What's the right way to use a #Configuration file with TestContainers?
Appreciate any help!
If I interpret your mixture of Java and Scala code correctly, I would recommend the following approach, but only if you insist on not using #SpringBootTest.
Also: You are using the outdated neo4j-java-driver-spring-boot-starter classes, they aren't needed for a while now in recent Spring Boot and also will potentially break SDN6+.
Please note: You should never import auto configuration classes and use them as ContextConfiguration at the same time. They are only meant to be imported via #ImportAutoConfiguration, but even that is redundant or overly complicated most of the time.
First of all, here's the pom so you get the proper dependencies:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.7.2</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.example</groupId>
<artifactId>neowithapoc</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>neowithapoc</name>
<description>neowithapoc</description>
<properties>
<java.version>17</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-neo4j</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>junit-jupiter</artifactId>
<version>1.17.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>neo4j</artifactId>
<version>1.17.3</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Solution 1
Here's a solution that goes into what you have there, with a trait - or in case of pure java - abstract test class:
package com.example.neowithapoc;
import org.springframework.boot.autoconfigure.ImportAutoConfiguration;
import org.springframework.boot.autoconfigure.data.neo4j.Neo4jDataAutoConfiguration;
import org.springframework.boot.autoconfigure.neo4j.Neo4jAutoConfiguration;
import org.springframework.boot.test.context.TestConfiguration;
import org.springframework.data.neo4j.repository.config.EnableNeo4jRepositories;
import org.springframework.test.context.DynamicPropertyRegistry;
import org.springframework.test.context.DynamicPropertySource;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import org.testcontainers.containers.Neo4jContainer;
import org.testcontainers.containers.Neo4jLabsPlugin;
#TestConfiguration
#ImportAutoConfiguration({
Neo4jAutoConfiguration.class,
Neo4jDataAutoConfiguration.class
})
#EnableTransactionManagement
#EnableNeo4jRepositories(considerNestedRepositories = true)
public abstract class EmbeddedNeo4jConfig {
static final Neo4jContainer<?> neo4j = new Neo4jContainer<>("neo4j:4.4")
.withLabsPlugins(Neo4jLabsPlugin.APOC)
.withReuse(true);
#DynamicPropertySource
static void neo4jProperties(DynamicPropertyRegistry registry) {
neo4j.start();
registry.add("spring.neo4j.uri", neo4j::getBoltUrl);
registry.add("spring.neo4j.authentication.username", () -> "neo4j");
registry.add("spring.neo4j.authentication.password", neo4j::getAdminPassword);
}
}
Use like this:
package com.example.neowithapoc;
import static org.assertj.core.api.Assertions.assertThat;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.neo4j.core.Neo4jClient;
import org.springframework.data.neo4j.core.Neo4jTemplate;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit.jupiter.SpringExtension;
#ExtendWith(SpringExtension.class)
#ContextConfiguration(classes = EmbeddedNeo4jConfig.class)
public class SomeTest extends EmbeddedNeo4jConfig {
#Test
void whatever(#Autowired Neo4jTemplate template, #Autowired Neo4jClient client) {
assertThat(template).isNotNull();
assertThat(client).isNotNull();
String apoc = client
.query("RETURN apoc.version() AS output")
.fetchAs(String.class)
.first().get();
assertThat(apoc).startsWith("4.4");
}
}
Solution 2
I do prefer this way simpler and less complicated approach:
package com.example.neowithapoc;
import static org.assertj.core.api.Assertions.assertThat;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.data.neo4j.core.Neo4jClient;
import org.springframework.data.neo4j.core.Neo4jTemplate;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.DynamicPropertyRegistry;
import org.springframework.test.context.DynamicPropertySource;
import org.springframework.test.context.junit.jupiter.SpringExtension;
import org.springframework.transaction.annotation.Transactional;
import org.testcontainers.containers.Neo4jContainer;
import org.testcontainers.containers.Neo4jLabsPlugin;
import org.testcontainers.junit.jupiter.Testcontainers;
#SpringBootTest
#Transactional
#Testcontainers(disabledWithoutDocker = true)
public class SomeSpringBootTest {
static final Neo4jContainer<?> neo4j = new Neo4jContainer<>("neo4j:4.4")
.withLabsPlugins(Neo4jLabsPlugin.APOC)
.withReuse(true);
#DynamicPropertySource
static void neo4jProperties(DynamicPropertyRegistry registry) {
neo4j.start();
registry.add("spring.neo4j.uri", neo4j::getBoltUrl);
registry.add("spring.neo4j.authentication.username", () -> "neo4j");
registry.add("spring.neo4j.authentication.password", neo4j::getAdminPassword);
}
#Test
void whatever(#Autowired Neo4jTemplate template, #Autowired Neo4jClient client) {
assertThat(template).isNotNull();
assertThat(client).isNotNull();
String apoc = client
.query("RETURN apoc.version() AS output")
.fetchAs(String.class)
.first().get();
assertThat(apoc).startsWith("4.4");
}
}
Bonus questions custom conversions
Given the following class:
package com.example.neowithapoc;
public class SomeObject {
private final String value;
public SomeObject(String value) {
this.value = value;
}
public String getValue() {
return value;
}
}
It is not an entity and it needs some special treatment during conversions.
It can be converted back and forth like this:
import java.util.Set;
import org.neo4j.driver.Value;
import org.neo4j.driver.Values;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.GenericConverter;
public class SomeObjectConverter implements GenericConverter {
#Override public Set<ConvertiblePair> getConvertibleTypes() {
return Set.of(
new ConvertiblePair(Value.class, SomeObject.class),
new ConvertiblePair(SomeObject.class, Value.class)
);
}
#Override
public Object convert(Object source, TypeDescriptor sourceType, TypeDescriptor targetType) {
if (source == null) {
return Value.class.isAssignableFrom(targetType.getType()) ? Values.NULL : null;
} else if (source instanceof SomeObject someObject) {
return Values.value(someObject.getValue());
} else if (source instanceof Value value) {
return new SomeObject(value.asString());
} else
throw new IllegalArgumentException();
}
}
Configure it with a #TestConfiguration (or on your main config) like this
#TestConfiguration
static class AdditionalBeans {
#Bean
SomeObjectConverter someObjectConverter() {
return new SomeObjectConverter();
}
#Bean
Neo4jConversions neo4jConversions(SomeObjectConverter someObjectConverter) {
return new Neo4jConversions(List.of(someObjectConverter));
}
}
In both cases this class is a static nested class of either EmbeddedNeo4jConfig and SomeSpringBootTest.
In the former you normally would just declare the bean method directly, but as I tried to recreate your "trait" as much as possible in Java and I do extend that class, this is not possible. Tests must not have their own #Bean methods.
After that, test like this
#Node
static class ContainerNode {
#Id #GeneratedValue
Long id;
SomeObject someObject;
public Long getId() {
return id;
}
public SomeObject getSomeObject() {
return someObject;
}
}
#Test
void conversionsShouldWork(#Autowired Neo4jTemplate template) {
var container = template.find(ContainerNode.class)
.matching("CREATE (n:ContainerNode {someObject: 'Hello'}) RETURN n")
.one();
assertThat(container).map(ContainerNode::getSomeObject).map(SomeObject::getValue).hasValue("Hello");
}
will work just fine, also with the repository.
This test here will fail
#Test
void conversionsShouldWork(#Autowired Neo4jClient client) {
var anObject = client.query("RETURN 'whatever'").fetchAs(SomeObject.class).first();
assertThat(anObject).map(SomeObject::getValue).hasValue("whatever");
}
but the Spring Data Neo4j fixed this in 6.3.3 https://github.com/spring-projects/spring-data-neo4j/issues/2594 and you would be able to manually add it to the client like this:
#TestConfiguration
static class AdditionalBeans {
#Bean
SomeObjectConverter someObjectConverter() {
return new SomeObjectConverter();
}
#Bean
Neo4jConversions neo4jConversions(SomeObjectConverter someObjectConverter) {
return new Neo4jConversions(List.of(someObjectConverter));
}
#Bean
public Neo4jClient neo4jClient(
Driver driver,
DatabaseSelectionProvider databaseSelectionProvider,
Neo4jConversions neo4jConversions) {
return Neo4jClient.with(driver)
.withDatabaseSelectionProvider(databaseSelectionProvider)
.withNeo4jConversions(neo4jConversions)
.build();
}
}
Than this test will work just fine:
#Test
void clientConversionsShouldWork(#Autowired Neo4jClient client) {
var anObject = client.query("RETURN 'whatever'").fetchAs(SomeObject.class).first();
assertThat(anObject).map(SomeObject::getValue).hasValue("whatever");
}

Apache Beam Coder for GenericRecord

I am building a pipeline that reads Avro generic records. To pass GenericRecord between stages I need to register AvroCoder. The documentation says that if I use generic record, the schema argument can be arbitrary: https://beam.apache.org/releases/javadoc/2.2.0/org/apache/beam/sdk/coders/AvroCoder.html#of-java.lang.Class-org.apache.avro.Schema-
However, when I pass an empty schema to the method AvroCoder.of(Class, Schema) it throws an exception at run time. Is there a way to create an AvroCoder for GenericRecord that does not require a schema? In my case, each GenericRecord has an embedded schema.
The exception and stacktrace:
Exception in thread "main" java.lang.NullPointerException
at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.checkIndexedRecord(AvroCoder.java:562)
at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.recurse(AvroCoder.java:430)
at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.check(AvroCoder.java:409)
at org.apache.beam.sdk.coders.AvroCoder.<init>(AvroCoder.java:260)
at org.apache.beam.sdk.coders.AvroCoder.of(AvroCoder.java:141)
I had a similar case and solved it with custom coder. The simplest (but sub-efficient) solution would be to encode schema along with each record. If your schemas are not too volatile you can get benefit of caching.
public class GenericRecordCoder extends AtomicCoder<GenericRecord> {
public static GenericRecordCoder of() {
return new GenericRecordCoder();
}
private static final ConcurrentHashMap<String, AvroCoder<GenericRecord>> avroCoders = new ConcurrentHashMap<>();
#Override
public void encode(GenericRecord value, OutputStream outStream) throws IOException {
String schemaString = value.getSchema().toString();
String schemaHash = getHash(schemaString);
StringUtf8Coder.of().encode(schemaString, outStream);
StringUtf8Coder.of().encode(schemaHash, outStream);
AvroCoder<GenericRecord> coder = avroCoders.computeIfAbsent(schemaHash,
s -> AvroCoder.of(value.getSchema()));
coder.encode(value, outStream);
}
#Override
public GenericRecord decode(InputStream inStream) throws IOException {
String schemaString = StringUtf8Coder.of().decode(inStream);
String schemaHash = StringUtf8Coder.of().decode(inStream);
AvroCoder<GenericRecord> coder = avroCoders.computeIfAbsent(schemaHash,
s -> AvroCoder.of(new Schema.Parser().parse(schemaString)));
return coder.decode(inStream);
}
}
While this solves the task, in fact I made it slightly different, using external schema registry (you can build this on the top of datastore for example). In this case you don't need to serialize/deserialize schema. The code looks like:
public class GenericRecordCoder extends AtomicCoder<GenericRecord> {
public static GenericRecordCoder of() {
return new GenericRecordCoder();
}
private static final ConcurrentHashMap<String, AvroCoder<GenericRecord>> avroCoders = new ConcurrentHashMap<>();
#Override
public void encode(GenericRecord value, OutputStream outStream) throws IOException {
SchemaRegistry.registerIfAbsent(value.getSchema());
String schemaName = value.getSchema().getFullName();
StringUtf8Coder.of().encode(schemaName, outStream);
AvroCoder<GenericRecord> coder = avroCoders.computeIfAbsent(schemaName,
s -> AvroCoder.of(value.getSchema()));
coder.encode(value, outStream);
}
#Override
public GenericRecord decode(InputStream inStream) throws IOException {
String schemaName = StringUtf8Coder.of().decode(inStream);
AvroCoder<GenericRecord> coder = avroCoders.computeIfAbsent(schemaName,
s -> AvroCoder.of(SchemaRegistry.get(schemaName)));
return coder.decode(inStream);
}
}
The usage is pretty straightforward:
PCollection<GenericRecord> inputCollection = pipeline
.apply(AvroIO
.parseGenericRecords(t -> t)
.withCoder(GenericRecordCoder.of())
.from(...));
After reviewing the code for AvroCoder, I do not think the documentation is correct there. Your AvroCoder instance will need a way to figure out the schema for your Avro records - and likely the only way to do that is by providing one.
So, I'd recommend calling AvroCoder.of(GenericRecord.class, schema), where schema is the correct schema for the GenericRecord objects in your PCollection.

Unzip File in Dataflow Before Reading

Our client is uploading files into GCS, but they are zipped. Is there any way, using the Java Dataflow SDK, in which we can run through all the zipped files, unzip the file, combine all the resulting .csv files into one file, and then only do the TextIO transforms?
EDIT
To answer jkffs's questions,
Well I don't really need to combine them all into a single file, it would just be much easier from a reading perspective.
They are ZIP files, not GZ or BZ or anything else. Each ZIP contains multiple files. The files names are not really significant, and yes, I would actually prefer it TextIO transparently decompresses and concatenates all the files, on a per-archive basis.
Hope that helps!
because I had the same problem and only came to this 1 year old and quite incomplete solution. Here is a full example on how to unzip files on google dataflow:
public class SimpleUnzip {
private static final Logger LOG = LoggerFactory.getLogger(SimpleUnzip.class);
public static void main(String[] args){
Pipeline p = Pipeline.create(
PipelineOptionsFactory.fromArgs(args).withValidation().create());
GcsUtilFactory factory = new GcsUtilFactory();
GcsUtil util = factory.create(p.getOptions());
try{
List<GcsPath> gcsPaths = util.expand(GcsPath.fromUri("gs://tlogdataflow/test/*.zip"));
List<String> paths = new ArrayList<String>();
for(GcsPath gcsp: gcsPaths){
paths.add(gcsp.toString());
}
p.apply(Create.of(paths))
.apply(ParDo.of(new UnzipFN()));
p.run();
}
catch(Exception e){
LOG.error(e.getMessage());
}
}
public static class UnzipFN extends DoFn<String,Long>{
private static final long serialVersionUID = 2015166770614756341L;
private long filesUnzipped=0;
#Override
public void processElement(ProcessContext c){
String p = c.element();
GcsUtilFactory factory = new GcsUtilFactory();
GcsUtil u = factory.create(c.getPipelineOptions());
byte[] buffer = new byte[100000000];
try{
SeekableByteChannel sek = u.open(GcsPath.fromUri(p));
InputStream is = Channels.newInputStream(sek);
BufferedInputStream bis = new BufferedInputStream(is);
ZipInputStream zis = new ZipInputStream(bis);
ZipEntry ze = zis.getNextEntry();
while(ze!=null){
LOG.info("Unzipping File {}",ze.getName());
WritableByteChannel wri = u.create(GcsPath.fromUri("gs://tlogdataflow/test/" + ze.getName()), getType(ze.getName()));
OutputStream os = Channels.newOutputStream(wri);
int len;
while((len=zis.read(buffer))>0){
os.write(buffer,0,len);
}
os.close();
filesUnzipped++;
ze=zis.getNextEntry();
}
zis.closeEntry();
zis.close();
}
catch(Exception e){
e.printStackTrace();
}
c.output(filesUnzipped);
}
private String getType(String fName){
if(fName.endsWith(".zip")){
return "application/x-zip-compressed";
}
else {
return "text/plain";
}
}
}
}
Dataflow / Apache Beam support ZIP-compressed files in TextIO automatically: TextIO.read().from(filepattern) will automatically decompress files matching the filepattern according to their extension, and .zip is one of the supported formats - in that case it will implicitly concatenate all files inside the .zip into a single file, and parse lines of text from that.
You can also specify compression type explicitly using TextIO.read().from(filepattern).withCompressionType(...) if the files don't have an extension.

jena programming error while reading from an input file .rdf.......Please guide me

package sample;
import java.io.InputStream;
import com.hp.hpl.jena.rdf.model.Model;
import com.hp.hpl.jena.rdf.model.ModelFactory;
import com.hp.hpl.jena.util.FileManager;
public class ReadRDF extends Object {
static final String fileName = "foaf-ijd.rdf";
public static void main(String[] args) {
Model model = ModelFactory.createDefaultModel();
InputStream in = FileManager.get().open(fileName);
if (in == null) {
throw new IllegalArgumentException("File: " + fileName
+ " not found");
}
model.read(in, "");
model.write(System.out);
}
}
Errors getting populated
Exception in thread "main" java.lang.NoSuchMethodError:
org.slf4j.Logger.isTraceEnabled()Z at
com.hp.hpl.jena.util.LocatorFile.open(LocatorFile.java:118) at
com.hp.hpl.jena.util.FileManager.openNoMapOrNull(FileManager.java:527)
at com.hp.hpl.jena.util.FileManager.openNoMap(FileManager.java:510)
at
com.hp.hpl.jena.util.LocationMapper.initFromPath(LocationMapper.java:132)
at com.hp.hpl.jena.util.LocationMapper.get(LocationMapper.java:61)
at com.hp.hpl.jena.util.FileManager.makeGlobal(FileManager.java:116)
at com.hp.hpl.jena.util.FileManager.get(FileManager.java:82) at
sample.ReadRDF.main(ReadRDF.java:17)
This error can apper, if you don't add to CLASSPATH all jar file from /lib dir in Jena ditrib.
Also, if version's slf4j, which you use, an d jena's slf4j is differenf, this error can appear.

Resources