JSON.SET in redis cli and stackexchange.redis it's showing exception - docker

I'm trying to set and get JSON results in redis with redis-om anyway it uses Stackexchange.Redis even with that the same exception repects.
using System;
using StackExchange.Redis;
namespace test
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
var muxe = ConnectionMultiplexer.Connect("localhost");
var db = muxe.GetDatabase();
var JSONResult = db.CreateTransaction();
db.Execute("JSON.SET", "dog:1", "$", "{\"name\":\"Honey\",\"breed\":\"Greyhound\"}");
db.Execute("JSON.GET", "dog:1", "$.breed");
JSONResult.Execute();
Console.WriteLine("I'm ok");
}
}
}
Exception: StackExchange.Redis.RedisServerException: 'ERR unknown command `JSON.SET`,

I missed with Redis moudle because alpine image only have core funs some of the modules are not present with the same. please check the redis modules.
redis-cli info modules

Related

REST call not working with Camel running in Docker

I have this Camel Rest Route:
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.main.Main;
import org.apache.camel.model.rest.RestBindingMode;
import spark.Spark;
public class MainCamel {
public static void main(final String[] args) throws Exception {
final Main camelMain = new Main();
camelMain.configure().addRoutesBuilder(new RouteBuilder() {
#Override
public void configure() throws Exception {
this.getContext().getRegistry().bind("healthcheck", CheckUtil.class);
this.restConfiguration()
.bindingMode(RestBindingMode.auto)
.component("netty-http")
.host("localhost")
.port(11010);
this.rest("/healthcheck")
.get()
.description("Healthcheck for docker")
.outType(Integer.class)
.to("bean:healthcheck?method=healthCheck");
}
});
// spark
Spark.port(11011);
Spark.get("/hello", (req, res) -> "Hello World");
System.out.println("ready");
camelMain.run(args);
}
public static class CheckUtil {
public Integer healthCheck() {
return 0;
}
}
}
I also created a second REST server with Spark.
The Camel route does NOT work if the code is executed in a Docker container.
Exception: org.apache.http.NoHttpResponseException: localhost:11010 failed to respond
The Spark server works fine.
However when executing the code directly in IntelliJ both REST Servers work. Of course both ports are exposed in the container.
You are binding the Netty HTTP server to localhost. Meaning that it will not be able to serve requests that originate from outside of the container.
Change .host("localhost") to .host("0.0.0.0") so that the sever listens on all available network interfaces.

How to use Dart's analysis server method search.findTopLevelDeclarations?

I'm trying to get list of top level class declarations with Dart's analysis server. So, I'm sending search.findTopLevelDeclarations request, but search results are always empty.
It seems to me that analysis server don't know where to search. I've tried to set my project's root as execution context (execution.createContext) root and/or analysis root (analysis.setAnalysisRoots), but search results are still empty.
What should I do to make server understand where to search declarations?
Never played with this before so I got into quite a journey...
I don't know how you are interacting with the analysis server but I have made a working example using the analysis_server_client package. One problem doing that is that the version on pub.dev is quite old so I ended up fetching the version from the stable branch of Dart SDK:
https://github.com/dart-lang/sdk/tree/stable/pkg/analysis_server_client
You can then import the package in your pubspec.yaml by doing:
dependencies:
analysis_server_client:
path: /path/to/analysis_server_client
I then made a simplified version of the example code from:
https://github.com/dart-lang/sdk/blob/stable/pkg/analysis_server_client/example/example.dart
import 'dart:io' show exit;
import 'package:analysis_server_client/handler/connection_handler.dart';
import 'package:analysis_server_client/handler/notification_handler.dart';
import 'package:analysis_server_client/protocol.dart';
import 'package:analysis_server_client/server.dart';
final server = Server();
Future<void> main(List<String> args) async {
const targetDirPath = r'C:\tmp\simple_project';
const searchPattern = 'main';
// Launch the server
await server.start();
// Connect to the server
final handler = _Handler(server);
server.listenToOutput(notificationProcessor: handler.handleEvent);
if (!await handler.serverConnected(timeLimit: const Duration(seconds: 15))) {
exit(1);
}
await server.send(ANALYSIS_REQUEST_SET_ANALYSIS_ROOTS,
AnalysisSetAnalysisRootsParams([targetDirPath], const []).toJson());
await server.send(SEARCH_REQUEST_FIND_TOP_LEVEL_DECLARATIONS,
SearchFindTopLevelDeclarationsParams(searchPattern).toJson());
}
class _Handler with NotificationHandler, ConnectionHandler {
#override
final Server server;
_Handler(this.server);
#override
void onSearchResults(SearchResultsParams params) {
print('-- Start of result --');
params.results.forEach(print);
print('-- End of result --');
server.stop();
}
}
The project at C:\tmp\simple_project is a simple project created with the following which means it just contains a single main method:
dart create -t console-simple simple_project
When I run my analyzer program I get the following output:
-- Start of result --
{"location":{"file":"C:\\tmp\\simple_project\\bin\\simple_project.dart","offset":5,"length":4,"startLine":1,"startColumn":6,"endLine":1,"endColumn":10},"kind":"DECLARATION","isPotential":false,"path":[{"kind":"FUNCTION","name":"main","location":{"file":"C:\\tmp\\simple_project\\bin\\simple_project.dart","offset":5,"length":4,"startLine":1,"startColumn":6,"endLine":1,"endColumn":10},"flags":8,"parameters":"(List<String> arguments)","returnType":"void"},{"kind":"COMPILATION_UNIT","name":"simple_project.dart","location":{"file":"C:\\tmp\\simple_project\\bin\\simple_project.dart","offset":0,"length":0,"startLine":1,"startColumn":1,"endLine":1,"endColumn":1},"flags":16},{"kind":"LIBRARY","name":"","location":{"file":"C:\\tmp\\simple_project\\bin\\simple_project.dart","offset":0,"length":0,"startLine":1,"startColumn":1,"endLine":1,"endColumn":1},"flags":0}]}
-- End of result --
If I change searchPattern to an empty String, I gets a long list of top level declarations around the default included Dart SDK libraries. I am sure there are a way to exclude those.
But as far as I can see, the searchPattern is a regular expression tested against the name of each top level declaration and includes the declaration if its name contain any part of the regular expression.
I found the code responsible for the search here:
#override
Future<List<SearchMatch>> searchTopLevelDeclarations(String pattern) async {
var allElements = <Element>{};
var regExp = RegExp(pattern);
var drivers = _drivers.toList();
for (var driver in drivers) {
var elements = await driver.search.topLevelElements(regExp);
allElements.addAll(elements);
}
return allElements.map(SearchMatchImpl.forElement).toList();
}
https://github.com/dart-lang/sdk/blob/1278bd5adb6a857580f137e47bc521976222f7b9/pkg/analysis_server/lib/src/services/search/search_engine_internal.dart#L113-L123
Which calls into:
/// Returns top-level elements with names matching the given [regExp].
Future<List<Element>> topLevelElements(RegExp regExp) async {
List<Element> elements = <Element>[];
void addElement(Element element) {
if (!element.isSynthetic && regExp.hasMatch(element.displayName)) {
elements.add(element);
}
}
List<FileState> knownFiles = _driver.fsState.knownFiles.toList();
for (FileState file in knownFiles) {
var unitResult = await _driver.getUnitElement(file.path);
if (unitResult is UnitElementResult) {
CompilationUnitElement unitElement = unitResult.element;
unitElement.accessors.forEach(addElement);
unitElement.classes.forEach(addElement);
unitElement.enums.forEach(addElement);
unitElement.extensions.forEach(addElement);
unitElement.functions.forEach(addElement);
unitElement.mixins.forEach(addElement);
unitElement.topLevelVariables.forEach(addElement);
unitElement.typeAliases.forEach(addElement);
}
}
return elements;
}
https://github.com/dart-lang/sdk/blob/1278bd5adb6a857580f137e47bc521976222f7b9/pkg/analyzer/lib/src/dart/analysis/search.dart#L166-L192

Deploying a transaction event listener in a Neo4jDesktop installation

I have created a project that contains an ExtensionFactory subclass annotated as #ServiceProvider that returns a LifecycleAdapter subclass which registers a transaction event listener in its start() method, as shown in this example. The code is below:
#ServiceProvider
public class EventListenerExtensionFactory extends ExtensionFactory<EventListenerExtensionFactory.Dependencies> {
private final List<TransactionEventListener<?>> listeners;
public EventListenerExtensionFactory() {
this(List.of(new MyListener()));
}
public EventListenerExtensionFactory(List<TransactionEventListener<?>> listeners) {
super(ExtensionType.DATABASE, "EVENT_LISTENER_EXT_FACTORY");
this.listeners = listeners;
}
#Override
public Lifecycle newInstance(ExtensionContext context, Dependencies dependencies) {
return new EventListenerLifecycleAdapter(dependencies, listeners);
}
#RequiredArgsConstructor
private static class EventListenerLifecycleAdapter extends LifecycleAdapter {
private final Dependencies dependencies;
private final List<TransactionEventListener<?>> listeners;
#Override
public void start() {
DatabaseManagementService managementService = dependencies.databaseManagementService();
listeners.forEach(listener -> managementService.registerTransactionEventListener(
DEFAULT_DATABASE_NAME, listener));
dependencies.log()
.getUserLog(EventListenerExtensionFactory.class)
.info("Registering transaction event listener for database " + DEFAULT_DATABASE_NAME);
}
}
interface Dependencies {
DatabaseManagementService databaseManagementService();
LogService log();
}
}
It works fine in an integration test:
public AbstractDatabaseTest(TransactionEventListener<?>... listeners) {
URI uri = Neo4jBuilders.newInProcessBuilder()
.withExtensionFactories(List.of(new EventListenerExtensionFactory(List.of(listeners))))
.withDisabledServer()
.build()
.boltURI();
driver = GraphDatabase.driver(uri);
session = driver.session();
}
Then I copy the jar file in the plugins directory of my desktop database:
$ cp build/libs/<myproject>.jar /mnt/c/Users/albert.gevorgyan/.Neo4jDesktop/relate-data/dbmss/dbms-7fe3cbdb-11b2-4ca2-81eb-474edbbb3dda/plugins/
I restart the database and even the whole desktop Neo4j program but it doesn't seem to identify the plugin or to initialize the factory: no log messages are found in neo4j.log after the start event, and the transaction events that should be captured by my listener are ignored. Interestingly, a custom function that I have defined in the same jar file actually works - I can call it in the browser. So something must be missing in the extension factory as it doesn't get instantiated.
Is it possible at all to deploy an ExtensionFactory in a Desktop installation and if yes, what am I doing wrong?
It works after I added a provider configuration file to META-INF/services, as explained in https://www.baeldung.com/java-spi. Neo4j finds it then.

Running Spring Boot test inside Docker container

I am wondering if it is possible to run Spring Boot test inside a Docker container. I would like to test my service in the environment as close to the real one as possible. I am already using Testcontainers and Localstack to setup my environment (Postgres, AWS secrets manager etc.), but couldn't find any way to run my service inside Docker. Ideally I would like to test my service inside Localstack's EC2 instance, but just testing it as Docker container would be sufficient. I have Dockerfile at the root of my project and am using Gradle Palantir plugin to build the container. The bootstrap section of my test:
#Testcontainers
#ActiveProfiles(profiles = {"test","jpa"})
#SpringBootTest(classes = PayoutsApplication.class, webEnvironment=WebEnvironment.RANDOM_PORT,
properties = {"aws.paramstore.enabled=false", "aws.secretsmanager.enabled=false"})
#EnableAutoConfiguration(exclude = {
org.springframework.cloud.aws.autoconfigure.context.ContextInstanceDataAutoConfiguration.class,
org.springframework.cloud.aws.autoconfigure.context.ContextStackAutoConfiguration.class,
org.springframework.cloud.aws.autoconfigure.context.ContextRegionProviderAutoConfiguration.class
})
#TestMethodOrder(OrderAnnotation.class)
#ContextConfiguration(loader = PayoutsApplicationTests.CustomLoader.class)
#TestExecutionListeners(listeners = {DependencyInjectionTestExecutionListener.class, PayoutsApplicationTests.class})
class MyApplicationTests extends AbstractTestExecutionListener {
private static AWSSecretsManager awsSecretsManager;
private static AWSSimpleSystemsManagement ssmClient;
// we need it to execute the listeners
public static class CustomLoader extends SpringBootContextLoader {
#Override
protected SpringApplication getSpringApplication() {
PropertiesListener listener = new PropertiesListener();
ReflectionTestUtils.setField(listener, "awsSecretsManager", awsSecretsManager);
ReflectionTestUtils.setField(listener, "ssmClient", ssmClient);
SpringApplication app = super.getSpringApplication();
app.addListeners(listener);
return app;
}
}
static DockerImageName localstackImage = DockerImageName.parse("localstack/localstack:0.11.3");
private static final LocalStackContainer.Service[] TEST_SERVICES = {
LocalStackContainer.Service.SECRETSMANAGER,
LocalStackContainer.Service.SSM,
LocalStackContainer.Service.EC2,
};
static LocalStackContainer awsLocalStackContainers = new LocalStackContainer(localstackImage)
.withServices(TEST_SERVICES).withEnv("LOCALSTACK_HOSTNAME", "localhost")
.withEnv("HOSTNAME", "localhost");
#Container
public static PostgreSQLContainer<?> postgreSQL =
new PostgreSQLContainer<>("postgres:13.1")
.withUsername("clusteradmin")
.withPassword("testPassword")
.withDatabaseName("public");
#DynamicPropertySource
static void postgresqlProperties(DynamicPropertyRegistry registry) {
registry.add("spring.datasource.url", postgreSQL::getJdbcUrl);
registry.add("spring.datasource.password", postgreSQL::getPassword);
registry.add("spring.datasource.username", postgreSQL::getUsername);
}
...

Neo4j OGM returning No Host exception

I am running neo4j-community-2.2.5 locally on my macbook.
I am trying to connect to the code using the neo4j-ogm version : 1.1.2
Here is the session factory:
public class Neo4jSessionFactory {
private final static SessionFactory sessionFactory = new SessionFactory("com.readypulse.rpinfluencernetwork.ogm.model");
private static Neo4jSessionFactory factory = new Neo4jSessionFactory();
public static Neo4jSessionFactory getInstance() {
return factory;
}
private Neo4jSessionFactory() {
}
public Session getNeo4jSession() {
return sessionFactory.openSession("http://localhost:7474", "neo4j", "mypassword");
}
}
I have a entity class :
#NodeEntity(label="Hashtag")
public class Hashtag extends Entity {
#Property(name = "name")
String name;
...
Service :
public interface HashtagService extends Service<Hashtag>{
}
Generic Service:
public abstract class GenericService<T> implements Service<T> {
private static final int DEPTH_LIST = 0;
private static final int DEPTH_ENTITY = 1;
private Session session = Neo4jSessionFactory.getInstance().getNeo4jSession();
public Iterable<T> findAll() {
return session.loadAll(getEntityType(), DEPTH_LIST);
}
public T find(Long id) {
return session.load(getEntityType(), id, DEPTH_ENTITY);
}
public void delete(Long id) {
session.delete(session.load(getEntityType(), id));
}
public void createOrUpdate(T entity) {
session.save(entity, DEPTH_ENTITY);
}
public abstract Class<T> getEntityType();
}
Calling code :
public static void main(String args[]) {
Hashtag hashtag = new Hashtag("fun");
HashtagService service = new HashtagServiceImpl();
service.createOrUpdate(hashtag);
}
I am running the code on eclipse as simple java process, and not on any Application server.
Here is the full log with trace:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/lazywiz/.m2/repository/org/slf4j/slf4j-log4j12/1.5.8/slf4j-log4j12-1.5.8.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/lazywiz/.m2/repository/org/slf4j/slf4j-jdk14/1.5.11/slf4j-jdk14-1.5.11.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/lazywiz/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
15/10/09 14:55:16 INFO info.ClassFileProcessor: Starting Post-processing phase
15/10/09 14:55:16 INFO info.ClassFileProcessor: Building annotation class map
15/10/09 14:55:16 INFO info.ClassFileProcessor: Building interface class map for 9 classes
15/10/09 14:55:16 INFO info.ClassFileProcessor: Registering default type converters...
15/10/09 14:55:16 INFO info.ClassFileProcessor: Post-processing complete
15/10/09 14:55:16 INFO info.ClassFileProcessor: 9 classes loaded in 16 milliseconds
15/10/09 14:55:17 WARN request.DefaultRequest: Caught response exception: No Host
Exception in thread "main" org.neo4j.ogm.session.result.ResultProcessingException: Failed to execute request: {"statements":[{"statement":"CREATE (_0:`Hashtag`{_0_props}) RETURN id(_0) AS _0","parameters":{"_0_props":{"name":"varun"}},"resultDataContents":["row"],"includeStats":false}]}
at org.neo4j.ogm.session.request.DefaultRequest.execute(DefaultRequest.java:105)
at org.neo4j.ogm.session.request.SessionRequestHandler.execute(SessionRequestHandler.java:99)
at org.neo4j.ogm.session.delegates.SaveDelegate.save(SaveDelegate.java:68)
at org.neo4j.ogm.session.Neo4jSession.save(Neo4jSession.java:391)
at com.readypulse.rpinfluencernetwork.ogm.service.GenericService.createOrUpdate(GenericService.java:26)
at com.readypulse.rpinfluencernetwork.GraphManager.main(GraphManager.java:16)
Caused by: org.apache.http.client.HttpResponseException: No Host
at org.neo4j.ogm.session.request.DefaultRequest.execute(DefaultRequest.java:86)
... 5 more
Can someone please suggest where I am going wrong.
Prior to this I was having a totally different code base where I was using
graphDb = new GraphDatabaseFactory().newEmbeddedDatabase(dbPath); and
But later I realized that this is not the right way when I want to connect to the neo4j server running in prod environment. I want to start the server and hence connect is via java and Ruby client concurrently.
Thanks!
Some points :
a)
You can not use neo4j as password, this is the default password when you install a new database, but the password need to be changed at first start.
For changing the password :
Open the Neo4j browser and the first prompt will ask you to change the password
Or issue a curl request for changing the password :
curl -H "Content-Type: application/json"\
-H "Authorization: Basic echo -n 'neo4j:neo4j' | base64"\
-X POST -d '{"password":"yourNewPassword"}'\
-I http://localhost:7474/user/neo4j/password
b) If you're not using SDN4, In the sessionFactoryyou need to pass the user and password as arguments to the openSession method:
Session session = sessionFactory.openSession("http://localhost:7474", username, password);
Docs :
Neo4j Authentication: http://neo4j.com/docs/stable/rest-api-security.html#rest-api-user-status-on-first-access
Neo4j OGM Session Authentication : http://neo4j.com/docs/ogm/java/stable/#reference_programming-model_session

Resources