getting error while installing kured using fleet - docker

the error iam getting is:
error while running post render on files:invalid cluster scoped object[name=namespaces-kured kind=ClusterRole apiversion:rbac.authorization.k8s.io/v1]found.consider using "DefaultNamespace" not "namespace" in fleet.yaml

Related

kafka streams stateful mode is error (rocksdb don't initialize)

error code :
Failed to close task manager due to the following error:
java.lang.NoClassDefFoundError: Could not initialize class org.rocksdb.DBOptions
at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:133)
at org.apache.kafka.streams.state.internals.TimestampedSegment.openDB(TimestampedSegment.java:49)
at org.apache.kafka.streams.state.internals.TimestampedSegments.getOrCreateSegment(TimestampedSegments.java:50)
at org.apache.kafka.streams.state.internals.TimestampedSegments.getOrCreateSegment(TimestampedSegments.java:25)
at org.apache.kafka.streams.state.internals.AbstractSegments.getOrCreateSegmentIfLive(AbstractSegments.java:84)
at org.apache.kafka.streams.state.internals.AbstractRocksDBSegmentedBytesStore.put(AbstractRocksDBSegmentedBytesStore.java:146)
at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:61)
at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:27)
at org.apache.kafka.streams.state.internals.ChangeLoggingWindowBytesStore.put(ChangeLoggingWindowBytesStore.java:111)
at org.apache.kafka.streams.state.internals.ChangeLoggingWindowBytesStore.put(ChangeLoggingWindowBytesStore.java:34)
at org.apache.kafka.streams.state.internals.CachingWindowStore.putAndMaybeForward(CachingWindowStore.java:106)
at org.apache.kafka.streams.state.internals.CachingWindowStore.lambda$initInternal$0(CachingWindowStore.java:86)
at org.apache.kafka.streams.state.internals.NamedCache.flush(NamedCache.java:151)
at org.apache.kafka.streams.state.internals.NamedCache.flush(NamedCache.java:109)
at org.apache.kafka.streams.state.internals.ThreadCache.flush(ThreadCache.java:124)
at org.apache.kafka.streams.state.internals.CachingWindowStore.flush(CachingWindowStore.java:291)
at org.apache.kafka.streams.state.internals.WrappedStateStore.flush(WrappedStateStore.java:84)
at org.apache.kafka.streams.state.internals.MeteredWindowStore.lambda$flush$4(MeteredWindowStore.java:200)
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:801)
at org.apache.kafka.streams.state.internals.MeteredWindowStore.flush(MeteredWindowStore.java:200)
at org.apache.kafka.streams.processor.internals.ProcessorStateManager.flush(ProcessorStateManager.java:282)
at org.apache.kafka.streams.processor.internals.StreamTask.suspend(StreamTask.java:647)
at org.apache.kafka.streams.processor.internals.StreamTask.close(StreamTask.java:745)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.closeTask(AssignedStreamsTasks.java:81)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.closeTask(AssignedStreamsTasks.java:37)
at org.apache.kafka.streams.processor.internals.AssignedTasks.shutdown(AssignedTasks.java:256)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.shutdown(AssignedStreamsTasks.java:535)
at org.apache.kafka.streams.processor.internals.TaskManager.shutdown(TaskManager.java:292)
at org.apache.kafka.streams.processor.internals.StreamThread.completeShutdown(StreamThread.java:1133)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:682)
Local works fine.
In the environment where the error occurs, the error occurs when uploading it to docker.
kafka streams version : 2.5.0

error converting YAML to JSON: yaml: line 74: mapping values are not allowed in this context

I have downloaded the akv2k8s helm chart locally and changed nothing. using default credential to install it on aks.
getting this error when installing this:
Error: UPGRADE FAILED: YAML parse error on akv2k8s/templates/env-injector-deployment.yaml: error converting YAML to JSON: yaml: line 74: mapping values are not allowed in this context
anyone else had faced this problem?

A command from the RDF4J framework is throwing an error

When running the following commands, an error is thrown by the add command. The rdf4j framework is used for communicating with a graphdb Knowledge Base:
import org.eclipse.rdf4j.model.Model;
Model model;
//statements have been added to the model
public RepositoryConnection connection;
...
connection.add(model); <-- error is thrown
An error is thrown in add command
2022-01-24 09:46:02 [http-nio-8080-exec-5] ERROR restapi.SubmitRMService - unable to rollback transaction. HTTP error code 404
org.eclipse.rdf4j.repository.RepositoryException: unable to rollback transaction. HTTP error code 404
at org.eclipse.rdf4j.http.client.RDF4JProtocolSession.rollbackTransaction(RDF4JProtocolSession.java:786)
at org.eclipse.rdf4j.repository.http.HTTPRepositoryConnection.rollback(HTTPRepositoryConnection.java:354)
at org.eclipse.rdf4j.repository.base.AbstractRepositoryConnection.conditionalRollback(AbstractRepositoryConnection.java:335)
at org.eclipse.rdf4j.repository.base.AbstractRepositoryConnection.add(AbstractRepositoryConnection.java:379)
The code is running successfully in two other environments except one we are setting with kubernetes. We ensured that /opt/graphdb/home has write permissions.
Question: I just do not understand the 404 RepositoryException error. The code can successfully run queries before the .add(model), so the connection with the graphdb is ok.

Failed to start the VM error when starting a Dataflow SQL job

Getting the following error when I try to launch a Dataflow SQL job:
Failed to start the VM, launcher-____, used for launching because of status code: INVALID_ARGUMENT, reason: Error: Message: Invalid value for field 'resource.networkInterfaces[0].network': 'global/networks/default'. The referenced network resource cannot be found. HTTP Code: 400.
This issue just started today.
Adding the default network solved the issue.

Using ant task in mobilefirst error in wladm cmd

I followed this tutorial:
http://www.ibm.com/support/knowledgecenter/SSHS8R_7.1.0/com.ibm.worklight.appadmin.doc/admin/r_invoking_the_wladm_program.html
I'm trying to enter a command in cmd and tried this:
wladm --url=http://IP:9080 --user=demo --passwordfile=PATH\wladm.config --secure=false show info
I'm getting this error:
Error accessing http://IP:9080/userAndConfigInfo?locale=en_US:
HTTP/1.1 404 Not Found
Now when I enter another command:
wladm --url=http://IP:9080 --user=demo --passwordfile=PATH\wladm.config --secure=false list adapters RuntimeName
I'm getting this error:
Error accessing http://sv591527.ph.sunlife:9080/management-apis/1.0/runtimes/Sun
lifeTestApp/adapters?pageSize=1000000000&locale=en_US: HTTP/1.1 404 Not Found
Anyone have an idea what I'm missing?
The URL should contain the context root of the MobileFirst web application for administration services, that is, should be something like http://IP:9080/wladmin or http://IP:9080/worklightadmin. For more details, please consult the wladm usage documentation http://www.ibm.com/support/knowledgecenter/SSHS8R_7.1.0/com.ibm.worklight.appadmin.doc/admin/r_invoking_the_wladm_program.html.

Resources