error from Akka shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[play-dev-mode] java.lang.StackOverflowError - playframework-2.6

When I start the play/scala application in IntelliJ, I get the following error Uncaught error from thread [play-dev-mode-akka.actor.default-dispatcher-2]: null, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[play-dev-mode]
java.lang.StackOverflowError
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
The error happens at the following line of code lazy val httpRequestHandler: HttpRequestHandler = new DefaultHttpRequestHandler(router, httpErrorHandler, httpConfiguration, httpFilters: _*). This is not my code though. The first entry of my code which I can see in stack trace (apploader.scala) is
lazy val userController = new UserController(credentialsProvider,application.configuration, utilities,userRepository, userTokenRepository,mailerService,controllerComponents, silhouetteJWTProvider,messagesApi,langs)
The UserController is defined as follows:
class UserController (credentialsProvider:CredentialsProvider,config:Configuration, utilities:HelperMethods, userRepo: UsersRepository,userTokenRepo:UserTokenRepository, mailerService:MailerService, cc: ControllerComponents, silhouette: Silhouette[JWTEnv],messagesApi: MessagesApi,langs:Langs)(implicit exec: ExecutionContext) extends AbstractController(cc){
I have seen solutions on SO which shows how to disable the feature (disable akka.jvm-exit-on-fatal-error for actorsystem in java) but I don't think it is a good idea to allow jvm to run for StackOverflowError. I don't know what is causing the issue though.
Why am I getting the error? To me, it seems the code is running in some sort of loop because the stack trace is repetitive.
Server started, use Alt+D to stop
practice question Javascript repo is practice_questions_javascript_tag
practice question html repo is practice_questions_html_tag
[warn] c.d.d.c.Cluster - You listed localhost/0:0:0:0:0:0:0:1:9042 in your contact points, but it wasn't found in the control host's system.peers at startup
database will connect with keyspace Some(codingjedi)
(keyspace is ,codingjedi)
user repo is users
app loader: csrf values: csrfToken, Some(CJCsrfCookie), CJCsrfHeader
Uncaught error from thread [play-dev-mode-akka.actor.default-dispatcher-2]: null, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[play-dev-mode]
java.lang.StackOverflowError
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at play.api.BuiltInComponents.httpRequestHandler$(Application.scala:321)
at play.api.BuiltInComponentsFromContext.httpRequestHandler$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.httpRequestHandler(ApplicationLoader.scala:122)
at play.api.BuiltInComponents.application(Application.scala:324)
at play.api.BuiltInComponents.application$(Application.scala:323)
at play.api.BuiltInComponentsFromContext.application$lzycompute(ApplicationLoader.scala:122)
at play.api.BuiltInComponentsFromContext.application(ApplicationLoader.scala:122)
at app.AppComponents.userController$lzycompute(AppLoader.scala:357)
at app.AppComponents.userController(AppLoader.scala:357)
at app.AppComponents.userWSRoutes$lzycompute(AppLoader.scala:377)
at app.AppComponents.userWSRoutes(AppLoader.scala:377)
at app.AppComponents.router$lzycompute(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:379)
at app.AppComponents.router(AppLoader.scala:128)
at play.api.BuiltInComponents.httpRequestHandler(Application.scala:321)
at ...
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0

I haven't debugged the issue completely yet but it seems that I should be using the configuration object provided by BuiltInComponentsFromContext instead of application.configuration. I changed the userRepository definition to the following and the code worked.
lazy val userController = new UserController(credentialsProvider,configuration, utilities,userRepository, userTokenRepository,mailerService,controllerComponents, silhouetteJWTProvider,messagesApi,langs)

Related

Keycloak and Docker OutOfMemoryError - process/resource limits reached

I have rented a virtual ubuntu server. Various applications run on it in Docker containers and natively:
Plesk
Wordpress
Flarum
MySQL
Wiki.js (in Docker container)
Keycloak (in Docker container)
MariaDB (in Docker container)
I use Keycloak as SSO for Wordpress, Wiki.js and Flarum. Now I have the problem that Keycloak simply crashes after a while and I can't restarted it in Docker. I get the following error message:
keycloak_1 | 17:22:06,447 DEBUG [org.jboss.as.config] (MSC service thread 1-3) VM Arguments: -D[Standalone] -Xms512m -Xmx2048m -XX:MetaspaceSize=96M -XX:MaxMetaspaceSize=256m -XX:+UseAdaptiveSizePolicy -XX:MaxMetaspaceSize=1024m -Djboss.modules.system.pkgs=org.jboss.byteman -Djava.awt.headless=true-Djava.net.preferIPv4Stack=true --add-exports=java.base/sun.nio.ch=ALL-UNNAMED --add-exports=jdk.unsupported/sun.misc=ALL-UNNAMED --add-exports=jdk.unsupported/sun.reflect=ALL-UNNAMED -Dorg.jboss.boot.log.file=/opt/jboss/keycloak/standalone/log/server.log -Dlogging.configuration=file:/opt/jboss/keycloak/standalone/configuration/logging.properties
keycloak_1 | 17:22:19,493 ERROR [org.jboss.as.controller.management-operation] (Controller Boot Thread) WFLYCTL0013: Operation ("add") failed - address: ([
keycloak_1 | ("subsystem" => "infinispan"),
keycloak_1 | ("cache-container" => "keycloak"),
keycloak_1 | ("thread-pool" => "transport")
keycloak_1 | ]) - failure description: {"WFLYCTL0080: Failed services" => {"org.wildfly.clustering.infinispan.cache-container.keycloak" => "org.infinispan.manager.EmbeddedCacheManagerStartupException: org.infinispan.commons.CacheException: java.lang.OutOfMemoryError: unable to create native thread: possibly out of memory or process/resource limits reached
keycloak_1 | Caused by: org.infinispan.manager.EmbeddedCacheManagerStartupException: org.infinispan.commons.CacheException: java.lang.OutOfMemoryError: unable to create native thread: possibly out of memory or process/resource limits reached
keycloak_1 | Caused by: org.infinispan.commons.CacheException: java.lang.OutOfMemoryError: unable to create native thread: possibly out of memory or process/resource limits reached
keycloak_1 | Caused by: java.lang.OutOfMemoryError: unable to create native thread: possibly out of memory or process/resource limits reached"}}
keycloak_1 | 17:22:19,505 INFO [org.jboss.as.server] (ServerService Thread Pool -- 46) WFLYSRV0010: Deployed "keycloak-server.war" (runtime-name : "keycloak-server.war")
keycloak_1 | 17:22:19,507 INFO [org.jboss.as.controller] (Controller Boot Thread) WFLYCTL0183: Service status report
keycloak_1 | WFLYCTL0186: Services which failed to start: service org.wildfly.clustering.infinispan.cache.ejb.http-remoting-connector: org.infinispan.commons.CacheConfigurationException: Error starting component org.infinispan.expiration.impl.InternalExpirationManager
keycloak_1 | service org.wildfly.clustering.infinispan.cache-container.keycloak: org.infinispan.manager.EmbeddedCacheManagerStartupException: org.infinispan.commons.CacheException: java.lang.OutOfMemoryError: unable to create native thread: possibly out of memory or process/resource limits reached
keycloak_1 | WFLYCTL0448: 32 additional services are down due to their dependencies being missing or failed
keycloak_1 | 17:22:19,599 INFO [org.jboss.as.server] (Controller Boot Thread) WFLYSRV0212: Resuming server
keycloak_1 | 17:22:19,606 ERROR [org.jboss.as] (Controller Boot Thread) WFLYSRV0026: Keycloak 12.0.4 (WildFly Core 13.0.3.Final) started (with errors) in 15455ms - Started 558 of 926 services (44 services failed or missing dependencies, 684 services are lazy, passive or on-demand)
keycloak_1 | 17:22:19,614 INFO [org.jboss.as] (Controller Boot Thread) WFLYSRV0060: Http management interface listening on http://127.0.0.1:9990/management
keycloak_1 | 17:22:19,614 INFO [org.jboss.as] (Controller Boot Thread) WFLYSRV0051: Admin console listening on http://127.0.0.1:9990
The critical mistake should be the following:
keycloak_1 | 17:48:15,196 ERROR [org.jboss.msc.service.fail] (ServerService Thread Pool -- 60) MSC000001: Failed to start service org.wildfly.clustering.infinispan.cache-container.keycloak: org.jboss.msc.service.StartException in service org.wildfly.clustering.infinispan.cache-container.keycloak: org.infinispan.manager.EmbeddedCacheManagerStartupException: org.infinispan.commons.CacheException: java.lang.OutOfMemoryError: unable to create native thread: possibly out of memory or process/resource limits reached
At first time I thought that Keycloak with Docker need more memory. Unfortunately, the change didn't bring the desired success. After some research, I read that sometime there are some problems with the threads on virtual servers. Unfortunately, I don't know that much about this topic. I hope someone can help me. :)
Am I right that it can be due to the thread limit of the virtual server?
Attached is my docker-compose file:
version: '3'
services:
mariadb:
image: mariadb:latest
restart: always
environment:
MYSQL_ROOT_PASSWORD: ******
MYSQL_DATABASE: app_keycloak
MYSQL_USER: ******
MYSQL_PASSWORD: ******
ports:
- 3308:3306
# Copy-pasted from https://github.com/docker-library/mariadb/issues/94
healthcheck:
test: ["CMD", "mysqladmin", "ping", "--silent"]
keycloak:
image: jboss/keycloak:latest
restart: always
environment:
DB_VENDOR: mariadb
DB_ADDR: mariadb
DB_DATABASE: ******
DB_USER: ******
DB_PASSWORD: ******
KEYCLOAK_USER: ******
KEYCLOAK_PASSWORD: ******
JGROUPS_DISCOVERY_PROTOCOL: JDBC_PING
JAVA_OPTS: "-server -Xms512m -Xmx2048m -XX:MetaspaceSize=96M -XX:MaxMetaspaceSize=256m -XX:+UseAdaptiveSizePolicy -XX:MaxMetaspaceSize=1024m -Djboss.modules.system.pkgs=org.jboss.byteman -Djava.awt.head$t.headless=true-Djava.net.preferIPv4Stack=true"
ports:
- 8080:8080
depends_on:
- mariadb
Update 1:
It does not seem to be due to the thread limit.
systemctl show --property=DefaultTasksMax
I looked to see if there was a limit. I read that Ubuntu set DefaultTasksMax to 15%.
cat /proc/user_beancounters
Overall I have by provider a limit of 700 threads.
Additionally, I looked at how many threads were using the current services. Docker in particular.
systemctl status *.service | grep -e Tasks
systemctl status docker.service | grep -e Tasks --> 75
With the findings I set DefaultTasksMax to 200.
nano /etc/systemd/system.conf
systemctl daemon-reload
In the end, I restarted the Docker Compose.
docker-compose down
docker-compose up
Unfortunately, I still get the same error. :(
Update 2:
An update to version 13 of Keycloak has apparently fixed the problem. I will continue to monitor the behavior.

How to test a node app with webdriverio in a dockerized environment (ERR_SSL_PROTOCOL_ERROR)

I want to have three connected docker containers (to run it on a build server):
My application (name: app)
A browser (in this case chrome; name: selenium)
My End2End/UI-Tests (name tester)
However the tests aren't running. The current error message from the tester container is "Request failed with status 500 due to unknown error: net::ERR_SSL_PROTOCOL_ERROR"
file structure:
dir
test
specs
basic.js
app.js
docker-compose.yml
Dockerfile
package.json
wdio.conf.js
And here my files:
"docker-compose.yml":
version: '3'
services:
tester:
build:
context: .
target: e2e-tests
command: npx wdio wdio.conf.js
links:
- selenium
selenium:
image: selenium/standalone-chrome
expose:
- "4444"
links:
- app
app:
build:
context: .
target: prod
expose:
- "3000"
command: npm start
"Dockerfile":
FROM node:12 as base
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
FROM base as prod
COPY app.js ./
EXPOSE 3000
FROM base as e2e-tests
COPY wdio.conf.js ./
COPY test test
"wdio.conf.js":
exports.config = {
hostname: 'selenium',
path: '/wd/hub',
specs: [
'./test/specs/**/*.js'
],
exclude: [],
maxInstances: 10,
capabilities: [{
maxInstances: 5,
browserName: 'chrome',
acceptInsecureCerts: true
}],
logLevel: 'info',
bail: 0,
baseUrl: 'http://localhost',
waitforTimeout: 10000,
connectionRetryTimeout: 120000,
connectionRetryCount: 3,
framework: 'jasmine',
reporters: ['spec'],
jasmineNodeOpts: {
defaultTimeoutInterval: 60000,
expectationResultHandler: function(passed, assertion) {}
},
}
"test/spec/basic.js":
describe('test app', () => {
it('should have the right title', () => {
browser.url('http://app:3000')
expect(browser).toHaveTextContaining('Hello');
})
})
"app.js":
const express = require('express');
const app = express();
app.get('/', (req, res) => res.send('<html><head><title>Hello</title></head><body>Hello World!</body></html>'));
const server = app.listen(3000, () => {
const { port } = server.address();
console.log(`Test app listening on port ${port}`);
});
"package.json":
{
"name": "wdiodocker",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "node app.js"
},
"author": "",
"license": "ISC",
"dependencies": {
"express": "^4.17.1"
},
"devDependencies": {
"#wdio/cli": "^6.5.2",
"#wdio/jasmine-framework": "^6.5.0",
"#wdio/local-runner": "^6.5.2",
"#wdio/spec-reporter": "^6.4.7",
"#wdio/sync": "^6.5.0",
"wdio-docker-service": "^3.0.0"
}
}
"bigger snippet from the log":
tester_1 | 2020-09-28T07:55:11.844Z INFO #wdio/cli:launcher: Run onPrepare hook
tester_1 | 2020-09-28T07:55:11.848Z INFO #wdio/cli:launcher: Run onWorkerStart hook
tester_1 | 2020-09-28T07:55:11.850Z INFO #wdio/local-runner: Start worker 0-0 with arg: wdio.conf.js
selenium_1 | 07:55:12.017 INFO [GridLauncherV3.lambda$buildLaunchers$3] - Launching a standalone Selenium Server on port 4444
selenium_1 | 2020-09-28 07:55:12.150:INFO::main: Logging initialized #1038ms to org.seleniumhq.jetty9.util.log.StdErrLog
tester_1 | [0-0] 2020-09-28T07:55:12.634Z INFO #wdio/local-runner: Run worker command: run
tester_1 | [0-0] 2020-09-28T07:55:12.658Z INFO webdriverio: Initiate new session using the ./protocol-stub protocol
selenium_1 | 07:55:12.862 INFO [WebDriverServlet.<init>] - Initialising WebDriverServlet
tester_1 | [0-0] RUNNING in chrome - /test/specs/basic.js
tester_1 | [0-0] 2020-09-28T07:55:13.056Z INFO webdriverio: Initiate new session using the webdriver protocol
tester_1 | [0-0] 2020-09-28T07:55:13.060Z INFO webdriver: [POST] http://selenium:4444/wd/hub/session
tester_1 | [0-0] 2020-09-28T07:55:13.061Z INFO webdriver: DATA {
tester_1 | capabilities: {
tester_1 | alwaysMatch: { browserName: 'chrome', acceptInsecureCerts: true },
tester_1 | firstMatch: [ {} ]
tester_1 | },
tester_1 | desiredCapabilities: { browserName: 'chrome', acceptInsecureCerts: true }
tester_1 | }
selenium_1 | 07:55:13.122 INFO [SeleniumServer.boot] - Selenium Server is up and running on port 4444
selenium_1 | 07:55:13.331 INFO [ActiveSessionFactory.apply] - Capabilities are: {
selenium_1 | "acceptInsecureCerts": true,
selenium_1 | "browserName": "chrome"
selenium_1 | }
selenium_1 | 07:55:13.335 INFO [ActiveSessionFactory.lambda$apply$11] - Matched factory org.openqa.selenium.grid.session.remote.ServicedSession$Factory (provider: org.openqa.selenium.chrome.ChromeDriverService)
selenium_1 | Starting ChromeDriver 85.0.4183.83 (94abc2237ae0c9a4cb5f035431c8adfb94324633-refs/branch-heads/4183#{#1658}) on port 24508
selenium_1 | Only local connections are allowed.
selenium_1 | Please see https://chromedriver.chromium.org/security-considerations for suggestions on keeping ChromeDriver safe.
selenium_1 | ChromeDriver was started successfully.
selenium_1 | [1601279713.386][SEVERE]: bind() failed: Cannot assign requested address (99)
selenium_1 | 07:55:14.377 INFO [ProtocolHandshake.createSession] - Detected dialect: W3C
selenium_1 | 07:55:14.428 INFO [RemoteSession$Factory.lambda$performHandshake$0] - Started new session 82d59567606b7f101da8650600e7dd00 (org.openqa.selenium.chrome.ChromeDriverService)
tester_1 | [0-0] 2020-09-28T07:55:14.607Z INFO webdriver: COMMAND navigateTo("http://app:3000/")
tester_1 | [0-0] 2020-09-28T07:55:14.610Z INFO webdriver: [POST] http://selenium:4444/wd/hub/session/82d59567606b7f101da8650600e7dd00/url
tester_1 | [0-0] 2020-09-28T07:55:14.611Z INFO webdriver: DATA { url: 'http://app:3000/' }
tester_1 | [0-0] 2020-09-28T07:55:15.188Z WARN webdriver: Request failed with status 500 due to unknown error: net::ERR_SSL_PROTOCOL_ERROR
tester_1 | (Session info: chrome=85.0.4183.83)
tester_1 | [0-0] 2020-09-28T07:55:15.189Z INFO webdriver: Retrying 1/3
tester_1 | [0-0] 2020-09-28T07:55:15.190Z INFO webdriver: [POST] http://selenium:4444/wd/hub/session/82d59567606b7f101da8650600e7dd00/url
tester_1 | [0-0] 2020-09-28T07:55:15.191Z INFO webdriver: DATA { url: 'http://app:3000/' }
tester_1 | [0-0] 2020-09-28T07:55:15.592Z WARN webdriver: Request failed with status 500 due to unknown error: net::ERR_SSL_PROTOCOL_ERROR
tester_1 | (Session info: chrome=85.0.4183.83)
tester_1 | [0-0] 2020-09-28T07:55:15.592Z INFO webdriver: Retrying 2/3
tester_1 | [0-0] 2020-09-28T07:55:15.594Z INFO webdriver: [POST] http://selenium:4444/wd/hub/session/82d59567606b7f101da8650600e7dd00/url
tester_1 | [0-0] 2020-09-28T07:55:15.595Z INFO webdriver: DATA { url: 'http://app:3000/' }
tester_1 | [0-0] 2020-09-28T07:55:15.942Z WARN webdriver: Request failed with status 500 due to unknown error: net::ERR_SSL_PROTOCOL_ERROR
tester_1 | (Session info: chrome=85.0.4183.83)
tester_1 | [0-0] 2020-09-28T07:55:15.943Z INFO webdriver: Retrying 3/3
tester_1 | [0-0] 2020-09-28T07:55:15.944Z INFO webdriver: [POST] http://selenium:4444/wd/hub/session/82d59567606b7f101da8650600e7dd00/url
tester_1 | [0-0] 2020-09-28T07:55:15.945Z INFO webdriver: DATA { url: 'http://app:3000/' }
tester_1 | [0-0] 2020-09-28T07:55:16.104Z ERROR webdriver: Request failed with status 500 due to unknown error: unknown error: net::ERR_SSL_PROTOCOL_ERROR
tester_1 | (Session info: chrome=85.0.4183.83)
tester_1 | [0-0] Error in "test app should have the right title"
tester_1 | unknown error: unknown error: net::ERR_SSL_PROTOCOL_ERROR
tester_1 | (Session info: chrome=85.0.4183.83)
tester_1 | [0-0] 2020-09-28T07:55:16.118Z INFO webdriver: COMMAND deleteSession()
tester_1 | [0-0] 2020-09-28T07:55:16.119Z INFO webdriver: [DELETE] http://selenium:4444/wd/hub/session/82d59567606b7f101da8650600e7dd00
selenium_1 | 07:55:16.196 INFO [ActiveSessions$1.onStop] - Removing session 82d59567606b7f101da8650600e7dd00 (org.openqa.selenium.chrome.ChromeDriverService)
tester_1 | [0-0] FAILED in chrome - /test/specs/basic.js
tester_1 | 2020-09-28T07:55:16.299Z INFO #wdio/cli:launcher: Run onComplete hook
tester_1 |
tester_1 | "spec" Reporter:
tester_1 | ------------------------------------------------------------------
tester_1 | [chrome 85.0.4183.83 linux #0-0] Spec: /usr/src/app/test/specs/basic.js
tester_1 | [chrome 85.0.4183.83 linux #0-0] Running: chrome (v85.0.4183.83) on linux
tester_1 | [chrome 85.0.4183.83 linux #0-0] Session ID: 82d59567606b7f101da8650600e7dd00
tester_1 | [chrome 85.0.4183.83 linux #0-0]
tester_1 | [chrome 85.0.4183.83 linux #0-0] test app
tester_1 | [chrome 85.0.4183.83 linux #0-0] ✖ should have the right title
tester_1 | [chrome 85.0.4183.83 linux #0-0]
tester_1 | [chrome 85.0.4183.83 linux #0-0] 1 failing (1.6s)
tester_1 | [chrome 85.0.4183.83 linux #0-0]
tester_1 | [chrome 85.0.4183.83 linux #0-0] 1) test app should have the right title
tester_1 | [chrome 85.0.4183.83 linux #0-0] unknown error: unknown error: net::ERR_SSL_PROTOCOL_ERROR
tester_1 | (Session info: chrome=85.0.4183.83)
tester_1 | [chrome 85.0.4183.83 linux #0-0] at <Jasmine>
tester_1 | [chrome 85.0.4183.83 linux #0-0] at processTicksAndRejections (internal/process/task_queues.js:97:5)
tester_1 | [chrome 85.0.4183.83 linux #0-0] at UserContext.<anonymous> (/usr/src/app/test/specs/basic.js:3:17)
Some hints:
when I call curl http://app:3000 from tester or selenium container I receive the HTML as expected.
when I navigate to google in the test, I receive the google code
Thank you for your time!

How can I connect to a remote InfluxDB

I can't connect to my remote InfluxDB. I pushed my InfluxDB docker image (the latest one from Docker Hub) to cloud foundry and started the app. If I use the ping command
$ curl -sl -I http://address:8086/ping
*address is just a placeholder for the real IP
I get this response:
HTTP/1.1 503 Service Unavailable
Cache-Control: no-cache
Pragma: no-cache
Content-Type: text/html; charset=utf-8
Proxy-Connection: close
Connection: close
Content-Length: 1440
Do I have to add something to the Docker Image? Does someone know how to connect to it?
Thank you for your help.
Here are the logs:
2020-02-18T13:26:26.43+0100 [API/0] OUT Updated app with guid ff79fa42-b225-4c66-aebc-cfc754c04b0e ({"state"=>"STARTED"})
2020-02-18T13:26:26.65+0100 [CELL/0] OUT Cell 8078bac2-fb7e-47f3-a85d-f79d2bd1a115 creating container for instance be12bd72-8e35-452c-49de-e1bf
2020-02-18T13:26:37.65+0100 [CELL/0] OUT Cell 8078bac2-fb7e-47f3-a85d-f79d2bd1a115 successfully created container for instance be12bd72-8e35-452c-49de-e1bf
2020-02-18T13:26:37.77+0100 [CELL/0] OUT Starting health monitoring of container
2020-02-18T13:26:38.20+0100 [APP/PROC/WEB/0] OUT INFLUXDB_ADMIN_PASSWORD:IP5p9-dfvfRJG1jZOClwZ_ry5liL1lDE
2020-02-18T13:26:42.82+0100 [APP/PROC/WEB/0] OUT influxdb init process in progress...
2020-02-18T13:26:42.93+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:42.935499Z lvl=info msg="InfluxDB starting" log_id=0L244kTl000 version=1.7.9 branch=1.7 commit=23bc63d43a8dc05f53afa46e3526ebb5578f3d88
2020-02-18T13:26:42.93+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:42.935537Z lvl=info msg="Go runtime" log_id=0L244kTl000 version=go1.12.6 maxprocs=8
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041205Z lvl=info msg="Using data dir" log_id=0L244kTl000 service=store path=/var/lib/influxdb/data
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041288Z lvl=info msg="Compaction settings" log_id=0L244kTl000 service=store max_concurrent_compactions=4 throughput_bytes_per_second=50331648 throughput_bytes_per_second_burst=50331648
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041305Z lvl=info msg="Open store (start)" log_id=0L244kTl000 service=store trace_id=0L244ktG000 op_name=tsdb_open op_event=start
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041348Z lvl=info msg="Open store (end)" log_id=0L244kTl000 service=store trace_id=0L244ktG000 op_name=tsdb_open op_event=end op_elapsed=0.045ms
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041387Z lvl=info msg="Opened service" log_id=0L244kTl000 service=subscriber
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041400Z lvl=info msg="Starting monitor service" log_id=0L244kTl000 service=monitor
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041411Z lvl=info msg="Registered diagnostics client" log_id=0L244kTl000 service=monitor name=build
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041421Z lvl=info msg="Registered diagnostics client" log_id=0L244kTl000 service=monitor name=runtime
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041430Z lvl=info msg="Registered diagnostics client" log_id=0L244kTl000 service=monitor name=network
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041457Z lvl=info msg="Registered diagnostics client" log_id=0L244kTl000 service=monitor name=system
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041521Z lvl=info msg="Starting precreation service" log_id=0L244kTl000 service=shard-precreation check_interval=10m advance_period=30m
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041545Z lvl=info msg="Starting snapshot service" log_id=0L244kTl000 service=snapshot
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041555Z lvl=info msg="Starting continuous query service" log_id=0L244kTl000 service=continuous_querier
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041568Z lvl=info msg="Starting HTTP service" log_id=0L244kTl000 service=httpd authentication=true
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041574Z lvl=info msg="opened HTTP access log" log_id=0L244kTl000 service=httpd path=stderr
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041584Z lvl=info msg="Auth is enabled but shared-secret is blank. BearerAuthentication is disabled." log_id=0L244kTl000 service=httpd
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041578Z lvl=info msg="Storing statistics" log_id=0L244kTl000 service=monitor db_instance=_internal db_rp=monitor interval=10s
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041628Z lvl=info msg="Listening on HTTP" log_id=0L244kTl000 service=httpd addr=127.0.0.1:8086 https=false
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041648Z lvl=info msg="Starting retention policy enforcement service" log_id=0L244kTl000 service=retention check_interval=30m
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041723Z lvl=info msg="Listening for signals" log_id=0L244kTl000
2020-02-18T13:26:43.04+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:43.041789Z lvl=info msg="Sending usage statistics to usage.influxdata.com" log_id=0L244kTl000
2020-02-18T13:26:47.87+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:47.870136Z lvl=info msg="Executing query" log_id=0L244kTl000 service=query query="CREATE USER ifadmin WITH PASSWORD [REDACTED] WITH ALL PRIVILEGES"
2020-02-18T13:26:47.95+0100 [APP/PROC/WEB/0] ERR [httpd] 127.0.0.1 - - [18/Feb/2020:12:26:47 +0000] "POST /query?chunked=true&db=&epoch=ns&q=CREATE+USER+%22ifadmin%22+WITH+PASSWORD+%5BREDACTED%5D+WITH+ALL+PRIVILEGES HTTP/1.1" 200 57 "-" "InfluxDBShell/1.7.9" eeae1cbc-5249-11ea-8002-eeee0affece2 80271
2020-02-18T13:26:51.79+0100 [APP/PROC/WEB/0] ERR [httpd] 127.0.0.1 - ifadmin [18/Feb/2020:12:26:51 +0000] "GET /ping HTTP/1.1" 204 0 "-" "InfluxDBShell/1.7.9" f1052d30-5249-11ea-8003-eeee0affece2 38
2020-02-18T13:26:51.86+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.869567Z lvl=info msg="Executing query" log_id=0L244kTl000 service=query query="CREATE DATABASE jmeter"
2020-02-18T13:26:51.87+0100 [APP/PROC/WEB/0] ERR [httpd] 127.0.0.1 - ifadmin [18/Feb/2020:12:26:51 +0000] "POST /query?chunked=true&db=&epoch=ns&q=CREATE+DATABASE+jmeter HTTP/1.1" 200 57 "-" "InfluxDBShell/1.7.9" f10539f2-5249-11ea-8004-eeee0affece2 81291
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] OUT /init-influxdb.sh: ignoring /docker-entrypoint-initdb.d/*
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882254Z lvl=info msg="Signal received, initializing clean shutdown..." log_id=0L244kTl000
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882284Z lvl=info msg="Waiting for clean shutdown..." log_id=0L244kTl000
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882375Z lvl=info msg="Listener closed" log_id=0L244kTl000 service=snapshot
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882395Z lvl=info msg="Shutting down monitor service" log_id=0L244kTl000 service=monitor
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882417Z lvl=info msg="Terminating storage of statistics" log_id=0L244kTl000 service=monitor
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882444Z lvl=info msg="Terminating precreation service" log_id=0L244kTl000 service=shard-precreation
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882471Z lvl=info msg="Terminating continuous query service" log_id=0L244kTl000 service=continuous_querier
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882528Z lvl=info msg="Closing retention policy enforcement service" log_id=0L244kTl000 service=retention
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882580Z lvl=info msg="Closed service" log_id=0L244kTl000 service=subscriber
2020-02-18T13:26:51.88+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:51.882618Z lvl=info msg="Server shutdown completed" log_id=0L244kTl000
2020-02-18T13:26:55.54+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.545168Z lvl=info msg="InfluxDB starting" log_id=0L245WjG000 version=1.7.9 branch=1.7 commit=23bc63d43a8dc05f53afa46e3526ebb5578f3d88
2020-02-18T13:26:55.54+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.545197Z lvl=info msg="Go runtime" log_id=0L245WjG000 version=go1.12.6 maxprocs=8
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645680Z lvl=info msg="Using data dir" log_id=0L245WjG000 service=store path=/var/lib/influxdb/data
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645721Z lvl=info msg="Compaction settings" log_id=0L245WjG000 service=store max_concurrent_compactions=4 throughput_bytes_per_second=50331648 throughput_bytes_per_second_burst=50331648
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645735Z lvl=info msg="Open store (start)" log_id=0L245WjG000 service=store trace_id=0L245X7G001 op_name=tsdb_open op_event=start
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645786Z lvl=info msg="Open store (end)" log_id=0L245WjG000 service=store trace_id=0L245X7G001 op_name=tsdb_open op_event=end op_elapsed=0.053ms
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645828Z lvl=info msg="Opened service" log_id=0L245WjG000 service=subscriber
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645838Z lvl=info msg="Starting monitor service" log_id=0L245WjG000 service=monitor
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645843Z lvl=info msg="Registered diagnostics client" log_id=0L245WjG000 service=monitor name=build
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645850Z lvl=info msg="Registered diagnostics client" log_id=0L245WjG000 service=monitor name=runtime
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645854Z lvl=info msg="Registered diagnostics client" log_id=0L245WjG000 service=monitor name=network
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645864Z lvl=info msg="Registered diagnostics client" log_id=0L245WjG000 service=monitor name=system
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645883Z lvl=info msg="Starting precreation service" log_id=0L245WjG000 service=shard-precreation check_interval=10m advance_period=30m
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645889Z lvl=info msg="Starting snapshot service" log_id=0L245WjG000 service=snapshot
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645895Z lvl=info msg="Starting continuous query service" log_id=0L245WjG000 service=continuous_querier
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645910Z lvl=info msg="Starting HTTP service" log_id=0L245WjG000 service=httpd authentication=true
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645915Z lvl=info msg="opened HTTP access log" log_id=0L245WjG000 service=httpd path=stderr
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.645919Z lvl=info msg="Auth is enabled but shared-secret is blank. BearerAuthentication is disabled." log_id=0L245WjG000 service=httpd
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.646003Z lvl=info msg="Storing statistics" log_id=0L245WjG000 service=monitor db_instance=_internal db_rp=monitor interval=10s
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.648680Z lvl=info msg="Listening on HTTP" log_id=0L245WjG000 service=httpd addr=0.0.0.0:8086 https=false
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.648714Z lvl=info msg="Starting retention policy enforcement service" log_id=0L245WjG000 service=retention check_interval=30m
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.648790Z lvl=info msg="Listening for signals" log_id=0L245WjG000
2020-02-18T13:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:26:55.648821Z lvl=info msg="Sending usage statistics to usage.influxdata.com" log_id=0L245WjG000
2020-02-18T13:26:56.59+0100 [CELL/0] OUT Container became healthy
2020-02-18T13:28:31.54+0100 [RTR/14] OUT ccp-influxdb-ci.apps.emea.vwapps.io - [2020-02-18T12:28:31.537112792Z] "GET / HTTP/1.1" 404 0 19 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:68.0) Gecko/20100101 Firefox/68.0" "10.11.8.226:39928" "10.11.34.95:61133" x_forwarded_for:"-" x_forwarded_proto:"https" vcap_request_id:"a3766d33-48e1-4e26-7f2f-f6125ce7b01f" response_time:0.007266 gorouter_time:0.000114 app_id:"ff79fa42-b225-4c66-aebc-cfc754c04b0e" app_index:"0" x_b3_traceid:"8a8c75b4bb5af422" x_b3_spanid:"8a8c75b4bb5af422" x_b3_parentspanid:"-" b3:"8a8c75b4bb5af422-8a8c75b4bb5af422"
2020-02-18T13:28:31.54+0100 [RTR/14] OUT
2020-02-18T13:28:31.64+0100 [RTR/18] OUT ccp-influxdb-ci.apps.emea.vwapps.io - [2020-02-18T12:28:31.640897474Z] "GET /favicon.ico HTTP/1.1" 404 0 19 "-" "Mozilla/5.0 (Windows NT
10.0; WOW64; rv:68.0) Gecko/20100101 Firefox/68.0" "10.11.8.226:31322" "10.11.34.95:61133" x_forwarded_for:"-" x_forwarded_proto:"https" vcap_request_id:"0c985087-0372-4a48-46d1-8204e656b395" response_time:0.006984 gorouter_time:0.000086 app_id:"ff79fa42-b225-4c66-aebc-cfc754c04b0e" app_index:"0" x_b3_traceid:"ae3db301fc8f45b5" x_b3_spanid:"ae3db301fc8f45b5" x_b3_parentspanid:"-" b3:"ae3db301fc8f45b5-ae3db301fc8f45b5"
2020-02-18T13:28:31.64+0100 [RTR/18] OUT
2020-02-18T13:56:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:56:55.648914Z lvl=info msg="Retention policy deletion check (start)" log_id=0L245WjG000 service=retention trace_id=0L25oOO0000 op_name=retention_delete_check op_event=start
2020-02-18T13:56:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T12:56:55.648977Z lvl=info msg="Retention policy deletion check (end)" log_id=0L245WjG000 service=retention trace_id=0L25oOO0000 op_name=retention_delete_check op_event=end op_elapsed=0.079ms
2020-02-18T14:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T13:26:55.648885Z lvl=info msg="Retention policy deletion check (start)" log_id=0L245WjG000 service=retention trace_id=0L27XFd0000 op_name=retention_delete_check op_event=start
2020-02-18T14:26:55.64+0100 [APP/PROC/WEB/0] ERR ts=2020-02-18T13:26:55.648930Z lvl=info msg="Retention policy deletion check (end)" log_id=0L245WjG000 service=retention trace_id=0L27XFd0000 op_name=retention_delete_check op_event=end op_elapsed=0.058ms

How to get influxdb docker image to initialize graphite and telegraf

Using the official InfluxDB docker image, I am trying to have multiple databases one that uses Graphite and the other uses the normal InfluxDB.
I am using docker-compose to build and start the images. I am initializing the InfluxDB with a slightly modified version of what is in the docs because I also want Graphite enabled. This all seems to work fine but when I try and run with docker-compose up, I can see it is not opening up 2003 graphite port. My end goal is to get data from netdata and pi-hole and use grafana to graph all that data.
the script to initialize influx:
docker run --rm \
-e INFLUXDB_DB=db0 -e INFLUXDB_ADMIN_ENABLED=true \
-e INFLUXDB_ADMIN_USER=admin -e INFLUXDB_ADMIN_PASSWORD=password \
-e INFLUXDB_USER=telegraf -e INFLUXDB_USER_PASSWORD=password \
-e INFLUXDB_GRAPHITE_ENABLED=true \
-v $PWD:/var/lib/influxdb \
influxdb /init-influxdb.sh
the docker-compose yaml file:
version: "2"
services:
grafana:
image: grafana/grafana
container_name: grafana
restart: always
ports:
- 3000:3000
networks:
- monitoring
volumes:
- grafana-volume:/var/lib/grafana
influxdb:
image: influxdb
container_name: influxdb
restart: always
ports:
- 8086:8086
- 2003:2003
networks:
- monitoring
volumes:
- influxdb-volume:/var/lib/influxdb
networks:
monitoring:
volumes:
grafana-volume:
external: true
influxdb-volume:
external: true
This is the output I get when starting the image.
influxdb | ts=2019-02-04T20:50:53.957730Z lvl=info msg="InfluxDB starting" log_id=0DQ_IJXG000 version=1.7.3 branch=1.7 commit=698dbc789aff13c2678357a6b93ff73dd7136571
influxdb | ts=2019-02-04T20:50:53.957829Z lvl=info msg="Go runtime" log_id=0DQ_IJXG000 version=go1.11 maxprocs=4
influxdb | ts=2019-02-04T20:50:54.059549Z lvl=info msg="Using data dir" log_id=0DQ_IJXG000 service=store path=/var/lib/influxdb/data
influxdb | ts=2019-02-04T20:50:54.059704Z lvl=info msg="Compaction settings" log_id=0DQ_IJXG000 service=store max_concurrent_compactions=2 throughput_bytes_per_second=50331648 throughput_bytes_per_second_burst=50331648
influxdb | ts=2019-02-04T20:50:54.059763Z lvl=info msg="Open store (start)" log_id=0DQ_IJXG000 service=store trace_id=0DQ_IJvl000 op_name=tsdb_open op_event=start
influxdb | ts=2019-02-04T20:50:54.068293Z lvl=info msg="Reading file" log_id=0DQ_IJXG000 engine=tsm1 service=cacheloader path=/var/lib/influxdb/wal/_internal/monitor/1/_00001.wal size=10564283
influxdb | ts=2019-02-04T20:51:00.401835Z lvl=info msg="Reading file" log_id=0DQ_IJXG000 engine=tsm1 service=cacheloader path=/var/lib/influxdb/wal/_internal/monitor/1/_00002.wal size=6833556
influxdb | ts=2019-02-04T20:51:09.724641Z lvl=info msg="Opened shard" log_id=0DQ_IJXG000 service=store trace_id=0DQ_IJvl000 op_name=tsdb_open index_version=inmem path=/var/lib/influxdb/data/_internal/monitor/1 duration=15659.985ms
influxdb | ts=2019-02-04T20:51:09.725095Z lvl=info msg="Open store (end)" log_id=0DQ_IJXG000 service=store trace_id=0DQ_IJvl000 op_name=tsdb_open op_event=end op_elapsed=15665.323ms
influxdb | ts=2019-02-04T20:51:09.728132Z lvl=info msg="Opened service" log_id=0DQ_IJXG000 service=subscriber
influxdb | ts=2019-02-04T20:51:09.728283Z lvl=info msg="Starting monitor service" log_id=0DQ_IJXG000 service=monitor
influxdb | ts=2019-02-04T20:51:09.728352Z lvl=info msg="Registered diagnostics client" log_id=0DQ_IJXG000 service=monitor name=build
influxdb | ts=2019-02-04T20:51:09.728416Z lvl=info msg="Registered diagnostics client" log_id=0DQ_IJXG000 service=monitor name=runtime
influxdb | ts=2019-02-04T20:51:09.728471Z lvl=info msg="Registered diagnostics client" log_id=0DQ_IJXG000 service=monitor name=network
influxdb | ts=2019-02-04T20:51:09.728562Z lvl=info msg="Registered diagnostics client" log_id=0DQ_IJXG000 service=monitor name=system
influxdb | ts=2019-02-04T20:51:09.728787Z lvl=info msg="Starting precreation service" log_id=0DQ_IJXG000 service=shard-precreation check_interval=10m advance_period=30m
influxdb | ts=2019-02-04T20:51:09.729479Z lvl=info msg="Starting snapshot service" log_id=0DQ_IJXG000 service=snapshot
influxdb | ts=2019-02-04T20:51:09.729555Z lvl=info msg="Starting continuous query service" log_id=0DQ_IJXG000 service=continuous_querier
influxdb | ts=2019-02-04T20:51:09.729043Z lvl=info msg="Storing statistics" log_id=0DQ_IJXG000 service=monitor db_instance=_internal db_rp=monitor interval=10s
influxdb | ts=2019-02-04T20:51:09.729642Z lvl=info msg="Starting HTTP service" log_id=0DQ_IJXG000 service=httpd authentication=false
influxdb | ts=2019-02-04T20:51:09.729699Z lvl=info msg="opened HTTP access log" log_id=0DQ_IJXG000 service=httpd path=stderr
influxdb | ts=2019-02-04T20:51:09.731377Z lvl=info msg="Listening on HTTP" log_id=0DQ_IJXG000 service=httpd addr=[::]:8086 https=false
influxdb | ts=2019-02-04T20:51:09.732022Z lvl=info msg="Starting retention policy enforcement service" log_id=0DQ_IJXG000 service=retention check_interval=30m
influxdb | ts=2019-02-04T20:51:09.732881Z lvl=info msg="Sending usage statistics to usage.influxdata.com" log_id=0DQ_IJXG000
influxdb | ts=2019-02-04T20:51:09.733412Z lvl=info msg="Listening for signals" log_id=0DQ_IJXG000

Kubernetes-Kafka unable to write message on topic

I am trying to write data on kafka topic but, stuck with some errors. Below are my configuration & error details.
Kubernetes Service:
kubectl get services
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
kafka-service NodePort 10.105.214.246 <none> 9092:30998/TCP 17m
kubernetes ClusterIP 10.96.0.1 <none> 443/TCP 4d
zoo1 ClusterIP 10.101.3.128 <none> 2181/TCP,2888/TCP,3888/TCP 20m
Kubernetes Pods:
kubectl get pods
NAME READY STATUS RESTARTS AGE
kafka-broker0-69c97b67f-4pmw9 1/1 Running 1 1m
zookeeper-deployment-1-796f9d9bcc-cr756 1/1 Running 0 20m
Kafka Docker Process:
docker ps | grep kafka
f79cd0196083 wurstmeister/kafka#sha256:d04dafd2b308f26dbeed8454f67c321579c2818c1eff5e8f695e14a19b1d599b "start-kafka.sh" About a minute ago Up About a minute k8s_kafka_kafka-broker0-69c97b67f-4pmw9_default_a747d38a-0da6-11e9-bd84-fa163e7d3173_1
75393e9e25c1 k8s.gcr.io/pause-amd64:3.1 "/pause" About a minute ago Up About a minute k8s_POD_kafka-broker0-69c97b67f-4pmw9_default_a747d38a-0da6-11e9-bd84-fa163e7d3173_0
Topic test is created successfully in Kafka as shown below:
docker exec k8s_kafka_kafka-broker0-69c97b67f-4pmw9_default_a747d38a-0da6-11e9-bd84-fa163e7d3173_1 /opt/kafka_2.12-2.1.0/bin/kafka-topics.sh --list --zookeeper zoo1:2181
OR
docker exec k8s_kafka_kafka-broker0-69c97b67f-4pmw9_default_a747d38a-0da6-11e9-bd84-fa163e7d3173_1 /opt/kafka_2.12-2.1.0/bin/kafka-topics.sh --list --zookeeper 10.101.3.128:2181
Output of above command:
test
As the topic is available to write data on it, I had executed below command with host machine IP 10.225.36.98 or with service IP 10.105.214.246 :
kubectl exec kafka-broker0-69c97b67f-4pmw9 -c kafka -i -t --
/opt/kafka_2.12-2.1.0/bin/kafka-console-producer.sh [ --broker-list
10.225.36.98:30998 --topic test ]
>{"k":"v"}
But none of them is working for me & throw below exception:
[2019-01-01 09:26:52,215] ERROR Error when sending message to topic test with key: null, value: 9 bytes with error:
(org.apache.kafka.clients.producer.internals.ErrorLoggingCallback)
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.
>[2019-01-01 09:27:59,513] WARN [Producer clientId=console-producer]
Connection to node -1 (/10.225.36.98:30998) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
When tried to write on broker with hostname kafka:
kubectl exec kafka-broker0-69c97b67f-4pmw9 -c kafka -i -t -- /opt/kafka_2.12-2.1.0/bin/kafka-console-producer.sh [ --broker-list kafka:9092 --topic test ]
[2019-01-01 09:34:41,293] WARN Couldn't resolve server kafka:9092 from bootstrap.servers as DNS resolution failed for kafka
(org.apache.kafka.clients.ClientUtils)
org.apache.kafka.common.KafkaException: Failed to construct kafka producer
As the host & service IP were not working, I tried with pod IP, but get test=LEADER_NOT_AVAILABLE error.
kubectl exec kafka-broker0-69c97b67f-4pmw9 -c kafka -i -t -- /opt/kafka_2.12-2.1.0/bin/kafka-console-producer.sh [ --broker-list 172.17.0.7:9092 --topic test ]
>{"k":"v"}
[2019-01-01 09:52:30,733] WARN [Producer clientId=console-producer] Error while fetching metadata with correlation id 1 : {test=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient)
After searching Google I found command to get list of available brokers in Zookeeper. So I tried to run it from container & stuck on below error:
bash-4.4# ./opt/zookeeper/bin/zkCli.sh -server zoo1:2181 ls /brokers/ids
Connecting to zoo1:2181
Exception from Zookeeper:
2019-01-01 09:18:05,215 [myid:] - INFO [main:Environment#100] - Client environment:zookeeper.version=3.4.10-39d3a4f269333c922ed3db283be479f9deacaa0f, built on 03/23/2017 10:13 GMT
2019-01-01 09:18:05,219 [myid:] - INFO [main:Environment#100] - Client environment:host.name=zookeeper-deployment-1-796f9d9bcc-cr756
2019-01-01 09:18:05,220 [myid:] - INFO [main:Environment#100] - Client environment:java.version=1.8.0_151
2019-01-01 09:18:05,223 [myid:] - INFO [main:Environment#100] - Client environment:java.vendor=Oracle Corporation
2019-01-01 09:18:05,223 [myid:] - INFO [main:Environment#100] - Client environment:java.home=/usr/lib/jvm/java-1.8-openjdk/jre
2019-01-01 09:18:05,223 [myid:] - INFO [main:Environment#100] - Client environment:java.class.path=/opt/zookeeper/bin/../build/classes:/opt/zookeeper/b
in/../build/lib/*.jar:/opt/zookeeper/bin/../lib/slf4j-log4j12-1.6.1.jar:/opt/zookeeper/bin/../lib/slf4j-api-
1.6.1.jar:/opt/zookeeper/bin/../lib/netty-3.10.5.Final.jar:/opt/zookeeper/bin/../lib/log4j-
1.2.16.jar:/opt/zookeeper/bin/../lib/jline-0.9.94.jar:/opt/zookeeper/bin/../zookeeper-3.4.10.jar:/opt/zookeeper/bin/../src/java/lib/*.jar:/opt/zookeeper/bin/../conf:
2019-01-01 09:18:05,223 [myid:] - INFO [main:Environment#100] - Client environment:java.library.path=/usr/lib/jvm/java-1.8-
openjdk/jre/lib/amd64/server:/usr/lib/jvm/java-1.8-openjdk/jre/lib/amd64:/usr/lib/jvm/java-1.8-openjdk/jre/../lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2019-01-01 09:18:05,223 [myid:] - INFO [main:Environment#100] - Client environment:java.io.tmpdir=/tmp
2019-01-01 09:18:05,224 [myid:] - INFO [main:Environment#100] - Client environment:java.compiler=<NA>
2019-01-01 09:18:05,224 [myid:] - INFO [main:Environment#100] - Client environment:os.name=Linux
2019-01-01 09:18:05,224 [myid:] - INFO [main:Environment#100] - Client environment:os.arch=amd64
2019-01-01 09:18:05,224 [myid:] - INFO [main:Environment#100] - Client environment:os.version=3.10.0-693.11.6.el7.x86_64
2019-01-01 09:18:05,224 [myid:] - INFO [main:Environment#100] - Client environment:user.name=root
2019-01-01 09:18:05,224 [myid:] - INFO [main:Environment#100] - Client environment:user.home=/root
2019-01-01 09:18:05,224 [myid:] - INFO [main:Environment#100] - Client environment:user.dir=/
2019-01-01 09:18:05,225 [myid:] - INFO [main:ZooKeeper#438] - Initiating client connection, connectString=zoo1:2181 sessionTimeout=30000 watcher=org.apache.zookeeper.ZooKeeperMain$MyWatcher#25f38edc
2019-01-01 09:18:05,259 [myid:] - INFO [main-SendThread(zoo1.default.svc.cluster.local:2181):ClientCnxn$SendThread#1032] - Opening socket connection to server zoo1.default.svc.cluster.local/10.101.3.128:2181. Will not attempt to authenticate using SASL (unknown error)
2019-01-01 09:18:35,280 [myid:] - WARN [main-SendThread(zoo1.default.svc.cluster.local:2181):ClientCnxn$SendThread#1108] - Client session timed out, have not heard from server in 30027ms for sessionid 0x0
2019-01-01 09:18:35,282 [myid:] - INFO [main-SendThread(zoo1.default.svc.cluster.local:2181):ClientCnxn$SendThread#1156] - Client session timed out, have not heard from server in 30027ms for sessionid 0x0, closing socket connection and attempting reconnect
Exception in thread "main" org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /brokers/ids
at org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.getChildren(ZooKeeper.java:1532)
at org.apache.zookeeper.ZooKeeper.getChildren(ZooKeeper.java:1560)
at org.apache.zookeeper.ZooKeeperMain.processZKCmd(ZooKeeperMain.java:731)
at org.apache.zookeeper.ZooKeeperMain.processCmd(ZooKeeperMain.java:599)
at org.apache.zookeeper.ZooKeeperMain.run(ZooKeeperMain.java:362)
at org.apache.zookeeper.ZooKeeperMain.main(ZooKeeperMain.java:290)
I also tried to create Kafka service of type LoadBalancer type, but, No LoadBalancer IP is assigned to service.
References to resolve this issue:
https://rmoff.net/2018/08/02/kafka-listeners-explained/
https://github.com/wurstmeister/kafka-docker/wiki/Connectivity#additional-listener-information
https://github.com/kubernetes/contrib/issues/2891
https://dzone.com/articles/ultimate-guide-to-installing-kafka-docker-on-kuber
https://github.com/wurstmeister/kafka-docker/issues/85
Any help would be appreciated.
Try following command to send data to topic:
docker exec k8s_kafka_kafka-broker0-69c97b67f-4pmw9_default_a747d38a-0da6-11e9-bd84-fa163e7d3173_1
/opt/kafka_2.12-2.1.0/bin/kafka-console-producer.sh
--broker-list kafka-service:30998 --topic test

Resources