Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logstash: [ERROR][logstash.javapipeline ] A plugin had an unrecoverable error. Will restart this plugin. #446

Closed
farrukhahmed4220 opened this issue Nov 8, 2019 · 3 comments

Comments

@farrukhahmed4220
Copy link

farrukhahmed4220 commented Nov 8, 2019

This error is coming, after docker-compose up
PS: I'm operating this on virtual cloud machine 4GB droplet (digitalocean)

[2019-11-08T14:58:10,393][ERROR][logstash.javapipeline    ] A plugin had an unrecoverable error. Will restart this plugin.
  Pipeline_id:main
  Plugin: <LogStash::Inputs::Tcp port=>5000, id=>"313916b0de4b98ad97271fe2049166dce20744d5d153cb29fe690b19a64c0047", enable_metric=>true, codec=><LogStash::Codecs::Line id=>"line_9a9ab803-e11c-46c2-9233-6ccfd6734418", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">, host=>"0.0.0.0", mode=>"server", proxy_protocol=>false, ssl_enable=>false, ssl_verify=>true, ssl_key_passphrase=><password>, tcp_keep_alive=>false, dns_reverse_lookup_enabled=>true>
  Error: event executor terminated
  Exception: Java::JavaUtilConcurrent::RejectedExecutionException
  Stack: io.netty.util.concurrent.SingleThreadEventExecutor.reject(io/netty/util/concurrent/SingleThreadEventExecutor.java:821)
io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(io/netty/util/concurrent/SingleThreadEventExecutor.java:327)
io.netty.util.concurrent.SingleThreadEventExecutor.addTask(io/netty/util/concurrent/SingleThreadEventExecutor.java:320)
io.netty.util.concurrent.SingleThreadEventExecutor.execute(io/netty/util/concurrent/SingleThreadEventExecutor.java:746)
io.netty.channel.AbstractChannel$AbstractUnsafe.register(io/netty/channel/AbstractChannel.java:479)
io.netty.channel.SingleThreadEventLoop.register(io/netty/channel/SingleThreadEventLoop.java:80)
io.netty.channel.SingleThreadEventLoop.register(io/netty/channel/SingleThreadEventLoop.java:74)
io.netty.channel.MultithreadEventLoopGroup.register(io/netty/channel/MultithreadEventLoopGroup.java:86)
io.netty.bootstrap.AbstractBootstrap.initAndRegister(io/netty/bootstrap/AbstractBootstrap.java:331)
io.netty.bootstrap.AbstractBootstrap.doBind(io/netty/bootstrap/AbstractBootstrap.java:282)
io.netty.bootstrap.AbstractBootstrap.bind(io/netty/bootstrap/AbstractBootstrap.java:278)
io.netty.bootstrap.AbstractBootstrap.bind(io/netty/bootstrap/AbstractBootstrap.java:260)
org.logstash.tcp.InputLoop.run(org/logstash/tcp/InputLoop.java:87)
jdk.internal.reflect.GeneratedMethodAccessor48.invoke(jdk/internal/reflect/GeneratedMethodAccessor48)
jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)
org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:440)
org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:304)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_tcp_minus_6_dot_0_dot_3_minus_java.lib.logstash.inputs.tcp.run(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-tcp-6.0.3-java/lib/logstash/inputs/tcp.rb:152)
usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.inputworker(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:314)
usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_input(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:306)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:295)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:274)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:270)
java.lang.Thread.run(java/lang/Thread.java:834)
[2019-11-08T14:58:11,395][INFO ][logstash.inputs.tcp      ] Starting tcp input listener {:address=>"0.0.0.0:5000", :ssl_enable=>"false"}

logstash/pipeline/logstash.conf:

input {
        tcp {
                port => 5000
        }
        udp {
                port => 5000
        }
        beats {
                port => 5000
        }
}

## Add your filters / logstash plugins configuration here
filter {
  if [type] == "syslog" {
    grok {
      match => {
         "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"
      }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
     }
  }
}


output {
        elasticsearch {
                hosts => "elasticsearch:9200"
                index => "logger-%{+YYYY.MM.dd}"
                user => "elastic"
                password => "changeme"
        }
}

logstash/config/logstash.yml:

## Default Logstash configuration from Logstash base image.
## https://github.com/elastic/logstash/blob/master/docker/data/logstash/config/logstash-full.yml
#
http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]

## X-Pack security credentials
#
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: elastic
xpack.monitoring.elasticsearch.password: changeme

docker-compose.yml:

version: '3.2'

services:
  elasticsearch:
    build:
      context: elasticsearch/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./elasticsearch/config/elasticsearch.yml
        target: /usr/share/elasticsearch/config/elasticsearch.yml
        read_only: true
      - type: volume
        source: elasticsearch
        target: /usr/share/elasticsearch/data
    ports:
      - "9200:9200"
      - "9300:9300"
    environment:
      ES_JAVA_OPTS: "-Xmx512m -Xms512m"
      ELASTIC_PASSWORD: changeme
    networks:
      - elk

  logstash:
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./logstash/config/logstash.yml
        target: /usr/share/logstash/config/logstash.yml
        read_only: true
      - type: bind
        source: ./logstash/pipeline
        target: /usr/share/logstash/pipeline
        read_only: true
    ports:
      - "5000:5000"
      - "9600:9600"
    environment:
      LS_JAVA_OPTS: "-Xmx512m -Xms512m"
    networks:
      - elk
    depends_on:
      - elasticsearch

  kibana:
    build:
      context: kibana/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./kibana/config/kibana.yml
        target: /usr/share/kibana/config/kibana.yml
        read_only: true
    ports:
      - "5601:5601"
    networks:
      - elk
    depends_on:
      - elasticsearch

networks:
  elk:
    driver: bridge

volumes:
  elasticsearch:

Whats the issue ?

@antoineco
Copy link
Collaborator

You have 2 inputs listening on the same TCP port. You have to remove the tcp input, or use non identical ports.

@antoineco
Copy link
Collaborator

@farrukhahmed4220 I'll consider this issue solved. Feel free to reopen if that's not the case.

@surojitbhattatgmail
Copy link

surojitbhattatgmail commented Sep 12, 2021

Hi,

My serilog NetTcp client produces the following message construct in json

{
	"@t": "2021-09-11T23:27:15.5387671Z",
	"@mt": "{@request} registered",
	"request": {
		"RequestId": "ddb7a8e8-9972-4178-a177-a8d5cad1c462",
		"DateTime": "2021-09-12T04:57:15.4851381+05:30",
		"InfType": "Info",
		"ServiceType": "Console1",
		"Body": "{ Message = This is not fare }",
		"$type": "Request"
	}
}

and following is my logstash pipeline construct

input {
	beats {
		port => 5044
		type=>"file-beat"	
	}

	tcp {
		port => 5000
		codec => json
		type=>"console-event"
	}

}

## Add your filters / logstash plugins configuration here
filter {
        
	if [type] =="console-event" {
	json {
        source => "request"
        tag_on_failure => [ "_jsonparsefailure" ]
	id => "%{RequestId}"
	add_tag=>["Request_%{RequestId}","Request_%{ServiceType}"]
      }
  }

	
   }
	

##user => "elastic"
##password => "changeme"
output {
	elasticsearch {
		hosts => "elasticsearch:9200"
		index=>"gateway-request-%{+YYYY.MM.dd}"
		ecs_compatibility => disabled
	}
}

However I am getting the following exception pattern

Pipeline_id:main
logstash_1       | + host=>"0.0.0.0", mode=>"server", proxy_protocol=>false, ssl_enable=>false, ssl_verify=>true, ssl_key_passphrase=><password>, tcp_keep_alive=>false, dns_reverse_lookup_enabled=>true>
logstash_1       |   Error: event executor terminated
logstash_1       |   Exception: Java::JavaUtilConcurrent::RejectedExecutionException
logstash_1       |   Stack: io.netty.util.concurrent.SingleThreadEventExecutor.reject(io/netty/util/concurrent/SingleThreadEventExecutor.java:926)
logstash_1       | io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(io/netty/util/concurrent/SingleThreadEventExecutor.java:353)

I am clueless, regarding what I am doing wrong..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants