amazon s3 - Logstash S3 output plugin codec issue -
i experiencing strange behavior logstash, using combination of codecs on input/file , output/s3. there seems issue output/s3 logstash plugin, cannot part files upload s3 unless specify codec in output/s3 plugin.
i tailing java application log files, ideally using input/file plugin watch log files in directory , make sure stack traces encountered (and new lines) wrapped same logstash event. this:
input { file { path => "c:/some/directory/logs/*" codec => multiline { pattern => "^%{datestamp}" negate => true => "previous" } } }
this append stack traces parent events. want perform 2 different output/s3 operations (essentially recreating raw log line line, , uploading event json) :
output { s3{ access_key_id => "mykey" secret_access_key => "myseckey" region => "us-east-1" bucket => "somebucket" size_file => 10000 upload_workers_count => 2 restore => true prefix => "rawlogs/" temporary_directory => "c:/temp/logstash/raw" time_file => 5 } s3{ access_key_id => "mykey" secret_access_key => "myseckey" region => "us-east-1" bucket => "somebucket" size_file => 10000 upload_workers_count => 2 restore => true prefix => "jsoneventlogs/" temporary_directory => "c:/temp/logstash/json" time_file => 5 codec => "json_lines" } }
the s3 upload uses "json_lines" codec works fine, raw log upload uses default "plain" codec not work @ all. files sit in temp directory , never pushed s3. have tried use "line" codec, still same behavior. if remove "multiline" codec input/file plugin , use in output/s3 raw plugin, upload s3 fine, each newline in stacktrace coming in own event, codec not seem job.
any idea why output/s3 plugin seems work json_lines , multiline?
Comments
Post a Comment