Skip to content

Commit aed3db3

Browse files
andselandrewvc
andauthored
Merge pull request #43 from andsel/size_limit_bytes
Exposes decode_size_limit_bytes setting to limit the line width that the codec can parse. Leverages the second parameter of BufferTokenizerExt to thrown an IllegalStateException when the size of the line to parse is bigger than decode_size_limit_bytes. Co-authored-by: Andrew Cholakian <[email protected]>
2 parents d6f97ad + 3b5a05e commit aed3db3

File tree

4 files changed

+26
-2
lines changed

4 files changed

+26
-2
lines changed

CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
## 3.2.0
2+
- Add decode_size_limit_bytes option to limit the maximum size of each JSON line that can be parsed.[#43](https://github.com/logstash-plugins/logstash-codec-json_lines/pull/43)
3+
14
## 3.1.0
25
- Feat: event `target => namespace` support (ECS) [#41](https://github.com/logstash-plugins/logstash-codec-json_lines/pull/41)
36
- Refactor: dropped support for old Logstash versions (< 6.0)

lib/logstash/codecs/json_lines.rb

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,11 @@ class LogStash::Codecs::JSONLines < LogStash::Codecs::Base
4242
# Change the delimiter that separates lines
4343
config :delimiter, :validate => :string, :default => "\n"
4444

45+
# Maximum number of bytes for a single line before a fatal exception is raised
46+
# which will stop Logsash.
47+
# The default is 20MB which is quite large for a JSON document
48+
config :decode_size_limit_bytes, :validate => :number, :default => 20 * (1024 * 1024) # 20MB
49+
4550
# Defines a target field for placing decoded fields.
4651
# If this setting is omitted, data gets stored at the root (top level) of the event.
4752
# The target is only relevant while decoding data into a new event.
@@ -50,7 +55,7 @@ class LogStash::Codecs::JSONLines < LogStash::Codecs::Base
5055
public
5156

5257
def register
53-
@buffer = FileWatch::BufferedTokenizer.new(@delimiter)
58+
@buffer = FileWatch::BufferedTokenizer.new(@delimiter, @decode_size_limit_bytes)
5459
@converter = LogStash::Util::Charset.new(@charset)
5560
@converter.logger = @logger
5661
end

logstash-codec-json_lines.gemspec

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
Gem::Specification.new do |s|
22

33
s.name = 'logstash-codec-json_lines'
4-
s.version = '3.1.0'
4+
s.version = '3.2.0'
55
s.licenses = ['Apache License (2.0)']
66
s.summary = "Reads and writes newline-delimited JSON"
77
s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"

spec/codecs/json_lines_spec.rb

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -118,6 +118,22 @@
118118
end
119119
end
120120

121+
describe "decode_size_limits_bytes" do
122+
let(:maximum_payload) { "a" * subject.decode_size_limit_bytes }
123+
124+
it "should not raise an error if the number of bytes is not exceeded" do
125+
expect {
126+
subject.decode(maximum_payload)
127+
}.not_to raise_error
128+
end
129+
130+
it "should raise an error if the max bytes are exceeded" do
131+
expect {
132+
subject.decode(maximum_payload << "z")
133+
}.to raise_error(java.lang.IllegalStateException, "input buffer full")
134+
end
135+
end
136+
121137
end
122138

123139
context "#encode" do

0 commit comments

Comments
 (0)