Demo Logs
Generate fake log events, which can be useful for testing and demos
Alias
This component was previously called the generator
source. Make sure to update your
Vector configuration to accommodate the name change:
[sources.my_demo_logs_source]
+type = "demo_logs"
-type = "generator"
Configuration
Example configurations
{
"sources": {
"my_source_id": {
"type": "demo_logs",
"format": "apache_common",
"lines": [
"line1"
]
}
}
}
[sources.my_source_id]
type = "demo_logs"
format = "apache_common"
lines = [ "line1" ]
sources:
my_source_id:
type: demo_logs
format: apache_common
lines:
- line1
{
"sources": {
"my_source_id": {
"type": "demo_logs",
"count": 9223372036854776000,
"format": "apache_common",
"interval": 1,
"lines": [
"line1"
]
}
}
}
[sources.my_source_id]
type = "demo_logs"
count = 9_223_372_036_854_776_000
format = "apache_common"
interval = 1
lines = [ "line1" ]
sources:
my_source_id:
type: demo_logs
count: 9223372036854776000
format: apache_common
interval: 1
lines:
- line1
count
optional uintThe total number of lines to output.
By default, the source continuously prints logs (infinitely).
9.223372036854776e+18
decoding
optional objectdecoding.avro
required objectcodec = "avro"
decoding.avro.schema
required string literalThe Avro schema definition.
Please note that the following [apache_avro::types::Value
] variants are currently not supported:
Date
Decimal
Duration
Fixed
TimeMillis
decoding.avro.strip_schema_id_prefix
required booldecoding.codec
optional string literal enumOption | Description |
---|---|
avro | Decodes the raw bytes as as an Apache Avro message. |
bytes | Uses the raw bytes as-is. |
gelf | Decodes the raw bytes as a GELF message. This codec is experimental for the following reason: The GELF specification is more strict than the actual Graylog receiver.
Vector’s decoder currently adheres more strictly to the GELF spec, with
the exception that some characters such as Other GELF codecs such as Loki’s, use a Go SDK that is maintained by Graylog, and is much more relaxed than the GELF spec. Going forward, Vector will use that Go SDK as the reference implementation, which means the codec may continue to relax the enforcement of specification. |
influxdb | Decodes the raw bytes as an Influxdb Line Protocol message. |
json | Decodes the raw bytes as JSON. |
native | Decodes the raw bytes as native Protocol Buffers format. This codec is experimental. |
native_json | Decodes the raw bytes as native JSON format. This codec is experimental. |
protobuf | Decodes the raw bytes as protobuf. |
syslog | Decodes the raw bytes as a Syslog message. Decodes either as the RFC 3164-style format (“old” style) or the RFC 5424-style format (“new” style, includes structured data). |
vrl | Decodes the raw bytes as a string and passes them as input to a VRL program. |
bytes
decoding.gelf
optional objectcodec = "gelf"
decoding.gelf.lossy
optional boolDetermines whether or not to replace invalid UTF-8 sequences instead of failing.
When true, invalid UTF-8 sequences are replaced with the U+FFFD REPLACEMENT CHARACTER
.
true
decoding.influxdb
optional objectcodec = "influxdb"
decoding.influxdb.lossy
optional boolDetermines whether or not to replace invalid UTF-8 sequences instead of failing.
When true, invalid UTF-8 sequences are replaced with the U+FFFD REPLACEMENT CHARACTER
.
true
decoding.json
optional objectcodec = "json"
decoding.json.lossy
optional boolDetermines whether or not to replace invalid UTF-8 sequences instead of failing.
When true, invalid UTF-8 sequences are replaced with the U+FFFD REPLACEMENT CHARACTER
.
true
decoding.native_json
optional objectcodec = "native_json"
decoding.native_json.lossy
optional boolDetermines whether or not to replace invalid UTF-8 sequences instead of failing.
When true, invalid UTF-8 sequences are replaced with the U+FFFD REPLACEMENT CHARACTER
.
true
decoding.protobuf
optional objectcodec = "protobuf"
decoding.protobuf.desc_file
optional string literaldecoding.protobuf.message_type
optional string literaldecoding.syslog
optional objectcodec = "syslog"
decoding.syslog.lossy
optional boolDetermines whether or not to replace invalid UTF-8 sequences instead of failing.
When true, invalid UTF-8 sequences are replaced with the U+FFFD REPLACEMENT CHARACTER
.
true
decoding.vrl
required objectcodec = "vrl"
decoding.vrl.source
required string literal.
target will be used as the decoding result.
Compilation error or use of ‘abort’ in a program will result in a decoding error.decoding.vrl.timezone
optional string literalThe name of the timezone to apply to timestamp conversions that do not contain an explicit
time zone. The time zone name may be any name in the TZ database, or local
to indicate system local time.
If not set, local
will be used.
format
required string literal enumOption | Description |
---|---|
apache_common | Randomly generated logs in Apache common format. |
apache_error | Randomly generated logs in Apache error format. |
bsd_syslog | Randomly generated logs in Syslog format (RFC 3164). |
json | Randomly generated HTTP server logs in JSON format. |
shuffle | Lines are chosen at random from the list specified using lines . |
syslog | Randomly generated logs in Syslog format (RFC 5424). |
framing
optional objectFraming configuration.
Framing handles how events are separated when encoded in a raw byte form, where each event is a frame that must be prefixed, or delimited, in a way that marks where an event begins and ends within the byte stream.
framing.character_delimited
required objectmethod = "character_delimited"
framing.character_delimited.delimiter
required ascii_charframing.character_delimited.max_length
optional uintThe maximum length of the byte buffer.
This length does not include the trailing delimiter.
By default, there is no maximum length enforced. If events are malformed, this can lead to additional resource usage as events continue to be buffered in memory, and can potentially lead to memory exhaustion in extreme cases.
If there is a risk of processing malformed data, such as logs with user-controlled input, consider setting the maximum length to a reasonably large value as a safety net. This ensures that processing is not actually unbounded.
framing.chunked_gelf
optional objectmethod = "chunked_gelf"
framing.chunked_gelf.max_length
optional uintThe maximum length of a single GELF message, in bytes. Messages longer than this length will be dropped. If this option is not set, the decoder does not limit the length of messages and the per-message memory is unbounded.
Note that a message can be composed of multiple chunks and this limit is applied to the whole message, not to individual chunks.
This limit takes only into account the message’s payload and the GELF header bytes are excluded from the calculation. The message’s payload is the concatenation of all the chunks’ payloads.
framing.chunked_gelf.pending_messages_limit
optional uintframing.chunked_gelf.timeout_secs
optional float5
framing.length_delimited
required objectmethod = "length_delimited"
framing.length_delimited.length_field_is_big_endian
optional booltrue
framing.length_delimited.length_field_length
optional uint4
framing.length_delimited.length_field_offset
optional uintframing.method
optional string literal enumOption | Description |
---|---|
bytes | Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). |
character_delimited | Byte frames which are delimited by a chosen character. |
chunked_gelf | Byte frames which are chunked GELF messages. |
length_delimited | Byte frames which are prefixed by an unsigned big-endian 32-bit integer indicating the length. |
newline_delimited | Byte frames which are delimited by a newline character. |
octet_counting | Byte frames according to the octet counting format. |
bytes
framing.newline_delimited
optional objectmethod = "newline_delimited"
framing.newline_delimited.max_length
optional uintThe maximum length of the byte buffer.
This length does not include the trailing delimiter.
By default, there is no maximum length enforced. If events are malformed, this can lead to additional resource usage as events continue to be buffered in memory, and can potentially lead to memory exhaustion in extreme cases.
If there is a risk of processing malformed data, such as logs with user-controlled input, consider setting the maximum length to a reasonably large value as a safety net. This ensures that processing is not actually unbounded.
framing.octet_counting
optional objectmethod = "octet_counting"
framing.octet_counting.max_length
optional uintinterval
optional floatThe amount of time, in seconds, to pause between each batch of output lines.
The default is one batch per second. To remove the delay and output batches as quickly as possible, set
interval
to 0.0
.
1
(seconds)sequence
optional booltrue
, each output line starts with an increasing sequence number, beginning with 0.false
format = "shuffle"
Outputs
<component_id>
Output Data
Logs
Warning
Telemetry
Metrics
linkcomponent_discarded_events_total
counterfilter
transform, or false if due to an error.component_errors_total
countercomponent_received_bytes_total
countercomponent_received_event_bytes_total
countercomponent_received_events_count
histogramA histogram of the number of events passed in each internal batch in Vector’s internal topology.
Note that this is separate than sink-level batching. It is mostly useful for low level debugging performance issues in Vector due to small internal batches.