You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
We have a scenario for sending security based logs to S3 using fluentbit, Since this service is used by two of our team (Network & Security). Network team already setup two INPUT plugins multiple filters & 2 OUTPUT for sending their respective events to S3, While i'm adding Security based filters with INPUT & OUTPUT plugins, we are getting objects in S3 but not with desired results its dumbing all logs.
[INPUT]
Name tail
Tag soc
Path /var/log/**/*.log
Skip_Long_Lines On
Refresh_Interval 10
Inotify_Watcher false
multiline.parser cri
storage.type filesystem
Buffer_Chunk_Size 64KB
Buffer_Max_Size 128KB
[FILTER]
Name modify
Match soc
Condition Key_value_matches log (securityContext:.+privileged:\strue|securityContext:.+allowPrivilegeEscalation:\strue)
Set keep true
[FILTER]
Name modify
Match soc
Condition Key_value_matches log (securityContext:.+runAsNonRoot:\sfalse|containerSecurityContext:.+runAsNonRoot:\sfalse)
Set keep true
[FILTER]
Name modify
Match soc
Condition Key_value_matches log authorization.k8s.io.+selfsubjectaccessreviews
Set keep true
[FILTER]
Name modify
Match soc
Condition Key_value_matches log authorization.k8s.io.+selfsubjectrulesreviews
Set keep true
[OUTPUT]
Name s3
Match soc
bucket bucketname
region awsregion
json_date_key date
json_date_format iso8601
total_file_size 100M
upload_chunk_size 6M
upload_timeout 10m
store_dir /var/log/s3/AWS/accountid/awsregion
store_dir_limit_size 256M
s3_key_format /var/log/s3/AWS/accountid/awsregion/$TAG/%Y/%m/%d/%H/${HOSTNAME}$UUID.log
auto_retry_requests true
preserve_data_ordering true
retry_limit 1
Above listed is my INPUT, FILTER & OUTPUT plugins , while performing the changes & applying the commit using argo-cd it passed without any errors but "tag" based logs are not sending to S3 & instead sending all /var/log messages.
Can anyone help me to understand what issue i'm facing here.
The text was updated successfully, but these errors were encountered:
Bug Report
Describe the bug
We have a scenario for sending security based logs to S3 using fluentbit, Since this service is used by two of our team (Network & Security). Network team already setup two INPUT plugins multiple filters & 2 OUTPUT for sending their respective events to S3, While i'm adding Security based filters with INPUT & OUTPUT plugins, we are getting objects in S3 but not with desired results its dumbing all logs.
[INPUT]
Name tail
Tag soc
Path /var/log/**/*.log
Skip_Long_Lines On
Refresh_Interval 10
Inotify_Watcher false
multiline.parser cri
storage.type filesystem
Buffer_Chunk_Size 64KB
Buffer_Max_Size 128KB
[FILTER]
Name modify
Match soc
Condition Key_value_matches log (securityContext:.+privileged:\strue|securityContext:.+allowPrivilegeEscalation:\strue)
Set keep true
[FILTER]
Name modify
Match soc
Condition Key_value_matches log (securityContext:.+runAsNonRoot:\sfalse|containerSecurityContext:.+runAsNonRoot:\sfalse)
Set keep true
[FILTER]
Name modify
Match soc
Condition Key_value_matches log authorization.k8s.io.+selfsubjectaccessreviews
Set keep true
[FILTER]
Name modify
Match soc
Condition Key_value_matches log authorization.k8s.io.+selfsubjectrulesreviews
Set keep true
[OUTPUT]
Name s3
Match soc
bucket bucketname
region awsregion
json_date_key date
json_date_format iso8601
total_file_size 100M
upload_chunk_size 6M
upload_timeout 10m
store_dir /var/log/s3/AWS/accountid/awsregion
store_dir_limit_size 256M
s3_key_format /var/log/s3/AWS/accountid/awsregion/$TAG/%Y/%m/%d/%H/${HOSTNAME}$UUID.log
auto_retry_requests true
preserve_data_ordering true
retry_limit 1
Above listed is my INPUT, FILTER & OUTPUT plugins , while performing the changes & applying the commit using argo-cd it passed without any errors but "tag" based logs are not sending to S3 & instead sending all /var/log messages.
Can anyone help me to understand what issue i'm facing here.
The text was updated successfully, but these errors were encountered: