You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First - thanks for contributing this tool! At $dayjob we're looking at piloting laikaboss as the engine for automated file analysis of files extracted from a large grid of network sensors - so here's the first of several newb questions:
Right now I have a networked instance of laika running and accepting requests from cloudscan.py just fine. I plan to leverage cloudscan to send files to laikad from each sensor. However I'd like the results of those scans to be sent from the centralized laikad instance to a log aggregation point. It looks like I can accomplish this with the log_fluent module - however I'm having trouble figuring out how I might configure it to send these logs to my fluentd endpoint. Is this done in laikad.conf or someplace else? Would really appreciate it if someone could point me in the right direction. Once I get this figured out I'd be happy to write documentation on it and contribute it back.
Many thanks!
The text was updated successfully, but these errors were encountered:
You'll see that by default, it sends to a local instance of fluentd. You can then configure your local instance of fluentd to forward to a central collector. Configuring fluentd this was offers you some additional flexibility such as load balancing the output across multiple receivers. I believe you could also go directly to the central receiver by configuring the module that way. You can change the settings two ways:
In conditional-dispatch.yara, modify LOG_FLUENT call to be something like LOG_FLUENT(host=mylogreceive.company.com,port=27524,tag=whatever)
All of the available options are in the code-- note that the get_option() helper function has the dispatcher argument name, config item name, and default.
If you wanted to set the host option you could do it 2 ways:
As a dispatcher argument, it would just be "host" as shown in the example above.
Secondly, you could set a config option in laikaboss.conf under ModuleHelpers.
If you set nothing, the third argument (in this case "localhost") will be used as default.
Hope this helps let me kno wif you have any further questions!
p.s. We provided LOG_FLUENT as an example -- We recognize not everyone will use it, but it's a nice "out of the box" solution. If you have a preferred way to collect the logs as you want to use LOG_FLUENT as a starting point to develop a new module, feel free to contribute it back!
First - thanks for contributing this tool! At $dayjob we're looking at piloting laikaboss as the engine for automated file analysis of files extracted from a large grid of network sensors - so here's the first of several newb questions:
Right now I have a networked instance of laika running and accepting requests from cloudscan.py just fine. I plan to leverage cloudscan to send files to laikad from each sensor. However I'd like the results of those scans to be sent from the centralized laikad instance to a log aggregation point. It looks like I can accomplish this with the log_fluent module - however I'm having trouble figuring out how I might configure it to send these logs to my fluentd endpoint. Is this done in laikad.conf or someplace else? Would really appreciate it if someone could point me in the right direction. Once I get this figured out I'd be happy to write documentation on it and contribute it back.
Many thanks!
The text was updated successfully, but these errors were encountered: