-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Datajson v3.1 #12319
base: master
Are you sure you want to change the base?
Datajson v3.1 #12319
Conversation
ERROR: ERROR: QA failed on build_asan. Pipeline 24045 |
ERROR: ERROR: QA failed on build_asan. Pipeline 24046 |
I modified the PR description to fix the SV branch and restarted the checks. |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #12319 +/- ##
==========================================
- Coverage 83.26% 83.09% -0.17%
==========================================
Files 912 920 +8
Lines 257643 259009 +1366
==========================================
+ Hits 214521 215222 +701
- Misses 43122 43787 +665
Flags with carried forward coverage won't be shown. Click here to find out more. |
Sorry I missed that again. |
@regit there are some build errors in the CI checks as well. Have you looked at them? |
Indeed going to fix that |
Setting to draft as it needs some more love but I would be happy to get feedback on the concept. |
ERROR: ERROR: QA failed on build_asan. Pipeline 24048 |
This pull request sets up GitHub code scanning for this repository. Once the scans have completed and the checks have passed, the analysis results for this pull request branch will appear on this overview. Once you merge this pull request, the 'Security' tab will show more code scanning analysis results (for example, for the default branch). Depending on your configuration and choice of analysis tool, future pull requests will be annotated with code scanning analysis results. For more information about GitHub code scanning, check out the documentation. |
ERROR: ERROR: QA failed on build_asan. Pipeline 24049 |
This patch introduces a new keyword datajson that is similar to dataset with a twist. Where dataset allows match from sets, datajson allows the same but also adds JSON data to the alert event. This data is comint from the set definition it self. For example, an ipv4 set will look like: 10.16.1.11,{"test": "success","context":3} The syntax is value and json data separated by a comma. The syntax of the keyword is the following: datajson:isset,src_ip,type ip,load src.lst,key src_ip; Compare to dataset, it just have a supplementary option key that is used to indicate in which subobject the JSON value should be added. The information is added in the even under the alert.extra subobject: "alert": { "extra": { "src_ip": { "test": "success", "context": 3 }, The main interest of the feature is to be able to contextualize a match. For example, if you have an IOC source, you can do value1,{"actor":"APT28","Country":"FR"} value2,{"actor":"APT32","Country":"NL"} This way, a single dataset is able to produce context to the event where it was not possible before and multiple signatures had to be used. Ticket: OISF#7372
Previous code was using an array and introducing a limit in the number of datajson keywords that can be used in a signature. This patch uses a linked list instead to overcome the limit. By using a first element of the list that is part of the structure we limit the cost of the feature to a structure member added to PacketAlert structure. Only the PacketAlertFree function is impacted as we need to iterate to find potential allocation. Ticket: OISF#7372
It was not handling correctly the json values with space as they were seen as multiple arguments. Ticket: OISF#7372
As 1.2.23.4,1 can be a valid datarep and a valid datajson entry, we can't differentiate datarep and datajson contents by the value so we need to separate the parsing based on the knowing the keyword.
With datajson infrastructure in place, it is now possible to add data in the extra information section. Following an idea by Jason Ish, this patch adds the feature for pcre extraction. A PCRE such as pcre:"/(?P<alert_ua>[a-zA-Z]+)\//" will add the content of the captured group to alert.extra.ua.
It can contain any vars so need addition properties.
As previous commit is adding the alert option, let's document the full family.
The format introduced in datajson is an evolution of the historical datarep format. This has some limitations. For example, if a user fetch IOCs from a threat intel server there is a large change that the format will be JSON or XML. Suricata has no support for the second but can support the first one. This patch implements this concept. A optional json_key option is added to the datajson keyword. If present, then Suricata assumes that the data file contains a JSON object that is an array. For each element elt of this array, the value added to the dataset is elt['$json_key'] and the JSON data is elt itself. Keeping the key value may seem redundant but it is useful to have it directly accessible in the extra data to be able to query it independantly of the signature (where it can be multiple metadata or even be a transformed metadata).
In some case, when interacting with data (mostly coming from threat intel servers), the JSON array containing the data to use is not at the root of the object and it is ncessary to access a subobject. This patch implements this with support of key in level1.level2. This is done via the `array_key` option that contains the path to the data.
This patch separates the datajson from dataset. File like datasets.c file were really too long after adding the datajson feature. Doing that, some function names implementing datajson feature have been renamed with a Datajson prefix to simplify them.
ERROR: ERROR: QA failed on build_asan. Pipeline 24050 |
With this patch, it is now possible to define the value to be used in the datajson set as a value in a chain of subobjects. For example, with the following JSON: { "info": { "threat": [ { "context": "gold old test", "year": 2005, "host": { "fqdn": "www.testmyids.com", "domain": "testmyids.com" } } ] } } it is possible to match on host.fqdn by doing: http.host; datajson:isset,nkbadhost,type string,load hosts-nested-key.json,key host,json_key host.fqdn, array_key info.threat `array_key info.threat` to access the inner array and then `json_key host.fqdn` to access the field inside.
ERROR: ERROR: QA failed on build_asan. Pipeline 24051 |
Information: ERROR: QA failed on SURI_TLPW2_autofp_suri_time. ERROR: QA failed on SURI_TLPR1_suri_time.
Pipeline 24052 |
Update of #12289.
This is a major functional upgrade and it also has code organization update as most datajson code is now in its own files to limit the size of file like datasets.c that was becoming really complicated to manage.
On the functional side, there is a new feature that changes the scope of the original proposal.
There is a lot of tools were the IOCs are published in a format like the following:
There is a object containing an array and in this array, we have one object per IOC and some of the key contains the IOC we want to inject in the Suricata.
The feature update consists in giving the capabilities to directly handle this type of data in Suricata. To do so, 2 options have been added:
In the previous example, we can use it with the following match:
A signature with this match will then produce something like:
The previous data format has been kept and this new one is thus an additional choice to the user. The advantage is that it does not require post processing to inject the data in Suricata which should be convenient for most users.
Contribution style:
https://docs.suricata.io/en/latest/devguide/contributing/contribution-process.html
Our Contribution agreements:
https://suricata.io/about/contribution-agreement/ (note: this is only required once)
Changes (if applicable):
(including schema descriptions)
https://redmine.openinfosecfoundation.org/projects/suricata/issues
Link to ticket: https://redmine.openinfosecfoundation.org/issues/
Describe changes:
SV_BRANCH=OISF/suricata-verify#2205