-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE IDEA] User data store for IP lists, etc. #275
Comments
We have two concepts in the roadmap for August:
Would you have a mission-critical use-case for this? |
The reference table feature sounds like what I'm looking for, as long as the upload or maintain could happen from within a workflow. For example, if I had a resource like this: https://iptoasn.com/ I would want to have a workflow running once a week to ingest that into the table, and then in other workflows I could use an action to look up an IP in that table, etc. The use cases that I had in mind are the checking incoming alerts against those lists to determine what I want to do with them. The other use cases fall into the enrichment category (looking up ASN or geolocation when the alert data does not contain it). Mission-critical? Depends on how you determine that. A pretty standard activity for a SOC analyst is to enrich an IP by determining the location and whether it's a known Tor exit node, VPN service, etc. ASN lookups are handy for blocking in some services like Okta where you can block an entire ASN, not just individual IPs. If we build some integrations for firewall management, I can see the lists being good for maintaining a whitelist of IPs to never block, or firing off extra workflows if you get an alert for an IP on a monitoring list. Extra workflow in this case would be to your EDR to run a script or gather artifacts, etc. Maybe I'm not blocking it outright, but I want to know when it happens to hang other actions off of it. |
Clarification: by "mission critical", would this be a blocking feature for evaluating / using Tracecat this month and August. We definitely want to add it in (and we plan to in August and release before September) as it's something other folks have requested privately as well. |
💯 it's slow, wasteful, and flaky to have to pull in IoCs lists from your TI source on every workflow. Reference tables are a really nice feature. It is also a necessary feature to build out more "AI-enabled" features (e.g. associating cases and detections with MITRE attack labels, the labels would be stored as reference tables). |
Not mission critical by this definition for me. Definitely use cases that we have that would be enabled by these features, but not blocking for me. |
Just wanted to throw a few more use cases on the pile for this one:
|
Huge. Overall thinking: two-way sync embedded into your soar would be extremely useful. Overtime, we could build in two way sync across different tooling (eg your sentinel one cases, so you don't need to add 2-3 extra actions just for the "update" step post triage and investigation) |
Yes, I had a SQL/ODBC integration on my list to get the data in/out from other systems, but having it in the "platform" to sync these user tables with other sources would be awesome without having to sprinkle these actions into every workflow |
Is your feature request related to a problem? Please describe.
We have the database sink action, but it is intended to sink to external DBs. A way for users to store data local to tracecat, and then query against that in other actions would be really handy for a use case like scheduling a workflow once a day to pull the latest TOR exit nodes, and another workflow for incidents/events that you could compare an IP against that list without pulling the list each time the other workflow ran.
Other good use cases for having this capability:
Describe the solution you'd like
2 actions:
Describe alternatives you've considered
Other alternatives:
Additional context
The text was updated successfully, but these errors were encountered: