Skip to content

[AI-5153] DDS: Mac Audit Logs Integration v1.0.0 #19989

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 15 commits into
base: master
Choose a base branch
from

Conversation

tirthrajchaudhari-crest
Copy link
Contributor

What does this PR do?

This is an initial release PR of Mac Audit Logs integration including all the required assets. This is agent based integration.

Additional Notes

  • OOTB detection rules JSON would be shared separately with the required teams as a part of separate repository .
  • Since during the standard attribute remapping we are not preserving the source attributes as per suggested best practices, it would result in filters using these standard attributes populating the values of other integrations as well as per current datadog behaviour.

Review checklist (to be filled by reviewers)

  • Feature or bugfix MUST have appropriate tests (unit, integration, e2e)
  • Add the qa/skip-qa label if the PR doesn't need to be tested during QA.
  • If you need to backport this PR to another branch, you can add the backport/<branch-name> label to the PR and it will automatically open a backport PR once this one is merged

Copy link

codecov bot commented Apr 2, 2025

Codecov Report

Attention: Patch coverage is 60.55046% with 43 lines in your changes missing coverage. Please review.

Project coverage is 88.65%. Comparing base (2d13353) to head (39b8924).
Report is 67 commits behind head on master.

Additional details and impacted files
Flag Coverage Δ
activemq ?
cassandra ?
hive ?
hivemq ?
hudi ?
ignite ?
jboss_wildfly ?
kafka ?
mac_audit_logs 60.55% <60.55%> (?)
presto ?
solr ?

Flags with carried forward coverage won't be shown. Click here to find out more.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@tirthrajchaudhari-crest tirthrajchaudhari-crest marked this pull request as ready for review April 2, 2025 15:26
@tirthrajchaudhari-crest tirthrajchaudhari-crest requested review from a team as code owners April 2, 2025 15:26
@tirthrajchaudhari-crest tirthrajchaudhari-crest changed the title DDS: Mac Audit Logs Integration v1.0.0 [AI-5153] DDS: Mac Audit Logs Integration v1.0.0 Apr 2, 2025
@drichards-87 drichards-87 added the editorial review Waiting on a more in-depth review from a docs team editor label Apr 2, 2025
@drichards-87
Copy link
Contributor

Created DOCS-10537 for Docs Team editorial review.

@nubtron
Copy link
Contributor

nubtron commented Apr 8, 2025

Hi @tirthrajchaudhari-crest, just a quick update - I’m working on the review and discussing some aspects with the team, I'll share as soon as it's ready!

Copy link
Contributor

@nubtron nubtron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Setting the time zone in the backend is inconvenient for the customer, and it would cause complications if there are clients in different time zones. Could you please modify the integration so that timezones are handled automatically?

You can read the time zone with, for eg., the date "+%z" command. Then you can
convert the timestamp and send the log data with the send_log function:

from datadog_checks.base import AgentCheck
from datadog_checks.base.utils.time import get_timestamp

class HelloCheck(AgentCheck):
    def check(self, instance):
        data = dict()
        data['timestamp'] = get_timestamp()
        data['message'] = "this is a custom log message!"
        data['ddtags'] = "env:dev,bar:foo"
        self.send_log(data)

You also need to change the type entry in logs section to integration, for eg:

logs:
  - type: integration
    source: my_source
    service: my_service

Octopus Deploy and SAP HANA have usages of send_log that can be used as examples.

You can parse the XML in the logs with Python code. You should use the lxml library for that, instead of the built-in Python xml library, which has security issues. The IBM WAS integration has an example of lxml usage.

- **Input/Output Control**
- **IPC (Inter-Process Communication)**

This integration collects mac audit logs and sends them to Datadog for analysis, providing visual insights through out-of-the-box dashboards and Log Explorer. It also helps monitor and respond to security threats with ready-to-use Cloud SIEM detection rules.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This integration collects mac audit logs and sends them to Datadog for analysis, providing visual insights through out-of-the-box dashboards and Log Explorer. It also helps monitor and respond to security threats with ready-to-use Cloud SIEM detection rules.
This integration collects Mac audit logs and sends them to Datadog for analysis, providing visual insights through out-of-the-box dashboards and the Log Explorer. It also helps monitor and respond to security threats with ready-to-use Cloud SIEM detection rules.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

### Configuration

#### Configure BSM Auditing on Mac

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
**Note**: The following steps are required for the Mac version >=14.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


4. Restart the Mac.

**Note**: The above steps are needed for the mac version >=14.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Recommend moving this note to the beginning of the configuration.

Suggested change
**Note**: The above steps are needed for the mac version >=14.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


2. Enter "Mac Audit Logs" in the **Filter Pipelines** search box.

3. Hover over the Mac Audit Logs pipeline and click on the **clone** button. This will create an editable clone of the Mac Audit Logs pipeline.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
3. Hover over the Mac Audit Logs pipeline and click on the **clone** button. This will create an editable clone of the Mac Audit Logs pipeline.
3. Hover over the Mac Audit Logs pipeline and click **clone**. This creates an editable clone of the Mac Audit Logs pipeline.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Timezone support has been added in code itself. Hence, Removed this section.


3. Hover over the Mac Audit Logs pipeline and click on the **clone** button. This will create an editable clone of the Mac Audit Logs pipeline.

4. Edit the Grok Parser using the below steps:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
4. Edit the Grok Parser using the below steps:
4. Edit the Grok Parser by following these steps:

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Timezone support has been added in code itself. Hence, Removed this section.

4. Edit the Grok Parser using the below steps:
- In the cloned pipeline, find a processor with the name "Grok Parser: Parse \`record.time\` attribute" and click on the `Edit` button by hovering over the pipeline.
- Under **Define parsing rules**,
- Change the string `UTC` to the [TZ identifier][9] of the time zone of your MAC machine. For example, if your timezone is IST, you would change the value to`Asia/Calcutta`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Recommend consistent capitalization of Mac, unless this is all caps for a reason.

Suggested change
- Change the string `UTC` to the [TZ identifier][9] of the time zone of your MAC machine. For example, if your timezone is IST, you would change the value to`Asia/Calcutta`.
- Change the string `UTC` to the [TZ identifier][9] of the time zone of your Mac machine. For example, if your timezone is IST, you would change the value to`Asia/Calcutta`.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Timezone support has been added in code itself. Hence, Removed this section.

- In the cloned pipeline, find a processor with the name "Grok Parser: Parse \`record.time\` attribute" and click on the `Edit` button by hovering over the pipeline.
- Under **Define parsing rules**,
- Change the string `UTC` to the [TZ identifier][9] of the time zone of your MAC machine. For example, if your timezone is IST, you would change the value to`Asia/Calcutta`.
- Click the **update** button.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- Click the **update** button.
- Click **Update**.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To confirm, is the button lowercase?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Timezone support has been added in code itself. Hence, Removed this section.

4. Edit the Grok Parser using the below steps:
- In the cloned pipeline, find a processor with the name "Grok Parser: Parse \`record.time\` attribute" and click on the `Edit` button by hovering over the pipeline.
- Under **Define parsing rules**,
- Change the string `UTC` to the [TZ identifier][9] of the time zone of your MAC machine. For example, if your timezone is IST, you would change the value to`Asia/Calcutta`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To confirm, did you intend to send the reader into the logs pipelines page or to a reference for TZ identifier? If you meant to link to the logs pipeline page, it seems redundant since the user is directed there in step 1.

Suggested change
- Change the string `UTC` to the [TZ identifier][9] of the time zone of your MAC machine. For example, if your timezone is IST, you would change the value to`Asia/Calcutta`.
- Change the string `UTC` to the TZ identifier of the time zone of your MAC machine. For example, if your timezone is IST, you would change the value to`Asia/Calcutta`.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Timezone support has been added in code itself. Hence, Removed this section.

@tirthrajchaudhari-crest
Copy link
Contributor Author

self.send_log(data)

Hey @nubtron , Thank you for the feedback. We are currently discussing this internally and will get back to you soon.​

@tirthrajchaudhari-crest
Copy link
Contributor Author

Hi @estherk15, I'll do the suggested changes for README along with this change.

@tirthrajchaudhari-crest
Copy link
Contributor Author

Setting the time zone in the backend is inconvenient for the customer, and it would cause complications if there are clients in different time zones. Could you please modify the integration so that timezones are handled automatically?

You can read the time zone with, for eg., the date "+%z" command. Then you can convert the timestamp and send the log data with the send_log function:

from datadog_checks.base import AgentCheck
from datadog_checks.base.utils.time import get_timestamp

class HelloCheck(AgentCheck):
    def check(self, instance):
        data = dict()
        data['timestamp'] = get_timestamp()
        data['message'] = "this is a custom log message!"
        data['ddtags'] = "env:dev,bar:foo"
        self.send_log(data)

You also need to change the type entry in logs section to integration, for eg:

logs:
  - type: integration
    source: my_source
    service: my_service

Octopus Deploy and SAP HANA have usages of send_log that can be used as examples.

You can parse the XML in the logs with Python code. You should use the lxml library for that, instead of the built-in Python xml library, which has security issues. The IBM WAS integration has an example of lxml usage.

Hey @nubtron, We have made the necessary changes as per your recommendations.

@nubtron
Copy link
Contributor

nubtron commented Apr 15, 2025

Hi @tirthrajchaudhari-crest, there is an issue with how the main check function is blocked by the praudit process, I'm looking into what could be a good alternative.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants