-
Notifications
You must be signed in to change notification settings - Fork 23
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat: Add distributed tracing example (#279)
Co-authored-by: James Sumners <[email protected]>
- Loading branch information
1 parent
391246c
commit e7a78c4
Showing
16 changed files
with
271 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
package-lock=false |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,60 @@ | ||
# Sample distributed tracing application | ||
|
||
This example provides both a BullMQ producer and consumer with a redis instance. | ||
|
||
The producer starts a transaction, adds headers into the transaction and then adds those headers as part of the job data to be added to the queue. The producer and the new relic agent will shutdown after 10 seconds. | ||
|
||
The consumer starts a transaction, processes the jobs from the queue and links the transaction from the producer by accepting its headers that were added as part of the job data. | ||
|
||
## Getting started | ||
**Note**: This application requires the use of Node.js v20+ and docker. | ||
|
||
1. Clone or fork this repository. | ||
|
||
2. Setup the redis container | ||
|
||
```sh | ||
docker compose up -d | ||
``` | ||
|
||
3. Install dependencies and run application | ||
|
||
```sh | ||
npm install | ||
cp env.sample .env | ||
# Fill out `NEW_RELIC_LICENSE_KEY` in .env and save | ||
# Start the consumer | ||
npm run start:consumer | ||
# Start the producer in a different shell | ||
npm run start:producer | ||
``` | ||
***You can send more messages to the consumer by rerunning the producer with "npm run start:producer"*** | ||
|
||
## Exploring Telemetry | ||
After the producer sends a few messages and the consumer processes them, navigate to your application in `APM & Services`. Select `Distributed Tracing`. Transactions will be created for the messages sent and processed. Since the consumer is running and handling message consumption, Distributed Tracing will link the two entities. | ||
|
||
![Producer distributed tracing](./images/producer-dt.png?raw=true "Producer distributed tracing") | ||
![Producer distributed trace](./images/producer-dt-trace.png?raw=true "Producer distributed trace") | ||
|
||
The producer service map shows two entities: the producer and consumer. | ||
![Producer service map](./images/producer-service-map.png?raw=true "Producer service map") | ||
|
||
You will see a distributed trace and a service map for the consumer as well. | ||
|
||
![Consumer distributed tracing](./images/consumer-dt.png?raw=true "Consumer distributed tracing") | ||
|
||
The consumer service map shows two entities (producer and consumer) and redis. | ||
![Consumer service map](./images/consumer-service-map.png?raw=true "Consumer service map") | ||
|
||
There are transactions created for every message consumption. | ||
![Consumer Transactions](./images/consumer-transactions.png) | ||
|
||
## About `insertDistributedTraceHeaders` and `acceptDistributedTraceHeaders` | ||
|
||
For context on how to use `acceptDistributedTraceHeaders` and `insertDistributedTraceHeaders`, first read [Enable distributed tracing with agent APIs](https://docs.newrelic.com/docs/distributed-tracing/enable-configure/language-agents-enable-distributed-tracing/). | ||
|
||
You can use `insertDistributedTraceHeaders` and `acceptDistributedTraceHeaders` to link different transactions together. In this example, one background transaction is linked to another background transaction. | ||
|
||
`insertDistributedTraceHeaders` modifies the headers map that is passed in by adding W3C Trace Context headers and New Relic Distributed Trace headers. The New Relic headers can be disabled with `distributed_tracing.exclude_newrelic_header: true` in the config. | ||
|
||
`acceptDistributedTraceHeaders` is used to instrument the called service for inclusion in a distributed trace. It links the spans in a trace by accepting a payload generated by `insertDistributedTraceHeaders` or generated by some other W3C Trace Context compliant tracer. This method accepts the headers of an incoming request, looks for W3C Trace Context headers, and if not found, falls back to New Relic distributed trace headers. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,59 @@ | ||
/* | ||
* Copyright 2024 New Relic Corporation. All rights reserved. | ||
* SPDX-License-Identifier: Apache-2.0 | ||
*/ | ||
|
||
'use strict' | ||
const newrelic = require('newrelic') | ||
const { Worker } = require('bullmq') | ||
const IORedis = require('ioredis') | ||
|
||
const connection = new IORedis({ | ||
maxRetriesPerRequest: null | ||
}) | ||
|
||
// since BullMQ is not auto instrumented by the newrelic node agent, we have to manually start a transaction | ||
return newrelic.startBackgroundTransaction('Message queue - consumer', function outerHandler() { | ||
const worker = new Worker( | ||
'jobQueue', | ||
async (job) => { | ||
// create a transaction for every consumption | ||
newrelic.startBackgroundTransaction('Message consumption', function innerHandler() { | ||
console.log('Processing job:', job.id) | ||
console.log('Job data:', job.data) | ||
console.log('Job headers', job.data.headers) | ||
|
||
// call newrelic.getTransaction to retrieve a handle on the current transaction | ||
const backgroundHandle = newrelic.getTransaction() | ||
|
||
// link the transaction started in the producer by accepting its headers | ||
backgroundHandle.acceptDistributedTraceHeaders('Queue', job.data.headers) | ||
|
||
// end the transaction | ||
backgroundHandle.end() | ||
return Promise.resolve() | ||
}) | ||
}, | ||
{ connection } | ||
) | ||
|
||
worker.on('completed', (job) => { | ||
console.log(`Job with ID ${job.id} has been completed`) | ||
}) | ||
|
||
worker.on('failed', (job, err) => { | ||
console.log(`Job with ID ${job.id} has failed with error: ${err.message}`) | ||
}) | ||
|
||
console.log('Worker started') | ||
|
||
return new Promise((resolve) => { | ||
process.on('SIGINT', () => { | ||
newrelic.shutdown({ collectPendingData: true }, () => { | ||
console.log('new relic agent shutdown') | ||
// eslint-disable-next-line no-process-exit | ||
process.exit(0) | ||
}) | ||
}) | ||
}) | ||
}) |
13 changes: 13 additions & 0 deletions
13
custom-instrumentation/distributed-tracing/docker-compose.yml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
version: '3' | ||
services: | ||
redis: | ||
image: redis:latest | ||
container_name: sample_redis | ||
ports: | ||
- "6379:6379" | ||
restart: unless-stopped | ||
healthcheck: | ||
test: ["CMD", "redis-cli", "ping"] | ||
interval: 10s | ||
timeout: 60s | ||
retries: 60 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
NEW_RELIC_LICENSE_KEY= |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
/* | ||
* Copyright 2024 New Relic Corporation. All rights reserved. | ||
* SPDX-License-Identifier: Apache-2.0 | ||
*/ | ||
|
||
'use strict' | ||
|
||
module.exports = { | ||
extends: '@newrelic', | ||
parserOptions: { | ||
ecmaVersion: 'latest' | ||
}, | ||
rules: { | ||
'no-console': 'off' | ||
} | ||
} |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added
BIN
+292 KB
custom-instrumentation/distributed-tracing/images/consumer-service-map.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added
BIN
+421 KB
custom-instrumentation/distributed-tracing/images/consumer-transactions.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added
BIN
+628 KB
custom-instrumentation/distributed-tracing/images/producer-dt-trace.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added
BIN
+289 KB
custom-instrumentation/distributed-tracing/images/producer-service-map.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,50 @@ | ||
/* | ||
* Copyright 2024 New Relic Corporation. All rights reserved. | ||
* SPDX-License-Identifier: Apache-2.0 | ||
*/ | ||
|
||
'use strict' | ||
/** | ||
* New Relic agent configuration. | ||
* | ||
* See lib/config/default.js in the agent distribution for a more complete | ||
* description of configuration variables and their potential values. | ||
*/ | ||
exports.config = { | ||
logging: { | ||
/** | ||
* Level at which to log. 'trace' is most useful to New Relic when diagnosing | ||
* issues with the agent, 'info' and higher will impose the least overhead on | ||
* production applications. | ||
*/ | ||
level: 'info' | ||
}, | ||
/** | ||
* When true, all request headers except for those listed in attributes.exclude | ||
* will be captured for all traces, unless otherwise specified in a destination's | ||
* attributes include/exclude lists. | ||
*/ | ||
allow_all_headers: true, | ||
attributes: { | ||
/** | ||
* Prefix of attributes to exclude from all destinations. Allows * as wildcard | ||
* at end. | ||
* | ||
* NOTE: If excluding headers, they must be in camelCase form to be filtered. | ||
* | ||
* @env NEW_RELIC_ATTRIBUTES_EXCLUDE | ||
*/ | ||
exclude: [ | ||
'request.headers.cookie', | ||
'request.headers.authorization', | ||
'request.headers.proxyAuthorization', | ||
'request.headers.setCookie*', | ||
'request.headers.x*', | ||
'response.headers.cookie', | ||
'response.headers.authorization', | ||
'response.headers.proxyAuthorization', | ||
'response.headers.setCookie*', | ||
'response.headers.x*' | ||
] | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
{ | ||
"name": "distributed-tracing", | ||
"version": "1.0.0", | ||
"description": "", | ||
"main": "index.js", | ||
"scripts": { | ||
"start:producer": "NEW_RELIC_APP_NAME=message-queue-producer node -r newrelic --env-file .env producer.js", | ||
"start:consumer": "NEW_RELIC_LOG=./consumer_agent.log NEW_RELIC_APP_NAME=message-queue-consumer node -r newrelic --env-file .env consumer.js", | ||
"lint": "eslint . ", | ||
"lint:fix": "eslint . --fix" | ||
}, | ||
"author": "", | ||
"license": "ISC", | ||
"dependencies": { | ||
"bullmq": "^5.10.1", | ||
"ioredis": "^5.4.1", | ||
"newrelic": "^11.19.0" | ||
}, | ||
"devDependencies": { | ||
"@newrelic/eslint-config": "^0.4.0" | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,47 @@ | ||
/* | ||
* Copyright 2024 New Relic Corporation. All rights reserved. | ||
* SPDX-License-Identifier: Apache-2.0 | ||
*/ | ||
|
||
'use strict' | ||
const newrelic = require('newrelic') | ||
const { Queue } = require('bullmq') | ||
const IORedis = require('ioredis') | ||
|
||
const connection = new IORedis({ | ||
maxRetriesPerRequest: null | ||
}) | ||
|
||
const queue = new Queue('jobQueue', { connection }) | ||
|
||
// since BullMQ is not auto instrumented by the newrelic node agent, we have to manually start a transaction. | ||
return newrelic.startBackgroundTransaction('Message queue - producer', function innerHandler() { | ||
console.log('Message queue started') | ||
|
||
// call newrelic.getTransaction to retrieve a handle on the current transaction. | ||
const backgroundHandle = newrelic.getTransaction() | ||
|
||
// insert the headers into the transaction | ||
const headers = { 'test-dt': 'test-newrelic' } | ||
backgroundHandle.insertDistributedTraceHeaders(headers) | ||
|
||
// add jobs every 6 milliseconds with data containing the message and the headers | ||
setInterval(async () => { | ||
await queue.add('simpleJob', { message: 'This is a background job', headers }) | ||
console.log('Job added to the queue') | ||
}, 600) | ||
|
||
// end the transaction | ||
backgroundHandle.end() | ||
|
||
return new Promise((resolve) => { | ||
setTimeout(() => { | ||
newrelic.shutdown({ collectPendingData: true }, () => { | ||
console.log('new relic agent shutdown') | ||
resolve() | ||
// eslint-disable-next-line no-process-exit | ||
process.exit(0) | ||
}) | ||
}, 10000) | ||
}) | ||
}) |