Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Added new API method withLlmCustomAttributes to run a function in a LLM context #2437

Merged
merged 39 commits into from
Aug 22, 2024
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
2361240
feat: Set LLM events custom attributes
MikeVaz Jul 16, 2024
4dd3be8
fix: Debug statements
MikeVaz Jul 16, 2024
55da778
fix: Example
MikeVaz Jul 16, 2024
720be07
fix: Add guards
MikeVaz Jul 16, 2024
878d6b5
fix: Add unit test
MikeVaz Jul 17, 2024
de8b498
feat: WithLlmCustomAttributes
MikeVaz Jul 22, 2024
dc0261a
feat: Merge parent context into children
RyanKadri Jul 23, 2024
07cbcab
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Jul 29, 2024
08927ec
feat: Option 4
MikeVaz Jul 31, 2024
8b0cd84
fix: Unnecessary test
MikeVaz Jul 31, 2024
7ab7bcd
fix: Test name
MikeVaz Jul 31, 2024
0b2f2d6
fix: Integration tests
MikeVaz Aug 1, 2024
f75dabf
fix: Remove extra npm scripts
MikeVaz Aug 2, 2024
19a9e80
fix: Remove extra npm scripts
MikeVaz Aug 2, 2024
24a13bd
fix: PR feedback
MikeVaz Aug 2, 2024
af4aafa
fix: PR feedback
MikeVaz Aug 2, 2024
d428aaa
fix: PR feedback
MikeVaz Aug 5, 2024
d2264ad
fix: Unit test and pr feedback
MikeVaz Aug 5, 2024
807f0cf
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Aug 13, 2024
ef488da
fix: PR feedback
MikeVaz Aug 13, 2024
bcc9e15
fix: Typo
MikeVaz Aug 13, 2024
cef4289
fix: PR feedback
MikeVaz Aug 13, 2024
87588e3
fix: PR feedback
MikeVaz Aug 13, 2024
d0101f6
fix: Unit test
MikeVaz Aug 13, 2024
dbd7282
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Aug 14, 2024
e7531d7
fix: Apply solution 1
MikeVaz Aug 14, 2024
308bdeb
Update lib/util/llm-utils.js
MikeVaz Aug 15, 2024
a32f932
fix: PR feedback
MikeVaz Aug 15, 2024
ba516b0
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Aug 16, 2024
6e629f9
fix: PR feedback
MikeVaz Aug 20, 2024
a8cbb8a
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Aug 20, 2024
6d3bd2d
fix: PR feedback
MikeVaz Aug 20, 2024
1f2d240
Update lib/instrumentation/openai.js
MikeVaz Aug 21, 2024
8cc326f
Update lib/instrumentation/langchain/common.js
MikeVaz Aug 21, 2024
cab2f74
Update lib/instrumentation/aws-sdk/v3/bedrock.js
MikeVaz Aug 21, 2024
67a35de
fix: Improve test coverage
MikeVaz Aug 21, 2024
d8353ff
Merge branch 'setLlmCustomAttributes' of https://github.com/MikeVaz/n…
MikeVaz Aug 21, 2024
cb423d6
fix: More unit test and PR feedback
MikeVaz Aug 21, 2024
21e4b5f
chore: Addressed code review feedback
bizob2828 Aug 22, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 43 additions & 0 deletions api.js
Original file line number Diff line number Diff line change
Expand Up @@ -1548,7 +1548,7 @@
* @param {string} params.traceId Identifier for the feedback event.
* Obtained from {@link getTraceMetadata}.
* @param {string} params.category A tag for the event.
* @param {string} params.rating A indicator of how useful the message was.

Check warning on line 1551 in api.js

View workflow job for this annotation

GitHub Actions / lint (lts/*)

The type 'getTraceMetadata' is undefined
* @param {string} [params.message] The message that triggered the event.
* @param {object} [params.metadata] Additional key-value pairs to associate
* with the recorded event.
Expand Down Expand Up @@ -1902,4 +1902,47 @@
transaction.ignoreApdex = true
}

/**

Check warning on line 1905 in api.js

View workflow job for this annotation

GitHub Actions / lint (lts/*)

Missing JSDoc @returns declaration
* Runs a function synchronously within a provided LLM custom attributes context and returns its return value.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it has to be synchronously. The context manager should properly bind all functions run within the context

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went with how they describe it in the nodejs docs.

Runs a function synchronously within a context and returns its return value. The store is not accessible outside of the callback function. The store is accessible to any asynchronous operations created within the callback.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The store is accessible to any asynchronous operations created within the callback.

*
* An example of setting a custom attribute:
*
* newrelic.withLlmCustomAttributes({'llm.someAttribute': 'someVallue'}, () => {
MikeVaz marked this conversation as resolved.
Show resolved Hide resolved
* return;
* })
* @param {Object} context LLM custom attributes context
* @param {Function} callback synchronous function called within the context
bizob2828 marked this conversation as resolved.
Show resolved Hide resolved
*/
API.prototype.withLlmCustomAttributes = function withLlmCustomAttributes(context, callback) {
const metric = this.agent.metrics.getOrCreateMetric(
NAMES.SUPPORTABILITY.API + '/withLlmCustomAttributes'
)
metric.incrementCallCount()

const transaction = this.agent.tracer.getTransaction()
if (!transaction) {
logger.warn('withLlmCustomAttributes must be called within the scope of a transaction.')
return callback?.()
bizob2828 marked this conversation as resolved.
Show resolved Hide resolved
}

for (const key in context) {
if (Object.hasOwn(context, key)) {
const value = context[key]
if (typeof value === 'object' || typeof value === 'function') {
jsumners-nr marked this conversation as resolved.
Show resolved Hide resolved
logger.warn(`Invalid attribute type for ${key}. Skipped.`)
delete context[key]
} else if (key.indexOf('llm.') !== 0) {
logger.warn(`Invalid attribute name ${key}. Renamed to "llm.${key}".`)
delete context[key]
context[`llm.${key}`] = value
}
}
}
jsumners-nr marked this conversation as resolved.
Show resolved Hide resolved

const parentContext = this.agent._contextManager.getContext()

const fullContext = parentContext ? Object.assign(parentContext, context || {}) : context
return this.agent._contextManager.runInContext(fullContext, callback, this)
}

module.exports = API
13 changes: 12 additions & 1 deletion lib/instrumentation/aws-sdk/v3/bedrock.js
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,18 @@ function isStreamingEnabled({ commandName, config }) {
*/
function recordEvent({ agent, type, msg }) {
msg.serialize()
agent.customEventAggregator.add([{ type, timestamp: Date.now() }, msg])
const context = agent._contextManager.getContext()
const llmContext = Object.keys(context || {}).reduce((result, key) => {
if (key.indexOf('llm.') === 0) {
result[key] = context[key]
}
return result
}, {})

agent.customEventAggregator.add([
{ type, timestamp: Date.now() },
Object.assign({}, msg, llmContext || {})
MikeVaz marked this conversation as resolved.
Show resolved Hide resolved
])
}

/**
Expand Down
13 changes: 12 additions & 1 deletion lib/instrumentation/langchain/common.js
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,18 @@ common.mergeMetadata = function mergeMetadata(localMeta = {}, paramsMeta = {}) {
*/
common.recordEvent = function recordEvent({ agent, type, msg, pkgVersion }) {
agent.metrics.getOrCreateMetric(`${LANGCHAIN.TRACKING_PREFIX}/${pkgVersion}`).incrementCallCount()
agent.customEventAggregator.add([{ type, timestamp: Date.now() }, msg])
const context = agent._contextManager.getContext()
const llmContext = Object.keys(context || {}).reduce((result, key) => {
if (key.indexOf('llm.') === 0) {
result[key] = context[key]
}
return result
}, {})

agent.customEventAggregator.add([
{ type, timestamp: Date.now() },
Object.assign({}, msg, llmContext || {})
MikeVaz marked this conversation as resolved.
Show resolved Hide resolved
])
}

/**
Expand Down
13 changes: 12 additions & 1 deletion lib/instrumentation/openai.js
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,18 @@ function decorateSegment({ shim, result, apiKey }) {
* @param {object} params.msg LLM event
*/
function recordEvent({ agent, type, msg }) {
agent.customEventAggregator.add([{ type, timestamp: Date.now() }, msg])
const context = agent._contextManager.getContext()
const llmContext = Object.keys(context || {}).reduce((result, key) => {
if (key.indexOf('llm.') === 0) {
result[key] = context[key]
}
return result
}, {})

agent.customEventAggregator.add([
{ type, timestamp: Date.now() },
Object.assign({}, msg, llmContext || {})
MikeVaz marked this conversation as resolved.
Show resolved Hide resolved
])
}

/**
Expand Down
42 changes: 42 additions & 0 deletions test/unit/api/api-llm.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,48 @@ tap.test('Agent API LLM methods', (t) => {
})
})

t.test('withLlmCustomAttributes', (t) => {
const { api } = t.context
helper.runInTransaction(api.agent, (tx) => {
const contextManager = api.agent._contextManager
t.context.agent.tracer.getTransaction = () => {
return tx
}
contextManager.setContext({
initial: true
})

api.withLlmCustomAttributes(
{
'toRename': 'value1',
'llm.number': 1,
'llm.boolean': true,
'toDelete': () => {},
'toDelete2': {},
'toDelete3': []
},
() => {
const parentContext = contextManager.getContext()
t.ok(parentContext.initial)
t.equal(parentContext['llm.toRename'], 'value1')
t.notOk(parentContext.toDelete)
t.notOk(parentContext.toDelete2)
t.notOk(parentContext.toDelete3)
t.equal(parentContext['llm.number'], 1)
t.equal(parentContext['llm.boolean'], true)

api.withLlmCustomAttributes({ 'llm.someAttribute': 'someValue' }, () => {
const context = contextManager.getContext()
t.ok(context.initial)
t.equal(context[`llm.toRename`], 'value1')
t.equal(context['llm.someAttribute'], 'someValue')
t.end()
})
}
)
})
})

t.test('setLlmTokenCount should register callback to calculate token counts', async (t) => {
const { api, agent } = t.context
function callback(model, content) {
Expand Down
2 changes: 1 addition & 1 deletion test/unit/api/stub.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
const tap = require('tap')
const API = require('../../../stub_api')

const EXPECTED_API_COUNT = 36
const EXPECTED_API_COUNT = 37

tap.test('Agent API - Stubbed Agent API', (t) => {
t.autoend()
Expand Down
22 changes: 22 additions & 0 deletions test/unit/instrumentation/openai.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -119,5 +119,27 @@ test('openai unit tests', (t) => {
t.equal(isWrapped, false, 'should not wrap chat completions create')
t.end()
})

t.test('should record LLM custom events with attributes', (t) => {
const { shim, agent, initialize } = t.context
shim.pkgVersion = '4.12.2'
const MockOpenAi = getMockModule()
agent.config.ai_monitoring.record_content = { enabled: true }
initialize(agent, MockOpenAi, 'openai', shim)
const completions = new MockOpenAi.Chat.Completions()
agent._contextManager.setContext({ initial: true })
const api = helper.getAgentApi()
helper.runInTransaction(agent, () => {
api.withLlmCustomAttributes({ 'llm.attribute': `someValue` }, async () => {
await completions.create({ stream: false, messages: [{ role: 'user', content: 'Hello' }] })
const events = agent.customEventAggregator.events.toArray()
const [[, message]] = events
t.notOk(message.initial)
t.equal(message['llm.attribute'], 'someValue')
t.end()
})
})
})

t.end()
})
41 changes: 23 additions & 18 deletions test/versioned/aws-sdk-v3/bedrock-chat-completions.tap.js
Original file line number Diff line number Diff line change
Expand Up @@ -135,25 +135,30 @@ tap.afterEach(async (t) => {
const api = helper.getAgentApi()
helper.runInTransaction(agent, async (tx) => {
api.addCustomAttribute('llm.conversation_id', 'convo-id')
await client.send(command)
const events = agent.customEventAggregator.events.toArray()
t.equal(events.length, 3)
const chatSummary = events.filter(([{ type }]) => type === 'LlmChatCompletionSummary')[0]
const chatMsgs = events.filter(([{ type }]) => type === 'LlmChatCompletionMessage')

t.llmMessages({
modelId,
prompt,
resContent: '42',
tx,
expectedId: modelId.includes('ai21') || modelId.includes('cohere') ? '1234' : null,
chatMsgs
api.withLlmCustomAttributes({ 'llm.contextAttribute': 'someValue' }, async () => {
jsumners-nr marked this conversation as resolved.
Show resolved Hide resolved
await client.send(command)
const events = agent.customEventAggregator.events.toArray()
t.equal(events.length, 3)
const chatSummary = events.filter(([{ type }]) => type === 'LlmChatCompletionSummary')[0]
const chatMsgs = events.filter(([{ type }]) => type === 'LlmChatCompletionMessage')

t.llmMessages({
modelId,
prompt,
resContent: '42',
tx,
expectedId: modelId.includes('ai21') || modelId.includes('cohere') ? '1234' : null,
chatMsgs
})

t.llmSummary({ tx, modelId, chatSummary })

const [, message] = chatSummary
t.ok(message['llm.contextAttribute'])

tx.end()
t.end()
})

t.llmSummary({ tx, modelId, chatSummary })

tx.end()
t.end()
})
}
)
Expand Down
69 changes: 37 additions & 32 deletions test/versioned/langchain/runnables.tap.js
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@ tap.test('Langchain instrumentation - runnable sequence', (t) => {

t.test('should create langchain events for every invoke call', (t) => {
const { agent, prompt, outputParser, model } = t.context

helper.runInTransaction(agent, async (tx) => {
const input = { topic: 'scientist' }
const options = { metadata: { key: 'value', hello: 'world' }, tags: ['tag1', 'tag2'] }
Expand Down Expand Up @@ -99,39 +98,45 @@ tap.test('Langchain instrumentation - runnable sequence', (t) => {
'should create langchain events for every invoke call on chat prompt + model + parser',
(t) => {
const { agent, prompt, outputParser, model } = t.context

const api = helper.getAgentApi()
agent._contextManager.setContext({ initial: true })
helper.runInTransaction(agent, async (tx) => {
const input = { topic: 'scientist' }
const options = { metadata: { key: 'value', hello: 'world' }, tags: ['tag1', 'tag2'] }

const chain = prompt.pipe(model).pipe(outputParser)
await chain.invoke(input, options)

const events = agent.customEventAggregator.events.toArray()

const langchainEvents = filterLangchainEvents(events)
const langChainMessageEvents = filterLangchainEventsByType(
langchainEvents,
'LlmChatCompletionMessage'
)
const langChainSummaryEvents = filterLangchainEventsByType(
langchainEvents,
'LlmChatCompletionSummary'
)

t.langchainSummary({
tx,
chatSummary: langChainSummaryEvents[0]
})

t.langchainMessages({
tx,
chatMsgs: langChainMessageEvents,
chatSummary: langChainSummaryEvents[0][1]
api.withLlmCustomAttributes({ 'llm.contextAttribute': 'someValue' }, async () => {
const input = { topic: 'scientist' }
const options = { metadata: { key: 'value', hello: 'world' }, tags: ['tag1', 'tag2'] }

MikeVaz marked this conversation as resolved.
Show resolved Hide resolved
const chain = prompt.pipe(model).pipe(outputParser)
await chain.invoke(input, options)

const events = agent.customEventAggregator.events.toArray()

const langchainEvents = filterLangchainEvents(events)
const langChainMessageEvents = filterLangchainEventsByType(
langchainEvents,
'LlmChatCompletionMessage'
)
const langChainSummaryEvents = filterLangchainEventsByType(
langchainEvents,
'LlmChatCompletionSummary'
)

t.langchainSummary({
tx,
chatSummary: langChainSummaryEvents[0]
})
const [[, message]] = events

t.ok(message['llm.contextAttribute'])
MikeVaz marked this conversation as resolved.
Show resolved Hide resolved

t.langchainMessages({
tx,
chatMsgs: langChainMessageEvents,
chatSummary: langChainSummaryEvents[0][1]
})

tx.end()
t.end()
})

tx.end()
t.end()
})
}
)
Expand Down
Loading