Can't send custom metrics from aws lambda function deployed via datadog-cdk-construct
dmcd opened this issue · 1 comments
Expected Behavior
We have a lambda function deployed using the datadog-cdk-construct with a valid API key as described here: https://docs.datadoghq.com/serverless/installation/nodejs/?tab=awscdk
The lambda is able to run and datadog traces are collected.
If we try to send a custom metric from within the handler using the instructions described here: https://docs.datadoghq.com/serverless/custom_metrics/ we expect to see a new metric appearing in the datadog dashboard.
import { sendDistributionMetric } from 'datadog-lambda-js';
export const handler = async (event: SQSEvent) => {
sendDistributionMetric('my_metrics.test', 1);
...
}
Actual Behavior
We receive the following error instead:
{
"status": "error",
"message": "datadog:handler not initialized"
}
We've tried enabling debug logging but can't see anything useful to explain why currentMetricsListener
would be undefined on this line:
datadog-lambda-js/src/index.ts
Line 212 in c9aa645
Steps to Reproduce the Problem
- Write a lambda function that uses
sendDistributionMetric
. - Deploy the lambda function using the cdk constructs as described here: https://docs.datadoghq.com/serverless/installation/nodejs/?tab=awscdk
- Invoke the lambda function and check cloudwatch logs.
Specifications
- nodeLayerVersion: 65
- extensionLayerVersion: 13
- "datadog-cdk-constructs": "^0.4.0",
- "datadog-lambda-js": "^4.67.0"
Stacktrace
timestamp,message
1642399541611,"START RequestId: df4c9551-3be6-4c3a-9b8c-0a16abe9958e Version: $LATEST"
1642399541612,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Starting daemon to receive messages from runtime..."
1642399541618,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Unable to restore the state from file"
1642399541618,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Using a SyncForwarder with a 5s timeout"
1642399541618,"2022-01-17 06:05:41 UTC | SAGENT | INFO | Retry queue storage on disk is disabled"
1642399541618,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | 'telemetry.dogstatsd.aggregator_channel_latency_buckets' is empty, falling back to default values"
1642399541618,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | 'telemetry.dogstatsd.listeners_latency_buckets' is empty, falling back to default values"
1642399541620,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | 'telemetry.dogstatsd.listeners_channel_latency_buckets' is empty, falling back to default values"
1642399541620,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | dogstatsd-udp: 127.0.0.1:8125 successfully initialized"
1642399541620,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Enabling logs collection HTTP route"
1642399541620,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Subscribing to Logs for types: [platform function extension]"
1642399541623,"2022-01-17 06:05:41 UTC | SAGENT | INFO | dogstatsd-udp: starting to listen on 127.0.0.1:8125"
1642399541623,"2022-01-17 06:05:41 UTC | SAGENT | WARN | Error loading config: open /var/task/datadog.yaml: no such file or directory"
1642399541624,"2022-01-17 06:05:41 UTC | SAGENT | INFO | Features detected from environment: "
1642399541624,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Could not get hostname via gRPC: grpc client disabled via cmd_port: -1. Falling back to other methods."
1642399541624,"LOGS Name: datadog-agent State: Subscribed Types: [platform,function,extension]"
1642399541624,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Acquired hostname from OS: ""169.254.198.45"". Core agent was unreachable at ""/opt/datadog-agent/bin/agent/agent"": fork/exec /opt/datadog-agent/bin/agent/agent: no such file or directory."
1642399541624,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Trace writer initialized (climit=200 qsize=78)"
1642399541625,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Stats writer initialized (climit=20 qsize=83)"
1642399541625,"2022-01-17 06:05:41 UTC | SAGENT | INFO | Starting a serverless logs-agent..."
1642399541625,"2022-01-17 06:05:41 UTC | SAGENT | INFO | logs-agent started"
1642399541626,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Adding AWS Logs collection source"
1642399541626,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | serverless agent ready in 16.82694ms"
1642399541626,"2022-01-17 06:05:41 UTC | SAGENT | INFO | Listening for traces at http://localhost:8126"
1642399542535,"2022-01-17 06:05:41 UTC | SAGENT | DEBUG | Starting concentrator"
1642399542535,"EXTENSION Name: datadog-agent State: Ready Events: [INVOKE,SHUTDOWN]"
1642399542536,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Received invocation event..."
1642399542539,"2022-01-17T06:05:42.538Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG {""status"":""debug"",""message"":""datadog:Patched console output with trace context""}"
1642399542539,"2022-01-17T06:05:42.539Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG {""status"":""debug"",""message"":""datadog:Not patching HTTP libraries"",""autoPatchHTTP"":true,""tracerInitialized"":true}"
1642399542540,"2022-01-17T06:05:42.540Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG {""status"":""debug"",""message"":""datadog:Reading trace context from env var Root=1-61e50735-5a8e05d935de03567715dbbd;Parent=0f25c866689d2899;Sampled=0""}"
1642399542542,"2022-01-17T06:05:42.542Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG {""status"":""debug"",""message"":""datadog:extracted trace context from xray context"",""trace"":{""parentID"":""1091498826874562713"",""sampleMode"":-1,""source"":""xray"",""traceID"":""3881543598739938237""},""header"":""Root=1-61e50735-5a8e05d935de03567715dbbd;Parent=0f25c866689d2899;Sampled=0""}"
1642399542558,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Hit on the serverless.Hello route."
1642399542558,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Client library registered with extension"
1642399542558,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | The flush strategy end has decided to not flush at moment: starting"
1642399542560,"2022-01-17T06:05:42.560Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:Extension present: true""}"
1642399542560,"2022-01-17T06:05:42.560Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:Using StatsD client""}"
1642399542578,"2022-01-17T06:05:42.577Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:Didn't attempt to find parent for aws.lambda span"",""mergeDatadogXrayTraces"":false,""traceSource"":""xray""}"
1642399542578,"2022-01-17T06:05:42.578Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:Creating the aws.lambda span""}"
1642399542579,"2022-01-17T06:05:42.579Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e ERROR [dd.trace_id=6582879259361563072 dd.span_id=6582879259361563072] {""status"":""error"",""message"":""datadog:handler not initialized""}"
1642399542787,"2022-01-17T06:05:42.787Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:Flushing statsD""}"
1642399542789,"2022-01-17T06:05:42.789Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:Flushing Extension""}"
1642399542789,"2022-01-17T06:05:42.789Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:sending payload with body {}""}"
1642399542791,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Hit on the serverless.Flush route."
1642399542791,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | The flush strategy end has decided to flush at moment: stopping"
1642399542791,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Beginning metrics flush at time 1642399542"
1642399542791,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Received a Flush trigger"
1642399542791,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Beginning traces flush at time 1642399542"
1642399542791,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Serializing 3 traces and 0 APM events."
1642399542792,"2022-01-17T06:05:42.792Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:Reading trace context from env var Root=1-61e50735-5a8e05d935de03567715dbbd;Parent=0f25c866689d2899;Sampled=0""}"
1642399542797,"2022-01-17T06:05:42.797Z df4c9551-3be6-4c3a-9b8c-0a16abe9958e DEBUG [dd.trace_id=3881543598739938237 dd.span_id=1091498826874562713] {""status"":""debug"",""message"":""datadog:Xray daemon received metadata payload"",""error"":null,""bytes"":407}"
1642399542797,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Beginning logs flush at time 1642399542"
1642399542797,"2022-01-17 06:05:42 UTC | SAGENT | INFO | Triggering a flush in the logs-agent"
1642399542799,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | Flushing 1 sketches to the forwarder"
1642399542816,"2022-01-17 06:05:42 UTC | SAGENT | DEBUG | The payload was not too big, returning the full payload"
1642399543677,"2022-01-17 06:05:43 UTC | SAGENT | INFO | Successfully posted payload to ""https://6-0-0-app.agent.datadoghq.com/api/beta/sketches?api_key=***************************2f539"", the agent will only log transaction success every 500 transactions"
1642399543677,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | SyncForwarder has flushed 1 transactions"
1642399543677,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | Flushing 2 series to the forwarder"
1642399543677,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | The payload was not too big, returning the full payload"
1642399543708,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | Flush in the logs-agent done."
1642399543708,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | Finished logs flush that was started at time 1642399542"
1642399543761,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | Flushed traces to the API; time: 946.004332ms, bytes: 1851"
1642399543765,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | Finished traces flush that was started at time 1642399542"
1642399543880,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | SyncForwarder has flushed 1 transactions"
1642399543880,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | Flushing 1 service checks to the forwarder"
1642399543880,"2022-01-17 06:05:43 UTC | SAGENT | DEBUG | The payload was not too big, returning the full payload"
1642399544084,"2022-01-17 06:05:44 UTC | SAGENT | DEBUG | SyncForwarder has flushed 1 transactions"
1642399544084,"2022-01-17 06:05:44 UTC | SAGENT | DEBUG | Finished metrics flush that was started at time 1642399542"
1642399544084,"2022-01-17 06:05:44 UTC | SAGENT | DEBUG | Finished flushing"
1642399544086,"END RequestId: df4c9551-3be6-4c3a-9b8c-0a16abe9958e"
1642399544087,"REPORT RequestId: df4c9551-3be6-4c3a-9b8c-0a16abe9958e Duration: 1548.77 ms Billed Duration: 1549 ms Memory Size: 1024 MB Max Memory Used: 144 MB Init Duration: 1183.51 ms "
Ok, I figured this one out. I needed to exclude 'datadog-lambda-js' from being bundled with my code seeing as it is already included by the cdk extension. I did this by providing the following additional options to awsLambdaNodejs.NodejsFunction
.
bundling: {
externalModules: [
'datadog-lambda-js', // already included by datadog-cdk-constructs
],
}