axiomhq/next-axiom

Axiom is not formatting my logs properly

Opened this issue · 14 comments

I have a nextjs project on vercel using axiom for logging.

On the logs tab in Vercel, my log is being properly formatted like so:

Starting endpoint {
  path: '/api/interviews/sendUserMessage',
  assistantId: 'asst_I5VvxwtWttxq7hg9GwxXPEPj',
  interviewId: 'clrcweurd00019mz85x8vrzaf',
  threadId: 'thread_nvLQOiEN1HHyEZijlHt5iYh2',
  userId: 'user_2ahEnXntrH6x792VvzOXnsPYQyY'
}

but if you look at the attached image you can see that each field in the JSON is being split up into its own individual log message.

Screenshot 2024-01-13 at 21 04 21

I am calling the log.info like so:

export const logStartEndpoint = (
  log: Logger,
  args: { [key: string]: any } = {}
) => {
  const logger = log.with(args);
  logger.info(`Starting endpoint`);
  console.log(`Starting endpoint`, args);
};

Am I doing something wrong? i would ideally want each field to be a part of the properties that are logged with the top level message of Starting endpoint

how does the log record for logger.info() look like? the one you sent seems to be from the console.log()

Locally the logger.info seems to be firing correctly. It seems to not consistently send to Axiom. I checked my Axiom logs again and despite using the same logger.info across all of my API end points, it is only being sent from one endpoint and not from anywhere else.

+1

I'm also seeing this same behaviour of a log being split into individual log entries. I'm using Axiom as a Vercel integration with NextJS and logging within a server action (preview deployment).

Here's basically how I'm using it:

"use server";

import { Logger } from "next-axiom";

export async function action(data: z.infer<typeof Schema>) {
    const log = new Logger();
    ...
    log.error("an error occurred", data);
}

Versions:

"next": "^14.1.0",
"next-axiom": "^1.1.1",

Have also followed the steps to wrap the Next.js config with withAxiom.

@schehata Are there any updates here? I haven't changed anything on my project and it was working previously but now the logs are largely useless as I can't pull any information from them at all.

hi @YourAverageTechBro , unfortunately I am not able to reproduce, I check our apps that uses the same version. maybe if you can provide a repo that reproduces this I could be able to debug and find out whats going on.

I'm also experiencing this issue. This is making investigations in logs very painful. Today, I had to download the logs and process them manually to concatenate messages by request ID. Not very practical when you need to fix a production issue.

I've run some tests using only the Vercel integration with no additional change on my test Next app. I noticed that the log drain was working as expected (1 log entry per query) as long as the function logs are short enough. It started creating a log entry per line when the Vercel 4KB limit was reached.

Vercel logs for a function call:
Screenshot 2024-03-22 at 11 43 48

Axiom creates a log entry per line:
Screenshot 2024-03-22 at 11 43 01

Here is the code I used for this test:

// src/pages/api/logPost.ts
export default function handler(req, res) {
  console.log("Request body:", req.body)
  res.status(200).json({ status: "ok" })
}

Then I just call this endpoint with a long body:

curl --request GET \
  --url https://test-axiom.vercel.app/api/logPost \
  --header 'Content-Type: application/json' \
  --header 'User-Agent: insomnia/8.6.1' \
  --data '{
	"a": [
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit"
	],
	"b": [
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit"
	],
	"c": [
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit"
	],
	"d": [
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit"
	],
	"e": [
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit"
	],
	"f": [
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit",
		"this is some test data with long content in order to cross the 4KB Vercel logs limit"
	]
}'

Hi @testerez , thats insightful, I will use your test to reproduce and debug whats going on.

Was there any progress made here? I'm experiencing this as well.

Also, since Server Actions are just mini api routes, can withAxiomRouteHandler be used?

Regular Axiom logging doesn't seem to work from Server Actions (for me at least) when NEXT_PUBLIC_AXIOM_INGEST_ENDPOINT is set for that Vercel environment.

When I instantiate the logger, the config object shows the axiomUrl to be https://axiom-vitals.com/api/v1/send?configurationId=icfg_<xxxxyyyyy>&projectId=<uuid>&type=web-vitals -- this is coming from the Vercel integration with Axiom via the marketplace.

Is that correct? Once I removed that env var from the Preview environment, the config URL changed to the default Axiom ingest and the logs started showing up properly.

Here's what I have in one of my server actions and nothing ever appears via the Vercel log drain. If I use console.info or console.warn then the Vercel standard logs pick those up and they appear in Axiom.

I would guess that the web-vitals endpoint is not expecting to see log type messages and they're getting dropped. Is that the case?

"use server";

import { inngest } from "./client";
import { Logger } from "next-axiom";

// Sending events to Inngest should always be from a server action. The EVENT_KEY is typically not NEXT_PUBLIC.

export async function inngestSendEvent(event:any) {
    const log = new Logger();

    log.warn('[inngestSendEvent] starting');
    log.warn('[inngestSendEvent] called with:', event);

    let resp;

    try {
        resp = await inngest.send(event);
    } catch (e) {
        resp = e;
    }

    log.warn('[inngestSendEvent] finishing.');
    log.flush();

    return JSON.stringify(resp);
}

Hey @schehata, did you get a chance to test this? This issue may be a deal breaker at some point for us...

I've been experiencing this too. I was using Axiom lots over the past few days to look at the logs for an app then I deployed a fairly minor code change today (completely unrelated to the logging setup) and now the Axiom logs are now split to single lines. I'd done multiple deploys throughout the past couple of days with without any problems with the logs. It just broke for no obvious reason with the last one. 🤷

Our logs are now working properly again without making any changes at all (no code deploys or configuration changes — I literally went to bed and looked at it again in the morning and it was fixed)! I can see the logs started getting formatted correctly again at a seemingly arbitrary point in time. The last incorrectly formatted log happens to be a TypeError: fetch failed error… but I doubt that's significant.

This is still happening intermittently for us. Definitely seems more common immediately after a deploy.

Just want to pile on, having the same issues here. Printing an object results in each line of the object being a separate item in Axiom. It is pretty much unusable.

We were able to reproduce and we are working on a fix. Just wanted to note that this is not related to next-axiom but related to Axiom Log Drain on Vercel integration. When the fix is released it will just work for you, you will not have to update anything. Will keep you posted.