Client doesn't work with Laravel Queue
Opened this issue · 5 comments
Hi,
We have this package installed in a Laravel application. The app listens for events for some specific events which are queued and processed in background (through queues). The thing is that when the queued jobs are processed we also submit the events to PostHog, eg:
PostHog::capture([
'distinctId' => 'user:1',
'event' => 'some-event'
]);
But we found out that the event is never sent to PostHog. There are no errors, it just does nothing. We tested the same app locally (without queue) and it works as expected. Also we tested making a HTTP request directly to the API through the Guzzle client and it works fine. After some debugging I think there may be some issues with the consumers clients (which by default is lib_curl
.), and I guess its related to how __destruct
function handles the code within a queued job in Laravel app.
Laravel Version 8.75.0
Queue Jobs are processed with the command php artisan queue:work
.
Looking forward to a solution. Thank you.
Are you calling PostHog::init
in the queue job as well?
Sounds like PostHog::init
wasn't called
Facing the same problem. I have tested calling PostHog::init
in the queued job as well but still didn't work:
PostHog::init(config()->get('services.posthog.api_key'), [
'host' => config()->get('services.posthog.host'),
]);
PostHog::capture([
'distinctId' => $this->user->getKey(),
'event' => 'user:registered',
'properties' => [
'registered_via' => $this->registerdVia,
],
]);
However it does work when running the job synchronously.
batch_size as optional parameter solved my problem (we don't have 100 events in time, that's why we didn't catch events in PostHog)
@nwdles could you show an example? I am confused about that parameter. Do yo mean this?
https://posthog.com/docs/api/post-only-endpoints#batch-events
https://posthog.com/docs/api/post-only-endpoints#batch-events
I mean this
Posthog::init(config('posthog.api_key'), ['host' => config('posthog.url'), 'batch_size' => 1]);