Designed for Salesforce admins, developers & architects. A robust logger for Apex, Flow, Process Builder & Integrations.
- Easily add log entries via Apex, Flow & Process Builder to generate 1 consolidate log
- Manage & report on logging data using the
Log__c
andLogEntry__c
objects - Leverage
LogEntryEvent__e
platform events for real-time monitoring & integrations - Enable logging and set the logging level for different users & profiles using
LoggerSettings__c
custom hierarchy setting - View related log entries on any Lighting SObject flexipage by adding the 'Related Log Entries' component in App Builder
- Dynamically assign Topics to
Log__c
andLogEntry__c
records for tagging/labeling your logs (not currently available in the managed package) - Plugin framework: easily build or install plugins that enhance the
Log__c
andLogEntry__c
objects, using Apex or Flow - Event-Driven Integrations with Platform Events, an event-driven messaging architecture. External integrations can subscribe to log events using the
LogEntryEvent__e
object - see more details at the Platform Events Developer Guide site
Learn more about the design and history of the project on Joys Of Apex blog post
You can choose to install the unlocked package, you can deploy the metadata from this repo to your org, or you can install the managed package. The metadata is the same in all 3 options, but there are some differences in using the 3 versions. All examples in README are for the unlocked package/unpackaged metadata (no namespace) - simply add the Nebula
namespace from the examples if you are using the managed package.
Unlocked Package (Recommended) | Managed Package | |
---|---|---|
Namespace | none | Nebula |
Future Releases | Faster release cycle: new patch versions are released (e.g., v4.4.x ) for new enhancements & bugfixes that are merged to the main branch in GitHub |
Slower release cycle: new minor versions are only released (e.g., v4.x ) once new enhancements & bugfixes have been tested and code is stabilized |
Public & Protected Apex Methods | Any public and protected Apex methods are subject to change in the future - they can be used, but you may encounter deployment issues if future changes to public and protected methods are not backwards-compatible |
Only global methods are available in managed packages - any global Apex methods available in the managed package will be supported for the foreseeable future |
Apex Debug Statements | System.debug() is automatically called |
Requires adding your own calls for System.debug() due to Salesforce limitations with managed packages |
Apex Stack Traces | Automatically stored in LogEntry__c.StackTrace__c when calling methods like Logger.debug('my message'); |
Requires calling parseStackTrace() due to Salesforce limitations with managed packages. For example:Logger.debug('my message').parseStackTrace(new DmlException().getStackTrace()); |
Assign Topics (Tagging/Labeling System) | Provide List<String> topics in Apex or Flow to dynamically assign Salesforce Topics to Log__c and LogEntry__c records |
This functionality is not currently available in the managed package |
Logger Plugin Framework | Leverage Apex or Flow to build your own "plugins" for Logger - to add your own automation to the Log__c or LogEntry__c objects. The logger system will then automatically run your plugins after each trigger event (BEFORE_INSERT, BEFORE_UPDATE, AFTER_INSERT, AFTER_UPDATE, and so on). |
This functionality is not currently available in the managed package |
After deploying Nebula Logger to your org, there are a few additional configuration changes needed...
- Assign permission set(s) to users
LoggerLogCreator
provides the minimum access needed for users to generate logs via Apex, Flow or Process BuilderLoggerEndUser
provides access to generate logs, as well as read-only access to any log records shared with the user.LoggerLogViewer
provides view-all access (read-only) to all log records. This does not provide access to generate logs.LoggerAdmin
provides view-all and modify-all access to all log records.
- Customize the default settings in
LoggerSettings__c
- You can customize settings at the org, profile and user levels
- Unlocked Package Only: Enable Salesforce Topics for the
Log__c
andLogEntry__c
objects for tagging/labeling. See Salesforce Help for more details.- Currently, enabling Topics for objects must still be done using the Salesforce Classic UI. Once enabled, Topics can then be used from withing Lightning Experience.
- Once enabled, Topics can be added via Apex and Flow and then used as list view filters (and more) for the object
Log__c
.
For Apex developers, the Logger
class has several methods that can be used to add entries with different logging levels. Each logging level's method has several overloads to support multiple parameters.
// This will generate a debug statement within developer console
System.debug('Debug statement using native Apex');
// This will create a new `Log__c` record with multiple related `LogEntry__c` records
Logger.error('Add log entry using Nebula Logger with logging level == ERROR');
Logger.warn('Add log entry using Nebula Logger with logging level == WARN');
Logger.info('Add log entry using Nebula Logger with logging level == INFO');
Logger.debug('Add log entry using Nebula Logger with logging level == DEBUG');
Logger.fine('Add log entry using Nebula Logger with logging level == FINE');
Logger.finer('Add log entry using Nebula Logger with logging level == FINER');
Logger.finest('Add log entry using Nebula Logger with logging level == FINEST');
Logger.saveLog();
This results in 1 Log__c
record with several related LogEntry__c
records.
Within Flow & Process Builder, you can select 1 of the several Logging actions
In this simple example, a Flow is configured after-insert and after-update to log a Case record (using the action 'Add Log Entry for an SObject Record')
This results in a Log__c
record with related LogEntry__c
records.
After incorporating Logger into your Flows & Apex code (including controllers, trigger framework, etc.), you'll have a unified transaction log of all your declarative & custom code automations.
Case currentCase = [SELECT Id, CaseNumber, Type, Status, IsClosed FROM Case LIMIT 1];
Logger.info('First, log the case through Apex', currentCase);
Logger.debug('Now, we update the case in Apex to cause our record-triggered Flow to run');
update currentCase;
Logger.info('Last, save our log');
Logger.saveLog();
This generates 1 consolidated Log__c
, containing LogEntry__c
records from both Apex and Flow
Within Apex, there are several different methods that you can use that provide greater control over the logging system.
Apex developers can use additional Logger
methods to dynamically control how logs are saved during the current transaction.
Logger.suspendSaving()
– causesLogger
to ignore any calls tosaveLog()
in the current transaction untilresumeSaving()
is called. Useful for reducing DML statements used byLogger
Logger.resumeSaving()
– re-enables saving aftersuspendSaving()
is usedLogger.flushBuffer()
– discards any unsaved log entriesLogger.setSaveMethod(SaveMethod saveMethod)
- sets the default save method used when callingsaveLog()
. Any subsequent calls tosaveLog()
in the current transaction will use the specified save methodLogger.saveLog(SaveMethod saveMethod)
- saves any entries in Logger's buffer, using the specified save method for only this call. All subsequent calls tosaveLog()
will use the default save method.- Enum
Logger.SaveMethod
- this enum can be used for bothLogger.setSaveMethod(saveMethod)
andLogger.saveLog(saveMethod)
Logger.SaveMethod.EVENT_BUS
- The default save method, this uses theEventBus
class to publishLogEntryEvent__e
records. The default save method can also be controlled declaratively by updating the fieldLoggerSettings__c.DefaultSaveMethod__c
Logger.SaveMethod.QUEUEABLE
- This save method will triggerLogger
to save any pending records asynchronously using a queueable job. This is useful when you need to defer some CPU usage and other limits consumed by Logger.Logger.SaveMethod.REST
- This save method will use the current user’s session ID to make a synchronous callout to the org’s REST API. This is useful when you have other callouts being made and you need to avoid mixed DML operations.Logger.SaveMethod.SYNCHRONOUS_DML
- This save method will skip publishing theLogEntryEvent__e
platform events, and instead immediately createsLog__c
andLogEntry__c
records. This is useful when you are logging from within the context of another platform event and/or you do not anticipate any exceptions to occur in the current transaction. Note: when using this save method, any exceptions will prevent your log entries from being saved - Salesforce will rollback any DML statements, including your log entries! Use this save method cautiously.
In Salesforce, asynchronous jobs like batchable and queuable run in separate transactions - each with their own unique transaction ID. To relate these jobs back to the original log, Apex developers can use the method Logger.setParentLogTransactionId(String). Logger
uses this value to relate child Log__c
records, using the field Log__c.ParentLog__c
.
This example batchable class shows how you can leverage this feature to relate all of your batch job’s logs together.
ℹ️ If you deploy this example class to your org,you can run it using
Database.executeBatch(new BatchableLoggerExample());
public with sharing class BatchableLoggerExample implements Database.Batchable<SObject>, Database.Stateful {
private String originalTransactionId;
public Database.QueryLocator start(Database.BatchableContext batchableContext) {
// Each batchable method runs in a separate transaction
// ...so store the first transaction ID to later relate the other transactions
this.originalTransactionId = Logger.getTransactionId();
Logger.info('Starting BatchableLoggerExample');
Logger.saveLog();
// Just as an example, query all accounts
return Database.getQueryLocator([SELECT Id, Name, RecordTypeId FROM Account]);
}
public void execute(Database.BatchableContext batchableContext, List<Account> scope) {
// One-time call (per transaction) to set the parent log
Logger.setParentLogTransactionId(this.originalTransactionId);
for (Account account : scope) {
// TODO add your batch job's logic
// Then log the result
Logger.info('Processed an account record', account);
}
Logger.saveLog();
}
public void finish(Database.BatchableContext batchableContext) {
// The finish method runs in yet-another transaction, so set the parent log again
Logger.setParentLogTransactionId(this.originalTransactionId);
Logger.info('Finishing running BatchableLoggerExample');
Logger.saveLog();
}
}
Queueable jobs can also leverage the parent transaction ID to relate logs together. This example queueable job will run several chained instances. Each instance uses the parentLogTransactionId to relate its log back to the original instance's log.
ℹ️ If you deploy this example class to your org,you can run it using
System.enqueueJob(new QueueableLoggerExample(3));
public with sharing class QueueableLoggerExample implements Queueable {
private Integer numberOfJobsToChain;
private String parentLogTransactionId;
private List<LogEntryEvent__e> logEntryEvents = new List<LogEntryEvent__e>();
// Main constructor - for demo purposes, it accepts an integer that controls how many times the job runs
public QueueableLoggerExample(Integer numberOfJobsToChain) {
this(numberOfJobsToChain, null);
}
// Second constructor, used to pass the original transaction's ID to each chained instance of the job
// You don't have to use a constructor - a public method or property would work too.
// There just needs to be a way to pass the value of parentLogTransactionId between instances
public QueueableLoggerExample(Integer numberOfJobsToChain, String parentLogTransactionId) {
this.numberOfJobsToChain = numberOfJobsToChain;
this.parentLogTransactionId = parentLogTransactionId;
}
// Creates some log entries and starts a new instance of the job when applicable (based on numberOfJobsToChain)
public void execute(System.QueueableContext queueableContext) {
Logger.setParentLogTransactionId(this.parentLogTransactionId);
Logger.fine('queueableContext==' + queueableContext);
Logger.info('this.numberOfJobsToChain==' + this.numberOfJobsToChain);
Logger.info('this.parentLogTransactionId==' + this.parentLogTransactionId);
// TODO add your queueable job's logic
Logger.saveLog();
--this.numberOfJobsToChain;
if (this.numberOfJobsToChain > 0) {
String parentLogTransactionId = this.parentLogTransactionId != null ? this.parentLogTransactionId : Logger.getTransactionId();
System.enqueueJob(new QueueableLoggerExample(this.numberOfJobsToChain, parentLogTransactionId));
}
}
}
Each of the logging methods in Logger
(such as Logger.error()
, Logger.debug()
, and so on) has several static overloads for various parameters. These are intended to provide simple method calls for common parameters, such as:
- Log a message and a record -
Logger.error(String message, SObject record)
- Log a message and a record ID -
Logger.error(String message, Id recordId)
- Log a message and a save result -
Logger.error(String message, Database.SaveResult saveResult)
- ...
To see the full list of overloads, check out the Logger
class documentation.
Each of the logging methods in Logger
returns an instance of the class LogEntryEventBuilder
. This class provides several additional methods together to further customize each log entry - each of the builder methods can be chained together. In this example Apex, 3 log entries are created using different approaches for calling Logger
- all 3 approaches result in identical log entries.
// Get the current user so we can log it (just as an example of logging an SObject)
User currentUser = [SELECT Id, Name, Username, Email FROM User WHERE Id = :UserInfo.getUserId()];
// Using static Logger method overloads
Logger.debug('my string', currentUser);
// Using the instance of LogEntryEventBuilder
LogEntryEventBuilder builder = Logger.debug('my string');
builder.setRecord(currentUser);
// Chaining builder methods together
Logger.debug('my string').setRecord(currentUser);
// Save all of the log entries
Logger.saveLog();
The class LogMessage
provides the ability to generate string messages on demand, using String.format()
. This provides 2 benefits:
-
Improved CPU usage by skipping unnecessary calls to
String.format()
// Without using LogMessage, String.format() is always called, even if the FINE logging level is not enabled for a user String formattedString = String.format('my example with input: {0}', List<Object>{'myString'}); Logger.fine(formattedString); // With LogMessage, when the specified logging level (FINE) is disabled for the current user, `String.format()` is not called LogMessage logMessage = new LogMessage('my example with input: {0}', 'myString'); Logger.fine(logMessage);
-
Easily build complex strings
// There are several constructors for LogMessage to support different numbers of parameters for the formatted string String unformattedMessage = 'my string with 3 inputs: {0} and then {1} and finally {2}'; String formattedMessage = new LogMessage(unformattedMessage, 'something', 'something else', 'one more').getMessage(); String expectedMessage = 'my string with 3 inputs: something and then something else and finally one more'; System.assertEquals(expectedMessage, formattedMessage);
For more details, check out the LogMessage
class documentation.
Within Flow (and Process Builder), there are 4 invocable actions that you can use to leverage Nebula Logger
- 'Add Log Entry' - uses the class
FlowLogEntry
to add a log entry with a specified message - 'Add Log Entry for an SObject Record' - uses the class
FlowRecordLogEntry
to add a log entry with a specified message for a particular SObject record - 'Add Log Entry for an SObject Record Collection' - uses the class
FlowCollectionLogEntry
to add a log entry with a specified message for an SObject record collection - 'Save Log' - uses the class
Logger
to save any pending logs
The Logger Console app provides access to Log__c
and LogEntry__c
object tabs (for any users with the correct access).
To help development and support teams better manage logs (and any underlying code or config issues), some fields on Log__c
are provided to track the owner, priority and status of a log. These fields are optional, but are helpful in critical environments (production, QA sandboxes, UAT sandboxes, etc.) for monitoring ongoing user activities.
-
All editable fields on
Log__c
can be updated via the 'Manage Log' quick action (shown below) -
Additional fields are automatically set based on changes to
Log__c.Status__c
Log__c.ClosedBy__c
- The user who closed the logLog__c.ClosedDate__c
- The datetime that the log was closedLog__c.IsClosed__c
- Indicates if the log is closed, based on the selected status (and associated config in the 'Log Status' custom metadata type)Log__c.IsResolved__c
- Indicates if the log is resolved (meaning that it required analaysis/work, which has been completed). Only closed statuses can be considered resolved. This is also driven based on the selected status (and associated config in the 'Log Status' custom metadata type)
-
To customize the statuses provided, simply update the picklist values for
Log__c.Status__c
and create/update corresponding records in the custom metadata typeLogStatus__mdt
. This custom metadata type controls which statuses are considerd closed and resolved.
Everyone loves JSON - so to make it easy to see a JSON version of a Log__c
record, you can use the 'View JSON' quick action button. It displays the current Log__c
+ all related LogEntry__c
records in JSON format, as well as a handy button to copy the JSON to your clipboard. All fields that the current user can view (based on field-level security) are dynamically returned, including any custom fields added directly in your org or by plugins.
Within App Builder, admins can add the 'Related Log Entries' lightning web component to any record page. Admins can also control which columns are displayed be creating & selecting a field set on LogEntry__c
with the desired fields.
- The component automatically shows any related log entries, based on
LogEntry__c.RecordId__c == :recordId
- Users can search the list of log entries for a particular record using the component's built-insearch box. The component dynamically searches all related log entries using SOSL.
- Component automatically enforces Salesforce's security model
- Object-Level Security - Users without read access to
LogEntry__c
will not see the component - Record-Level Security - Users will only see records that have been shared with them
- Field-Level Security - Users will only see the fields within the field set that they have access to
- Object-Level Security - Users without read access to
Admins can easily delete old logs using 2 methods: list views or Apex batch jobs
Salesforce (still) does not support mass deleting records out-of-the-box. There's been an Idea for 11+ years about it, but it's still not standard functionality. A custom button is available on Log__c
list views to provide mass deletion functionality.
- Admins can select 1 or more
Log__c
records from the list view to choose which logs will be deleted
- The button shows a Visualforce page
LogMassDelete
to confirm that the user wants to delete the records
Two Apex classes are provided out-of-the-box to handle automatically deleting old logs
LogBatchPurger
- this batch Apex class will delete anyLog__c
records withLog__c.LogRetentionDate__c <= System.today()
.- By default, this field is populated with "TODAY + 14 DAYS" - the number of days to retain a log can be customized in
LoggerSettings__c
. - Admins can also manually edit this field to change the retention date - or set it to null to prevent the log from being automatically deleted
- By default, this field is populated with "TODAY + 14 DAYS" - the number of days to retain a log can be customized in
LogBatchPurgeScheduler
- this schedulable Apex class can be schedule to runLogBatchPurger
on a daily or weekly basis
If you want to add your own automation to the Log__c
or LogEntry__c
objects, you can leverage Apex or Flow to define "plugins" - the logger system will then automatically run the plugins after each trigger event (BEFORE_INSERT, BEFORE_UPDATE, AFTER_INSERT, AFTER_UPDATE, and so on). This framework makes it easy to build your own plugins, or deploy/install others' prebuilt packages, without having to modify the logging system directly.
-
Flow plugins: your Flow should be built as auto-launched Flows with these parameters:
Input
parametertriggerOperationType
- The name of the current trigger operation (such as BEFORE_INSERT, BEFORE_UPDATE, etc.)Input
parametertriggerNew
- The list of logger records being processed (Log__c
orLogEntry__c
records)Output
parameterupdatedTriggerNew
- If your Flow makes any updates to the collection of records, you should return a record collection containing the updated recordsInput
parametertriggerOld
- The list of logger records as they exist in the datatabase
-
Apex plugins: your Apex class should extend the abstract class
LoggerSObjectHandlerPlugin
. For example:public class ExamplePlugin extends LoggerSObjectHandlerPlugin { public override void execute( TriggerOperation triggerOperationType, List<SObject> triggerNew, Map<Id, SObject> triggerNewMap, List<SObject> triggerOld, Map<Id, SObject> triggerOldMap ) { switch on triggerOperationType { when BEFORE_INSERT { for (Log__c log : (List<Log__c>) triggerNew) { log.Status__c = 'On Hold'; } } } } }
Once you've created your Apex or Flow plugin(s), you will also need to configure the plugin:
- 'Logger Plugin' - use the custom metadata type
LoggerSObjectHandlerPlugin__mdt
to define your plugin, including the plugin type (Apex or Flow) and the API name of your plugin's Apex class or Flow - 'Logger Plugin Parameter' - use the custom metadata type
LoggerSObjectHandlerPluginParameter__mdt
to define any configurable parameters needed for your plugin, such as environment-specific URLs and other similar configurations
Note: the logger plugin framework is not available in the managed package due to some platform limitations & considerations with some of the underlying code. The unlocked package is recommended (instead of the managed package) when possible, including if you want to be able to leverage the plugin framework in your org.
The optional Slack plugin leverages the Nebula Logger plugin framework to automatically send Slack notifications for logs that meet a certain (configurable) logging level. The plugin also serves as a functioning example of how to build your own plugin for Nebula Logger, such as how to:
- Use Apex to apply custom logic to
Log__c
andLogEntry__c
records - Add custom fields and list views to Logger's objects
- Extend permission sets to include field-level security for your custom fields
- Leverage the new
LoggerSObjectHandlerPluginParameter__mdt
CMDT object to store configuration for your plugin
Check out the Slack plugin for more details on how to install & customize the plugin
If you want to remove the unlocked or managed packages, you can do so by simply uninstalling them in your org under Setup --> Installed Packages.