Skip to main content
Version: 2.2-dev

SparkPlug B Nodes

SparkPlug B is the MQTT-based interoperability standard for industrial IoT, providing a well-defined topic namespace, payload encoding, and birth/death certificate lifecycle. MaestroHub acts as a SparkPlug B Edge Node, publishing device data to SCADA hosts and IoT platforms.

Publish-Only Protocol

SparkPlug B nodes in MaestroHub publish data outward. There are no read nodes — data ingestion from SparkPlug B is handled through MQTT subscriptions on the connection level.

Configuration Quick Reference

FieldWhat you chooseDetails
ParametersConnection, Function, Function Parameters, Timeout OverrideSelect the connection profile, function, configure function parameters with expression support, and optionally override timeout.
SettingsDescription, Timeout (seconds), Retry on Timeout, Retry on Fail, On ErrorNode description, maximum execution time, retry behavior on timeout or failure, and error handling strategy. All execution settings default to pipeline-level values.

SparkPlug B Publish node configuration

SparkPlug B Publish Node

SparkPlug B Publish Node

Publish device data metrics to a SparkPlug B infrastructure via MQTT.

Supported Function Types:

Function NamePurposeCommon Use Cases
Publish Device Data (sparkplugb.ddata)Publish DDATA message with device metricsSensor telemetry, machine status updates, periodic data reporting

Node Configuration

ParameterTypeRequiredDescription
ConnectionSelectionYesSparkPlug B connection profile to use
FunctionSelectionYesPublish function from the selected connection
Function ParametersDynamicVariesAuto-populated from the function schema. See your SparkPlug B connection functions for full parameter details.
Timeout OverrideNumber (seconds)NoOverride the default function timeout

All function parameters support expression syntax ({{ expression }}) for dynamic values from the pipeline context.

Input

The node receives the output of the previous node as input. Input data can be referenced in function parameter expressions using $input.

Output Structure

On success the node produces:

{
"success": true,
"functionId": "<function-id>",
"data": {
"topic": "spBv1.0/Plant1/DDATA/EdgeNode1/sensor-001",
"deviceId": "sensor-001",
"metricCount": 2,
"action": "ddata"
},
"durationMs": 15,
"timestamp": "2026-01-15T08:30:00Z"
}
FieldTypeDescription
successbooleantrue when the publish completed without errors
functionIdstringID of the executed function
dataobjectPublish result details (see below)
durationMsnumberExecution time in milliseconds
timestampstringISO 8601 / RFC 3339 UTC timestamp

data fields:

FieldTypeDescription
topicstringFull MQTT topic the message was published to
deviceIdstringDevice identifier used in the message
metricCountnumberNumber of metrics included in the payload
actionstringAlways "ddata" for device data publishes

Publish Device Data (DDATA)

The DDATA function publishes device metrics to the SparkPlug B topic namespace.

Function Parameters

ParameterTypeRequiredDescription
deviceIdStringYesDevice identifier. Becomes part of the MQTT topic. Supports template placeholders.
metricsObjectYesName-value map of metrics to publish. At least one metric is required. Values support template placeholders.

Example metrics:

{
"temperature": 25.5,
"humidity": 60,
"motorRunning": true,
"status": "ACTIVE"
}

Supported Metric Data Types

Metric types are automatically inferred from the values provided:

Value TypeSparkPlug B TypeExample
IntegerInt32 / Int6460
FloatDouble25.5
BooleanBooleantrue
StringString"ACTIVE"

Topic Structure

Messages are published to the standard SparkPlug B topic namespace:

spBv1.0/{groupId}/DDATA/{edgeNodeId}/{deviceId}

The groupId and edgeNodeId are configured on the connection profile. The deviceId comes from the function parameter.


Automatic Lifecycle Management

The SparkPlug B connector automatically manages birth and death certificates — you only need to publish DDATA.

CertificateWhen PublishedPurpose
NBIRTH (Node Birth)Automatically on connectionAnnounces the Edge Node to the SCADA host
NDEATH (Node Death)Automatically on disconnect (also set as MQTT Last Will)Notifies the host that the Edge Node is offline
DBIRTH (Device Birth)Automatically on first DDATA for a deviceRegisters the device and its metric definitions
DDEATH (Device Death)Automatically on disconnectMarks all devices as offline
tip

You do not need to publish birth or death certificates manually. The connector handles the full SparkPlug B session lifecycle, including sequence number management and LWT (Last Will and Testament) configuration.

Sequence Numbers

The connector manages two sequence counters per the SparkPlug B specification:

CounterScopeBehavior
bdSeqBirth-death cycleIncrements on each NBIRTH/NDEATH cycle, wraps at 256
seqMessage sequenceIncrements per message, resets to 0 on NBIRTH, wraps at 256

Validation Rules

  • deviceId is required and cannot be empty.
  • metrics must be a valid object with at least one key-value pair.
  • The connection must be active (connected to the MQTT broker) at execution time.

Settings Tab

The SparkPlug B Publish node uses the standard Settings tab:

SettingTypeDefaultDescription
DescriptionTextOptional description displayed on the node
Timeout (seconds)NumberPipeline defaultMaximum time the node may run before timing out
Retry on TimeoutTogglePipeline defaultAutomatically retry the node if it times out
Retry on FailTogglePipeline defaultAutomatically retry the node if it fails
On ErrorSelectionPipeline defaultError strategy: stop the pipeline, continue to the next node, or follow the error output path

When left at their defaults, these settings inherit from the pipeline-level execution configuration.