SparkPlug B Integration Guide
Use MaestroHub's SparkPlug B connector to publish industrial metrics to SCADA systems, historians, and IoT platforms. MaestroHub operates as a SparkPlug B Edge Node (EoN), automatically managing birth/death certificates, sequence numbers, and Protocol Buffers encoding according to the SparkPlug B specification.
Overview
The SparkPlug B connector delivers:
- Edge Node (EoN) Publisher role with automatic NBIRTH/NDEATH lifecycle management
- Automatic DBIRTH/DDEATH handling for devices on first data and disconnect
- Protocol Buffers encoding for efficient, standardized message payloads
- Sequence number management ensuring proper message ordering
- Last Will and Testament (LWT) for graceful and unexpected disconnection handling
SparkPlug B is an open specification that defines how to use MQTT in industrial environments. It provides a consistent topic namespace, payload format, and state management model. For more details, see the Eclipse SparkPlug Specification.
Connection Configuration
Creating a SparkPlug B Connection
Navigate to Connections → New Connection → SparkPlug B and fill in these details:
SparkPlug B Connection Creation Fields
1. Profile Information
| Field | Default | Description |
|---|---|---|
| Profile Name | - | A descriptive name for this connection profile (required, max 100 characters) |
| Description | - | Optional description for this SparkPlug B connection |
2. MQTT Broker Configuration
| Field | Default | Description |
|---|---|---|
| Broker | - | MQTT broker hostname or IP address (e.g., broker.example.com) – required |
| Port | 1883 | MQTT broker port (1-65535) |
| Scheme | tcp | Connection scheme: tcp, ssl, ws, or wss. Auto-set to ssl if TLS is enabled |
| Client ID | - | MQTT client identifier. Auto-generated as maestrohub-spb-{connectionId} if not provided |
3. SparkPlug B Namespace Configuration
| Field | Default | Description |
|---|---|---|
| Group ID | - | SparkPlug B group identifier (required). Represents a logical grouping of Edge Nodes (e.g., factory-floor, building-a) |
| Edge Node ID | - | Unique Edge Node identifier within the group (required). Represents this MaestroHub instance (e.g., edge-node-01, plc-gateway) |
Topic Namespace
SparkPlug B uses a standardized topic structure:
spBv1.0/{group_id}/{message_type}/{edge_node_id}[/{device_id}]
Example topics:
spBv1.0/factory/NBIRTH/edge-node-1– Node birth certificatespBv1.0/factory/DDATA/edge-node-1/sensor-01– Device dataspBv1.0/factory/DDEATH/edge-node-1/sensor-01– Device death certificate
4. Authentication
| Field | Default | Description |
|---|---|---|
| Username | - | MQTT broker username (optional) |
| Password | - | MQTT broker password (optional) |
5. TLS/SSL Settings
| Field | Default | Description |
|---|---|---|
| Enable TLS | false | Use encrypted connection to the MQTT broker |
When TLS is enabled, the connection scheme automatically switches to ssl. For production environments, ensure your MQTT broker has proper certificate configuration.
6. Connection Settings
| Field | Default | Description |
|---|---|---|
| Clean Session | true | Start with a clean session on connect |
| Keep Alive (seconds) | 60 | Keep-alive interval in seconds |
| Connect Timeout (seconds) | 30 | Connection timeout in seconds |
7. Connection Labels
| Field | Default | Description |
|---|---|---|
| Labels | - | Key-value pairs to categorize and organize this SparkPlug B connection (max 10 labels) |
Example Labels
environment: production– Deployment environmentteam: automation– Responsible teamprotocol: sparkplugb– Connection protocolregion: us-east-1– Geographical region
- Single Ownership: SparkPlug B connections require exclusive scaling because birth/death certificates and sequence numbers must be managed by a single instance.
- Automatic LWT: The connector automatically configures Last Will and Testament with NDEATH payload to ensure proper death certificate delivery on unexpected disconnection.
- Sequence Numbers: Message sequence numbers (0-255) and birth-death sequence numbers (bdSeq) are managed automatically and wrap at 256.
Function Builder
Creating SparkPlug B Functions
After the connection is configured:
- Go to Functions → New Function
- Choose Publish Device Data as the function type
- Select the SparkPlug B connection profile
- Define the device ID and metrics to publish
Publish Device Data Function
Purpose: Send device metric values (DDATA) to SCADA systems. The connector automatically manages device birth/death certificates — a DBIRTH is published on the first data message for each device.
Configuration Fields
| Field | Type | Required | Default | Description |
|---|---|---|---|---|
| Device ID | String | Yes | - | Unique device identifier within this Edge Node (e.g., sensor-01, plc-line-1) |
| Metrics | Object | Yes | - | Map of metric names to values. Must contain at least one metric. Supports parameter templates |
Supported Metric Data Types
The connector automatically infers the SparkPlug B data type from the metric value:
| Go/JSON Type | SparkPlug B Type | Description |
|---|---|---|
int, int64 | Int64 | 64-bit signed integer |
int8 | Int8 | 8-bit signed integer |
int16 | Int16 | 16-bit signed integer |
int32 | Int32 | 32-bit signed integer |
uint, uint64 | UInt64 | 64-bit unsigned integer |
uint8 | UInt8 | 8-bit unsigned integer |
uint16 | UInt16 | 16-bit unsigned integer |
uint32 | UInt32 | 32-bit unsigned integer |
float32 | Float | 32-bit floating point |
float64 | Double | 64-bit floating point |
bool | Boolean | True/false value |
string | String | Text value |
[]byte | Bytes | Binary data |
time.Time | DateTime | Timestamp (milliseconds since epoch) |
Use Cases: Sensor data publishing, real-time telemetry, periodic data reporting, status metric updates
Example Metrics Configuration
{
"temperature": 23.5,
"pressure": 101.3,
"running": true,
"status": "operational",
"count": 42
}
Using Parameters
SparkPlug B functions support parameterized metric values via the ((parameterName)) syntax.
| Configuration | Description | Example |
|---|---|---|
| Type | Validate incoming pipeline data | string, number, boolean, datetime, json, buffer |
| Required | Force presence of the parameter | Required / Optional |
| Default Value | Provide fallback values | 0, false, "unknown" |
| Description | Document intent for other authors | "Current temperature reading from sensor" |
Example with Parameters
{
"temperature": "((sensorTemp))",
"pressure": "((sensorPressure))",
"timestamp": "((now))"
}
SparkPlug B Message Lifecycle
Automatic Message Management
The SparkPlug B connector automatically manages the complete message lifecycle:
| Message Type | When Published | Purpose |
|---|---|---|
| NBIRTH | Automatically on connection | Announces the Edge Node is online and provides its initial state |
| NDEATH | On graceful disconnect or via LWT on unexpected disconnect | Announces the Edge Node is offline |
| DBIRTH | Automatically on first DDATA for each device | Announces a device is online and provides its metric schema |
| DDEATH | Automatically on disconnect | Announces a device is offline |
| DDATA | When you call the Publish Device Data function | Contains updated metric values for a device |
Birth-Death Sequence (bdSeq)
The bdSeq metric correlates NBIRTH and NDEATH messages. SCADA host applications use this to:
- Detect if they missed any death certificates
- Determine if the current birth certificate is the latest
- Properly handle reconnection scenarios
Pipeline Integration
Use the SparkPlug B connection functions you create here as nodes inside the Pipeline Designer to publish industrial metrics to SCADA systems. Drag in the Publish Device Data node, bind its parameters to upstream node outputs or constants, and build event-driven flows for your industrial data.
If you are planning broader orchestration, review the Connector Nodes page for guidance on where SparkPlug B nodes fit within multi-system automation patterns.
Common Use Cases
SCADA Integration
Publish real-time process data from PLCs and sensors to SCADA systems that support SparkPlug B, enabling standardized data exchange without custom parsing.
Historian Connectivity
Stream time-series data to historians like Ignition, InfluxDB, or cloud-based solutions that consume SparkPlug B messages for long-term storage and analysis.
Edge-to-Cloud Telemetry
Bridge legacy industrial protocols (OPC UA, Modbus, Siemens S7) to cloud IoT platforms by combining reads from those protocols with SparkPlug B publish steps.
Multi-Site Data Aggregation
Use consistent Group ID naming across sites to aggregate data from multiple factories or buildings into a central SCADA or analytics platform.
Digital Twin Integration
Feed real-time equipment metrics to digital twin platforms that consume SparkPlug B messages, keeping virtual models synchronized with physical assets.