Skip to main content
Version: 2.0-beta.1
Convert to File node

Convert to File node

Convert to File Node

Overview

  • Type: transform.data.serializer
  • Display Name: Convert to File
  • Category: transform
  • Execution: supportsExecution: true
  • I/O Handles:
    • Input: defaultIn (left), label: Input
    • Outputs: result (Success), error (Error)

Purpose: Serializes structured input into CSV or Excel payloads. Supports header control, flexible input mapping, boolean/null formatting, and output encoding for safe transport to file writers or connectors.


Configuration Reference

Serializer – Format configuration

Format: CSV or Excel with format-specific options

Format

Serializer – Headers

Header mode, provided headers, include/always/strict flags

Headers

Serializer – Input and Output

Input structure, mapping path, encoding and append mode

Input / Output

Serializer – Settings

Retry behavior, on-error strategy, documentation

Settings

Format

FieldTypeDefaultDescription
FormatenumcsvOutput file format: csv or excel.
Delimiter (CSV)string,Field separator (comma, semicolon, tab, pipe). Must not be empty.
Line Ending (CSV)enumLFLine terminator style: LF or CRLF.
Sheet Name (Excel)stringSheet1Required. Max 31 chars. Cannot include / \\ ? * [ ].
Auto Filter (Excel)booleantrueAdds auto-filter to header row.
Freeze Header (Excel)booleantrueFreeze panes at header row.

Headers

FieldTypeDefaultDescription
Header ModeenumautoHow to determine headers: auto, provided, fromFirstRow, or none.
Headersstring[][]Required when headerMode === 'provided'. Ordered list of header names.
Include HeadersbooleantrueWrites header row.
Always Include HeadersbooleanfalseWhen includeHeaders, forces header row every execution (useful for Excel overwrite).
Strict HeadersbooleanfalseValidates data structure against established headers.
Stateful header behavior

Headers are written once per execution session. Use the node's reset manual action to clear state and force a new header write when needed.

Input / Output

FieldTypeDefaultDescription
Input FormatenumautoExpected structure: auto, arrayOfArrays, or arrayOfObjects.
First Row Is HeaderbooleanfalseWhen inputFormat === 'arrayOfArrays', treat first row as headers.
Input FieldstringdataDot path to the data to serialize (e.g., data.items.records).
Output FieldstringdataWhere the encoded payload is placed in the output packet.
Encodingenumbase64base64 or utf8. Excel is always base64 (binary); CSV supports both.
Append ModebooleantrueAccumulate data across executions (useful for streaming).

Data Processing

FieldTypeDefaultDescription
Null Valuestring""Text used for null values.
Boolean Formatenumtrue/falseBoolean string format: true/false, TRUE/FALSE, or 1/0.
Flatten NestedbooleanfalseReserved for future use; flattens nested objects when enabled.

Settings

FieldTypeDefaultDescription
Retry on FailbooleanfalseRetry on transient errors.
On ErrorenumstopPipelineOne of: stopPipeline, continueExecution, retryNode.
Notesstring""Documentation notes for collaborators.
Display Note in PipelinebooleanfalseShow notes on the pipeline canvas.

Validation Rules

  • Node label: required (non-empty)
  • format: required; must be csv or excel
  • CSV: delimiter required and non-empty
  • Excel: sheetName required; must not contain / \\ ? * [ ]
  • headerMode: must be one of auto|provided|fromFirstRow|none
  • When headerMode === 'provided': headers must be a non-empty array
  • inputField: required (non-empty)
  • outputField: required (non-empty)

Typical Integration

  • Upstream: Use parsing/processing nodes (e.g., CSV/Excel Processor) to produce structured data (arrayOfObjects preferred; arrayOfArrays supported).
  • Downstream:
    • For file output, wire into a file write function (e.g., Local File connector “Write” function).
    • When Local File “Fetch” is used earlier, the typical flow is: Fetch raw file → process → serialize → write file.

Examples

  • CSV output for a .csv file:

    • format: 'csv'
    • delimiter: ',', lineEnding: 'LF'
    • headerMode: 'auto', includeHeaders: true
    • inputFormat: 'arrayOfObjects'
    • inputField: 'data.records', outputField: 'data'
    • encoding: 'base64' (recommended for binary-safe transport)
    • appendMode: false for one-shot exports
  • Excel output for an .xlsx file:

    • format: 'excel'
    • sheetName: 'Summary', autoFilter: true, freezeHeader: true
    • headerMode: 'auto', includeHeaders: true, alwaysIncludeHeaders: true (useful for overwrite flows)
    • inputFormat: 'arrayOfObjects'
    • inputField: 'data.items', outputField: 'data'
    • encoding: 'base64' (fixed for Excel)

  • Serialize → Write to file:

    • Connect Serializer result → Local File write function node.
    • Set Local File functionConfig.data to the serializer’s outputField (default data).
    • For CSV: choose encoding per downstream; for Excel: base64 is automatic.
  • Using Local File Fetch upstream:

    • If Fetch is used to read source data, ensure processors convert it into the structure expected by Serializer:
      • arrayOfObjects (preferred), or
      • arrayOfArrays (enable firstRowIsHeader if first row holds headers).