Skip to main content
Version: 2.1

Local File Nodes

Local File nodes allow pipelines to interact with files on the local filesystem. All file operations are restricted to a configured base directory for security. Use these nodes to read data files, write output files, and integrate file-based workflows with other pipeline operations.

Configuration Quick Reference

Local File Fetch node configuration

Local File Nodes

FieldWhat you chooseDetails
ParametersConnection, Function, Function Parameters, Timeout OverrideSelect the connection profile, function, configure function parameters with expression support, and optionally override timeout.
SettingsDescription, Timeout (seconds), Retry on Timeout, Retry on Fail, On ErrorNode description, maximum execution time, retry behavior on timeout or failure, and error handling strategy. All execution settings default to pipeline-level values.

Local File Fetch Node

Read files from the local filesystem for processing in your pipeline.

Supported Function Types:

Function NamePurposeCommon Use Cases
Fetch Local FileRead file contentCSV data import, configuration files, log analysis

Node Configuration

ParameterTypeRequiredDescription
ConnectionSelectionYesLocal File connection profile to use
FunctionSelectionYesFetch function from the selected connection
Function ParametersDynamicVariesAuto-populated from the function schema. See your Local File connection functions for full parameter details.
Timeout OverrideNumber (seconds)NoOverride the default function timeout

All function parameters support expression syntax ({{ expression }}) for dynamic values from the pipeline context.

Input

The node receives the output of the previous node as input. Input data can be referenced in function parameter expressions using $input.

Output Structure

On success the node produces:

{
"success": true,
"functionId": "<function-id>",
"data": {
"fileName": "sales_data.csv",
"filePath": "/data/reports/sales_data.csv",
"size": 2048,
"data": "<file content as string or base64>",
"encoding": "utf-8",
"mimeType": "text/csv",
"modifiedAt": "2026-01-15T08:30:00Z"
},
"durationMs": 42,
"timestamp": "2026-01-15T08:30:00Z"
}
FieldTypeDescription
successbooleantrue when the function executed without errors
functionIdstringID of the executed function
dataobjectFile content and metadata (see below)
durationMsnumberExecution time in milliseconds
timestampstringISO 8601 / RFC 3339 UTC timestamp

File Data Fields

FieldTypeDescription
fileNamestringName of the file that was read
filePathstringFull path to the file (within base path)
sizenumberFile size in bytes
datastringFile content (text or base64-encoded)
encodingstringContent encoding used
mimeTypestringDetected MIME type of the file
modifiedAtstringFile last modification timestamp

Local File Write Node

Write content to files on the local filesystem.

Supported Function Types:

Function NamePurposeCommon Use Cases
Write to Local FileWrite file contentExport reports, save processed data, create log files

Node Configuration

ParameterTypeRequiredDescription
ConnectionSelectionYesLocal File connection profile to use
FunctionSelectionYesWrite function from the selected connection
Function ParametersDynamicVariesAuto-populated from the function schema (e.g., fileName, data). See your Local File connection functions for full parameter details.
Timeout OverrideNumber (seconds)NoOverride the default function timeout

All function parameters support expression syntax ({{ expression }}) for dynamic values.

Input

The node receives the output of the previous node as input. Use expressions like {{ $input.payload.content }} to pass dynamic values to write parameters.

Output Structure

The write node uses the same output envelope as the fetch node:

{
"success": true,
"functionId": "<function-id>",
"data": {
"fileName": "output.csv",
"filePath": "/data/exports/output.csv",
"bytesWritten": 1024,
"created": true,
"appended": false
},
"durationMs": 15,
"timestamp": "2026-01-15T08:30:00Z"
}

Write Result Fields

FieldTypeDescription
fileNamestringName of the file that was written
filePathstringFull path to the written file
bytesWrittennumberNumber of bytes written
createdbooleanWhether the file was newly created
appendedbooleanWhether content was appended to an existing file

Common Use Cases

Reading CSV Data for Processing

  1. Configure a Local File connection with base path pointing to your data directory
  2. Create a Fetch Local File function targeting your CSV file
  3. Add a Local File Fetch node to your pipeline
  4. Connect the output to a File Extractor node to parse the CSV into JSON
[Trigger] → [Local File Fetch] → [File Extractor] → [Process Data]

Writing Pipeline Results to File

  1. Configure a Local File connection with base path pointing to your output directory
  2. Create a Write to Local File function with parameterized filename and data fields
  3. Add a Local File Write node after your data processing nodes
  4. Use expressions to pass the processed data and dynamic filename
[Fetch Data] → [Transform] → [Local File Write]

Parameterized File Operations

Use parameter placeholders in your functions to create dynamic file paths:

Parameter PatternExampleUse Case
((fileName)).csvsales.csvSimple parameterized filename
((date))/report.csv2026-01-15/report.csvDate-based folder structure
((region))/((year))/data.jsonus-west/2026/data.jsonMulti-level parameterization

Pass parameter values using expressions in the node configuration:

fileName: {{ $trigger.payload.filename }}
date: {{ $execution.startedAt | date: 'YYYY-MM-DD' }}

Settings Tab

Both Local File node types share the same Settings tab:

SettingTypeDefaultDescription
DescriptionTextOptional description displayed on the node
Timeout (seconds)NumberPipeline defaultMaximum time the node may run before timing out
Retry on TimeoutTogglePipeline defaultAutomatically retry the node if it times out
Retry on FailTogglePipeline defaultAutomatically retry the node if it fails
On ErrorSelectionPipeline defaultError strategy: stop the pipeline, continue to the next node, or follow the error output path

When left at their defaults, these settings inherit from the pipeline-level execution configuration.


Integration with File Extractor

The Local File Fetch node is commonly used with the File Extractor node to parse CSV and Excel files into structured JSON data:

  1. Local File Fetch reads the raw file content
  2. File Extractor parses the content into rows and columns

Ensure the File Extractor's inputField matches the Local File Fetch output path (default: data.data).


Best Practices

  • Use specific file paths rather than regex patterns when possible for better performance
  • Set appropriate timeouts for large files or slow storage
  • Validate file existence before write operations that should not overwrite
  • Use parameterized paths to create organized, date-based folder structures
  • Configure size limits in the connection to prevent memory issues with large files
Security

All file operations are restricted to the base path configured in the connection. Symlinks are disabled by default to prevent access outside the base directory.