Parse JSON and JSON Lines files uploaded to Flatfile and extract data into Sheets within a Workbook
.json
) and JSON Lines (.jsonl
, .jsonlines
) files uploaded to Flatfile. It automatically detects the file type and extracts the data into one or more Sheets within a Flatfile Workbook.
The primary use case is to allow users to upload structured JSON data and have it seamlessly transformed into a tabular format for review and import. The plugin can handle two main JSON structures:
{ "address": { "city": "..." } }
becomes a column named Address.City
). This plugin is intended to be used in a server-side listener.
JSONExtractor
function accepts an optional options object with the following properties:
Parameter | Type | Default | Description |
---|---|---|---|
chunkSize | number | 10000 | Controls the number of records to process in each batch or “chunk” when inserting data into Flatfile |
parallel | number | 1 | Controls how many chunks are processed concurrently. Increasing this can speed up the import of very large files but may use more server resources |
debug | boolean | false | If set to true, enables more verbose logging for debugging purposes |
jsonParser
function directly to parse a JSON file buffer outside of the Flatfile listener context.
.json
, .jsonl
, and .jsonlines
file uploads, parses them, and creates the corresponding Sheets and Records in Flatfile.
Parameters:
options
(optional): PluginOptions
chunkSize
(number): Records per chunk. Default: 10000parallel
(number): Number of chunks to process in parallel. Default: 1debug
(boolean): Enable verbose logging. Default: falseExtractor
instance, which is a type of listener middleware.
Usage Example:
@flatfile/util-extractor
utility, which automatically handles job lifecycle events. If the parser throws an error (e.g., due to malformed JSON), the utility will catch it and fail the corresponding job in Flatfile, providing feedback to the user in the UI.
WorkbookCapture
object. This is useful for testing or for use cases where parsing is needed without the full listener integration. It handles both standard JSON and JSONL formats.
Parameters:
buffer
: Buffer
- A Node.js Buffer containing the raw file contentoptions
(optional): { readonly fileExt?: string }
fileExt
(string): Can be set to ‘jsonl’ to explicitly trigger JSON Lines parsing logic, even if the file name is not availableWorkbookCapture
object, which has sheet names as keys and SheetCapture
objects as values.
Example: { "MySheet": { headers: ["id", "name"], data: [{ id: {value: 1}, name: {value: "Test"} }] } }
Usage Example:
SyntaxError
.json
or .jsonl
extension and that the listener is correctly configured with listener.use(JSONExtractor())
.json
, .jsonl
, and .jsonlines
{ "user": { "name": "John" } }
will result in a column named user.name
parseBuffer
function is wrapped in a try/catch block. On a fatal parsing error (like malformed JSON), it re-throws the error, which is then handled by the extractor utility to fail the job