Skip to content

Conversation

@iokinpardo
Copy link

@iokinpardo iokinpardo commented Oct 22, 2025

🧩 Enhanced Controlled Export/Import Process with Intelligent Conflict Management

📄 Description

This PR introduces a new controlled Export/Import process for Flowise workspaces, allowing users to precisely manage which elements to import, duplicate, or update.
The goal is to provide greater control and transparency while maintaining data isolation and credential security across workspaces.

This functionality is currently tested and running correctly across all exportable elements.


✅ Key Features

1. Granular Import Control

  • Users can decide per item whether to:
    • Update the existing element (replace content)
    • Duplicate it (create a new UUID with a name suffix, e.g., -duplicate)
    • Skip import altogether
  • Conflict checking is now based on item names, not UUIDs, ensuring workspace isolation and independence.

2. Supported Elements

The controlled import process is validated for the following exportable elements:

AgentFlow, AgentFlowV2, AssistantFlow, AssistantCustom,
AssistantOpenAI, AssistantAzure, ChatFlow, CustomTemplate,
DocumentStore, Tool, Variable

For the following, items always appear as “New Items” on the second tab (user decides whether to import them):

Executions, ChatMessages, ChatMessageFeedbacks, DocumentStoreFileChunks


🔐 Credential Handling & Security

  • Exported data does not include secret credential values — only credential NamesIds are included.
  • During import:
    • If credentials exist on the target instance, they are automatically reconnected.
    • If credentials are missing, fields remain empty (safe default).
  • This ensures:
    • No exposure of sensitive data (UUIDs or secret values)
    • Automatic reconnection for smooth post-import operation

🧠 Implementation Details

  • Conflict management uses element names as keys for comparison.
  • The system displays conflicts in a dedicated “Conflict Tab”, where users can:
    • Choose to update, duplicate, or skip each conflicting element.
  • No automatic imports — user confirmation is required for every item.
  • The export process now includes credential name mapping without exposing secrets.
  • Fully compatible and tested within a single workspace setup.

⚠️ Security Considerations

Enterprise Context

  • Importing data into another workspace creates new UUIDs, ensuring full isolation.
  • The original workspace data cannot be modified from another workspace.

SaaS Context

  • Since matching is name-based, not UUID-based, data conflicts cannot expose other users’ data.
  • The Conflict Tab guarantees user approval before overwriting or duplicating data.

🧭 Next Steps

  • Validate under multi-workspace environments
  • Add optional “Sanitize Sensitive Info” toggle for enterprise/SaaS export contexts
  • Finalize UX refinements for the conflict resolution interface

Summary by CodeRabbit

  • New Features

    • Import preview endpoint and API to analyze payloads and surface conflicts before importing
    • Import review UI/dialog with per-item and bulk conflict resolution controls (duplicate/update) and cloning helpers
  • Enhancements

    • Automatic credential mapping for workspace export/import, preserving and restoring credential bindings where possible
    • Improved import flow with detailed conflict view, item selection, and summary reporting
  • Documentation

    • Added documentation describing credential export/import flow and limitations
  • Tests

    • New tests and comprehensive mocks covering export/import conflict handling
  • Chores

    • Test config updates to support new mocks and test environment

…conflict-resolution

Add import conflict preview and update handling
…conflict-resolution-1izsy9

Fix assistant conflict type mapping
…ployment-en-render.com

Fix export-import assistant type mapping duplication
…ise-repository

Enhance import conflict review dialog selection controls
…-capabilities-in-agentflow-v2

Add credential binding metadata to workspace exports
…iew-and-selection

feat: improve import review with tabs and summary
…ore-display-in-tabs

Fix duplicate document store sections when switching tabs
…e-to-unchecked

feat(ui): add accordion sections to import review dialog
…abled-item

Improve import review visibility and final summary
…onflicts-tab

Fix conflict selection keys and accordion support in import review
…during-import-process

Fix import review dialog showing conflicts as new items
…-migration-issue

Fix chatflow import credential restoration
…icate-items

Adjust duplicate import handling for imports
@chungyau97 chungyau97 added the enhancement New feature or request label Oct 29, 2025
@HenryHengZJ
Copy link
Contributor

@coderabbitai review

@coderabbitai
Copy link

coderabbitai bot commented Oct 29, 2025

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

@coderabbitai
Copy link

coderabbitai bot commented Oct 29, 2025

Walkthrough

Adds credentialBindings export/import support with preview endpoint, backend conflict-detection/resolution types and logic, UI import-review workflow (preview + decisions), workspace export propagation of credentialBindings, new tests and mocks, and README and test-config updates.

Changes

Cohort / File(s) Summary
Documentation
docs/credential-export-options.md,
README.md
New doc describing credentialBindings export/import format and reconstruction rules; README adds "Conflict-aware import filtering" feature docs and usage example.
Backend — Controllers & Routes
packages/server/src/controllers/export-import/index.ts,
packages/server/src/routes/export-import/index.ts
Added previewImportData controller and POST /export-import/preview route (requires workspace:import); controller default export extended to include preview handler.
Backend — Service Logic
packages/server/src/services/export-import/index.ts
Large additions: conflict/import types and ImportPayload, previewImportData, credential-binding collection/enrichment/reinstatement (formatting, path resolution), import orchestration with conflictResolutions, ID remapping with skip-sets, exportData now enriches chatflows with credential metadata, and workspace-scoped checks and logging. Public API updated to export previewImportData.
UI — API Layer
packages/ui/src/api/exportimport.js
Added previewImportData API method posting to /export-import/preview and included it in default export.
UI — Components / Import Flow
packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx
Added ImportReviewDialog, extended ImportDialog props (status, summary, onClose), wired preview/import flow and conflict UI, per-item decisions (duplicate/update), bulk actions, clone-name helpers, selection state, and API/snackbar handling.
UI — Utilities
packages/ui/src/utils/genericHelper.js
Export now preserves flowData.credentialBindings into exported JSON when present (shallow copy).
Tests, Mocks & Test Config
packages/server/jest.config.js,
packages/server/test/__mocks__/flowise-components.ts,
packages/server/test/index.test.ts,
packages/server/test/services/export-import.service.test.ts
Jest config adjustments (ts-jest globals, moduleNameMapper), new broad mock for flowise-components, test harness update to run new service test, and new test exercising import conflict-resolution (skipping child recreations when parent updated).

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant UI as Browser UI
    participant API as UI API
    participant Controller as ExportImport Controller
    participant Service as ExportImport Service
    participant DB as Database

    rect rgb(230,240,255)
    Note over User,Service: Export (credential metadata enrichment)
    User->>UI: Click "Export workspace"
    UI->>API: POST /export (exportData)
    API->>Controller: exportData request
    Controller->>Service: exportData()
    Service->>DB: fetch entities
    Service->>Service: enrichChatflowsWithCredentialMetadata()
    Service-->>Controller: ExportData (includes credentialBindings)
    Controller-->>API: return payload
    API-->>UI: download file
    end

    rect rgb(230,255,235)
    Note over User,Service: Preview Import
    User->>UI: Upload import file
    UI->>API: POST /export-import/preview (previewImportData)
    API->>Controller: previewImportData
    Controller->>Service: previewImportData()
    Service->>DB: scan destination workspace for conflicts
    Service-->>Controller: ImportPreview { conflicts[] }
    Controller-->>API: return conflicts
    API-->>UI: show ImportReviewDialog
    end

    rect rgb(255,245,230)
    Note over User,Service: Import with resolutions
    User->>UI: Confirm resolutions
    UI->>API: POST /export-import/import (importData with conflictResolutions)
    API->>Controller: importData
    Controller->>Service: importData(ImportPayload)
    Service->>Service: apply conflictResolutions & id remapping
    Service->>Service: reinstateCredentialBindingsOnImport()
    Service->>DB: insert/update entities (skip sets honored)
    DB-->>Service: persist results
    Service-->>Controller: import summary
    Controller-->>API: return summary
    API-->>UI: show success summary
    end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

  • Review focus:
    • packages/server/src/services/export-import/index.ts — conflict-detection types, previewImportData, credential binding path resolution, reintegration, and ID remapping/skip-set logic.
    • packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx — ImportReviewDialog state/decision handling, payload construction, and edge-case behaviors (parent-child mappings, clone suffix).
    • Tests & mocks: packages/server/test/services/export-import.service.test.ts, packages/server/test/__mocks__/flowise-components.ts, and packages/server/jest.config.js for test correctness and isolation.

Poem

🐰 I found some bindings tucked inside a flow,
I sniffed for names and types where imports go,
I hop and map each node to find a match,
I nudge the ids so nothing breaks the patch,
A little rabbit keeps your flows aglow. 🥕✨

Pre-merge checks and finishing touches

✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The pull request title 'Chore: Enhanced Controlled Export/Import Process with Intelligent Conflict Management' directly describes the main feature being introduced: an enhanced export/import process with conflict management capabilities. This aligns with the substantial changes across the codebase, including new conflict detection types, a preview endpoint, UI components for conflict review, and supporting documentation. The title is specific and clearly conveys the primary objective without being vague or misleading.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (3)
packages/server/src/services/export-import/index.ts (3)

522-525: Wrong flag gating AssistantFlow export; add a dedicated assistantFlow boolean.

AssistantFlow export is currently controlled by exportInput.assistantCustom (Assistant entity) instead of its own flag. This will export AssistantFlow chatflows when the user only asked for AssistantCustom entities.

Fix by introducing assistantFlow (or assistant_flow on the wire) and using it here.

Apply:

-        let AssistantFlow: ChatFlow[] | { data: ChatFlow[]; total: number } =
-            exportInput.assistantCustom === true ? await chatflowService.getAllChatflows('ASSISTANT', activeWorkspaceId) : []
+        let AssistantFlow: ChatFlow[] | { data: ChatFlow[]; total: number } =
+            exportInput.assistantFlow === true ? await chatflowService.getAllChatflows('ASSISTANT', activeWorkspaceId) : []

And extend ExportInput plus convertExportInput (see separate comments).


481-506: Update convertExportInput to reflect new inputs and normalize assistant_flow → assistantFlow.

This validator still checks a legacy assistant flag and doesn’t validate assistantCustom, assistantOpenAI, assistantAzure, or the missing assistantFlow. It may silently accept malformed payloads.

Apply:

 const convertExportInput = (body: any): ExportInput => {
   try {
     if (!body || typeof body !== 'object') throw new Error('Invalid ExportInput object in request body')
     if (body.agentflow && typeof body.agentflow !== 'boolean') throw new Error('Invalid agentflow property in ExportInput object')
     if (body.agentflowv2 && typeof body.agentflowv2 !== 'boolean') throw new Error('Invalid agentflowv2 property in ExportInput object')
-    if (body.assistant && typeof body.assistant !== 'boolean') throw new Error('Invalid assistant property in ExportInput object')
+    if (body.assistant_flow && typeof body.assistant_flow !== 'boolean')
+      throw new Error('Invalid assistant_flow property in ExportInput object')
+    if (body.assistantFlow && typeof body.assistantFlow !== 'boolean')
+      throw new Error('Invalid assistantFlow property in ExportInput object')
+    if (body.assistantCustom && typeof body.assistantCustom !== 'boolean')
+      throw new Error('Invalid assistantCustom property in ExportInput object')
+    if (body.assistantOpenAI && typeof body.assistantOpenAI !== 'boolean')
+      throw new Error('Invalid assistantOpenAI property in ExportInput object')
+    if (body.assistantAzure && typeof body.assistantAzure !== 'boolean')
+      throw new Error('Invalid assistantAzure property in ExportInput object')
     if (body.chatflow && typeof body.chatflow !== 'boolean') throw new Error('Invalid chatflow property in ExportInput object')
     if (body.chat_message && typeof body.chat_message !== 'boolean')
       throw new Error('Invalid chat_message property in ExportInput object')
     if (body.chat_feedback && typeof body.chat_feedback !== 'boolean')
       throw new Error('Invalid chat_feedback property in ExportInput object')
     if (body.custom_template && typeof body.custom_template !== 'boolean')
       throw new Error('Invalid custom_template property in ExportInput object')
     if (body.document_store && typeof body.document_store !== 'boolean')
       throw new Error('Invalid document_store property in ExportInput object')
     if (body.execution && typeof body.execution !== 'boolean') throw new Error('Invalid execution property in ExportInput object')
     if (body.tool && typeof body.tool !== 'boolean') throw new Error('Invalid tool property in ExportInput object')
     if (body.variable && typeof body.variable !== 'boolean') throw new Error('Invalid variable property in ExportInput object')
-    return body as ExportInput
+    // normalize wire -> internal
+    const normalized = {
+      ...body,
+      assistantFlow: body.assistantFlow ?? body.assistant_flow ?? false
+    }
+    return normalized as ExportInput
   } catch (error) {
     throw new InternalFlowiseError(
       StatusCodes.INTERNAL_SERVER_ERROR,
       `Error: exportImportService.convertExportInput - ${getErrorMessage(error)}`
     )
   }
 }

832-839: Impossible checks: records.length < 0 can never be true. Use === 0.

These early-exit guards don’t work. Replace with if (records.length === 0) return originalData to avoid unnecessary processing.

Apply:

-        if (records.length < 0) return originalData
+        if (records.length === 0) return originalData

Occurrences in: ChatFlow, Assistant, ChatMessage, ChatMessageFeedback, CustomTemplate, DocumentStore, Tool, Variable, Execution, DocumentStoreFileChunk.

Also applies to: 859-866, 916-922, 1082-1093, 1113-1120, 1140-1147, 1196-1203, 1225-1232, 1247-1253, 1166-1176

🧹 Nitpick comments (8)
packages/server/src/controllers/export-import/index.ts (1)

48-71: Avoid name shadowing: rename local importData → importPayload.

Shadowing the module-level importData harms readability and tooling.

Apply:

-        const importData = req.body
-        if (!importData) {
+        const importPayload = req.body
+        if (!importPayload) {
             throw new InternalFlowiseError(
                 StatusCodes.BAD_REQUEST,
                 'Error: exportImportController.previewImportData - importData is required!'
             )
         }
 
-        const preview = await exportImportService.previewImportData(importData, workspaceId)
+        const preview = await exportImportService.previewImportData(importPayload, workspaceId)
packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx (1)

1511-1539: Verify “new items” detection uses the same type normalization as conflicts.

Comparing Set(conflict keys) with keys from pendingImportPayload assumes exact key equality. Any mismatch classifies conflicted items as “new”, confusing users.

Normalize types when building newItems:

-        Object.entries(pendingImportPayload).forEach(([type, items]) => {
+        const normalize = (t) => t?.trim() || ''
+        Object.entries(pendingImportPayload).forEach(([type, items]) => {
             if (!Array.isArray(items)) return
             items.forEach((item) => {
                 if (!item || !item.id) return
-                const key = getImportItemKey({ type, importId: item.id })
+                const key = getImportItemKey({ type: normalize(type), importId: item.id })
                 if (conflictKeys.has(key)) return
                 newItems.push({
-                    type,
+                    type: normalize(type),
                     importId: item.id,
                     name: getImportItemName(type, item)
                 })
             })
         })

Optionally, build and reuse the same buildTypeKey helper from handleConfirmImport. Would you like me to open a follow‑up patch?

packages/server/src/services/export-import/index.ts (4)

1547-1554: Batch ChatMessageFeedback inserts to avoid parameter limits/timeouts.

ChatMessage and DocumentStoreFileChunk use saveBatch; ChatMessageFeedback saves whole array. Large imports can exceed DB param limits.

Apply:

-            if (importData.ChatMessageFeedback.length > 0)
-                await queryRunner.manager.save(ChatMessageFeedback, importData.ChatMessageFeedback)
+            if (importData.ChatMessageFeedback.length > 0)
+                await saveBatch(queryRunner.manager, ChatMessageFeedback, importData.ChatMessageFeedback)

934-1093: Dedup by messageId may drop legitimate multiple feedback entries. Confirm cardinality.

This block enforces one ChatMessageFeedback per messageId globally. If multiple feedback entries per message are valid (e.g., different users or categories), this will silently discard data.

If 1:1 is intended, add a unique DB constraint to enforce it. If not, dedup by (messageId, userId, type) or keep all.


1318-1326: Global JSON replaceAll for ID remaps can touch unintended strings; consider structured remap.

serialized.replaceAll(oldId, newId) runs across every property/value, including free text. With UUIDs risk is low but not zero; it also has O(n*m) cost.

Consider iterating entity arrays and remapping only id and known FK fields instead of whole-JSON replacement. If keeping string replace, at least match quoted tokens (" + id + ") to reduce false positives.


1283-1299: Param name shadowing reduces readability.

Function importData shadows its parameter importData, making assignments ambiguous at a glance.

Rename the parameter to payload or data and adjust local references.

docs/credential-export-options.md (2)

7-15: Align doc with server-side sanitization and clarify scope.

  • If you accept the server-side sanitization change suggested in code review, note that exportImportService.exportData will now strip FLOWISE_CREDENTIAL_ID and secret-like inputs server-side as well (not only in the UI).
  • Consider replacing hard-coded file/line citations with stable anchors; line numbers drift quickly.

1-21: Minor Spanish nits.

  • IDs → identificadores (optional), “IDs sensibles” → “identificadores sensibles”.
  • “cliente” → “el cliente” (artículo).
  • “sólo” → “solo” (RAE).
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between eed7581 and d9e0026.

📒 Files selected for processing (7)
  • docs/credential-export-options.md (1 hunks)
  • packages/server/src/controllers/export-import/index.ts (1 hunks)
  • packages/server/src/routes/export-import/index.ts (1 hunks)
  • packages/server/src/services/export-import/index.ts (16 hunks)
  • packages/ui/src/api/exportimport.js (1 hunks)
  • packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx (10 hunks)
  • packages/ui/src/utils/genericHelper.js (1 hunks)
🧰 Additional context used
🪛 LanguageTool
docs/credential-export-options.md

[grammar] ~3-~3: Aquí puede haber un error.
Context: ...nstruyen automáticamente al importarlos. ## Qué se incluye en la exportación * `genera...

(QB_NEW_ES_OTHER_ERROR_IDS_MISSING_PUNCTUATION)


[grammar] ~5-~5: Aquí puede haber un error.
Context: ...l importarlos. ## Qué se incluye en la exportación * generateExportFlowData continúa eliminando cualquier clave `FL...

(QB_NEW_ES_OTHER_ERROR_IDS_MISSING_PUNCTUATION)


[grammar] ~7-~7: Aquí puede haber un error.
Context: ... exportación * generateExportFlowData continúa eliminando cualquier clave `FLO...

(QB_NEW_ES)


[grammar] ~7-~7: Aquí puede haber un error.
Context: ...exportación * generateExportFlowData continúa eliminando cualquier clave `FLOWISE_CRED...

(QB_NEW_ES_OTHER_ERROR_IDS_REPLACEMENT_ORTHOGRAPHY_LOWERCASE)


[grammar] ~7-~7: Aquí puede haber un error.
Context: ... no exponer IDs sensibles en el archivo descargado.【F:packages/ui/src/utils/genericHelper.j...

(QB_NEW_ES)


[grammar] ~7-~7: Aquí puede haber un error.
Context: ...ui/src/utils/genericHelper.js†L551-L607】 * Antes de que los datos lleguen al client...

(QB_NEW_ES_OTHER_ERROR_IDS_REPLACEMENT_SPACE)


[grammar] ~8-~8: Aquí puede haber un error.
Context: ...ath, credentialName, credentialType }`. Sólo se incluyen entradas cuyo nombre y tipo...

(QB_NEW_ES_OTHER_ERROR_IDS_REPLACEMENT_CONFUSION)


[grammar] ~8-~8: Aquí puede haber un error.
Context: ... tipo pudieron resolverse en la base de datos.【F:packages/server/src/services/export-i...

(QB_NEW_ES)


[grammar] ~9-~9: Aquí puede haber un error.
Context: ... junto con la lista de nodos y edges ya sanitizados.【F:packages/ui/src/utils/exportImport.js...

(QB_NEW_ES)


[grammar] ~9-~9: Aquí puede haber un error.
Context: ...ui/src/utils/genericHelper.js†L559-L606】 ## Cómo se reconstruyen al importar * Dura...

(QB_NEW_ES)


[grammar] ~11-~11: Aquí puede haber un error.
Context: ...06】 ## Cómo se reconstruyen al importar * Durante la importación, `exportImportSer...

(QB_NEW_ES)


[grammar] ~19-~19: Aquí puede haber un error.
Context: ...†L356-L433】 ## Notas y limitaciones * Sólo se generan entradas en `credentialBindi...

(QB_NEW_ES_OTHER_ERROR_IDS_REPLACEMENT_CONFUSION)


[grammar] ~19-~19: Aquí puede haber un error.
Context: ...endo configuración manual al importarse. * El mecanismo confía en que los nombres d...

(QB_NEW_ES)


[grammar] ~21-~21: Aquí puede haber un error.
Context: ...redentialBindings`; el mapeo automático sólo se aplica a las exportaciones de worksp...

(QB_NEW_ES_OTHER_ERROR_IDS_REPLACEMENT_CONFUSION)


[grammar] ~21-~21: Aquí puede haber un error.
Context: ...aplica a las exportaciones de workspace.

(QB_NEW_ES)

🔇 Additional comments (2)
packages/ui/src/api/exportimport.js (1)

5-11: LGTM — API surface extended cleanly.

Matches existing client usage; no issues spotted.

packages/server/src/routes/export-import/index.ts (1)

8-9: Apply rate limiting middleware to the export-import routes, particularly /preview.

Global body size limit (50mb) is configured and applies to all routes. However, no rate limiting middleware is applied to the export-import routes. The rateLimit utility exists but is not used here. Add rate limiting to prevent abuse through repeated large preview requests.

Suggested approach: Import and apply the rate limiter from utils/rateLimit.ts to the /preview endpoint, or apply a standard express-rate-limit middleware at the router level in packages/server/src/routes/export-import/index.ts.

Comment on lines +83 to +91
type ConflictResolutionAction = 'update' | 'duplicate'

type ConflictResolution = {
type: ConflictEntityKey
importId: string
existingId: string
action: ConflictResolutionAction
}

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Implement “skip” resolution end-to-end.

UI/PR describe update/duplicate/skip, but the API type only supports update | duplicate and the import path never omits items marked skip. Skipped items will still be imported.

Apply:

-type ConflictResolutionAction = 'update' | 'duplicate'
+type ConflictResolutionAction = 'update' | 'duplicate' | 'skip'

And honor skip during import:

-const importData = async (importData: ImportPayload, orgId: string, activeWorkspaceId: string, subscriptionId: string) => {
+const importData = async (importData: ImportPayload, orgId: string, activeWorkspaceId: string, subscriptionId: string) => {-  const idsToSkipMap = conflictEntityKeys.reduce((acc, key) => {
+  const idsToSkipMap = conflictEntityKeys.reduce((acc, key) => {
     acc[key] = new Set<string>()
     return acc
   }, {} as Record<ConflictEntityKey, Set<string>>)
+  const idsToOmitMap = conflictEntityKeys.reduce((acc, key) => {
+    acc[key] = new Set<string>()
+    return acc
+  }, {} as Record<ConflictEntityKey, Set<string>>)

   const idRemap: Record<string, string> = {}

   for (const resolution of conflictResolutions) {
     if (!resolution || !resolution.type || !resolution.importId || !resolution.existingId) continue
     if (resolution.action === 'update') {
       idRemap[resolution.importId] = resolution.existingId
       idsToSkipMap[resolution.type].add(resolution.existingId)
+    } else if (resolution.action === 'skip') {
+      idsToOmitMap[resolution.type].add(resolution.importId)
     }
   }

   if (Object.keys(idRemap).length > 0) {
     let serialized = JSON.stringify(importData)
     for (const [oldId, newId] of Object.entries(idRemap)) {
       if (!oldId || !newId || oldId === newId) continue
       serialized = serialized.replaceAll(oldId, newId)
     }
     importData = JSON.parse(serialized)
   }
+
+  // Drop items explicitly marked as skip
+  importData.AgentFlow        = importData.AgentFlow.filter(i => !idsToOmitMap.AgentFlow.has(i.id))
+  importData.AgentFlowV2      = importData.AgentFlowV2.filter(i => !idsToOmitMap.AgentFlowV2.has(i.id))
+  importData.AssistantFlow    = importData.AssistantFlow.filter(i => !idsToOmitMap.AssistantFlow.has(i.id))
+  importData.AssistantCustom  = importData.AssistantCustom.filter(i => !idsToOmitMap.AssistantCustom.has(i.id))
+  importData.AssistantOpenAI  = importData.AssistantOpenAI.filter(i => !idsToOmitMap.AssistantOpenAI.has(i.id))
+  importData.AssistantAzure   = importData.AssistantAzure.filter(i => !idsToOmitMap.AssistantAzure.has(i.id))
+  importData.ChatFlow         = importData.ChatFlow.filter(i => !idsToOmitMap.ChatFlow.has(i.id))
+  importData.CustomTemplate   = importData.CustomTemplate.filter(i => !idsToOmitMap.CustomTemplate.has(i.id))
+  importData.DocumentStore    = importData.DocumentStore.filter(i => !idsToOmitMap.DocumentStore.has(i.id))
+  importData.Tool             = importData.Tool.filter(i => !idsToOmitMap.Tool.has(i.id))
+  importData.Variable         = importData.Variable.filter(i => !idsToOmitMap.Variable.has(i.id))

Also applies to: 1283-1317

Comment on lines +165 to +177
const sanitizeInputsForExport = (inputs: any, inputParams: any[]): any => {
if (!inputs || typeof inputs !== 'object') return {}

const sanitizedInputs: Record<string, any> = {}
for (const inputName of Object.keys(inputs)) {
const param = Array.isArray(inputParams)
? inputParams.find((inp: any) => inp && inp.name === inputName)
: undefined
if (param && (param.type === 'password' || param.type === 'file' || param.type === 'folder')) continue
sanitizedInputs[inputName] = inputs[inputName]
}
return sanitizedInputs
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Server-side sanitization: strip credential IDs and secret-y inputs before returning exportData.

Today, sanitizeInputsForExport/stripCredentialIds operate on a cloned node, but parsed is not updated. That means server responses can still include FLOWISE_CREDENTIAL_ID and sensitive input values unless the UI performs extra scrubbing. Defense-in-depth: sanitize on the server so API consumers are safe.

Apply minimal change inside collectCredentialBindingsFromChatflow to replace node data with the sanitized clone:

         const exportableNodeData = {
           id: nodeData.id,
           …,
           inputs: sanitizedInputs,
           …,
           selected: false
         }
-
-        stripCredentialIds(exportableNodeData, ['data'], nodeId, chatflow.id, bindings)
+        // 1) replace node data with sanitized copy for export
+        node.data = exportableNodeData
+        // 2) record and strip credential IDs from the sanitized structure
+        stripCredentialIds(exportableNodeData, ['data'], nodeId, chatflow.id, bindings)

This ensures enrichChatflowsWithCredentialMetadata writes back a fully sanitized flowData (lines 346–355) and the object returned from exportData (lines 575–576) is safe even outside the UI.

Also applies to: 211-247, 346-355, 575-576

🤖 Prompt for AI Agents
In packages/server/src/services/export-import/index.ts around lines 165–177 (and
related spots 211–247, 346–355, 575–576), sanitizeInputsForExport currently
returns a sanitized clone but the calling code keeps using the original node
data so FLOWISE_CREDENTIAL_ID and secret-like inputs remain in server responses;
inside collectCredentialBindingsFromChatflow replace the node's inputs (or
node.data) with the sanitized clone returned by sanitizeInputsForExport (i.e.,
create the sanitized copy and assign it back to the node object before
continuing) so that enrichChatflowsWithCredentialMetadata and the final
exportData return the fully sanitized flowData.

Comment on lines +603 to +612
const credentialBindings = Array.isArray(flowData.credentialBindings) ? flowData.credentialBindings : []

const exportJson = {
nodes,
edges
}

if (credentialBindings.length) {
exportJson.credentialBindings = credentialBindings.map((binding) => ({ ...binding }))
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Sanitize credentialBindings before exporting to prevent ID/secret leakage.

Current logic shallow-copies bindings; unlike nodes, it doesn’t strip FLOWISE_CREDENTIAL_ID (or nested fields). Reuse _removeCredentialId to deep-sanitize.

Apply:

-    const credentialBindings = Array.isArray(flowData.credentialBindings) ? flowData.credentialBindings : []
+    const credentialBindings = Array.isArray(flowData.credentialBindings)
+        ? flowData.credentialBindings.map((b) => _removeCredentialId(b))
+        : []
@@
-    if (credentialBindings.length) {
-        exportJson.credentialBindings = credentialBindings.map((binding) => ({ ...binding }))
-    }
+    if (credentialBindings.length) {
+        exportJson.credentialBindings = credentialBindings
+    }
🤖 Prompt for AI Agents
In packages/ui/src/utils/genericHelper.js around lines 603 to 612, the export
currently shallow-copies flowData.credentialBindings which can leave
FLOWISE_CREDENTIAL_ID or nested secrets intact; instead call the existing
_removeCredentialId sanitizer for each binding (and any nested objects) to
perform a deep sanitize before assigning exportJson.credentialBindings. Replace
the map that does ({ ...binding }) with a call to _removeCredentialId(binding)
(or a small wrapper that clones then passes through _removeCredentialId) so
exported credentialBindings never include credential IDs or secrets.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (2)
packages/server/src/services/export-import/index.ts (2)

83-83: 'skip' action still not implemented.

Previous review flagged that ConflictResolutionAction is missing the 'skip' option mentioned in PR objectives, and the import logic (lines 1310-1325) doesn't filter out skipped items. This remains unfixed.


248-248: Sanitized node data not written back to parsed structure.

Previous review flagged that stripCredentialIds returns a sanitized clone but the return value is discarded. The original unsanitized node.data remains in the parsed structure, potentially leaking credentials and sensitive inputs in server responses.

Apply this fix to assign the sanitized result back:

-            stripCredentialIds(exportableNodeData, ['data'], nodeId, chatflow.id, bindings)
+            node.data = stripCredentialIds(exportableNodeData, ['data'], nodeId, chatflow.id, bindings)
🧹 Nitpick comments (2)
packages/server/src/services/export-import/index.ts (2)

642-643: Consider handling duplicate names within import data.

If the import payload contains multiple items with the same name, findFirstImportByName will only match the first one. Subsequent items with duplicate names won't be flagged as conflicts, potentially causing name collision errors during import.

Consider either:

  1. Pre-validating import data to reject duplicate names per entity type, or
  2. Documenting that conflicts are detected for the first occurrence only

This is likely acceptable if duplicate names within imports are unexpected, but explicit handling would improve robustness.


1310-1316: 'duplicate' action relies on implicit default behavior.

The conflict resolution loop only explicitly handles 'update' action. For 'duplicate' action, the code relies on default behavior where replaceDuplicateIds functions generate new IDs for items not in idsToSkipMap. While this works correctly, explicit handling would improve code clarity.

Consider adding an explicit branch for documentation:

 for (const resolution of conflictResolutions) {
     if (!resolution || !resolution.type || !resolution.importId || !resolution.existingId) continue
     if (resolution.action === 'update') {
         idRemap[resolution.importId] = resolution.existingId
         idsToSkipMap[resolution.type].add(resolution.existingId)
+    } else if (resolution.action === 'duplicate') {
+        // Default behavior: replaceDuplicateIds will generate new UUID for this item
+        // No entry in idsToSkipMap, so new ID will be generated
     }
 }
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d9e0026 and b373720.

📒 Files selected for processing (2)
  • packages/server/src/controllers/export-import/index.ts (1 hunks)
  • packages/server/src/services/export-import/index.ts (16 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • packages/server/src/controllers/export-import/index.ts
🧰 Additional context used
🧬 Code graph analysis (1)
packages/server/src/services/export-import/index.ts (3)
packages/server/src/Interface.ts (1)
  • AssistantType (21-21)
packages/server/src/utils/getRunningExpressApp.ts (1)
  • getRunningExpressApp (3-13)
packages/ui/src/api/exportimport.js (2)
  • previewImportData (5-5)
  • importData (4-4)
🔇 Additional comments (2)
packages/server/src/services/export-import/index.ts (2)

575-576: Credential metadata enrichment properly integrated.

The call to enrichChatflowsWithCredentialMetadata correctly passes all chatflow types that can contain credential bindings, ensuring exported flowData includes credential name/type metadata without exposing secret values.


1343-1346: Credential reinstatement correctly positioned in import flow.

The call to reinstateCredentialBindingsOnImport is properly placed after conflict resolution/ID remapping but before database persistence, ensuring credentials are reconnected using the target workspace's credential instances.

Copy link
Contributor

@chungyau97 chungyau97 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Finding: Duplication of Child Data

Test Data Used
ExportData Default.json

Steps to Reproduce

  1. Create a new enterprise setup.
  2. Import ExportData Default.json as new items.
Step 2 Step 2 Confirmation
new step 1 new step 1 confirmation
  1. Import ExportData Default.json again using the conditions below.
    Conditions
  • Conflict
    • Agent Flow V2:
      • 48fb167f-160c-465d-b7d9-6857e87734d3
  • New Items
    • Chat Message:
      • 121320c2-239c-458f-901c-f92a6a156514
      • d6bc7e62-4f8b-44dd-978a-cb20cd2f7d43
    • Chat Message Feedback:
      • 2c4148e8-5f83-4dbe-81ac-1c767c949f39
    • Execution:
      • 50cabf62-a17c-4697-870e-66824846b4cc
Step 3 Step 3 Confirmation
new step 2 conflict new step 2 new items new step 2 confirmation

Result

Duplicate messages appear, as shown in the attached screenshots.

Result 1 Result 2
new result 1 new result 2

Required Action
When user toggle update on parent data, remove corresponding child data from ExportData Default.json if child UUIDs already exist in the database.

Child Entity Relationships

  • ChatMessage, ChatMessageFeedbackAgentFlow, AgentFlowV2, AssistantCustom, AssistantFlow, AssistantOpenAI, AssistantAzure, ChatFlow
  • ExecutionAgentFlowV2
  • DocumentStoreFileChunkDocumentStore

Opinions on this PR

  1. Complicates the UI: users are now forced to decide between "duplicate" or "update", which increases risk of incorrect choices.
  2. If a user selects "update", the UI must provide efficient ways to identify and select related child entities (e.g., ChatMessage, ChatMessageFeedback, ... ...), this is essential when thousands of related records exist.
  3. The export-import flow in the main branch handles duplication issue, as shown in the attached screenshots.
Agent Flow V2 Direct Reply 1 Direct Reply 2
old 1 Screenshot 2025-10-31 153900 Screenshot 2025-10-31 153918
  1. Another simpler approach of this PR could be to modify the name of duplicated parent data during import (e.g., append " - duplicate"). This flags the duplicates so users can easily review or delete them after the import process completes.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (2)
packages/server/src/services/export-import/index.ts (2)

83-83: [Duplicate] Add 'skip' action to ConflictResolutionAction type.

The type still only supports 'update' | 'duplicate'. The PR description and UI allow users to skip conflicts, but the backend type and logic do not implement this action. This is a known issue from a previous review.


211-293: [Duplicate] Sanitized node data not integrated into export structure.

The functions sanitizeInputsForExport and stripCredentialIds return sanitized clones, but collectCredentialBindingsFromChatflow does not update the parsed.nodes with these sanitized versions. The original node data with credential IDs and sensitive inputs remains in the structure. This is a known issue from a previous review.

🧹 Nitpick comments (1)
packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx (1)

370-1173: Consider splitting ImportReviewDialog into smaller components.

The ImportReviewDialog component is approximately 800 lines, which makes it harder to maintain and test. Consider extracting sub-components such as:

  • ConflictTab for conflict review UI
  • NewItemsTab for new items UI
  • ConflictItem and NewItemRow for individual item rendering
  • TypeSection for the collapsible type groupings

This would improve readability and make unit testing easier.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 620b707 and 0301742.

📒 Files selected for processing (2)
  • packages/server/src/services/export-import/index.ts (21 hunks)
  • packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx (10 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
packages/server/src/services/export-import/index.ts (4)
packages/server/src/Interface.ts (1)
  • AssistantType (21-21)
packages/ui/src/utils/errorHandler.js (2)
  • getErrorMessage (17-19)
  • getErrorMessage (17-19)
packages/server/src/utils/getRunningExpressApp.ts (1)
  • getRunningExpressApp (3-13)
packages/ui/src/api/exportimport.js (1)
  • previewImportData (5-5)
packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx (2)
packages/ui/src/views/canvas/index.jsx (4)
  • errorFailed (365-379)
  • navigate (65-65)
  • dispatch (79-79)
  • enqueueSnackbar (89-89)
packages/ui/src/utils/errorHandler.js (2)
  • getErrorMessage (17-19)
  • getErrorMessage (17-19)
🔇 Additional comments (1)
packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx (1)

1564-1620: ****

The concern about potential mismatch between conflict.type and payload keys is unfounded. The backend service explicitly defines a conflictEntityKeys array with controlled values ('AgentFlow', 'AgentFlowV2', 'AssistantFlow', 'AssistantCustom', 'AssistantOpenAI', 'AssistantAzure', 'ChatFlow', 'CustomTemplate', 'DocumentStore', 'Tool', 'Variable'), which are iterated to populate conflict types. These values directly correspond to the top-level keys in the import payload structure, ensuring consistency by design. The code correctly accesses payload[conflict.type] because conflict.type originates from this predefined list that matches the payload schema.

Likely an incorrect or invalid review comment.

Comment on lines +1589 to +1642
importData = await replaceDuplicateIdsForChatMessage(
queryRunner,
importData,
importData.ChatMessage,
activeWorkspaceId,
childIdsMarkedForUpdate.chatMessage
)
importData = await replaceExecutionIdForChatMessage(queryRunner, importData, importData.ChatMessage, activeWorkspaceId)
}
if (importData.ChatMessageFeedback.length > 0)
importData = await replaceDuplicateIdsForChatMessageFeedback(
queryRunner,
importData,
importData.ChatMessageFeedback,
activeWorkspaceId
activeWorkspaceId,
childIdsMarkedForUpdate.chatMessageFeedback
)
if (importData.CustomTemplate.length > 0) {
importData.CustomTemplate = insertWorkspaceId(importData.CustomTemplate, activeWorkspaceId)
importData = await replaceDuplicateIdsForCustomTemplate(queryRunner, importData, importData.CustomTemplate)
importData = await replaceDuplicateIdsForCustomTemplate(
queryRunner,
importData,
importData.CustomTemplate,
idsToSkipMap.CustomTemplate
)
}
if (importData.DocumentStore.length > 0) {
importData.DocumentStore = insertWorkspaceId(importData.DocumentStore, activeWorkspaceId)
importData = await replaceDuplicateIdsForDocumentStore(queryRunner, importData, importData.DocumentStore)
importData = await replaceDuplicateIdsForDocumentStore(
queryRunner,
importData,
importData.DocumentStore,
idsToSkipMap.DocumentStore
)
}
if (importData.DocumentStoreFileChunk.length > 0)
importData = await replaceDuplicateIdsForDocumentStoreFileChunk(queryRunner, importData, importData.DocumentStoreFileChunk)
importData = await replaceDuplicateIdsForDocumentStoreFileChunk(
queryRunner,
importData,
importData.DocumentStoreFileChunk,
childIdsMarkedForUpdate.documentStoreFileChunk
)
if (importData.Tool.length > 0) {
importData.Tool = insertWorkspaceId(importData.Tool, activeWorkspaceId)
importData = await replaceDuplicateIdsForTool(queryRunner, importData, importData.Tool)
importData = await replaceDuplicateIdsForTool(queryRunner, importData, importData.Tool, idsToSkipMap.Tool)
}
if (importData.Execution.length > 0) {
importData.Execution = insertWorkspaceId(importData.Execution, activeWorkspaceId)
importData = await replaceDuplicateIdsForExecution(queryRunner, importData, importData.Execution)
importData = await replaceDuplicateIdsForExecution(
queryRunner,
importData,
importData.Execution,
childIdsMarkedForUpdate.execution
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Fix undefined variable childIdsMarkedForUpdate.

Lines 1594, 1604, 1629, and 1641 reference childIdsMarkedForUpdate which is never defined in the function scope. This will cause a ReferenceError at runtime.

Looking at the surrounding logic (lines 1381-1453), the code already filters out existing child entities from the importData arrays directly. Since those children are removed before reaching the replaceDuplicateIds functions, you likely don't need to pass skip sets for them.

Option 1: Remove the undefined parameter (simplest fix):

 if (importData.ChatMessage.length > 0) {
     importData = await replaceDuplicateIdsForChatMessage(
         queryRunner,
         importData,
         importData.ChatMessage,
-        activeWorkspaceId,
-        childIdsMarkedForUpdate.chatMessage
+        activeWorkspaceId
     )

And update the function signatures to make idsToSkip optional with a default empty Set:

 async function replaceDuplicateIdsForChatMessage(
     queryRunner: QueryRunner,
     originalData: ExportData,
     chatMessages: ChatMessage[],
     activeWorkspaceId?: string,
-    idsToSkip: Set<string> = new Set()
+    idsToSkip?: Set<string>
 ) {
+    const skipSet = idsToSkip ?? new Set<string>()
     // ... use skipSet instead of idsToSkip in the function body

Option 2: Define and populate childIdsMarkedForUpdate (if you need to track these IDs for other purposes):

Add after line 1320:

+    const childIdsMarkedForUpdate = {
+        chatMessage: new Set<string>(),
+        chatMessageFeedback: new Set<string>(),
+        documentStoreFileChunk: new Set<string>(),
+        execution: new Set<string>()
+    }

Then populate these sets in the filtering logic (lines 1386-1452) by adding the IDs before filtering them out.

Also applies to: 1604, 1629, 1641

🤖 Prompt for AI Agents
In packages/server/src/services/export-import/index.ts around lines 1589-1642
the code references an undefined variable childIdsMarkedForUpdate in calls to
replaceDuplicateIdsForChatMessage, replaceDuplicateIdsForChatMessageFeedback,
replaceDuplicateIdsForDocumentStoreFileChunk and replaceDuplicateIdsForExecution
causing a ReferenceError; fix by either (A) removing the undefined parameter
from those calls and updating the respective replaceDuplicateIds* function
signatures to make the idsToSkip parameter optional with a default empty Set
(preferred/simplest), or (B) if you actually need to track these IDs, define
childIdsMarkedForUpdate before its use and populate its appropriate Sets during
the earlier filtering steps so the calls receive a valid Set.

created: createdItems,
duplicated: duplicateConflicts,
updated: updatedConflicts,
skipped: [...skippedConflicts, ...skippedNew, ...dependencyRemoved]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Critical: dependencyRemoved is undefined.

Line 1670 references dependencyRemoved in the skipped array, but this variable is never declared or populated. This will cause a runtime error.

The dependency filtering logic at lines 1622-1665 removes child items whose parents are not in the payload, but these removed items are never tracked. You need to collect them before filtering.

Apply this diff to fix:

         })
     })
+
+    const dependencyRemoved = []
+    const buildDependencySkipInfo = (item, reason, parentId) => ({
+        type: item.type || 'Unknown',
+        name: getImportItemName(item.type, item),
+        importId: item.id,
+        reason: 'dependency',
+        parent: { id: parentId }
+    })
+
     const chatflowCollections = ['AgentFlow', 'AgentFlowV2', 'AssistantFlow', 'ChatFlow']
     const chatflowIdsInPayload = new Set()
     chatflowCollections.forEach((key) => {
         const collection = payload[key]
         if (!Array.isArray(collection)) return
         collection.forEach((item) => {
             if (item && item.id) {
                 chatflowIdsInPayload.add(item.id)
             }
         })
     })
     if (Array.isArray(payload.ChatMessage)) {
+        const removed = payload.ChatMessage.filter((message) => {
+            if (!message || !message.chatflowid) return true
+            return !chatflowIdsInPayload.has(message.chatflowid)
+        })
+        removed.forEach((msg) => {
+            dependencyRemoved.push(buildDependencySkipInfo(msg, 'dependency', msg.chatflowid))
+        })
         payload.ChatMessage = payload.ChatMessage.filter((message) => {
             if (!message || !message.chatflowid) return false
             return chatflowIdsInPayload.has(message.chatflowid)
         })
     }
     if (Array.isArray(payload.ChatMessageFeedback)) {
+        const removed = payload.ChatMessageFeedback.filter((feedback) => {
+            if (!feedback || !feedback.chatflowid) return true
+            return !chatflowIdsInPayload.has(feedback.chatflowid)
+        })
+        removed.forEach((fb) => {
+            dependencyRemoved.push(buildDependencySkipInfo(fb, 'dependency', fb.chatflowid))
+        })
         payload.ChatMessageFeedback = payload.ChatMessageFeedback.filter((feedback) => {
             if (!feedback || !feedback.chatflowid) return false
             return chatflowIdsInPayload.has(feedback.chatflowid)
         })
     }
     if (Array.isArray(payload.Execution)) {
+        const removed = payload.Execution.filter((execution) => {
+            if (!execution || !execution.agentflowId) return true
+            return !chatflowIdsInPayload.has(execution.agentflowId)
+        })
+        removed.forEach((exec) => {
+            dependencyRemoved.push(buildDependencySkipInfo(exec, 'dependency', exec.agentflowId))
+        })
         payload.Execution = payload.Execution.filter((execution) => {
             if (!execution || !execution.agentflowId) return false
             return chatflowIdsInPayload.has(execution.agentflowId)
         })
     }

     const documentStoreIdsInPayload = new Set()
     if (Array.isArray(payload.DocumentStore)) {
         payload.DocumentStore.forEach((store) => {
             if (store && store.id) {
                 documentStoreIdsInPayload.add(store.id)
             }
         })
     }
     if (Array.isArray(payload.DocumentStoreFileChunk)) {
+        const removed = payload.DocumentStoreFileChunk.filter((chunk) => {
+            if (!chunk || !chunk.storeId) return true
+            return !documentStoreIdsInPayload.has(chunk.storeId)
+        })
+        removed.forEach((chunk) => {
+            dependencyRemoved.push(buildDependencySkipInfo(chunk, 'dependency', chunk.storeId))
+        })
         payload.DocumentStoreFileChunk = payload.DocumentStoreFileChunk.filter((chunk) => {
             if (!chunk || !chunk.storeId) return false
             return documentStoreIdsInPayload.has(chunk.storeId)
         })
     }
🤖 Prompt for AI Agents
In packages/ui/src/layout/MainLayout/Header/ProfileSection/index.jsx around line
1670, the skipped array references dependencyRemoved which is never declared;
initialize a dependencyRemoved array before the dependency filtering block,
during the loop that removes child items whose parents are not in the payload
push those removed child items into dependencyRemoved (or collect them as you
filter), and then use that array in the skipped spread expression so skipped:
[...skippedConflicts, ...skippedNew, ...dependencyRemoved] works without runtime
errors; ensure dependencyRemoved is declared in the correct scope and populated
before line 1670.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants