Skip to content

feat(console): add JSON export option to database documents#2897

Open
Divyansh2992 wants to merge 12 commits intoappwrite:mainfrom
Divyansh2992:feature/database-json-export
Open

feat(console): add JSON export option to database documents#2897
Divyansh2992 wants to merge 12 commits intoappwrite:mainfrom
Divyansh2992:feature/database-json-export

Conversation

@Divyansh2992
Copy link

@Divyansh2992 Divyansh2992 commented Feb 28, 2026

What does this PR do?

This PR adds support for exporting collection documents as a .json file from the Database Console.

Currently, users can export documents as CSV. This enhancement extends the existing Export wizard to support both CSV and JSON formats without introducing any backend changes.

image

Key Changes:

  1. Renamed “Export CSV” wizard to a generic “Export” wizard

  2. Added a format selector (CSV / JSON) in the Export options

  3. Implemented client-side JSON export using:

  • listDocuments

  • Cursor-based pagination using Query.cursorAfter() and Query.limit(100)

  1. Ensured export respects:
  • Active filters

  • Search queries

  • Selected columns

  1. Generated downloadable .json file using browser Blob API

  2. File name dynamically set to:
    ${tableName}.json

  3. Added analytics tracking:

  • Click.DatabaseExportJson

  • Submit.DatabaseExportJson

This is a Console-only UI enhancement and does not require backend modifications. Existing CSV export behavior remains unchanged.

Test Plan

I verified the implementation locally with the following steps:

  1. Started the Console locally.

  2. Navigated to:
    Database → Collection → Documents

  3. Clicked the Export button.

  4. Selected:

  • Format: JSON

  • Specific columns via checkboxes

  1. Applied:
  • Search queries

  • Filters

  1. Triggered export.

Verified:

  1. JSON file downloads automatically.

  2. File name matches collection name (.json).

  3. Export respects:

  • Active filters

  • Search queries

  • Selected columns

  1. Pagination correctly fetches all documents (tested with datasets >100 documents).

  2. CSV export still works exactly as before.

  3. No console errors.

  4. Analytics events trigger correctly.

Related PRs and Issues

Closes Issue #2891

Have you read the Contributing Guidelines on issues?

Yes, I have read and followed the contributing guidelines.

Summary by CodeRabbit

  • New Features

    • Added JSON export alongside CSV with a format selector, progress display, and cancel support.
    • JSON export assembles paged results and triggers a client download with completion feedback.
  • Style

    • Export UI label changed from "Export CSV" to "Export".
    • CSV-specific options (delimiter, include header) shown only when CSV is selected.
  • New Events

    • Added analytics tracking for database export (CSV and JSON) actions.
  • Chores

    • Updated a package dependency.

@appwrite
Copy link

appwrite bot commented Feb 28, 2026

Console (appwrite/console)

Project ID: 688b7bf400350cbd60e9

Sites (1)
Site Status Logs Preview QR
 console-stage
688b7cf6003b1842c9dc
Failed Failed Authorize Preview URL QR Code

Tip

Messaging handles push notifications, emails, and SMS through one unified API

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 28, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review

Walkthrough

Adds JSON export alongside existing CSV export: introduces exportFormat, filename derivation, server-paginated JSON export flow that assembles and triggers a client-side download with progress and cancel support, preserves the CSV export path and messages, and updates the UI (tooltip changed from "Export CSV" to "Export") with conditional controls per format. Adds two analytics enum members (DatabaseExportJson) for click and submit events. Updates package.json overrides: minimatch bumped from 10.2.1 to 10.2.3.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The PR title clearly and accurately describes the main change: adding JSON export functionality to database documents. It is concise, specific, and reflects the primary objective of the changeset.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte (1)

15-15: ⚠️ Potential issue | 🟡 Minor

Emit Click.DatabaseExportJson when JSON export is initiated.

The JSON path currently tracks submit/error but not click. Since this PR introduces Click.DatabaseExportJson, add it when the JSON export action starts.

📈 Proposed analytics wiring
-import { Submit, trackEvent, trackError } from '$lib/actions/analytics';
+import { Click, Submit, trackEvent, trackError } from '$lib/actions/analytics';
@@
         } else {
             // JSON export logic
+            trackEvent(Click.DatabaseExportJson);
             $isSubmitting = true;
             try {

Also applies to: 132-193

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
at line 15, When the JSON export flow starts, emit the Click.DatabaseExportJson
analytics event: in the handler that initiates the export (the Submit call path
in this file where trackEvent and trackError are imported), call
trackEvent('Click.DatabaseExportJson') immediately before starting the JSON
export submission; ensure the same addition is applied to the other JSON export
branch referenced around lines 132-193 so both JSON export entry points call
trackEvent('Click.DatabaseExportJson') prior to invoking Submit and before any
trackError handling.
🧹 Nitpick comments (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte (1)

12-12: Use route alias import instead of relative import.

Please replace ../store with the configured alias style for route imports.

As per coding guidelines **/*.{js,ts,svelte}: Use $lib, $routes, and $themes path aliases for imports.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte
at line 12, Replace the relative import "import { table } from '../store'" with
the configured route-alias import (use the $routes alias per guidelines) so the
symbol table is imported via the route alias instead of a relative path; update
the import statement in +page.svelte to use $routes (and fix any other relative
route imports in this file) to comply with the /*.{js,ts,svelte} alias rule.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte:
- Around line 94-125: The loop currently uses offset-based pagination (pageSize,
offset and Query.offset) which should be changed to cursor-based to match the
other export path: replace offset and Query.offset with a cursor variable and
pass Query.cursorAfter(cursor) to sdk.forProject(...).tablesDB.listRows; in the
loop update cursor from the listRows response (response.cursor /
response.nextCursor / response.cursorAfter depending on the SDK field) and break
when the response indicates no further cursor or rows, while keeping the
selectedCols filtering and pushing into allRows unchanged.

---

Outside diff comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Line 15: When the JSON export flow starts, emit the Click.DatabaseExportJson
analytics event: in the handler that initiates the export (the Submit call path
in this file where trackEvent and trackError are imported), call
trackEvent('Click.DatabaseExportJson') immediately before starting the JSON
export submission; ensure the same addition is applied to the other JSON export
branch referenced around lines 132-193 so both JSON export entry points call
trackEvent('Click.DatabaseExportJson') prior to invoking Submit and before any
trackError handling.

---

Nitpick comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte:
- Line 12: Replace the relative import "import { table } from '../store'" with
the configured route-alias import (use the $routes alias per guidelines) so the
symbol table is imported via the route alias instead of a relative path; update
the import statement in +page.svelte to use $routes (and fix any other relative
route imports in this file) to comply with the /*.{js,ts,svelte} alias rule.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ec5b6ad and 05cbf96.

📒 Files selected for processing (4)
  • src/lib/actions/analytics.ts
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.svelte
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Feb 28, 2026

Greptile Summary

This PR successfully extends the database export functionality to support JSON format alongside the existing CSV export. The implementation uses cursor-based pagination (Query.cursorAfter()) to fetch documents in chunks of 100, respecting active filters and column selections.

Key improvements:

  • Format selector added to export wizard (CSV/JSON)
  • JSON export includes progress bar with completion percentage
  • Cancel functionality allows users to abort long-running exports
  • Document count notification after first API response (Exporting X rows...)
  • Memory warning for large datasets (>10k rows)
  • Analytics tracking for both click and submit events
  • CSV-specific options (delimiter, header) conditionally shown

Changes:

  • export/+page.svelte: Added JSON export logic with client-side download, progress tracking, and abort controller for cancellation
  • analytics.ts: Added Click.DatabaseExportJson and Submit.DatabaseExportJson events
  • +page.svelte: Updated tooltip from "Export CSV" to "Export"
  • package.json: Security update for minimatch (CVE-2024-44003)

The implementation addresses most concerns from previous review threads, including progress feedback, document count warnings, and proper analytics tracking.

Confidence Score: 4/5

  • Safe to merge with minor error handling improvements needed
  • The implementation is solid with proper pagination, progress tracking, and user warnings for large datasets. Two minor issues with error message handling won't affect normal operation since the SDK throws proper Error objects. No critical bugs, security issues, or breaking changes. The feature works as intended and includes good UX considerations.
  • The export wizard (export/+page.svelte) has two minor error handling issues that should be addressed before merge, but they're non-blocking

Important Files Changed

Filename Overview
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte Added JSON export with cursor-based pagination, progress tracking, and cancel support. Implementation includes document count warnings for large datasets and proper error handling.
src/lib/actions/analytics.ts Added Click.DatabaseExportJson and Submit.DatabaseExportJson analytics events for tracking JSON export usage
package.json Updated minimatch from 10.2.1 to 10.2.3 (security patch for CVE-2024-44003 ReDoS vulnerability)

Flowchart

%%{init: {'theme': 'neutral'}}%%
flowchart TD
    A[User clicks Export button] --> B{Select format}
    B -->|CSV| C[Server-side export via migrations API]
    B -->|JSON| D[Initialize AbortController & progress]
    
    C --> C1[Show 'CSV export started' notification]
    C1 --> C2[Navigate back to table]
    
    D --> E[Fetch page with Query.limit & cursorAfter]
    E --> F{First page?}
    F -->|Yes| G[Show 'Exporting X rows' notification]
    G --> H{Total > 10k?}
    H -->|Yes| I[Show memory warning]
    H -->|No| J[Continue]
    I --> J
    F -->|No| J
    
    J --> K[Filter selected columns & append to allRows]
    K --> L[Update progress bar percentage]
    L --> M{User clicked Cancel?}
    M -->|Yes| N[Show 'Export cancelled' notification]
    M -->|No| O{More pages?}
    
    O -->|Yes| E
    O -->|No| P[JSON.stringify with pretty-print]
    P --> Q[Create Blob & trigger download]
    Q --> R[Show 'Export complete' notification]
    R --> S[Track Submit.DatabaseExportJson]
    S --> T[Navigate back to table]
    
    N --> U[Cleanup & reset]
    R --> U
Loading

Last reviewed commit: 485804a

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

5 files reviewed, 17 comments

Edit Code Review Agent Settings | Greptile

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte (2)

133-133: Remove this non-essential inline comment.

// JSON export logic does not add context beyond the code structure and can be dropped.

As per coding guidelines: "Use minimal comments; only comment TODOs or complex logic".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
at line 133, Remove the non-essential inline comment "// JSON export logic" that
appears inside the JSON export handler (the export logic block/function) —
delete the comment string and any extra blank line it leaves so the export
handler code remains clean and properly indented (no other code changes
required).

137-176: Avoid buffering the full JSON dataset before serialization.

This currently stores every row in memory and then stringifies everything at once, which can freeze the tab on large tables. Prefer building BlobPart[] incrementally per page.

♻️ Suggested refactor to lower peak memory usage
-                const allRows: Record<string, unknown>[] = [];
+                const jsonParts: BlobPart[] = ['[\n'];
+                let isFirstRow = true;
+                let exportedCount = 0;
                 const pageSize = 100;
                 let lastId: string | undefined = undefined;
                 let fetched = 0;
                 let total = Infinity;
@@
-                    const filtered = response.rows.map((row) => {
+                    const filtered = response.rows.map((row) => {
                         const obj: Record<string, unknown> = {};
                         for (const col of selectedCols) {
                             obj[col] = row[col];
                         }
                         return obj;
                     });
-
-                    allRows.push(...filtered);
+                    for (const row of filtered) {
+                        const rowJson = JSON.stringify(row, null, 2);
+                        jsonParts.push(isFirstRow ? rowJson : `,\n${rowJson}`);
+                        isFirstRow = false;
+                        exportedCount += 1;
+                    }
                     fetched += response.rows.length;
                     lastId = response.rows[response.rows.length - 1].$id as string;
                 }
-
-                const json = JSON.stringify(allRows, null, 2);
-                const blob = new Blob([json], { type: 'application/json' });
+                jsonParts.push('\n]');
+                const blob = new Blob(jsonParts, { type: 'application/json' });
@@
-                    message: `JSON export complete — ${allRows.length} row${allRows.length !== 1 ? 's' : ''} downloaded`
+                    message: `JSON export complete — ${exportedCount} row${exportedCount !== 1 ? 's' : ''} downloaded`
                 });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
around lines 137 - 176, The loop currently accumulates every row in allRows and
calls JSON.stringify(allRows) which buffers the entire dataset; instead, stream
build a BlobPart[] as you page through results: create an array parts:
BlobPart[] and before the while add the opening '['; for each response map
selectedCols to plain objects (like you do in filtered) then push JSON.stringify
for each row to parts (prefix commas for subsequent rows) without storing them
in allRows; after the loop push the closing ']' and create the Blob from parts;
keep using pageSize/lastId/fetched/total and
sdk.forProject(...).tablesDB.listRows and ensure you handle empty result sets
and comma placement to produce valid JSON.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Around line 178-183: The download revocation happens immediately after
anchor.click(), which can abort the download in some browsers; update the
download flow that creates the anchor so that URL.revokeObjectURL(url) (and
anchor removal) runs after the click has been processed — for example by
deferring revocation with a short setTimeout or listening for the anchor's
click/visibility change/download completion event — referencing the existing
anchor variable, anchor.click(), and URL.revokeObjectURL(url) to locate and
modify the code.

---

Nitpick comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Line 133: Remove the non-essential inline comment "// JSON export logic" that
appears inside the JSON export handler (the export logic block/function) —
delete the comment string and any extra blank line it leaves so the export
handler code remains clean and properly indented (no other code changes
required).
- Around line 137-176: The loop currently accumulates every row in allRows and
calls JSON.stringify(allRows) which buffers the entire dataset; instead, stream
build a BlobPart[] as you page through results: create an array parts:
BlobPart[] and before the while add the opening '['; for each response map
selectedCols to plain objects (like you do in filtered) then push JSON.stringify
for each row to parts (prefix commas for subsequent rows) without storing them
in allRows; after the loop push the closing ']' and create the Blob from parts;
keep using pageSize/lastId/fetched/total and
sdk.forProject(...).tablesDB.listRows and ensure you handle empty result sets
and comma placement to produce valid JSON.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f7a82df and 6474248.

📒 Files selected for processing (3)
  • package.json
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.svelte
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
✅ Files skipped from review due to trivial changes (1)
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.svelte

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (2)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte (2)

197-201: Consider defensive error handling for the message property.

If the caught error is not a standard Error object, accessing .message directly could display undefined to the user.

🛡️ Suggested defensive fix
             } catch (error) {
                 addNotification({
                     type: 'error',
-                    message: error.message
+                    message: error instanceof Error ? error.message : 'JSON export failed'
                 });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
around lines 197 - 201, The catch block that calls addNotification({ type:
'error', message: error.message }) should defensively handle non-Error thrown
values: update the catch handling around the addNotification call in the same
catch(error) block (referencing the addNotification call and the caught variable
error) to compute a safe message (e.g. use error instanceof Error ?
error.message : String(error) with a fallback like 'An unexpected error
occurred') before passing it to addNotification so users never see undefined.

143-173: Pagination logic is well implemented.

The cursor-based pagination using Query.cursorAfter() correctly handles large datasets. The loop structure with the empty-response break and fetched < total guard is sound.

One consideration: for very large collections, all rows accumulate in allRows before download. This is acceptable for typical use cases, but extremely large exports (hundreds of thousands of rows) could strain browser memory.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
around lines 143 - 173, The current loop accumulates allRows in memory which can
OOM for huge exports; change the export to stream/write rows incrementally
instead of pushing into allRows: inside the while (fetched < total) loop (the
code using pageSize, lastId, Query.cursorAfter,
sdk.forProject(...).tablesDB.listRows and selectedCols), convert each fetched
page's filtered rows into CSV/NDJSON text and append them to a streaming sink
(e.g., a WritableStream, FileSystemWritableFileStream, or an array of Blob parts
flushed periodically) and update fetched/lastId as now; remove or avoid building
a full allRows array and only keep minimal state (lastId, fetched) to reduce
browser memory usage.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Around line 197-201: The catch block that calls addNotification({ type:
'error', message: error.message }) should defensively handle non-Error thrown
values: update the catch handling around the addNotification call in the same
catch(error) block (referencing the addNotification call and the caught variable
error) to compute a safe message (e.g. use error instanceof Error ?
error.message : String(error) with a fallback like 'An unexpected error
occurred') before passing it to addNotification so users never see undefined.
- Around line 143-173: The current loop accumulates allRows in memory which can
OOM for huge exports; change the export to stream/write rows incrementally
instead of pushing into allRows: inside the while (fetched < total) loop (the
code using pageSize, lastId, Query.cursorAfter,
sdk.forProject(...).tablesDB.listRows and selectedCols), convert each fetched
page's filtered rows into CSV/NDJSON text and append them to a streaming sink
(e.g., a WritableStream, FileSystemWritableFileStream, or an array of Blob parts
flushed periodically) and update fetched/lastId as now; remove or avoid building
a full allRows array and only keep minimal state (lastId, fetched) to reduce
browser memory usage.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6474248 and 848930c.

📒 Files selected for processing (1)
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte

});
try {
const activeQueries = exportWithFilters ? Array.from(localQueries.values()) : [];
const allRows: Record<string, unknown>[] = [];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all documents loaded into memory before download - will cause browser crashes for collections >10k documents

consider streaming the download or showing a document count warning before starting export

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Mar 1, 2026

Additional Comments (1)

src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.svelte
always tracks Click.DatabaseExportCsv before user selects format - creates inconsistent analytics where CSV exports get double-counted

consider tracking generic Click.DatabaseExport here, or remove tracking from this button and let the wizard handle it

} catch (error) {
addNotification({
type: 'error',
message: error.message
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

error.message will be undefined if a non-Error object is thrown. use error?.message || String(error) for safer error display

Suggested change
message: error.message
message: error?.message || String(error)

} catch (error) {
addNotification({
type: 'error',
message: error.message
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same issue - error.message will be undefined if a non-Error object is thrown

Suggested change
message: error.message
message: error?.message || String(error)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant