feat(console): add JSON export option to database documents#2897
feat(console): add JSON export option to database documents#2897Divyansh2992 wants to merge 12 commits intoappwrite:mainfrom
Conversation
Console (appwrite/console)Project ID: Sites (1)
Tip Messaging handles push notifications, emails, and SMS through one unified API |
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
WalkthroughAdds JSON export alongside existing CSV export: introduces Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes 🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte (1)
15-15:⚠️ Potential issue | 🟡 MinorEmit
Click.DatabaseExportJsonwhen JSON export is initiated.The JSON path currently tracks submit/error but not click. Since this PR introduces
Click.DatabaseExportJson, add it when the JSON export action starts.📈 Proposed analytics wiring
-import { Submit, trackEvent, trackError } from '$lib/actions/analytics'; +import { Click, Submit, trackEvent, trackError } from '$lib/actions/analytics'; @@ } else { // JSON export logic + trackEvent(Click.DatabaseExportJson); $isSubmitting = true; try {Also applies to: 132-193
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte at line 15, When the JSON export flow starts, emit the Click.DatabaseExportJson analytics event: in the handler that initiates the export (the Submit call path in this file where trackEvent and trackError are imported), call trackEvent('Click.DatabaseExportJson') immediately before starting the JSON export submission; ensure the same addition is applied to the other JSON export branch referenced around lines 132-193 so both JSON export entry points call trackEvent('Click.DatabaseExportJson') prior to invoking Submit and before any trackError handling.
🧹 Nitpick comments (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte (1)
12-12: Use route alias import instead of relative import.Please replace
../storewith the configured alias style for route imports.As per coding guidelines
**/*.{js,ts,svelte}: Use $lib, $routes, and $themes path aliases for imports.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte at line 12, Replace the relative import "import { table } from '../store'" with the configured route-alias import (use the $routes alias per guidelines) so the symbol table is imported via the route alias instead of a relative path; update the import statement in +page.svelte to use $routes (and fix any other relative route imports in this file) to comply with the /*.{js,ts,svelte} alias rule.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte:
- Around line 94-125: The loop currently uses offset-based pagination (pageSize,
offset and Query.offset) which should be changed to cursor-based to match the
other export path: replace offset and Query.offset with a cursor variable and
pass Query.cursorAfter(cursor) to sdk.forProject(...).tablesDB.listRows; in the
loop update cursor from the listRows response (response.cursor /
response.nextCursor / response.cursorAfter depending on the SDK field) and break
when the response indicates no further cursor or rows, while keeping the
selectedCols filtering and pushing into allRows unchanged.
---
Outside diff comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Line 15: When the JSON export flow starts, emit the Click.DatabaseExportJson
analytics event: in the handler that initiates the export (the Submit call path
in this file where trackEvent and trackError are imported), call
trackEvent('Click.DatabaseExportJson') immediately before starting the JSON
export submission; ensure the same addition is applied to the other JSON export
branch referenced around lines 132-193 so both JSON export entry points call
trackEvent('Click.DatabaseExportJson') prior to invoking Submit and before any
trackError handling.
---
Nitpick comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte:
- Line 12: Replace the relative import "import { table } from '../store'" with
the configured route-alias import (use the $routes alias per guidelines) so the
symbol table is imported via the route alias instead of a relative path; update
the import statement in +page.svelte to use $routes (and fix any other relative
route imports in this file) to comply with the /*.{js,ts,svelte} alias rule.
ℹ️ Review info
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
src/lib/actions/analytics.tssrc/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.sveltesrc/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.sveltesrc/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
...ject-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte
Outdated
Show resolved
Hide resolved
Greptile SummaryThis PR successfully extends the database export functionality to support JSON format alongside the existing CSV export. The implementation uses cursor-based pagination ( Key improvements:
Changes:
The implementation addresses most concerns from previous review threads, including progress feedback, document count warnings, and proper analytics tracking. Confidence Score: 4/5
Important Files Changed
Flowchart%%{init: {'theme': 'neutral'}}%%
flowchart TD
A[User clicks Export button] --> B{Select format}
B -->|CSV| C[Server-side export via migrations API]
B -->|JSON| D[Initialize AbortController & progress]
C --> C1[Show 'CSV export started' notification]
C1 --> C2[Navigate back to table]
D --> E[Fetch page with Query.limit & cursorAfter]
E --> F{First page?}
F -->|Yes| G[Show 'Exporting X rows' notification]
G --> H{Total > 10k?}
H -->|Yes| I[Show memory warning]
H -->|No| J[Continue]
I --> J
F -->|No| J
J --> K[Filter selected columns & append to allRows]
K --> L[Update progress bar percentage]
L --> M{User clicked Cancel?}
M -->|Yes| N[Show 'Export cancelled' notification]
M -->|No| O{More pages?}
O -->|Yes| E
O -->|No| P[JSON.stringify with pretty-print]
P --> Q[Create Blob & trigger download]
Q --> R[Show 'Export complete' notification]
R --> S[Track Submit.DatabaseExportJson]
S --> T[Navigate back to table]
N --> U[Cleanup & reset]
R --> U
Last reviewed commit: 485804a |
...ject-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte (2)
133-133: Remove this non-essential inline comment.
// JSON export logicdoes not add context beyond the code structure and can be dropped.As per coding guidelines: "Use minimal comments; only comment TODOs or complex logic".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte at line 133, Remove the non-essential inline comment "// JSON export logic" that appears inside the JSON export handler (the export logic block/function) — delete the comment string and any extra blank line it leaves so the export handler code remains clean and properly indented (no other code changes required).
137-176: Avoid buffering the full JSON dataset before serialization.This currently stores every row in memory and then stringifies everything at once, which can freeze the tab on large tables. Prefer building
BlobPart[]incrementally per page.♻️ Suggested refactor to lower peak memory usage
- const allRows: Record<string, unknown>[] = []; + const jsonParts: BlobPart[] = ['[\n']; + let isFirstRow = true; + let exportedCount = 0; const pageSize = 100; let lastId: string | undefined = undefined; let fetched = 0; let total = Infinity; @@ - const filtered = response.rows.map((row) => { + const filtered = response.rows.map((row) => { const obj: Record<string, unknown> = {}; for (const col of selectedCols) { obj[col] = row[col]; } return obj; }); - - allRows.push(...filtered); + for (const row of filtered) { + const rowJson = JSON.stringify(row, null, 2); + jsonParts.push(isFirstRow ? rowJson : `,\n${rowJson}`); + isFirstRow = false; + exportedCount += 1; + } fetched += response.rows.length; lastId = response.rows[response.rows.length - 1].$id as string; } - - const json = JSON.stringify(allRows, null, 2); - const blob = new Blob([json], { type: 'application/json' }); + jsonParts.push('\n]'); + const blob = new Blob(jsonParts, { type: 'application/json' }); @@ - message: `JSON export complete — ${allRows.length} row${allRows.length !== 1 ? 's' : ''} downloaded` + message: `JSON export complete — ${exportedCount} row${exportedCount !== 1 ? 's' : ''} downloaded` });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte around lines 137 - 176, The loop currently accumulates every row in allRows and calls JSON.stringify(allRows) which buffers the entire dataset; instead, stream build a BlobPart[] as you page through results: create an array parts: BlobPart[] and before the while add the opening '['; for each response map selectedCols to plain objects (like you do in filtered) then push JSON.stringify for each row to parts (prefix commas for subsequent rows) without storing them in allRows; after the loop push the closing ']' and create the Blob from parts; keep using pageSize/lastId/fetched/total and sdk.forProject(...).tablesDB.listRows and ensure you handle empty result sets and comma placement to produce valid JSON.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Around line 178-183: The download revocation happens immediately after
anchor.click(), which can abort the download in some browsers; update the
download flow that creates the anchor so that URL.revokeObjectURL(url) (and
anchor removal) runs after the click has been processed — for example by
deferring revocation with a short setTimeout or listening for the anchor's
click/visibility change/download completion event — referencing the existing
anchor variable, anchor.click(), and URL.revokeObjectURL(url) to locate and
modify the code.
---
Nitpick comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Line 133: Remove the non-essential inline comment "// JSON export logic" that
appears inside the JSON export handler (the export logic block/function) —
delete the comment string and any extra blank line it leaves so the export
handler code remains clean and properly indented (no other code changes
required).
- Around line 137-176: The loop currently accumulates every row in allRows and
calls JSON.stringify(allRows) which buffers the entire dataset; instead, stream
build a BlobPart[] as you page through results: create an array parts:
BlobPart[] and before the while add the opening '['; for each response map
selectedCols to plain objects (like you do in filtered) then push JSON.stringify
for each row to parts (prefix commas for subsequent rows) without storing them
in allRows; after the loop push the closing ']' and create the Blob from parts;
keep using pageSize/lastId/fetched/total and
sdk.forProject(...).tablesDB.listRows and ensure you handle empty result sets
and comma placement to produce valid JSON.
ℹ️ Review info
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
package.jsonsrc/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.sveltesrc/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
✅ Files skipped from review due to trivial changes (1)
- src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.svelte
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
🧹 Nitpick comments (2)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte (2)
197-201: Consider defensive error handling for the message property.If the caught error is not a standard
Errorobject, accessing.messagedirectly could displayundefinedto the user.🛡️ Suggested defensive fix
} catch (error) { addNotification({ type: 'error', - message: error.message + message: error instanceof Error ? error.message : 'JSON export failed' });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte around lines 197 - 201, The catch block that calls addNotification({ type: 'error', message: error.message }) should defensively handle non-Error thrown values: update the catch handling around the addNotification call in the same catch(error) block (referencing the addNotification call and the caught variable error) to compute a safe message (e.g. use error instanceof Error ? error.message : String(error) with a fallback like 'An unexpected error occurred') before passing it to addNotification so users never see undefined.
143-173: Pagination logic is well implemented.The cursor-based pagination using
Query.cursorAfter()correctly handles large datasets. The loop structure with the empty-response break andfetched < totalguard is sound.One consideration: for very large collections, all rows accumulate in
allRowsbefore download. This is acceptable for typical use cases, but extremely large exports (hundreds of thousands of rows) could strain browser memory.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte around lines 143 - 173, The current loop accumulates allRows in memory which can OOM for huge exports; change the export to stream/write rows incrementally instead of pushing into allRows: inside the while (fetched < total) loop (the code using pageSize, lastId, Query.cursorAfter, sdk.forProject(...).tablesDB.listRows and selectedCols), convert each fetched page's filtered rows into CSV/NDJSON text and append them to a streaming sink (e.g., a WritableStream, FileSystemWritableFileStream, or an array of Blob parts flushed periodically) and update fetched/lastId as now; remove or avoid building a full allRows array and only keep minimal state (lastId, fetched) to reduce browser memory usage.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Around line 197-201: The catch block that calls addNotification({ type:
'error', message: error.message }) should defensively handle non-Error thrown
values: update the catch handling around the addNotification call in the same
catch(error) block (referencing the addNotification call and the caught variable
error) to compute a safe message (e.g. use error instanceof Error ?
error.message : String(error) with a fallback like 'An unexpected error
occurred') before passing it to addNotification so users never see undefined.
- Around line 143-173: The current loop accumulates allRows in memory which can
OOM for huge exports; change the export to stream/write rows incrementally
instead of pushing into allRows: inside the while (fetched < total) loop (the
code using pageSize, lastId, Query.cursorAfter,
sdk.forProject(...).tablesDB.listRows and selectedCols), convert each fetched
page's filtered rows into CSV/NDJSON text and append them to a streaming sink
(e.g., a WritableStream, FileSystemWritableFileStream, or an array of Blob parts
flushed periodically) and update fetched/lastId as now; remove or avoid building
a full allRows array and only keep minimal state (lastId, fetched) to reduce
browser memory usage.
ℹ️ Review info
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Show resolved
Hide resolved
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
| }); | ||
| try { | ||
| const activeQueries = exportWithFilters ? Array.from(localQueries.values()) : []; | ||
| const allRows: Record<string, unknown>[] = []; |
There was a problem hiding this comment.
all documents loaded into memory before download - will cause browser crashes for collections >10k documents
consider streaming the download or showing a document count warning before starting export
...)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
Outdated
Show resolved
Hide resolved
Additional Comments (1)
consider tracking generic |
| } catch (error) { | ||
| addNotification({ | ||
| type: 'error', | ||
| message: error.message |
There was a problem hiding this comment.
error.message will be undefined if a non-Error object is thrown. use error?.message || String(error) for safer error display
| message: error.message | |
| message: error?.message || String(error) |
| } catch (error) { | ||
| addNotification({ | ||
| type: 'error', | ||
| message: error.message |
There was a problem hiding this comment.
same issue - error.message will be undefined if a non-Error object is thrown
| message: error.message | |
| message: error?.message || String(error) |

What does this PR do?
This PR adds support for exporting collection documents as a .json file from the Database Console.
Currently, users can export documents as CSV. This enhancement extends the existing Export wizard to support both CSV and JSON formats without introducing any backend changes.
Key Changes:
Renamed “Export CSV” wizard to a generic “Export” wizard
Added a format selector (CSV / JSON) in the Export options
Implemented client-side JSON export using:
listDocuments
Cursor-based pagination using Query.cursorAfter() and Query.limit(100)
Active filters
Search queries
Selected columns
Generated downloadable .json file using browser Blob API
File name dynamically set to:
${tableName}.json
Added analytics tracking:
Click.DatabaseExportJson
Submit.DatabaseExportJson
This is a Console-only UI enhancement and does not require backend modifications. Existing CSV export behavior remains unchanged.
Test Plan
I verified the implementation locally with the following steps:
Started the Console locally.
Navigated to:
Database → Collection → Documents
Clicked the Export button.
Selected:
Format: JSON
Specific columns via checkboxes
Search queries
Filters
Verified:
JSON file downloads automatically.
File name matches collection name (.json).
Export respects:
Active filters
Search queries
Selected columns
Pagination correctly fetches all documents (tested with datasets >100 documents).
CSV export still works exactly as before.
No console errors.
Analytics events trigger correctly.
Related PRs and Issues
Closes Issue #2891
Have you read the Contributing Guidelines on issues?
Yes, I have read and followed the contributing guidelines.
Summary by CodeRabbit
New Features
Style
New Events
Chores