Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 15 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,9 @@ http.createServer(function (request, response){
case 'csv':
response.setHeader('Content-Type', 'text/csv')
break
case 'tsv':
response.setHeader('Content-Type', 'text/tsv')
break
case 'xls':
response.setHeader('Content-Type', 'application/vnd.ms-excel')
break
Expand All @@ -119,18 +122,18 @@ http.createServer(function (request, response){

**Note:** `JSON` refers to a parsable JSON string or a serializable JavaScript object.

| Option name | Required | Type | Description
| ----------- | -------- | ---- | ----
| data | true | `Array<JSON>`, `JSON` or `string` | If the exportType is 'json', data can be any parsable JSON. If the exportType is 'csv' or 'xls', data can only be an array of parsable JSON. If the exportType is 'txt', 'css', 'html', the data must be a string type.
| fileName | false | string | filename without extension, default to `'download'`
| extension | false | string | filename extension, by default it takes the exportType
| fileNameFormatter | false | `(name: string) => string` | filename formatter, by default the file name will be formatted to snake case
| fields | false | `string[]` or field name mapper type `Record<string, string>` | fields filter, also supports mapper field name by passing an name mapper, e.g. { 'bar': 'baz' }, default to `undefined`
| exportType | false | Enum ExportType | 'txt'(default), 'css', 'html', 'json', 'csv', 'xls', 'xml'
| processor | false | `(content: string, type: ExportType, fileName: string) => any` | default to a front-end downloader
| withBOM | false | boolean | Add BOM(byte order mark) meta to CSV file. BOM is expected by `Excel` when reading UTF8 CSV file. It is default to `false`.
| beforeTableEncode | false | `(entries: { fieldName: string, fieldValues: string[] }[]) => { fieldName: string, fieldValues: string[] }[]` | Given a chance to altering table entries, only works for `CSV` and `XLS` file, by default no altering.
| delimiter | false | `',' \| ';'` | Specify CSV raw data's delimiter between values. It is default to `,`
| Option name | Required | Type | Description
| ----------- | -------- |---------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
| data | true | `Array<JSON>`, `JSON` or `string` | If the exportType is 'json', data can be any parsable JSON. If the exportType is 'csv' or 'xls', data can only be an array of parsable JSON. If the exportType is 'txt', 'css', 'html', the data must be a string type.
| fileName | false | string | filename without extension, default to `'download'`
| extension | false | string | filename extension, by default it takes the exportType
| fileNameFormatter | false | `(name: string) => string` | filename formatter, by default the file name will be formatted to snake case
| fields | false | `string[]` or field name mapper type `Record<string, string>` | fields filter, also supports mapper field name by passing an name mapper, e.g. { 'bar': 'baz' }, default to `undefined`
| exportType | false | Enum ExportType | 'txt'(default), 'css', 'html', 'json', 'csv', 'xls', 'xml', 'tsv'
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Documentation updates for TSV support are accurate.

The updates to the exportType and delimiter options in the documentation correctly reflect the new support for TSV files. This enhances user understanding and usability of the library.

Static Analysis Fixes:

  • Line 131: Correct the article usage from "an name mapper" to "a name mapper".
  • Line 134: Update the spelling of "UTF8" to "UTF-8" as per IANA standards.

Apply these corrections:

- an name mapper
+ a name mapper

- UTF8 CSV file
+ UTF-8 CSV file

Also applies to: 136-136

| processor | false | `(content: string, type: ExportType, fileName: string) => any` | default to a front-end downloader
| withBOM | false | boolean | Add BOM(byte order mark) meta to CSV file. BOM is expected by `Excel` when reading UTF8 CSV file. It is default to `false`.
| beforeTableEncode | false | `(entries: { fieldName: string, fieldValues: string[] }[]) => { fieldName: string, fieldValues: string[] }[]` | Given a chance to altering table entries, only works for `CSV` and `XLS` file, by default no altering.
| delimiter | false | `',' \| ';' \| '\t'` | Specify CSV/TSV raw data's delimiter between values. It is default to `,`

### Tips

Expand Down
6 changes: 3 additions & 3 deletions src/converters.ts
Original file line number Diff line number Diff line change
Expand Up @@ -110,8 +110,8 @@ export function _createTableEntries (
// (not all programs support values with line breaks).
// Rule: All other fields do not require double quotes.
// Rule: Double quotes within values are represented by two contiguous double quotes.
function encloser (value: string, delimiter: ',' | ';') {
const enclosingTester = new RegExp(`${delimiter}|"|\n`)
function encloser (value: string, delimiter: ',' | ';' | '\t') {
const enclosingTester = new RegExp(`${delimiter === "\t" ? "\\t" : delimiter}|"|\n`)
Comment on lines +113 to +114
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review the updated encloser function for handling the tab delimiter.

The function now correctly handles the tab delimiter ('\t') by adjusting the regular expression used to determine if a value needs to be enclosed in double quotes. This change is crucial for supporting TSV formats and ensures that values containing tabs are properly quoted.

However, the implementation could be further optimized by pre-compiling the regular expression outside of the function to avoid re-compilation on every function call, which can improve performance especially when processing large datasets.

Consider pre-compiling the regular expression to improve performance:

+ const enclosingTesterMap = {
+   ',': new RegExp(",|\"|\n"),
+   ';': new RegExp(";|\"|\n"),
+   '\t': new RegExp("\\t|\"|\n")
+ }

function encloser(value: string, delimiter: ',' | ';' | '\t') {
-  const enclosingTester = new RegExp(`${delimiter === "\t" ? "\\t" : delimiter}|"|\n`)
+  const enclosingTester = enclosingTesterMap[delimiter]
  const enclosingCharacter = enclosingTester.test(value) ? '"' : ''
  const escaped = value.replace(/"/g, '""')

  return `${enclosingCharacter}${escaped}${enclosingCharacter}`
}

Committable suggestion was skipped due to low confidence.

const enclosingCharacter = enclosingTester.test(value) ? '"' : ''
const escaped = value.replace(/"/g, '""')

Expand All @@ -120,7 +120,7 @@ function encloser (value: string, delimiter: ',' | ';') {

interface CreateCSVDataOptions {
beforeTableEncode?: (entries: ITableEntries) => ITableEntries,
delimiter?: ',' | ';',
delimiter?: ',' | ';' | "\t",
}

const defaultCreateCSVDataOption: Required<CreateCSVDataOptions> = { beforeTableEncode: i => i, delimiter: ',' }
Expand Down
12 changes: 9 additions & 3 deletions src/exportFromJSON.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ export interface IOption<R = void> {
beforeTableEncode?: (
tableRow: Array<{ fieldName: string, fieldValues: string[] }>,
) => Array<{ fieldName: string, fieldValues: string[]}>
delimiter?: ',' | ';'
delimiter?: ',' | ';' | '\t'
}

function exportFromJSON<R = void> ({
Expand All @@ -31,7 +31,7 @@ function exportFromJSON<R = void> ({
processor = downloadFile as never,
withBOM = false,
beforeTableEncode = (i) => i,
delimiter = ',',
delimiter,
}: IOption<R>): R {
const MESSAGE_IS_ARRAY_FAIL = 'Invalid export data. Please provide an array of objects'
const MESSAGE_UNKNOWN_EXPORT_TYPE = `Can't export unknown data type ${exportType}.`
Expand Down Expand Up @@ -64,13 +64,19 @@ function exportFromJSON<R = void> ({
case 'json': {
return processor(JSONData, exportType, normalizeFileName(fileName, extension ?? 'json', fileNameFormatter))
}
case 'tsv':
case 'csv': {
assert(isArray(safeData), MESSAGE_IS_ARRAY_FAIL)
const BOM = '\ufeff'

if(delimiter === undefined) {
delimiter = exportType === 'tsv' ? '\t' : ',';
}

const CSVData = createCSVData(safeData, { beforeTableEncode, delimiter })
const content = withBOM ? BOM + CSVData : CSVData

return processor(content, exportType, normalizeFileName(fileName, extension ?? 'csv', fileNameFormatter))
return processor(content, exportType, normalizeFileName(fileName, extension ?? exportType, fileNameFormatter))
}
case 'xls': {
assert(isArray(safeData), MESSAGE_IS_ARRAY_FAIL)
Expand Down
3 changes: 2 additions & 1 deletion src/processors.ts
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,9 @@ export function generateDataURI (content: string, type: ExportType, byBlob: bool

return `data:,${blobType}` + encodeURIComponent(content)
}
case "tsv":
case 'csv': {
const blobType = 'text/csv;charset=utf-8'
const blobType = `text/${type};charset=utf-8`

if (byBlob) return URL.createObjectURL(new Blob([content], { type: blobType }))

Expand Down
3 changes: 2 additions & 1 deletion src/types.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
export type ExportType = 'txt' | 'json' | 'csv' | 'xls' | 'xml' | 'css' | 'html'
export type ExportType = 'txt' | 'json' | 'csv' | 'xls' | 'xml' | 'css' | 'html' | 'tsv'

export const exportTypes: { [ET in ExportType]: ET } = {
txt : 'txt',
Expand All @@ -8,4 +8,5 @@ export const exportTypes: { [ET in ExportType]: ET } = {
csv : 'csv',
xls : 'xls',
xml : 'xml',
tsv : 'tsv'
}