Skip to content

Add option to cache generated interfaces #156

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
vsund opened this issue Mar 29, 2018 · 4 comments
Open

Add option to cache generated interfaces #156

vsund opened this issue Mar 29, 2018 · 4 comments

Comments

@vsund
Copy link

vsund commented Mar 29, 2018

Hey,

I made this part of my build process to make sure that the interfaces are in sync with the schema (which by the way works great).
Unfortunately this takes some additional seconds every build. Since these schemas don't really change often it would be cool to have an option to cache generated interfaces (e.g. only generate interfaces when the timestamp on the schema file changed (similar to make)).

Or is there another way that I missed? Either way, happy for suggestions 👍

@bcherny
Copy link
Owner

bcherny commented Apr 4, 2018

This is a really cool idea! I wonder if we could use a more general utility for caching. Something like:

cat schema.json | fromCache | json2ts | toCache > schema.d.ts

One implementation of something like this: https://bitbucket.org/sivann/runcached/src

If you can make a standalone package for this, I'd love to add it to the docs as the recommended way of using the CLI! What do you think?

@vsund
Copy link
Author

vsund commented Apr 5, 2018

I'm pretty sure that I could solve this with a few lines of Bash (I currently don't have the capacity for bringing this up as a standalone side project).
But though I think I'd like this more to be a part of json2ts itself. It's not much logic and could be as simple as activating/deactivating it with --[no]cache.

Due to #16 my call to json2ts is already very very complex :D (If I can simplify this, please let me know.)

I can understand that you like to keep such state out of json2ts but in this case I'd say that I'm more for making this a feature of json2ts itself.
Not sure though how much you'd like to keep this external, I can surely work on a small script that solves this externally.

@qm3ster
Copy link
Contributor

qm3ster commented Aug 23, 2018

@vsund Is there any requirement that you use the cli binary?
I am currently using something like this:

import * as Ajv from 'ajv'
import * as pack from 'ajv-pack'
import { join, basename, relative, dirname } from 'path'
import { outputFile, outputJson } from 'fs-extra'
import * as walk from 'klaw'
import {
  compile,
  Options as TypeCompilerOptions
} from 'json-schema-to-typescript'

import { JSONSchema4 } from 'json-schema'

const ajv = new Ajv({ sourceCode: true })

const SRC_ROOT = join(__dirname, 'src/')
const JSON_ROOT = join(__dirname, 'json/')
const DIST_ROOT = join(__dirname, 'dist/')

const SUFFIX = '.schema.ts'

const typerOptions: Partial<TypeCompilerOptions> = { bannerComment: '' }

const results: Promise<void | { error: any }>[] = []
walk(SRC_ROOT)
  .on('data', item => {
    if (!item.stats.isFile()) return
    const filename = basename(item.path)
    console.log(filename)
    if (!filename.endsWith(SUFFIX)) return
    const name = filename.slice(0, -SUFFIX.length)
    const schemaObject = import(item.path).then(
      module => module.default as JSONSchema4
    )

    const jsonDir = join(JSON_ROOT, relative(SRC_ROOT, dirname(item.path)))

    results.push(
      schemaObject.then(schemaObject =>
        outputJson(join(jsonDir, name + '.schema.json'), schemaObject).catch(
          error => {
            console.error("Couldn't make schema json", name, error)
            return { error }
          }
        )
      )
    )

    const outDir = join(DIST_ROOT, relative(SRC_ROOT, dirname(item.path)))

    results.push(
      schemaObject
        .then(schemaObject => compile(schemaObject, name, typerOptions))
        .then(typings => outputFile(join(outDir, name + '.d.ts'), typings))
        .catch(error => {
          console.error("Couldn't make schema typings", name, error)
          return { error }
        })
    )

    results.push(
      schemaObject
        .then(schemaObject => {
          const validator = ajv.compile(schemaObject)
          const validatorCode: string = pack(ajv, validator)

          return outputFile(join(outDir, 'is' + name + '.js'), validatorCode)
        })
        .catch(error => {
          console.error("Couldn't make validator code", name, error)
          return { error }
        })
    )

    const validatorTypings = `import {Validate} from '${relative(
      outDir,
      join(__dirname, './types/Validate')
    )}'
import {${name}} from './${name}'
declare const isFile: Validate<${name}>
export = isFile
`
    results.push(
      outputFile(join(outDir, 'is' + name + '.d.ts'), validatorTypings).catch(
        error => {
          console.error("Couldn't make validator typings", name, error)
          return { error }
        }
      )
    )
  })
  .on('end', async () => {
    console.log('Finishing...')
    const result = await Promise.all(results)
    console.log('Done!')
  })

The reason it uses import is because my schemas actually look like this, not JSON

import { JSONSchema4 } from 'json-schema'
import { freeze } from '../../utils/schema'
export default freeze({
  $schema: 'http://json-schema.org/draft-07/schema#',
  $id: 'http://bepis.com/Event.schema.json',
  title: 'Event',
  description: 'A full event',
  properties: {
    type: { description: 'The event type name', type: 'string' },
    ts: {
      description: 'A Unix timestamp, currently valid between 2010 and 2110',
      type: 'number',
      minimum: 1262304000000,
      maximum: 4417977600000
    },
    data: {}
  }
}) as JSONSchema4

I haven't solved external $refs yet.

@dhvector
Copy link

dhvector commented Aug 2, 2022

Here's a slightly trimmed down portion of a checksum script I threw together to wrap our usage of json2ts, in case it's useful.

I'm computing a checksum of all json input files and all d.ts output files, and only regenerating the d.ts files when anything changes.

snippet
#!/usr/bin/env bash

set -eo pipefail

cd "$(dirname -- "$(readlink -f -- "$0" || echo "$0")")"

CHECKSUM_CACHE_FILE="CHANGE ME"
DST_DIR="CHANGE ME"
SRC_DIR="CHANGE ME"

main() {
  preconditions
  initialize
  codegen
  cache-checksums
}

preconditions() {
  # exit early if checksums are up-to-date
  if validate-checksums; then
    exit 0
  fi
}

initialize() {
  rm -rf "$DST_DIR" || true
}

codegen() {
  echo "generating type definitions..." >&2
  mkdir -p "$DST_DIR"

  # generate .d.ts files
  npx json2ts -i "$SRC_DIR" -o "$DST_DIR" \
    [YOUR_OPTS_HERE]
}

validate-checksums() {
  if [[ ! -d "$DST_DIR" ]]; then
    echo "no generated files found" >&2
    return 1
  elif [[ ! -f "$CHECKSUM_CACHE_FILE" ]]; then
    echo "cached checksum not found" >&2
    return 1
  fi

  local current cached
  current=$(checksum-all)
  cached=$(<"$CHECKSUM_CACHE_FILE")

  if [[ $current != "$cached" ]]; then
    echo "changes detected" >&2
    return 1
  fi
}

cache-checksums() {
  checksum-all > "$CHECKSUM_CACHE_FILE"
}

checksum-all() {
  {
    # digest < "<ANY_OTHER_FILE>";
    checksum-dir "$SRC_DIR" "*.json";
    checksum-dir "$DST_DIR" "*.d.ts";
  } | digest
}

checksum-dir() {
  local dir="$1"
  local pattern="$2"
  find "$dir" -type f -iname "$pattern" -print0 | while IFS= read -r -d '' file; do
    digest < "$file"
  done |
    sort |
    digest
}

digest() {
  openssl dgst -sha1
}

main "$@"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants