|
1 | 1 | # Release History
|
2 | 2 |
|
| 3 | +## 1.5.0 |
| 4 | + |
| 5 | +### Highlights |
| 6 | + |
| 7 | +- Added OAuth M2M support (databricks/databricks-sql-nodejs#168, databricks/databricks-sql-nodejs#177) |
| 8 | +- Added named query parameters support (databricks/databricks-sql-nodejs#162, databricks/databricks-sql-nodejs#175) |
| 9 | +- `runAsync` options is now deprecated (databricks/databricks-sql-nodejs#176) |
| 10 | +- Added staging ingestion support (databricks/databricks-sql-nodejs#164) |
| 11 | + |
| 12 | +### Databricks OAuth support |
| 13 | + |
| 14 | +Databricks OAuth support added in v1.4.0 is now extended with M2M flow. To use OAuth instead of PAT, pass |
| 15 | +a corresponding auth provider type and options to `DBSQLClient.connect`: |
| 16 | + |
| 17 | +```ts |
| 18 | +// instantiate DBSQLClient as usual |
| 19 | + |
| 20 | +client.connect({ |
| 21 | + // provide other mandatory options as usual - e.g. host, path, etc. |
| 22 | + authType: 'databricks-oauth', |
| 23 | + oauthClientId: '...', // optional - overwrite default OAuth client ID |
| 24 | + azureTenantId: '...', // optional - provide custom Azure tenant ID |
| 25 | + persistence: ..., // optional - user-provided storage for OAuth tokens, should implement OAuthPersistence interface |
| 26 | +}) |
| 27 | +``` |
| 28 | + |
| 29 | +U2M flow involves user interaction - the library will open a browser tab asking user to log in. To use this flow, |
| 30 | +no other options are required except for `authType`. |
| 31 | + |
| 32 | +M2M flow does not require any user interaction, and therefore is a good option, say, for scripting. To use this |
| 33 | +flow, two extra options are required for `DBSQLClient.connect`: `oauthClientId` and `oauthClientSecret`. |
| 34 | + |
| 35 | +Also see [Databricks docs](https://docs.databricks.com/en/dev-tools/auth.html#oauth-machine-to-machine-m2m-authentication) |
| 36 | +for more details about Databricks OAuth. |
| 37 | + |
| 38 | +### Named query parameters |
| 39 | + |
| 40 | +v1.5.0 adds a support for [query parameters](https://docs.databricks.com/en/sql/language-manual/sql-ref-parameter-marker.html). |
| 41 | +Currently only named parameters are supported. |
| 42 | + |
| 43 | +Basic usage example: |
| 44 | + |
| 45 | +```ts |
| 46 | +// obtain session object as usual |
| 47 | + |
| 48 | +const operation = session.executeStatement('SELECT :p1 AS "str_param", :p2 AS "number_param"', { |
| 49 | + namedParameters: { |
| 50 | + p1: 'Hello, World', |
| 51 | + p2: 3.14, |
| 52 | + }, |
| 53 | +}); |
| 54 | +``` |
| 55 | + |
| 56 | +The library will infer parameter types from passed primitive objects. Supported data types include booleans, various |
| 57 | +numeric types (including native `BigInt` and `Int64` from `node-int64`), native `Date` type, and string. |
| 58 | + |
| 59 | +It's also possible to explicitly specify the parameter type by passing `DBSQLParameter` instances instead of primitive |
| 60 | +values. It also allows one to use values that don't have a corresponding primitive representation: |
| 61 | + |
| 62 | +```ts |
| 63 | +import { ..., DBSQLParameter, DBSQLParameterType } from '@databricks/sql'; |
| 64 | + |
| 65 | +// obtain session object as usual |
| 66 | + |
| 67 | +const operation = session.executeStatement('SELECT :p1 AS "date_param", :p2 AS "interval_type"', { |
| 68 | + namedParameters: { |
| 69 | + p1: new DBSQLParameter({ |
| 70 | + value: new Date('2023-09-06T03:14:27.843Z'), |
| 71 | + type: DBSQLParameterType.DATE, // by default, Date objects are inferred as TIMESTAMP, this allows to override the type |
| 72 | + }), |
| 73 | + p2: new DBSQLParameter({ |
| 74 | + value: 5, // INTERVAL '5' DAY |
| 75 | + type: DBSQLParameterType.INTERVALDAY |
| 76 | + }), |
| 77 | + }, |
| 78 | +}); |
| 79 | +``` |
| 80 | + |
| 81 | +Of course, you can mix primitive values and `DBSQLParameter` instances. |
| 82 | + |
| 83 | +### `runAsync` deprecation |
| 84 | + |
| 85 | +Starting with this release, the library will execute all queries asynchronously, so we have deprecated |
| 86 | +the `runAsync` option. It will be completely removed in v2. So you should not use it going forward and remove all |
| 87 | +the usages from your code before version 2 is released. From user's perspective the library behaviour won't change. |
| 88 | + |
| 89 | +### Data ingestion support |
| 90 | + |
| 91 | +This feature allows you to upload, retrieve, and remove unity catalog volume files using SQL `PUT`, `GET` and `REMOVE` commands. |
| 92 | + |
3 | 93 | ## 1.4.0
|
4 | 94 |
|
5 | 95 | - Added Cloud Fetch support (databricks/databricks-sql-nodejs#158)
|
|
0 commit comments