Skip to content

Support for array parameters in namedParameters? #276

Closed
@D-Maher

Description

@D-Maher

I am trying to construct the following SQL query:

SELECT *
FROM my_table
WHERE weight IN (0.14, 0.23);

by doing the following:

const queryOperation = await session.executeStatement(
  `SELECT * FROM my_table WHERE weight IN (:weights)`,
  {
    namedParameters: {
      weights: [0.14, 0.23],
    },
  }
);

However, running that query results in the following error: Error: writeString called without a string/Buffer argument: 0.14,0.23.

Full Stack Trace
Error: writeString called without a string/Buffer argument: 0.14,0.23
    at TBinaryProtocol.writeStringOrBinary (/path/to/my_app/node_modules/thrift/lib/nodejs/lib/thrift/binary_protocol.js:165:11)
    at TBinaryProtocol.writeString (/path/to/my_app/node_modules/thrift/lib/nodejs/lib/thrift/binary_protocol.js:170:8)
    at module.exports.TSparkParameterValue.TSparkParameterValue.write (/path/to/my_app/node_modules/@databricks/sql/thrift/TCLIService_types.js:6535:12)
    at module.exports.TSparkParameter.TSparkParameter.write (/path/to/my_app/node_modules/@databricks/sql/thrift/TCLIService_types.js:6641:16)
    at module.exports.TExecuteStatementReq.TExecuteStatementReq.write (/path/to/my_app/node_modules/@databricks/sql/thrift/TCLIService_types.js:6291:17)
    at TCLIService_ExecuteStatement_args.write (/path/to/my_app/node_modules/@databricks/sql/thrift/TCLIService.js:366:14)
    at exports.Client.TCLIServiceClient.send_ExecuteStatement (/path/to/my_app/node_modules/@databricks/sql/thrift/TCLIService.js:2373:10)
    at exports.Client.TCLIServiceClient.ExecuteStatement (/path/to/my_app/node_modules/@databricks/sql/thrift/TCLIService.js:2361:10)
    at /path/to/my_app/node_modules/@databricks/sql/lib/hive/Commands/BaseCommand.ts:58:17
    at new Promise (anonymous)

Is it not possible to pass an array as a namedParameter or am I missing something?

I've also tried joining the array into a string (i.e. [0.14, 0.23].join(',')), but then I end up with a different error further down:

org.apache.spark.SparkNumberFormatException: [CAST_INVALID_INPUT] The value \'0.14,0.23\' of the type "STRING" cannot be cast to "DOUBLE" because it is malformed. Correct the value as per the syntax, or change its target type. Use try_cast to tolerate malformed input and return NULL instead. If necessary set "ansi_mode" to "false" to bypass this error.

Is there a way to cast each member of the array to the proper data type individually?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions