Closed
Description
Bug Report
I found multiple issues with .at()
when used with tuples. Since tuple is strictly typed array by indexes, I think that the type inference with .at()
should be better.
🔎 Search Terms
- array
- array.at()
- at()
- tuple
- array.at() with tuple
- tuple at()
🕗 Version & Regression Information
Version: 4.8.2
(also 4.9.0-dev.20220921
)
- This is the behavior in every version I tried, and I reviewed the FAQ for entries about arrays and tuples with
.at()
⏯ Playground Link
Playground link with relevant code
💻 Code
const tuple: [string, number] = ['1', 2]
// 1. Shows that "a" can be undefined, but should be a string
// 2. Shows that "a" can be number, but should be string
const a = tuple.at(0)
const b = tuple[0] // this is correct
// 3. Shows type error with bracket notation but no type error with .at()
const d = tuple.at(3)
const c = tuple[3] // shows type error which is correct
🙁 Actual behavior
When I use .at()
with tuple, it wrongly infers type.
- It shows
string | number | undefined
when used with an existing index. - It doesn't show a type error when used with an index that doesn't exist
🙂 Expected behavior
I would expect, that it works the same as with bracket notation []
when used with a tuple. Because it knows what type is at what index. Also if that index exists in that tuple.