Ad
  • Custom User Avatar

    The problem itself being setup for multiple languages makes no mention of JavaScript "Numbers" but instead only "integers".

    In javascript, the Number, BigInt or Boolean primitives may be used to represent what we consider an integer.
    I don't see the justification of why the Number primitive ought to be the only acceptable way to represent a mathematical integer.

    I think that the test case of "[ [ true ] ] should return false", should be omitted from the tests as it's ambigious, and would require deeper context.

  • Custom User Avatar

    I think some others have noted this here, but for Javascript there is a test case of [ [true] ] which is dissallowed. Booleans are typically considered a special case of number, and behave as such in JS (i.e., 5 + true == 6). It seems that [ [ true ] ] should be true as a best practice as the trivial sudoku of 1x1 can only have one value 1 which is best represented by a boolean.