GraphQL large integer error: Int cannot represent non 32-bit signed integer value

GraphQL large integer error: Int cannot represent non 32-bit signed integer value


37

I´m trying to store a UNIX timestamp in MongoDB using GraphQL, but it seens that GraphQL has a limit to handle integers. See the mutation below:

const addUser = {
    type: UserType,
    description: 'Add an user',
    args: {
        data: {
            name: 'data',
            type: new GraphQLNonNull(CompanyInputType)
        }
    },
    resolve(root, params) {

        params.data.creationTimestamp = Date.now();

        const model = new UserModel(params.data);
        const saved = model.save();

        if (!saved)
            throw new Error('Error adding user');

        return saved;
    }
}

Result:

  "errors": [
    {
      "message": "Int cannot represent non 32-bit signed integer value: 1499484833027",
      "locations": [
        {
          "line": 14,
          "column": 5
        }
      ],
      "path": [
        "addUser",
        "creationTimestamp"
      ]
    }

I´m currently using GraphQLInteger for this field on type definition:

creationTimestamp: { 
    type: GraphQLInt
}

How can I solve that situation if there is no larger GraphQLInt available in GraphQL ?

Share

3 Answers
3

Reset to default


49

GraphQL doesn’t support integers larger than 32 bits as the error indicates. You’re better off using a custom scalar like GraphQL Date. There’s also a "Long" type available here. Or you could roll your own custom type; there’s a great example from Apollo here.

If you’re curious why GraphQL does not support anything bigger, you can check out this issue on Github.

Share

4

  • Thanks for the links. Could we get an explanation? 😀

    – reergymerej

    Sep 23, 2020 at 17:52

  • @reergymerej what is still unclear from the answer?

    – Daniel Rearden

    Sep 23, 2020 at 19:00

  • I think what @reergymerej meant is a TLDR for why GraphQL doesn't support integers larger than 32 bits.

    – Paul Razvan Berg

    Nov 5, 2020 at 11:28

  • github.com/graphql/graphql-js/issues/292#issuecomment-186702763

    – Daniel Rearden

    Nov 5, 2020 at 12:52


7

⚠️Not recommended due to potential loss of precision…

… but if you want a quick fix (maybe you don’t have the time to implement a custom scalar), you can use the Float type instead of Int.

Share

5

  • Hey, I downvoted your comment because it's not very helpful and you're even "Not recommending" it, yourself. Float and Int share the same bit-size.

    – bastianwegge

    Apr 12, 2021 at 18:50

  • 2

    I mean I know but when I came to this error I didn't had the time at all to refactorise all my schema like mentioned in the valid answer :/ this error is kind of unexpected since simple timestamp doesn't fit in (and please try it by your own because this 'trick' works I found it in this issue). So yes maybe it's not the best answer but it could definitively help someone which faced the same problem so there is no reason to downvote it.

    – johannchopin

    Apr 12, 2021 at 21:23

  • 1

    This says not recommended, but does not provide a reason why you wouldn't want to do this.

    – Gakio

    Jul 22, 2021 at 19:18

  • 1

    @Gakio Because it's better to use the right type for what you need. If you don't need to use some floats there is no point.

    – johannchopin

    Jul 23, 2021 at 14:18

  • 1

    The real issue is that a signed 32 bit integer can exactly represent any integer up to about 2 x 10^9. A 32 bit float can exactly represent any integer up to about 1.5 x 10^7 but can approximately represent numbers up to about 3 x 10^38: a hugely wider range but at the cost of precision. Sometimes loss of precision is fine and other times it very isn't.

    – Tamlyn

    Jan 27 at 21:11



0

Float instead of Int is actually not a bad choice in some cases. Int is a 32-bit integer, which means that it can safely represent numbers with up to 9 decimal digits, whereas Float is a 64-bit floating-point number (see: https://graphql.org/learn/schema/), which means it has 53 significant bits, which then translates to safely representing all integers with up to 15 decimal digits.

Share



Not the answer you're looking for? Browse other questions tagged

or ask your own question.

Leave a Reply

Your email address will not be published. Required fields are marked *