• expr@programming.dev
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    1
    ·
    5 months ago

    I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.

    • RandomWalker@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      5 months ago

      I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.

    • Kogasa@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      edit-2
      5 months ago

      Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.

      FWIW I was taught that the inclusion of 0 is a French tradition.

      • xkforce@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        4 months ago

        The US is one of 3 countries on the planet that still stubbornly primarily uses imperial units. “The US doesn’t do it that way” isn’t a great argument for not adopting a standard.

      • pooberbee (any)@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        This isn’t strictly true. I went to school for math in America, and I don’t think I’ve ever encountered a zero-exclusive definition of the natural numbers.

      • holomorphic@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        4 months ago

        I have yet to meet a single logician, american or otherwise, who would use the definition without 0.

        That said, it seems to depend on the field. I think I’ve had this discussion with a friend working in analysis.

  • affiliate@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    edit-2
    5 months ago

    the standard (set theoretic) construction of the natural numbers starts with 0 (the empty set) and then builds up the other numbers from there. so to me it seems “natural” to include it in the set of natural numbers.

  • baseless_discourse@mander.xyz
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    4 months ago

    I think if you ask any mathematician (or any academic that uses math professionally, for that matter), 0 is a natural number.

    There is nothing natural about not having an additive identity in your semiring.

  • l10lin@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    4 months ago

    Definition of natural numbers is the same as non-negative numbers, so of course 0 is a natural number.

  • AppleMango@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    4 months ago

    I have been taught and everyone around me accepts that Natural numbers start from 1 and Whole numbers start from 0

    • baseless_discourse@mander.xyz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      4 months ago

      Oh no, are we calling non-negative integers “whole numbers” now? There are proposals to change bad naming in mathematics, but I hope this is not one of them.

      On the other hand, changing integer to whole number makes perfect sense.

    • RandomWalker@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 months ago

      Rigorously, yes. Unambiguously, no. Plenty of words (like continuity) can mean different things in different contexts. The important thing isn’t the word, it’s that the word has a clear definition within the context of a proof. Obviously you want to be able to communicate ideas clearly and so a convention of symbols and terms have been established over time, but conventions can change over time too.

  • Codex@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    5 months ago

    I’d learned somewhere along the line that Natural numbers (that is, the set ℕ) are all the positive integers and zero. Without zero, I was told this were the Whole numbers. I see on wikipedia (as I was digging up that Unicode symbol) that this is contested now. Seems very silly.

    • Kogasa@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      There can’t really be an argument either way. It’s just a matter of convention. “Natural” is just a name, it’s not meant to imply that 1 is somehow more fundamental than -1, so arguing that 0 is “natural” is beside the point

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    So 0 is hard. But you know what? Tell me what none-whole number follows right after or before 0. That’s right, we don’t even have a thing to call that number.

  • HexesofVexes@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    4 months ago

    N is the set of “counting numbers”.

    When you count upwards you start from 1, and go up. However, when you count down you usually end on 0. Surely this means 0 satisfies the definition.

    The natural numbers are derived, according to Brouwer, from our intuition of time of time by the way. From this notion, 0 is no strange idea since it marks the moment our intuition first begins _

    • baseless_discourse@mander.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      countable infinite set are unique up-to bijection, you can count by rational numbers if you want. I don’t think counting is a good intuition.

      • HexesofVexes@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        On the contrary - to be countabley infinite is generally assumed to mean there exists a 1-1 correspondence with N. Though, I freely admit that another set could be used if you assumed it more primitive.

        • baseless_discourse@mander.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          4 months ago

          On the contrary - to be countabley infinite is generally assumed to mean there exists a 1-1 correspondence with N.

          Isn’t this what I just said? If I am not mistaken, this is exactly what “unique up-to bijection” means.

          Anyways, I mean either starting from 1 or 0, they can be used to count in the exactly same way.

          • HexesofVexes@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            I’m arguing from the standpoint that we establish the idea of counting using the naturals - it’s countable if it maps to the naturals, thus the link. Apologies for the lack of clarity.

      • baseless_discourse@mander.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        4 months ago

        I don’t personally know many programming languages that provide natural number type in their prelude or standard library.

        In fact, I can only think of proof assistants, like Lean, Coq, and Agda. Obviously the designer of these languages know a reasonable amount of mathematics to make the correct choice.

        (I wouldn’t expect the same from IEEE or W3C, LOL

  • aberrate_junior_beatnik@midwest.social
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    5 months ago

    As a programmer, I’m ashamed to admit that the correct answer is no. If zero was natural we wouldn’t have needed 10s of thousands of years to invent it.

    • lowleveldata@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 months ago

      As a programmer, I’d ask you to link your selected version of definition of natural number along with your request because I can’t give a fuck to guess