I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.
I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.
Yeah dont do that.
From what i understand, you can pay iso to standardise anything. So it’s only useful for interoperability.
Can I pay them to make my dick length the ISO standard?
I feel they have an image to maintain, but i also feel they would sell out for enough money. So… tell me if you make it.
Yeah, interoperability. Like every software implementation of natural numbers that include 0.
Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.
FWIW I was taught that the inclusion of 0 is a French tradition.
The US is one of 3 countries on the planet that still stubbornly primarily uses imperial units. “The US doesn’t do it that way” isn’t a great argument for not adopting a standard.
This isn’t strictly true. I went to school for math in America, and I don’t think I’ve ever encountered a zero-exclusive definition of the natural numbers.
It is true.
I have yet to meet a single logician, american or otherwise, who would use the definition without 0.
That said, it seems to depend on the field. I think I’ve had this discussion with a friend working in analysis.
I did say mathematician, not logician.
Logicians are mathematicians. Well, most of them are.
But not all mathematicians are logicians.
Logically.
the standard (set theoretic) construction of the natural numbers starts with 0 (the empty set) and then builds up the other numbers from there. so to me it seems “natural” to include it in the set of natural numbers.
I think if you ask any mathematician (or any academic that uses math professionally, for that matter), 0 is a natural number.
There is nothing natural about not having an additive identity in your semiring.
Definition of natural numbers is the same as non-negative numbers, so of course 0 is a natural number.
But -0 is also 0, so it can’t be natural number.
I have been taught and everyone around me accepts that Natural numbers start from 1 and Whole numbers start from 0
Oh no, are we calling non-negative integers “whole numbers” now? There are proposals to change bad naming in mathematics, but I hope this is not one of them.
On the other hand, changing integer to whole number makes perfect sense.
In school i was taught that ℕ contained 0 and ℕ* was ℕ without 0
I was taught ℕ did not contain 0 and that ℕ₀ is ℕ with 0.
ℕ₀* is ℕ with 0 without 0
Wait, I thought everything in math is rigorously and unambiguously defined?
There’s a hole at the bottom of math.
Platonism Vs Intuitionism would like a word.
Yes, and like any science it gets revisited and contested periodically.
Rigorously, yes. Unambiguously, no. Plenty of words (like continuity) can mean different things in different contexts. The important thing isn’t the word, it’s that the word has a clear definition within the context of a proof. Obviously you want to be able to communicate ideas clearly and so a convention of symbols and terms have been established over time, but conventions can change over time too.
I’d learned somewhere along the line that Natural numbers (that is, the set ℕ) are all the positive integers and zero. Without zero, I was told this were the Whole numbers. I see on wikipedia (as I was digging up that Unicode symbol) that this is contested now. Seems very silly.
N0
It is a natural number. Is there an argument for it not being so?
Well I’m convinced. That was a surprisingly well reasoned video.
There can’t really be an argument either way. It’s just a matter of convention. “Natural” is just a name, it’s not meant to imply that 1 is somehow more fundamental than -1, so arguing that 0 is “natural” is beside the point
Zero is a number. Need I say more?
Just make star wars universe live action Rick and Morty but crucially WITHOUT Rick and Morty.
So 0 is hard. But you know what? Tell me what none-whole number follows right after or before 0. That’s right, we don’t even have a thing to call that number.
I think p-adic has that
N is the set of “counting numbers”.
When you count upwards you start from 1, and go up. However, when you count down you usually end on 0. Surely this means 0 satisfies the definition.
The natural numbers are derived, according to Brouwer, from our intuition of time of time by the way. From this notion, 0 is no strange idea since it marks the moment our intuition first begins _
countable infinite set are unique up-to bijection, you can count by rational numbers if you want. I don’t think counting is a good intuition.
On the contrary - to be countabley infinite is generally assumed to mean there exists a 1-1 correspondence with N. Though, I freely admit that another set could be used if you assumed it more primitive.
On the contrary - to be countabley infinite is generally assumed to mean there exists a 1-1 correspondence with N.
Isn’t this what I just said? If I am not mistaken, this is exactly what “unique up-to bijection” means.
Anyways, I mean either starting from 1 or 0, they can be used to count in the exactly same way.
I’m arguing from the standpoint that we establish the idea of counting using the naturals - it’s countable if it maps to the naturals, thus the link. Apologies for the lack of clarity.
0 is natural.
Source - programming languages.
I don’t personally know many programming languages that provide natural number type in their prelude or standard library.
In fact, I can only think of proof assistants, like Lean, Coq, and Agda. Obviously the designer of these languages know a reasonable amount of mathematics to make the correct choice.
(I wouldn’t expect the same from IEEE or W3C, LOL
It’s really just a joke about counting from 0 instead of 1.
Oh, array indexing, sure.
As a programmer, I’m ashamed to admit that the correct answer is no. If zero was natural we wouldn’t have needed 10s of thousands of years to invent it.
As a programmer, I’d ask you to link your selected version of definition of natural number along with your request because I can’t give a fuck to guess
I truly have no idea what you’re saying.
I think you’re considering whether zero is somehow “naturally-occurring”, while others may be considering the concept of a natural number, which is a nonnegative integer.