Well, you can naturally have zero of something. In fact, you have zero of most things right now.
How do you know so much about my life?
I have seen arguments for zero being countable because of some transitive property with not counting still being an option in an arbitrary set of numbers you have the ability to count to intuitively.
How do I have anything if I have nothing of something?
the standard (set theoretic) construction of the natural numbers starts with 0 (the empty set) and then builds up the other numbers from there. so to me it seems “natural” to include it in the set of natural numbers.
On top of that, I don’t think it’s particularly useful to have 2 different easy shorthands for the positive integers, when it means that referring to the union of the positive integers and the singleton of 0 becomes cumbersome as a result.
I think if you ask any mathematician (or any academic that uses math professionally, for that matter), 0 is a natural number.
There is nothing natural about not having an additive identity in your semiring.
I’d learned somewhere along the line that Natural numbers (that is, the set ℕ) are all the positive integers and zero. Without zero, I was told this were the Whole numbers. I see on wikipedia (as I was digging up that Unicode symbol) that this is contested now. Seems very silly.
Weird, I learned the exact reverse. The recommended mnemonic was that the whole numbers included zero because zero has a hole in it.
Definition of natural numbers is the same as non-negative numbers, so of course 0 is a natural number.
In some countries, zero is neither positive nor negative. But in others, it is both positive and negative. So saying the set of natural number is the same as non-negative [integers] doesn’t really help. (Also, obviously not everyone would even agree that with that definition regardless of whether zero is negative.)
But -0 is also 0, so it can’t be natural number.
N0
I have been taught and everyone around me accepts that Natural numbers start from 1 and Whole numbers start from 0
Oh no, are we calling non-negative integers “whole numbers” now? There are proposals to change bad naming in mathematics, but I hope this is not one of them.
On the other hand, changing integer to whole number makes perfect sense.
Wait, I thought everything in math is rigorously and unambiguously defined?
There’s a hole at the bottom of math.
There’s a frog on the log on the hole on the bottom of math. There’s a frog on the log on the hole on the bottom of math. A frog. A frog. There’s a frog on the log on the hole on the bottom of math.
Rigorously, yes. Unambiguously, no. Plenty of words (like continuity) can mean different things in different contexts. The important thing isn’t the word, it’s that the word has a clear definition within the context of a proof. Obviously you want to be able to communicate ideas clearly and so a convention of symbols and terms have been established over time, but conventions can change over time too.
Platonism Vs Intuitionism would like a word.
Zero grew up from the seeds of the undefined, just like negative numbers and people who refuse to accept that the square root only has one value. Undefined is a pathway to many abilities some would consider unnatural.
It is a natural number. Is there an argument for it not being so?
Well I’m convinced. That was a surprisingly well reasoned video.
My favourite part is all the replies claiming that their answer to it is correct and it’s not at all controversial.
Which is funny because to a mathsless individual like me it proves how true the post is.
N is the set of “counting numbers”.
When you count upwards you start from 1, and go up. However, when you count down you usually end on 0. Surely this means 0 satisfies the definition.
The natural numbers are derived, according to Brouwer, from our intuition of time of time by the way. From this notion, 0 is no strange idea since it marks the moment our intuition first begins _
0 is natural.
Source - programming languages.
*Most programming languages
We don’t talk about those kids, they’re weird. :)
I don’t personally know many programming languages that provide natural number type in their prelude or standard library.
In fact, I can only think of proof assistants, like Lean, Coq, and Agda. Obviously the designer of these languages know a reasonable amount of mathematics to make the correct choice.
(I wouldn’t expect the same from IEEE or W3C, LOL
It’s really just a joke about counting from 0 instead of 1.
Oh, array indexing, sure.
countable infinite set are unique up-to bijection, you can count by rational numbers if you want. I don’t think counting is a good intuition.
On the contrary - to be countabley infinite is generally assumed to mean there exists a 1-1 correspondence with N. Though, I freely admit that another set could be used if you assumed it more primitive.
On the contrary - to be countabley infinite is generally assumed to mean there exists a 1-1 correspondence with N.
Isn’t this what I just said? If I am not mistaken, this is exactly what “unique up-to bijection” means.
Anyways, I mean either starting from 1 or 0, they can be used to count in the exactly same way.
I’m arguing from the standpoint that we establish the idea of counting using the naturals - it’s countable if it maps to the naturals, thus the link. Apologies for the lack of clarity.
So 0 is hard. But you know what? Tell me what none-whole number follows right after or before 0. That’s right, we don’t even have a thing to call that number.
±ε
Just make star wars universe live action Rick and Morty but crucially WITHOUT Rick and Morty.
Science memes…
Shows a Jedi.
🤡🤡🤡🤡🤡