Numbers, in their pure form, are abstract concepts. They only behave according to the set of rules you make them follow. What you are asking is really a question of number theory. I'm sure if you are really that upset with the idea that 1+1=2, you could probably find some abstract algebra out there in which the result doesn't happen. For instance, if you disreguard the rule not allowing division by zero, you can make 1=2. (or more specifically 1x=2x, so any number equal to its double) So using that idea, you could make 1+1= whatever your heart desires. Or, if its just the visual aspect of the equation that bothers you, you could always just assign some other symbols for 'one' and 'two' and come up with <+<=^ or something. Or, you could just switch bases and get 1+1=10, but then we're just arguing symantics, because its the same thing, just a different 'language'. The simple answer is 1+1=2 because thats what people agreed it do a long, long time ago. There is no magical, mystical secret here.
bravo Mephura The simple answer is 1+1=2 because thats what people agreed it do a long, long time ago. There is no magical, mystical secret here. It is a business standard for doing business not a fundamental truth with respect
zero doesn't count? oh, right, Natural Numbers. So then this proof doesn't work on the Set of Integers, then? we have to fix up the situation before the Math works, right? The point being, the proof works, yes. But only in certain situations. In other situations, the math doesn't work. This makes it a subset of all possibilities (cause other rules apply in other situations), and therefore, it is not a full definition of what you are trying to describe (the Universe). as such, while the rules behind math exsist in nature and those rules were mapped out into what is now used as integer math, those rules are a simplification of what exsists. they describe it, they don't create or rule it. These rules define simple integer math, and ignore the rest. And the rest fits into a much bigger bag.
umm.. isn't that little bit too incorrect. 2 is not 1 according to that. 2 is still 2 Please Register or Log in to view the hidden image! or have I made a mistake? please correct me if I have (aa - ab) / (aa - ab) is not 1. Let me see... (aa/aa) - (aa/ab) - (ab/aa) + (ab/ab) = 1 - a/b - b/a + 1 = 2 - (ab/ab) - (ab/ab) = 2 - [(ab-ab)/ab)] = 2 - [(ab/ab) - (ab/ab)] = 2 - (1-1) = 2 - 0 = 2 but I'm really tired so I can't think straight Please Register or Log in to view the hidden image! oh well.. I'm still having fun Please Register or Log in to view the hidden image! I've been wondering about why 1+1=2 for a while now. All I can think of is that if 1+1=2 then 1=2-1 and that is followed that 1=1 that comes from if you mark 1 with x, then x+1 = 2 x = 2-1 x = 1 That wouldn't work without the mathematical laws that have been agreed on though. We don't really know anything, but everything has been agreed on that that's the way it is. Like in what direction the electrons go in circuits or how much does 1 kg weight.
Re: bravo Mephura It seems to me that 1+1=2 even before we had language and math to communicate the fact to each other, even before life existed.
This has already been said, I know BUT 1+1=2 because 2 is defined as 1+1. End of story. If three were defined as 1+1 and two was defined as 1+1+1 then we would count 1,3,2,4... The romans would have said I+I=II, and nobody would have questioned it because the symbol used for a pair of units is a pair of unit symbols. For convinience and simplicity we use "2" instead of "11" to represent a pair of units, but again 1+1 only equals 2 because the arabic number system uses "2" as a more convenient replacement for "1+1".
Here is the accepted mathematical answer – in other words the right answer. Take some classes on set theory if you want to know more. The proof starts from the Peano Postulates, which define the natural numbers N. N is the smallest set satisfying these postulates: P1. 1 is in N. P2. If x is in N, then its "successor" x' is in N. P3. There is no x such that x' = 1. P4. If x isn't 1, then there is a y in N such that y' = x. P5. If S is a subset of N, 1 is in S, and the implication (x in S => x' in S) holds, then S = N. Then you have to define addition recursively: Def: Let a and b be in N. If b = 1, then define a + b = a' (using P1 and P2). If b isn't 1, then let c' = b, with c in N (using P4), and define a + b = (a + c)'. Then you have to define 2: Def: 2 = 1' 2 is in N by P1, P2, and the definition of 2. Theorem: 1 + 1 = 2 Proof: Use the first part of the definition of + with a = b = 1. Then 1 + 1 = 1' = 2 Q.E.D. Note: There is an alternate formulation of the Peano Postulates which replaces 1 with 0 in P1, P3, P4, and P5. Then you have to change the definition of addition to this: Def: Let a and b be in N. If b = 0, then define a + b = a. If b isn't 0, then let c' = b, with c in N, and define a + b = (a + c)'. You also have to define 1 = 0', and 2 = 1'. Then the proof of the Theorem above is a little different: Proof: Use the second part of the definition of + first: 1 + 1 = (1 + 0)' Now use the first part of the definition of + on the sum in parentheses: 1 + 1 = (1)' = 1' = 2 Q.E.D. http://mathforum.org/library/drmath/view/51551.html
sorry to drag up this thread, but the same argument is going on in another forum. Nasor- what happens to this proof is you move the number set out of the Natural Numbers? Say to Intergers, decimals, inaginaries, so all, all the way up to the number set Q where all possible numbers exsist in Q?
The subject of this thread apparently assumes that we are just dealing with pure math which would then result in the answer to the question as being because we have defined it so. Without the assumption, 1+1 rarely equals 2 and usually doesn't. Try adding one apple and one hungry donkey or one just sub-critical mass of Plutonium and another equal mass or virtually whatever. Two things being similar enough to be added togethor to make two is the exception.
Who cares? Using natural numbers is sufficient to prove that 1+1=2, since 1 and 2 are both natural numbers Please Register or Log in to view the hidden image!
you don't have to. this is merely a logical consequence of the common definition of numbers. obviously it's highly practical to accept. technically though, it only exists as an abstract and has zero bearing on reality. it has bearing on how to interpret reality in your mind.
2 = 1 + 1 by mathematical definition. whether 1 + 1 really do give you "2".... ask Orwell. he's got an interesting theory on math in "1984" ...
Most of the posts seem to start with the presupposition that numbers are purely conventional, and then they "prove" it in basically two ways: either by showing that our notational system is conventional, or by pointing out that in alternative algebras the elements most resembling to our 0 or 1 behave differently. Both types of argument miss the point in my opinion. The question is neither about notational systems, nor alternative algebraic systems. <b>It is about the logical and metaphysical status of natural numbers and the operations on them. </b>Whatever those are, they are not purely conventional because then the rules of addition could be changed at will or by legislation. But this not so: through changing the conventions a monarch could change the notational system or could force people to use a different algebra instead of the usual algebra of natural numbers, but even a monarch could not change the way natural numbers behave. This is because numbers and the notational means we use to refer to them are different things and of course the names can be changed without changing the entities they are the names of. It is particularly tempting to treat numbers as purely conventional because they aren't perceptible entities so we predominantly think about them in terms of their signs. You can change the names you use, you may even change the whole language introducing different rules so that you will talk about something completely different in the end --- these are all possible manoeuvres but constitute no real answer to the original question. Best,
It seems to me that you are confusion the Natural Numbers with the physical properties which they were designed to represent. Yes, you can change the notation, no, you cann't change how mass works in the universe. But that *is* changing the NUMBERS. The numbers are a notion, a representaion of the world. If all the humans were dead, would numbers be floating around? no. Would the rules governing how natural numbers math still exsist? yes - sortof. The defined rules would be gone, because no one would know the rules tHe rules as is the written description of how to do the math. The physical principals on which those rules were based on, the principals such as 'an object in motion stays in motion' would continue to exsist. The rule and the number are invented, and will leave with us. the stuff they were invented to describe would not go anywhere.
-------------------------------------------------------------------------------- Originally posted by proteus42 they are not purely conventional because then the rules of addition could be changed at will or by legislation. But this not so: through changing the conventions a monarch could change the notational system or could force people to use a different algebra instead of the usual algebra of natural numbers, but even a monarch could not change the way natural numbers behave. -------------------------------------------------------------------------------- Sorry, Wes for any imprecision in my text. Please take the word "this" as referring back to the view that numbers are conventional creatures. It should be better now. River-wind, How can you be so certain about this? It's impossible to prove a statement like this. Where does your confidence come from, then? That's also a dogma, sorry. What you say about rules proves that you take numbers as constituted by rules -- just like chess is constituted by the rules people play it by. And it is true that if humanity went extinct, games like chess would go with the last human. But it is some sort of confusion to assimilate arithmetics to chess. The rules of chess can be imagined being different pretty easily. You can consider what it would be like if rooks moved differently. If you really believe arithmetics is essentially the same as chess, then you should be able to imagine what it would be like if 1+1 would not be 2 but, say, 43. But whereas you could coherently describe what it would be like for the rook to move differently, I'd like to see a coherent description of how a world would look like where 1+1 is not 2. (Our world is not like that in spite of the fact that two masses of Plutonium might annihilate each other as one of the posts noted. Addition in the set of natural numbers is not a physical operation like "putting near each other" or the like: when you count things they don't have to be in the same place for a minute. They don't even have to exist physically.)
Your point is most rational but IMO flawed in that it assumes a perspective. If your perspective on the world is that of a dolphin, do numbers exist? What about from the perspective of a rock? What about from the perspective of a computer (which is a guess redundant as it is the same as that of the rock)? Mathematics and logic are actually both logical consequences of the definition of a number system. I see a strong argument that "they are a property of abstract space" in that a "thinking being" who came across the concept of numbering would be forced to draw similar conclusions based on that concept. In other words, math is self consistent. However, the definition is purely abstract (it cannot exist in any other place besides conscious minds) and has no bearing on anything other than our interpretation of our input.
I'd agree that a thinking being would most liekly come to the same conclusions. But I don't see the connection to math therefore being self consistant. I see the properties which math (the set of written rules and written numbers) as being self consistant. I think we have different ideas of what exactly falls under the heading "Math". I place the "concept of numbering" under the heading of "universal property", and the "act or practice or method of numbering" as Math. Therefore the thinking being in your example would be inventing his own Math, even if it were identical to our own. edit: and you're right, it does have a fair amount of personal predjudice in it. It's hard to get rid of Please Register or Log in to view the hidden image! And proteus42: you are right, it is unprovable, I should have ben more specific. "With all lieklyhood, If all the humans were dead, would numbers be floating around? no." This of course, is still only my opinion, and is based on what I define "Math" as.