NATURAL NUMBERS translated by Dedi Erianto Manullog from the - TopicsExpress



          

NATURAL NUMBERS translated by Dedi Erianto Manullog from the Indonesian Wiki: In mathematics there are two conventions regarding sets of real numbers. The first definition according to traditional mathematicians, is the set of non-zero positive integers {1, 2, 3, 4, ...}.While the second definition by logicians and computer scientists is the null set and the positive integers {0, 1, 2, 3, ...}. Natural numbers are one of the simplest concepts in mathematics and the first concept people learn to understand, and in fact several studies indicate various kinds of apes/monkies can understand them. Its normal then that natural numbers are the first type of number to be used for reckoning, counting, etc. A deeper property of natural numbers is their relationship to prime numbers, as studied in number theory. In more advanced mathematics natural numbers are used to order and define calculation attributes for a set. Any number, for example 1, is an abstract concept that human eyes cant see, yet of universal character. One way to introduce the concept of all natural numbers as an abstract structure is by the Peano axiom (for an illustration, see Peano arithmetic). The most common and broadest concept of numbers needs further discussion, and sometimes even needs depth of logic to comprehend and define it. For example in mathematical theory the set of all natural numbers can be constructed in phases starting with the set of natural numbers. The History of Natural Numbers Natural numbers originated from the words used to count things, starting with the number one. The first big advance in abstraction was the use of a counting system to symbolize numbers. That allowed the writing of large numbers. As an example, the Babylonians developed a position-based system for the number 1 to 10. The ancient Egyptians had a number system with different hieroglyphs for 1, 10, and all degrees of 10 for a million. A measurement of stone at Karnak dated around 1500 BC and now in the Louvre in Paris represented 276 as 2 hundred, 7 tens and 6 ones; and the same was done for the number 4622. Another major advance was the development of the idea of the number 0 as a number with its own symbol. Null had been used in positional notation as early as 700 BC by the Babylonians,but they omitted it when it would be the last symbol for a given number.[1]. The concept of null in modern times originated from the Indian mathematician Brahmagupta. In the 19th century a natural number theory was developed using set theory. By that definition, it seemed easier to include null (corresponding to the empty set) as a natural number, and its now become a convention in the set theory, logic and computer science fields. Other mathematicians, such as in the number theory field stick to the long tradition and still consider 1 as the first number
Posted on: Fri, 16 Jan 2015 02:46:44 +0000

Trending Topics



"stbody" style="min-height:30px;">
The Power of Transparency! Thanks boss (es)! :) Take a stand for

Recently Viewed Topics




© 2015