Notation, Language, and Rigor
Most of the mathematical notation in use today was not invented until the 16th century. Before that, mathematics was written out in words, a painstaking process that limited mathematical discovery. Euler (1707–1783) was responsible for many of the notations in use today. Modern notation makes mathematics much easier for the professional, but beginners often find it daunting. It is extremely compressed: a few symbols contain a great deal of information. Like musical notation, modern mathematical notation has a strict syntax (which to a limited extent varies from author to author and from discipline to discipline) and encodes information that would be difficult to write in any other way.
Mathematical language can be difficult to understand for beginners. Words such as or and only have more precise meanings than in everyday speech. Moreover, words such as open and field have been given specialized mathematical meanings. Technical terms such as homeomorphism and integrable have precise meanings in mathematics. Additionally, shorthand phrases such as iff for "if and only if" belong to mathematical jargon. There is a reason for special notation and technical vocabulary: mathematics requires more precision than everyday speech. Mathematicians refer to this precision of language and logic as "rigor".
Mathematical proof is fundamentally a matter of rigor. Mathematicians want their theorems to follow from axioms by means of systematic reasoning. This is to avoid mistaken "theorems", based on fallible intuitions, of which many instances have occurred in the history of the subject. The level of rigor expected in mathematics has varied over time: the Greeks expected detailed arguments, but at the time of Isaac Newton the methods employed were less rigorous. Problems inherent in the definitions used by Newton would lead to a resurgence of careful analysis and formal proof in the 19th century. Misunderstanding the rigor is a cause for some of the common misconceptions of mathematics. Today, mathematicians continue to argue among themselves about computer-assisted proofs. Since large computations are hard to verify, such proofs may not be sufficiently rigorous.
Axioms in traditional thought were "self-evident truths", but that conception is problematic. At a formal level, an axiom is just a string of symbols, which has an intrinsic meaning only in the context of all derivable formulas of an axiomatic system. It was the goal of Hilbert's program to put all of mathematics on a firm axiomatic basis, but according to Gödel's incompleteness theorem every (sufficiently powerful) axiomatic system has undecidable formulas; and so a final axiomatization of mathematics is impossible. Nonetheless mathematics is often imagined to be (as far as its formal content) nothing but set theory in some axiomatization, in the sense that every mathematical statement or proof could be cast into formulas within set theory.
Read more about this topic: Mathematics
Other articles related to "rigor":
... Like musical notation, modern mathematical notation has a strict syntax (which to a limited extent varies from author to author and from discipline ... refer to this precision of language and logic as "rigor" ... Mathematical proof is fundamentally a matter of rigor ...
Famous quotes containing the word rigor:
“It makes little sense to spend a month teaching decimal fractions to fourth-grade pupils when they can be taught in a week, and better understood and retained, by sixth-grade students. Child-centeredness does not mean lack of rigor or standards; it does mean finding the best match between curricula and childrens developing interests and abilities.”
—David Elkind (20th century)