Math Notation vs Code
The cluster debates the ambiguity, terseness, and context-dependence of traditional mathematical notation compared to the explicitness and readability of programming code, with programmers criticizing it and mathematicians defending its purpose.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Math notation is just obfuscated code in a language that doesn't compile or run.
Mathematical notation is often too vague, informal and too dependent on context.Frankly there is too much ambiguous overloading of symbols.It seems the language has evolved to be succinct and short to write, rather than being explicit and unambiguous.I know maths people say that it is meant to be read by people and not by computers. More people are acustomed to reading code than ever before.I feel that math notation needs reform that makes it formal, unambiguous and capable of describing imperat
I do complex math without using greek letters and other untypable characters all the time. It's called programming. The order of operations is explicit, the operations themselves are explicit, and there is very little room for ambiguity. The notation is only for saving space, not because it's the be-all end-all absolute best solution for representing mathematics.
Perhaps its time to rethink the sanctity of mathematical symbolic communication. Seriously, that equation borders on parody in attempting to relay a rather simple concept. We dont expect CS people to communicate concepts through compiled, machine language style syntax (minimum number of bits uber alles) so why not revisit the communication of mathematical concepts with weight given to clarity over compression?
Can you give an example of a piece of math notation that you think could be improved?Maths is more like a human language - the writing necessarily does not contain a complete picture, just like a book can never unambiguously describe a scene. This is what sets it apart from the computer programs you have compared it with since a computer program is a complete, unambiguous definition of the thing it is describing.All that math notation does is give us a language that sits somewhere between
As a mathematician by training who does a lot of programming for a living: This is the biggest misconception about math I see coming from programmers. There's frequently a complaint about notation (be it that it is too compact, too obscure, too gatekept, whatever) and the difficulty in picking out what a given "line" (meaning equation or diagram or theorem, or whatever) means without context.Here's the thing though: Mathematics isn't code! The symbols we use to
In my opinion, the problem with math is actual mathematical notation, which is, frankly, terrible. Ridiculously bad. Especially given the advances that have been made in CS in that regard.Early programming languages created by mathematicians? They were terrible (for the most part). Then CS people started formalizing stuff, grammars were invented, and finally, the situation began to improve in the late 60s.But not math. Nope, it's the same archaic, imprecise, terrible notation that's been a
Well put! I was in the same boat i.e. wanting everything to be expressible in C/C++ type of syntax before i realized that it was quite the wrong way of looking at things and in particular; Mathematics. Mathematical Notation is fundamental while Programming Language Notation is incidental when looking at Mathematical subjects. It was merely an unwillingness to put forth effort to learn a new "shorthand conceptual language" that was holding me back. Once i learnt to read the notatio
What I mean is that mathematical notation was never developed with the intention of being easily understood. Nobody writes out a formula and then spends time making it more readable. In programming they do - it's called refactoring. How mathematicians typically polish an already correct formula is called "simplification." This usually leads to their collection of symbols being even more inscrutable. But the more mathematical proofs, formulae, and other statements are unintelligibl
I always wondered why math notation hasn't been replaced by code.While math notations can be interpreted in different ways, code is going to give an unambiguous result.Why don't mathematicians, especially those writing in machine learning or computer science domains, do this? Is it just a problem of agreeing on a common language?