Symbols for names: Difference between revisions
No edit summary |
mNo edit summary |
||
(One intermediate revision by the same user not shown) | |||
Line 1: | Line 1: | ||
[[Category:Unpleasantness]] | [[Category:Unpleasantness]] | ||
[[Category:Variable Names]] | |||
There are many conventions to do with naming things. This page might split into several when it gets big enough. | There are many conventions to do with naming things. This page might split into several when it gets big enough. |
Latest revision as of 16:33, 12 July 2021
There are many conventions to do with naming things. This page might split into several when it gets big enough.
Symbol exhaustion
Mathematicians tend to use single Latin or Greek letters to name things, when they can. A common consequence of this is that the author runs out of symbols, so needs to look to other alphabets or use of different typefaces to convey meaning.
Once you've exhausted all the alphabets you know, you should take a long hard look at yourself, but you might feel tempted to start using non-letter symbols.
This question on academia.stackexchange.com[1] asks whether it's OK to use symbols from other character sets for names. The highest-scoring reply points out that using more symbols makes the material harder to read.
Starting at \(x\)
For reasons that aren't completely clear, \(x\) is normally used for the first variable, then \(y\) and \(z\). After that, some go to \(u\), \(v\) and \(w\).
Florian Cajori says that Descartes is responsible for starting at \(x\)[2]. The start of the alphabet should be used for known quantities, and the end for unknowns. Cajori gives a few dubious explanations for why \(x\) is more common than \(z\).
An alternate tack is to use Greek letters \(\alpha\), \(\beta\), \(\gamma\), \(\delta\), and so on: the convention seems to be to use those in the standard order.