Math notation bothers me

Maybe it is just because I am from a Computer Science background, but there are a few things about Mathematics notation that really bother me, especially when I’m taking notes.

Function notation

Here are two I see often. Let me give you an example.

f(x,y) = sin x + y

I see this handwritten all the time, but I have never seen this mistake in a textbook. Here’s why it is a mistake:: IT IS AMBIGUOUS! It can either mean:

(1) f(x,y) = sin(x + y)
(2) f(x,y) = sin(x) + y

I usually find that the professors intend this to mean (2). On the other hand…

f(x,y) = sin x y

could mean either:

(1) f(x,y) = sin(x y)
(2) f(x,y) = sin(x) y

In this case professors usually intend this to mean (1). What?! sin() is supposed to be a function, right? Without parenthesis it doesn’t really have any meaning. I have never seen anyone write…

f x = 2x

to mean…

f(x) = 2x

so why do mathematicians think it is okay to do this with trig and log notations? And yes, I have gotten problems wrong before from misinterpreting the above examples.

log()? Do you mean ln(), lg(), or lb()?

I hate the function log(). It simply should not exist. The definition of the logarithm function is:

log[b](a) = ln(a) / ln(b)

The only true logarithm is the natural. ln() is base e (for math and physics), lg() is base 10 (only exists because you have ten fingers), and lb() is base 2 (for Computer Science). So what the heck is log()? It is ambiguous. Most programming languages take log() to mean ln(), the only true logarithm, but for some reason, in math, log() defaults to lg(). Why? If you want to use log[10](), write either lg() or log[10]().

The equals sign

In most programming languages today, there are two operators “==” and “=”. The first is a comparison, the second is an assignment. In some languages this is “=” and “:=” respectively. We have two operators for a reason, a comparison is a question, and an assignment is a statement. If the same notation is used, this is ambiguous. In math, “=” is… well… something in between… I guess… Now, often I usually see people write:

Let x = 5

This is okay; I don’t have a problem with this. However, countless times in a math lecture, the professor will be in the middle of an example and then suddenly write something like…

x = 5

This never ceases to cause a student to raise his hand and ask, “Wait… how did you know x was 5?” The professor then will look at the student like he’s an idiot and say, “Because I set it to 5”. This is because the equals sign is ambiguous, and there needs to be a second sign for setting a variable to a specific number.

So maybe this is all a bit silly; however, to a Computer Scientist, notation is very important and cannot be ambiguous. Speaking as a programmer, I have to say that if Math is such a “universal language”, you’d think they would at least make it unambiguous. :)