1. Memories from school

Part 1.: problems, solutions.
Part 2.: problems, solutions.
Part 3.: problems.

Linear function

A function of form f(x)=ax+b is called linear. Its graph is a line. Notice that b is the value the function takes for x=0, so it is where it intersects y axis. Meanwhile, a decides on the slope of the line. For a>0 we get an increasing function, for a<0 we get a decreasing functions. The bigger a, the ,,steeper” the line is. Notice that the line y=ax goes through (0,0) and (1,a), which along with (1,0) make a right triangle. The rate of its legs is a/1=a and is called the tangent of its angle. Thus, a is simply \text{tg}\alpha, where \alpha is the angle between a horizontal line and our line.

Notice also that two lines meet at a single point if and only if their slopes are different. To find this point we have to solve a system of two equations. On the other hand if the slopes are equal, and the lines are not equal, they are parallel.

Quadratic polynomial

A quadratic polynomial is a function of form f(x)=ax^2+bx+c. We assume that a\neq 0.

We can write putting the second expression under the square (completing the square):


which is called the canonical form. Notice that this allows us to find the roots (i.e. values for which the expression gives 0) immediately. E.g. take x^2-2x+3. we get


(we do not have to use any formulas here). Thus, x-1=\pm \sqrt{2}, so x=1\pm\sqrt{2}.

By the way, it is clear that the minimum or maximum (depending on the sign of a) we get when the squared expression is zero, i.e. for x=-b/2a. The expression \Delta=b^2-4ac is called the determinant of the quadratic equation. We get two roots, if \Delta>0, and one, if \Delta =0 and none for \Delta<0.

And finally, since


x+b/2a=\frac{\pm\sqrt{\Delta}}{2a}, so x=\frac{-b\pm\sqrt{\Delta}}{2a}, if only \Delta\geq 0.


A polynomial is a function of the following form: a_{n}x^{n}+a_{n-1}x^{n-1}+\ldots+a_{2}x^{2}+a_{1}x^+a_{0}, where a_{0}, a_{1}, \ldots are coefficients, i.e. given real numbers. So constant, linear or quadratic functions are examples of polynomials. E.g.: f(x)=4x-2, g(z)=-z^{2}+4z-\frac{1}{2}.

The highest exponent of the indeterminate is called its degree.

How to solve an equation f(x)=0, where f is a polynomial? If f is quadratic, then everyone knows how to solve it. Just calculate \Delta and so on.

If the polynomial is of higher degree, e.g. x^3-4x^2+x+6=0, then we first need to laboriously guess a root. In our case we guess that it is -1. Indeed, (-1)^3-4(-1)^2+(-1)-6=-1-4-1+6=0. Having any root a, one needs to know that the polynomial is dividable by (x-a) (see Bezout Theorem). The divided polynomial is of degree lower by one and has the same roots as the initial polynomial except from a.

But how to divide one polynomial by another one. One of a few possible methods resembles the usual number division:

    \[\begin{array}{l} x^2-5x+6\\ \overline{(x^3-4x^2+x+6) : (x+1)}\\ \underline{x^3+x^2}\\ \qquad -5x^2+x\\ \qquad \underline{-5x^2-5x}\\ \qquad \qquad 6x+6\\ \qquad \qquad \underline{6x+6}\\ \qquad \qquad \qquad 0 \end{array} \]

So: x^3-4x^2+x+6=(x+1)\cdot (x^2-5x+6).

And now it suffices to use \Delta-technique to solve equation x^2-5x+6=0. \Delta=25-24=1, so the roots are: 2 and 3. So the solutions of the equation x^3-4x^2+x+6=0 are: -1, 2 and 3.


Permutations, combinations and variations

Counting permutations is counting of the number of possible orderings of a number of elements. Elements of n-element set can be put in an n-element sequence (without repetitions) in n! ways, where n!=1\cdot\ldots n.

Variations without repetitions is a more general question. From an n-element set we choose k-element sequences (without repetition). Thus it can be done in:

    \[V_n^k=n\cdot (n-1)\cdot\ldots\cdot (n-k+1)=\frac{n!}{(n-k)!}\]


Variations with repetitions is choosing from an n-element set an k-element sequence, in which the elements can be repeated. It can be done in



Combinations without repetitions is choosing from an n-element set a k-element subset (i.e. without repetitions and without order). It is easy to notice that

    \[C_n^k\cdot k!=V_n^k,\]

where C_n^k is the number of ways to do it. Thus,

    \[C_n^k=\frac{v_n^k}{k!}=\frac{n\cdot (n-1)\cdot\ldots\cdot (n-k+1)}{k!}=\frac{n!}{(n-k)!\cdot k!}=\binom{n}{k}.\]

The last symbol is called the Newton’s binomial coefficient.

Finally, combinations with repetitions is the problem of choosing from an n-elemnent set an k-element collection in which each element can repeat any number of times (but the order does not count). It is equivalent to the problem of putting k elements in n boxes. This is equivalent to choosing on which places of n_k-1 sequence of objects and walls there are n-1 walls, i.e.


ways to do this.

To sum up we get

variations (order) combinations (without order)
without repetitions V_n^k=n!/(n-k)! C_n^k=\binom{n}{k}
with repetitions W_n^k=n^k K_n^k=\binom{n+k-1}{n-1}

Paskal’s triangle

Notice that:


Indeed, the chief of a herd of monkeys choosing k monkeys to go for a quest from n-monkey herd can simply do it (left-hand side). He can also first decide whether he goes on the quest. If so, he still has to choose k-1 monkeys from the n-1 monkeys left. If no, he chooses k monkeys from the n-1 monkeys left.

Using this fact, and having for every n,


we get that \binom{n}{k} can be shown in the following triangle, called Pascal’s triangle, in which every element is a sum of elements above to the left and directly above it.

k=0 k=1 k=2 k=3 k=4 k=5 k=6 k=7
n=0 1
n=1 1 1
n=2 1 2 1
n=3 1 3 3 1
n=4 1 4 6 4 1
n=5 1 5 10 10 5 1
n=6 1 6 15 20 15 6 1
n=7 1 7 21 35 35 21 7 1

By the way, we have proved that n,

    \[\binom{n}{0}+\binom{n}{1}+\ldots+ \binom{n}{n}=2^n.\]

Newton’s binomial formula

When expanding expression

    \[(a+b)^n=(a+b)(a+b)\cdot \ldots\cdot (a+b)\]

we notice that the term a^kb^{n-k} can be obtained in a number of ways, and this number is the number which stands with this term in the expanded form. This number is the number of ways in which one can choose k letters a from n brackets, which is simply \binom{n}{k}. Thus:

    \[(a+b)^n=\binom{n}{0}b^n+\binom{n}{1}ab^{n-1}+\binom{n}{2}a^2b^{n-2}+\ldots+ \binom{n}{n-1}a^{n-1}b+\binom{n}{n}a^n,\]



Dirichlet’s rule

It is a simple combinatorial fact but extremally useful: if n monkeys sleep on k trees, and k<n, then there is at least one tree on which two monkeys sleep.