The limit is the number that you get closer and closer to, as you advance in your sequence/sum/whatever.
Formally, I think it is defined something like this: Pick any small number (often named epsilon) that you like. If an infinite sum approaches a limit L, that at some point the finite sum gets within epsilon of L (and doesn't leave). This works no matter how small epsilon is.
Although a finite sum might never reach exactly the limit L, it will get arbitrary close. Typically, the limit is never reached exactly (without infinitely many steps). It isn't against the rules for something to get exactly to the limit (e.g. Borek's example), it just usually doesn't happen that way.
BTW, one the most important uses of limits is calculus. Without the concept of limit, calculus makes no sense. For example, the derivative of a function is defined as a limit. Take the ratio ((f(x+dx) - f(x)) / (dx) and find the limit as dx->0. That is the derivative. dx never actually reaches 0, the ratio doesn't work then (0/0), it only gets very close. Limits have exact values just like derivatives do.