Wikipedia:Reference desk/Archives/Mathematics/2010 September 5

From Wikipedia, the free encyclopedia
Mathematics desk
< September 4 << Aug | September | Oct >> September 6 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


September 5[edit]

Question on polynomials[edit]

Hey guys, how are ya all? ANyway, I'm stuck on a question and your help would be appreciated.

The question is this: Let k be a natural number and let r be a real number such that |r| < 1. Prove (by induction on k) that for any polynomial P of degree k there's a polynomial Q of degree k s.t. Q(n+1)r^(n+1) - Q(n)r^n = P(n)r^n.

The hint is this: Consider differences of succesive terms for n^k r^n, and use the inductive hypothesis.

OK. So that's the question and hint. I've been trying hard at this question since I've woken up but to no avail. So I've been trying for 5 hours. I really would appreciate an answer. Like for example I've tried it when P(x) = x^2, r = 1/2 and I've found that Q(x) = -2x^2 - 4x - 6 and it works. The reason I did this is 'cause it helps me to sum the series n^2 2^(-n). But I'm really stumped on this one. Help??? Thanks guys ... I've worked out about 7 pages of rough work so please don't say I'm lazy and I want my homework done for me. This isn't homework just independent study and it's for my own benefit but I'd really like a decent hint or an answer please. —Preceding unsigned comment added by 114.72.228.12 (talk) 05:02, 5 September 2010 (UTC)[reply]

Also you don't have to use the hint if you don't want to. Thanks guys ...

I've just run through the argument but only have a mess (though I think it's a correct mess). The point you might be missing (which is basically the hint restated) is that you don't want to evaluate what Q is if P = x^k; it suffices to show you can deal with x^k (in much the same way as the grounding case of the induction) and then the left overs are of less degree so can be absorbed into the rest of P and dealt with by Inductive Hypothesis. Also I don't see that you need |r|<1 (though you're in trouble if r=1). Hope that helps. 95.150.22.63 (talk) 14:17, 5 September 2010 (UTC)[reply]
Following from 95.150.22.63's bit about induction, I think it is pretty clear that we can rewrite as rQ(n+1)r^n - Q(n)r^n = x^k r^n. We ignore the r = 0 case, since it's rather meaningless, and so we divide through by r^n. Q is a polynomial of degree k, so let us express it as
, so our equation is
Then expand the first term by the binomial theorem, and then equating coefficients of n^i for i = 0, ..., k, you get k+1 linear equations to solve for the k+1 coefficients a_0, ..., a_k. To calculate, you can start from a_k and work back, a_k = 1/(r-1) always, for example (a_i can be solved for directly once you know the values of a_{i+1}, ..., a_k). In any case, you have k+1 linear equations for k+1 unknowns, and now you want to be sure that for all |r|<1, these have a unique solution. Invrnc (talk) 14:31, 5 September 2010 (UTC)[reply]

Jensen's inequality?[edit]

Is this a case of Jensen's inequality, or some other inequality?

—Preceding unsigned comment added by 130.102.158.15 (talkcontribs) 08:14, 5 September 2010 (UTC)[reply]

That's Minkowski's inequality, with . —Bkell (talk) 15:43, 5 September 2010 (UTC)[reply]
Thanks! —Preceding unsigned comment added by 130.102.158.15 (talk) 22:32, 5 September 2010 (UTC)[reply]

Limit Question[edit]

I know it's possible to reason out that , but is there any way to do it algebraically? Thanks. --Basho: banana tree (talk) 20:42, 5 September 2010 (UTC)[reply]

For every ε>0, you can choose an appropriate δ>0 (you will be able to express this value in terms of ε) such that for every point (x,y) within a distance of δ of (0,0), . This is formally how you demonstrate a limit. Rckrone (talk) 22:34, 5 September 2010 (UTC)[reply]
(edit conflict) You are interested in how the function behaves as (x,y) tends towards (0,0). But there are many ways in which a point (x,y) can tend towards (0,0). Let's assume that (x,y) follows a line towards (0,0). In other words x = r⋅cos(θ) and y = r⋅sin(θ), where θ is a fixed parameter and r is the variable along the line. We consider:
Notice that the penultimate expression is independent of θ and does not depend upon the sign of r since 1/(+r)2 = 1/(−r)2. Fly by Night (talk) 23:04, 5 September 2010 (UTC)[reply]
Note that this is not a proof. There are functions which converge along every line towards the origin, and yet do not converge at the origin.--203.97.79.114 (talk) 09:53, 6 September 2010 (UTC)[reply]
That's the whole point of a limit. The function may not be defined at the origin, but you calculate the limit of the function as you tend towards the origin. In this example the function is indeterminate at the origin; but the limit exists and is well defined. Fly by Night (talk) 12:59, 6 September 2010 (UTC)[reply]
I didn't say such functions were undefined; I said they did not converge. Consider the function . The limit as along any line is . The limit as along the curve is . The limit in this case does not converge, yet if you only considered approaching along a line, you would think it does. —Preceding unsigned comment added by 203.97.79.114 (talk) 13:18, 6 September 2010 (UTC)[reply]
I don't think that example works. Surely if we have then and then  ? (However, Meni's example below illustrates the point).Gandalf61 (talk) 13:36, 6 September 2010 (UTC)[reply]
Woops. works if you want something elementary. Or Meni's, as you say. --203.97.79.114 (talk) 13:48, 6 September 2010 (UTC)[reply]
[ec] 203's point is that only looking at straight lines is insufficient; you need it to work for every curve. The canonical example is
This function has no limit at the origin; but along any line, it converges to 0 at the origin.
Of course, in the OP's case, this objection can be considered a nitpick, since you have demonstrated that the function depends only on r, so finding the limit for a single curve suffices. -- Meni Rosenfeld (talk) 13:27, 6 September 2010 (UTC)[reply]
Strictly speaking, you cannot prove this statment (or any other statement involving limits) algebraically because it is a statement of analysis, not of algebra. The result depends on the topology that you use - with the usual topology on R2 the statement is true, but with the discrete topology (for example) it is false. The choice of topology does not affect the algebraic properties of R2 - therefore the result cannot be derived from algebraic properties alone. Gandalf61 (talk) 08:34, 6 September 2010 (UTC)[reply]
True, but we often teach an algebraic approach to limits that relies on analysis only for the fact that certain basic functions (addition, division where defined, etc) are continuous. Presumably that's what was being asked for.--203.97.79.114 (talk) 13:29, 6 September 2010 (UTC)[reply]
So what do we think the OP is looking for ? What would a proof of the OP's statement look like under this algebraic approach to limits ? Presumably it would not involve any δs, εs, open intervals or neighbourhoods ? Gandalf61 (talk) 13:51, 6 September 2010 (UTC)[reply]
I think it would be something like . -- Meni Rosenfeld (talk) 14:55, 6 September 2010 (UTC)[reply]

dL[edit]

In physics you express uncertainty as dL or delta-L. WHat does this have to do with derivatives? 76.229.214.25 (talk)` —Preceding undated comment added 23:00, 5 September 2010 (UTC).[reply]

because if y=f(x), then dy=f '(x)dx is an approximate computation of the uncertainty of y based on the uncertainty of x. Bo Jacoby (talk) 08:13, 6 September 2010 (UTC).[reply]