|
|
 Originally Posted by wufwugy
Thanks for the responses. This is pretty over my head though, so I'm not sure I understand much.
I asked because of a situation in my calculus class. An exam problem was to test for convergence of the sum from n=whatever to infinity of 1/(n(n^(1/2)+10)). I looked at this and said it compares to 1/n^(3/2). Since p>1, it converges. But my professor marked me wrong because it's not technically correct unless I do the limit comparison test. I'm left wondering why I can't just say it converges since doesn't any function have to converge when the fastest growing term is an exponent and its ratio in the denominator compared to numerator is >1?
My best guess as to why he said that is because a college test is a simplified example designed to be completed (and graded) in a short amount of time. In the real world, the examples wont be so simple and straight forward and knowing how to figure out whether your function converges is an important skill. In physics and engineering, it is almost always that the particular skill has been requested by the industry to be included in the course material.
I was wrong every time I thought something was taught to me in vain. I thought Taylor series were just a crude approximation that I would never need. How wrong I was. [deleted for brevity] It's all about getting over the hubris of not seeing the applications the first minute you are exposed to a new idea. At least... it was for me.
At any rate, my real world advice is to accept that this guy is going to give you your grade. He's not grading you on what you think makes sense to learn. This can be a burden or a privilege, depending on your point of view.
 Originally Posted by wufwugy
I mean, even though technically a p-series is the form 1/n^p, I can't find any more complex functions where p is the fastest growing value and is >1 yet the series doesn't converge. This is technically not p-series: "sum of n=1 to infinity of 1/(n^(100/99)-1000000000000000000000000000000000000000000000000n )", but it still converges on p-series logic.
I don't want to presume to know your professor's specific gripe.
If you present that your conclusion is based on logic which you already understand, then it should be no matter for you to show your work via mathematical symbols.
 Originally Posted by wufwugy
Is there some point where I add enough zeroes to the n^1 term that it starts mattering?
So long as you're proving convergence over a domain that goes to infinity, then no.
In the case of a polynomial, the term with the largest polynomial degree dominates at both + and - infinity.
There are other rules, both more general and more specific, for various classes of functions.
 Originally Posted by wufwugy
I tried asking my professor but he wasn't understanding my question and almost seemed to be getting mad that I didn't understand why I can't just find the simplest way to get the right answer instead of doing what mathematicians do and rigorously prove it for the sake of rigorous proof, or whatever.
First of all, I've been there, man. I feel your frustration.
There is often a benefit to a rigorous proof in that it can be generalized to a class of results. Explicitly state your assumptions at the top of the proof and notice how broadly it applies. You may find that you've been shown a very powerful tool that is broad in its applications.
Or your prof is a knucklehead and you gain nothing by frustrating him and yourself.
***
FWIW, convergence is a requirement for all solutions to the Schroedinger Equation.
|