Hensel's Lemma and Implicit Function Theorem
$begingroup$
In the literature and on the web happened to me several times to read confused or simply cryptic assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem.
I tried to explicit this relation but I failed, here there are some observations I made.
A first good property of Henselian rings, so rings that satisfy Hensel's Lemma, is that their spectrum is homotopically equivalent to their closed point in the sense of Grothendieck. Precisely, if $widehat{pi}$ is the pro-fundamental group of a scheme as in SGA1, then $widehat{pi}(operatorname{Spec}(A)) simeq widehat{pi}(operatorname{Spec}(k(m))$, where $A$ is an Henselian ring and $k(m)$ is the residue field of the maximal ideal $m$ of $A$.
So I thought that spectra of Henselian rings were the kind of "small neighborhoods" in which you can write a "function" explicitly, thanks to Hensel's Lemma. But I'm confused in trying to understand what kind of functions I have to examine.
Another observation is that Henselianity is exactly the condition needed for a local ring $R$ for having no non-trivial étale coverings of $operatorname{Spec}(R)$ which are trivial on the closed point. Since these coverings are in correspondence with ètale algebras of $R$ I examined this direction and I found that, for any field $k$, the $k$-algebra of the form $k[x]/f(x)$ is ètale over $k$ if and only if $f'(x)$ is invertible in the algebra.
There is also a more complicated criterion for ètale algebras over rings which uses the invertibility of the determinant of the Jacobian of a system of polynomials. This is very reminiscent of the key condition of the Implicit Function Theorem, but I don't know why.
Here I put the link for the wikipedia pages of some related concepts, such as the implicit function theorem, Henselian rings and Hensel's lemma. Moreover here you can find an article with a large introduction about Henselian rings.
Thank you in advance for your time.
abstract-algebra algebraic-geometry
$endgroup$
add a comment |
$begingroup$
In the literature and on the web happened to me several times to read confused or simply cryptic assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem.
I tried to explicit this relation but I failed, here there are some observations I made.
A first good property of Henselian rings, so rings that satisfy Hensel's Lemma, is that their spectrum is homotopically equivalent to their closed point in the sense of Grothendieck. Precisely, if $widehat{pi}$ is the pro-fundamental group of a scheme as in SGA1, then $widehat{pi}(operatorname{Spec}(A)) simeq widehat{pi}(operatorname{Spec}(k(m))$, where $A$ is an Henselian ring and $k(m)$ is the residue field of the maximal ideal $m$ of $A$.
So I thought that spectra of Henselian rings were the kind of "small neighborhoods" in which you can write a "function" explicitly, thanks to Hensel's Lemma. But I'm confused in trying to understand what kind of functions I have to examine.
Another observation is that Henselianity is exactly the condition needed for a local ring $R$ for having no non-trivial étale coverings of $operatorname{Spec}(R)$ which are trivial on the closed point. Since these coverings are in correspondence with ètale algebras of $R$ I examined this direction and I found that, for any field $k$, the $k$-algebra of the form $k[x]/f(x)$ is ètale over $k$ if and only if $f'(x)$ is invertible in the algebra.
There is also a more complicated criterion for ètale algebras over rings which uses the invertibility of the determinant of the Jacobian of a system of polynomials. This is very reminiscent of the key condition of the Implicit Function Theorem, but I don't know why.
Here I put the link for the wikipedia pages of some related concepts, such as the implicit function theorem, Henselian rings and Hensel's lemma. Moreover here you can find an article with a large introduction about Henselian rings.
Thank you in advance for your time.
abstract-algebra algebraic-geometry
$endgroup$
1
$begingroup$
So, is there a question? For what it's worth, I always thought Hensel's Lemma was the $p$-adic version of Newton's Method.
$endgroup$
– Gerry Myerson
Jun 29 '11 at 12:56
1
$begingroup$
I also think of Hensel's lemma as Newton's method, and so does the Wikipedia article. Where did you find the "assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem"?
$endgroup$
– ShreevatsaR
Jun 29 '11 at 13:12
$begingroup$
I see both principles as a way to go from "local" solutions to "global" solutions for an equation. I think that's where the analogy lies.
$endgroup$
– Joel Cohen
Jun 29 '11 at 13:33
$begingroup$
I'm sorry if I was not enough explicit. The question was: "how Hensel's Lemma is the algebraic analogue of the Implicit Function Theorem?", which follows implicitely: "it is true that Hensel's lemma is the algebraic analogue of IFT?". Actually I found the explicit statement of the analogy only in some lecture notes, but they are often compared in the sense outlined by J.Cohen. So I thought there was a more deep and precise connection between the topics. For me it is also enough to know that this connection actually does not exist. Anyway, thank again for your time.
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 13:46
$begingroup$
Hensel's lemma in a sufficiently general formulation using more than one variable is an algebraic version of the IFT. This is well-explained in the book H. Kurke, G. Pfister, M. Roczen, Henselsche Ringe, Deutsch. Verlag Wissenschaft. (1975), which was printed in the German Democratic Republic, obviously out of print. Unfortunately I know of no other reference.
$endgroup$
– Hagen Knaf
Jun 29 '11 at 13:57
add a comment |
$begingroup$
In the literature and on the web happened to me several times to read confused or simply cryptic assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem.
I tried to explicit this relation but I failed, here there are some observations I made.
A first good property of Henselian rings, so rings that satisfy Hensel's Lemma, is that their spectrum is homotopically equivalent to their closed point in the sense of Grothendieck. Precisely, if $widehat{pi}$ is the pro-fundamental group of a scheme as in SGA1, then $widehat{pi}(operatorname{Spec}(A)) simeq widehat{pi}(operatorname{Spec}(k(m))$, where $A$ is an Henselian ring and $k(m)$ is the residue field of the maximal ideal $m$ of $A$.
So I thought that spectra of Henselian rings were the kind of "small neighborhoods" in which you can write a "function" explicitly, thanks to Hensel's Lemma. But I'm confused in trying to understand what kind of functions I have to examine.
Another observation is that Henselianity is exactly the condition needed for a local ring $R$ for having no non-trivial étale coverings of $operatorname{Spec}(R)$ which are trivial on the closed point. Since these coverings are in correspondence with ètale algebras of $R$ I examined this direction and I found that, for any field $k$, the $k$-algebra of the form $k[x]/f(x)$ is ètale over $k$ if and only if $f'(x)$ is invertible in the algebra.
There is also a more complicated criterion for ètale algebras over rings which uses the invertibility of the determinant of the Jacobian of a system of polynomials. This is very reminiscent of the key condition of the Implicit Function Theorem, but I don't know why.
Here I put the link for the wikipedia pages of some related concepts, such as the implicit function theorem, Henselian rings and Hensel's lemma. Moreover here you can find an article with a large introduction about Henselian rings.
Thank you in advance for your time.
abstract-algebra algebraic-geometry
$endgroup$
In the literature and on the web happened to me several times to read confused or simply cryptic assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem.
I tried to explicit this relation but I failed, here there are some observations I made.
A first good property of Henselian rings, so rings that satisfy Hensel's Lemma, is that their spectrum is homotopically equivalent to their closed point in the sense of Grothendieck. Precisely, if $widehat{pi}$ is the pro-fundamental group of a scheme as in SGA1, then $widehat{pi}(operatorname{Spec}(A)) simeq widehat{pi}(operatorname{Spec}(k(m))$, where $A$ is an Henselian ring and $k(m)$ is the residue field of the maximal ideal $m$ of $A$.
So I thought that spectra of Henselian rings were the kind of "small neighborhoods" in which you can write a "function" explicitly, thanks to Hensel's Lemma. But I'm confused in trying to understand what kind of functions I have to examine.
Another observation is that Henselianity is exactly the condition needed for a local ring $R$ for having no non-trivial étale coverings of $operatorname{Spec}(R)$ which are trivial on the closed point. Since these coverings are in correspondence with ètale algebras of $R$ I examined this direction and I found that, for any field $k$, the $k$-algebra of the form $k[x]/f(x)$ is ètale over $k$ if and only if $f'(x)$ is invertible in the algebra.
There is also a more complicated criterion for ètale algebras over rings which uses the invertibility of the determinant of the Jacobian of a system of polynomials. This is very reminiscent of the key condition of the Implicit Function Theorem, but I don't know why.
Here I put the link for the wikipedia pages of some related concepts, such as the implicit function theorem, Henselian rings and Hensel's lemma. Moreover here you can find an article with a large introduction about Henselian rings.
Thank you in advance for your time.
abstract-algebra algebraic-geometry
abstract-algebra algebraic-geometry
edited Dec 27 '18 at 19:31
user26857
39.5k124284
39.5k124284
asked Jun 29 '11 at 11:37
Giovanni De GaetanoGiovanni De Gaetano
2,3791333
2,3791333
1
$begingroup$
So, is there a question? For what it's worth, I always thought Hensel's Lemma was the $p$-adic version of Newton's Method.
$endgroup$
– Gerry Myerson
Jun 29 '11 at 12:56
1
$begingroup$
I also think of Hensel's lemma as Newton's method, and so does the Wikipedia article. Where did you find the "assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem"?
$endgroup$
– ShreevatsaR
Jun 29 '11 at 13:12
$begingroup$
I see both principles as a way to go from "local" solutions to "global" solutions for an equation. I think that's where the analogy lies.
$endgroup$
– Joel Cohen
Jun 29 '11 at 13:33
$begingroup$
I'm sorry if I was not enough explicit. The question was: "how Hensel's Lemma is the algebraic analogue of the Implicit Function Theorem?", which follows implicitely: "it is true that Hensel's lemma is the algebraic analogue of IFT?". Actually I found the explicit statement of the analogy only in some lecture notes, but they are often compared in the sense outlined by J.Cohen. So I thought there was a more deep and precise connection between the topics. For me it is also enough to know that this connection actually does not exist. Anyway, thank again for your time.
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 13:46
$begingroup$
Hensel's lemma in a sufficiently general formulation using more than one variable is an algebraic version of the IFT. This is well-explained in the book H. Kurke, G. Pfister, M. Roczen, Henselsche Ringe, Deutsch. Verlag Wissenschaft. (1975), which was printed in the German Democratic Republic, obviously out of print. Unfortunately I know of no other reference.
$endgroup$
– Hagen Knaf
Jun 29 '11 at 13:57
add a comment |
1
$begingroup$
So, is there a question? For what it's worth, I always thought Hensel's Lemma was the $p$-adic version of Newton's Method.
$endgroup$
– Gerry Myerson
Jun 29 '11 at 12:56
1
$begingroup$
I also think of Hensel's lemma as Newton's method, and so does the Wikipedia article. Where did you find the "assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem"?
$endgroup$
– ShreevatsaR
Jun 29 '11 at 13:12
$begingroup$
I see both principles as a way to go from "local" solutions to "global" solutions for an equation. I think that's where the analogy lies.
$endgroup$
– Joel Cohen
Jun 29 '11 at 13:33
$begingroup$
I'm sorry if I was not enough explicit. The question was: "how Hensel's Lemma is the algebraic analogue of the Implicit Function Theorem?", which follows implicitely: "it is true that Hensel's lemma is the algebraic analogue of IFT?". Actually I found the explicit statement of the analogy only in some lecture notes, but they are often compared in the sense outlined by J.Cohen. So I thought there was a more deep and precise connection between the topics. For me it is also enough to know that this connection actually does not exist. Anyway, thank again for your time.
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 13:46
$begingroup$
Hensel's lemma in a sufficiently general formulation using more than one variable is an algebraic version of the IFT. This is well-explained in the book H. Kurke, G. Pfister, M. Roczen, Henselsche Ringe, Deutsch. Verlag Wissenschaft. (1975), which was printed in the German Democratic Republic, obviously out of print. Unfortunately I know of no other reference.
$endgroup$
– Hagen Knaf
Jun 29 '11 at 13:57
1
1
$begingroup$
So, is there a question? For what it's worth, I always thought Hensel's Lemma was the $p$-adic version of Newton's Method.
$endgroup$
– Gerry Myerson
Jun 29 '11 at 12:56
$begingroup$
So, is there a question? For what it's worth, I always thought Hensel's Lemma was the $p$-adic version of Newton's Method.
$endgroup$
– Gerry Myerson
Jun 29 '11 at 12:56
1
1
$begingroup$
I also think of Hensel's lemma as Newton's method, and so does the Wikipedia article. Where did you find the "assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem"?
$endgroup$
– ShreevatsaR
Jun 29 '11 at 13:12
$begingroup$
I also think of Hensel's lemma as Newton's method, and so does the Wikipedia article. Where did you find the "assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem"?
$endgroup$
– ShreevatsaR
Jun 29 '11 at 13:12
$begingroup$
I see both principles as a way to go from "local" solutions to "global" solutions for an equation. I think that's where the analogy lies.
$endgroup$
– Joel Cohen
Jun 29 '11 at 13:33
$begingroup$
I see both principles as a way to go from "local" solutions to "global" solutions for an equation. I think that's where the analogy lies.
$endgroup$
– Joel Cohen
Jun 29 '11 at 13:33
$begingroup$
I'm sorry if I was not enough explicit. The question was: "how Hensel's Lemma is the algebraic analogue of the Implicit Function Theorem?", which follows implicitely: "it is true that Hensel's lemma is the algebraic analogue of IFT?". Actually I found the explicit statement of the analogy only in some lecture notes, but they are often compared in the sense outlined by J.Cohen. So I thought there was a more deep and precise connection between the topics. For me it is also enough to know that this connection actually does not exist. Anyway, thank again for your time.
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 13:46
$begingroup$
I'm sorry if I was not enough explicit. The question was: "how Hensel's Lemma is the algebraic analogue of the Implicit Function Theorem?", which follows implicitely: "it is true that Hensel's lemma is the algebraic analogue of IFT?". Actually I found the explicit statement of the analogy only in some lecture notes, but they are often compared in the sense outlined by J.Cohen. So I thought there was a more deep and precise connection between the topics. For me it is also enough to know that this connection actually does not exist. Anyway, thank again for your time.
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 13:46
$begingroup$
Hensel's lemma in a sufficiently general formulation using more than one variable is an algebraic version of the IFT. This is well-explained in the book H. Kurke, G. Pfister, M. Roczen, Henselsche Ringe, Deutsch. Verlag Wissenschaft. (1975), which was printed in the German Democratic Republic, obviously out of print. Unfortunately I know of no other reference.
$endgroup$
– Hagen Knaf
Jun 29 '11 at 13:57
$begingroup$
Hensel's lemma in a sufficiently general formulation using more than one variable is an algebraic version of the IFT. This is well-explained in the book H. Kurke, G. Pfister, M. Roczen, Henselsche Ringe, Deutsch. Verlag Wissenschaft. (1975), which was printed in the German Democratic Republic, obviously out of print. Unfortunately I know of no other reference.
$endgroup$
– Hagen Knaf
Jun 29 '11 at 13:57
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
This is elaborated in various places, e.g. see Kuhlman's paper Valuation theoretic and model theoretic aspects of local uniformization in Hauser et al. Resolution of singularities p.389 ff. excerpted below. You can find full proofs in the links following the excerpt. See also Ribenboim,
Equivalent forms of Hensel's lemma, Exposition. Math. 3 (1985), no. 1, 3-24.
[K2] 10.5 The multidimensional Hensel's Lemma, in Ch. 10, Hensel's Lemma, in
Draft of Franz-Viktor Kuhlmann's book on Valuation Theory.
[PZ] A. Prestel, M. Ziegler, Model-theoretic methods in the theory of topological fields.
J. Reine Angew. Math. 299(300) (1978), 318-341
$endgroup$
$begingroup$
Ok! It is the kind of answer I was looking for, I'm only a little bit worried by "It is (not all too well) known that Hensel's lemma...". Now I will try to understand completely the connection. Thank you very much!
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 14:46
$begingroup$
Just a remark: Franz-Viktor's book covers the case of valuation domains only. I was under the impression that you are interested into the general case ...
$endgroup$
– Hagen Knaf
Jun 29 '11 at 15:08
add a comment |
$begingroup$
Bill Dubuque has largely answered the question, but just to be explict:
Suppose that you have an equation
$f(x,y) = 0$, which you want to solve to express $y$ as a function of $x$.
(This is a typical implicit function theorem situation.)
Well, the implicit function theorem says that first, you should choose a small
n.h. of a point, say $x = 0$ to fix ideas. You should then choose a value $y_0$
of $y$ at this point, i.e. fix a solution to $f(y_0,0) = 0$; again, let's assume
that we can take $y_0 = 0$. (In other words, we assume that $f(x,y)$ has no
constant term, i.e. that $f(0,0) = 0$.)
Now the implicit function theorem says that we can solve for $y$ locally as a function of $x$ provided that $dfrac{partial f}{partial y}(0,0) neq 0.$ (Of course, there are other technical assumptions --- $f$ should be smooth and so on; let's ignore those, since in a moment I will take $f$ to be a polynomial.)
Now we could think about the implicit function theorem for analytic functions, and then for formally analytic functions, i.e. for formal power series.
So now the question is: given $f(x,y) = 0$, with $f$ a polynomial with no constant term, when we can find a solution $y in mathbb C[[x]]$ with no constant term. A sufficient condition is given by Hensel's lemma: one needs
that $f'(0)$ be a unit in $mathbb C[[x]]$ (thinking of $f$ as a polynomial in $y$ with
coeficients in $x$, and taking the derivative in $y$). A formal power series is a unit precisly if its constant term is non-zero, so this can be rephrased as $dfrac{partial f}{partial y}f(0,0) neq 0,$
which is exactly the same condition as in the implicit function theorem.
In short, Hensel's Lemma in the case of a formal power series ring is
exactly the same as an implicit function theorem (for polynomial equations,
say) in which one only asks for formal power series solutions.
Incidentally, the connection with Newton's Method is easy to see too:
Suppose that we are trying to solve $f(x,y) = 0$ for $y$ in terms of $x$,
under the assumption that $f(0,0) = 0$ and that $dfrac{partial f}{partial y}(0,0) neq 0.$ We may assume that the latter quantity equals $1$, by rescaling $f$ if necessary, so our equition has the form
$$0 = a x + y + b x^2 + c xy + d y^2 + cdots,$$
which we can rewrite as
$$ y = -a x - b x^2 - c x y - d y^2 + cdots .$$
Note that this already determines $y$ up to second order terms.
Now subsitute this expression for $y$ back into the right hand side, to
get
$$y = - a x - b x^2 - c x (- a x - b x^2 - cdots) - d (- a x - b x^2 - cdots)^2 + cdots,$$
to get $y$ up to third order terms. Now substitute in again, to
get $y$ up to fourth order terms, and so on.
This is just Newton's Method.
This proves Hensel's Lemma in this context. It is also easy to estimate
the size of the power series coefficients for $y$ that one obtains, and
to prove that $y$ has a positive radius of convergence. Thus we also
establish a version of the implicit function theorem for analytic functions with the same argument.
Summary: The Implicit Function Theorem, Hensel's Lemma, and Newton's Method are all variants of the same theme.
$endgroup$
add a comment |
$begingroup$
Although it might not be exactly the same result, you can see both principles as a way to go from "local" solutions to "global" solutions for an equation. Here's a very rough sketch of how I think about it (in each case there are some adjustments to be made if you want a rigorous statement).
First, note that the implicit function theorem is usually formulated with a function of two variables $x$ and $y$ (it goes "for all y, there is a unique x ..."), but $y$ will be fixed throughout so I'll omit it. Start form an "approximate" solution $x$ to your equation (in the implicit theorem case, you take $(x_0, y_0)$ an actual solution, and then $(x_0, y)$ is an approximate solution). In each case, you want to solve $f(x+h) = 0$. Now when $h$ is "small", because $f$ is "sufficiently regular", you can write
$$f(x+h) = f(x) + f'(x).h +o(h)$$
So the rough idea (or guess), would be that has unique solution $f(x+h) = 0$ if $f(x) + f'(x).h = 0$ has one, which would be the case if $f'(x)$ is invertible (if $f$ is multivariate, $f'(x)$ is understood as the differential, meaning it's invertible when the Jacobian is non zero). Both theorems more or less state that it actually works.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f48419%2fhensels-lemma-and-implicit-function-theorem%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
This is elaborated in various places, e.g. see Kuhlman's paper Valuation theoretic and model theoretic aspects of local uniformization in Hauser et al. Resolution of singularities p.389 ff. excerpted below. You can find full proofs in the links following the excerpt. See also Ribenboim,
Equivalent forms of Hensel's lemma, Exposition. Math. 3 (1985), no. 1, 3-24.
[K2] 10.5 The multidimensional Hensel's Lemma, in Ch. 10, Hensel's Lemma, in
Draft of Franz-Viktor Kuhlmann's book on Valuation Theory.
[PZ] A. Prestel, M. Ziegler, Model-theoretic methods in the theory of topological fields.
J. Reine Angew. Math. 299(300) (1978), 318-341
$endgroup$
$begingroup$
Ok! It is the kind of answer I was looking for, I'm only a little bit worried by "It is (not all too well) known that Hensel's lemma...". Now I will try to understand completely the connection. Thank you very much!
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 14:46
$begingroup$
Just a remark: Franz-Viktor's book covers the case of valuation domains only. I was under the impression that you are interested into the general case ...
$endgroup$
– Hagen Knaf
Jun 29 '11 at 15:08
add a comment |
$begingroup$
This is elaborated in various places, e.g. see Kuhlman's paper Valuation theoretic and model theoretic aspects of local uniformization in Hauser et al. Resolution of singularities p.389 ff. excerpted below. You can find full proofs in the links following the excerpt. See also Ribenboim,
Equivalent forms of Hensel's lemma, Exposition. Math. 3 (1985), no. 1, 3-24.
[K2] 10.5 The multidimensional Hensel's Lemma, in Ch. 10, Hensel's Lemma, in
Draft of Franz-Viktor Kuhlmann's book on Valuation Theory.
[PZ] A. Prestel, M. Ziegler, Model-theoretic methods in the theory of topological fields.
J. Reine Angew. Math. 299(300) (1978), 318-341
$endgroup$
$begingroup$
Ok! It is the kind of answer I was looking for, I'm only a little bit worried by "It is (not all too well) known that Hensel's lemma...". Now I will try to understand completely the connection. Thank you very much!
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 14:46
$begingroup$
Just a remark: Franz-Viktor's book covers the case of valuation domains only. I was under the impression that you are interested into the general case ...
$endgroup$
– Hagen Knaf
Jun 29 '11 at 15:08
add a comment |
$begingroup$
This is elaborated in various places, e.g. see Kuhlman's paper Valuation theoretic and model theoretic aspects of local uniformization in Hauser et al. Resolution of singularities p.389 ff. excerpted below. You can find full proofs in the links following the excerpt. See also Ribenboim,
Equivalent forms of Hensel's lemma, Exposition. Math. 3 (1985), no. 1, 3-24.
[K2] 10.5 The multidimensional Hensel's Lemma, in Ch. 10, Hensel's Lemma, in
Draft of Franz-Viktor Kuhlmann's book on Valuation Theory.
[PZ] A. Prestel, M. Ziegler, Model-theoretic methods in the theory of topological fields.
J. Reine Angew. Math. 299(300) (1978), 318-341
$endgroup$
This is elaborated in various places, e.g. see Kuhlman's paper Valuation theoretic and model theoretic aspects of local uniformization in Hauser et al. Resolution of singularities p.389 ff. excerpted below. You can find full proofs in the links following the excerpt. See also Ribenboim,
Equivalent forms of Hensel's lemma, Exposition. Math. 3 (1985), no. 1, 3-24.
[K2] 10.5 The multidimensional Hensel's Lemma, in Ch. 10, Hensel's Lemma, in
Draft of Franz-Viktor Kuhlmann's book on Valuation Theory.
[PZ] A. Prestel, M. Ziegler, Model-theoretic methods in the theory of topological fields.
J. Reine Angew. Math. 299(300) (1978), 318-341
edited Jul 12 '11 at 22:20
user242
answered Jun 29 '11 at 14:19
Bill DubuqueBill Dubuque
214k29197656
214k29197656
$begingroup$
Ok! It is the kind of answer I was looking for, I'm only a little bit worried by "It is (not all too well) known that Hensel's lemma...". Now I will try to understand completely the connection. Thank you very much!
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 14:46
$begingroup$
Just a remark: Franz-Viktor's book covers the case of valuation domains only. I was under the impression that you are interested into the general case ...
$endgroup$
– Hagen Knaf
Jun 29 '11 at 15:08
add a comment |
$begingroup$
Ok! It is the kind of answer I was looking for, I'm only a little bit worried by "It is (not all too well) known that Hensel's lemma...". Now I will try to understand completely the connection. Thank you very much!
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 14:46
$begingroup$
Just a remark: Franz-Viktor's book covers the case of valuation domains only. I was under the impression that you are interested into the general case ...
$endgroup$
– Hagen Knaf
Jun 29 '11 at 15:08
$begingroup$
Ok! It is the kind of answer I was looking for, I'm only a little bit worried by "It is (not all too well) known that Hensel's lemma...". Now I will try to understand completely the connection. Thank you very much!
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 14:46
$begingroup$
Ok! It is the kind of answer I was looking for, I'm only a little bit worried by "It is (not all too well) known that Hensel's lemma...". Now I will try to understand completely the connection. Thank you very much!
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 14:46
$begingroup$
Just a remark: Franz-Viktor's book covers the case of valuation domains only. I was under the impression that you are interested into the general case ...
$endgroup$
– Hagen Knaf
Jun 29 '11 at 15:08
$begingroup$
Just a remark: Franz-Viktor's book covers the case of valuation domains only. I was under the impression that you are interested into the general case ...
$endgroup$
– Hagen Knaf
Jun 29 '11 at 15:08
add a comment |
$begingroup$
Bill Dubuque has largely answered the question, but just to be explict:
Suppose that you have an equation
$f(x,y) = 0$, which you want to solve to express $y$ as a function of $x$.
(This is a typical implicit function theorem situation.)
Well, the implicit function theorem says that first, you should choose a small
n.h. of a point, say $x = 0$ to fix ideas. You should then choose a value $y_0$
of $y$ at this point, i.e. fix a solution to $f(y_0,0) = 0$; again, let's assume
that we can take $y_0 = 0$. (In other words, we assume that $f(x,y)$ has no
constant term, i.e. that $f(0,0) = 0$.)
Now the implicit function theorem says that we can solve for $y$ locally as a function of $x$ provided that $dfrac{partial f}{partial y}(0,0) neq 0.$ (Of course, there are other technical assumptions --- $f$ should be smooth and so on; let's ignore those, since in a moment I will take $f$ to be a polynomial.)
Now we could think about the implicit function theorem for analytic functions, and then for formally analytic functions, i.e. for formal power series.
So now the question is: given $f(x,y) = 0$, with $f$ a polynomial with no constant term, when we can find a solution $y in mathbb C[[x]]$ with no constant term. A sufficient condition is given by Hensel's lemma: one needs
that $f'(0)$ be a unit in $mathbb C[[x]]$ (thinking of $f$ as a polynomial in $y$ with
coeficients in $x$, and taking the derivative in $y$). A formal power series is a unit precisly if its constant term is non-zero, so this can be rephrased as $dfrac{partial f}{partial y}f(0,0) neq 0,$
which is exactly the same condition as in the implicit function theorem.
In short, Hensel's Lemma in the case of a formal power series ring is
exactly the same as an implicit function theorem (for polynomial equations,
say) in which one only asks for formal power series solutions.
Incidentally, the connection with Newton's Method is easy to see too:
Suppose that we are trying to solve $f(x,y) = 0$ for $y$ in terms of $x$,
under the assumption that $f(0,0) = 0$ and that $dfrac{partial f}{partial y}(0,0) neq 0.$ We may assume that the latter quantity equals $1$, by rescaling $f$ if necessary, so our equition has the form
$$0 = a x + y + b x^2 + c xy + d y^2 + cdots,$$
which we can rewrite as
$$ y = -a x - b x^2 - c x y - d y^2 + cdots .$$
Note that this already determines $y$ up to second order terms.
Now subsitute this expression for $y$ back into the right hand side, to
get
$$y = - a x - b x^2 - c x (- a x - b x^2 - cdots) - d (- a x - b x^2 - cdots)^2 + cdots,$$
to get $y$ up to third order terms. Now substitute in again, to
get $y$ up to fourth order terms, and so on.
This is just Newton's Method.
This proves Hensel's Lemma in this context. It is also easy to estimate
the size of the power series coefficients for $y$ that one obtains, and
to prove that $y$ has a positive radius of convergence. Thus we also
establish a version of the implicit function theorem for analytic functions with the same argument.
Summary: The Implicit Function Theorem, Hensel's Lemma, and Newton's Method are all variants of the same theme.
$endgroup$
add a comment |
$begingroup$
Bill Dubuque has largely answered the question, but just to be explict:
Suppose that you have an equation
$f(x,y) = 0$, which you want to solve to express $y$ as a function of $x$.
(This is a typical implicit function theorem situation.)
Well, the implicit function theorem says that first, you should choose a small
n.h. of a point, say $x = 0$ to fix ideas. You should then choose a value $y_0$
of $y$ at this point, i.e. fix a solution to $f(y_0,0) = 0$; again, let's assume
that we can take $y_0 = 0$. (In other words, we assume that $f(x,y)$ has no
constant term, i.e. that $f(0,0) = 0$.)
Now the implicit function theorem says that we can solve for $y$ locally as a function of $x$ provided that $dfrac{partial f}{partial y}(0,0) neq 0.$ (Of course, there are other technical assumptions --- $f$ should be smooth and so on; let's ignore those, since in a moment I will take $f$ to be a polynomial.)
Now we could think about the implicit function theorem for analytic functions, and then for formally analytic functions, i.e. for formal power series.
So now the question is: given $f(x,y) = 0$, with $f$ a polynomial with no constant term, when we can find a solution $y in mathbb C[[x]]$ with no constant term. A sufficient condition is given by Hensel's lemma: one needs
that $f'(0)$ be a unit in $mathbb C[[x]]$ (thinking of $f$ as a polynomial in $y$ with
coeficients in $x$, and taking the derivative in $y$). A formal power series is a unit precisly if its constant term is non-zero, so this can be rephrased as $dfrac{partial f}{partial y}f(0,0) neq 0,$
which is exactly the same condition as in the implicit function theorem.
In short, Hensel's Lemma in the case of a formal power series ring is
exactly the same as an implicit function theorem (for polynomial equations,
say) in which one only asks for formal power series solutions.
Incidentally, the connection with Newton's Method is easy to see too:
Suppose that we are trying to solve $f(x,y) = 0$ for $y$ in terms of $x$,
under the assumption that $f(0,0) = 0$ and that $dfrac{partial f}{partial y}(0,0) neq 0.$ We may assume that the latter quantity equals $1$, by rescaling $f$ if necessary, so our equition has the form
$$0 = a x + y + b x^2 + c xy + d y^2 + cdots,$$
which we can rewrite as
$$ y = -a x - b x^2 - c x y - d y^2 + cdots .$$
Note that this already determines $y$ up to second order terms.
Now subsitute this expression for $y$ back into the right hand side, to
get
$$y = - a x - b x^2 - c x (- a x - b x^2 - cdots) - d (- a x - b x^2 - cdots)^2 + cdots,$$
to get $y$ up to third order terms. Now substitute in again, to
get $y$ up to fourth order terms, and so on.
This is just Newton's Method.
This proves Hensel's Lemma in this context. It is also easy to estimate
the size of the power series coefficients for $y$ that one obtains, and
to prove that $y$ has a positive radius of convergence. Thus we also
establish a version of the implicit function theorem for analytic functions with the same argument.
Summary: The Implicit Function Theorem, Hensel's Lemma, and Newton's Method are all variants of the same theme.
$endgroup$
add a comment |
$begingroup$
Bill Dubuque has largely answered the question, but just to be explict:
Suppose that you have an equation
$f(x,y) = 0$, which you want to solve to express $y$ as a function of $x$.
(This is a typical implicit function theorem situation.)
Well, the implicit function theorem says that first, you should choose a small
n.h. of a point, say $x = 0$ to fix ideas. You should then choose a value $y_0$
of $y$ at this point, i.e. fix a solution to $f(y_0,0) = 0$; again, let's assume
that we can take $y_0 = 0$. (In other words, we assume that $f(x,y)$ has no
constant term, i.e. that $f(0,0) = 0$.)
Now the implicit function theorem says that we can solve for $y$ locally as a function of $x$ provided that $dfrac{partial f}{partial y}(0,0) neq 0.$ (Of course, there are other technical assumptions --- $f$ should be smooth and so on; let's ignore those, since in a moment I will take $f$ to be a polynomial.)
Now we could think about the implicit function theorem for analytic functions, and then for formally analytic functions, i.e. for formal power series.
So now the question is: given $f(x,y) = 0$, with $f$ a polynomial with no constant term, when we can find a solution $y in mathbb C[[x]]$ with no constant term. A sufficient condition is given by Hensel's lemma: one needs
that $f'(0)$ be a unit in $mathbb C[[x]]$ (thinking of $f$ as a polynomial in $y$ with
coeficients in $x$, and taking the derivative in $y$). A formal power series is a unit precisly if its constant term is non-zero, so this can be rephrased as $dfrac{partial f}{partial y}f(0,0) neq 0,$
which is exactly the same condition as in the implicit function theorem.
In short, Hensel's Lemma in the case of a formal power series ring is
exactly the same as an implicit function theorem (for polynomial equations,
say) in which one only asks for formal power series solutions.
Incidentally, the connection with Newton's Method is easy to see too:
Suppose that we are trying to solve $f(x,y) = 0$ for $y$ in terms of $x$,
under the assumption that $f(0,0) = 0$ and that $dfrac{partial f}{partial y}(0,0) neq 0.$ We may assume that the latter quantity equals $1$, by rescaling $f$ if necessary, so our equition has the form
$$0 = a x + y + b x^2 + c xy + d y^2 + cdots,$$
which we can rewrite as
$$ y = -a x - b x^2 - c x y - d y^2 + cdots .$$
Note that this already determines $y$ up to second order terms.
Now subsitute this expression for $y$ back into the right hand side, to
get
$$y = - a x - b x^2 - c x (- a x - b x^2 - cdots) - d (- a x - b x^2 - cdots)^2 + cdots,$$
to get $y$ up to third order terms. Now substitute in again, to
get $y$ up to fourth order terms, and so on.
This is just Newton's Method.
This proves Hensel's Lemma in this context. It is also easy to estimate
the size of the power series coefficients for $y$ that one obtains, and
to prove that $y$ has a positive radius of convergence. Thus we also
establish a version of the implicit function theorem for analytic functions with the same argument.
Summary: The Implicit Function Theorem, Hensel's Lemma, and Newton's Method are all variants of the same theme.
$endgroup$
Bill Dubuque has largely answered the question, but just to be explict:
Suppose that you have an equation
$f(x,y) = 0$, which you want to solve to express $y$ as a function of $x$.
(This is a typical implicit function theorem situation.)
Well, the implicit function theorem says that first, you should choose a small
n.h. of a point, say $x = 0$ to fix ideas. You should then choose a value $y_0$
of $y$ at this point, i.e. fix a solution to $f(y_0,0) = 0$; again, let's assume
that we can take $y_0 = 0$. (In other words, we assume that $f(x,y)$ has no
constant term, i.e. that $f(0,0) = 0$.)
Now the implicit function theorem says that we can solve for $y$ locally as a function of $x$ provided that $dfrac{partial f}{partial y}(0,0) neq 0.$ (Of course, there are other technical assumptions --- $f$ should be smooth and so on; let's ignore those, since in a moment I will take $f$ to be a polynomial.)
Now we could think about the implicit function theorem for analytic functions, and then for formally analytic functions, i.e. for formal power series.
So now the question is: given $f(x,y) = 0$, with $f$ a polynomial with no constant term, when we can find a solution $y in mathbb C[[x]]$ with no constant term. A sufficient condition is given by Hensel's lemma: one needs
that $f'(0)$ be a unit in $mathbb C[[x]]$ (thinking of $f$ as a polynomial in $y$ with
coeficients in $x$, and taking the derivative in $y$). A formal power series is a unit precisly if its constant term is non-zero, so this can be rephrased as $dfrac{partial f}{partial y}f(0,0) neq 0,$
which is exactly the same condition as in the implicit function theorem.
In short, Hensel's Lemma in the case of a formal power series ring is
exactly the same as an implicit function theorem (for polynomial equations,
say) in which one only asks for formal power series solutions.
Incidentally, the connection with Newton's Method is easy to see too:
Suppose that we are trying to solve $f(x,y) = 0$ for $y$ in terms of $x$,
under the assumption that $f(0,0) = 0$ and that $dfrac{partial f}{partial y}(0,0) neq 0.$ We may assume that the latter quantity equals $1$, by rescaling $f$ if necessary, so our equition has the form
$$0 = a x + y + b x^2 + c xy + d y^2 + cdots,$$
which we can rewrite as
$$ y = -a x - b x^2 - c x y - d y^2 + cdots .$$
Note that this already determines $y$ up to second order terms.
Now subsitute this expression for $y$ back into the right hand side, to
get
$$y = - a x - b x^2 - c x (- a x - b x^2 - cdots) - d (- a x - b x^2 - cdots)^2 + cdots,$$
to get $y$ up to third order terms. Now substitute in again, to
get $y$ up to fourth order terms, and so on.
This is just Newton's Method.
This proves Hensel's Lemma in this context. It is also easy to estimate
the size of the power series coefficients for $y$ that one obtains, and
to prove that $y$ has a positive radius of convergence. Thus we also
establish a version of the implicit function theorem for analytic functions with the same argument.
Summary: The Implicit Function Theorem, Hensel's Lemma, and Newton's Method are all variants of the same theme.
answered Jun 29 '11 at 16:22
Matt EMatt E
106k8223394
106k8223394
add a comment |
add a comment |
$begingroup$
Although it might not be exactly the same result, you can see both principles as a way to go from "local" solutions to "global" solutions for an equation. Here's a very rough sketch of how I think about it (in each case there are some adjustments to be made if you want a rigorous statement).
First, note that the implicit function theorem is usually formulated with a function of two variables $x$ and $y$ (it goes "for all y, there is a unique x ..."), but $y$ will be fixed throughout so I'll omit it. Start form an "approximate" solution $x$ to your equation (in the implicit theorem case, you take $(x_0, y_0)$ an actual solution, and then $(x_0, y)$ is an approximate solution). In each case, you want to solve $f(x+h) = 0$. Now when $h$ is "small", because $f$ is "sufficiently regular", you can write
$$f(x+h) = f(x) + f'(x).h +o(h)$$
So the rough idea (or guess), would be that has unique solution $f(x+h) = 0$ if $f(x) + f'(x).h = 0$ has one, which would be the case if $f'(x)$ is invertible (if $f$ is multivariate, $f'(x)$ is understood as the differential, meaning it's invertible when the Jacobian is non zero). Both theorems more or less state that it actually works.
$endgroup$
add a comment |
$begingroup$
Although it might not be exactly the same result, you can see both principles as a way to go from "local" solutions to "global" solutions for an equation. Here's a very rough sketch of how I think about it (in each case there are some adjustments to be made if you want a rigorous statement).
First, note that the implicit function theorem is usually formulated with a function of two variables $x$ and $y$ (it goes "for all y, there is a unique x ..."), but $y$ will be fixed throughout so I'll omit it. Start form an "approximate" solution $x$ to your equation (in the implicit theorem case, you take $(x_0, y_0)$ an actual solution, and then $(x_0, y)$ is an approximate solution). In each case, you want to solve $f(x+h) = 0$. Now when $h$ is "small", because $f$ is "sufficiently regular", you can write
$$f(x+h) = f(x) + f'(x).h +o(h)$$
So the rough idea (or guess), would be that has unique solution $f(x+h) = 0$ if $f(x) + f'(x).h = 0$ has one, which would be the case if $f'(x)$ is invertible (if $f$ is multivariate, $f'(x)$ is understood as the differential, meaning it's invertible when the Jacobian is non zero). Both theorems more or less state that it actually works.
$endgroup$
add a comment |
$begingroup$
Although it might not be exactly the same result, you can see both principles as a way to go from "local" solutions to "global" solutions for an equation. Here's a very rough sketch of how I think about it (in each case there are some adjustments to be made if you want a rigorous statement).
First, note that the implicit function theorem is usually formulated with a function of two variables $x$ and $y$ (it goes "for all y, there is a unique x ..."), but $y$ will be fixed throughout so I'll omit it. Start form an "approximate" solution $x$ to your equation (in the implicit theorem case, you take $(x_0, y_0)$ an actual solution, and then $(x_0, y)$ is an approximate solution). In each case, you want to solve $f(x+h) = 0$. Now when $h$ is "small", because $f$ is "sufficiently regular", you can write
$$f(x+h) = f(x) + f'(x).h +o(h)$$
So the rough idea (or guess), would be that has unique solution $f(x+h) = 0$ if $f(x) + f'(x).h = 0$ has one, which would be the case if $f'(x)$ is invertible (if $f$ is multivariate, $f'(x)$ is understood as the differential, meaning it's invertible when the Jacobian is non zero). Both theorems more or less state that it actually works.
$endgroup$
Although it might not be exactly the same result, you can see both principles as a way to go from "local" solutions to "global" solutions for an equation. Here's a very rough sketch of how I think about it (in each case there are some adjustments to be made if you want a rigorous statement).
First, note that the implicit function theorem is usually formulated with a function of two variables $x$ and $y$ (it goes "for all y, there is a unique x ..."), but $y$ will be fixed throughout so I'll omit it. Start form an "approximate" solution $x$ to your equation (in the implicit theorem case, you take $(x_0, y_0)$ an actual solution, and then $(x_0, y)$ is an approximate solution). In each case, you want to solve $f(x+h) = 0$. Now when $h$ is "small", because $f$ is "sufficiently regular", you can write
$$f(x+h) = f(x) + f'(x).h +o(h)$$
So the rough idea (or guess), would be that has unique solution $f(x+h) = 0$ if $f(x) + f'(x).h = 0$ has one, which would be the case if $f'(x)$ is invertible (if $f$ is multivariate, $f'(x)$ is understood as the differential, meaning it's invertible when the Jacobian is non zero). Both theorems more or less state that it actually works.
answered Jun 29 '11 at 14:19
Joel CohenJoel Cohen
7,44412238
7,44412238
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f48419%2fhensels-lemma-and-implicit-function-theorem%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
So, is there a question? For what it's worth, I always thought Hensel's Lemma was the $p$-adic version of Newton's Method.
$endgroup$
– Gerry Myerson
Jun 29 '11 at 12:56
1
$begingroup$
I also think of Hensel's lemma as Newton's method, and so does the Wikipedia article. Where did you find the "assertions regarding the fact that Hensel's Lemma is the algebraic version of Implicit Function Theorem"?
$endgroup$
– ShreevatsaR
Jun 29 '11 at 13:12
$begingroup$
I see both principles as a way to go from "local" solutions to "global" solutions for an equation. I think that's where the analogy lies.
$endgroup$
– Joel Cohen
Jun 29 '11 at 13:33
$begingroup$
I'm sorry if I was not enough explicit. The question was: "how Hensel's Lemma is the algebraic analogue of the Implicit Function Theorem?", which follows implicitely: "it is true that Hensel's lemma is the algebraic analogue of IFT?". Actually I found the explicit statement of the analogy only in some lecture notes, but they are often compared in the sense outlined by J.Cohen. So I thought there was a more deep and precise connection between the topics. For me it is also enough to know that this connection actually does not exist. Anyway, thank again for your time.
$endgroup$
– Giovanni De Gaetano
Jun 29 '11 at 13:46
$begingroup$
Hensel's lemma in a sufficiently general formulation using more than one variable is an algebraic version of the IFT. This is well-explained in the book H. Kurke, G. Pfister, M. Roczen, Henselsche Ringe, Deutsch. Verlag Wissenschaft. (1975), which was printed in the German Democratic Republic, obviously out of print. Unfortunately I know of no other reference.
$endgroup$
– Hagen Knaf
Jun 29 '11 at 13:57