... | ... | @@ -68,19 +68,26 @@ The convergence of the Newton-Raphson method is by no means guaranteed: it depen |
|
|
\mathbf{\delta x}_k`$. The relaxation factor is usually chosen in $`]0,1]`$.
|
|
|
|
|
|
When the nonlinear function $`\mathbf{F}(\mathbf{x})`$ has the particular form
|
|
|
$`\mathbf{F}(\mathbf{x}) := \mathbf{A}(\mathbf{x}) \mathbf{x} - \mathbf{b}`$ (i.e.
|
|
|
$`\mathbf{F}(\mathbf{x}) := \mathbf{A}(\mathbf{x}) \mathbf{x} - \mathbf{b}(\mathbf{x})`$ (i.e.
|
|
|
involves a square $`N\times N`$ matrix $`\mathbf{A}`$ whose entries depend on $`\mathbf{x}`$),
|
|
|
the Newton-Raphson iteration becomes
|
|
|
```math
|
|
|
\mathbf{J}(\mathbf{x}_{k-1}) \mathbf{\delta x}_k
|
|
|
= \mathbf{b} - \mathbf{A}(\mathbf{x}_{k-1}) \mathbf{x}_{k-1}
|
|
|
= \mathbf{b}(\mathbf{x}_{k-1}) - \mathbf{A}(\mathbf{x}_{k-1}) \mathbf{x}_{k-1}
|
|
|
```
|
|
|
with $`\mathbf{J}(\mathbf{x})_{ij}
|
|
|
with
|
|
|
```math
|
|
|
\mathbf{J}(\mathbf{x})_{ij}
|
|
|
= \frac{\partial(\mathbf{A}(\mathbf{x})\mathbf{x})_i}{\partial\mathbf{x}_j}
|
|
|
= \frac{\partial\mathbf{A}(\mathbf{x})_{ij}}{\partial\mathbf{x}_j} + \mathbf{A}(\mathbf{x})_{ij}`$. Equivalently, one can solve
|
|
|
- \frac{\partial\mathbf{b}(\mathbf{x})_i}{\partial\mathbf{x}_j}
|
|
|
= \frac{\partial\mathbf{A}(\mathbf{x})_{ij}}{\partial\mathbf{x}_j}
|
|
|
+ \mathbf{A}(\mathbf{x})_{ij}
|
|
|
- \frac{\partial\mathbf{b}(\mathbf{x})_i}{\partial\mathbf{x}_j}.
|
|
|
```
|
|
|
Equivalently, one can solve
|
|
|
```math
|
|
|
\mathbf{J}(\mathbf{x}_{k-1}) \mathbf{x}_k
|
|
|
= \mathbf{b} - \mathbf{A}(\mathbf{x}_{k-1}) \mathbf{x}_{k-1} + \mathbf{J}(\mathbf{x}_{k-1}) \mathbf{x}_{k-1} .
|
|
|
= \mathbf{b}(\mathbf{x}_{k-1}) - \mathbf{A}(\mathbf{x}_{k-1}) \mathbf{x}_{k-1} + \mathbf{J}(\mathbf{x}_{k-1}) \mathbf{x}_{k-1} .
|
|
|
```
|
|
|
|
|
|
### Picard method
|
... | ... | @@ -529,4 +536,4 @@ Resolution{ |
|
|
|
|
|
|
|
|
###########################################################################################
|
|
|
--> |
|
|
--> |
|
|
\ No newline at end of file |