Talk:Lanczos algorithm

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Old comments that did not have a section of their own[edit]

[[1]] - an unfocused variety of Lanczos algorithm —Preceding unsigned comment added by 134.219.166.104 (talkcontribs) 21:23, 1 September 2005

This doesn't have much but it does have a reference to a book mathworld on Lanczos Algorithm]—Preceding unsigned comment added by RJFJR (talkcontribs) 23:36, 25 September 2005

I don't believe this is the Lanczos algorithm at all. It is the power method. —Preceding unsigned comment added by 130.126.55.123 (talkcontribs) 01:04, 5 August 2006

I don't know if the algorithm is correct, but it's certainly different than the power method, and presented pretty clearly. I think it's gotten me on the right track at least... Thanks. --Jjdonald (talk) 22:22, 17 December 2007 (UTC)[reply]

It is not easy to say it's wrong or correct, since quite some information is missing in order to apply it: (a) how to choose v[1], (b) how to choose m, (c) how to recognize the eigenvalues of A among those of T_mm. Unfortunately, this vagueness is by no means eliminated by the Numerical stability section. — MFH:Talk 21:57, 12 September 2008 (UTC)[reply]
It is certainly not completely correct: there's at least something faulty with the indices. — MFH:Talk 19:59, 8 December 2011 (UTC)[reply]

It should state that "it applies to Hermitian matrices" at the start of the article and not somewhere in the middle. limweizhong (talk) 09:54, 11 November 2008 (UTC)[reply]

There is a paper about Non-Symmetric Lanczos' algorithm (compared to Arnoldi) by Jane Cullum. — MFH:Talk 20:07, 8 December 2011 (UTC)[reply]

In Latent Semantic Indexing, for...[edit]

I really think that this sentense has nothing to do in the first paragraph! Please someone who understand anything about it should create a separate section and explain what this is about! Alain Michaud (talk) 16:52, 19 February 2010 (UTC)[reply]

Block Lanczos algorithm[edit]

I suppose that Peter Montgomery`s 1995 paper was very good, but I do not see the need to inform everyone about its existence. This topic is much too advanced to be discussed at the top of the page. Please move this (second paragraph) towards the end of the page.

Alain Michaud (talk) 16:50, 19 February 2010 (UTC)[reply]

Extracting information from tridiagonal matrix[edit]

So Lanczos gives you a tridiagonal matrix. I think a link would be helpful which explains how to extract low eigenvalues/eigenvectors from this matrix. —Preceding unsigned comment added by 209.6.144.249 (talk) 06:30, 2 March 2008 (UTC)[reply]

Agree - or largest eigenvalues: anyway, the article starts by saying that it's for calculating eigenvalues, but then stops with the tridiag. matrix.
B.t.w., the algorithm calculates up to v[m+1], I think this could be avoided. (also, "unrolling" the 1st part of the m=1 case as initialization should allow to avoid using v[0].) — MFH:Talk 03:09, 11 September 2008 (UTC)[reply]
PS: also, it should be said what is 'm'...
  • Seconded — the article spends a lot of ink on the Lanczos iteration (but could do a better job at explaining it) for producing a tridiagonal matrix, says various other algorithms can be used for calculating eigenvalues and eigenvectors of that tridiagonal matrix, but is almost silent on how the two are related. As far as I can tell, early steps of the iteration tends to put most of the weight on the extreme eigenvalues (largest and smallest both, regardless of their absolute values), meaning those are fairly accurately reproduced in the tridiagonal matrix, and the algorithm proceeds towards less extreme eigenvalues the longer it is run; it's 'tends' because the initial weight distribution depends on the initial vector, which is chosen at random. What is not clear from mere thought experiments is how concentrated the distribution is … 130.243.68.202 (talk) 14:22, 2 May 2017 (UTC)[reply]
Thinking some more about this, I find it desirable to modify the way the algorithm is stated — partly to address the case , and partly to do something about the "changes to" remark at the end, which is confusing in that no variable assigned in the algorithm is ever changed. It turns out the remark is boilerplate text in the {{algorithm-end}} template, and there is no option to omit it. Since Wikipedia:WikiProject_Computer_science/Manual_of_style#Algorithms_and_data_structures does not recommend using that template, and instead recommends using (numbered) lists with explicit input and output headings, a rewrite from scratch seems in order.
Said rewrite is now live in the article. It also addresses the gap of where to go when you've got the tridiagonal matrix. A remaining gap is the result that eigenvalues of approximates eigenvalues of . 130.243.68.122 (talk) 15:48, 26 May 2017 (UTC)[reply]

Following up on the previous comment, the section "Application to eigenproblem" misleadingly implies that the eigenvalues of the mxm matrix T are exactly the eigenvalues of A, which is obviously false. (Indeed the equation in parentheses at the end of the first paragraph is only valid if m =n). The point is that these eigenvalues approximate the eigenvalues of A; more work needs to be done to understand why and how good the approximation is.

Define variables[edit]

It would be nice if variables are defined before (or just after) being used. For example, at the begining, and are not defined and its confusing for non-expert public.

Felipebm (talk) 13:34, 17 May 2011 (UTC)[reply]

problematic matrix decomposition[edit]

In the section "Power method for finding eigenvalues", the matrix A is represented as , which is true only for normal matrices. For the general case, SVD decomposition should be used, i.e. where U and V are some orthogonal matrices. — Preceding unsigned comment added by 89.139.52.157 (talk) 12:14, 24 April 2016 (UTC)[reply]

It's not stated explicitly at that point, but presumably is already taken to be Hermitian (as it needs to be for the Lanczos algorithm to work), which means it has an eigendecomposition of the form stated. Instead using the SVD decomposition in this argument won't work, because the entire point is that so that the product telescopes! Possibly it would be clearer to just use , i.e., hold off on requiring orthogonality — the reason being that the paragraph in question is about the plain power method, which applies in a greater generality. 130.243.68.202 (talk) 13:01, 2 May 2017 (UTC)[reply]

External links modified[edit]

Hello fellow Wikipedians,

I have just modified one external link on Lanczos algorithm. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 18 January 2022).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 14:26, 16 December 2017 (UTC)[reply]