Jump to content

Wikipedia:Reference desk/Archives/Mathematics/2023 October 24

From Wikipedia, the free encyclopedia
Mathematics desk
< October 23 << Sep | October | Nov >> Current desk >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


October 24[edit]

Singular Vector of Scatter Plot[edit]

When you have a scatter plot, you have a matrix of values. Two columns are X and Y for the scatter plot. Each row is a point in the plot. Assume I have 11 points. So, I have 11 rows and 2 columns. I cannot get a eigenvector because it is not square. So, I multiply the matrix by itself and get an 11x11 matrix. I get the eigenvectors of that, which is an 11x11 matrix. I square root it to get the singular vectors, which is an 11x11 matrix. What I have not been able to find is how to make sense of the resulting 11x11 matrix. I expect to have a set of vectors, each indicating a direction. I am in 2D space, just X and Y, so how do I turn the 11x11 matrix into 2D vectors indicating the direction for each vector? 12.116.29.106 (talk) 16:28, 24 October 2023 (UTC)[reply]

I am not sure why you multiply the matrix by its transpose and then proceed the find its eigenvectors and so on. Are you following some recipe? What kind of sense would you hope to make of the result of these manipulations? Assuming we can produce 2D vectors, what would they mean or indicate? If you do the multiplication the other way around ( instead of ) you get a symmetric 2×2 matrix for which the indices have at least some meaning, as they correspond to the X and Y columns. It may become more meaningful if the X and Y values are adjusted by subtracting from each their average values and so that their sums become zero. Then the 2×2 matrix contains the data needed to compute the slope of the "trendline" or regression line, which goes through the point ; see Simple linear regression § Simple linear regression without the intercept term (single regressor).  --Lambiam 21:00, 24 October 2023 (UTC)[reply]
@Lambiam: Doing so is key to the singular value decomposition.--Jasper Deng (talk) 06:51, 26 October 2023 (UTC)[reply]
Your process is correct. You simply need to reverse your dot product as Lambiam noted. Your square matrix will be 2x2 and you will end up with a 2x2 eigenvector set. When you square root that, you will have a 2x2 singular vector set. Each column will be an X/Y vector. Transpose it to make X/Y vectors rows like your original data. The way you did it, you had two points in 11-dimensional space, so you ended up with 11 singular vectors in 11-dimensional space. 97.82.165.112 (talk) 10:39, 25 October 2023 (UTC)[reply]