Eigenvalues and Eigenvectors From Scratch Using QR Decomposition (Householder) With Python

One Sunday morning I decided to implement a function to compute the eigenvalues and eigenvectors of a matrix (square, symmetric) from scratch using Python.

Note: Eigenvalues and eigenvectors only apply to square matrices. If the source matrix is symmetric, the eigenvalues will be ordinary real numbers, but if the source matrix is not symmetric, one or more eigenvalues could be complex (involving the square root of -1.

There are several algorithms for eigenvalues / eigenvectors. I used the QR algorithm. And then there are several algorithms for QR decomposition. I used the Householder algorithm.


The results of my from-scratch eigen() function and the numpy linalg eig() function are the same. Columns [0] and [2] of my from-scratch version are -1 times columns [0] and [2] of the linalg version. This is OK because multiplying an eigenvector by any constant (including -1) has no effect on its eigenvector-ness.

As I was writing my code, I realized that the term “from scratch” can have several interpretations. For example, my eigen() function calls QR decomposition in a loop. I could use the built-in numpy np.linalg.qr() function or I could implement a my_qr() function from scratch, which I did.

But then my from-scratch QR decomposition calls np.eye(), np.copy(), np.linalg.norm(), np.dot(), np.matmul(). Depending on what is meant by “from scratch” I could use those numpy functions, or I could implement my_eye(), my_copy(), and so on, all from scratch.

In the end, it was a fun and interesting mental exercise on a Sunday morning.



Writing code from scratch is one thing. Star Wars theme costumes from scratch is another.


Demo code. Replace “lt”, “gt”, “lte”, “gte” with Boolean operator symbols.

# eigen_demo.py
# eigenvalues, eigenvectors from scratch using Python

import numpy as np

# -----------------------------------------------------------

def my_eigen(A):
  # A must be square, symmetric
  # return (eigenvalues, eigenvectors) tuple
  n = len(A)
  X = np.copy(A)  # or X = my_copy(A), see below
  pq = np.eye(n)
  max_ct = 10000

  ct = 0
  while ct "lt" max_ct:
    Q, R = my_qr(X)
    pq = np.matmul(pq, Q)  # accum Q
    X = np.matmul(R, Q)  # note order
    ct += 1

    if is_upper_tri(X, 1.0e-8) == True:
      break

  if ct == max_ct:
    print("WARN: no converge ")

  # eigenvalues are diag elements of X
  e_vals = np.zeros(n, dtype=np.float64)
  for i in range(n):
    e_vals[i] = X[i][i]

  # eigenvectors are columns of pq
  e_vecs = np.copy(pq)
  
  return (e_vals, e_vecs)

# -----------------------------------------------------------

def my_qr(A, standard=False):
  # QR decomposition. standard: make R diag positive
  # Householder algorithm, i.e., not Gram-Schmidt or Givens
  # if not square, verify m greater-than n ("tall")
  # if standard==True verify m == n

  m, n = A.shape  
  Q = np.eye(m)  # or my_eye(m) -- see below
  R = np.copy(A)
  if m == n: end = n-1
  else: end = n
  for i in range(0, end):
    H = np.eye(m)

    # -------------------
    a = R[i:, i]  # partial column vector
    norm_a = np.linalg.norm(a)  # 
    if a[0] "lt" 0.0: norm_a = -norm_a
    v = a / (a[0] + norm_a)
    v[0] = 1.0
    h = np.eye(len(a))  # H reflection
    h -= (2 / np.dot(v, v)) * \
      np.matmul(v[:,None], v[None,:])
    # -------------------

    H[i:, i:] = h  # copy h into H
    Q = np.matmul(Q, H)
    R = np.matmul(H, R)

  if standard == True:  # A must be square
    S = np.zeros((n,n))  # signs of diagonal
    for i in range(n):
      if R[i][i] "lt" 0.0: S[i][i] = -1.0
      else: S[i][i] = +1.0
    Q = np.matmul(Q, S)
    R = np.matmul(S, R)

  return Q, R

# -----------------------------------------------------------

def is_upper_tri(A, tol):
  n = len(A)
  for i in range(0,n):
    for j in range(0,i):  # lower
      if np.abs(A[i][j]) "gt" tol:
        return False
  return True
      
# -----------------------------------------------------------
# 
# for really from-scratch, use these functions
#
# -----------------------------------------------------------

def my_eye(n):
  result = np.zeros((n,n), dtype=np.float64)
  for i in range(n):
    result[i][i] = 1.0 
  return result

def my_copy(v): 
  n = len(v)
  result = np.zeros(n, dtype=np.float64)
  for i in range(n):
    result[i] = v[i]
  return result

def my_norm(v):
  n = len(v)
  sum = 0.0
  for i in range(n):
    sum += v[i] * v[i]
  return np.sqrt(sum)

def my_matmul(A, B):
  a_rows = len(A); a_cols = len(A[0])
  b_rows = len(B); b_cols = len(B[0])
  result = np.zeros((a_rows,b_cols), dtype=np.float64)
  for i in range(a_rows):
    for j in range(b_cols):
      for k in range(a_cols):
        result[i][j] += A[i][k] * B[k][j]
  return result
 
# -----------------------------------------------------------

def main():
  print("\nBegin eigenvalues, eigenvectors from scratch ")
  np.set_printoptions(suppress=True,
    precision=4, floatmode='fixed')

  A = np.array([[0.9, 0.5, 0.2],
                [0.5, 0.3, 0.4],
                [0.2, 0.4, 0.8]], dtype=np.float64)

  print("\nSource matrix: ")
  print(A)

  e_vals, e_vecs = my_eigen(A)
  print("\neigenvalues from scratch: ")
  print(e_vals)
  print("\neigenvectors (columns) from scratch: ")
  print(e_vecs)

  # verify Ax = vx
  # Ax = np.matmul(A, e_vecs[:,0])
  # print(Ax)
  # vx = np.dot(e_vals[o], e_vecs[:,0])
  # print(vx)

  e_vals, e_vecs = np.linalg.eig(A)
  print("\neigenvalues from np.linalg: ")
  print(e_vals)
  print("\neigenvectors (columns) from np.linalg: ")
  print(e_vecs)

  print("\nEnd demo ")

if __name__ == "__main__":
  main()
This entry was posted in Machine Learning. Bookmark the permalink.

2 Responses to Eigenvalues and Eigenvectors From Scratch Using QR Decomposition (Householder) With Python

  1. Pingback: Eigenvalues and Eigenvectors in Python Algorithm

  2. The URL/link has an excellent discussion of eigenvalues and eigenvectors.

Comments are closed.