Lecture 10 Problems

  1. Complete the following function with Python-esque pseudocode (or working code in the lecture codebase), which performs inverse warping with nearest-neighbor sampling in the source image.
def warp(img, tx, dsize=None)
    """ Warp img using tx, a matrix representing a geometric transformation.
    Pre: tx is 3x3 (or some upper-left slice of a 3x3 matrix). img is grayscale.
    Returns: an output image of shape dsize with the warped img"""
    H, W = img.shape[:2]

    # turn a 2x2 or 2x3 tx into a full 3x3 matrix
    txH, txW = tx.shape
    M = np.eye(3)
    M[:txH,:txW] = tx

    # set the output size to the input size if not specified
    if dsize is None:
        DH, DW = (H, W)
    else:
        DH, DW = dsize[::-1]
    out = np.zeros((DH, DW))

    # your code here
    
    return out
  1. Suppose you are stitching a panorama with three images \(I_1, I_2, I_3\) and you’ve fitted transformations \(T_{12}\) and \(T_{23}\) that map coordinates from image 1 to 2 and from 2 to 3, respectively. Give the transformation that maps points from image 3’s coordinates to image 1’s coordinates.
  2. Give a strategy (describe, or write pseudocode) for finding the corners of the bounding box of a given image img after it has been warped using a homography T.

ML1 Problem

  1. For a linear regression model under MSE loss, calculate the derivative of the loss with respect to each of the parameters \(w\) and \(b\). These are partial derivatives, so when differentiating with respect to \(w\) we’ll just treat \(b\) as a constant, and vice versa.

Use the chain rule to break down the problem as follows: * \(\frac{\partial \mathcal{L}}{\partial w} = \frac{\partial }{\partial \hat{y_i}} \left[ \frac{1}{n}\sum_{i=1}^{n}(y_i - \hat{y_i})^2\right] \cdot \frac{\partial}{\partial w} \hat{y_i}\) =