* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

# Download R n

Survey

Document related concepts

System of linear equations wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Hilbert space wikipedia , lookup

Euclidean space wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Tensor operator wikipedia , lookup

Exterior algebra wikipedia , lookup

Cross product wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Vector space wikipedia , lookup

Matrix calculus wikipedia , lookup

Linear algebra wikipedia , lookup

Geometric algebra wikipedia , lookup

Euclidean vector wikipedia , lookup

Four-vector wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Bra–ket notation wikipedia , lookup

Transcript

Week 3 - Friday What did we talk about last time? Vertex shaders Geometry shaders Pixel shaders The Utah teapot was modeled in 1975 by graphics pioneer Martin Newell at the University of Utah It's actually taller than it looks They distorted the model so that it would look right on their non-square pixel displays Original Modern Yeah… just what do you know about vectors? We refer to n-dimensional real Euclidean space as Rn A vector v in this space is an n-tuple, an ordered list of n real numbers To match the book (and because we are computer scientists), we'll index these from 0 to n – 1 We will generally write our vectors as column vectors rather than row vectors We will be interested in a number of operations on vectors, including: Addition Scalar multiplication Dot product Norm Addition of two vectors is just element-byelement addition u0 v0 v0 u0 u1 v1 u1 v1 n uv R u v u v n 1 n 1 n 1 n 1 Vector addition is associative: (u v) w u (v w) Vector addition is commutative: u v v u There is a unique vector for Rn which is 0 = (0,0,…,0) with a total of n zeroes o is additive identity: 0 v v For vector v, there is a unique inverse –v = (-v0, -v1, … -vn-1) -v is the additive inverse: v ( v) 0 Multiplication by a scalar is just element-byelement multiplication of that scalar au0 au1 n au R au n1 Rules for scalar multiplication can easily be inferred from the normal properties of reals under addition and multiplication: (ab)u a(bu) (a b)u au bu a(u v) au av 1u u The dot product is a form of multiplication between two vectors that produces a scalar n 1 u v u i v i R i 0 Dot product rules are slightly less obvious than scalar product 𝐮 ∙ 𝐮 ≥ 0 and is 0 only when 𝐮 = 𝟎 (𝐮 + 𝐯) ∙ 𝐰 = 𝐮 ∙ 𝐰 + 𝐯 ∙ 𝐰 (𝑎𝐮) ∙ 𝐯 = 𝑎 𝐮 ∙ 𝐯 𝐮∙𝐯=0↔𝐮⊥𝐯 A norm is a way of measuring the magnitude of a vector We are actually only interested in the L2 norm, but there are many (infinite) norms for any given vector space We'll denote the norm of u as ||u|| 𝑛−1 𝑢𝑖2 𝐮 = 𝐮∙𝐮= 𝑖=0 𝐮 =0↔𝐮=𝟎 𝑎𝐮 = 𝑎 𝐮 𝐮+𝐯 ≤ 𝐮 + 𝐯 𝐮∙𝐯 ≤ 𝐮 𝐯 Mathematical rules are one thing Understanding how they are interpreted in geometry is something else Unfortunately, this means getting more math to link up the existing math with geometry A set of vectors u0, u1, … un-1 is linearly independent if the only scalars that satisfy the following identity are v0 = v1 = … = vn-1 = 0 v0uo v1u1 ... vn1un1 0 In other words, you can't make any one vector out of any of the others A set of vectors u0, u1, … un-1 spans Rn if any vector v Rn can be written: n 1 v v i ui i 0 In addition, if v0, v1, … , vn-1 are uniquely determined for all v Rn, then u0, u1, … un-1 form a basis of Rn To properly describe Rn, the vectors we give are actually scalar multiplied by each of the basis vectors ui By convention, we leave off the basis vectors ui, because it would be cumbersome to show them Also, they are often boring: (1,0,0), (0,1,0), and (0,0,1) A vector can either be a point in space or an arrow (direction and distance) The norm of a vector is its distance from the origin (or the length of the arrow) In R2 and R3, the dot product is: u v u v cosφ where is the smallest angle between u and v A basis is orthogonal if every vector in the basis is orthogonal to every other (has dot product 0) An orthogonal basis is orthonormal if every vector in it has length 1 The standard basis is orthonormal and made up of vectors ei which are all 0's except a 1 at location i We can find the orthogonal projection w of vector u onto vector v Essentially, this means the part of u that's in v u v u v w 2 v v tv v vv The cross product of two vectors finds a vector that is orthogonal to both For 3D vectors u and v in an orthonormal basis, the cross product w is: u y v z u zv y wx w w y u v uzv x u x v z u v u v w y x z x y 𝐰 = 𝐮 × 𝐯 = 𝐮 𝐯 sin θ 𝐮 × 𝐯 = −𝐯 × 𝐮 𝑎𝐮 + 𝑏𝐯 × 𝐰 = 𝑎 𝐮 × 𝐰 + 𝑏(𝐯 × 𝐰) In addition wu and wv u, v, and w form a right-handed system More linear algebra Matrices Homogeneous notation Geometric techniques Keep reading Appendix A Assignment 1 due tonight by midnight! Keep working on Project 1, due next Friday, February 10 by 11:59