Mathematics for Machine Learning – Day 13

Linear Mappings

Let me slap you with a formula first.

Consider

V,W∈RA mapping Φ:V→W preserves the vector space 
V, W \in \reals \\
\text{A mapping } \Phi : V \to W \text{ preserves the vector space }
V,W∈RA mapping Φ:V→W preserves …


This content originally appeared on DEV Community and was authored by Terra

Isomorphic vector meme

Linear Mappings

Let me slap you with a formula first.

Consider

V,W∈RA mapping Φ:V→W preserves the vector space  V, W \in \reals \\ \text{A mapping } \Phi : V \to W \text{ preserves the vector space } V,WRA mapping Φ:VW preserves the vector space 

If

Φ(x+y)=Φ(x)+Φ(y)Φ(λx)=λΦ(x)∀x,y∈V and λ∈R \Phi (x+y) = \Phi(x) + \Phi(y) \\ \Phi (\lambda x) = \lambda \Phi (x) \\ \forall x,y \in V \text{ and } \lambda \in \reals Φ(x+y)=Φ(x)+Φ(y)Φ(λx)=λΦ(x)x,yV and λR

What a mapping? and why did you start with a formula?

Good question. It's because you might realize this from the property of distributivity but with a different symbol also if I'm being jumped by an equation, you're going to get jumped as well.

It's preserving the vector while being able to scale it and don't forget with mathematical notations, Phi can be anything so long as it doesn't violate any rules stated, it can be a scalar, a vector, a matrix, even a set!... Actually it can't be a set, I'd get slapped by a mathematician for trying to map a vector with a set.

What use is mapping?

I don't know :D but much like in Panda's map (It's a python library) a mapping can be a an important tool and I see similarities from this type of mapping.

Also, the mapping itself isn't the main topic, it's:

Linear mapping

A mapping is called linear mapping if:

∀x,y∈V,∀λ and Ψ∈R:Φ(λx+Ψy)=λΦ(x)+ΨΦ(y) \forall x,y \in V, \forall \lambda \text{ and } \Psi \in \reals : \\ \Phi(\lambda x + \Psi y) = \lambda \Phi (x) + \Psi \Phi (y) x,yV,λ and ΨR:Φ(λx+Ψy)=λΦ(x)+ΨΦ(y)

That means that it's defined as linear mapping when a mapping is done to a vector and will result in a vector. Linear mapping can also be called linear transformation / vector space homomorphism

Consider a mapping

A mapping Φ:V→WWhere V,W can be arbitrary sets \text{A mapping } \Phi : V \to W \\ \text{Where } V, W \text{ can be arbitrary sets} A mapping Φ:VWWhere V,W can be arbitrary sets

Phi will be called differently depending on the conditions

Injective

∀x,y∈V:Φ(x)=Φ(y)→x=y \forall x,y \in V : \Phi (x) = \Phi (y) \to x=y x,yV:Φ(x)=Φ(y)x=y

Surjective

Φ(V)=W \Phi (V) = W Φ(V)=W

This means that every element in W can be reached from V using Phi

Bijective

Ψ∘Φ(x)=x \Psi \circ \Phi(x) = x ΨΦ(x)=x

Bijective will fulfill both injective and surjective. This can be undone by mapping the inverse of the previous mapping

Ψ:W→VtextWhereΨ=Φ−1 \Psi : W \to V \\ text{Where } \Psi = \Phi^{-1} Ψ:WVtextWhereΨ=Φ1

Special cases of linear mapping

text1.Isomorphism:Φ:V→WLinear and Bijectivetext2.Endomorphism:Φ:V→VLineartext3.Automorphism:Ψ:V→VLinear and Bijective text{1. Isomorphism: } \Phi : V \to W \text{Linear and Bijective} \\ text{2. Endomorphism: } \Phi : V \to V \text{Linear} \\ text{3. Automorphism: } \Psi : V \to V \text{Linear and Bijective} text1.Isomorphism:Φ:VWLinear and Bijectivetext2.Endomorphism:Φ:VVLineartext3.Automorphism:Ψ:VVLinear and Bijective

Example

Φ:R→C,Φ(x)=x1+ix2 \Phi : \reals \to \Complex, \Phi(x) = x_1 + i x_2 Φ:RC,Φ(x)=x1+ix2
Φ([x1x2]+[y1y2])=x1+ix2+y1+iy2=Φ([x1x2])+Φ([y1y2]) \Phi ( \left[\begin{array}{c} x_1 \\ x_2 \end{array}\right] + \left[\begin{array}{c} y_1 \\ y_2 \end{array}\right] ) = x_1 + ix_2 + y_1 + iy_2 \\ = \Phi ( \left[\begin{array}{c} x_1 \\ x_2 \end{array}\right]) + \Phi ( \left[\begin{array}{c} y_1 \\ y_2 \end{array}\right] ) Φ([x1x2]+[y1y2])=x1+ix2+y1+iy2=Φ([x1x2])+Φ([y1y2])
On the other hand :
Φ(λ[x1x2])=λx1+iλx2=λ(x1+ix2)=Φλ([x1x2]) \Phi ( \lambda \left[\begin{array}{c} x_1 \\ x_2 \end{array}\right]) = \lambda x_1 + i \lambda x_2 \\ = \lambda ( x_1 + i x_2 ) = \Phi \lambda ( \left[\begin{array}{c} x_1 \\ x_2 \end{array}\right] ) Φ(λ[x1x2])=λx1+x2=λ(x1+ix2)=Φλ([x1x2])

Finite-dimensional vector

(From theorem 3.59 in Axler, 2015)

V and W is isomorphic if and only if dim(V) = dim(W)

Intuition

This means that V and W are kind of the same thing, since they can be transformed to one another without incurring any loss.

Consider the vector V, W, X

For linear mappings:

Φ:V→W and Ψ:W→X \Phi : V \to W \text{ and } \Psi : W \to X Φ:VW and Ψ:WX

the mapping that is also linear will be

Φ∘ΦV→X \Phi \circ \Phi V \to X ΦΦVX

For isomorphism

If:

Φ:V→W is an isomorphism \Phi : V \to W \text{ is an isomorphism} Φ:VW is an isomorphism

Then:

Φ−1:W→V is an isomorphism too \Phi^{-1} : W \to V \text{ is an isomorphism too} Φ1:WV is an isomorphism too

For linear mapping (2)

If:

Φ:V→W,Ψ:V→W are linear \Phi : V \to W, \Psi : V \to W \text{ are linear} Φ:VW,Ψ:VW are linear

Then:

Φ+Ψ and λΦ,λ∈R are linear too \Phi + \Psi \text{ and } \lambda \Phi, \lambda \in \reals \text{ are linear too} Φ+Ψ and λΦ,λR are linear too

Acknowledgement

I can't overstate this: I'm truly grateful for this book being open-sourced for everyone. Many people will be able to learn and understand machine learning on a fundamental level. Whether changing careers, demystifying AI, or just learning in general, this book offers immense value even for fledgling composer such as myself. So, Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong, thank you for this book.

Source:
Axler, Sheldon. 2015. Linear Algebra Done Right. Springer
Deisenroth, M. P., Faisal, A. A., & Ong, C. S. (2020). Mathematics for Machine Learning. Cambridge: Cambridge University Press.
https://mml-book.com


This content originally appeared on DEV Community and was authored by Terra


Print Share Comment Cite Upload Translate Updates
APA

Terra | Sciencx (2024-07-20T19:25:45+00:00) Mathematics for Machine Learning – Day 13. Retrieved from https://www.scien.cx/2024/07/20/mathematics-for-machine-learning-day-13/

MLA
" » Mathematics for Machine Learning – Day 13." Terra | Sciencx - Saturday July 20, 2024, https://www.scien.cx/2024/07/20/mathematics-for-machine-learning-day-13/
HARVARD
Terra | Sciencx Saturday July 20, 2024 » Mathematics for Machine Learning – Day 13., viewed ,<https://www.scien.cx/2024/07/20/mathematics-for-machine-learning-day-13/>
VANCOUVER
Terra | Sciencx - » Mathematics for Machine Learning – Day 13. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/07/20/mathematics-for-machine-learning-day-13/
CHICAGO
" » Mathematics for Machine Learning – Day 13." Terra | Sciencx - Accessed . https://www.scien.cx/2024/07/20/mathematics-for-machine-learning-day-13/
IEEE
" » Mathematics for Machine Learning – Day 13." Terra | Sciencx [Online]. Available: https://www.scien.cx/2024/07/20/mathematics-for-machine-learning-day-13/. [Accessed: ]
rf:citation
» Mathematics for Machine Learning – Day 13 | Terra | Sciencx | https://www.scien.cx/2024/07/20/mathematics-for-machine-learning-day-13/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.