Do you know what it is and what the structure of a vector space is? The vector spaces are probably **the most common mathematical structures **What can we find. All phenomena classified as linear in a multitude of contexts are linked in some way to a vector space, which gives an idea of their importance. Therefore, in this post, we talk about the structure of a vector space.

## What is a vector space?

Before talking about the structure of a vector space, it is convenient to define what a vector space is. Among other things, a vector space is:

A graphic representation. A set of dimensions where vectors can be located.

**Imagine a vector space as a section of space in which all the vectors that reside there, absolutely all that we can imagine, have the same properties.** What this indicates is that all the vectors that reside in the vector space are similar based on their properties. Specifically, some of these properties are the sum or the dot product, among others.

**A vector space is nothing more than a mathematical foundation that allows us to define where the vectors are stored.** The consequences of defining a vector space is that operations can be done with all the vectors that reside within the vector space.

Among these operations is, for example, the* cosine similarity.* This is possible because all vectors have the same dimensions.

## The structure of a vector space

A vector space is a non-empty set *V* of objects, called vectors, in which two operations have been defined: the sum and the product by a scalar (real number) subject to ten axioms and with some properties, which make up the structure of a vector space.

**The 10 axioms of vector spaces**

There are some axioms present in the structure of a vector space that are worth mentioning, such as:

u + v ∈ V u + v = v + u (u + v) + w = u + (v + w) There exists a null vector 0V ∈ V such that v + 0V = v For every v in V, there exists an opposite (–v) ∈ V such that v + (-v) = 0V αv ∈ V α (u + v) = αu + αv (α + β) v = αv + βv α (βv) = (αβ) v 1v = v

**Properties of vector spaces**

Likewise, the structure of a vector space also has some important properties to highlight. Among them are:

#### for the sum

**Associative property**: if we have a vector u, vyw and we add vyw first and then u, it is the same as if we do it the opposite way → u + (v + w) = (u + v) + w → **∀u, v, w ∈ V**

**Commutative property**: if we do u plus v, it will be the same as v plus u → u + v = v + u → **∀u, v ∈ V **

**Existence of the neutral element**: means that if we add zero to a vector, the resulting is the original vector → There exists e ∈ V such that e + v = v + e = v **∀u ∈ V**

**Existence of the opposite element**: means that if we have an element u and we subtract u from it, the result will be 0. For each v ∈ V there exists w such that v + w = w + v = e

#### For the dot product

The properties of the structure of a vector space in the dot product are almost the same as those in the sum:

**Associative property:** λ(µv) = (λµ)v **∀v ∈ V, ∀λ, µ ∈ R**

**Existence of the neutral element:** 1v = v ** ∀v ∈ V**

**Distributive property of the product with respect to the sum of vectors: **λ (u + v) = λu + λv and (λ + µ) v = λv + µv ** ∀u, v ∈ V** and **∀λ, µ ∈ R**

**Distributive property of the product with respect to the sum of scalars:** (λ + µ) u = λv + µu **∀u** **∈V** and **∀λ, µ ∈ R**

## How to continue?

Now that we have seen how the structure of a vector space works, you can continue learning about Big Data thanks to the Big Data, Artificial Intelligence & Machine Learning Full Stack Bootcamp, a high-intensity training in which you will be able to acquire all the knowledge, both at the theoretical and practical, what you need to enter the labor market in a short time. You will be supported by a great team of professionals willing to guide you in your training process and in the subsequent job search. **Dare to boost your future and request more information now!**