Skip to content

Literal Types

von Schappler edited this page Oct 27, 2024 · 1 revision

Understanding how TypeScript infers types

Let's take the small snippet of code below to as reference to explain how type inference works in TypeScript:

let number1 = 10;
const number2 = 10;

On the snippet above what happens is that when we use the keyword let to declare a variable, Typescript will infer automatically the correty type for the variable, in that case, number1: number, and if we try to re-assing the value of number1 with a value of a different type, that will then display a type warning on the screen.

On the other hand, when we make use of the keyword const, typescript will not infer the type correctly - it will infers that the type as the value 10 (number2: 10), creating what we call literal type - the actual value of a type.

This means we are telling TypeScript that number2 is not a generic number, but specifically A NUMBER with the value 10.

If we analize what's happening here, this makes sense because:

  • We can reasign the value of variables declared with the keyword let, and so TypeScript will restrict the data type for that variable
  • We can't reasign the value of variables declared with const and then that variable will not only have the database restricted, but also the value itself restricted.

In Typescript it's also possible to declare explicitly literal types when declaring variables:

let number1: 10 = 10;
const number2: 10 = 10;

It's important to note though, that this type of declaration is more commonly useful in TypeScript when it's associated to unions of type, which will be discussed later.

Clone this wiki locally