Home » Programming » Javascript » Programming Variable Types

Why Using the Right Variable Type is Important (Especially in Javascript)

Variable types are a fundamental programming concept that often confuses beginners – especially those learning JavaScript. This article will show you the consequences of using the wrong variable types.

What is a Variable Type?

In computer programming, the type of a variable determines what values a variable can hold and what can be done with the variable. As an example, numeric variables can have mathematical operations performed on them, whereas string variables cannot (more on this later).

Different programming languages support different types of variables, but generally, they fall into the categories of numeric, string, and object variables.

Strongly Typed Languages

Strongly typed languages enforce the correct usage of variable types and will emit warnings and errors if you try to perform an action on a variable type that does not support that action.

Loosely Typed Languages

On the other hand, loosely typed languages will ignore these incompatibilities and try and continue with a best-case scenario – coercing variables of the wrong type to a type that can be used for the unsupported action.

This may seem convenient but often results in unexpected behavior. Relying on your computer to correctly guess what type a variable was meant to be is unwise. When programming, your intentions should be clear and unambiguous – both so that your computer – and future programmers who might work on your code – know what it is supposed to be doing.

Javascript – Nightmare Mode

The worst offender for being loosely typed is Javascript. Strings can have arithmetic performed on them. Numbers can be appended to strings. If you don’t create your variable with the right type, you will run into all sorts of unexplainable behavior.

Here’s an example:

Adding two numeric values works as you would expect:

3 + 3

Returns:

6

What if the values being added are strings that contain a number?

'3' + '3'

Returns the two strings joined – the numbers are not treated as numbers but simply as characters in the string:

"33"

Subtracting numbers also works as you would expect:

3 + 3 - 3

Returns:

3

Subtracting a string containing a number, however:

'3' + '3' - '3'

Returns

30

Seems to make no sense at all!

In a complex application, these issues can pile up quickly. Making sure your variables contain the right type of value is important if you want a reliable, accurate application

TypeScript is a noble attempt to tame Javascript and introduce strong typing. It also adds a bunch of other helpful syntax and vastly improves the language.

It even compiles down to regular Javascript so that it can be used wherever Javascript is currently used. Check out the official documentation and getting started guide here.

SHARE:
Photo of author
Author
I'm Brad, and I'm nearing 20 years of experience with Linux. I've worked in just about every IT role there is before taking the leap into software development. Currently, I'm building desktop and web-based solutions with NodeJS and PHP hosted on Linux infrastructure. Visit my blog or find me on Twitter to see what I'm up to.

Leave a Comment