How Airport Codes Work

How Airport Codes Work

DALLAS — Have you ever wondered why the tag on your checked bag has three letters that match the airport you are departing from? These are International Air Transport Association (IATA) airport codes, and they have been in use since the beginning of aviation. It is incredible to think that the commercial aviation we know today would not be possible without them.

The International Air Transport Association (IATA) is a worldwide trade organization, representing more than 300 airlines from around the world and supporting airport operations in 194 countries. The association is responsible for assigning and managing the three-letter codes you see printed on luggage tags and flight tickets.

The practice of assigning codes to airports began in the 1930s, with each location having two letters. However, as more airports began to be built and the aviation industry continued to develop, this method became unmanageable.

In the 1960s, IATA introduced the three-letter code that we know today, which was established by IATA Resolution 763. This resolution set out a series of requirements that must be met by the entities requesting the codes for their airports.

Assigning airport codes can sometimes be a very difficult task as they cannot be connected to more than one airport in most cases. Photo: Adrian Nowakowski/Airways

Assigning Airport Codes


Today, there is no specific process for assigning and approving airport codes, other than the requirement that each airport must be given a unique code and that no code can be used twice.

Generally, it is the airport itself that submits a list of proposed codes to IATA, which approves them based on availability. The most common way to assign codes is to take the first three letters of the city or airport, such as MIA (Miami), MAD (Madrid), or FRA (Frankfurt). However, in some cases, such as Santiago de Chile, the desired code was already taken by San Diego International Airport (SAN), so the Chilean airport had to choose a different code (SCL).

Another method of assigning airport codes is to combine the first letters of the full airport name, such as JFK (New York—John F. Kennedy), CDG (Paris—Charles de Gaulle), or DFW (Dallas—Fort Worth). In the case of Toronto Pearson International Airport, the assigned IATA code is YYZ, which does not contain any letters from either the airport or the city.

Canada’s third largest airport in Calgary was assigned “YYC” as its IATA code for commercial operations, even though not being very similar to the city’s name. Photo: Daniel0685 (CC BY-SA 2.0)

Canadian Airport Codes


All of Canada’s commercial airports have IATA codes that start with the letter “Y.” This is because these codes were reused from the codes used to identify weather stations in the country’s earliest aviation period, in the 1930s.

A code starting with “Y” meant that the weather station was located at an airport, while a code starting with “W” meant that the weather station was located without an airport nearby. Other codes could start with “U” if the weather station was located with an NDB navaid, or “Z” if the station was located in the United States.

This means that some Canadian airport codes can give us clues about the airport’s location, such as YVR for Vancouver, while others, such as YQX for Gander International Airport, are not as readily identifiable. Toronto is no exception to this rule.

With up to 6 commercial airports serving as passenger entries to London, the city possesses a general IATA code “LON”, which generalizes traffic flow to the whole city. Diagram: Adrian Nowakowski/Airways.

Cities With More Than One Airport


When traffic and passenger flow in a large city is so high that it needs to operate with two or more airports at once, the city can request that IATA assign a global code to the city that encompasses all the airports that serve that location.

This allows passengers to search for flights to the city as a whole, instead of needing to search for each individual airport.

This practice is common among major cities worldwide, with examples including London (LON), New York (NYC), Stockholm (STO), and Rio de Janeiro (RIO). However, if a passenger is looking for flights to Montreal, they may need to guess that the code “YMQ” stands for the city.

Recognized by almost all UN member states, ICAO is responsible for how aviation functions and evolves since 1944. Photo: Kurt Raschke (CC BY-SA 2.0).

ICAO Codes


In addition to IATA, the International Civil Aviation Organization (ICAO) has developed its own airport coding system, which includes not only commercial airports but also all other airstrips around the world. Every small aerodrome, air base, commercial airport, and even airstrip in Antarctica has its own unique ICAO code.

Unlike IATA codes, ICAO codes are composed of four letters instead of three, which increases the range of possibilities to 456,976 different ICAO codes.

The lack of combinations is a major issue for IATA, as more than 10,000 airport codes have been registered since the 1960s, and the total number of combinations for three letters is only 17,576. Entities no longer have as much influence over the final assignment of the ICAO code for their airport and are generally granted according to availability.

The main difference between IATA and ICAO codes is that the ICAO codes contain important information about the location of a particular aerodrome throughout the world. This is accomplished through classification by world region, which reserves the first two letters of a code to indicate the country to which the code belongs.

Pursuing the correct organization of aerodromes and airspace, the ICAO established a series of world regions with an identifying initial letter, which is present on every ICAO code. Photo: Hytar/Wikimedia Commons.

ICAO World Regions


In the 1940s, ICAO assigned 22 different letters to each world region and country to distinguish them geographically and simplify the process of assigning codes to countries. In a four-letter ICAO code, the first letter always denotes the world region in which the country is located.

Examples include “E” for Northern Europe, “S” for South America, or “U” for former Soviet Union countries like Belarus, Kazakhstan, and Ukraine. The second letter then refers to the specific country within the region.

For instance, “EG” stands for the United Kingdom, “SB” for Brazil, “UK” for Ukraine, and so on. In this case, “LFPO” refers to Paris Orly Airport, as “L” indicates the ICAO World Region of Southern Europe, “F” denotes France, and “PO” stands for Paris Orly.

Things become complicated when looking at larger countries like Canada, Russia, and the United States. This is because, as two of the four letters in an ICAO code are fixed to the country, only 676 combinations remain for the specific airport code. As the US currently has more than 5,000 civil aerodromes, a new problem arises when registering all the ICAO codes for these airports.

To solve this, the ICAO granted the United States a special status by allowing the country to have a one-letter country code (“K”) followed by its three-letter IATA code, instead of the usual two-letter code. This opened up the possibility of assigning up to 17,576 different ICAO codes for all the different aerodromes in the United States.

Canada also has a similar issue, with the letter “C” assigned as the national code initial for every aerodrome inside Canadian territory. This, combined with the unique coding system for Canadian airports, makes them some of the most unique airports worldwide.

LAX, which is the IATA code for Los Angeles International Airport, has become a worldwide marketing sign for the US West Coast city. Photo: Andre M./Wikimedia Commons

When to Use IATA or ICAO Codes


Now that two different global organizations have established their own airport coding systems, entities can make different uses of IATA and ICAO codes. Each code has its own purpose.

ICAO codes are typically reserved for the internal and technical operations of airlines and airports, as they include not only commercial airports but also every airstrip and military air base in the world. They are used by air navigation service providers, air traffic controllers, pilots, and airlines to identify airports on flight plans, communications, and classified documents, among other things.

Meanwhile, IATA codes are mostly used in situations involving the general public and passengers. As they tend to be similar to the airport name or city, IATA codes are easily recognizable by passengers and don’t require any special aviation knowledge. For this reason, airlines and airports use these three-letter codes in their commercial operations, route networks, and marketing campaigns, as well as for luggage management and sorting.

By using a three-letter code instead of a full airport name, the chances of luggage transfer confusion are greatly reduced, ensuring that bags arrive at their correct destination more often.

London Stansted Airport Media Centre

Interesting Airport Codes Worldwide


With so many airports around the globe, assigning airport codes can sometimes cause bizarre situations where the three- or four-letter codes result in amusing names that are not related to the airport at all.

For instance, the IATA code for Omega Airport in Namibia is OMG; flying from Funafuti International Airport in Tuvalu to Derby Field in the US is like going from FUN to LOL; and someone who is a big animal fan would be lucky to be taking a flight from Cascais Municipal Airport in Portugal to Dongola Airport in Sudan, which is CAT to DOG.

These are just a few examples of how entertaining it can be to look into airport codes worldwide.


Featured image: Piviso (CC-0)

ANWAviation
Commercial aviation enthusiast from Madrid, Spain. Studying for a degree in Air Traffic Management and Operations at the Technical University of Madrid. Aviation photographer since 2018.

You cannot copy content of this page