- Getting Started
- Project Structure
What is Infrastructure as Code?
Infrastructure as Code is an approach for defining your infrastructure via code (in definition files using using a high level language or DSL), that can be kept under source control. It allows faster and reliable provisioning process for your infrastructure and eliminates the human error.
Why do we need it?
There are lots of benefits that can think of, when we talk Infrastructure as Code:
- Lack of risk
- Coded configurations
A vast number of tools like HashiCorp Terraform, Puppet, Ansible, cloud providers REST APIs makes everything easy to accomplish. In these blog posts series I will mainly use HashiCorp Terraform and Ansible for provisioning and managing an infrastructure in DigitalOcean.
Getting started with Terraform
Terraform is a tool for managing infrastructure. Configurations are coded using HashiCorp Configuration Language (HCL) - human readable, self-documenting and machine-friendly. Terraform supports JSON configurations, but the recommended approach is to use HCL. Once a new change is introduced in a definition file, Terraform is able to determine what and how should be changed, in order to match the desired state.
Helpful information how to install and configure Terraform can be found in the “Install Terraform” documentation article.
Terraform uses code files to describe the infrastructure and define the needed variables - Terraform configurations (with .tf extension). Supported data formats are two: Terraform and JSON.
The demo project has the following structure:
- templates - contains the templates data to generate the user data files, used for DO droplets bootstrapping
- droplets.tf - DO droplets configurations, including DO volumes
- firewalls.tf - firewalls configurations for our infrastructure
- outputs.tf - definition of the output values
- provider.tf - the DO provider configuration
- tags.tf - tags definitions for DO resources
- terraform.tfvars - files with variables values, used to populate the configuration data (fill variables with values), these files are loaded automatically, if tfvars file is not specified explicitly
- variables.tf - variables definitions used by all DO resources
Since we will need to provision and setup different environments on different data centers - we have a dynamic setup, we will need means to parameterized our infrastructure code. The way to do it in Terraform is to use input variables, they serve as parameters for the Terraform modules.
Each variable block declares a single variable and has three arguments:
- type (optional) - defines the variable’s type, supported types: string, list and map (boolean values are supported as well, but it is recommended to use them as strings)
- description (optional)
- default (optional) - default value for the variable
There are several different approaches for setting values for variables:
- Environment variables - the value of TF_VAR_<variable_name> will be used as value of the variable_name variable.
TF_VAR_web_server_image=ubuntu-18-x64 terraform apply
- Variable files - all *.tfvars files will be automatically loaded to fill the variable values, if not explicit variable files are specified via the -var-file flag
terraform apply -var-file=data.tfvars
- Command line parameters - values can be passed in the command line using -var flag
terraform apply -var web_server_image=centos
Once we have the variables in place, we should consume them in our configuration files. Accessing variable’s value in Terraform is done via interpolation, by prefixing the the variable name with the prefix var.
Further information about input variables can be found in the following articles: “Input Variable Configuration” and “Input Variables”
HashiCorp Terraform provides DigitalOcean support based on the DigitalOcean provider. To be able to interact with the DigitalOcean API (exactly what the Terraform provider for DO is doing under the hood), you should obtain a token for full access OAuth token to your account. This can be achieved from Apps & API section under the DO panel.
Now that you have an OAuth token generated, you should pass it to the DigitalOcean provider.
Since this is sensitive information, you could avoid storing the DO token in tfvars files. You could either pass it via command line argument or expose it as an environment variable.
The first Terraform command for a new configuration or on fetching an existing one is:
This command will download all needed plugins, based on the providers used in the project.
Initializing provider plugins... - Checking for available provider plugins on https://releases.hashicorp.com... - Downloading plugin for provider "digitalocean" (0.1.3)... - Downloading plugin for provider "template" (1.0.0)... The following providers do not have any version constraints in configuration, so the latest version was installed. To prevent automatic upgrades to new major versions that may contain breaking changes, it is recommended to add version = "..." constraints to the corresponding provider blocks in configuration, with the constraint strings suggested below. * provider.digitalocean: version = "~> 0.1" * provider.template: version = "~> 1.0" Terraform has been successfully initialized! You may now begin working with Terraform. Try running "terraform plan" to see any changes that are required for your infrastructure. All Terraform commands should now work. If you ever set or change modules or backend configuration for Terraform, rerun this command to reinitialize your working directory. If you forget, other commands will detect it and remind you to do so if necessary.
Should you add a new provider in your Terraform project, you need to run the init command, in order to fetch its plugin’s binaries.
Resources and Dependencies
Terraform is used for configuring infrastructure components: virtual machine, firewall, DNS records, etc. Using the definition language of Terraform we could describe such component via the resource block.
The first parameters of the resource block are used to specify the type of the resource “digitalocean_droplet” and its name / identifier “web”.
Each resource block of specific type (i.e. digitalocean_droplet) has specific meta-parameters, like name, image, monitoring, etc.
There are several general meta-parameters:
- count - number of resource of the specified component to be created
- depends_on - explicit dependency(ies) on other resources, that should be created before this resource
We could use a timeout block nested in the resource block, in order to wait for this resource to be created, if it needs more time.
Sometimes we will need to introduce dependency between our resources. For instance if we want to attach a persistent volume to a droplet in DigitalOcean, we should have this volume created beforehand. The syntax for defining a dependency is <resource_type>.<resource_identifier>. In the example below, we have an explicit dependency between the node and bastion droplets. In this scenario before creating the node droplet, Terraform will create the bastion node.
Implicit dependencies: resource A is automatically dependent on resource B, if we reference the resource B via interpolation in resource A block:
Further information about the Terraform resources can be found in this article “Build Infrastructure”.
Terraform allows users to render templates from a file or using inlined template format via the data source template_file.
- template - the content of the template. it could be loaded from a file using the file() function
- vars - variables for the interpolation
Further information can be found here.
We are ready to apply the configuration:
terraform apply -var digitalocean_token=<YOUR_DIGITAL_OCEAN_TOKEN> ... Apply complete! Resources: 4 added, 0 changed, 0 destroyed. Outputs: web_server_ip_addresses = <web-1_ip_address>,<web-2_ip_address>
You could try to access the default Apache site on each one of the web nodes.
The complete source code, used in the blog post, can be found here.
In the next blog post I will describe how to structure your project.