Build workflows-as-code automations for AWS services using Flowpipe

Build workflows-as-code automations for AWS services using Flowpipe

Build workflows-as-code automations for AWS services using Flowpipe

Author: Gabriel Costa
Published on: 2024-09-03 16:24:44
Source: Integration & Automation

Disclaimer:All rights are owned by the respective creators. No copyright infringement is intended.


DevOps practitioners are always looking for better ways to easily build workflows that integrate AWS services with other services and APIs. Since we’ve all embraced infrastructure as code (IaC), why not extend that concept to workflow automation? That’s the philosophy of Flowpipe, a new open-source tool from Turbot. It offers the same components that you’ll find in typical workflow tools, including pipelines, steps, triggers, and control flow. And it integrates with everything you’d expect from this type of tool.

But Flowpipe isn’t ClickOps; you don’t draw diagrams. Pipeline definitions use HashiCorp configuration language (HCL) for code artifacts that live in repositories as first-class citizens of the modern software ecosystem—version-controlled and collaborative. You run pipelines using a single binary that you deploy locally, in the cloud, or in any CI/CD pipeline.

In this blog post, we introduce you to Flowpipe and provide a set of examples for automating cloud operations on AWS.

Prerequisites

To get started with Flowpipe for your AWS environment, follow these steps:

  1. Install Flowpipe.
  2. Configure the AWS CLI in your local environment.
  3. In the IAM console, configure the following environment variables to authenticate with AWS resources through IAM credentials.
    • AWS_PROFILE
    • AWS_ACCESS_KEY_ID
    • AWS_SECRET_ACCESS_KEY
  4. Optionally define credentials in Flowpipe configuration files.

Examples

The following examples show you how to use Flowpipe to perform common cloud operations such as creating a Flowpipe pipeline and building Amazon VPC and Amazon S3 scenarios into your pipeline.

Create a basic pipeline using a geolocation IP service

This example is a two-step pipeline that asks api.ipify.org for your public IP address and then calls the Really Free GEO IP Flowpipe library to geolocate the address.

Usage

To initialize a mod:

mkdir my_mod
cd my_mod
flowpipe mod init

To install the Really Free GEO IP mod as a dependency:

flowpipe mod install github.com/turbot/flowpipe-mod-reallyfreegeoip

To use the dependency in a pipeline step:

To create a pipeline that geolocates a public IP address, run the following code in the Flowpipe CLI:

pipeline "geolocate" {
  step "http" "get_ipv4" {
    url = "https://api.ipify.org?format=json"
  }

  step "pipeline" "get_geo" {
    pipeline = reallyfreegeoip.pipeline.get_ip_geolocation
    args = {
      ip_address = step.http.get_ipv4.response_body.ip
    }
  }

  output "ip_address" {
    value = step.http.get_ipv4.response_body.ip
  }

  output "latitude" {
    value = step.pipeline.get_geo.output.geolocation.latitude
  }

  output "longitude" {
    value = step.pipeline.get_geo.output.geolocation.longitude
  }
}

The pipeline steps run concurrently and are subject to dependencies. In this example, the get_geo step waits for the get_ipv4 step to finish running before starting.

This pipeline example uses the following Flowpipe step types.

http

The http step makes an HTTP request, allowing Flowpipe to interact with external systems. Pipelines can also run steps powered by containerized CLI commands and AWS Lambda-compatible functions.

input

Using the input step, you can build workflows that ask for and respond to human input. The pipeline supports interactions in Slack, Microsoft Teams, or email. You configure notifiers and integrations to specify the communication channel for an approval step.

Note: If you don’t require human input and only need to notify a channel, use the message step with the same configuration.

pipeline

The pipeline step, Flowpipe’s basic unit composition, is responsible for running another pipeline. Flowpipe libraries are made of interacting pipelines. In our example, the get_geo pipeline runs the reallyfreegeoip.pipeline.get_ip_geolocation pipeline.

query

The query step runs a SQL query either immediately or on a schedule. It works with Steampipe (a Postgres-based tool that queries cloud APIs) or with your own Postgres, MySQL, SQLite, or DuckDB database. When using Flowpipe to query Steampipe, you can take advantage of over 140 featured plugins that make APIs available to SQL queries. Steampipe’s AWS plugin is especially rich, offering hundreds of tables that cover a vast number of AWS APIs.

Create an Amazon VPC scenario

This example (1) creates an Amazon CloudWatch log group and a VPC (Amazon VPC) and (2) enables the sending of flow logs (AWS Flow Logs) to the CloudWatch log group. By referencing the output from the create_vpc step in the second and third steps, you automate the creation of the CloudWatch log group and enable the VPC flow log for the VPC.

Usage

To initialize a mod:

mkdir create_vpc
cd create_vpc
flowpipe mod init

To install the AWS mod as a dependency:

flowpipe mod install github.com/turbot/flowpipe-mod-aws

To use the dependency in a pipeline step:

Paste the following code into the create_vpc.fp file:

pipeline "vpc" {
  step "pipeline" "create_vpc" {
    pipeline = aws.pipeline.create_vpc  
    args = {
      region = "YOUR-AWS-REGION"
      cidr_block = "YOUR-IPV4-NETWORK-RANGE"
    }
  }

  step "pipeline" "create_cw_log" {
    pipeline = aws.pipeline.create_cloudwatch_log_group  
    args = {
      region = "YOUR-AWS-REGION"
      log_group_name = step.pipeline.create_vpc.output.vpc.VpcId
    }
  }

  step "pipeline" "create_vpc_flow_log" {
    pipeline = aws.pipeline.create_vpc_flow_logs  
    args = {
      region = "YOUR-AWS-REGION"
      vpc_id = step.pipeline.create_vpc.output.vpc.VpcId
      log_group_name = step.pipeline.create_vpc.output.vpc.VpcId
      iam_role_arn = "YOUR-IAM-ARN-ROLE-WITH-PERMISSION-TO-CREATE-VPC-FLOW-LOGS"
    }
  }

  output "vpc" {
    value = step.pipeline.create_vpc.output
  }

  output "cw_log" {
    value = step.pipeline.create_cw_log.output
  }

  output "vpc_flow_log" {
    value = step.pipeline.create_vpc_flow_log.output
  }
}

To run the pipeline, run this command: flowpipe pipeline run create_vpc

Aligning with security best practices for VPCs, this pipeline creates a VPC and enables VPC Flow Logs.

Create an Amazon S3 bucket scenario

Initialize a mod for creating an Amazon S3 bucket and add the AWS mod dependency:

mkdir create_s3
cd create_s3
flowpipe mod init
flowpipe mod install github.com/turbot/flowpipe-mod-aws 

To use the dependency in a pipeline step:

Paste the following code in the create_s3.fp file to create an Amazon S3 bucket and enable bucket versioning and encryption using customer-managed AWS KMS keys. This step aligns with security best practices for Amazon S3.

pipeline "s3" {
  step "pipeline" "create_s3" {
    pipeline = aws.pipeline.create_s3_bucket 
    args = {
      region = "YOUR-AWS-REGION"
      bucket = "YOUR-BUCKET-NAME"
    }
  }

  step "pipeline" "s3_versioning" {
    pipeline = aws.pipeline.put_s3_bucket_versioning 
    args = {
      region = "YOUR-AWS-REGION"
      bucket = "YOUR-BUCKET-NAME"
    }
    depends_on  = [step.pipeline.create_s3]
  }

  step "pipeline" "s3_encryption" {
  pipeline = aws.pipeline.put_s3_bucket_encryption  
  args = {
    region = "YOUR-AWS-REGION"
    bucket = "YOUR-BUCKET-NAME"
    sse_algorithm = "aws:kms"
    kms_master_key_id = "YOUR-KMS-KEY-ID"
    bucket_key_enabled = true
  }
  depends_on  = [step.pipeline.create_s3]
  }
}

To run the pipeline, run the following code:

‘Detect and correct’ libraries

Flowpipe detect and correct libraries such as AWS Thrifty Mod for Flowpipe can detect problems such as unallocated, cost-incurring Elastic IP addresses and correct them automatically or with human approval.

The following Flowpipe command runs the AWS Thirfty Mod for Flowpipe library:

flowpipe pipeline run detect_and_correct_vpc_eips_unattached \
  --arg approvers="["slack"]" \
  --arg host=local

You can update this command to work with a communication channel (such as Slack, Microsoft Teams, or email) interactively or on a schedule, with or without approval.

For details about integrating with Slack and other communication channels, including authentication and security information, see Slack integration in the Flowpipe documentation.

In the following image, the pipeline detects one unattached Elastic IP address and waits for a decision (Release or Skip) in a Slack channel. If you choose Release, the pipeline calls a utility pipeline in the AWS library to release the Elastic IP address, and a Slack message confirms the action.

Image showing how the pipeline detects one unattached Elastic IP address and waits for a decision in a Slack channel

If more than one unattached Elastic IP address is present, Slack interactions occur in order. To batch the interaction, set the max_concurrency variable to your desired batch size.

Conclusion

With its ability to run locally, in the cloud, or within CI/CD pipelines, Flowpipe streamlines DevOps processes and promotes infrastructure-as-code principles into your AWS Cloud workflows. We hope our examples have given you a taste of this powerful tool and its many benefits.

To learn more about Flowpipe, we invite you to join our community of builders. Just sign up for our Turbot community and install Flowpipe today. Also check out the AWS library in the Flowpipe Hub, where you’ll find several prebuilt pipelines for common scenarios, including creating a VP security, putting Amazon S3 bucket encryption and versioning, launching and stopping Amazon EC2 instances, and tagging resources.

If you have feedback about this blog post, use the Comments section on this page.

About the authors

Gabriel CostaGabriel Costa is a senior partner solutions architect at AWS, working with AWS Partners and customers on all things cloud operations. Outside of work, he enjoys playing the guitar, reading about philosophy, watching sci-fi and anime, and searching with his wife for the new cool restaurant in town.

Jon UdellJon Udell is the community lead for Turbot’s open source products Steampipe, Powerpipe, and Flowpipe. He’s known as both a developer and tech journalist who explores and explains many kinds of software, and many ways of developing it. He has worked for Lotus Development, BYTE, InfoWorld, O’Reilly Media, and Microsoft.


Disclaimer: All rights are owned by the respective creators. No copyright infringement is intended.

Leave a Reply

Your email address will not be published. Required fields are marked *

Secured By miniOrange