Sign in

Developer Advocate for Google Cloud Platform, Apache Groovy programming language project VP/Chair

In the two previous episodes, we saw how to create and call subworkflows , and we applied this technique to making a reusable routine for logging with Cloud Logging. However, there’s already a built-in function for that purpose! So let’s have a look at this integration.

To call the built-in logging function, just create a new step, and make a call to the sys.log function:

This function takes a mandatory parameter: text . And an optional one: severity .

The text parameter accepts all types of supported values…

Workflows are made of sequences of steps and branches. Sometimes, some particular sequence of steps can be repeated, and it would be a good idea to avoid error-prone repetitions in your workflow definition (in particular if you change in one place, and forget to change in another place). You can modularize your definition by creating subworkflows, a bit like subroutines or functions in programming languages. For example, yesterday, we had a look at how to log to Cloud Logging : if you want to log in several places in your workflow, you can extract that routine in a subworkflow.


Time to come back to our series on Cloud Workflows. Sometimes, for debugging purposes or for auditing, it is useful to be able to log some information via Cloud Logging. As we saw last month, you can call HTTP endpoints from your workflow. We can actually use Cloud Logging’s REST API to log such messages! Let’s see that in action.

We call the API endpoint to write new logging…

In previous episodes of this Cloud Workflows series, we’ve learned about variable assignment , data structures like , and switch conditions to move between steps, and expressions to do some computations, including potentially some built-in functions.

With all these previous learnings, we are now equipped with all the tools to let us create loops and iterations, like for example, iterating over the element of an array, perhaps to call an API several times but with different arguments. So let’s see how to create such an iteration!

First of all, let’s prepare some variable assignments:

Workflows are not necessarily instantaneous, and executions can span over a long period of time. Some steps may potentially launch asynchronous operations, which might take seconds or minutes to finish, but you are not notified when the process is over. So when you want for something to finish, for example before polling again to check the status of the async operation, you can introduce a sleep operation in your workflows.

To introduce a sleep operation , add a step in the workflow with a call to the built-in sleep operation:

Google Cloud Workflows offers a few built-in environment variables that are accessible from your workflow executions.

There are currently 5 environment variables that are defined:

  • GOOGLE_CLOUD_PROJECT_NUMBER : The workflow project’s number.
  • GOOGLE_CLOUD_PROJECT_ID : The workflow project’s identifier.
  • GOOGLE_CLOUD_LOCATION : The workflow’s location.
  • GOOGLE_CLOUD_WORKFLOW_ID : The workflow’s identifier.
  • GOOGLE_CLOUD_WORKFLOW_REVISION_ID : The workflow’s revision identifier.

Let’s see how to access them from our workflow definition:

We use the built-in sys.get_env() function to access those variables. We’ll revisit the various existing built-in functions in later episodes.

Then when you execute this workflow…

So far, in this series on Cloud Workflows , we’ve only used the Google Cloud Console UI to manage our workflow definitions, and their executions. But it’s also possible to deploy new definitions and update existing ones from the command-line, using the . Let’s see how to do that!

If you don’t already have an existing service account, you should create one following these instructions . I’m going to use the workflow-sa service account I created for the purpose of this demonstration.

Our workflow definition is a simple “hello world” like the one we created for of our exploration…

Time to do something pretty handy: calling an HTTP endpoint, from your Google Cloud Workflows definitions. Whether calling GCP specific APIs such as the ML APIs, REST APIs of other products like Cloud Firestore, or when calling your own services, third-party, external APIs, this capability lets you plug your business processes to the external world!

Let’s see calling HTTP endpoints in action in the following video, before diving into the details below:

By default, when creating a new workflow definition, a default snippet / example is provided for your inspiration. We’ll take a look at it for this article…

All the workflow definitions we’ve seen so far, in this series, were self-contained. They were not parameterized. But we often need our business processes to take arguments (the ID of an order, the details of the order, etc.), so that we can treat those input values and do something about them. That’s where workflow input parameters become useful!

Let’s start with a simple greeting message that we want to customize with a firstname and lastname. We’d like our workflow to look something like this:

In the…

So far, in this series of articles on Cloud Workflows, we have used simple data types, like strings, numbers and boolean values. However, it’s possible to use more complex data structures, like arrays and dictionaries. In this new episode, we’re going to use those new structures.

Arrays can be defined inline (like anArray ) or spanning over several lines (like anotherArray ):

For dictionaries, you can define them as follows:

Guillaume Laforge

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store