Using Steampipe in Google Cloud Shell
Google Cloud Shell is a web-based environment preloaded with tools for managing Google Cloud. Because Google's cloud shell includes the CLI and launches with your credentials, you can quickly install Steampipe along with the GCP plugin and then instantly query your cloud resources.
About the Google Cloud Shell
The Google Cloud Shell is free to all Google Cloud customers. Because it's a free resource, Google imposes a few limits on the service. You can only use 50 hours of Google Cloud Shell each week. Additionally, the home directory of your cloud shell is deleted if you don't use your Cloud Shell for 120 days. An inactive Cloud Shell is shut down after one hour, and an active session can run at most for 12 hours.
When the Google Cloud Shell terminates, only files inside the home directory are preserved. For that reason we want to install the Steampipe binary in the local directory and not in /usr/local/bin
. For this reason all Steampipe commands will start with ./
.
To get started with the Google Cloud Shell, go to the Google Cloud Console. Select a Google Project that has billing enabled, then click on the Cloud Shell icon in the upper right.
Installing Steampipe in Google Cloud Shell
To install Steampipe, copy and run this command.
curl -s -L https://github.com/turbot/steampipe/releases/latest/download/steampipe_linux_amd64.tar.gz | tar -xzf -
To install the GCP plugin, copy and run this command.
./steampipe plugin install gcp
Your output should look something like:
Installed plugin: gcp@latest v0.27.0Documentation: https://hub.steampipe.io/plugins/turbot/gcp
Run your first query
To run a query, type:
./steampipe query
Let's query the gcp_project table.
selectname,project_id,project_number,lifecycle_state,create_timefromgcp_project;
You may find that the first time you run a query, a dialog box will prompt you to authorize Cloud Shell to use you credentials. Click "Authorize".
That's it! You didn't have to read GCP API docs, install an API client library, or learn how to use that client to make API calls and unpack JSON responses. Steampipe did all that for you. It works the same way for every GCP table. And because you can use SQL to join across multiple tables representing GCP services, it's easy to reason over your entire GCP organization.
To view the information about your GCP Organization, you can run:
selectdisplay_name,organization_id,lifecycle_state,creation_timefromgcp_organization;
To see the full set of columns for any table, along with examples of their use, visit the Steampipe Hub. For quick reference you can autocomplete table names directly in the shell.
If you haven't used SQL lately, see our handy guide for writing Steampipe queries.