Snakemake storage plugin: gcs

https://img.shields.io/badge/repository-github-blue?color=#022c22 https://img.shields.io/badge/author-Vanessa Sochat-purple?color=#064e3b PyPI - Version PyPI - License

A Snakemake storage plugin that reads and writes from Google cloud storage (GCS).

Installation

Install this plugin by installing it with pip or mamba, e.g.:

pip install snakemake-storage-plugin-gcs

Usage

Queries

Queries to this storage should have the following format:

Query type

Query

Description

any

gs://mybucket/myfile.txt

A file in an google storage (GCS) bucket

any

gcs://mybucket/myfile.txt

A file in an google storage (GCS) bucket (alternative query scheme)

As default provider

If you want all your input and output (which is not explicitly marked to come from another storage) to be written to and read from this storage, you can use it as a default provider via:

snakemake --default-storage-provider gcs --default-storage-prefix ...

with ... being the prefix of a query under which you want to store all your results. You can also pass custom settings via command line arguments:

snakemake --default-storage-provider gcs --default-storage-prefix ... \
    --storage-gcs-max-requests-per-second ... \        --storage-gcs-project ... \        --storage-gcs-keep-local ... \        --storage-gcs-stay-on-remote ... \        --storage-gcs-retries ...

Within the workflow

If you want to use this storage plugin only for specific items, you can register it inside of your workflow:

# register storage provider (not needed if no custom settings are to be defined here)
storage:
    provider="gcs",
    # optionally add custom settings here if needed
    # alternatively they can be passed via command line arguments
    # starting with --storage-gcs-..., see
    # snakemake --help
    # Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.
    max_requests_per_second=...,
    # Google Cloud Project
    project=...,
    # keep local copy of storage object(s)
    keep_local=...,
    # The artifacts should stay on the remote
    stay_on_remote=...,
    # Google Cloud API retries
    retries=...,

rule example:
    input:
        storage.gcs(
            # define query to the storage backend here
            ...
        ),
    output:
        "example.txt"
    shell:
        "..."

Using multiple entities of the same storage plugin

In case you have to use this storage plugin multiple times, but with different settings (e.g. to connect to different storage servers), you can register it multiple times, each time providing a different tag:

# register shared settings
storage:
    provider="gcs",
    # optionally add custom settings here if needed
    # alternatively they can be passed via command line arguments
    # starting with --storage-gcs-..., see below
    # Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.
    max_requests_per_second=...,
    # Google Cloud Project
    project=...,
    # keep local copy of storage object(s)
    keep_local=...,
    # The artifacts should stay on the remote
    stay_on_remote=...,
    # Google Cloud API retries
    retries=...,

# register multiple tagged entities
storage foo:
    provider="gcs",
    # optionally add custom settings here if needed
    # alternatively they can be passed via command line arguments
    # starting with --storage-gcs-..., see below.
    # To only pass a setting to this tagged entity, prefix the given value with
    # the tag name, i.e. foo:max_requests_per_second=...
    # Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.
    max_requests_per_second=...,
    # Google Cloud Project
    project=...,
    # keep local copy of storage object(s)
    keep_local=...,
    # The artifacts should stay on the remote
    stay_on_remote=...,
    # Google Cloud API retries
    retries=...,

rule example:
    input:
        storage.foo(
            # define query to the storage backend here
            ...
        ),
    output:
        "example.txt"
    shell:
        "..."

Settings

The storage plugin has the following settings (which can be passed via command line, the workflow or environment variables, if provided in the respective columns):

CLI setting

Workflow setting

Envvar setting

Description

Default

Choices

Required

Type

--storage-gcs-max-requests-per-second VALUE

max_requests_per_second

Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.

None

str

--storage-gcs-project VALUE

project

SNAKEMAKE_STORAGE_GCS_PROJECT

Google Cloud Project

None

str

--storage-gcs-keep-local VALUE

keep_local

keep local copy of storage object(s)

False

bool

--storage-gcs-stay-on-remote VALUE

stay_on_remote

The artifacts should stay on the remote

False

bool

--storage-gcs-retries VALUE

retries

Google Cloud API retries

5

int