Export Files
URL
To create temporary URLs for downloading files, create a job with a export/url task.
Please note that all tasks get deleted after 24 hours automatically. Meaning, the created URLs are available for 24 hours only.
Task Parameters
export/url.The input task name(s) for this task.
This option makes the export URLs return the Content-Disposition inline header, which tells browser to display the file instead of downloading it.
By default, multiple files will create multiple export URLs. When enabling this option, one export URL with a ZIP file will be created.
S3
To export files to an S3 compatible object storage, create a job with a export/s3 task.
Task Parameters
export/s3.The input task name(s) for this task.
The Amazon S3 bucket where to store the file(s).
Specify the Amazon S3 endpoint, e.g. us-west-2 or eu-west-1.
Use a custom S3 API endpoint. The default endpoint is built from the configured region. Makes it possible to use other S3 compatible storage services (e.g. DigitalOcean).
S3 key for storing the file (the filename in the bucket, including path). If there are multiple files to export, printf style placeholders are possible (e.g. myfile-%d.pdf produces the output files myfile-1.pdf, myfile-2.pdf and so on).
Alternatively to using key, you can specify a key prefix for exporting files.
The Amazon S3 access key id. It needs to have the s3:PutObject permission. When using a different ACL from private, it needs to have the s3:PutObjectAcl permission.
The Amazon S3 secret access key.
Auth using temporary credentials (AWS Security Token Service).
S3 ACL for storing the files. Possible values include: private, public-read, public-read-write, authenticated-read, bucket-owner-read, bucket-owner-full-control. Defaults to private.
S3 CacheControl header to specify the lifetime of the file, for example: max-age=172800.
Specify the Content-Disposition header for the file, for example: attachment or inline.
Specify the Content-Type header for the file, for example: application/pdf. By default, it will automatically set the correct Content-Type based on the mime type.
Object of additional S3 meta data.
Enable the Server-side encryption algorithm used when storing this object in S3. Possible values include AES256 and aws:kms.
Object of S3 tags to add to the keys.
Azure Blob Storage
To export files to an Azure blob container, create a job with a export/azure/blob task.
Task Parameters
export/azure/blob.The input task name(s) for this task.
The name of the Azure storage account (This is the string before .blob.core.windows.net).
The Azure secret key. Only required alternatively, if you are not providing a SAS token.
The Azure SAS token.
Azure container name.
Blob name for storing the file (the filename in the container, including path). If there are multiple files to export, printf style placeholders are possible (e.g. myfile-%d.pdf produces the output files myfile-1.pdf, myfile-2.pdf and so on).
Alternatively to using blob, you can specify a blob prefix for exporting files.
Object of additional Azure meta data.
Google Cloud Storage
To export files to a Google Cloud Storage bucket, create a job with a export/google-cloud-storage task.
Task Parameters
export/google-cloud-storage.The input task name(s) for this task.
The Google Cloud Project ID (api-project-...).
The Google Cloud Storage Bucket name.
The client email of the service account to use (...@api-project-....iam.gserviceaccount.com).
The private key of the service account.
Filename of the file to create (the filename in the bucket, including path).
Alternatively to using file, you can specify a file prefix for exporting files.
OpenStack
To export files to OpenStack Object Storage (Swift), create a job with a export/openstack task.
Task Parameters
export/openstack.The input task name(s) for this task.
The URL of the OpenStack Identity endpoint (without version).
Specify the OpenStack region.
The name of the OpenStack Storage container.
The OpenStack username.
The OpenStack password.
File name of the file to create (the filename in container bucket, including path).
Alternatively to using file, you can specify a file prefix for exporting files.
SFTP
To export files to your SFTP server, create a job with a export/sftp task.
Task Parameters
export/sftp.The input task name(s) for this task.
The SFTP server hostname.
The SFTP port. Defaults to 22.
The SFTP username.
The SFTP password.
Alternatively to using password, you can provide a private key.
File name of the file to create (the filename on the server, including path).
Alternatively to using file, you can specify a path for exporting files.
Upload
To upload files to any arbitrary URL via HTTP PUT, create a job with a export/upload task.
This can be used to upload to AWS S3 using presigned URLs, for example.
Task Parameters
export/upload.The input task name(s) for this task.
The URL to send the PUT request to.
Object of additional headers to send with the PUT request. Can be used to access URLs that require authorization.