This is the API v2 documentation. API v2 is currently in preview state and not recommended for production use! Read more about API v2 in our blog post.

Export files

Create export URL tasks

This task creates temporary URLs which can be used to download the files.

Please note that all tasks get deleted after 24 hours automatically. Meaning, the created URLs are available for 24 hours only.

Arguments
input string or array, required The ID of the task to create temporary URLs for. Multiple task IDs can be provided as an array.
archive_multiple_files boolean, optional By default, multiple files will create multiple export URLs. When enabling this option, one export URL with a ZIP file will be created.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

When the task is finished, the result key has a files array with the filenames and their temporary URLs, as shown by the example on the right. The URLs are valid for 24h and get invalidated prematurely if the export task is deleted.

Endpoint

POST https://api.cloudconvert.com/v2/export/url

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/export/url" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "input": "73df1e16-fd8b-47a1-a156-f197babde91a"
}'

Example Response

{
  "data": {
    "id": "eed87242-577e-4e3e-8178-9edbe51975dd",
    "operation": "export/url",
    "status": "finished",
    "message": null,
    "created_at": "2018-09-19T14:42:58+00:00",
    "started_at": "2018-09-19T14:42:58+00:00",
    "ended_at": "2018-09-19T14:42:58+00:00",
    "result": {
      "files": [
        {
          "filename": "data.txt",
          "url": "https://storage.cloudconvert.com/eed87242-577e-4e3e-8178-9edbe51975dd/data.txt?temp_url_sig=79c2db4d884926bbcc5476d01b4922a19137aee9&temp_url_expires=1545962104"
        }
      ]
    }
  }
}

Create export S3 tasks

Create a task to export files to a S3 bucket.

Arguments
input string or array, required The ID of the task to export. Multiple task IDs can be provided as an array.
bucket string, required The Amazon S3 bucket where to store the file(s).
region string, required Specify the Amazon S3 endpoint, e.g. us-west-2 or eu-west-1.
endpoint string, optional Use a custom S3 API endpoint. The default endpoint is built from the configured region. Makes it possible to use other S3 compatible storage services (e.g. DigitalOcean).
key string, optional S3 key for storing the file (the filename in the bucket, including path). If there are multiple files to export, printf style placeholders are possible (e.g. myfile-%d.pdf produces the output files myfile-1.pdf, myfile-2.pdf and so on).
key_prefix string, optional Alternatively to using key, you can specify a key prefix for exporting files.
access_key_id string, required The Amazon S3 access key id. It needs to have the s3:PutObject permission. When using a different ACL from private, it needs to have the s3:PutObjectAcl permission.
secret_access_key string, required The Amazon S3 secret access key.
session_token string, optional Auth using temporary credentials (AWS Security Token Service).
acl string, optional S3 ACL for storing the files. Possible values include: private, public-read , public-read-write, authenticated-read, bucket-owner-read, bucket-owner-full-control. Defaults to private.
cache_control string, optional S3 CacheControl header to specify the lifetime of the file, for example: max-age=172800 .
metadata object, optional Object of additional S3 meta data.
server_side_encryption string, optional Enable the Server-side encryption algorithm used when storing this object in S3. Possible values include AES256 and aws:kms.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/export/s3

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/export/s3" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "input": "73df1e16-fd8b-47a1-a156-f197babde91a",
  "access_key_id": "AKXXXXXXXXXXXX",
  "secret_access_key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
  "bucket": "mybucket",
  "region": "eu-central-1",
  "key": "myfile.ext",
  "metadata": {
    "x-amz-meta-some-key": "data"
  }
}'

Create export Azure blob tasks

Create a task to export files to a Azure blob container.

Arguments
input string or array, required The ID of the task to export. Multiple task IDs can be provided as an array.
storage_account string, required The name of the Azure storage account (This is the string before .blob.core.windows.net ).
storage_access_key string, optional The Azure secret key. Only required alternatively, if you are not providing a SAS token.
sas_token string, optional The Azure SAS token.
container string, required Azure container name.
blob string, optional Blob name for storing the file (the filename in the container, including path). If there are multiple files to export, printf style placeholders are possible (e.g. myfile-%d.pdf produces the output files myfile-1.pdf, myfile-2.pdf and so on).
blob_prefix string, optional Alternatively to using blob, you can specify a blob prefix for exporting files.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/export/azure/blob

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/export/azure/blob" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "input": "73df1e16-fd8b-47a1-a156-f197babde91a",
  "storage_account": "XXXXXXXXXXXX",
  "storage_access_key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
  "container": "mycontainer",
  "blob": "myfile.ext"
}'

Create export Google Cloud Storage tasks

Create a task to export files to a Google Cloud Storage bucket.

Arguments
input string or array, required The ID of the task to export. Multiple task IDs can be provided as an array.
project_id string, required The Google Cloud Project ID (api-project-...).
bucket string, required The Google Cloud Storage Bucket name.
client_email string, required The client email of the service account to use (...@api-project-....iam.gserviceaccount.com).
private_key string, required The private key of the service account.
file string, optional Azure blob name of the file to create (the filename in the bucket, including path).
file_prefix string, optional Alternatively to using file, you can specify a file prefix for exporting files.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/export/google-cloud-storage

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/export/google-cloud-storage" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "input": "73df1e16-fd8b-47a1-a156-f197babde91a",
  "project_id": "api-project-XXXXXXXXXXXX",
  "bucket": "mybucket",
  "client_email": "xxxxxxx@api-project-xxxxxx.iam.gserviceaccount.com",
  "private_key": "-----BEGIN PRIVATE KEY-----\nXXXXXXXXXXXX....",
  "file": "myfile.ext"
}'

Create export OpenStack Object Storage tasks

Create a task to export files to OpenStack Object Storage (Swift).

Arguments
input string or array, required The ID of the task to export. Multiple task IDs can be provided as an array.
auth_url string, required The URL of the OpenStack Identity endpoint (without version).
username string, required The OpenStack username.
password string, required The OpenStack password.
region string, required Specify the OpenStack region.
container string, required The name of the OpenStack Storage container.
file string, optional File name of the file to create (the filename in container bucket, including path).
file_prefix string, optional Alternatively to using file, you can specify a file prefix for exporting files.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/export/openstack

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/export/openstack" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "input": "73df1e16-fd8b-47a1-a156-f197babde91a",
  "auth_url": "https://auth.cloud.ovh.net",
  "region": "DE1",
  "username": "test",
  "password": "test",
  "container": "container_name",
  "file": "path/myfile.ext"
}'

Create export SFTP tasks

Create a task to export files to a SFTP server.

Arguments
input string or array, required The ID of the task to export. Multiple task IDs can be provided as an array.
host string, required The SFTP server hostname.
port integer, optional The SFTP port. Defaults to 22.
username string, required The SFTP username.
password string, optional The SFTP password.
private_key string, optional Alternatively to using password, you can provide a private key.
file string, optional File name of the file to create (the filename on the server, including path).
path string, optional Alternatively to using file, you can specify a path for exporting files.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/export/sftp

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/export/sftp" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "input": "73df1e16-fd8b-47a1-a156-f197babde91a",
  "host": "myserver.com",
  "username": "test",
  "password": "test",
  "file": "path/myfile.ext"
}'