Import files

Create import URL tasks

Create a task to import one file by downloading it from an URL.

Arguments
url string, required The URL to the file.
filename string, optional The filename of the input file, including extension. If none provided we will try to detect the filename from the URL.
headers object, optional Object of additional headers to send with the download request. Can be used to access URLs that require authorization.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/import/url

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/url" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "url": "https://url-to/file",
  "filename": "myfile.jpg",
  "headers": {
    "Authorization": "Bearer XX",
  }
}'

Create upload tasks

Create a task which uploads one input file. It allows your users to directly upload input files to CloudConvert, without temporary storing them on your server.

Arguments
redirect string, optional Optionally redirect user to this URL after upload.

Returns

The response has a form object in the result key which allows uploading the file. As shown below, you can use the url and parameters to perform browser-based uploading. Please note that all post parameters are required and that the number and names of the post parameters might vary. file always needs to be the last post parameter.

<form action="https://upload.cloudconvert.com/d660c0df-d15e-468a-9554-917e0f0f3ef1/"
      method="POST"
      enctype="multipart/form-data">
    <input type="hidden" name="expires" value="1545444403">
    <input type="hidden" name="max_file_count" value="1">
    <input type="hidden" name="max_file_size" value="10000000000">
    <input type="hidden" name="signature" value="d0db9b5e4ff7283xxfe0b1e3ad6x1db95c616121">
    <input type="file" name="file">
    <input type="submit">
</form>

If you have used the redirect parameter when creating the task, the user gets redirected after submitting the form.

The equivalent curl command for uploading the file would be:

curl -L "https://upload.cloudconvert.com/d660c0df-d15e-468a-9554-917e0f0f3ef1/" \
    -F "expires=1545444403" \
    -F "max_file_count=1" \
    -F "max_file_size=10000000000" \
    -F "signature=d0db9b5e4ff7283xxfe0b1e3ad6x1db95c616121" \
    -F "file=@/path/to/file.ext" \

The task completes as soon as a file was uploaded.

Endpoint

POST https://api.cloudconvert.com/v2/import/upload

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/upload" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "redirect": "https://url-to/redirect"
}'

Example Response

{
  "data": {
    "id": "c85f3ca9-164c-4e89-8ae2-c08192a7cb08",
    "operation": "import/upload",
    "status": "waiting",
    "message": "Waiting for upload",
    "code": null,
    "created_at": "2018-09-19T14:42:58+00:00",
    "started_at": null,
    "ended_at": null,
    "payload": {},
    "result": {
      "form": {
        "url": "https://upload.cloudconvert.com/d660c0df-d15e-468a-9554-917e0f0f3ef1/",
        "parameters": {
          "expires": 1545444403,
          "max_file_count": 1,
          "max_file_size": 10000000000,
          "signature": "d0db9b5e4ff7283xxfe0b1e3ad6x1db95c616121"
        }
      }
    }
  }
}

Create import base64 tasks

Create a task to import one file by a base64 string.

Do not use this import method for files >10MB. Embedded Base64 encoded files blow up the request payload which might cause issues. For bigger files, use an asynchronous import method like import/upload or import/url.

Arguments
file string, required The base64 encoded file content.
filename string, required The filename of the input file, including extension.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/import/base64

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/base64" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "file": "dGVzdDEyMw==",
  "filename": "test.txt"
}'

Create import raw tasks

Create a task to import one file by a raw string.

Do not use this import method for files >10MB. Files as embedded raw strings blow up the request payload which might cause issues. For bigger files, use an asynchronous import method like import/upload or import/url.

Arguments
file string, required The raw file content.
filename string, required The filename of the input file, including extension.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/import/raw

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/raw" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "file": "content",
  "filename": "test.txt"
}'

Create import S3 tasks

Create a task to import files by downloading it from a S3 bucket.

Arguments
bucket string, required The Amazon S3 bucket where to download the file.
region string, required Specify the Amazon S3 endpoint, e.g. us-west-2 or eu-west-1.
endpoint string, optional Use a custom S3 API endpoint. The default endpoint is built from the configured region. Makes it possible to use other S3 compatible storage services (e.g. DigitalOcean).
key string, optional S3 key of the input file (the filename in the bucket, including path).
key_prefix string, optional Alternatively to using key, you can specify a key prefix for importing multiple files at once.
access_key_id string, required The Amazon S3 access key id. It needs to have the s3:GetObject permission.
secret_access_key string, required The Amazon S3 secret access key.
session_token string, optional Auth using temporary credentials (AWS Security Token Service).
filename string, optional The filename of the input file, including extension. If none provided we will use the key parameter as the filename for the file.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/import/s3

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/s3" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "access_key_id": "AKXXXXXXXXXXXX",
  "secret_access_key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
  "bucket": "mybucket",
  "region": "eu-central-1",
  "key": "myfile.ext"
}'

Create import Azure blob tasks

Create a task to import files by downloading it from a Azure blob container.

Arguments
storage_account string, required The name of the Azure storage account (This is the string before .blob.core.windows.net).
storage_access_key string, optional The Azure secret key. Only required alternatively, if you are not providing a SAS token.
sas_token string, optional The Azure SAS token.
container string, required Azure container name.
blob string, optional Azure blob name of the input file (the filename in the bucket, including path).
blob_prefix string, optional Alternatively to using blob, you can specify a blob prefix for importing multiple files at once.
filename string, optional The filename of the input file, including extension. If none provided we will use the blob parameter as the filename for the file.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/import/azure/blob

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/azure/blob" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "storage_account": "XXXXXXXXXXXX",
  "storage_access_key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
  "container": "mycontainer",
  "blob": "myfile.ext"
}'

Create import Google Cloud Storage tasks

Create a task to import files by downloading it from a Google Cloud Storage bucket.

Arguments
project_id string, required The Google Cloud Project ID (api-project-...).
bucket string, required The Google Cloud Storage Bucket name.
client_email string, required The client email of the service account to use (...@api-project-....iam.gserviceaccount.com).
private_key string, required The private key of the service account.
file string, optional Filename of the input file (the filename in the bucket, including path).
file_prefix string, optional Alternatively to using file, you can specify a file prefix for importing multiple files at once.
filename string, optional The filename of the input file, including extension. If none provided we will use the file parameter as the filename for the file.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/import/google-cloud-storage

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/google-cloud-storage" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "project_id": "api-project-XXXXXXXXXXXX",
  "bucket": "mybucket",
  "client_email": "xxxxxxx@api-project-xxxxxx.iam.gserviceaccount.com",
  "private_key": "-----BEGIN PRIVATE KEY-----\nXXXXXXXXXXXX....",
  "file": "myfile.ext"
}'

Create import OpenStack Object Storage tasks

Create a task to import files by downloading it from OpenStack Object Storage (Swift).

Arguments
auth_url string, required The URL of the OpenStack Identity endpoint (without version).
username string, required The OpenStack username.
password string, required The OpenStack password.
region string, required Specify the OpenStack region.
tenant_name string, optional Specify the OpenStack tenant name.
container string, required The name of the OpenStack Storage container.
file string, optional File name of the input file (the filename in the container, including path).
file_prefix string, optional Alternatively to using file, you can specify a file prefix for importing multiple files at once.
filename string, optional The filename of the input file, including extension. If none provided we will use the file parameter as the filename for the file.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/import/openstack

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/openstack" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "auth_url": "https://auth.cloud.ovh.net",
  "region": "DE",
  "username": "test",
  "password": "test",
  "container": "container_name",
  "file": "path/myfile.ext"
}'

Create import SFTP tasks

Create a task to import files by downloading it from a SFTP server.

Arguments
host string, required The SFTP server hostname.
port integer, optional The SFTP port. Defaults to 22.
username string, required The SFTP username.
password string, optional The SFTP password.
private_key string, optional Alternatively to using password, you can provide a private key.
file string, optional File name of the input file (the filename on the server, including path).
path string, optional Alternatively to using file, you can specify a path for importing multiple files at once.
filename string, optional The filename of the input file, including extension. If none provided we will use the file parameter as the filename for the file.

Returns

The created task. You can find details about the task model response in the documentation about the show tasks endpoint.

Endpoint

POST https://api.cloudconvert.com/v2/import/sftp

Example Request

$ curl -X POST "https://api.cloudconvert.com/v2/import/sftp" \
       -H "Authorization: Bearer API_KEY" \
       -H "Content-type: application/json" \
       -d '{
  "host": "myserver.com",
  "username": "test",
  "password": "test",
  "file": "path/myfile.ext"
}'