Import Files
URL
To import a file by downloading it from an URL, create a job with a import/url task.
Task Parameters
import/url.The URL to the file.
The filename of the input file, including extension. If none provided we will try to detect the filename from the URL.
Object of additional headers to send with the download request. Can be used to access URLs that require authorization.
Base64 String
To import a file by providing a base64 encoded string with the file content, create a job with a import/base64 task.
Task Parameters
import/base64.Base64 encoded file content.
The filename of the input file, including extension.
Raw String
To import a file by providing a raw string with the file content, create a job with a import/raw task.
Task Parameters
import/raw.File content as string.
The filename of the input file, including extension.
Upload
Allow your users to directly upload input files to CloudConvert, without temporary storing them on your server.
First, create a job with a import/upload task:
Task Parameters
import/upload.Uploading
The job creation response has a form object in the result key which then allows uploading the file:
{
"data": {
"id": "9a160154-58e2-437f-9b6b-19d63b1f59e3",
"tag": "myjob-123",
"status": "waiting",
"created_at": "2018-09-19T14:42:58+00:00",
"started_at": "2018-09-19T14:42:58+00:00",
"tasks": [
{
"id": "c85f3ca9-164c-4e89-8ae2-c08192a7cb08",
"operation": "import/upload",
"status": "waiting",
"message": "Waiting for upload",
"code": null,
"created_at": "2018-09-19T14:42:58+00:00",
"started_at": null,
"ended_at": null,
"payload": {},
"result": {
"form": {
"url": "https://upload.cloudconvert.com/d660c0df-d15e-468a-9554-917e0f0f3ef1/",
"parameters": {
"expires": 1545444403,
"max_file_count": 1,
"max_file_size": 10000000000,
"signature": "d0db9b5e4ff7283xxfe0b1e3ad6x1db95c616121"
}
}
}
}
],
"links": {
"self": "https://api.cloudconvert.com/v2/jobs/Xh56hvvMhG"
}
}
}
As shown below, you can use the url and parameters to allow browser-based uploading. Please note that all post parameters are required and that the number and names of the post parameters might vary. file always needs to be the last post parameter.
Alteratively, our SDKs have built-in methods for uploading files:
<form action="https://upload.cloudconvert.com/d660c0df-d15e-468a-9554-917e0f0f3ef1/"
method="POST"
enctype="multipart/form-data">
<input type="hidden" name="expires" value="1545444403">
<input type="hidden" name="max_file_count" value="1">
<input type="hidden" name="max_file_size" value="10000000000">
<input type="hidden" name="signature" value="d0db9b5e4ff7283xxfe0b1e3ad6x1db95c616121">
<input type="file" name="file">
<input type="submit">
</form>
curl -L "https://upload.cloudconvert.com/d660c0df-d15e-468a-9554-917e0f0f3ef1/" \
-F "expires=1545444403" \
-F "max_file_count=1" \
-F "max_file_size=10000000000" \
-F "signature=d0db9b5e4ff7283xxfe0b1e3ad6x1db95c616121" \
-F "file=@/path/to/file.ext" \
use \CloudConvert\Models\Job;
use \CloudConvert\Models\Task;
$job = (new Job())
->addTask(new Task('import/upload','upload-my-file'))
->addTask(
(new Task('convert', 'convert-my-file'))
->set('input', 'upload-my-file')
->set('output_format', 'pdf')
)
->addTask(
(new Task('export/url', 'export-my-file'))
->set('input', 'convert-my-file')
);
$job = $cloudconvert->jobs()->create($job);
$uploadTask = $job->getTasks()->whereName('upload-my-file')[0];
$cloudconvert->tasks()->upload($uploadTask, fopen('./file.pdf', 'r'), 'file.pdf');
const job = await cloudConvert.jobs.create({
tasks: {
'upload-my-file': {
operation: 'import/upload'
}
// ...
}
});
const uploadTask = job.tasks.find(task => task.name === 'upload-my-file');
const inputFile = fs.createReadStream('./file.pdf');
await cloudConvert.tasks.upload(uploadTask, inputFile, 'file.pdf');
job = cloudconvert.Job.create(payload={
'tasks': {
'upload-my-file': {
'operation': 'import/upload'
}
}
})
upload_task_id = job['tasks'][0]['id']
upload_task = cloudconvert.Task.find(id=upload_task_id)
res = cloudconvert.Task.upload(file_name='path/to/sample.pdf', task=upload_task)
// Create a client
final AsyncCloudConvertClient asyncCloudConvertClient = new AsyncCloudConvertClient();
// Create a job
final JobResponse createJobResponse = asyncCloudConvertClient.jobs().create(
ImmutableMap.of(
"import-my-file", new UploadImportRequest(),
"convert-my-file", new ConvertFilesTaskRequest()
.setInput("import-my-file")
.set("width", 100)
.set("height", 100),
"export-my-file", new UrlExportRequest().setInput("convert-my-file")
)
).get().getBody();
// Get job response
final TaskResponse uploadFileTaskJobResponse = jobResponse.getTasks().stream().filter(taskResponse -> taskResponse.getName().equals("import-my-file")).findFirst().get();
// File as input stream
final InputStream inputStream = getClass().getClassLoader().getResourceAsStream("file.jpg");
// Actual upload file
asyncCloudConvertClient.importUsing()
.upload(uploadFileTaskJobResponse.getId(), uploadFileTaskJobResponse.getResult().getForm(), inputStream).get();
job = cloudconvert.jobs.create({
tasks: [
{
name: "upload-my-file",
operation: "import/upload",
}
]
})
upload_task = job.tasks.where(operation: "import/upload").first
response = cloudconvert.tasks.upload("/path/to/sample.pdf", upload_task)
updated_task = cloudconvert.tasks.find(upload_task.id)
var job = await _cloudConvertAPI.CreateJobAsync(new JobCreateRequest
{
Tasks = new
{
upload_my_file = new ImportUploadCreateRequest()
// ...
}
});
var uploadTask = job.Data.Tasks.FirstOrDefault(t => t.Name == "upload_my_file");
string path = @"TestFiles/test.pdf";
string fileName = "test.pdf";
using (System.IO.Stream stream = File.Open(path, FileMode.Open, FileAccess.Read, FileShare.Read))
{
await _cloudConvertAPI.UploadAsync(uploadTask.Result.Form.Url.ToString(), stream, fileName, uploadTask.Result.Form.Parameters);
}
The job is stuck in waiting without input file and continues as soon as a file was uploaded.
parameters because they are changing dynamically! The keys and values on this page are just examples.S3
To import a file from your S3 compatible object storage, create a job with a import/s3 task.
Task Parameters
import/s3.The Amazon S3 bucket where to download the file.
Specify the Amazon S3 endpoint, e.g. us-west-2 or eu-west-1.
Use a custom S3 API endpoint. The default endpoint is built from the configured region. Makes it possible to use other S3 compatible storage services (e.g. DigitalOcean).
S3 key of the input file (the filename in the bucket, including path).
Alternatively to using key, you can specify a key prefix for importing multiple files at once.
The Amazon S3 access key id. It needs to have the s3:GetObject permission.
The Amazon S3 secret access key.
Auth using temporary credentials (AWS Security Token Service).
Azure Blob Storage
To import a file from a Azure blob container, create a job with a import/azure/blob task.
Task Parameters
import/azure/blob.The name of the Azure storage account (This is the string before .blob.core.windows.net).
The Azure secret key. Only required alternatively, if you are not providing a SAS token.
The Azure SAS token.
Azure container name.
Azure blob name of the input file (the filename in the bucket, including path).
Alternatively to using blob, you can specify a blob prefix for importing multiple files at once.
Google Cloud Storage
To import a file from a Google Cloud Storage bucket, create a job with a import/google-cloud-storage task.
Task Parameters
import/google-cloud-storage.The Google Cloud Project ID (api-project-...).
The Google Cloud Storage Bucket name.
The client email of the service account to use (...@api-project-....iam.gserviceaccount.com).
The private key of the service account.
Filename of the input file (the filename in the bucket, including path).
Alternatively to using file, you can specify a file prefix for importing multiple files at once.
OpenStack
To import a file from OpenStack Object Storage (Swift), create a job with a import/openstack task.
Task Parameters
import/openstack.The URL of the OpenStack Identity endpoint (without version).
Specify the OpenStack region.
The name of the OpenStack Storage container.
The OpenStack username.
The OpenStack password.
File name of the input file (the filename in the container, including path).
Alternatively to using file, you can specify a file prefix for importing multiple files at once.
SFTP
To import a file from your SFTP server, create a job with a import/sftp task.
Task Parameters
import/sftp.The SFTP server hostname.
The SFTP port. Defaults to 22.
The SFTP username.
The SFTP password.
Alternatively to using password, you can provide a private key.
File name of the input file (the filename on the server, including path).
Alternatively to using file, you can specify a path for importing multiple files at once.