How to upload files to S3 using @aws-sdk/client-s3
Using @aws-sdk/client-s3 from Node.js to upload single files and entire folders to AWS S3, plus the gotchas you only learn the hard way (content type, leading slashes, Windows paths).
Intro
On a recent project I needed a script that uploads static files to AWS S3. The environment running the script couldn't have aws-cli installed, so I ended up using @aws-sdk/client-s3 directly to get the job done.
This post walks through how to use @aws-sdk/client-s3 to upload files and folders to S3, plus a few non-obvious things to watch out for along the way.
Installation
Besides the SDK itself, we'll also need mime-types and slash. I'll explain why later in the post.
yarn add @aws-sdk/client-s3 mime-types slashHow does @aws-sdk/client-s3 work?
The flow is pretty straightforward:
- Initialize a client with your AWS credentials.
- Create a
PutObjectcommand pointing at the bucket, key, and file contents. - Tell the client to
sendthe command.
const fs = require('fs');
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
// 1. Init the client.
const { AWS_REGION, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, S3_BUCKET } =
process.env;
const client = new S3Client({
region: AWS_REGION,
credentials: {
accessKeyId: AWS_ACCESS_KEY_ID,
secretAccessKey: AWS_SECRET_ACCESS_KEY,
},
});
// 2. Build the upload command.
const pathFile = './my-file.txt';
const params = {
Bucket: S3_BUCKET, // bucket name, e.g. 'sample_bucket_101'.
Key: 'my-file.txt', // object name, e.g. 'my-file.txt'.
Body: fs.readFileSync(pathFile), // file content.
ContentType: 'application/text', // file content type.
};
const uploadCommand = new PutObjectCommand(params);
// 3. Send the command.
client
.send(uploadCommand)
.then(() => {
console.log('Success uploading the file to S3.');
})
.catch(err => {
console.log('Error uploading file', err);
});Things worth knowing
The library is solid, but a few details aren't obvious from the docs. Here's a short list of gotchas — hopefully it saves you the hours I lost looking these up:
-
Unlike the AWS CLI,
@aws-sdk/client-s3only uploads one file at a time. If you want to upload a set of files or a whole folder, you need to write the orchestration in JavaScript yourself (we'll do that in the next section). -
You have to set
ContentTypeper file or@aws-sdk/client-s3defaults to"application/octet-stream". Themime-typespackage gives you a helper to derive the content type from a file extension. There's a related issue here.const mime = require('mime-types'); const uploadCommand = new PutObjectCommand({ // ... ContentType: mime.lookup(filePath), }); client.send(uploadCommand); -
If
Keylooks like a path,@aws-sdk/client-s3will create matching folders inside the bucket.With
static/js/app.jsas an example:const uploadCommand = new PutObjectCommand({ // ... Key: 'static/js/app.js', }); client.send(uploadCommand);You'll end up with this folder structure inside the bucket:
my-bucket-s3/ └── static/ └── js/ └── app.js -
Related: if
Keystarts with/, the SDK will create a folder literally called/. Taking/app.js:const uploadCommand = new PutObjectCommand({ // ... Key: '/app.js', }); client.send(uploadCommand);You get:
my-bucket-s3/ └── / └── app.js -
If you're going to use a path as a
Key, remember to convert Windows backslashes to forward slashes. Theslashpackage handles this:const slash = require('slash'); const filePath = 'folder\\dist\\app.js'; const uploadCommand = new PutObjectCommand({ // ... Key: slash(filePath), // 'folder\\dist\\app.js' → 'folder/dist/app.js' }); client.send(uploadCommand);
Uploading a whole folder
Since @aws-sdk/client-s3 is single-file at a time, we'll need a couple of helpers:
walk— recursively visits every file in a folder.uploadFile— uploads a single file to S3.
For the example, assume the folder we want to upload is my-app/dist (the static assets for the app). Inside the bucket, we want them under a static prefix so they end up reachable at your-public-s3-url.com/static/:
my-app/
└── dist/ <== upload this to S3
├── app.js
├── manifest.json
├── chunks/
│ ├── app.js
│ ├── chunk-1.js
│ ├── chunk-2.js
│ └── chunk-3.js
└── images/
└── background.png
└── src/walk
This helper walks a folder recursively — it'll also descend into subfolders. The second argument is a callback that runs for each file:
const path = require('path');
const fs = require('fs');
function walk(dir, callback) {
const files = fs.readdirSync(dir);
files.forEach(file => {
const filePath = path.join(dir, file);
// recurse into directories
if (fs.statSync(filePath).isDirectory()) {
walk(filePath, callback);
} else {
// run the callback on each file we find
callback(filePath);
}
});
}uploadFile
This one uploads a single file to S3. Here's where mime-types and slash come in:
const fs = require('fs');
const path = require('path');
const mime = require('mime-types');
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const slash = require('slash');
const { AWS_REGION, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, S3_BUCKET } =
process.env;
const client = new S3Client({
region: AWS_REGION,
credentials: {
accessKeyId: AWS_ACCESS_KEY_ID,
secretAccessKey: AWS_SECRET_ACCESS_KEY,
},
});
function uploadFile({ filePath, folderPath, bucketBasePath = '' }) {
/**
* Strip the folderPath to get the path relative to the upload root.
* '/project/my-app/dist/chunks/app.js' → 'chunks/app.js'
*/
const filename = filePath.replace(folderPath, '');
/**
* Prepend the destination folder.
* 'chunks/chunk-1.js' → 'static/js/chunks/chunk-1.js'
*/
const s3File = slash(path.join(bucketBasePath, filename));
/**
* Drop a leading slash if there is one — otherwise we'd
* create a folder literally named '/' inside the bucket.
*/
const s3KeyFile = s3File[0] === '/' ? s3File.slice(1) : s3File;
const params = {
Bucket: S3_BUCKET,
Key: s3KeyFile,
Body: fs.readFileSync(filePath),
ContentType: mime.lookup(filePath),
};
console.log('Uploading file: ', s3KeyFile);
const uploadCommand = new PutObjectCommand(params);
return client.send(uploadCommand).catch(err => {
console.log('Error uploading file', err);
process.exit(1);
});
}Finally, glue both helpers together to upload the whole folder:
async function uploadFolderToS3(folderPath, bucketBasePath) {
const filesArr = [];
// collect every file
walk(folderPath, filePath => {
filesArr.push(filePath);
});
// upload them in series
await filesArr.reduce((p, filePath) => {
return p.then(() => uploadFile({ filePath, folderPath, bucketBasePath }));
}, Promise.resolve());
}
const distFolderPath = path.join(process.cwd(), './dist');
// 'static' is the destination prefix inside the bucket
uploadFolderToS3(distFolderPath, 'static');Wrapping up
Using @aws-sdk/client-s3 itself is straightforward. The work goes into the orchestration around it — in this case, walking a folder and uploading every file individually with the right key and content type.
Thanks for reading.