Hometutorials
transform images using faas and nodejs

Jump toUpdate content

Transforming images in an S3 bucket using Serverless Functions and Node JS

Reviewed on 02 May 2022 • Published on 26 October 2021
  • Serverless
  • Functions
  • Faas
  • Containers
  • Object-Storage

Serverless Functions can help you schedule automated transformation of all *.jpg or *.png pictures stored in an Object Storage bucket.

With this tutorial you learn how to transform images in an S3 bucket using Serverless Functions and NodeJS:

This project contains two functions where:

  • The first connects to the source Object Storage bucket, gets all image files from it, then calls the second function to resize each image.
  • The second function gets a specific image (whose name is passed through an HTTP call), resizes it and pushes the resized image to a new bucket.
Requirements:

Creating the BucketScan function

  1. Set the following environment variables:

    • SOURCE_BUCKET: Key of your source bucket
    • S3_ENDPOINT_URL: Scaleway’s Object Storage URL (http://s3.{region}.scw.cloud)
    • ACCESS_KEY_ID: Your API key ID
    • ACCESS_KEY: Your API key
    • TRANSFORM_URL: URL of the function used to transform the images
  2. Create the BucketScan function:

    • Import dependencies:
    const AWS = require("aws-sdk");
    const https = require("https");
    • Import environment variable:
    const SOURCE_BUCKET = process.env.SOURCE_BUCKET;
    const S3_ENDPOINT_URL = process.env.S3_ENDPOINT_URL;
    const ID = process.env.ACCESS_KEY_ID;
    const SECRET = process.env.ACCESS_KEY;
    const TRANSFORM_URL = process.env.TRANSFORM_URL;
    • Connect to Object Storage:
    // Create S3 service object
    const s3 = new AWS.S3({
    endpoint: S3_ENDPOINT_URL,
    credentials: {
    accessKeyId: ID,
    secretAccessKey: SECRET,
    }
    });
    //Configure parameters for the listObjectsV2 method
    const params = {
    Bucket: SOURCE_BUCKET
    };
    • Initiate your function
    exports.handle = async (event, context, callback) => {
    • Get all files from the source bucket:
    s3.listObjectsV2(params, function (err, data) {
    if (err) {
    console.log(err, err.stack); // an error occurred
    } else {
    let counter = 0;
    const contents = data.Contents;
    const total = contents.length;
    • Asynchronously call the ImageTransform function after having filtered objects on all images:
           contents.forEach(function (content) {
    // Infer the image type from the file suffix.
    srcKey = content.Key;
    const typeMatch = srcKey.match(/\.([^.]*)$/);
    if (!typeMatch) {
    console.log("Could not determine the image type.");
    } else {
    const imageType = typeMatch[1].toLowerCase();
    // Check that the image type is supported
    if (["jpeg", "jpg", "png"].includes(imageType) !== true) {
    return console.log(`Unsupported image type: ${imageType}`);
    } else {
    try {
    console.log(TRANSFORM_URL + "?key=" + srcKey);
    https.get(TRANSFORM_URL + "?key=" + srcKey);
    counter += 1;
    } catch (error) {
    console.log(error);
    }
    };
    };
    });
    }
    });
    }

Creating the ImageTransformation function

The following steps are adapted from the AWS lambda tutorial.

  1. Install sharp using the fbs linuxmusl platform to ensure compatibility with Scaleway’s node runtime:

    npm install --platform=linuxmusl --arch=x64 sharp --ignore-script=false --save
  2. Set the following environment variables:

    • SOURCE_BUCKET: Key of your source bucket
    • S3_ENDPOINT_URL: Scaleway’s Object Storage URL (http://s3.{region}.scw.cloud)
    • ACCESS_KEY_ID: Your API key ID
    • ACCESS_KEY: Your API Key
    • DESTINATION_BUCKET: Key of the destination bucket
    • RESIZED_WIDTH: Desired width of the resized image
  3. Create the ImageTransformation function:

    • Import dependencies
      const AWS = require("aws-sdk");
      const sharp = require("sharp");
    • Import environment variables
    const srcBucket = process.env.SOURCE_BUCKET;
    const S3_ENDPOINT_URL = process.env.S3_ENDPOINT_URL;
    const ID = process.env.ACCESS_KEY_ID;
    const SECRET = process.env.ACCESS_KEY;
    const dstBucket = process.env.DESTINATION_BUCKET;
    let width = parseInt(process.env.RESIZED_WIDTH, 10);
    if (width < 1 || width > 1000) {
    width = 200;
    }
    • Connect to Object Storage:
    const s3 = new AWS.S3({
    endpoint: S3_ENDPOINT_URL,
    credentials: {
    accessKeyId: ID,
    secretAccessKey: SECRET,
    }
    });
    • Get image information from the request call, using the event object:
    exports.handle = async (event, context, callback) => {
    // Read options from the event parameter.
    console.log("Reading options from event");
    // Object keys may have spaces or unicode non-ASCII characters.
    const srcKey = event.queryStringParameters.key;
    • Fetch the image from the source bucket:
    try {
    const params = {
    Bucket: srcBucket,
    Key: srcKey,
    };
    var origimage = await s3.getObject(params).promise();
    } catch (error) {
    console.log(error);
    return;
    }
    • Transform the image based on the information specified in the environment variable:
    try {
    var buffer = await sharp(origimage.Body)
    .resize({ width, withoutEnlargement: true })
    .toBuffer();
    } catch (error) {
    console.log(error);
    return;
    }
    • Push the image to the destination bucket:
      try {
    const destparams = {
    Bucket: dstBucket,
    Key: dstKey,
    Body: buffer,
    ContentType: "image"
    };
    const putResult = await s3.putObject(destparams).promise();
    } catch (error) {
    console.log(error);
    return;
    }
    }
    • Log Success transformation and end the function
       console.log(
    "Successfully resized " +
    srcBucket +
    "/" +
    srcKey +
    " and uploaded to " +
    dstBucket +
    "/" +
    dstKey
    );
    return {
    statusCode: 201,
    body: JSON.stringify({
    status: "ok",
    message:
    "Image : " +
    srcKey +
    " has successfully been resized and pushed to the bucket " +
    dstBucket,
    }),
    headers: {
    "Content-Type": "application/json",
    },
    };
    };

Pushing the two functions to Scaleway Serverless

Please refer to the Serverless Documentation to learn how to package and deploy your functions using Serverless Framework or a zip-file.

Appendix

Below you can find a Serverless.yml configuration file to use with the Serverless Framework:

service: fileupload
configValidationMode: off

provider:
name: scaleway
runtime: node14
# Global Environment variables - used in every function
env:
ACCESS_KEY: {Scaleway API key related to source and destination buckets}
ACCESS_KEY_ID: {Scaleway API key ID}
SOURCE_BUCKET: {source bucket key}
DESTINATION_BUCKET: {destination bucket key}
S3_ENDPOINT_URL: http://s3.fr-par.scw.cloud
TRANSFORM_URL: https://{your function}.functions.fnc.fr-par.scw.cloud
# the path to the credentials file needs to be absolute
scwToken: {Scaleway API token related to the Project ID}
scwProject: {Scaleway Project ID}

plugins:
- serverless-scaleway-functions

patterns:
- '!.gitignore'
- '!.git/**'

functions:
imagetransform:
handler: ImageTransform.handle
memoryLimit: 1024
minScale: 1
bucketscan:
handler: BucketScan.handle
memoryLimit: 1024
minScale: 0