Technology Subset

Uploading Images from Google Sheets to Cloudinary – Technology Subset

Recently, I started using Cloudinary to upload images directly from Google Sheets itself and my experience has been outstanding. In this blog post, I will be explaining what I am doing and how I am approaching uploading those images.

The backstory is, earlier I was using Imgur to upload Google Drive images from inside Google Sheets to be imported inside WordPress for a programmatic SEO setup. The setup was working just fine until Imgur started acting up and I started seeking an alternative.

And upon looking, almost every website I looked at was suggesting Cloudinary for asset management, so I finally decided to give it a try.

The Google Sheets Apps Script

In a column, I had the Google Drive link of all the images (say column A) to be uploaded to Cloudinary. But those links were not working, so I had to make the Google Drive folder with images public. Also, I added another column that extracted the file ID from the original Google Drive link and then converted the link into a public link. Here’s the formula for that:

=""&(RIGHT(A2, 33))

After that, I took the help of ChatGPT and used some common sense to finally got the Google Sheets Apps Script working as it should. Here’s the Apps Script function that I created that would take the Google Drive public link from a column (say B)  and then return the Cloudinary image link in the next column (say C):

function uploadToCloudinary(imageUrl) {

var cloudinaryCloudName = "YOUR_CLOUD_NAME";
var unsignedPresetName = "YOUR_UPLOAD_PRESET_NAME";

// Construct the Cloudinary API endpoint URL
var url = "" + cloudinaryCloudName + "/image/upload";

// Fetch the image
var image = UrlFetchApp.fetch(imageUrl).getBlob();

// Create the payload for the API request
var payload = {
"file": image,
"upload_preset": unsignedPresetName

// Set the options for the API request
var options = {
"method": "post",
"payload": payload

// Make the API request
var response = UrlFetchApp.fetch(url, options);

// Parse the JSON response
var json = response.getContentText();
var data = JSON.parse(json);
var compressedUrl = data.secure_url;

// Return the URL of the uploaded and compressed image
return compressedUrl;

The code was working just fine!

Optimizing the uploaded images

I noticed in the Cloudinary documentation that they also had functionalities to automatically serve optimized images by just adding q_auto in the image URL. Below is an example:

Original image (889 KB):

Optimized image (315 KB):

Yes, just adding q_auto automatically served an optimized version of the image. But… the original image was still taking 889 KB of size. And since there is a limit on the storage space, I wanted to store only the optimized version of the image.

It turned out there was an option to optimize the images while uploading by modifying the payload as follows:

var payload = {
    "file": image,
    "upload_preset": unsignedPresetName,
    "eager": "q_auto",
    "eager_async": true

But this didn’t work because I was using unsigned uploads (in fact, it wasn’t possible to sign the uploads inside Google Sheets, or maybe I couldn’t make it work), this eager_async method didn’t work.

Upon researching, I found out that I could also automatically compress/optimize images even after uploading. I just had to go to Settings > Upload, edit the unsigned upload preset, go to the Upload Manipulations tab and set the quality as Auto (as you see in the screenshot below).

Cloudinary Dashboard - Image Optimization Auto

After that all the uploaded images, whether PNGs or JPGs or anything else, they were automatically optimized and stored. Most of my uploaded images were 200 KB in size before uploading and after uploading they were just around 60 KB.

Yes, now I was able to store 3 optimized images in the same space as 1 original image. And that too without compromising the quality.

Read next:

Hope this helps you in some way.

If you get stuck somewhere, kindly feel free to let me know in the comments below.

Source link