DEV Community

Kamal Hossain
Kamal Hossain

Posted on

Upload file to Google cloud storage from Node.js server

Cloud storage is one of the essential part of web and mobile application. Google cloud storage is a popular cloud storage. In this blog, we will see how to upload files to Google Cloud Storage using Node.js server. Lets jump into it.

Step one: Getting Keys for authentication

Before upload anything to the Cloud storage we must have the authorization for it. For that, we have to do the following steps:

  • Create a new service account if you don't have one.
  • Click on Create service account create-service-account-button
  • Give you key a name and skip the optionals entry for now and click Done, as we don't need thos for this task.
    create-service-account-optionals

  • Now you can see your newly created Service account on the service account list. As you can see below it doesn't have any Key now. So we need to create one.
    list-of-service-accounts

  • Open the Service account and click on Add key then Create new key. Keep the it as json format. And then download the key to root of your Node.js project.
    service-account-add-new-key

Step two: Giving service account the write permission to the bucket

  • Get the email for your service account you have just created above service-account-email-location
  • Go to your bucket that you want to upload the content, and click on the PERMISSIONS tab to get the GRANT ACCESS button.
    grant-access-button-at-bucket-permission-tab

  • Paste the email address of service account at the New principals input, and then you should see your service account and click on it.

new-principals-input

  • Now on Assign roles, select Storage Object Creator

storage-object-creator

  • Now your service account has the permission to create new objects at the bucket. Click on the Save.

Step three: Upload an image to Google cloud storage

For these steps, you must have Node.js installed in your computer along with npm. Also a bucket must be created already. Although it is possible to create the bucket from the Node.js application, but we are not doing it here.

  • Create a file called index.js at the root of your folder. And also keep the key file on the same level you have created from Step one.
  • Run npm i @google-cloud/storage
  • Now inside the index.js file, add these lines:


const { Storage } = require('@google-cloud/storage')

// Initialize storage
const storage = new Storage({
  keyFilename: `./key-file-from-service-account.json`,
})

const bucketName = 'my-test-bucket'
const bucket = storage.bucket(bucketName)

// Sending the upload request
bucket.upload(
  `./image_to_upload.jpeg`,
  {
    destination: `someFolderInBucket/image_to_upload.jpeg`,
  },
  function (err, file) {
    if (err) {
      console.error(`Error uploading image image_to_upload.jpeg: ${err}`)
    } else {
      console.log(`Image image_to_upload.jpeg uploaded to ${bucketName}.`)
    }
  }
)




Enter fullscreen mode Exit fullscreen mode
  • Make sure your key file and the image is the same root folder of your project. Or make sure to put appropiate path of these files if they are stored in another location.

  • Node run node index.js from command line. It should upload the image file to the Google Cloud Bucket.

Additionally don't forget to put your key file to .gitignore file.

Extra step: Making the content public after uploading

If you want to make the object publicly accessible from internet then you can make it public from Node.js right after the successfull upload. Just add these extra lines after the successfully uploaded message.



const { Storage } = require('@google-cloud/storage')

// Initialize storage
const storage = new Storage({
  keyFilename: `./key-file-from-service-account.json`,
})

const bucketName = 'my-test-bucket'
const bucket = storage.bucket(bucketName)

// Sending the upload request
bucket.upload(
  `./image_to_upload.jpeg`,
  {
    destination: `someFolderInBucket/image_to_upload.jpeg`,
  },
  function (err, file) {
    if (err) {
      console.error(`Error uploading image image_to_upload.jpeg: ${err}`)
    } else {
      console.log(`Image image_to_upload.jpeg uploaded to ${bucketName}.`)

        // Making file public to the internet
        file.makePublic(async function (err) {
        if (err) {
          console.error(`Error making file public: ${err}`)
        } else {
          console.log(`File ${file.name} is now public.`)
          const publicUrl = file.publicUrl()
          console.log(`Public URL for ${file.name}: ${publicUrl}`)
        }
       })

    }
  }
)



Enter fullscreen mode Exit fullscreen mode

Happy to help if you stuck anywhere.

Top comments (5)

Collapse
 
donald_moore_39ad1f1e01dc profile image
Donald Moore

Thank you so much! I was getting frustrated trying to figure out how to get GCS to authenticate the connection.

Collapse
 
kamalhossain profile image
Kamal Hossain

@donald_moore_39ad1f1e01dc
You are welcome, glad it helped you.

Collapse
 
arun_v_25b257b460163b3eca profile image
Arun V

hey it working fine for smaller files , for larger files i am getting
=>ERROR updating transcoding completion Error: Client network socket disconnected before secure TLS connection was established
at connResetException (/internal/errors.js:639:14)
at TLSSocket.onConnectEnd (/_tls_wrap.js:1570:19)
at TLSSocket.emit (/events.js:412:35)
at endReadableNT (/internal/streams/readable.js:1333:12)
at processTicksAndRejections

Collapse
 
jai_mohan_dedde911f597b9c profile image
Jai Mohan

Thanks Kamal , it was helpful

Collapse
 
innocentbern profile image
Innocent-Bern

Thank you.