Cloud storage is one of the essential part of web and mobile application. Google cloud storage is a popular cloud storage. In this blog, we will see how to upload files to Google Cloud Storage using Node.js server. Lets jump into it.
Step one: Getting Keys for authentication
Before upload anything to the Cloud storage we must have the authorization for it. For that, we have to do the following steps:
- Create a new service account if you don't have one.
- Click on Create service account
Give you key a name and skip the optionals entry for now and click Done, as we don't need thos for this task.
Now you can see your newly created Service account on the service account list. As you can see below it doesn't have any Key now. So we need to create one.
Open the Service account and click on Add key then Create new key. Keep the it as json format. And then download the key to root of your Node.js project.
Step two: Giving service account the write permission to the bucket
- Get the email for your service account you have just created above
Go to your bucket that you want to upload the content, and click on the PERMISSIONS tab to get the GRANT ACCESS button.
Paste the email address of service account at the New principals input, and then you should see your service account and click on it.
- Now on Assign roles, select Storage Object Creator
- Now your service account has the permission to create new objects at the bucket. Click on the Save.
Step three: Upload an image to Google cloud storage
For these steps, you must have Node.js installed in your computer along with npm. Also a bucket must be created already. Although it is possible to create the bucket from the Node.js application, but we are not doing it here.
- Create a file called
index.js
at the root of your folder. And also keep the key file on the same level you have created from Step one. - Run
npm i @google-cloud/storage
- Now inside the
index.js
file, add these lines:
const { Storage } = require('@google-cloud/storage')
// Initialize storage
const storage = new Storage({
keyFilename: `./key-file-from-service-account.json`,
})
const bucketName = 'my-test-bucket'
const bucket = storage.bucket(bucketName)
// Sending the upload request
bucket.upload(
`./image_to_upload.jpeg`,
{
destination: `someFolderInBucket/image_to_upload.jpeg`,
},
function (err, file) {
if (err) {
console.error(`Error uploading image image_to_upload.jpeg: ${err}`)
} else {
console.log(`Image image_to_upload.jpeg uploaded to ${bucketName}.`)
}
}
)
Make sure your key file and the image is the same root folder of your project. Or make sure to put appropiate path of these files if they are stored in another location.
Node run
node index.js
from command line. It should upload the image file to the Google Cloud Bucket.
Additionally don't forget to put your key file to .gitignore file.
Extra step: Making the content public after uploading
If you want to make the object publicly accessible from internet then you can make it public from Node.js right after the successfull upload. Just add these extra lines after the successfully uploaded message.
const { Storage } = require('@google-cloud/storage')
// Initialize storage
const storage = new Storage({
keyFilename: `./key-file-from-service-account.json`,
})
const bucketName = 'my-test-bucket'
const bucket = storage.bucket(bucketName)
// Sending the upload request
bucket.upload(
`./image_to_upload.jpeg`,
{
destination: `someFolderInBucket/image_to_upload.jpeg`,
},
function (err, file) {
if (err) {
console.error(`Error uploading image image_to_upload.jpeg: ${err}`)
} else {
console.log(`Image image_to_upload.jpeg uploaded to ${bucketName}.`)
// Making file public to the internet
file.makePublic(async function (err) {
if (err) {
console.error(`Error making file public: ${err}`)
} else {
console.log(`File ${file.name} is now public.`)
const publicUrl = file.publicUrl()
console.log(`Public URL for ${file.name}: ${publicUrl}`)
}
})
}
}
)
Happy to help if you stuck anywhere.
Top comments (5)
Thank you so much! I was getting frustrated trying to figure out how to get GCS to authenticate the connection.
@donald_moore_39ad1f1e01dc
You are welcome, glad it helped you.
hey it working fine for smaller files , for larger files i am getting
=>ERROR updating transcoding completion Error: Client network socket disconnected before secure TLS connection was established
at connResetException (/internal/errors.js:639:14)
at TLSSocket.onConnectEnd (/_tls_wrap.js:1570:19)
at TLSSocket.emit (/events.js:412:35)
at endReadableNT (/internal/streams/readable.js:1333:12)
at processTicksAndRejections
Thanks Kamal , it was helpful
Thank you.