The Why
At work this past week we had a hackathon-type event we called 'Innovation Week'. A buddy and I decided we wanted to do some predictive maintenance on customer systems. The main issue? We don't have their data. We don't even have everything we would need to do it with our data. Also, how would we get their data? Internet of Things (IoT) measurement systems, of course. So, in a week, I needed to write a heat transfer system simulator and then get the measurement points to communicate back to a dashboard. This is the reason for faking IoT - I didn't want to buy actual IoT measurement systems and try to hook them up to an actual heat transfer system. Even if I could, I wouldn't get data fast enough to do predictive maintenance, so I needed a way to simulate the devices so I could demonstrate a working prototype. I chose to use node.js because I'm trying to learn more JS and I think more server-side than client-side.
Why a tutorial of this? It turns out the whole process was actually hard for someone like me. There are a few tutorials out there about hooking up a Raspberry Pi, but most of them are old enough that some of the steps in AWS and even with the SDK are irrelevant and out of date, so I had to figure it out and I wanted to document for future me and also anybody else out there like me.
The Assumptions
- You already have an AWS account
- You're OK using node.js
- You're not super concerned with having overly permissive IAM roles in AWS
- You're at least moderately comfortable with provisioning servers from a provider (DigitalOcean, AWS, whatever)
The How
Step 1 - Get started with AWS IoT
The first thing you'll need to do is login to the AWS Console and navigate to the IoT Core resource. If it's your first time, your screen should look different, but our goal is to create a 'thing', or, register an IoT device. Don't worry, you don't have to have a device yet to make this happen, this is just getting things prepared.
You'll need to be in the Manage > Things section and click 'Create'.
We just want to make a single AWS IoT thing, so that's what we'll click.
There's a lot of possible things to fill in on this next screen. You can safely ignore all of it except the name at the top. I've named this device 'dev-tutorial'. You may be wondering, "What do all the other things do?" and the answer as best as I can tell is they're fields to help you keep stuff organized. The company I work for uses AWS and there are so many people doing so many different things that the tags and groups and such are essential. I do proof of concept stuff and blow it all away when I'm done so I ignore all of this.
This next step is important, we need to create the certificates that will allow our as-of-yet non-existent IoT device to identify itself to AWS. You don't want to mess up these next steps. You'll need to click on 'Create certificate' to generate 3 different files we'll need to download and copy to our server.
You probably see a box flash telling you the device has been created, but don't think you're done. You need to download the three files that are in the table and download a root CA (Certificate Authority) file. The first three links will all download actual files and you must download them now or they will be lost forever. This is your one shot to get these certificates. Don't blow it. The last link though doesn't directly download a file.
The new page that loads has a bunch of links. The one you want is the Amazon Root CA 1 file, an RSA 2048 bit key. Go ahead and click the link.
With all four files now downloaded, be sure to click 'Activate' back on the IoT screen that says 'Certificate Created!'. Once activated, click 'Attach a policy'.
Remember when I said I assumed you were OK with permissive IAM roles and stuff? Well, here I'm just selecting the global policy that allows this thing to do anything with IoT on any resource. Probably not a good long term idea, but it's what the tutorial tells you to do :)
Congrats! You've registered a thing with IoT!
Step 2 - Get a Server to Pretend to be an IoT device
First, go get yourself a server running Linux. I use DigitalOcean (note, that is my personal referral link) because $5/month a decent little box is great. I also chose to use Ubuntu.
Next, connect to the box and we'll get stuff installed.
- Install node.js
-
Install AWS CLI (on Ubuntu 18.10 I used
apt install awscli
without issue) - Install the AWS IoT Device SDK for JS,
npm i aws-iot-device-sdk
- Configure AWS Credentials
aws configure
I chose to make a project folder called 'iweek'. Wherever you want to work, make a directory called 'certs' and upload the 4 certificates we downloaded earlier. For ease of copy/paste, rename the files as such:
- ...-certificate.pem.crt > certificate.pem.crt
- ...-private.pem.key > private.pem.key
- ...-public.pem.key > public.pem.key
- AmazonRootCA1.pem > root-CA.crt
The last thing we need to determine is the custom host endpoint we'll be connecting to. This is done with the AWS CLI aws iot describe-endpoint --endpoint-type 'iot:Data-ATS' --region us-east-1
making sure to update the region to whatever region you've setup your IoT Thing in. Copy the contents of the endpoint address in the response, we'll need that in a minute.
Now we're ready to make the JavaScript file to make sure everything can connect OK. If your directory structure is setup like mine, the file should be saved to ~/iweek/test_iot.js
with ~/iweek/certs/
holding the certificates.
At the top of the file we need to load the IoT Device SDK and then initialize our device. We're not doing anything with the device just yet, just defining what it looks like. The 'clientId' is a string that you use to identify what's connecting. It does not have to match your Thing name, so it can be goofy or informative.
var awsIot = require('aws-iot-device-sdk');
var device = awsIot.device({
keyPath: './certs/private.pem.key',
certPath: './certs/certificate.pem.crt',
caPath: './certs/root-CA.crt',
clientId: 'first-try',
host: 'CUSTOM HOST ENDPOINT'
});
Add to the bottom of the file some instructions for the device to follow when it connects to the IoT Core.
device
.on('connect', function() {
console.log('connect');
});
At this point we'll boot up a terminal cd iweek
and node test_iot.js
All we should see after we hit enter is the word 'connect' in our STDOUT and no new prompt. This is because there's no end to our code, the device is connected, just not doing anything. So you'll need to send a cancel code to your terminal.
Now we can try sending messages.
We'll modify the 'on connect' part of the code now to subscribe to a topic and also publish to that topic. When we subscribe and publish we do so to a topic. The topic is just a name and it can be whatever you want. The name we publish to is important to remember because it's how we'll retrieve data later.
device
.on('connect', function() {
console.log('connect');
device.subscribe('dev_to_test', function(error, result) {
console.log(result);
});
device.publish('dev_to_test', JSON.stringify({ 'message': 'hi there!', 'points': 168}));
});
We also want to add a code block to alert us whenever a message is added to the topic. Now, when the device receives a message, we'll print the message contents to STDOUT.
device
.on('message', function(topic, payload) {
console.log('message', topic, payload.toString());
});
Back at our terminal we run node test_iot.js
and we get a few messages. First, we get our 'connect' to tell us we successfully connected. Next we get information telling us we've subscribed to the topic 'dev_to_test' and finally we see the result of publishing our message to the topic.
connect
[ { topic: 'dev_to_test', qos: 0 } ]
message dev_to_test {"message":"hi there!","points":168}
Step 3 - Collect data sent via IoT
This was the hardest step for me, it's where most of the tutorials I found broke down and I had to figure it out by my lonesome. What we're going to try and do is get IoT core to push incoming data on a topic to a Kinesis Firehose which should drop the results into S3.
First we need to setup the Kinesis Firehose. So we'll navigate to that service, click on the Data Firehose tab and the 'Create delivery stream'
In the create menu we need to give a name and, most importantly, make sure you have 'Direct PUT or other sources' selected. This is the simplest interface from IoT to Kinesis rather than going through a Data Stream. click Next at the bottom of the page.
The next page has a few options. If you want to do work on the the data submitted by the IoT thing you can trigger AWS Lambda. You can also convert the record format to something like Apache's parquet if you'd like. I chose to disable both of these features as I'm submitting back only simple data.
Finally, we need to select the destination. Firehose will stream data to S3, Redshift, Elasticsearch or Splunk. In this demo we're storing things to S3 because of the simplicity of storage and the ease of applying Athena on top of the S3 data. S3 is the default selection, so scroll down to pick which S3 bucket you want to use for storage (alternatively, click 'Create new' to make a new bucket) and then optionally specify an prefix for the files (the psuedo folder structure in S3). Once this is done, click 'Next'.
The final step to get our Firehose setup is to configure it. The most important part of this page is the S3 buffer conditions. Firehose will receive data and store it until the buffer condition is met and then it will push the data to S3. The defaults here are 5MB or 5 minutes, I've set mine to 1MB or 1 minute (the minimum) because we're not going to be shipping a ton of data back for this tutorial and I don't want to wait forever for it to arrive. The data we're sending isn't huge so we don't need compression and the data isn't sensitive so we don't need encryption, but those options exist. We need to have an IAM role with the correct permissions.
I have an IAM role called firehose_delivery_role which I selected here and I create a new Role Policy to make sure it can access my newly created S3 bucket. Once you've finished in this screen, click Next.
On the final screen make sure your choices look good and click 'Create delivery stream'. While the deliver stream is being created, we need to go back to our IoT Core page (now full of successful connections and donut charts!) and click into the 'Act' tab and then click the 'Create' button.
On the create screen we need to give our rule a name and ideally a description. I've named mine dev_to_rule. Next we need to write a SQL query to state what data we want to pass through the rule. This works like a basic SQL query, you can use 'where' statements and the like. If we were passing in complex data we me might even use the 'select' statement to filter which columns to keep. Here though, we just want to pass it all through so the query looks like the following. Note the table name is the topic we pass messages to.
select * from 'dev_to_test'
Now we need to add an action. We click the 'Add action' button which brings up a huge list of options. We want to pass the data to our Kinesis Firehose stream. You may see that we could just store the messages directly in an S3 bucket which is true, but by leveraging Firehose we have more options (lambda processing, other destinations, etc.). So we make the appropriate selection and then click 'Configure action'
When configuring we want to select the stream we created, dev_to_firehose, and pick the separator for messages received. Since Firehose will build up a buffer, multiple messages will be in the same file. I chose a new line to help with readability. Finally, we'll want to create a new IAM role and give it a name, dev_to_iot_to_firehose in this case. You'll need to hit the refresh button after you've created the role and select it from the dropdown list. Finally, hit 'Update role' to make sure it's applied and then click 'Add action'.
This takes us back to the create rule screen and so we'll press 'Create rule'.
With the rule created we go back to our server now and from the terminal run node test_iot.js
to trigger the message being sent to our channel. We need to be patient now (1 minute) while the buffer in Firehose builds up. After a minute we can go to S3 > dev-to-tutorial / iot / year / month / day / hour
and see we have a file that's been created!
We can download the file and see that we have text in a JSON structure that contains the message we sent. Success!
Innovation Week Results
Hopefully the tutorial helped you get started with IoT devices powered by node.js communicating with AWS services. The results of my efforts here plus the machine learning algorithm and real-time dashboard won my team first place which was very exciting. Thanks for letting me share this with you!
Top comments (0)