Category Archives: nodejs

Super functions are delicious, just you let me show ya

With the advent of the cloud function it looks like we’re coming towards a new era in web development, where the front end comes totally detached and served by a serverless back-end.

This has its positive aspects and its drawbacks, which I’m not going to get into here.. Instead, I’m going to take a quick (ish) sprint around the current offerings from the various companies out there.

The plan

Our scenario is a super connected industry influencer, just like, mooching around the top art popups and bubble tea establishments. Time is precious when you’re prowling around influencing things, so instead of writing a blog post when something catches my eye (boooooring!), I just want to send an SMS with the latest hash-fashion and have a blog post immediately appear on the now hip and trendsome Blogger platform (Kanye uses it).

Across the street, however, is an evil Ad Conglomerate, lets just call them InterOmnitsu. And they see your influencing, and although BoingBoing is pretty great at keeping their content fresh, they need some of your youthful energy at that next pitch for water based caffienated wet wipes. They want to watch your blog posts, and get a copy of them as soon as you post one.

Its an arms race, rivalling the Cuban Shoe Crisis of 1967.

ahem.

So, to fulfill our scenario, we will be SMSing into Twillio, which will send the message to an Azure Function, which will then query Twitter… Our Azure function will take the results of the twitter query and send it to Google Cloud functions, which will in turn take the content and format it for a post to Blogger and send a message to linkedin notifying my followers of my newest musings.

On the dark side, an Amazon service will monitor the blogger page and when a new post is detected, take a screen grab, save it to cloud storage, then send the screenshot to an email account.

The yoof marketing industry is surely a den of inequity and vice..

To fulfill this task, we will need to set up the following (*cough* this may change)

Microsoft Azure Account

Twillio Demo Account

Twitter Application Account

Google Account

Blogger Account

LinkedIn Account

Amazon AWS account

All the above have 30 day/demo credit offers, and as we’re micro-functioning the whole thing, even if your demo does run out, just create another.

Step One: Twillio to Azure

Note: you will have a lot of API keys and accounts to keep track of, best create a document and keep them safe.

Create an Azure demo account and log into the functions area ( https://portal.azure.com/#create/Microsoft.FunctionApp ).

Create an account on Twilio ( https://www.twilio.com/try-twilio )

Get a new number, and create a programmable SMS. Point the Request URL to your Azure Function URL and set the dropdown to POST

SMS the number with a word, and you should see the request come into your function in your Azure log. Thats a big POST!

Add the following code, to the Azure function javascript and we should be able to text a Twilio number and see only the requested word in the log.

 module.exports = function(context, req) {
 var whatWasMessaged=req.query.Body;
 context.log('Search for a tweet for '+whatWasMessaged);
 context.done();
};

Oh yeah, create a Twitter app and make a note of all the api keys..

Next, we start with the following code building on it to connect to Twitter once the call from Twilio comes in (which will enter the function as a JSON POST).

module.exports = function(context, req) {
 var tweets=getTweetsForWord(req.query.Body);
 sendToGoogle(tweets,context);
 context.done();
};

function getTweetsForWord(nam){
 return {tweets:[{message:"HAM:"+nam+"HAM:", from:"neilhighley"},
 {message:"JAM:"+nam+"JAM:", from:"cooldude"}]};
}
 
function sendToGoogle(pak,context){
 for(var i=0;i<pak.tweets.length;i++){
 context.log(pak.tweets[i].message + " from "+pak.tweets[i].from);
 }
}

I’ve just created dummy functions so that I can test the connection from Twilio to my function URL and get the meat of the app done as soon as possible.

 

Create a Twitter App, and note down the APi and customer keys.

We need to use the Twitter API, so we have to have the package installed via node.

Open up the function app settings and navigate to the App Service Editor.

azure-function-advanced

On the left, select the console, so we can install packages, and install the Twitter package.

azure-scm-install-twitter-npm

Add the following to your function and run it.

var Twitter = require('twitter');
var client = new Twitter({
 consumer_key: 'xxxxxxxxxx',
 consumer_secret: 'xxxxxxxxxx',
 access_token_key: 'xxxxxxxxxx',
 access_token_secret: 'xxxxxxxxxx'
});

Add a “console.log(client);”  to the first function (module.exports).
Observe the console.log  in the monitor section on the left of the App service editor. You should see a huge json object with the twitter client. Otherwise, check the other log next to the function code which should say the error coming from Twitter.

Now that we have a connection to Twitter, we can connect our Azure Function to Twilio so that an SMS is sent to the twitter API.

var Twitter = require('twitter');

var client = new Twitter({
 consumer_key: 'xxxxxxxxxx',
 consumer_secret: 'xxxxxxxxxx',
 access_token_key: 'xxxxxxxxxx',
 access_token_secret: 'xxxxxxxxxx'
}); 

var tweet_count=3;
module.exports = function(context, req) {
 var tweets=getTweetsForWord(req.query.Body,context);
 sendToGoogle(tweets,context);
 context.done();
};

function getTweetsForWord(nam, context){
 var tweetsRecieved=[];
 var errorReturned={error:'none'};
 client.get('statuses/user_timeline', { screen_name: 'donaldtrump', count: tweet_count }, 
 function(error, tweets, response) {
     if (!error) {
       for(var i=0;i<tweets.length;i++){
        var thisTweet=tweets[i];
         this.tweetsRecieved.push(    {tweet_text:thisTweet.text,
 date:thisTweet.created_at,
 id:thisTweet.id});
           }
     }
     else {
      context.log("error");
      errorReturned.error=error;
     }
 });
 var ret= {tweets:this.tweetsRecieved};
 return ret;
}
 
function sendToGoogle(pak,context){
   //just gonna test for now
   for(var i=0;i<pak.tweets.length;i++){
      context.log(pak.tweets[i].tweet_text + " from "+pak.tweets[i].id);
   }
}

Now we have Twilio sending a post to Azure Functions, which calls Twitter, and formats a JSON object ready for sending to Google/Blogger..

next time.. hopefully…

Setup SSH on a home linux server for remote Node development

Hello again, today I’m going to run through whats required to get a node server running from home.

This may seem like an odd thing to do, but if you do a lot of remote work/hackathons/contract work you may find that the facilities to perform a internet accessible demo are quite lacking.

Firstly, we take our old laptop/micro pc/old pc and install the latest version of Ubuntu (15.10 at time of writing). However, we don’t need the desktop experience so we’ll just install the server installation. You’ll need to do this in front of the machine (although it is possible to roll a SSH enabled distro, but that is far from Quick 😉 ).

After installing Ubuntu and setting a static IP, log in and install openSSH..

Ensure that you follow the instructions in the link below, and alter the listening port to something other than 22 (e.g. 36622)

https://help.ubuntu.com/community/SSH/OpenSSH/Configuring

So, now you should be able to access your ssh prompt via  a local callback:

ssh -v localhost

Lets add node and a simple express application

sudo apt-get install node npm

Once node is installed, create a folder for your server

mkdir nodetest

Then browse to your new folder and initialise node

cd nodetest
npm init

Now add the http module

npm install http -save

(as ever, use sudo if none of this works or chmod/chown your folder)

And add the following code to a new javascript file called quickanddirty.js to create a simple http listener on port 8090

var http = require('http');
var server = http.createServer(function(req,resp){
    resp.end("Welcome to your Node server");
});
server.listen(8090, function(){
    console.log("Your server has started", 8090);
});

Test your server out by running node with the javascript file

node quickanddirty.js

You will see that the server has started, and is listening to port 8090. Leave it on as we move to accessing the box remotely.

Note: you can use cURL to check the response also if you are feeling unstoppable 😉

So, to recap, we have an Ubuntu linux box running openSSH and Node. Happy times, happy times.

At this point, as we already assume you have a home broadband connection, we will connect the box to the outside world.

As broadband supplier software differs I’ll try and explain what you need to do both on and away from the box.

Firstly, you need a way of mapping the often shifting IP address of your router with a static dns entry. This is done using a dynamic DNS service such as dynDNS (there are others available, but will generally require installing perl scripts on your linux box to keep the dynamic dns entry up to date).

So, register an account with DynDNS (others are available) and choose a subdoman. Note: Don’t make the name identifiable to yourself..lets not give hackers an easy ride 😉

Once you have your subdomain, you need to create a mechanism to update the dynamic service so calls to the domain get passed to your router IP address.

Both the SKY and virgin broadband devices have areas to select the Dynamic DNS service. Note: Advanced users can configure the dynamic dns update from the linux box

Once it is selected, you’ll enter your account details for the Dynamic DNS service and your router will periodically let DynDNS (or whoever) know the current IP address of your router. This allows you to ssh in on a domain and always get to your router.

Once the dynamic dns is set up you’ll generally need to set up a port forward via the routers firewall from the entry point of your router to the linux server’s openSSH port number (as chosen previously), 36622.

With the Virgin router, you will need to buy another router and put your Virgin box into modem mode, which will simply pass the connection to your other router for dynamic dns, port forwarding and firewall setup. The full instructions for doing this can be found online “virgin wifi modem mode dynamic dns“.

The Sky router is more friendly, with services to set up the port to listen to, then firewall settings to point it to your box.

As I said previously, you don’t need to use DynDNS through the broadband box, just ensure that the port is available and you have a method of updating the Dynamic DNS entry in your provider with your router IP.

The clevererer of you reading will have realised that you don’t need dynamic dns at all if you know the current IP of your router, so as a last resort, you can use that to connect to SSH.

Which leads us to, connecting to your server.

With your server running, hop onto another network, such as your phones, using a different computer and try to connect to your SSH server.

In terminal type the following, taking “nodeuser” as the user created on your linux box, and “randomchicken47.dyndns.org” as the dynamic dns entry (you could use the router IP instead also), and the port number of 36622 we chose earlier

ssh nodeuser@randomchicken47.dyndns.org -p 36622

You should be able to log in to your server. Verify by browsing to your nodetest folder.

So, we can access your server via openssh, but how can we access the node instance running at 8090. Simples. We tunnel to it.

type “exit” to exit from the openSSH session, then create a new session with added tunneling. To explain how tunneling works in one easy sample, I am going to tunnel into port 8090 on my SSH connection via a local port of 9999.

ssh nodeuser@randomchicken47.dyndns.org -p 36622 -L 9999:randomchicken47.dyndns.org:8090 -N

or, if that seems to not work correctly replace the second dynamic domain with your servers actual name.

ssh nodeuser@randomchicken47.dyndns.org -p 36622 -L 9999:randomchicken47svr:8090 -N

Now you’ll be able to browse to the localhost port of 9999 in a web browser, and see the response from your Node server via tunneling.

We have used tunneling instead of just opening a port direct to your node port as it increases security. If you’re opening ports for multiple services it increases your attack surface, meaning that an attacker has more things to attack to gain access to your network. Its much safer to have a single fortified SSH accesspoint on a non-standard port.

Be careful, you may get addicted to SSH tunneling, as it can enable you to do some amazing things.. But bear in mind, the tunnel uses your home bandwidth allowance if you have one.

Take care,

Neil

Quick N Dirty : Using Git BASH in Webstorm (MS Windows)

Whether you are having problems with nodejs running on windows terminal in Webstorm, or are just comfortable with Git BASH, there is a way of replacing the default windows command line with a Git BASH terminal.

Firstly, install Git for windows and make a note of the installation directory.

Then open up webstorm, then settings (Default or local project, doesn’t really matter).

Navigate to Tools>Terminal, and enter the BASH initialisation command;

D:\Program Files (x86)\Git\bin\sh.exe -login -i

The path is optional, if you have it set in System Variables already. If you’re not sure, just use the full path.

Until next time,

Have fun

Retrieving records from Apache Cassandra using NodeJS v0.10.35 and ExpressJS v4.0 via a REST interface

The adventures in Cassandra continue, and following on from the last post I’m going to show how to set up a REST interface to retrieve records from Apache Cassandra via Nodejs,

I’ll only set up the GET for a list of items and an individual item, the PUT, POST and DELETE actions I’ll leave to you.

NodeJS is a server based runtime environment for Javascript which uses Googles V8 javascript engine. It has proven to be an extremely fast way of pushing content and has been around for about 5 years now.

It is single threaded, sacrificing session and context specific overheads to maximise throughput.

As with Cassandra it can cope with a great number of requests a second (~ 500rps on 150 concurrent requests), obviously scaling up for processor, and as it is stateless, can be scaled up into round robin farms to cope at massive sizes. (see the Paypal and eBay metrics for more stat-candy).

To achieve a full-javascript stack, node is often used to drive javascript test-runners and compilation programs at the developer side, then used again, alongside a web server, such as ExpressJS to serve server side content back to a javascript application.

NodeJS runs on a variety of flavours of server  and desktop platform, but I’m going to skip the install of nodejs and try to keep this walkthrough as platform agnostic as possible. A common feature of all platforms is the NPM (Node Package Manager), and this is what we’ll be using to install the ExpressJS and Cassandra libraries we will need to serve the content.

Firstly, I’ll create a folder to run this application from, then I’ll open up a command prompt and browse to the folder. Then I install express by entering the following.

npm install express

The folder should now have a node_modules folder, which will contain the express server code and its dependencies.

We will also need to add  a package.json file so that node can verify the app on running.

{
  "name": "nd.neilhighley.com",
  "version": "1.0.0",
  "private": true,
  "scripts": {
    "start": "node ./bin/www"
  },
  "dependencies": {
    "express": "~4.11.1"
  }
}

Next, I create my application folder structure as follows;

>bin
>www
>public
>>javascripts
>routes
>views

Now I add a few javascript files to the following folders;

>www
www.js

adding the following code for the webserver

#!/usr/bin/env node

var app = require('../app');
var http = require('http');

var port = normalizePort(process.env.PORT || '8080');
app.set('port', port);
var server = http.createServer(app);
server.listen(port);
server.on('error', onError);
server.on('listening', onListening);

function normalizePort(val) {
 var port = parseInt(val, 10);

 if (isNaN(port)) {
 // named pipe
 return val;
 }

 if (port >= 0) {
 // port number
 return port;
 }

 return false;
}

function onError(error) {
 if (error.syscall !== 'listen') {
 throw error;
 }

 var bind = typeof port === 'string'
 ? 'Pipe ' + port
 : 'Port ' + port

 // handle specific listen errors with friendly messages
 switch (error.code) {
 case 'EACCES':
 console.error(bind + ' requires elevated privileges');
 process.exit(1);
 break;
 case 'EADDRINUSE':
 console.error(bind + ' is already in use');
 process.exit(1);
 break;
 default:
 throw error;
 }
}

function onListening() {
 var addr = server.address();
 var bind = typeof addr === 'string'
 ? 'pipe ' + addr
 : 'port ' + addr.port;
}

The code above simply sets up the webserver to listen on port 8080,  and sets our application main file as app.js in the folder root.

Before delving into app.js, I need to set up the routes, this will trap any calls to a particular Uri, and pass them to the appropriate codebase.  I’m going to name this after the route I will be using, this is not a requirement, but will help you in future! 😉

>routes
somedata.js
var express = require('express');
var router = express.Router();

/* GET data. */
router.get('/somedata/', function(req, res, next) {
 var jsonToSend={Message:"Here is my resource"};
 res.json(jsonToSend);
});

module.exports = router;

Now, I can set up my app.js file in the folder root to contain the main application logic.

var express = require('express');
var path = require('path');
var routes = require('./routes/somedata');
var app = express();

// view engine setup
//app.set('views', path.join(__dirname, 'views'));
//app.set('view engine', 'jade');

app.use(express.static(path.join(__dirname, 'public')));
app.use('/', routes);

// catch 404 and forward to error handler
app.use(function(req, res, next) {
 var err = new Error('Not Found');
 err.status = 404;
 next(err);
});

app.use(function(err, req, res, next) {
 res.status(err.status || 500);
 res.render('error', {
 message: err.message,
 error: err
 });
});

module.exports = app;

Lets do a quick test by starting the application

npm start

Then browsing to

http://localhost:8080/somedata

If all goes well, we will receive a JSON file as a response.

Now I can alter the routes file to mirror a typical REST interface.

Replace the previous single GET function above with the following

/* GET data. */
router.get('/somedata/', function(req, res, next) {
 var jsonToSend={Message:"Here is my resource"};
 res.json(jsonToSend);
});

router.get('/somedata/:item_id', function(req, res, next) {
 var jsonToSend={Message:"Here is my resource of item "+req.params.item_id};
 res.json(jsonToSend);
});

Confirm the above by calling the following urls and checking the content;

http://localhost:8080/somedata/
http://localhost:8080/somedata/12

The second url should return with the message which has the item_id in it.

Now I have confirmed the framework running, I’ll connect to cassandra and retrieve the records. To do that we just need to replace the jsonToSend with the cassandra response.

So, install the cassandra node client, as follows, in the root of your application.

npm install cassandra-driver

Then to update the package.json, add the cassandra-driver, as installed, so package.json now looks like;

{
 "name": "nd.neilhighley.com",
 "version": "1.0.0",
 "private": true,
 "scripts": {
 "start": "node ./bin/www"
 },
 "dependencies": {
 "express": "~4.11.1",
 "cassandra-driver":"~1.0.2"
 }
}

I’ll use the same keyspace as the last blog post, so have a look there for details on setting up cassandra.

casswcfexample

Update the somedata.json file to the following

var express = require('express');
var router = express.Router();

var cassandra = require('cassandra-driver');
var async = require('async');

var client = new cassandra.Client({contactPoints: ['127.0.0.1'], keyspace: 'casswcfexample'});

function GetRecordsFromDatabase(callback) {
 client.execute("select tid,description,title from exampletable", function (err, result) {
 if (!err){
 if ( result.rows.length > 0 ) {
 var records = result.rows[0];
 callback(1,result.rows);
 } else {
 callback(1,{});
 }
 }else{
 callback(0,{});
 }

 });
 }

function GetRecords(res){
 var callback=function(status,recs){
 if(status!=1){
 res.json({Error:"Error"});
 }else{
 var jsonToSend={Results:recs};
 res.json(jsonToSend);
 }
 };

 GetRecordsFromDatabase(callback);
}

/* GET data. */
router.get('/somedata/', function(req, res, next) {
 GetRecords(res);
});

router.get('/somedata/:item_id', function(req, res, next) {
 var jsonToSend={Message:"Here is my resource from "+req.params.item_id};
 res.json(jsonToSend);
});

module.exports = router;

Now when we call the following url

http://localhost:8080/somedata/

We should receive the following (or similar).

{"Results":[{"tid":1,"description":"description","title":"first title"},{"tid":2,"description":"another description","title":"second title"}]}

If you get an error, it may mean that async and long may need to be installed in your application root also. Install them with the npm.

We can format the JSON returned by altering  the GetRecords function.

The GetRecord function will be almost identical to the GetRecords function, just passing in the Id and changing the CQL.

function GetRecordFromDatabase(passed_item_id,callback) {
 client.execute("select tid,description,title from exampletable where tid="+passed_item_id, function (err, result) {
 if (!err){
 if ( result.rows.length > 0 ) {
 var record = result.rows[0];
 callback(1,record);
 } else {
 callback(1,{});
 }
 }else{
 callback(0,{});
 }

 });
 }

function GetRecord(r_item_id,res){
 var callback=function(status,recs){
 if(status!=1){
 res.json({Error:"Error"});
 }else{
 var jsonToSend={Results:recs};
 res.json(jsonToSend);
 }
 };

 GetRecordFromDatabase(r_item_id,callback);
}

There are a few things that can be done to the above code, including;

– Place all CRUD operations in standard library
– Have middleware fulfill requests to enable client side error catching and cleaner implementation

Hope you have enjoyed the example above. Apologies for any typos, etc. 🙂

Until next time.