Graduate Program KB

Node

Table of Contents


Command Line Scripts

Keywords

  • process: a global object that provides information about, and control over, the current Node process.
    • stdout.write: allows you to write to standard output (similar but not the same as console.log).
    • stderr.write: allows you to write to standard error (similar but not the same as console.error).

Setting up a script file

  • Begin the file with a shebang #!/usr/bin/env node to make it explicitly a node script.
  • Make sure it is an executable file chmod u+x file.js.

Command Line Arguments

  • process.argv: is an array of all the arguments that were passed in through the executing shell.
    • Often don't care about the first 2 arguments (node location, script file location).
    • We can chop these off with process.argv.slice(2)
  • yargs: another argument tool.
  • minimist: npm package for parsing arguments.
    • var args = require(minimist)( process.argv.slice(2) ): returns an object of arguments as the key and the value given to them.
    • you can pass this options to construct how you wish your file to be run.
      var args = require(minimist)(process.argv.slice(2), {
        // Specifying a help flag and a file flag
        // --help && --file=""
        boolean: ['help'],
        string: ['file'],
      })
      // We can set up if statements on how we want to handle things when these args are or are not provided.
      
    • access the flags how you would any object property args.help or args.file

Useful Script Packages and Uses

  • fs: file system package
    • readFileSync(filepath): shows you the binary buffer of the file by default with console.log. We can see the contents by using process.stdout.write.
      • Alternatively we could have simply done fs.readFileSync(filepath, "utf8");
    • fs.createReadStream(): creates a readable stream.
  • path: paths package
    • path.resolve: we can give this our file arg (args.file) and it will return its full path given only a relative path.
    • __dirname: magic node variable which tells you the absolute path of the directory containing the currently executing file.
    • path.join: takes any number of inputs and use the appropriate directory separator for that system to create a path.

Asynchronous readFile

  • fs.readFile(callback): is the asynchronous form and it expects a callback.
    fs.readFile(filepath, function onContents(err, contents) {
      if (err) {
        console.error(err)
      } else {
        // contents is a buffer to process contents we need to convert the binary buffer to a string first.
        // say we want to upper case the contents:
        contents = contents.toString().toUpperCase()
        process.stdout.write(contents)
      }
    })
    

stdin

  • var getStdin = require("get-stdin"): "get-stdin" -> a node package that gives us access to the stdin stream.
  • Allows us to pipe contents into our program and handle it appropriately.

Environment Variables

  • Work on a per command basis.
    • If we run VALUE=2 ./program.js, this sets an environment variable VALUE and gives it a value of 2 only for that command.
  • Typically used for configuration purposes
// a good test to make sure you're program is doing as expected with environment variables.
if (process.env.VARIABLE) {
  console.log(process.env.VARIABLE)
}

Streams

  • readable streams and writeable streams

  • we can pipe readable streams into writeable streams with .pipe

    • stream1.pipe(stream2)
  • You can create pipe chains due to the fact that piping a readable into a writeable produces a readable.

  • Transforming a stream requires a little more work:

    function processFile(inStream) {
        var outStream = inStream;
    
        var upperStream = new Transform({
            transform(chunk, enc, cb) {
                this.push(chunk.toString().toUpperCase());
                cb();
            }
        });
    
        outStream = outStream.pipe(upperStream);
    
        var targetStream = process.stdout;
        outStream.pipe(targetStream);
    }
    
  • We can add the ability to output our stream to stdout by adding the following code into the above function.

    • We essentially just need to add an out flag then if this is provided we choose the appropriate stream.
    var targetStream
    if (args.out) {
      targetStream = process.stdout
    } else {
      targetStream = fs.createWriteStream(OUTFILE)
    }
    outStream.pipe(targetStream)
    

Zipping and Unzipping

  • We can gzip with a package called zlib.
  • To zip we stream our altered file into a gzipStream (zlib.createGzip()) we then pipe this stream to our targetStream to output the zipped file.
  • To unzip we can use zlib.createGunzip(), similar to above we just choose a streams depending on how we want to use it.

Notifying the End of Stream

  • To make it clear we have reached the end of the stream we can make the function that is dealing with the stream logic asynchronous.

  • This allows us to call then() on a promise to execute that function with a console.log within it. Which will print after the async function is done.

    processFile(stream)
      .then(function () {
        console.log('Complete')
      })
      .catch(error)
    
    // Helper function to notify once stream has completed
    function streamComplete(stream) {
      return new Promise(function callback(resolve) {
        // Built-in feature to know when complete.
        stream.on('end', function () {
          resolve() // signals that the stream has ended
        })
      })
    }
    
    async function processFile() {
      var outStream = inStream
    
      var upperStream = new Transform({
        transform(chunk, enc, cb) {
          this.push(chunk.toString().toUpperCase())
          cb()
        },
      })
    
      outStream = outStream.pipe(upperStream)
    
      var targetStream = process.stdout
      outStream.pipe(targetStream)
    
      // will wait until the stream is done before printing complete message
      await streamComplete(outStream)
    }
    
  • We can re-write this as a generator using CAF, this can be useful for stopping execution if it's taking too long or something to save system resources.


Database

  • We can use a basic sqlite3 database for simplicity.

    • The database file is managed directly by the program, a separate database application isn't necessary.
  • Basic table to use as a reference mydb.sql:

    -- Enables the use of Foreign Keys
    PRAGMA foreign_keys = ON;
    
    CREATE TABLE IF NOT EXISTS Something (
        id INTEGER PRIMARY KEY ASC,
        otherID INTEGER,
        data INTEGER,
    
        FOREIGN KEY (otherID) REFERENCES Other(id)
    );
    
    CREATE TABLE IF NOT EXISTS Other (
        id INTEGER PRIMARY KEY ASC,
        data VARCHAR(40) UNIQUE
    );
    
  • sqlite3.Database(DB_PATH): create the database in program.

  • var initSQL = fs.readFileSync(DB_SQL_PATH, "utf8");: initialize database.

  • A good way to initialize the database with some helpers can be done like so:

    const DB_PATH = path.join(__dirname, 'my.db')
    const DB_SQL_PATH = path.join(__dirname, 'mydb.sql')
    
    // define some SQLite3 database helpers
    var myDB = new sqlite3.Database(DB_PATH)
    SQL3 = {
      run(...args) {
        return new Promise(function c(resolve, reject) {
          myDB.run(...args, function onResult(err) {
            if (err) reject(err)
            else resolve(this)
          })
        })
      },
      get: util.promisify(myDB.get.bind(myDB)),
      all: util.promisify(myDB.all.bind(myDB)),
      exec: util.promisify(myDB.exec.bind(myDB)),
    }
    

Web Servers

  • Can use require("http") to build a web server

  • Also using require("node-static-alias") to deal with lots of serving functionalities that we don't want to deal with.

  • Create a server:

    var http = require('http')
    
    var httpServer = http.createServer(handleRequest)
    
  • To listen to a port:

    httpServer.listen(HTTP_PORT)
    console.log(`Listening on http://localhost:${HTTP_PORT}...`)
    
  • Handling a request:

    // BASIC EXAMPLE
    function handleRequest(request, response) {
      // 200 => success
      // { header tags }
      response.writeHead(200, { 'Content-Type': 'text/plain' })
      response.write('Some text') // can just pass this into end if wanted.
      response.end()
    }
    
    // ASYNCHRONOUS EXAMPLE
    async function handleRequest(request, response) {
      fileServer.serve(request, response)
    }
    
  • Starting up a static file server:

    // WEB_PATH = document root
    const WEB_PATH = path.join(__dirname, 'web')
    
    var fileServer = new staticAlias.Server(WEB_PATH, {
      cache: 100,
      serverInfo: 'Node Workshop',
      alias: [
        {
          match: /^\/(?:index\/?)?(?:[?#].*$)?$/,
          serve: 'index.html',
          force: true,
        },
        {
          match: /^\/js\/.+$/,
          serve: '<% absPath %>',
          force: true,
        },
        {
          match: /^\/(?:[\w\d]+)(?:[\/?#].*$)?$/,
          serve: function onMatch(params) {
            return `${params.basename}.html`
          },
        },
        {
          match: /[^]/,
          serve: '404.html',
        },
      ],
    })
    

Express JS

  • A handy tool for creating servers.
  • Makes routing a lot easer with a little bit of configuration.
  • We define the routing via middleware.
var app = express()

var httpServer = http.createServer(app)

Child Process

  • Use the require("child_process") library
  • Allows us to manage child processes.
  • Able to manage processes via exit codes.
    • 0: success

Debugging

  • Use a handy chrome dev tool and launch your process on a specific port and have the chrome dev tool listening on that port.
  • Very useful and handy tool.

Return