Tuesday, April 15, 2014

converting openstreetmap xml files to geojson with a node application

I recently found a nice project named osm2geojson. This project can convert openstreetmap xml files, or OSM files to geo-json format.
Because in my projects, I normally do not need a complete conversion, but only a subset. For example, I only need the streets (identified by a tag called Highways) So I created a little node application for myself.

The code is as simple as this for my purpose:

#!/bin/env node

osm2geojson = require("osm2geojson")(function (feature) {
    return feature.properties.highway !== undefined;
});

var argv = require('optimist').options('f',{
        alias: 'file',
        default: '',
        describe: 'an input file'
}).argv;
var request = require('request');
var fs= require('fs');
var input = null;

if (argv.file !== ''){
    console.log(argv.file);
    input = fs.createReadStream(argv.file);
}

if (!input) return console.log('There was no input recieved');
input.pipe(osm2geojson).pipe(process.stdout);

Now I just launch the application by running my file, that is called parse.js
node ./parser.js -f ./inputfile.osm

Sunday, April 6, 2014

Nodejs

Today I am doing some end-to-end testing of a frontend. That frontend is made in angularjs. The features of this application is, that it has to be generic to fit in several other projects.

This means it is a kind of framework. Because the back-end for the different applications can have some hardware, it is time to mock the hardware. But mocking, is that an end-to-end test? Because I think it is not, I wrote a kind of a simulator. The simulator will mimic the backend that we use. I decided to write that back-end in javascript to learn how to create a nodeJs application. First an environment has to be setup.

The environment will contain tests and source code. Lucky we can make use of the grunt tool. And testing will be done in jasmine. As simple as that.

What do we need? A file that is called package.json. This file describes the application. There are several features in it, like project name, description, dependencies and version number. A basic template we use is here:
{
    "name": "simulator",
    "version": "0.0.1",
    "description": "A nodeJs app for simulation of the backend",
    "main": "backend.js",
    "dependencies": {
        "ws": "~0.4.31"
    },
    "devDependencies": {
        "grunt": "~0.4.2",
        "grunt-contrib-watch": "~0.5.3", 
        "grunt-jasmine-node": "~0.1.0",
        "grunt-contrib-jshint": "~0.8.0", 
        "grunt-jasmine-node-coverage": "~0.1.7", 
        "grunt-exec": "~0.4.5" 
    }, 
    "scripts": { 
        "test": "test" 
    }, 
    "author": "Bart Vanherck", 
    "license": "GPL" 
}
As you can see, in devDependencies are all dependencies to run the grunt tool and in dependencies, the dependencies of the application are listed.
After creation of this file, a  Gruntfile.js needs to be created. This file contains the tasks grunt can do.

Example: (GRUNTFILE.js)
module.exports = function(grunt) {
    // Project configuration.
    grunt.initConfig({
        pkg: grunt.file.readJSON('package.json'),
 
        watch: {
            files: ['Gruntfile.js','spec/*.spec.js', 'src/*.js'],
            tasks: ['jshint', 'jasmine_node'],
       },
 
       jshint: {
           options: {
               unused: true,
               trailing: true,
               noempty: true,
               curly: true,
               indent: 4
           },
           all: ['Gruntfile.js', 'src/*.js', 'spec/*.spec.js']
        },
 
        jasmine_node: {
            coverage: {
                showColors:true
            },
            options: {
                forceExit: true,
                match: '.',
                matchall: false,
                extensions: 'js',
                specNameMatcher: 'spec',
                junitreport: {
                    report: true,
                    savePath : "./coverage/reports/",
                    useDotNotation: true,
                    consolidate: false
                }
             }
        },
        exec: {
            toHtml: {
                command: 'istanbul report html'
            }
        }
    });
 
    grunt.loadNpmTasks('grunt-contrib-watch');
    grunt.loadNpmTasks('grunt-jasmine-node');
    grunt.loadNpmTasks('grunt-contrib-jshint');
    grunt.loadNpmTasks('grunt-jasmine-node-coverage');
    grunt.loadNpmTasks('grunt-exec');
 
    grunt.registerTask('coverage', 'jasmine_node');
    grunt.registerTask('test', 'jasmine_node');
    grunt.registerTask('tohtml', 'exec:toHtml');
    grunt.registerTask('default', 'test');
};
After creation of the files, the real application can be build. This however is for a next time.

Tuesday, March 18, 2014

git bash prompt

Even as a tester, we need version control systems.  At this moment , personally I use git. When installing git on windows, there is in the shell tab completion as default. This is a nice feature that I wanted to have on all my systems, say it to be mac and linux.

I did not had to implement that for myself, because of course it already exist.  I did follow a procedure that I explain below. If my command prompt enters a repository, the prompt changes as follows:
[user@localhost projectdir (master)]$
While rebasing or during a merge process the promtp changes too. Very handy according to me.
[user@localhost projectdir (master|REBASE-i)]$
To have this nice feature, it is simple. Download some scripts from the net and place the files in a directory on your system.  The files itself can be found at following locations:
I placed the files in a directory  ~/.bash_files on my system.  After that, in my  ~/.bashrc file (or  ~/.bash_profile on my mac) I entered the following commands somewhere at the bottom:
source ~/.bash_files/git-completion.sh
source ~/.bash_files/git-prompt.sh
export PS1='[\u@\h \W$(__git_ps1 " (%s)")]\$ '
If you followed these instructions, you have tab completion inside your git repositories. Its very nice and handy, that tab key.

Monday, March 17, 2014

angularjs

nodejs
I do need to test a new web application that our team is going to write  in AngularJS. 
To start the project, first a kind of environment has to be prepared. AngularJS is a kind of javascript library written by the people of Google if I am correct.
There are several tools that we use together with this javascript library. This for making our live easier. We have grunt, karma, yeoman, protractor and a lot of more.

Because some of the tools are written as a Node.js application, there is a need to install that Node.js and also the package manager of node.js Because I do not want to install the complete application, I downloaded the binary or executable of node.js and placed that in a file. Afterwards I created a script that changes my PATH variable to the location the node.js binary is located. With that script, I can now use node.js as it was installed. 
But the advantage I encounter here is that I can now place my environment in version control and share it with my colleagues or with our build server. If there is a need to install something globally, it is just installed in the directory where node.js is located. Simple and we can not have fun with our project.

Friday, July 26, 2013

Verifying HTTP websockets traffic with wireshark

I really do like the wirreshark tool. It is a so called network packet analyzer and helped me a lot in the past. For the project where I am working on, there were 2 problems:

  • Wireshark does not capture data at localhost on windows.
  • Wireshark did not dissect our websocket traffic correctly.

Sniffing traffic for the localhost interface on windows.


The official wiki of wireshark says it all:
You can't capture on the local loopback address 127.0.0.1 with a Windows packet capture driver like WinPcap.
This is a problem for me, because I do test on my machine and  with the loopback device. This because our application will be running on a machine and never has any access to any network. This means the requirement of many thousands of clients is not for us, but that is another issue.

There is howver another solution for this and that is the use of another tool.  This tool is called RawCap. It is a free command line network sniffer for windows that will be part of my toolbox from now onwords, together with wireshark of course. To run the tool you need administrator privileges, this is normal because the tool connects to the network devices at lowlevel and a normal user normal never can have access to hardware drivers at that level.

The usage of the program is also very simple: Just run it in a terminal and follow the options. There are also commandline options, but to see what they are, you have to check the manual on the website of RawCap. The program will log all traffic in a file. That file can be opened in wireshark for further investigation.

An example of the terminal output I got and typed in is:
E:\tools>rawcap
Interfaces:
 0. 172.16.0.151 Wireless Network Connection Wireless80211
 1. 172.16.0.153 Local Area Connection Ethernet
 2. 192.168.6.1 VMware Network Adapter VMnet1 Ethernet
 3. 192.168.111.1 VMware Network Adapter VMnet8 Ethernet
 4. 127.0.0.1 Loopback Pseudo-Interface 1 Loopback
Select interface to sniff [default '0']: 4
Output path or filename [default 'dumpfile.pcap']:
Sniffing IP : 127.0.0.1
File : dumpfile.pcap
Packets : 15370^C
With this usefull program, problem 1 is solved.

Dissection of JSSON WebSocket traffic
It is possible that like in our case, the WebSocket traffic is not at an HTTP port. In our case the traffic is at port 5000. That port is also used for IPA, or GSM over IP. Because of this problem, our traffic is seen like unreadable data, because the data will not be unmasked.


Lucky for us there is a solution. Go to the Preferences Window and search for HTTP in the Protocols dropdown. At the right side an option TCP Ports will be available. Here we can add our custom port.



That is not enough, because our WebSockets use Json. Search in the same dropdown for Websocket and select in the Dissect websocket text dropdown box the "As json" option.

After applying our problem has been solved. The json data can be seen as plain text now.





All the options are there for inspecting the data. Pretty handy for debugging if something is wrong, that wireshark tool.

Thursday, July 4, 2013

Belgiums king Albert II


Yesterday our king decided to leave the throne to his son on 21 july. That day we were of for celebrating that he was 20 years on the throne. But he decided that it was enough. I can imagine that. The recent political situation in our country is very stressing for him. He was never intended to rule anyway, because his brother was king before him. His brother has no children, so that made him the successor twenty years ago.

Because our king has 3 children, there is now a successor. He has 2 suns and a daughter. This makes that the firstborn is now becoming the new king. Is it? There might be a little problem. In the constitution the process of abdication is mentioned what happens if the king dies. Unlike the constitution of the netherlands, in Belgium, there is nothing mentioned what happens in this case, when the king decided that he does not want to be the ruler of the country anymore. This can cause a problem in theory, because everything the king does or decides to do must be covered by a member of the government. If this is not covered there is a crisis.

Lucky for us this latter is not the case, but maybe in the future it can be a problem.

We as testers are aware of this situation. Is it not that we somethimes see such problems in the requirements and that some say that it is not a problem because it nearly never happens. Are we happy about that? What is the risk we are going to take? We see, most of the problems we see in software are in the real world too. It is just a matter of risk and how creative we are to solve the potential problems.

Friday, June 7, 2013

confirmation bias

A few months ago I went to Belgium Testing Days 2013. This was again an amazing conference with a lot of speakers and some nice tutorials. I was listening there to the presentation Huib Schoots. He mentioned several books we should read. Because his presentation gave me some insights and gave me links to other domains, I began to read the book "You are not so smart". One chapter in the book is about confirmation bias. Let's investigate this a little bit further.

This confirmation bias is in every person. It is about the fact that if you buy a new car and afterwards sees that car everywhere. This is because previously you ignored that kind of information, and now it is linked in your mind to watch for.

That chapter also mentioned about an experiment that was done, like there are many experiments done in social sciences. But this experiment they let a group of people read about a fictive person X. Some things the person does could be extrovert and other introvert. This makes it a real person after all? After a week of knowing all about the life of that person, the researchers divided the group in two. Group one was asked if person X was good for woring in a library, the other group was asked if the person was good for a job as lawyer.

Both groups aggreed that person X could do the work the researchers suggested. The librarian group remembered that the introvert character of person X was a pro, the other group remembered more person X to be extrovert. After this, both groups were asked if person X was good for the the job that was suggested to the other group. Each group decided that person X was not good for that other job.
This study suggests that even in your memory the confirmation bias is present. You always fall in recalling things that you already know and forget the things that are contradicted to them. Maybe it is because we learned in math that contradiction does not exist?

The conclusion is that we must more think like in science, seek evidence of the contrary of what you think. Maybe that is true for testing too?

If you find a bug in the system you are testing, the normal process is that a developer fix the bug. You verify the bug and find new ones, simular to this bug. That is why they also say that bugs stick together and are social. They always come at the same place. But is this? Is this not the confirmation bias? Maybe it is. Maybe you also should look further in the system at totally other domains.

The developer has fixed the bug, but afte a while the bug is in the sytem again for some reason. This should be found too. Because we are confident that it worked previously because they fixed the bug, it is not certain that the bug stays away forever. We must collect evidences that it stays away. Solution for this could be automated tests that run at regular times. But not everything is automatable, and that is a risk.

But there is  in my opinion another risk too, not a technical, but at social level. A manager sees that everything goes fine, no or not that many problems in the system, so the product will be delivered at time. At that moment, you as a tester discover a major blocking bug that will have impact in the complete system and we must fix this, so the delivery date is in danger. How will the manager react? Does he also suffer of the confirmation bias? How is it possible that if everything is fine, suddenly it is not? That is a contradiction in his mind. We as testers have to communicate this, and we may have the risk that they blame the messenger in stead of the message.