Very often when I deliver a GigaSpaces training I get asked “I like the technology, but how would you recommend we test our XAP application?”. This happened most recently during the GigaSpaces XAP Advanced training I held in Kiev.
So I thought it would be a good idea to answer that question through this article so that everyone can benefit from it. Continue reading →
A couple of interns started work at Avisi recently. Everyone of them is working on an interesting assignment. We will introduce them in the coming weeks on the blog. Today we are featuring Mitchel Kuijpers. Our testing framework is based on Selenium/webdriver and uses SauceLabs for execution. It’s a typical code first solution. Mitchel’s job is to transform it to a behavior driven framework. Continue reading →
Ever wanted to improve a badly performing Oracle 11 database, or parts of it? How would you know for sure the performance improved for end users?
An Oracle database is a complex entity. It has all sorts of mechanisms to improve and optimize performance, like caching results, creating and caching execution plans, caching dictionaries, etc. When measuring query times, often the first attempt will take several seconds, while next attempts only takes a few milliseconds. That is because Oracle caches almost everything during that first attempt. In the real world though, where databases are under heavy use, caches expire. Performance is based upon first and second attempts together. This means, the more diversified the queries, the less advantage you get from caching.
For almost a year now I’ve been testing a Yubikey hard token. Basically, it’s USB-key that adds strong two factor authentication to the process of logging in to my computer. You can check out my previous blog post on exploring hard tokens and the need for better identity management.
Now it’s time to update you on my experiences thus far…
A very important part of our software development cycle is functional testing. Luckily, functional testing techniques have evolved tremendously since the dark days of old school testing. Back then, testing was done with countless Excel sheets each having multiple tabs that reflected all the individual scenarios. Each tab looked a bit like this:
Goto web-page: http://myincredibletestproject.com
Click on the login link
Enter username: test
Enter password: secret
Click login button
Verify response: “Failed to login. Invalid credentials.”
Our international economic system is highly dependent on the stability and quality of numerous individual banks. In Europe the main banks are submitted to so-called ‘banking stress test exercises‘ every year since 2009. Banks must take part in the stress test if they are deemed to have a measurable impact on the economic system as whole.
We software engineers perform testing duties on a daily basis. And every project we work on will be tested, regardless of their size and complexity. For some projects we choose a risk-based approach and for a few of them we will (can) choose a 100% coverage approach.
If you get involved in a project with 0% test coverage, the first thing you want to do is to improve on that percentage. If the project is very large you will not see exciting things happen in your coverage graphs the first weeks… Better results can be achieved by starting out with Graphical User Interface tests and measuring coverage for that.
In our development structure we use an internally developed integration test system, based on Selenium, to test web application user interfaces. It works great. The Selenium tests run against the test environment (usually JBoss, but it can be any application server) and the tests are run periodically by the continuous integration tool, Jenkins. We can even automatically export test results with test steps to Confluence (our wiki) and link the test cases to their specific use cases.
But there was one thing missing: Calculating the code coverage of the integration tests.
Yesterday, march 6th 2012, my colleague Barri Jansen and I attended Valori’s “thema avond” (theme night). The subject for the evening was “New generation software for automated testing”. The event was held at Microsoft headquarters in the Netherlands, which is located almost on top of the runway at Amsterdam’s Schiphol airport.