Our first point of pain is the amount of time it takes to run our acceptance tests. We have around 500+ and they have been taking close to 40 minutes to run. I'm not sure at what point they began to take so long, but I hardly considered 40 minutes to be acceptable. One of the key ideas behind Test Driven Development is the red, green, refactor principle, which requires you to run your tests often. With the acceptance tests taking 40 minutes to run, I had begun to find myself running those tests less and thus losing the security blanket that we had created by writing those tests from the very beginning.
Another point of pain is testing the number of web browsers we are supporting. Currently we are supporting eight different browsers, Firefox 2 & 3 (Mac & PC varieties), Safari 2.0.4 & 3 (Mac only), IE 6 & 7. We are looking to add support for at least two more, Google Chrome & IE 8. One of WatiN's biggest limitations is that it only runs in IE. Our stories had to be tested manually in the other browsers to ensure they were complete and if you do this often enough it, you will loose a lot of time out of your development cycle.
We decided to reexamine using Selenium, particularly Selenium Grid and Selenium RC, in our test suite. We were hoping to a) Speed up the time it took to run out tests b) Run our test across a larger set of browsers. Knowing how the Grid works, we knew that b could easily be achieved, but how would the speed compared to WatiN.
As a test we converted a small subset of our fixtures, about 10, over to Selenium and did some slight benchmarking. After the 10 fixtures were converted, we didn't see any initial improvement in the average test time. It was decided the advantage gained by being able to test across multiple browsers was more than enough for us to convert the rest of our tests to Selenium.
It appeared the quickest way to move forward was to rewrite our tests using SeleniumIDE. It took two developers about a week and a half to convert every test over and flush out any old tests or broken tests. After this was all said and done the Selenium tests ran about two to five minutes faster than the WatiN tests, but I feel that what we gained was well worth the effort. At the very least, we did not slow ourselves down and we can now test across multiple browsers with ease, and I like knowing that whatever changes we make will be tested and tested well.
*We were able to speed the tests up and get them to run in as little as 12 minutes. I'll share that in a later post.
No comments:
Post a Comment