
By SmartBear Blog | Article Rating: |
|
March 18, 2016 07:45 AM EDT | Reads: |
450 |
How Developers and Testers Can Work Together to Overcome the Challenge of Object Recognition
By Justin Rohrman
Webpages are fickle.
Right when I think I know where everything is a label changes, a button is moved a couple of pixels to somewhere completely different in the page. For the last year or so, I have been working full time on a project building automated checks for a user interface.
There are a lot of challenges in UI automation - timing, selecting the right tests, figuring how to make tests powerful - but handling page objects is always the first I have to figure out.
Here are a few object recognition challenges I have come across and how I got past them.
The Elusive Object
I have come across a lot of projects trying to script away some testing problems in the user interface, but not a lot of success stories. There are a couple of patterns I notice in these hard lessons. The first one is having one person on the test team spend a week building a proof of concept.
Usually that person will spend a couple days getting their head around the tool, and then a couple more building or recording a few checks to demonstrate. When I did this, the demo was a nightmare. These one-off scripts would fail on and off, and in different ways each time. Management understandably lost interest and we took another route.
The other pattern throws caution out and charges forward. The first automation project I worked on was like this. We built a proof of concept that was small and simple enough that it mislead the team into thinking things would be simple. Over the next 6 months we built some shaky infrastructure and a little over a hundred checks. Unfortunately, we rarely had more than half of those passing during the nightly runs. We ended up with an expensive source of information that couldn't be trusted.
Usually when these failed, the script just plain couldn't find the screen element it was supposed be touching. We would kick off a script, it would navigate to the next page and then start the search and the page would load and then, boom; red light.
This can happen for a few reasons.
Probably the most basic problem you'll ever have is dealing with pixels. The first tool I ever used to figure out this UI automation thing was for automating basic IT jobs like getting a nightly server backup running. When this tool performed a click, or typed into a text field, it found those fields by their pixel location on the screen. Any time something on that screen changed like a button changing size, or moving to another place on the same screen, or even the browser getting resized, the script would break. If this is how your tool finds objects, you should probably get a different tool.
The next journey was with a very basic framework, I think it was one of the first UI libraries build for Ruby. This tool used XPath to find everything on the page. It is a little bit like specifying a file path.
The first drawback was how slow the process of building checks was. Each time I wanted to touch a new button or field, I had to stop and open up developer tools to find the path. It was a little tedious. If we added a second table above the first, the locator would break. If the rows in the table are sorted and the default sort breaks, the locator would report the text on row 2, column 2 was now "incorrect." If the rows are orders, sorted newest first, and someone logs in as that user and creates another order, or deletes the first, the automated tests would report the test is, again, incorrect.
Tools have come a long way from those days.
An automated testing tool like TestComplete, provides built-in functionality to improve the stability of automated tests. The TestComplete platform works at the object level. This means that when TestComplete captures user actions over an application, it records more than just mouse clicks and simple screen coordinate-based actions.
Searching for an object based on it's ID in the DOM is one of the most commonly used methods right now. It isn't perfect, but it is much better than the alternative. Usually when this fails, it's because there is a timing issue. For example, I work on a product that is very data dense. Every page has expandable sections with buttons, and rows of data. When a test fails from object problems now, it's because the script is trying to interact with the page before everything loaded and ready.
Sometimes the ID is generated based on an order number - with a different button for each order number. This can be very challenging, especially if you want the tool to go through create an order, then click the "right" radio box. A human might be able to understand what to click, but coding that in software can be challenging.
Let's talk about how to code to make the job of test tooling easier, sometimes called testability requirements.
Develop for Testability
One of the stranger things I've seen in user interface automation isn't related to technology at all. When these projects start, managers decide that testers should build these systems all on their own. The marketing and product demos tend to support this idea; after all, why should test automation require the help of the people writing the production code? When the project starts moving along and the testers inevitably find a hitch, people get understandably frustrated.
The first thing we noticed was that we couldn't just lay the scripts on top of our product and have it run. Instead, the product itself needed to change, to add testability features. Originally, our software wasn't designed with any sort of UI automation in mind, so developers weren't very careful about adding IDs to the page elements. Each time I wanted to work on a new page, or even a new element on an old page, I had to put in a change request to get an ID added for that field. Hopefully I remembered all the fields on the first try. That created a one day lag at minimum before I could go to work, longer if something important was going on at the time.
The other thing we did was a larger effort. The initial tests we wrote were full of hard coded wait commands. Each button click, or navigation was followed by a 5 (or more) second wait to give the page time to load before moving on to the next step in the script. Those waits made the scripts incredibly brittle, we never really knew how long a page would take to load. If 5 seconds wasn't long enough, the script would try to click on a button or type into a field that wasn't there and fail. To get around this problem, we had to design a polling solution that would tell us when a page was ready to be used. Not just design it, but also convince the development managers that it was a good idea and in their best interest. The end result was a flag in the DOM that would be set to isReady='true' when the page was fully loaded.
Successful UI automation projects take effort from everyone.
The product I am working on now probably has the most successful UI automation work I have seen. The product is mostly legacy, it isn't getting new javascript libraries thrown in once a week and we don't have to worry about responsive design. It is built mostly on top of SQL stored procedures. Changes can be tricky because of that, sometimes a change in one place will break something in an area we had never considered. It's also very hard to unit test.
Every time a new feature is added or an old bit of functionality is changed somehow, the developers do that with the idea that I may need to add that into the suite of checks that runs over night. Every new element in the UI gets a static ID that hopefully doesn't change, and we try to minimize the number of fields that are added and removed dynamically. When it helps stability, the developers also work with me to write SQL scripts to seed data into the database.
Getting a strategy in order to handle your user interface objects will save a lot of pain and frustration down the road. The usual objections follow the old story of ‘we are already short on developers, we don't have time for that'.
A better question might be, do you have the time a couple months from now to untangle the web you are making today? I bet the answer is no, we'd rather handle that now.
Read the original blog entry...
Published March 18, 2016 Reads 450
Copyright © 2016 SYS-CON Media, Inc. — All Rights Reserved.
Syndicated stories and blog feeds, all rights reserved by the author.
More Stories By SmartBear Blog
As the leader in software quality tools for the connected world, SmartBear supports more than two million software professionals and over 25,000 organizations in 90 countries that use its products to build and deliver the world’s greatest applications. With today’s applications deploying on mobile, Web, desktop, Internet of Things (IoT) or even embedded computing platforms, the connected nature of these applications through public and private APIs presents a unique set of challenges for developers, testers and operations teams. SmartBear's software quality tools assist with code review, functional and load testing, API readiness as well as performance monitoring of these modern applications.
![]() Mar. 18, 2016 02:00 PM EDT Reads: 829 |
By Liz McMillan ![]() Mar. 18, 2016 01:30 PM EDT Reads: 1,060 |
By Elizabeth White ![]() Mar. 18, 2016 01:15 PM EDT Reads: 973 |
By Elizabeth White ![]() Mar. 18, 2016 12:45 PM EDT Reads: 152 |
By Liz McMillan ![]() Mar. 18, 2016 12:45 PM EDT Reads: 233 |
By Liz McMillan ![]() Mar. 18, 2016 12:00 PM EDT Reads: 874 |
By Liz McMillan ![]() Mar. 18, 2016 11:15 AM EDT Reads: 999 |
By Liz McMillan ![]() Mar. 18, 2016 11:00 AM EDT Reads: 888 |
By Elizabeth White ![]() Mar. 18, 2016 10:45 AM EDT Reads: 683 |
By Pat Romanski ![]() Mar. 18, 2016 10:00 AM EDT Reads: 840 |
By Elizabeth White ![]() Mar. 18, 2016 09:00 AM EDT Reads: 391 |
By Elizabeth White ![]() Mar. 18, 2016 08:00 AM EDT Reads: 481 |
By Elizabeth White ![]() Mar. 18, 2016 02:15 AM EDT Reads: 867 |
By Elizabeth White ![]() Mar. 18, 2016 02:00 AM EDT Reads: 752 |
By Elizabeth White ![]() Mar. 18, 2016 01:00 AM EDT Reads: 763 |
By Elizabeth White ![]() Mar. 18, 2016 12:00 AM EDT Reads: 801 |
By Elizabeth White ![]() Mar. 17, 2016 09:45 PM EDT Reads: 643 |
By Scott Allen ![]() Mar. 17, 2016 09:15 PM EDT Reads: 840 |
By Elizabeth White ![]() Mar. 17, 2016 08:45 AM EDT Reads: 978 |
By Pat Romanski ![]() Mar. 16, 2016 05:00 PM EDT Reads: 651 |