How we automatically test responsive pages in mobile devices

boredThe challenge of the responsive test – manual testing is too much work!

Responsive sites have made it to the mainstream. If you order a new website today, it is likely to be responsive. While this is a wonderful concept for the end user, it has its challenges for the developer. The meaning of browser compatibility testing (or cross browser testing) has changed. Manual test of your website in over 20 different devices plus handling the quirks of desktop browsers is a task so daunting that few even start. So how can you test a responsive website in a fast manner?

Give me a reference!

For any test (especially automated test) to be set up well you need to know what is the expected result. In literature that is called “the oracle” or “reference”. What is the reference in responsive testing? Most developers work with their favourite browser (that in majority of cases tends to be Chrome, Firefox to a lesser extent). For responsive the developers simply manipulate the window width or use Chrome developer tools. There are also some web-based window size manipulators that claim to “emulate iPad”, but don’t take their word for it. But the point remains – there’s one browser where the developer debugs her work manually anyway.

So we decided to do the same. Take a screenshots with different browser size and use them as a point of reference. However, that is not new – there are plenty of screenshot tools out there. This is where you hit the next challenge. Given that you test your site in 30 devices and 13 desktop browser versions you end up with 73 pictures. Even if one would take the time to go through this deck manually, the detection rate would be low. Very low.

responsive-baseline

Desktop layout vs mobile device layout on a sample website.

The real pain – and a painkiller

What a web developer wants to know in the end is “it works fine on my browser (usually Google Chrome), but does it look the same everywhere else?”. At this stage you would like to focus only on the potential problems and ignore the devices where your site renders fine. Looking manually at nearly 100 pictures or doing manual cross browser testing is not the solution here.

This is where Browserbite’s computer vision based comparison algorithm comes on stage. It does not care whether it’s comparing 2 or 200 pictures. And it’s not doing pixel-by-pixel comparison either. The result is an effective filter that works in real life. You can focus on the differences and feel safe about the green flags.

Excerpt from a test where differences are detected in a device...

Excerpt from a test where differences are detected on two mobile devices

Show me the money!

So how much would that automation be worth to you? The real question of testing is how much can a bug cost you and what investment are you willing to tolerate to avoid that cost?

Let’s consider the alternatives: manual testing and manual screenshot comparison.

Manual testing covers a lot more than layout tests, therefore can present more business value especially in a web application with business logic. The downsides here is time and detection rate. Let’s say a manual tester spends about a minute with each device and browser (having all the devices and browsers readily at hand) it still amounts to more than one hour of manual labor. Secondly, the detection rate for layout errors is notoriously low (two thirds of bugs will go unnoticed) for manual testing.

Manual screenshot comparison does not have the downside of manually opening all configurations, but has the same set back of low detection rate.

In conclusion it’s always more useful to let the automation do the detection work and people do the final scrutiny without the boring part of flipping through hundreds of images. All methods tend to compliment each other.

The obstacles – when real life interferes

In practice there are some pitfalls when you start comparing images from different devices and browsers. We will run a separate article on the pitfalls and how to tackle them. In short, the following things will cause false results on image-based tests

  • Sliders and on-page animations – two devices will show different phase of the slider/animation
  • Dynamic content – Banners, Facebook/Twitter/RSS feeds and A/B-tests that shuffle their content. Also rotating banners fall in this category
  • Lazy loading – pictures, CSS or fonts are loaded after the page load has completed
  • User-agent string based logic – to load different fonts/images/CSS based on the user’s browser
  • Splash pages – a prompt that spans the entire viewport, hiding the content
  • Fixed menus – menus that stick around in the header when scrolling down long pages
  • Cookie information prompts – these annoying pop-ups that inform you about basic web technology
  • Font differences – different operating systems and browsers have access to different fonts causing rendering differences

There are solutions and workarounds for all of the problems listed and we’ve pretty much covered nearly all of them (except for splash pages and cookie prompts, but the fix coming!).

What next? A future more simple and complex

We feel that we’re in the start of something wonderful. but we can do much better. Right now our solution is based around browsers, but we’d like to simplify the workflow even further and group the automatically detected issues together. Stay tuned!

Browserbite is a unique service to automatically test layouts of responsive webpages in over 45+ mobile devices and desktop browsers. Try it out on your website below!

Posted by

Comments are closed.