Skip to main content
Cross Browsers Testing - What does it mean?

This post is from the Experitest blog and has not been updated since the original publish date.

Last Updated Oct 20, 2018 — Continuous Testing Expert

Cross Browsers Testing - What does it mean?

Continuous Testing

Automating Visual Tests Across Multiple Viewports and Browsers

Over the past few years, we have witnessed enterprises pursue "Cross Browser Testing". Simply defined it is the process of being able to test websites on multiple operating systems, devices, browsers and screen resolutions. Devices and operating systems are more fragmented than ever. It is essential that a website or web application displays correctly no matter how it is being viewed.

What any Enterprise wants is their customers and users to enjoy the best possible user experience across all of their endpoints. Selenium is the tool that is increasingly used to accomplish this.

What cross browser testing infrastructure providers give their customers is a wide matrix of different browsers (like Chrome, Safari, Internet Explorer, Edge and more). They also add the ability to test on different browser versions, different machine OSes, and screen sizes.

The main way this is done is by developing a system in which you can run Selenium test scripts on different browsers and versions.


The importance of having a responsive website

Responsive web design is the ability for a website or web application to adapt successfully to any screen size. This has reached paramount importance. This is due, in no small part, to the dramatic rise in mobile web use. Statista has reported that by 2020 the number of smartphone users in the world will have passed the 5 billion mark. This has accelerated the need to perform cross browsers testing that tests across a wide variety of browsers, and OS versions.

It is not simply the browser which must be tested. The browser "Viewport" is of equal importance. A "Viewport" refers to the window dimensions of a screen (width and height) and plays a huge role in how sites are displayed.

If a website does not display correctly due to whichever reason - device or viewport - this can severely impact your user experience.


Cross browser testing with Selenium?

As we mentioned at the top, the overall consensus in today's testing strategies is to automate your cross browser tests by using Selenium. Personally, I feel we are missing a very important point here. In many cases, Selenium tests will overlook many of the compatibility issues between different browsers.

In fact, your website could be totally broken when used in a specific browser version. However, all of your Selenium tests will mark as passed.

The reason for this is that Selenium does not interact with a website in the same way in which humans interact with a website - visually.

What that means is that where we as humans look at the visual side of a website in order to interact with it, Selenium simply looks at the page source, so to speak, and performs actions on elements in the web page by triggering a Javascript event. This means that a broken and unusable site that is only being tested via Selenium will pass every test and UX issues will not be resolved.

So why waste all of the effort to execute your tests on different combinations? If the end result is going to be a false positive?


User Interaction vs Selenium Interaction

A normal user interaction is a path that a user takes on a given website or app in order to complete a task. The most common example of that is a customer purchasing a product. The customer must select the product in the configuration they want, add it to their shopping cart, navigate to the shopping cart and complete the purchase. If any element - for example the "checkout" button but does not display correctly the user simply will not be able to complete the purchase and the sale will be lost.

This is the main challenge facing Selenium users. The way that the Selenium WebDriver controls your website is by using the DOM (the HTML document). The driver does not test the visual part of the user experience.

The worst thing that can happen in terms of "User Experience" Testing is that a user will not identify an element that is mandatory for the completion of an action that the user would like to perform.

That is the difference in the way Selenium tests a user flow. With a Selenium test, the flow is simply analyzed from the code level and not the user level. UX issues can easily slip through Selenium's grasp.

Take the picture below for example.

Figure 1 - GitHub Sign-in Page - Rendered by IE 10


The ‘Sign in' button is the most important button (e.g main action button) on this page. However, in this example, the view is corrupted when rendered on Internet Explorer. Although no "real" user will mark this test as passed, Selenium overlooks this issue and simply triggers the click, which in turn triggers the button "onClick" function and navigates the user to the next page.


What Can We Do?

Selenium does a great job when it comes to testing the functionality of a website. From our experience, there is no real need for us to run our functional based tests across different browsers/viewports. The rare cases in which a functional aspect of an application fails to operate on a different browser or viewport is not worth the effort to deploy and maintain an automated cross browser testing environment.

Our options now vary across the two following solutions :

  1. Dismissing our cross browser testing automation strategy and have real live users run our tests across multiple browsers manually.
  2. Locate some other methods for automating visual testing.

Since today's standard in web-based applications approaches continuous delivery, we cannot really consider the first method. No enterprise has the time or desire to shell out the amount of money it would cost to manually test thousands of different browser/OS combinations.

So, in order to address this challenge, we need tools that will complement Selenium's capabilities. This is in order to validate the visual aspect of the application user interface. One such example is a tool called Galen Framework:


Galen Framework is an open source testing framework which was developed specifically to execute layout testing on web applications in a real browser. Galen Framework was made with the express goal to execute responsive design testing.

Here is how it works:

  1. Galen opens a page in a browser
  2. Resizes the browser to fit a specified size
  3. Tests the layout with Galen Specs

The Galen Framework is based on Selenium so you are free to carry out whichever actions you need like: clicking, typing, injecting client-side javascript etc. The language that defines how a page should look for different devices is called Galen Spec. The language is advanced and allows you to express the complete layout of your website with minimal text.

Let's take a look at an example.

The following example shows you how to catch a UX issue using Galen spec:


Galen result file for the given spec:

Galen and Selenium working as a team.

It is an exciting time to be in the web application testing business. There are so many different permutations of devices, browsers, and OSes. This makes manually carrying out cross browser testing completely infeasible.

Selenium's ability to test the functionality of a web application is strong. It is missing the ability to carry out visual testing.

Enter Galen, which was developed specifically in order to execute visual testing. There have been other attempts at developing visual testing solutions. These require the use of images and are too complex and awkward to really solve the issue. Galen with its specifically developed Galen Spec language can truly interact with and test the visual nature of your web application.

Scaling automated visual testing

Experitest has recently launched a new automated visual testing tool that helps you easily develop Galen based tests and integrate them into your Continuous Testing pipeline for truly effective cross browser testing. Read more about the solution here.

Follow our Digital App Testing page on LinkedIn

Guy Arieli - CTO

More from the Blog

View more
May 13, 2022

What tips would you give testers on how to ask better questions?

Continuous Testing
The nature of testing is not about confirmation. It is about questioni ...
Read More
Mar 23, 2022 Continuous Testing, First in Market to Support Android 13 DP

Continuous Testing is proud to announce that our Continuous Testing solution i ...
Read More
Mar 21, 2022

Eliminate inefficiencies in your enterprise with codeless and continuous automated testing

Continuous Testing
Today, terms like codeless, continuous automated testing and shift lef ...
Read More
Feb 07, 2022

These key factors will help you choose an automation tool

Continuous Testing
The year 2022 looks to be another banner year of growth in the softwar ...
Read More
Contact Us