Matt Lowrie (Google)
Evolution of Business and Engineering Productivity
Manasi Joshi (Google)
In this keynote, we try to take everyone on the journey of how engineering productivity discipline evolved at Google and how that was and is instrumental to Google's business growth for moving fast, staying stable and providing lot of confidence through development/release/monitoring processes. We also allude to some of the challenges we are facing today and new horizons for testing cross platform in a highly connected/vertical product experience that Google is going through.
Automating Telepresence Robot Driving
Tanya Jenkins (Cantilever Consulting)
Testing a telepresence device's driving interface is challenging. It operates in the real world, interacts with people and objects, but must be tested in a controlled environment. How do you tackle creating a realistic remote driving environment while simultaneously validating the location and position of the device when you can't see it? I'll present an innovative solution.
What’s in your Wallet?
Hima Mandali (Capital One)
Capital One is one of the largest credit card company in the US with over 70 million accounts. At Capital one, we are building lots of cool products that provides amazing digital experiences for our customers. With mobile devices becoming the preferred channel for our customers, this talk will focus on how we solved the problem of test automation for mobile web app’s and what did we do to get into a faster software delivery pipeline. We will also share the open source tools we used and open sourced dashboard that we built to solve our problems.
Using test run automation statistics to predict which tests to run
Boris Prikhodky (Unity Technologies)
Tests have become a vital part of application development processes but what to do when once a savior has become a bottleneck in a daily life. Here we share our experience on what we did when we were having 3-6 hours wait times for a test configuration to run. Simple but yet powerful approach is presented in this talk which saves precious time of running ever green tests on a build and test farm. Possible ways to improve the process are also covered.
Selenium-based test automation for Windows and Windows Phone
Nikolai Abalov (2gis)
There is Selenium for test automation of web applications. There is Appium for mobile applications on iOS and Android. But for Windows Desktop and Windows Phone/Mobile we had to come up with our own Selenium-based solution. So Winium was created. Winium is an open source solution for test automation of Windows Desktop and Windows Phone/Mobile apps. Winium is Selenium based, so it should be relatively easy to start using it for your automation needs if you already know Selenium or Appium, it can be integrated into your existing selenium infrastructure. In the talk I will present projects that compose Winium and demonstrate both Winium.Desktop and Winium.Mobile in action.
The Quirkier Side of Testings
Brian Vanpee (Google)
ML Algorithm for Setting up Mobile Test Environment
Rajkumar Bhojan (Wipro technologies)
With the rapid advance of mobile computing technology, there is a significant demand on mobile application testing on mobile devices. Mobile Device Management is playing a vital role in mobile app testing and understanding the challenges in Mobile Device Management is as important as solving them. In order to avoid Device specific problems, test automation developers must test their apps on a large number of devices, which is costly and inefficient. In this talk, we show how machine learning algorithm can identify right set of devices for setting up mobile test environment.
“Can you hear me?” - Surviving Audio Quality Testing
IATF: An new Automated Cross-platform and Multi-device API Test Framework
Yanbin Zhang (Intel)
To ease adoption of WebRTC technology and make it widely available to expand or create new applications, Intel has developed the end-to-end WebRTC solution, Intel® Collaboration Suite for WebRTC. Currently, Intel already establish a growing ecosystem of Intel® Collaboration Suite for WebRTC around the world. Cooperation covers various areas, included education, medical, industry cloud, social media online broadcasting, video conference and wearable etc. The rapid-growth number of supported platforms for SDK APIs make the cross platform compatibility and integration testing effort increase explosively. How to automatically test the interoperability across those various SDKs on different platform becomes a big problem. In this talk, we will present our Automated Cross-platform and Multi-device API Test Framework-IATF. It can be adopted for any cross-platform and multi-device SDK testing which need communication across different platforms.
Using Formal Concept Analysis in software testing
Fedor Strok (Yandex/NRU HSE)
Formal Concept Analysis provides us with a toolbox for building formal ontology over the set of objects with descriptions (expressed as set of attributes). This branch of algebraic theory was introduced in 1984, and now is applied for a wide variety of data mining tasks. This talk is focused on techniques which could especially valuable for software testing: using formal ontology for convenient test reports and for semi-automatic test case derivation.
How Flaky Tests in Continuous Integration: Current Practice at Google and Future Directions
John Micco (Google)and
Atif Memon (University of Maryland, College Park)
Google has an enormous corpus of tests that we run continuously in our massive continuous integration system. Looking at this data, we find that flaky tests cause us a lot of waste in several different dimensions. We are working to improve our ability to understand the impact, detect and mitigate the inherent level of flakiness that we see in our system.
Developer Experience, FTW!
Niranjan Tulpule (Google)
Docker Based Geo Dispersed Test Farm - Test Infrastructure Practice in Intel Android Program
Jerry Yu (Intel) and Guobing Chen (Intel)
OpenHTF - The Open-Source Hardware Testing Framework
Joe Ethier (Google) and John Hawley (Google)
Directed Test Generation to Detect Loop Inefficiencies
Monika Dhok (Indian Institute of Science)
Redundant traversal of loops has been identified as a source of performance bugs in many java libraries. This has resulted in the design of static and dynamic analysis techniques to detect these performance bugs automatically. However, while the effectiveness of dynamic analyses is dependent on the analysed input tests, static analyses are less effective in automatically validating the presence of these problems, validating the fixes and avoiding regressions in future versions. We propose a novel approach to generate tests automatically to detect loop inefficiencies in java libraries. This talk gives brief overview of this work.
Need for Speed - Accelerate Automation Tests From 3 Hours to 3 Minutes
Emanuil Slavov (Komfo Inc)
All high level automated tests are slow for today’s fast paced, first-to-marked environment. This is the elephant in the room that everyone ignores. And for a good reason. Achieving fast, reliable and useful automated tests is hard work. However, you have no choice — with slow automated tests you’re just shipping crap to your customers faster. At Komfo, we had tests running for more than 3 hours every night. The execution time kept growing unrestricted. The tests were getting unstable and unusable as a feedback loop. At one point the tests were failing for more than 20 days in a row. Regression bugs started to appear in production. We decided to stop this madness and after considerable effort and dedication, currently the same tests run for less than 3 minutes. This is the continuous improvement story of how we achieved 60x faster tests.
Code Coverage is a Strong Predictor of Test suite Effectiveness in the Real World
Rahul Gopinath (Oregon State University)
ClusterRunner: making fast test-feedback easy through horizontal scaling
Box runs around thirty hours of unit and integration tests on every commit. We parallelize them to run in less than 17 minutes using our open-source test distribution platform, ClusterRunner. Why does Box have so many tests? How does ClusterRunner work? Is it easy to set up ClusterRunner for your own tests? (Spoiler: Yes.) ClusterRunner gives you insanely speedy test feedback by both parallelizing tests on a single host and distributing across many hosts. Developed by Box's Productivity Engineering team, we use ClusterRunner internally to run a suite of over thirty linear hours of tests in 17 minutes, and we do that hundreds of times every day. ClusterRunner is open source and language-agnostic, so you can easily use it for your own project. We created ClusterRunner for engineering teams who struggle with long test feedback delays or under-tested code. We designed it from the bottom up to be easy to use and can integrate with your existing CI system. It learns how long your tests take to run, and schedules future runs accordingly to deliver feedback as fast as possible. Its components communicate via a friendly REST API which makes it both accessible and extensible.
Integration Testing with Multiple Mobile Devices and Services
Mobly is an open-source, Google-developed framework for testing products that require interactions among multiple devices, like social apps; or tests that require controlling test environment, like Wi-Fi connection. We'll discuss how multi-device testing differs from single-device testing and its unique problems, like synchronization and code flow between multiple devices, and how Mobly solves them.
Scale vs Value: Test Automation at the BBC
We built an in-house opensource device cloud to scale testing of our Mobile and TV applications but it pretty quickly grew into a monster that forced us to rethink our approach to automation, and find the correct balance between scale and value. Learn how we solved the challenges of on-device testing with focused automation and shared ownership. Also discover how to build your own internal device cloud and leverage our open source tools.
Finding bugs in C++ libraries using LibFuzzer
Kostya Serebryany (Google)
How I learned to crash test a server
Jonathan Abrahams (MongoDB)
Come learn how we tested the robustness of the MongoDB server to survive various system crash scenarios. Learn how we were able to automate crashing a server of any type of OS and host configuration (physical or virtual).