About three years ago, the six elevators in the office building where I work were upgraded to be accessible for the visually-impaired. As part of the upgrade, the elevators now verbally announce the floor that they’re stopping on. Except… Some of them announce “Floor ten” and others “Tenth floor.” All six elevators are in one cluster and, in fact, they have common software as they have a system where you select your destination floor and are directed to a specific one of the six. So, why in the heck does the verbal announcement differ? Inquiring minds want to know. Drives me crazy.
Glad I wasn’t the QA Engineer here!
According to the NTSB’s analysis of a recent incident where a self-driving Uber car struck and killed a pedestrian:
A radar on the modified Volvo XC90 SUV first detected Herzberg roughly six seconds before the impact, followed quickly by the car’s laser-ranging lidar. However, the car’s self-driving system did not have the capability to classify an object as a pedestrian unless they were near a crosswalk.
For the next five seconds, the system alternated between classifying Herzberg as a vehicle, a bike and an unknown object. Each inaccurate classification had dangerous consequences. When the car thought Herzberg a vehicle or bicycle, it assumed she would be travelling in the same direction as the Uber vehicle but in the neighboring lane. When it classified her as an unknown object, it assumed she was static.
Worse still, each time the classification flipped, the car treated her as a brand new object. That meant it could not track her previous trajectory and calculate that a collision was likely, and thus did not even slow down. Tragically, Volvo’s own City Safety automatic braking system had been disabled because its radars could have interfered with Uber’s self-driving sensors.
By the time the XC90 was just a second away from Herzberg, the car finally realized that whatever was in front of it could not be avoided. At this point, it could have still slammed on the brakes to mitigate the impact. Instead, a system called “action suppression” kicked in.
This was a feature Uber engineers had implemented to avoid unnecessary extreme maneuvers in response to false alarms. It suppressed any planned braking for a full second, while simultaneously alerting and handing control back to its human safety driver. But it was too late. The driver began braking after the car had already hit Herzberg. She was thrown 23 meters (75 feet) by the impact and died of her injuries at the scene.
Being a software engineer over age 50
As someone who is currently 52 years old and who has worked in software development now for over 23 years, I agree with a lot of the points in this essay: Being a Developer After 40.
It’s not a bug…
The evolution of cucumber UI test steps
I’m currently the framework/lead developer of a UI testing framework using Selenium WebDriver and Cucumber (written in Javascript using the webdriver.io bindings). In an ideal world, cucumber test steps have no reference to anything beyond what is visible to the user in the user interface:
And I enter Password1 in the Password field And I click the Log In button
In the initial version of the test steps that I wrote, I made steps like this reusable by allowing the test creator to enter custom data in reusable steps:
And I enter "Password1" in the "Password" field And I click the "Log In" button
Where the text in parentheses is a variable, allowing the same step to be used for any similar field:
And I enter "johndoe" in the "Username" field And I enter "Password1" in the "Password" field
The Javascript code behind these cucumber steps made certain assumptions about how the UI text was associated with the UI objects that the user needed to interact with, but this was hidden from the author of the cucumber steps. In the examples above, the assumption was that the text associated with the input field was contained in an associated <label> tag, e.g.,
<label for="username">Username</label> <input type="text" name="username>
Another example step:
And I click the "Submit" button
which corresponds to this DOM element:
<input type="submit" value="Submit">
For our applications that are actively under development, we discussed the need for consistent UI conventions with our developers and they agreed to abide by a set of UI design patterns. Since the agreed upon patterns represented best practices in HTML UI development in addition to making automation easier, our developers were agreeable. Afterwards, if we started to automate a new UI screen and discovered that it did not conform to the agreed-upon patterns, we filed a design-for-testability bug and the UI would be changed.
In the next phase of our automation project, we began to to automate some of our legacy applications, and we quickly discovered that the developers had not been nearly as consistent with the UI designs of these applications: the assumptions behind our existing cucumber steps did not always hold true. Since these applications will eventually be phased out, the company is carefully limiting the amount of development work that is put into them. Therefore, we could not request that the UIs be changed to enable them to conform to the design patterns we had agreed upon for our new UIs.
To accommodate automation of these applications, we had to deviate from the cucumber principle that the test writer doesn’t need to know any more about the UI than what they can see. We developed some steps for other common UI patterns, and our automated test developers had to look at the DOM of the elements they needed to interact with in order to decide which step to use, such as:
And I enter "johndoe" in the input text field with name "username"
which corresponds to the input text field:
<input type="text" name="username">
and:
And I click the button that contains the text "Submit"
which corresponds to:
<button...>Submit</button>
(In that step, the ‘contains’ is the part that requires the user to understand the DOM and which differentiates this step from other steps related to buttons).
To my pleasant surprise, even our manual testers who didn’t really have any significant experience with HTML or the DOM learned these UI patterns quickly and adapted to them.
For the most part, we discovered that the UIs of our legacy applications, while not as conducive to the spirit of cucumber, still used only a fairly small set of UI conventions. We coded some more reusable steps for those other conventions and covered probably 90% of the cases that we encountered.
Eventually, I had to add some steps for the more obtuse UI design cases, such as:
And I enter "johndoe" in the "input" field with attribute "custom_attribute_name" with value "x_username"
I’m the first to admit that that is an ugly, ugly step, but steps like this allow us to automate the other 10% of UI designs that lie outside the conventions outlined above, and we’ve only had to use these types of steps a few times.
It did not take us long to create a library of steps that accommodate all of our standard UI interactions. I don’t think I’ve added a general-purpose UI interaction step in a couple of months, which is awesome.
For the next phase in the evolution of our automation framework, I’m thinking of using the XPATH ‘or’ attribute to collapse multiple steps back into one, but that may end up being more confusing to our test writers, especially if I can collapse some similar steps but not all of the same type. We’ll see.
Travel Time to…
I had to take a different route to work this morning, down I-35 from Round Rock to downtown Austin. Every few miles along I-35 there are light-up signs that are sometimes used for Amber alerts and senior alerts. Most of the time, however, they report “Travel Time to” the next major crossroad. As I pass these signs, the QA engineer in me tries to answer the following questions:
- What is the purpose of these signs?
- What am I, a passing driver, supposed to do with this information?
- How well do the signs serve their presumed purpose?
I assume that the purpose of these signs is to give drivers an idea of how congested the roadway is at the present time. As for what I can do with this information? The sign doesn’t give any recommendations, but I assume that if the travel time is short, I could choose to stay on the road, and if it’s longer, I could choose an alternate route or just say f**k it, I’m heading back home.
The most interesting question to me, though, is how well the signs serve their purpose. In order to make a driving decision based on the information on the sign, I would have to have some idea of the following:
- Which crossroad does the sign refer to (I’ll come back to this one)?
- What does the current number of minutes shown say about the expected traffic volume? Making this judgment means I would need to know some of the following:
- How far is it to the crossroad?
- What would be a ‘normal’ travel time?
- By ‘normal,’ am I thinking of traveling the speed limit without congestion slowing me down, or what I might consider ‘normal’ for this day and time, morning rush hour in my case this morning?
- How does the current estimated travel time compare to ‘normal’?
The first sign that I saw this morning, and the one that inspired me to write this blog post is installed around Pflugerville, just south of the Louis Henna/SH-45 intersection with I-35, I think. This morning, it read:
TRAVEL TIME TO FM 734 7-9 MINS
I would argue that this particular sign fails its intended purpose by the simple fact that it refers to a road number that nobody in the Austin area uses. If you asked me which roadway in the Austin area is FM 734, I probably wouldn’t be able to tell you, but from the location of the sign and from having lived in north Austin for 20 years, I deduce/remember that FM 734 is the numerical designation for the roadway that EVERYONE in the Austin area knows as Parmer Ln.
But this brings up another question: who is the intended audience for these signs? I-35 is both a major local expressway and an important interstate highway. Since I am a local resident, my initial assumption is that the information is for me, and therefore, since I believe most Austin-area residents would not know that FM 734 is Parmer Ln, my conclusion is that the information is not very useful. If the sign is intended for someone just passing through Austin, then referring to the cross road as FM 734 is probably acceptable.
But assuming that I, the passing traveler, know the crossroad, then we come to the point where all of these signs fail their assumed purpose: in order to make use of the information provided, I have to have some way to gauge the estimated travel time against some known quantity. If you asked me, for instance, how far I thought Parmer Ln is from the sign mentioned above, I’d estimate ‘a few miles’ or how long to travel there at the speed limit, I’d estimate ‘a few minutes.’ Assuming my ability to estimate distances or travel times are average, then the estimated travel time is pretty useless.
Let’s assume that my estimate of ‘a few miles’ is somewhere between 3 and 7 miles. If it’s 3 miles, then traffic must be traveling at about 30 mph, but if the distance is actually toward the longer end of my estimate, then traffic must be traveling close to the speed limit. Considering that traffic was probably traveling at 40-50 mph–right in the middle of 30-60 mph–at the time I saw the sign, then my only conclusion is that traffic must be about the same or average for that time during morning rush hour.
Conclusion
Giving the estimated drive time to a point that is a few miles away seems to be of very limited usefulness and requires the person reading the sign to have fairly accurate information about distances or usual drive times in order to make use of the information. I guess if I drove this stretch of I-35 every day and noticed the signs regularly, then I could make comparative use of the data provided. But if I were unfamiliar with Austin, then without consulting a map, this information would be completely useless. If the estimated drive time this morning had been, say 15-17 minutes, I probably would have concluded that traffic was very heavy this morning, but considering the relatively short distance from my current location to FM 734, if congestion were that heavy, then I would probably already be traveling slowly, in which case, I don’t need a sign to tell me that traffic is slow today.
Alternatives
Since these signs apparently use some sort of real-time detection of traffic density or speed, it seems to me that a more useful to give an average traffic speed, such as:
AVG TRAFFIC SPD
NEXT 5 MI
40 MPH
Honestly, I would enjoy nothing more than having someone tell me that some of my assumptions above are wrong or to explain the process behind the decision to show average travel time.
If tests fail, eliminate them
JSF Program Ditches Tests To Protect Schedule:
A major operational test series planned for the Lockheed Martin F-35 Joint Strike Fighter has been abandoned in an attempt to protect the schedule for delivering a fully operational aircraft, according to the just-released fiscal 2014 report on the program from the Pentagon’s Director of Operational Test & Evaluation (DOT&E).
As I understand it, the F-35 has suffered from increases in scope, schedule and cost. Quality sacrificed. Good thing there are no lives on the line. Oh wait…
Automating android cordova/phonegap applications with appium
I am just putting this out there for others to find, because I had such a difficult time locating this crucial tidbit of information: if you select the Android WebView context in your hybrid mobile application, you can run your DOM-based Selenium tests against it using appium.
I am tasked with creating UI automation for an angularJS-based application that we are deploying as a web application and as a mobile application using cordova/phonegap. I wrote my Selenium-based tests for the web deployment and then wanted to use the same tests for the Android mobile application using appium. When I launched my mobile application in the Android emulator and then used Android’s UIAutomator to view the UI objects, all I saw was the Android-based objects, no DOM-based objects–even when I selected the WebView context. My heart sunk because I thought I would have to write separate automation for the web deployment and the Android app. After quite a bit of Googling, though, I found the nugget of information above (I can’t find the source now). So, I’m able to write my tests against the web deployment using Selenium and then run them against the Android app using appium.
Identifying form elements by the <label> tag contents
In a previous post, I explained how I use the text associated with UI objects in my cucumber tests. My steps look something like this:
Given I go to the login page And I enter "johndoe" in the field with id "username" And I enter "password1" in the field with id "password"
If the application under test is using the <label> tag to identify form elements (and it should! The <label> tag was designed specifically for this purpose), then your application under test has UI objects that look something like this:
<label for="username">Username</label> <input name="username" type="text"></input>
Writing a cucumber step to interact with the <input> field based on the text in the <label> tag consists of locating the label element and then using the value of its for= attribute to locate the input element. Using the webdriver.io Selenium bindings, my code looks like this:
this.Given(/^I enter "([^"]*)" in the "([^"]*)" field$/, function (textToEnter, labelText, callback) { xpath = '//label[contains(.,"' + labelText + '")]'; this.getAttribute(xpath, 'for').then( function (value) { // get the input tag with the name= {value} and enter the text xpath = '//input[@name="' + value + '"]'; that.setValue(xpath, textToEnter).then( function () { callback(); }, function (err) { callback(err); } ); }, function (err) { callback(err); } ); });
Words have meanings!
I recently received the following message via LinkedIn:
Dear Stan, We are a young silicon-valley-like startup …
… developing disruptive products for sensing, cognition and communication for the Internet of Things (IoT) market;
… fully funded with an exclusive Fortune 200 customer already secured;
… who is working closely with us to specify the product and take it to market;
… led by a top-notch team of seasoned start-up engineers and executives with successful prior startup exits to multi-national corporations; and
… all right here in Austin.We are currently looking for a top-notch automation expert and looking at your resume and background I thought you might be a good fit. I hope you are interested in hearing more and would be glad to discuss this opportunity further via a call or f2f meeting. Thanks, [name redacted]
I was really curious to know what he meant by ‘silicon-valley-like,’ so I answered:
What does ‘silicon-valley-like’ denote? That could mean a lot of different things–both positive and negative.
His response:
Good point re: the silicon-valley reference – esp. being a long-time Austinite (by choice) I can understand why it could be considered negative! I was referring to the fact that we have an exciting mission in a hot industry area that can have a big impact with a top-notch team to work with. And the particular role I’d like to go over has some very interesting challenges – for example capturing, storing, analyzing, labeling and retrieving very large data sets.
And my answer again:
I’m pretty sure that “an exciting mission in a hot industry area that can have a big impact with a top-notch team to work with” isn’t a characteristic unique to the Bay Area. I understand that you probably can’t reveal many details, but the quote above doesn’t tell me anything more than “silicon-valley-like” So far, you’ve basically told me nothing at all about the opportunity.
If this ‘silicon-valley-like’ startup hired this guy to do their recruiting, I can only come to one of two conclusions:
- He doesn’t know the business well enough to give meaningful details, or
- He doesn’t understand recruiting well enough to get to a candidate’s concerns quickly and answer them.
Based on what I saw on LinkedIn (LI only let me see this guy’s name and title), however, I suspect that he is one of the founders or early employees. If that’s the case, then possible explanations above for his behavior incline me even less to treat his offer seriously. Do I want to work in a company where this person has a leading role? Hell no.