Nicky Tests Software

Friday, May 17, 2019

Testbash Brighton 2019 Conference Day Part II of II

Part I is here

Here are some of my key learnings on the last 3 talks I was able to attend at Testbash Brighton 2019 before I had to leave early to catch my flight back home.


Gareth Waterhouse and Lindsay Strydom: Building Communities at Scale

2010: 
  • 12 QAS
  • 1 location
  • Around 5 teams
  • "Where we're going, we don't need a community"

Currently:
  • They currently have around 120QAs (this number fluctuates due to contractors)
  • 4 locations - 3 in the UK, 1 in India
  • Different tech stacks/languages
  • Around 70 dev teams
Dealing with growth:
  • Before, they would send meeting invites for 16:30 UK time, this was roughly 22:00 in India.
  • Then they started having 2 clocks in the room, so they could consider time differences when scheduling meetings
They faced challenges in scheduling meetups for the testing community:



After Testbash 2018, Lindsay felt inspired. A few days away from the office gave her time to reflect about the current lack of a testing community.


They then scheduled an unworkshop to discuss exactly what they didn't want in a testing community.



They looked into overcoming challenges:
  • Different formats e.g. testsphere, eating lunch together etc.
  • Multiplate locations
  • Different times of day
  • Community driven (not just by one person)
  • Increase publicity for the test community


What's next and how to set up for success:




Eric Proegler: Continuous Performance Testing

Sites go down because of arrival rate --> it's because of the sudden arrival of lots of users.

Common approach to load testing, gradually ramp up the numbers of users like it's a stairs approach. But this doesn't happen in real life.

I took a lot of photos during this presentation and I wasn't fast enough to write too many notes during this presentation.

Here are some of the slides that stood out to me:








Conor Fitzgerald: Benefits of exploring other industries and disciplines

Does the culture of a company affect the quality of the testing?


3 different company cultures that Conor experienced:
Controlled, conservative, checking
Energy, empathy, exploring
Autonomy, Anarchy, Automation (he initially thought it was the 3Es)


Discovered CDT, Principle 4 stood out to Conor:


Aviation:
  • Focus on preventing crashes
  • Checklists (e.g. pilot checklists before takeoff) --> checklists are also used in healthcare and construction
  • Pairing: Modern aircraft designed to be flown by 2< people because it's mentally taxinf to fly 
  • Blameless culture
  • A culture of questioning - people are encouraged to ask questions. *Korean Air story from Outliers by Malcolm Gladwell  
                        



Economics:







Wednesday, May 15, 2019

Testbash Brighton 2019 Conference Day Part I of II

It's been an intense month in a my personal life and even more intense (exciting!) months to come. So before I get too busy, let's turn my notes (that can only be read by me), into a blog post!

In early April this year, I had the honour of speaking at Testbash Essentials on the Wednesday. I also had the opportunity to attend the main Testbash Conference on the Friday and learn a lot from the speakers.

Here are my notes on what I learned in the first few talks I was able to attend (I had to leave a bit early to make my flight back home).


Lisi Hocke - Cross team Pair testing: Lessons of a testing traveller. 

"Have you ever asked yourself, are you good enough?"
Lisi didn't come from a technical background ---> she fell into testing. 
She was mostly the lone tester at a company and didn't have a mentor. But now at her current company there are more testers.

"I'm basically just managing my list, but not learning something."
She shared her struggled in learning new things and constantly having to edit a to-do list.

When she met Toyer, they helped each other get out of their comfort zones. 

Sunday, April 7, 2019

TestBash Brighton 2019: Morning Workshop on "Context Driven Coaching... There is no script."

In the morning I attended Martin Hynie's and Chris Blain's workshop on Context Driven Coaching... There is no script.

Here are my takeaways on what I learned in this workshop:

1. It's common for people to be promoted to managerial positions but not have the skills to coach people. 
Martin and Chris have coached managers, who are in charge of coaching people.
But people promoted to managerial positions aren't necessarily promoted to these positions because they have the skills to coach people.

2. If you want people to work together, you need to give them a chance to get to know each other, to build trust and to show some vulnerability
To kick off the workshop, Martin and Chris had us answer four questions to our table groups (roughly six people each table), so we can (presumably) achieve this goal.

3. The person you are coaching needs to trust you and to trust that you know what you're doing.

4. When you are coaching, there is a high probability that you need to explain why the skill (you are coaching) is important/needed

5. No major change has come from logic; it comes from energy

6. It's important to remember that everyone is trying (in their own mind) make the world a better place.
If things come up that stress you out, think about:

  • What can I control?
  • What can't I control? --> What can I influence?
7. The difference between Feedback, Coaching and Mentoring
Feedback - giving someone information about past behaviour, in the present, to affect future behaviour
Coaching - trying to transfer knowledge/skills
Mentoring - reflecting on a journey you've taken; sharing your experience with the mentee and how that experience can be related to the mentee's current experiences

8. Coaching vs. Training
Coaching - more likely to be individual
Training - more likely to be a larger audience

9. Coaching vs. Mentoring

Coaching Mentoring
"Transaction" - clear end/objective Reflecting on a journey - no clear objective
Timeframe - shorter timeframes; few months Timeframe - could be years
You can measure if it was successful Can't clearly measure success


10. Johanna Rothman's 4 step approach to coaching
  1. Verify help is wanted/needed
  2. Generate options - at least three
  3. Review options, let them choose
  4. Create an action plan (both the coach and the person being coached are responsible for this)



Thursday, February 7, 2019

Reflecting on leading a Testing Community of Practice Part II

For Part I go here

Devoting time and effort - when I have it

While I'm on my project my priority is as a tester in my scrum team. Therefore, I only devote time and effort when I have it. Some weeks I'm very busy in my team and barely give the CoP a second thought; other weeks I have more time to prepare a presentation or approach people to give presentations (or look up topics to see what people may find interesting to hear about from others).

I really appreciate the flexibility. While there is an expectation that something happens regularly, it seems that definition of "regularly" has become roughly once a month.

Merging the Test Automation COP and Testing COP

The lead of the Test Automation COP pinged me on slack a few weeks ago to see what I thought about merging the two. I said I was all for it (after all I saw Test Automation as a part of testing; a way to approach testing - and so did he).

We both posted messages in our slack channel saying we had this idea and wanted to hear  what people thought of it or if people were concerned/worried about this move. Based on the feedback, people seemed ok with it.

Now that we were merged, we updated the Confluence page (for those who read it that is. Are there page counters in Confluence? 💁).
I also sent out a survey asking how testing is going in their teams and what their biggest testing problems were and what they wanted to learn more about. I also asked them if they had automation set up in their team or if they wanted help in getting it set up. (I've found people don't always actively seek out help, but if you offer it - they may take you up on that offer).

Wednesday, February 6, 2019

Reflecting on leading a Testing Community of Practice Part I

For about 4-6 months, I have been leading the Testing Community of Practice at my current project. Before then there were 4 of us being co-leads (for 6 months ish) before I was approached to see if I wanted to drive it and be the lead. I said yes - and said I wanted to see if I was a good fit as a lead, if I had the energy/desire for it and if there was a need/desire for a Testing CoP in the first place.

Finding out what people expected from this Community of Practice

My first focus was to find out what people expected from a Community of Practice. I sent out some surveys to those already in the Testing slack channel, and had two discussion groups in our Malmö and Helsingborg offices.
The hard part was I already had my opinions already on what it was and what it would involve, so when I was holding these discussions I had to watch what I say, and how I say things, in an effort to not affect people's opinions.
The two main things people expected were to share information about testing with each other. Both with regards to how they test in their team and also to learn about new testing concepts and tools (new to them).

Getting people involved

While the majority of people expected information sharing and wanted to hear about how testing is done in other teams (we are in scrum teams distributed across two offices), people aren't exactly jumping up and down to share how they test in their team.
If I ask in a slack channel, "Does anyone have anything to share or want to share what they learned this week? Or a tool they are using?" then chances are I don't hear anything (it has happened, but very rarely).

I have found a much better approach to this is to approach people directly with what you have noticed about their skillset and ask if they wanted to talk about their experiences. It seems a lot of people don't realise that what they think is "easy" or "normal" or "not interesting to listen to" is actually something others would benefit hearing from.

Some upcoming sessions I'm really looking forward to in our Testing Community of Practice is on how one tester implemented and is using Cypress and how another one works with developers.


Figuring out if there's value in this, and if I'm adding value

This is a big one for me. At the start, I said yes, then said I wanted to see if I was a good fit as a lead, if I had the energy/desire for it and if there was a need/desire for a Testing CoP in the first place.

In terms of how testing is done in my project and what people want, part of me can't help but feel a little helpless. A very small minority tell me what they want/expect, and those same people share information about how they test in their team and new tools they use. I don't know what the majority wants and if they even care about links that are posted in the slack channel or automation workshops that I arranged with an automation teacher (he actually taught automation in a course before our current project) as I don't hear anything from a lot of people.

Another thing, there are less and less testers in my project. I'm seeing testers either being forced out or choosing to leave as they don't feel testing (and thus their skills) is valued anymore. And you know what - that sucks! Here I'm thinking is this Testing COP adding value? Am I adding value in this role?

The thing is, I have no "real authority" on this project or leadership title on this project (which is interesting considering it seems most people involved in testing at my project is a "Test Manager"). Therefore in terms of upskilling people or trying to inspire people to want to get involved and learn more I'm not sure how exactly I'm perceived by my peers.

Maybe I care too much? Maybe.
But if I realise that I'm feeling apathy, then that's almost a definite sign it's time for me to leave the project.


For Part II, go here.



Friday, December 7, 2018

How to set up a proxy on Charles (so you can test other devices on your local dev machine)

Have you ever wondered, how do I test my fixes/changes on my local machine on IE 11?
Or in general, how do I test my fixes/changes on my local machine, when I don't actually have that browser on my laptop?

If so, the Charles Proxy tool might be your answer.

A few years ago I learned this and I thought this could be useful for you if any of the below apply to you:
You are developing on a Mac and want to test Internet Explorer 11 or Microsoft Edge on your local machine
You want to test your fixes on a (physical) mobile device, with fixes done on your local machine
You don't have access to a mobile device/internet browser simulator like Browserstack or Perfecto

For this I'm going to explain how to set this up using your development laptop and a "test device"

1. Download Charles https://www.charlesproxy.com/download/ and install this on your laptop where you do your development
2. Make sure both the laptop and the test device are on the same internet network
3. Get your IP address from your laptop (you need your internal IP address, not the public facing one)
4. On your test device, go to your internet settings and enable HTTP proxy manually. In the server field enter the IP address of your laptop and in the port type in 8888 (port types must match in Charles and the one you enter in your test device, you can also enter 8080 in both)
5. In Charles, click on the Proxy menu, then click Proxy settings - then see the checkboxes etc as below.


6. Once you click OK. Go back to your test device and access your normal localhost URL - you should be able to see your traffic in Charles.




Wednesday, November 21, 2018

My experience at Agile Testing Days 2018

I had seen blog posts and activity on Twitter in the years leading up to my first ever Agile Testing Days (ATD(- it seemed like people really enjoyed this conference, so it's safe to say I was glad to get  this email from Uwe saying:



"Congratulations, you are part of the Agile Testing Days 10th anniversary as a speaker with: Bringing in change when you don’t have a “leadership” job title! "


For me, this is a BIG conference, a lot of people; a lot of attendees and A LOT of tracks - won't deny I found this intimidating as there were a lot of interesting talks/workshops at the same time. Looking back at this conference, one thing that really stands out to me is choice: Which will I see? Which am I prepared to miss out on?