Friday, 24 November 2017

41 Definitions of Software Testing

41 Definitions of Software Testing is the first write-up I wrote for a Software Testing e-magazine.
I was asked to share an article (including the Bio) for the Testing Circus e-magazine, I had under an hour to write, review and share. I submitted 40 definitions and was glad to have made it in time. The article completes 4 years and I am re-sharing it on this medium. Thanks to Testing Circus e-magazine for publishing it. I still believe Testing is all of this. The 41st definition was contributed by Testing Circus e-magazine editor Ajoy Singha.

What is Software Testing?
Heck! Define it yourself. I have been thinking about an answer for the same but have found it hard to convey in one statement. Am I a born, reborn or a resurrected tester? I hope to find out for myself and the below article is an attempt at this.

Testing – I will be using the word testing as I continue to write and as you readers read, read it as Software Testing. Here are my definitions of software testing.

1) Testing is a responsibility of representing information which is essential for bettering the application/product under test.

2) Testing is learning to think well.

3) Testing is to understand the various contexts a system can be applicable in.

4) Testing is identifying the subtleties and extremities where the system can be used.

5) Testing is craving to dig deep into the system to look in the nook and corner in order to project the information that can awaken the product owner and the user to surprises and a wow-ness(that a product can be used to perform)

6) Testing is to provide the consumer with an application which re-ensures confidence in the consumer and for the business.

7) Testing is that ability which the whole team is entitled to with an opportunity to grab the consumers attention, supply the consumers demand and to deliver well.

8) Testing is to convert that dormant thought into an active on-going action oriented process.

9) Testing is to continuously collect aids which aims at delivering quality information to anyone equipped to better build the product.

10) Testing is remembering to act in unison with the vision and mission reflecting in the consumable product.

11) Testing is questioning, challenging, being biased and up-rooting the biases about how the product is presumed to be built and used.

12) Testing is having an eye for details however miniature or magnanimous.

13) Testing is buying yourself a microscope and a telescope to look at how a product is consumed today and in future.

14) Testing is building a lifelong insanity to learn in all sanity.

15) Testing is a role play of that of an investigator, a doctor, a builder, a victim, a crime fighter, an intruder, a seeker, an evangelist, a doer.

Do you see such attributes in a tester? – Hire that person.

16) Testing is being in a context all assuming and continuously judging.

17) Testing is testing the assumptions and then falling prey for the judgements made.

18) Testing is re-opening a concluded case.

19) Testing is to don the hat of someone other than you, change perspective and test with a prejudice.

20) Test to KNOW.

21) Testing is time boxed and at-times unleashing the you, learning to think in a way which is not brand you.

22) Testing is building credibility for yourself, your organisation which serves you and which you are serving.

23) Testing is learning to explore the path which you are willing to tread and paths which are road less travelled.

24) Testing is defining, redefining and un-defining.

25) Testing is breaking barriers to test.

26) Testing is a courageous act of preparing oneself to tread a new path, take another challenge.

27) Testing is taking ownership of mistakes with a pitcher of gratitude, that I learnt what not to do and what to do in this context.

28) Testing is story telling via testing and the experience reports.

29) Testing is diminishing confusion and expanding the confidence of a user.

30) Testing is that walk down the memory lane and think if this issue has occurred or is a déjà vu.

31) Testing is that feeling when you love yourself for learning to learn new every day.

32) Testing is together untying and revealing the product/application to itself.

33) Testing is you emerging out of the bath tub with a ‘Eureka’ moment.

34) Testing is a knock on the door of a developer to help undertake measures to provide a fix.

35) Testing is a wake-up call to innovation, to time travel back into the future.

Did you relate to any one or more of these?

36) Testing is an unconventional mode of transport to the minds of a user.

37) Testing is at-times masking the status quo.

38) Testing is closing in on the ‘I’ the consumer, ‘I’ the tester, ‘I’ the developer, ‘I’ the owner and illuminating the path of ‘We’ the team.

39) Testing is that run down the rabbit hole to discover the wonderland of Alice/Alfred to sketch the tomorrow of testing.

40) Testing is Learning.

41) Testing is Circus :) [This definition is contributed by Testing Circus]

I am sure you have moulded yourself into a tester with your own definition of testing and tester, do share your thoughts on the same. Come, join and be a part of this community of information seekers and providers. 

What is your definition of testing?

BIO:
Jyothi Rangaiah, a trespasser into the minds of users, dons many hats that a tester should in order to test. She who has made learning a way of living. Continues to inspire herself by constantly connecting with the learned from the newly introduced community of testers. Willing to make way, where there is no way she finds herself fighting the crime scene in her vicinity. And on this journey bugs bump into her. Unwilled to take no for an answer without judgement, reasoning and questioning has helped her in her way of testing. Jyothi Rangaiah is a budding blogger who writes at chroniclesoftesting.blogspot.com. A fan of defining art who's favorite English word is genuine.

Originally published here: https://www.testingcircus.com/41-definitions-software-testing/

Thursday, 16 November 2017

COPENHAGEN CONTEXT DRIVEN TESTING CONFERENCE 2017

Pre-conference:
Some of the attendees at the conference and those who wished to be a part of the pre-conference registered to be at the meetup that was held a day before the Copenhagen CDT conference, courtesy Danske Bank.

Pre-conference meetup was kicked off by Paul Holland who used United States as software testing analogy and routes to different states in the USA as the test environment. It was an interesting and an eye opening session for most of us present learning about ‘The Potholes of Automating too much’ and on how NOT to test. Paul being an ex-pilot and with over 20 years of experience in testing, had several gems of lessons to share with us.
Points to ponder:
How to make sense out of the test environment when performing testing?
If investigation does not happen then why run the tests?

The second session was by Keith Klain whom I idolise, his talk was about ‘How to "Sell" Testing to C-level Management?’
I, along with the other audience members asked questions and found ways to overcome the challenges we face selling testing to the CXO's.
Key takeaways:
Do not sell testing, instead use a language that the management understands and speak in terms of business and revenue generated, and the cost saved if possible for your voice to be heard.
I noted that, Keith rarely used or referred the slides, he has been speaking about this topic for a while now and we all testers need to yet learn the tact of 'How to not sell testing to CXO's'.
Many a times we fail by spending the limited time we get to educate the management on testing. Instead prepare yourself to speak in a language, and using terms that makes sense from business perspective.

CONFERENCE DAY 1 - TUTORIAL DAY
I opted to attend the tutorial on 'HOW TO COACH TESTERS TO BE BETTER TESTERS'
Carsten Feilberg and Cindy Carless with their brilliant approach to run the tutorial took us participants on a path of learning to coach self and others.
It was a fantastic full day learning by examples and models to learn how to coach.
I learned about ways to coach self which isn't all that easy and the others.
How to be and not to be a coachee was demonstrated at the end of the day by practical examples.
This exercise brought out many key lessons for us all to learn from.
I was glad to have had this opportunity and to learn from both Carsten and Cindy.
Point to note here was the pace of this full day tutorial, breaks which allowed us all to question, learn from each other.
The ice breaker at the beginning of the session was an excellent way to introduce self to others when we first meet.
Few of the ice breaker questions are as follows:
Where is home?, Siblings?,1st job?,Worst job?, Challenges growing up?, Interesting hobbies?. It helped build trust and honour all present.

Carsten and Cindy were/are very well equipped and shared practical examples, do's and don'ts of coaching.
Lessons shared:
Coaching: Assist person A to go from X to Y in your own best way and not the coaches way.
Mentoring: A mentor helps with your existing skills.
A coach can wear a mentoring hat.
Have goal directed conversations with your coach.
Refine the goals - Know when you get there. Build connection / rapport. Learn by matching / mirroring.
Be involved in coaching as a coach and as a coachee, know that they are with you and you with them. So that they can go along this process with you.
Coaching takes two willing participants: Coach and the coachee.
Include stakeholder when coaching.
Deal with the coachee's emotional imbalance if they come to you with it.
Coaching requires equal level of emotion to begin coaching.
Meet the coachee at their emotional level.
Deal with the emotions, then start coaching. This for me is an excellent piece of suggestion that I learned and registered.
If feasible, invite an observer when you coach the coachee initially, as you take up coaching.
Be a guide and not a problem solver.
Do not force a coach on others.
Coaching may not be for all.
Know that some do not need a coach.
Whilst coaching, feelings and emotions can come to play, keep them at bay.
Patience is the key along this journey.
Structure the coaching process, do not derail. Learn and refer various models of coaching and use different models for different scenarios.
And many more gems of wisdom was shared by the coaches in this tutorial. Gratitude and love to the both of them for arranging and running this tutorial with us all.
Personally, I have been briefly coached by Carsten Feilberg and he is an excellent coach one can be associated with. Kudos to them both.


CONFERENCE DAY 2
Day 2 began by shedding light about the what, why, where, when, who and how of the context driven testing community in Copenhagen.
It was an absolute honour to know about Morten Hougaard who has been a torch bearer of this education in Denmark. And with contributions from Michael Bolton, COCO2017 was made possible.

The first keynote of the day was by Keith Klain, who has adopted a great style of delivery and an excellent way of sharing lessons learned.
Few gems from the keynote:
Focus on business risks.
Test managers need to be hands on testers too.
Think deeply about your work.
TRUST your team to do a great job.
Learn and spread the word about CDT.
Be not a tool fetish.

Having heard Keith on earlier occasions too, his talk did strike a chord with the audience and he did engage the participants very well with his notations on 'I don't think deeply about my work', 'I don't trust my team', 'I don't like testing'. Thinking testers understand this and need support from those who "don't". Let us be wise and support each other to "I do".

Next up was my talk on 'Testers role as requirement gatherers' I was fortunate to have my coach, Carsten in the audience who would later share first hand feedback on the talk.
It was my second talk in the European region and I did not know what to expect from this audience. But the participants did give a listening ear and asked questions relevant at the end of the session. I wish I had more time to also do a practical exercise. Nevertheless it was a great day attending other talks at the conference and this made me happy -- seamless transition from the keynote to my talk in the same hall and with the infrastructure setup. No hassle faced. No pre-check required.
Ruud Cox was at my talk and made this sketch note. Thank you Ruud.


To have Keith K, Duncan N, Jokin A, Elizabeth Z, Carsten F, Mike L, Smita M, from the awesome testing community present at the talk, some of who later shared feedback was helpful. Happy to have made new friends in Corey and Soren from the tutorial day, who made it to the talk.

I sat in a few other talks through the day, most promising speakers - Elizabeth Zagroba, Ash Winter, Jokin Aspiazu and Tomislav Delalic who shared about their learning.

‘How to succeed as an introvert' I felt like I met my twin in Elizabeth - her talk was profound and I came back home inspired to share the knowledge with others around who feel stuck and challenged for being an introvert amidst the very vocal extroverts. Her talk encourages introverts to be just themselves and not necessarily cripple their introvert-ism or change for who they aren’t to grow and succeed as testers.
Knowledge on Testing below the application, hardware testing, infrastructure testing, deep diving into discussions on logistics, performance, logs, and the Star Wars themed slide designs by Ash Winter was brilliant.
Building Local Testing Community Out Of Nothing - Forming and growing a testing community, challenges faced, how to grow beyond these challenges made for an excellent learning opportunity to build a testing community around us. Very well presented by two testers Jokin and Tomislav from Spain, who make a fantastic speaking duo.
The last keynote ‘Implementing Context-Driven Testing Keeps Kicking My Ass - But I Think I'm Finally Winning!’ on implementing CDT at different organisations by Nancy Kelln was amazing. She is a rock star in the testing world, her brave and yet at times vulnerable ways of dealing with tough situations and people at work and to educate testers and others on testing call it CDT or otherwise is worthy of noting and following. And we need more Nancy Kelln 's from around the place I come from, someone who can Kick Ass and be epic about it. She is awe-inspiring, thank you program committee for bringing her to COCO.

About the conference itself:
It was a very well paced conferring experience. I found time to meet and greet many friends from the testing community at the conference - Meeting Ben Kelly and family, Janet Gregory, Martin Hynie, Ash Coleman, Smita Mishra, Mike Lyles, Helena Jeret-Mäe, Nancy Kelln, Jokin Aspiazu, Tomislav Delalic and other friends during this conferring experience was hyggelig. Testing talks during and post the conference with the other testers made it all the more worthy of being at this event despite the tons of challenges that I had to face to get to the conference. My immense gratitude and Kudos to Morten Hougaard, Duncan Nisbet, Paul Holland, Maria Kedemo, Keith Klain, Michael Bolton and others from the community who made it possible for me to be at Copenhagen Context driven testing conference 2017.

I must confess to two things here:
  1. The challenges that I faced to get myself to this conference had affected me and the talk delivered. Learned an important lesson by the end of it all to only next submit to talk at a conference, when I am holistically well placed to make it without the multitude of issues bogging me down.
  2. Amidst the hurdles, I felt home with this specific testing community. Thank you all for making this experience of conferring happen for me which helped me make worthy decision in my career and to move on and be better at my work and improve what I do. Thank you for the encouragement and for the inspiration COCO.

Until next time, keep TESTING which is synonym to learning or as Ash Winter puts it 'Testing is believing'. If you are reading this, I urge you to be at this conference to learn and share your learning on CDT with the testers who need to know about it. Special thanks to James Bach, Parimala Hariprasad, Vivien Ibiyemi, and AST without whose wise words and support this experience would not have been possible.


References:

Sunday, 16 July 2017

Hiring problems emerging from not knowing what a testers responsibilities are.

  • Does it matter to you as a hiring manager to know and share what a tester’s responsibilities are?
  • Does this knowing shape what a tester does in the role of a tester?
  • Does this knowing shape the future of testers and testing?
  • What gets communicated as a tester’s responsibilities does affect what a newbie /clueless tester does at his / her first job. It takes a rebel and an educated tester to question the norms. My education on software testing was that of the traditional sorts — write test cases by looking into the FRS, send it for review, rework on the review comments, execute the tests and log bugs against it. When I did something out of the recorded tests — one of these 3 happened: ‘In awe!’, ‘Why did you do this?’ or ‘Who detected this?’. 1. In awe because a new tester swayed away from test cases (It was unheard off for some that a newbie cannot follow the rules set). 2. Why did you do this was because it is not recorded as one of the test cases (and is not reviewed) and who will now explain the involved that we have a new test :). 3. Who detected this because the bug was disguised as a feature and remained undetected for a while, when test cases were followed as is.
  • How often do we witness and / or encourage a tester who questions the norms, professionally?
  • How often are the testers educated in software testing, prior to joining an organization as a tester? Is the syllabus for software testing formed with the consensus from any of the active software testing practitioners?
  • Are we equipped to encourage newness / questions that come our way when we make a hiring (accept / reject) decision? How often do you sum up your energy to ask for the reject reason? In the biased world, it is safe to sometimes not ask for the rejection reason. But when we do ask, know that they may or may not reveal the real reason.
  • Does it matter to you to know that you hired right? May be you did hire right, but the system and the processes followed in the organization remain unquestioned and you get a feeling that 'You don't fit in' or someone you hired doesn't fit in. 

An attempt to share a tester's responsibilities (actual versus the expected) is made here. 

A lot of this learning comes from the notes, articles, books I have read, being present at the test-opsy session conducted at StarEAST[2016] by James and Jon Bach and Michael Bolton. And by witnessing tasks I carried out in the role of a tester. In itself, this work is incomplete and there is scope for improvement as I see that contributing to building testing community, one own's credibility is also part of a tester's responsibility. 


Some of the questions that helped me frame the map are below:

  • What does a tester do? 
  • How do I define a tester?
  • What kind of responsibilities suit this tester versus that tester? (Automation / CodeJunkie / Script Kiddie / ET)
  • What else does a tester do? 
  • What matters? What doesn't matter?
  • How can I help? Who can I seek help from? 
  • Does this process versus that matter? What matters to the organization may not matter to a commoner / a user. 
  • How to communicate the good and the bad news about the product's quality? 
  • Is the information shared useful? useful to the relevant?

(Click on each of these images below, to view their enlarged version).



A typical CV would contain keywords such as the above for an automation test engineer and a traditional test engineer. But is that all, what a tester does and defines with everyday learning? 
A tester does more than document tests, prepare the tool to perform the tests and as part of performing testing, a tester also needs to learn, perform, be and become a tester eventually. See below for some of the other skills a tester needs to be educated in and be equipped with.





Process is for the program / project to run smoothly. Do not discard a tester / CV / resume because it lacks domain / process knowledge / information in it. How we design, program, test, build should reflect in the quality of the product that a user operates.




What matters to the business and the user is important. But what matters to a tester is equally important. If you are learning, you are growing.







Does this know-how help you to shape better questions to ask when you meet a tester to hire him/her? 
Does this make you rethink how / whom to hire? Am I asking the right and relevant questions to a tester who's being interviewed?

Add your comments below, how would you define a tester's responsibility differently when you share what a tester's responsibilities are. 


Do this exercise with your testers (already hired). 

Exercise - What tasks a tester is bound to do when they explore the requirements or perform a product walk-through? And take notes.

If there are no questions at the end of this exercise, then the client is not going to be happy with the testers hired. Hire better. Design the tests to test the tester better. Learn a way to solve hiring problems. 


End of it - Share your honest feedback of the candidate interviewed, and it's important to do it.

Sunday, 25 June 2017

Risk Versus Experiment


















Risk versus experiment - A change in mindset for self learners attempting to learn by experimenting.











The tweet above about risk and experimenting didn't convey all that I meant to and in less than 140 characters it is hard to put into a single tweet the content and the context at times. Hence an attempt to share this short post on how re-thinking risk as an experiment can help us in self learning and not abruptly end our learning via experimenting because it is categorized as hard or as risky. 

Every experiment may or may not be composed of risks. 
I'd like to address via this tweet that some think few tasks as risky and wouldn't even attempt to figure out the solution by themselves. They would rather wait for someone else to conduct this experiment or take this risk on behalf of them to learn from it, which is fair too. But at the same time, if we are relaxed with whatever may be the outcome of the experiment conducted in the controlled environment, then we would continue to learn, focus on learning by experimenting and the results obtained rather than overly focus on the risk attached to the task.

Note that the experiments that I am talking here are not life threatening experiments, but those that are aimed to aid us learn. And can help us learn better when we are fine with what the outcome is__ success or failure. We learn from both these outcome on what to attempt next and what not to. Let us give ourselves the liberty to accept learning from failure/s as well as from success/es.

The context in which I shared this tweet (and in the way it was perceived and interpreted by Twitter friends), proves yet again how important it is to state the context. And to help the reader understand the context in a tweet sometimes isn't possible. Thus I made few more attempts by replying to the tweet to expand on the context. 
The tweet is aimed at an audience who would not try out (for themselves) tests, experiments when learning but rely on ready made answers or would stop looking for answers because it isn't easy to proceed further along this journey of self discovering answers / solutions. The target audience of this tweet would be those who are focused on learning and those who attempt to learn by experimenting. 

Point to ponder - Try replacing risk with experiment, as an experiment for a test that you are performing and check if there is a change in the mindset in the way we learn to perform tests. And note how differently are we continuing learning despite the end result of this experiment. Share your views of such an experiment conducted below. 

Concluding thoughts — Know that, if we did not try finding an answer / solution for ourselves we are relying on someone else’s assumptions, beliefs, circumstances, interpretations, limitations that they had created to arrive at the conclusion. If you had interest, time, resources and a drive to self learn then it isn't wrong in attempting to learn by self experimenting and along the way one might as well gain self satisfaction (if that is something that you crave for). 

Edited to include - 

  • One may refer other's work done prior to continue or to get inspired to perform their own experiments. 
  • If you find someone with similar drive to experiment on a chosen area, then bond with the inspired group to experiment together.
  • Find Make-a-thon groups around you to be a part of such experimentation. 
  • Become a mentor to aspiring students. Share your experience and results of experiments conducted by you / your group.

Tuesday, 23 May 2017

When should a tester engage in testing?

Test Early
Image Credit: https://cdn-images-1.medium.com
- Why should I hire testers at an early stage of a project and not just during the testing phase?
- Why should I let my testers test early?
- How can I hire and retain testers?

- How testers can evolve as PRODUCT ENGINEERS?

- And finally, why software testing is not a boring job?

Here is an attempt I made to address these questions.
As far back as my memory goes, I have been an advocate of testers involving themselves in the SDLC right from the beginning of a project. I have questioned when no testers were invited to the Architecture Review Board meeting at a previous firm that I earlier worked with.
I strongly wish that testers start contributing to a project from it’s discovery phase. Right from when the decision is made to build the software. This means that we do contribute to building the product right from the beginning and are not mere time wasters of programmers (coder/designer) who have already spent considerable amount of time learning about the product, data modelling, coding, unit testing and deploying the code to the test server. And who then will have to pass on all the (or filtered) knowledge to a tester just before or during the testing cycle. Does it annoy you as a programmer?
Point to ponder here: Is this the reason why programmers are paid more as compared to the testers? Let me restate it for the testers: Is this the reason why the testers are paid less as compared to the programmers?
Moving on: How can I as a tester contribute from the beginning of a project?
Here are some solutions on how testers can collaborate with the client / Product Owner, programmer, designer, SEO engineer, security analyst and contribute to each phase of software development.
Requirement Gathering Cycle
When the client / product owner shares the requirement, resolve to this:
As a tester, I want to refine these requirements based on quality criteria so that I will not contribute to cost, effort and time wasted at the last phase of a project.
And do thisShare a clarifications log with the client with the assumptions, clarifications and suggestions to enhance the requirement.
The clarification log can consist of columns in a sheet with these details:

|Query |Clarification by the client / Product Owner |Priority| Implemented in sprint|
Example: The user is able to download the report. 
As a tester, you can ask:

Query |By user do we mean a guest user or a logged in user in this context?
Clarification by the client / Product Owner | Both the guest user and the logged in user
Priority | Low / Medium / High 
Implemented in Sprint | Sprint number

Above is an example of the clarification(log) which we maintained throughout the product life cycle by the whole team in a startup I worked at. The programmers then become better equipped to add both the scenarios of making the report available for a logged in and a guest user.
Understand and refine the requirement/s when we receive it so that as a tester you can start contributing to the quality of the product and build credibility for yourself right from the early stages.
Design Cycle
Develop skills required to test the design / wire-frames. Some references are below:
Test for user flows / navigation, user behavior in the design cycle. Do not wait until the end of functional testing to begin with User Interface and User Experience testing. We are responsible for a user forming good habits when they use the software built. Let us make every click count :)
Development Cycle
Test the code that is being developed for missing validations, known vulnerabilities and bugs, ask for access to the code, database and learn what is being implemented as part of a fix and what is the impact of the fix. This helps us as testers to learn about the fix and the impact early on. And help identify a bug during the coding cycle and in fixing it then rather than wait for the fix to be made, unit tested and deployed on to the test server to be tested and then for the bug to be logged in the bug logging system to be fixed and re-tested.
This to me sounds like waste of valuable time for all involved.
Testing Cycle
Now that a lot of time has been invested by you in testing at different cycles of the project, utilize the testing time effectively to test the system post integration. Take time to learn about effective bug logging, interactions with the application, it’s users and invest time to gather user feedback and enhance the product suitably.
Release Cycle
Proceed confidently to the release cycle. Take time to test the release notes in dev/test/beta phase. Check for version numbers, fixed bugs section, whether adding this information creates an impact, can we do away without adding the trivial details. Those users who will read the release notes (which is not equal to the number of downloads) are unforgiving if there is nothing new that they can learn post reading the notes. Add value by bug logging and enhancing the product quality at this stage too.
Maintenance Cycle
Use this time to work on those low priority bugs and get it fixed and re-tested along with the other bugs found by the team / user feedback / suggestions / user reviews that you read.
Conclusion
Invest time earned to learn, take testing courses, give talks, attend webinars and conferences, invite users to test and to gather another test idea. Up-skill to remain relevant in the ever changing and ever growing industry.
Investing early efforts to test can help address major problems which we are facing as testers.
1. No time to test.
2. Building credibility early on.
3. Knowing that this credibility eventually gets translated to respect and testers being treated equally with the other roles in an organization.

My question for you today is: Do you recognize valuable respected testers around you? 
If yes, make them your mentors today. They are waiting for you too :)
Earlier published : https://blog.whyable.com/test-early-904c337696f1