Thursday, 31 December 2015

HAPPY NEW YEAR

Dear friends,

Thanks to you all for providing me with opportunities to learn, un-learn and work with you in any scale this year.
Wishing you the best in the coming year and I look forward to a continued association with you all.
Let us collaborate and contribute to the testing community to the best of each of our abilities as we continue to learn.

Many of you have been a guru, an inspiration and have helped me look forward to the coming year. Thank you for being an example, for leading the way and setting high standards for the next generation.

Special mention of many bloggers and authors for providing insights into your practical learning.
Speakers, that I met at conferences this year watching you speak has been a treat, keep going strong.
Newbie testers with whom I worked, thank you for helping me learn be better than earlier.
Mentors who provided positive feedback and at the right time have helped me take the right step forward.

Cannot take all the names here, all your kind gestures in nurturing new speakers and guiding them in the right direction has been applause worthy.

Thank you friends, co-workers, critics, readers, bloggers, speakers, activist's. Cheers and a happy new year y'all.

Friday, 25 December 2015

Testing the Design - Part I

What factors do you as a designer, as a user experience creator, as a tester, as a coder consider when thinking about the design and while creating it?

Here are some questions that can trigger us to think based on few factors listed below and which when considered can help us design better.

PLACEMENT

Placement is where you place different page elements on a webpage.
Where would you place some of the frequently used applications on your device? Why would you place it there?

- For ease of access.

- For Security - lock or unlock based on what you store in the application.
- Group relevant applications, page elements together.

ALIGNMENT

Alignment is how you place the page elements on the webpage.
- Original or Inverted or Tilted

Why alignment is important?
- to share another perspective
- because it is relevant in a context
- to connect with your target audience
- to target a different set of users

DISPLAY


- Font
- Font size
- Font color
- Font style
Do you have users with low vision?
How would you cater to their expectations from your application? Are you designing for the differently-abled?
As we learn which foreground color needs to be used to enhance visibility on a particular background color, this know-how can at times lead us to change a decision we already made, go ahead and take this risk initially.
Test under low light, bright light, natural and artificial light, move closer to the screen and move away from the screen and perform display and visibility testing.

CONTENT

- What you put in there, helps you win and retain the customers and keep them coming to your website rather than opt for your competitors website. 
- The content needs to be: legible, influential, a user can connect to it, can remember with ease and is not offensive.
- Remember while you make it fancy, it remains relevant to the theme of the website that you are building.
Example:
Consider you are designing a website which you wish to make accessible, interactive, functioning, responsive, reachable to all the users and is user friendly.

You are on the login page of a website designed for dancers entering dance competition.
You enter username and password and there is a submit button for user A and let's dance button for user B.


Which of these button's are appealing to you as a dancer? A or B
Do you as a user care?
Do we as creators care?

I leave you with these questions which can act as a trigger when you test for design on both web and mobile applications.

FEW DESIGN TIPS
  • Share the prototype / model / wire-frame with the client, coder, tester. Test it and fix bugs reported during design cycle to reduce cost, time and effort.
  • Remember to note any missing design/s implementation/s and let the design team know.
  • Perform AB testing - compare, cater to the needs of the user and conclude.
  • Do not ignore designing for mobile along with web.
  • Choose interactive design every time. Because the design talks to the user even without a demo / guideline / manual on how-to use the application.
  • Read books, articles on design. Enhance your ability to assess a website for design. 
  • Experiment - Bring ideas from anywhere, remember nature is our best teacher.
  • Cater to the differently-abled users. Make it usable for majority of your users.
  • Design based on a theme, make the application interactive with every user message displayed. Success message: You have successfully registered for competition "XYZ". Break a leg!
  • Give only what is needed as part of the design. Address the spaces wisely.
  • Design the user guide / manual with the same theme. Let the website be a reflection of how a dancer breathes and thinks. 
  • Know who your users are.
  • Design for the user to navigate further and not logout if for example the entries are closed or if you are no longer accepting dancers for a competition. Help them with directions / signs which when clicked on lands the user in a list of entries which are accepting entries.

Thursday, 10 December 2015

Where do we testers invest our testing effort and time?

Before we proceed: This is not a measure or an estimate of the testing time and effort.

Do you have Testers in your team asking for time to test more than what is allotted as time for testing?
Ever wondered why Testers need that time and how it gets utilized?
Here is a mind-map I put together to capture where I spent my testing effort and time.

Is it effective use of testing time? 
Answer for yourself.


Testers, take a look at the distribution of testing effort and time across the nodes in this mind-map. Do you consider time spent researching, learning and communicating with right sources of knowledge as testing time? 

Give it a good thought and action on where you are spending your testing effort and time. 
Remember that attendance at meetings and conferences, a talking assignment, interactions with knowledge sources are not time off from testing but to be considered by us testers as testing time so that when we talk to our team we are assertive when we take time out for these testing tasks.

Shared below is my perspective on why Testers ask for time to test rather than merely accept what the management considers and allocates time for as testing time.

Image - Testing Effort and Time
Testing - Effort and Time
Click on the Image to enlarge or alternatively view node view below.


                         






If you have more to share please add as comments below. The next time you ask for testing time, do consider the factors that would otherwise not be considered by non-testers. Thank you for reading this blog post.

Please note: 
Audience, this post is in no way related to performance, performance testing, performance review or performance rating. If you have a recommendation for me to write / share on the aforementioned topics then do share your comments below.

Tuesday, 21 July 2015

Blueprint of a Test Strategy

Draft the testing strategy not in isolation but with the team.
Take inputs from the team - edit, correct, read, review, rectify and share.
Do not enforce a strategy, plan, policy, tool on the team.
Build teams which can self organize and destruct on a need basis.
Define a *culture to fit the needs of a team and not otherwise.

*Civilization is what we have. Culture is what we are  - Dr. H. R. Nagendra

Software Testing
Policy - Strategy - Plan - Metrics - Tips

The mind map below enlists / blueprints Software Testing:
  • Policy
  • Strategy
  • Plan
  • Metrics
  • Tips
Software Testing
Policy - Strategy - Plan - Metrics - Tips

Tuesday, 30 June 2015

Are You Performing Your Actual role?

Audience – Testers, Developers, Business Owners and others who are reading this article
Scenario – Actions and reactions to a rejected defect by the team
Key takeaway points – Amplify your productivity at work by sharing all that you do as part of your role. Ask for sufficient time required to perform each of these testing tasks.

Tester

The defect we log is at times not accepted as a valid one. At times it simply is not reproducible.
There are occasions when a server restart can fix a server error encountered on a web page.
But the same is not conveyed to a tester, unless the team has to raise a request every time the server is restarted.

Dev team - Do you keep your testers in loop at all times when you make changes to the delivered code?
Testers - Do you ask for information on what fixed the defect? And say no to accommodating frequent changes to the delivered code.

It is important for the entire team to be able to COMMUNICATE and be INVOLVED wherever and whenever required.

  • Be transparent and help each other maintain a healthy environment.
  • Start by initiating to reproduce a defect with a developer sitting beside you and do so on the test environment. There, the problem is solved.
  • Provide sufficient details to reproduce the defect. Re-produce it yourself. Ask a peer to do so. Ask the dev team to do so on the test environment.
  • Ensure that the team actually investigates the defect before marking it as fixed.
  • Collect testing wisdom as you do this and share the notes with the team.
  • Use a defect tracking tool, to track a defect to its closure.
A tester’s job is not limited to logging a defect but is extended to taking it to a justifiable closure.

Developer

Do you feel offended on missing out to implement a requirement or portions of a requirement?

  • Defect is a reflection of quality (low) of the product not a reflection on you/r work.
  • Not being able to find ALL defects is not solely a fault of the tester.
  • Irrespective of developer / tester – it is necessary to understand errors and differentiate human and machine error.
A requirement is raised and prioritized by the business owner(BO) and is signed off by the BO.
If the entire team contributes to prioritizing the tasks then how can a defect or a suggestion waiting for its turn to be fixed be de-scoped by someone other than the BO?
Do you check with the business / product owner prior to de-scoping / rejecting a defect / suggestion?

Business Owner

  • Are you accessible to the whole team?
  • Are you present in a defect meeting?
  • How do you contribute to defect triage?
  • Do you have a say on de-scoping / rejecting a defect /suggestion?
If the answer to the above questions is no then it is important that the BO be invited and kept in the loop of meetings which does require involvement on making decisions to de-scope or reject a defect. The power to de-scoping or rejecting a defect / suggestion suddenly does not shift from BO to a Developer / Lead / Manager.

Back to the question: Are you acting / performing your role?

  • How often does it require for you to work beyond your role and responsibility?
  • Are you really asking the right questions during the interview, regarding the role and your responsibility as a team member and not just as an individual contributor?
  • Does it surprise you, when you are asked to deliver more in a less time-frame?

The above questions will not worry us later, if we take time out to define the role and the responsibility we are endowed with at the time of the interview and what we are actually doing as part of our day to day testing tasks.
  • Record and share with the team, the tasks you performed whenever you overshot a target set.
  • Share what other activities you performed, being in the current role.
  • Ask for time to do all these activities.
  • Provide information on risks encountered, road blocks you met with, research that you did, mitigation's you implemented while performing these tasks.
Are you still penalized for not acting your role? Or promoted for taking too much work upon yourself?
Both can be lethal!

Answer for yourself – Am I acting and performing my role?
Above all – Know that what defines you differentiates you. You are unique because you ask a different question, you solve a tougher problem and you can help solve them.


Allow yourself to act your role

ACT is to – Apply Clever *Tactics
*Tactic - A plan for attaining a particular goal.

Featured in the June 2015 edition of Testing Circus

Friday, 19 June 2015

Testing - A Migration Project

Lessons learnt when working on a migration project - Pitfalls and Preparations


I had not worked on a migration project before. I did not know if testing this project would be same / different from the other projects that I had worked earlier on.

I did learn new lessons when I worked on this project and have summed up the lessons learnt in the form of a mind-map below.

Note: If several web applications are being migrated it helps to track the progress on each relevant component meticulously.

A few other aspects which helped me (in this instance) sail through this project are:
  • Assertive – Being flexible and saying no when needed
  • Asking for help / clarifications in time
  • Learning from mistakes
  • Being practical 
  • Communicating – regularly and effectively
  • Sharing risks early and following it up to its closure (in some instances)

Why mind map?
Because it helps me collate and share the ideas around a centralized topic better than the other modes I have tried.
If it is relevant to what you wish to share and helps you visualize better, try it.


Add on to this mind-map with your learning and / or share your comments below.  Happy testing and migrating! 

Click / tap on the image and zoom to view better or alternatively view the images below. 
Migration - Web application


Identify and Do this
























Tips





Testing












Friday, 8 May 2015

Mobile App Testing

Mobile Application Testing - This is not a test first, but prepare to test first approach to testing

I) Testing Requirements

Begin by gathering the required information to test
  • Credentials – Device,  App, Defect logging Tool [if any]
  • Clarifications – Keep the clarifications log updated throughout the testing phase
  • Deployed code
  • Device – Availability (and unlocked), device details, operating system, build version, charger, adapter, USB Cord
  • Functional Requirement Specification – Obtain information from the development team, previous tester, from the FRS, existing defects, test data, UI screens, known vulnerabilities of the device
  • Context – Learn the context in which the app needs to be tested
  • Scope – Extend / limit testing based on the approaching target time
  • Test Instance – which instance to test on, having this clarified at the earliest is better

II) Test Preparation

Prepare a check list /Mind map
  • Check access with the provided credentials
  • Ensure the device is available throughout testing with the testing team
  • Deployed code / build is available and is the correct version
  • Clarifications can be created, maintained and shared at every point. Keep the clarifications log updated and share it with the whole team. Everyone in the team can benefit from this
  • Read from the available sources about the app, domain, instance, known issues
  • Learn more from various knowledge sources about specific details to equip and perform better than earlier
  • Read from books, blogs and from previous testers / seniors prior to and during testing
  • Make a list of third party components used and build knowledge about the same. Available for support being the prime check
  • Prepare the test environment – crucial that we take time to set up the environment prior is feasible. Ensure to take notes of environmental failures and log it

III) Testing

  • Record every tap, touch, scenario, data used while testing. Share test data and the source of test data
  • Network tested on, components tested, tools used, observations made and provide suggestions if any
  • Clarify doubts on time
  • As we do this, build a rapport with the entire team, programmers, network, infra and testing team. And anyone who can help speed up or add to the test execution phase










IV) Test Deliverable's

Communicate – to the team what you will deliver post testing.
  • Defects
  • Experience Report
  • Impact on business
  • Impact on users
  • Observations
  • Suggestions
  • Test Data
  • Test Report
  • Test Results

V) Risks faced

Communicate – what else can be /needs to be tested, risk and the mitigation plan
  • unavailability of device
  • not having sufficient time
  • clarifications pending
  • new to the team
  • new to testing
  • lack of adequate knowledge to continue testing
  • other time wasting factors
Final thoughts
  • Timely reporting and learning to enhance the way we communicate during the testing phase can help us do better
  • Clearly state the context, scope, test data used and the components tested in the test report

This can ensure us gain satisfaction with the effort we put in during testing phase and can help the team to plan better for the next phase/build/iteration be it testing or any stage of the product development life cycle.

Sunday, 19 April 2015

DEV and Testing Deliverables

Many problems arise for testers and the testing team mainly due to the following reasons.
  •      Late delivery of the code for testing
  •      Test environment not being available / ready / setup aptly
  •      Lack of information regarding the product , project , process and
  •      Due to lack of clear communication at all these levels

These are also the reasons for delay in delivery of tested code and the product for User Acceptance Testing in instances where there is a UAT team involved.

Here is a mind map which sheds light on the requirements which the testing team requires prior to commencing testing and the teams involved in delivering this.

This is applicable in an agile context where the testing team awaits for the DEV team for test deliverable's.

Do add on to this mind map. Share and use in a context applicable to you.

Having a check list will not help, unless it is put to use rightly. 

To avoid blaming and miscommunication within and the involved teams, try to be aware of the
  • Risks
  • Educate and convey the risks involved to the concerned teams and 
  • Have a mitigation plan to tackle unforeseen problems
Above all, communicate honestly and bravely with decisiveness about the delay, risks clearly.

Saturday, 18 April 2015

WINJA at NULLCON2015


WINJA - An all women’s Capture The Flag (CTF) event arranged at NULLCON 2015 by Sneha Rajguru, Apoorva Giri and Shruthi Kamath was a well organized and well run event.
The event attracted enthusiastic participants from across India, some of them were already regulars at the null chapters in their respective cities.

Wondering what or who a WINJA is?
It is an on-site hacking simulated competition at nullcon where individuals attempt to attack and defend computers and networks using certain software and network structures. http://nullcon.net/website/goa-15/ctf.php#winja
Women Ninja at NullCon
3 groups were distributed with vulnerable systems and asked to crack the challenges. Below are some of the challenges presented to the participants.
  1. Missing function level
  2. Command execution
  3. SQL Injection
  4. IDOR
  5. Spoofing Referer
  6. Reflective XSS
  7. Sensitive data exposure
  8. File upload
  9. Stored XSS
  10. CSRF

Participants were grouped into three teams consisting of 3 / 4 girls in each team.
  1. Group A participants
Kriti, Shobha, Rupali, Soni
  1. Group B participants
Sudeeksha, Elizabeth, Angeline
  1. Group C participants [WINJA Winners]
Vandana, Hema, Ananya

Cracked challenges and scores
Missing function level -10, SQL Injection - 10, Reflective XSS - 30

The participants tested their hacking skills and learnt different attack vectors for various vulnerabilities and had fun while doing it.

Feedback from some of the participants
Saumya - Excellent concept, glad to have bonded at the first and one of it’s kind women only event.
Kriti from Adobe - Liked the opportunity to volunteer and be a part of the event.
Sneha a participant from Attify - An exclusive women only event helped to network with the other participants and know each other.
Elizabeth - It was my 1st CTF event, tried and understood what I was doing and enjoyed it.
More such events should be organized.
Ananya Chatterjee - Having the event organized at NULLCON helped. Glad that it was an  inhouse event so that the participants could attend the conference plus the competition.

Overall the participants were in unison that the event was educative and helped them all know and network with each other.
Team 2 continued to crack the challenges after the event, with the winning team helping the runners up.
This did not stop here as Sneha Rajguru extended her help to continue to learn after the event by sharing their contacts to exchange ideas and share knowledge.
Some of the participants expressed their interest to contribute to Infosec girls and be a part of the null chapters at their respective cities.

Group C emerged as the winners and were awarded at the end of the event.

The event in pictures

About The Event organizers - The Infosec Girls
Apoorva Giri
Apoorva works as a Security Analyst with iViZ Security (a Cigital company).She has presented a workshop on "Cyber Security and Ethical Hacking for Women" at c0c0n 2014at Kochi, Kerala. Her interests lie in Web Application Security and Mobile Security. She's an active member of null/OWASP Bangalore Chapter. She has been listed on the Barracuda Hall of Fame for finding vulnerabilities on their application.
Shruthi Kamath
Shruthi works at Infosys Limited. She is a certified Ethical Hacker from EC Council .She has presented a workshop on "Cyber Security and Ethical Hacking for Women" at c0c0n 2014.She has conducted a one day workshop on "OWASP TOP 10" at Null Bangalore chapter. She has presented on "Secure SDLC" at c0c0n Conference 2013.She has participated at Jailbreak nullcon 2014. She presented a talk on "Cybercrimes in India and its Mitigation" at the National Conference for Women Police held at Trivandrum. She's an active member of null/OWASP Bangalore Chapter. Her area of interest is Web Application Security.
Sneha Rajguru
Sneha works at Payatu Technologies Pvt.Ltd. She is a Certified Ethical Hacker and a Licensed Penetration Tester from EC Council. She's an active member of null Pune Chapter and has presented talks on various information security related topics during the local null meets(Pune chapter). Her area of interest lies in Web application and mobile application security and fuzzing.
Follow the below web links to learn more about NULLCON conference, Infosec girls and null chapters.
Null - http://null.co.in/ Infosec Girls - https://infosecgirls.in/ NULLCON - http://nullcon.net/

Saturday, 4 April 2015

About automation and mismeasurements

Testing Newsletter - 1

Headlines
  • 100/100 test cases automated - testing team awarded.
  • Management says 12.73 failure rate is a must for every testing cycle.
  • Testers denied from attending training's on analytical and critical thinking skills.
If such is the criteria for accolade and criticism we as professionals, leaders have failed ourselves.

Are these people who make such suggestions / decisions educated in software testing?
The testing team must seek answers and challenge such decisions made.

To people, who think automation is the solution to any testing problem. Read, read often and enough to know the origin, history and usage of automation. Then apply this knowledge to learn: if automation is the solution to the problem at hand. 

Read enough to know the pros and cons of test automation. Then use it wisely.

About automation and mis-measurements

Consider this example below to understand what we can do to educate ourselves and those around us.

Preparation of Doughnut (or Vada)
We can automate the process of doughnut preparation, but everyone who consumes the doughnut has a different need / taste / allergic to the ingredients used. The same doughnut prepared out of the same ingredients and using the same preparation process is not the only way to prepare a doughnut. Nor is test automation the only solution to all testing problems.

Extending this analogy to testing, we can learn that with the introduction of variations and complications we can begin to gather information about the product. With this information we can learn the context, in which a doughnut is prepared, served and to whom and under what conditions. These parameters define the CONTEXT. 

Are we as inheritors of this knowledge, understand and apply the knowledge of context before defining, suggesting automation? If yes, then good.

If no is the answer to the above question, then let's begin to read regularly and apply the knowledge. Every user is not in the same environment nor is in the same context when preparing, serving and consuming the doughnut. Then, why suggest and sell test automation as a solution without exhaustive research of the organization, product, team, technology, environment and the users.

Try not to suggest test automation by reading the profit sheet of organization X. Do your own study and put your power to influence to good use.

Measure under pressure
Measurement under pressure is a recipe for disaster. 
Though not immediately but eventually metrics / measurements made under pressure lead to corrupting the system and the process followed and used. 

Metrics must serve a purpose, have well defined parameters, state the environment of use and conditions in which it can be put to use aptly.
Record the data as clearly as possible. 
Metrics should not be merely used to achieve, over or under achieve an / any objective. 
Or used to meet the standards of an audit ready document.

"Whenever possiblebe clear" -  Confucius
When defining the requirements, test strategy, automation framework, test reports - be clear.