Friday, 26 December 2014

Assumptions in Testing

Testing is performed mostly based on initial assumptions made about the application under development/test. Specially, if a tester is involved in the initial stages of the product life cycle then clarifying the assumptions made forms a part of the testing activities.

In order to clarify the assumptions made, it becomes necessary for the testing team to be an integral part of the project discussion meetings throughout the duration of the project, be it the meetings during product discovery stage, architecture review board meetings, change request meets and those that happen during the product design and development phase.

Preparation for testing begins much before the requirements take the form of a functional requirement specification document. Based on the assumptions made testers start to gather information, set up the test environment and test. And while doing so, we may make more assumptions.

How to convert the assumptions made early into valid test ideas?
If you are involved in testing an application:
● with or without documents
● have started testing without looking for any documents
● you are seated away from the team which can help you get clarifications
● you are a testing firm without immediate access to the right sources(people) of information or
● you are new to a team and are already in the test execution phase

Then try this:
● Make a note of the assumptions made.
● Share this notes with the team who can help bring clarity.

Test the assumptions made:
● Make a note of the assumptions made and those that turned out to be correct.
● Make a note of the assumptions made and those that turned out to be incorrect.

This will help us understand how we make assumptions, how we learn and with what ease we unlearn. At times, it is required to proceed with testing with the assumptions made. Know that, not all clarifications may be received in time for testing. In that case, share the details of a test and how testing was performed underlying the assumptions made.
To assume is to presume - Jude Morgan
To assume is to perish without learning newness about what a product is designed to do and what else can it do. To understand this, apply this to people around you. To assume something about them is to not let you learn something new about them.

Would you be willing to take the risk of not learning new? If not ask away, gently.

Consider the below test as an example:
Test: Check if the end users can view a report available on a web page.
Apply Jerry Weinberg’s “Mary Had A Little Lamb” to the above test.
  • Who are the end users?
  • Do all end users have read permission to view the report?
  • Can we check on behalf of all these end users?
  • Are the end users required to subscribe to or follow a group for them to view the report which is under test?
There are no minimum or maximum number of assumptions.
The above questions can now help us to define precisely who the end users are. Clarify the assumptions made. Assumptions are not a dangerous thing to make, but not clarifying assumptions can turn to be expensive.
Make assumptions. ‘Never say, never assume’
Know and note the assumptions made to clarify them with the right sources of knowledge.
Unattended assumptions may bother us while we are performing time bound testing.

Take time to clarify the assumptions made.
Maintain a clarification log.
Ask for and receive clarifications.
Learn how it can help us remove foreseen and unforeseen roadblocks.
Sometimes we may increase the shelf life of assumptions. Do not let them overstay their welcome.

Asking for clarifications 

Be gentle.
Establish a positive tone when asking for clarification and when clarifying. Assumptions are the basis for many inventions. Why shy away from new learning?

How can a tester benefit from assuming and clarifying?

Few benefits are noted below:
Assumptions and subsequently clarifying can teach us:
● Survival skills
● Life skills
● Being genuine skills
● Communication skills and
● An ability to understand the unknown when we started.

Know what else? 
It helps us build a rapport with the other teams involved in building the application. The teams can now openly discuss and share ideas with ease. During your learning process, you would have shared a few assumptions which could have converted into an area to develop. This means another test idea to test and add value to the product.

It can help the involved teams; add value to the product and to each other’s effort. Learning together as a team, has more benefits than listed here.
I will leave you with this quote:
“Your assumptions are your windows on the world. Scrub them off every once in a while, or the light won't come in.” ― Isaac Asimov

Optional further reading: 
References to Mary had a little lamb
http://www.developsense.com/blog/2011/01/exegesissaves/ http://testsidestory.com/2010/10/12/childrens-ownpassfail-criteria-and-nursery-rhymes/

This article was featured in : http://www.testingcircus.com/december-2014/

Saturday, 15 November 2014

Being Quality Conscious

It may so happen that in the busyness of trying to achieve n(insert any random number here) test case execution per day, one may lose the childlike curiosity to view a product or learn from it. Unless, we crave for newness in learning, questioning, implementing, listening and to be wowed by it - testing hasn't gotten curious enough yet.


Testing skills can be enhanced by many ways:
  1. Testing solo
  2. Pair testing
  3. Testing with a team
  4. Participating in bug bashes
  5. Reading a book
  6. Reviewing bug reports
  7. Learning from a mentor
The list is not exhaustive. We all learn from many sources and in unique ways.


What have we learnt new in testing today?
This question can haunt if we do not practice to hone testing skills everyday. Practice here can mean practicing to observe, test, listen, read, explore, record and report the findings. Testing ideas can be derived from such practices.


Expected result of a test case may seem obvious, but each user’s behavior isn't obvious. Asking a tester to follow a set of test steps and ranking him/her based on this criteria is weird in any age and time. But it has to be done, as we finally will need a winner to announce in the race.


Everytime we test, we can ask the question, what is quality?
If we tend to follow a script or are testing a product using a tool, try to add an element of "I wish and want this result from this product, feature or service".


Coding, scripting, testing with tools can help introduce the testers to technicalities of the product under test. So let us not rule out or limit learning to test using these methods.


Context Driven Testing
Introduce context driven testing to testers around you along with imbibing interest to test beyond test cases.


Context1:
How would we test a product A, if there was no competitor to the product?


Context2:
How would we test product A in comparison with product B(a competitor to Product A)?
Would we limit testing to the requirements specified and signed off in a meeting based on a method used to derive tests or can we extend the quality of the product by including features of product A in product B?


Does testing contextually add to the quality of the product? Try it yourself.
Diversify testing.


What is Quality?
Let us not tie quality to a definition, a method, a cycle, a phase or a methodology.


We have been speaking about, testers learning to code and coders learning to test(Devtesters). Can we as a team be Quality Conscious in our respective roles(defined or undefined)?


Who should be Quality Conscious?
Let us just remember that Quality is everyone’s concern.
We can introduce testing to the BA, Developer, Product Owner and others in the team.


Is this Testing?
Matching lines of code to a requirement or an expected behavior is a happening test in many STLC. This ascertains that there is code coverage provided and a ‘Pass’ associated with the unit tests.


It isn’t all about record and play nor about traceability(the ability to trace back tests identified to the requirements gathered and signed off at the beginning of the project) which is the norm in some firms.


Introduce the team to testing beyond requirements gathered. Learn and aid the team to add testing ideas as we(or more of us) test together.


What are the testers testing for?
Testing needs to be extended way beyond the laid out architecture, platform chosen, standards set, tools chosen and metrics used.
Branch, path, statement coverage, unit and database testing can meet business needs. Do they also meet user needs? Knowledgeable and experienced testers can come in handy to test beyond what meets the requirement document.


Where?
Share the test environment details with the whole team. Details about the device, OS, Browser, Internet(speed, package, provider,router preferences), protocol, client, server and versions used. Familiarize the team with testing gizmos.


What is the vocabulary of the team?
Does the vocabulary of the team members start with requirements gathering and end at delivering clean code?


Can we as testers introduce term ‘testing’ into this vocabulary. Thus helping the team to be conscious about Quality.
Be meticulous about the terms introduced to the technical team, introduce terms that make most sense to you as a Technical Tester, the language which developers can follow. And not ruling out the fact that other testing terms can gradually seep into the system with this start.


Learn to learn that our differences can make us a stronger team ~ Diversity Heuristic referenced from http://www.satisfice.com/blog/archives/893

Credit - James Marcus Bach


Thursday, 9 October 2014

Functional Conference - FuConf2014

Day1 at the Functional conference - FuConf2014

Keynote:
The keynote 'The joy of functional programming' was delivered by Venkat Subramanium. Indeed a joy to listen to this talk. There were lessons shared on coding standards and sub-standards.
I had not set expectations before the talk. As the talk progressed Venkat shared the history of programming, computability, object oriented programming and mention of this quote by Alan Kay who coined the term object oriented.
I made up the term 'object-oriented', and I can tell you I didn't have C++ in mind
-- Alan Kay, OOPSLA '97


The talk centered upon what was presumed as 'mainstream' once upon a time needn't be the word of the Lord. 
Venkat deconstructed the word mainstream by quoting many examples like:
-heliocentric theory.
-importance of hygiene at hospitals.
-women's right to vote.


The gist of the above references being: not to fall prey for dogmatic mainstream theories but to be as pragmatic as possible.


Mention of Rosa Parks was a spark in the talk, he then moved on to explain pure functions, expressions, statements, programs returning null value, goto statements, variable declarations, use of parallel stream, referential transparency and lambda expressions in Java 8.


He shed light upon the use and misuse of expressions versus statements.


Coding standards
-Use of expressions over statements


Coding sub-standards
-use of statements
-program returning null value
-goto statements


He also cited examples of lazy coding and his experiences with code reviews. Every participant had a lesson to carry back and implement.


There was a mention of the syllabus and the academicians in the talk. 
Thought to ponder upon: 
What's taught , learnt and implemented in schools? 

Clearly, teaching faculty like Venkat and his books can help a great deal to the student community. 
When asked 'which is his favorite programming language', he answered that he enjoys coding in every language and that he can solve client issues irrespective of the programming language used to code. The talk was well-constructed. The message and lessons were clear.
His ability to teach and share is commendable.The talk was, in the later sessions of the day recollected and referenced to. In one word the talk on the joy of functional programming was/is impactful.

I opted to attend talk 2: Functional Reactive UIs with Elm by Shashi Gowda.

In technical sessions, people expect to know and learn what is being talked about. Shashi did meet the expectations. This was an introductory course on elm. He meticulously showed examples of programming in elm. The participants could gather several practical applications of elm during the talk.


Primarily, I think elm can be a language which could be introduced to first time coders/learners.
The practical browser implemented result screen and the debugger screen motivates new learners to continue to learn. For someone who wishes to build a virtual theme park, Elm is a place to start off with.


Some of the practical implementations of elm:
  • Education - practising to code and debug.
  • Building prototypes using the model, update and display paradigm.
  • Gaming - writing code for designing games like chess, mario, bounce(ball).
  • Tracing mouse clicks for various other applications.
  • Designing icons, banners and animated gifs.
If you are interested in UI designing I think you might lose sleep thinking, learning and implementing elm post this session. Discussion on elm-html, frp, syntax, semantics and implementation of elm, followed. There was a call out to the audience to try out interactive programming in Elm.
In one word, the talk was inspiring and on day 2 the audience agreed that it was an impressive talk.

References:
elm-lang.org/try


Before I headed to talk with the FuConf organizing committee, I attended another talk by Premanand Chandrasekaran on 'Functional Programming in Java' who ripped open the chest of inventory management system by providing insights into search functionality, warehouse, item/product count, fetch first item, total items in stock in a particular/all warehouse/s using Java 6,7,8 implementation methods. 
More followed with mentions of standards and sub-standards of Java programming.
In one word, the talk was informative.

Day 2 
It was evident on day 2 that all attendees craved for more learning and interaction with the functional programming community. 

Keynote:
Bruce Tate in his keynote 'The role of fear in language adoption' explained the pros and cons of 'Crossing the chasm' for an employee and the organization itself. 
The fear factors as per Bruce were segregated as Paralyzing and Motivating fears which would creep in when an organization tries to adopt to a new / different programming language than the one which is in existence at a particular organization.

In recent times the paralyzing fear factors are: Support for the adopted language, building communities, onboarding required talent and documentation.
And motivating fear factors are: concurrency, code complexity, multi-core and distribution.

Not all were accepting of the idea of fear in adopting to a different language. 

The other talks that followed on day 2 shed light upon programming in functional language like Dyalog. The talk delivered by Morten Kromberg had the newbie's and the experienced craving more for DyalogAPL. I spoke to a Dyalog programmer Radha who was overwhelmed to have met Morten the CTO at Dyalog and expressed her joy of getting to know the community. She explains that where complex computations are involved programming in Dyalog can come to your rescue.
The speaker then briefed about the history and mentioned that functional programming has existed and has been able to solve problems for several companies predominantly in the 1960's - 1970's. And that functional programming is not as new as it is presumed to be. 
The talk was hilarious, informative, very well presented and had the audience rushing towards Morten post the talk and ask for additional coaching exercise. Morten did express his willingness to coach the interested lot.
It struck to me that, what if Dyalog is introduced as part of the syllabus in educational programs? I guess the learners would enjoy learning and programming in this language(which has survived for five decades) for years to come. I personally liked the usage of cachedget among other computations. 

Dyalog has existed for over 5 decades and the team has made updates to keep up with the trends in programming to this day. Radha conveyed that the recently rolled out version of Dyalog is feature rich with excellent and quick updates from the support team. 

The current trends in programming and what the participants were gaga about at this conference as I could gather from one of the slides from Bruce Tate's presentation is: scala, erlang, elixir, haskell, clojure and with many more programming languages exploding, I prefer to conclude that there are many solutions to one problem. Figuring out how to code, test , implement tdd, integrate with stand alone, web apps, services offered, how to deploy and release is now a choice left to the learners.

Kudos to the Functional Conference committee for organizing this educational and very informative conference for the developer and the testing community likewise.

To sum up the references made and jargons used at the conference, here's a word cloud:


Thursday, 11 September 2014

Exploratory Testing And Test Coverage

Test coverage while exploratory testing needn't be limited to one or more of the below factors:
-Test ideas matching the requirements
-Test ideas matching a tester’s skills
Let us explore the other options to provide test coverage when:
-The requirements are not yet available or are incomplete, which usually is the case in an agile way of working with frequent changes being made to the product.
-There is not yet a certain way to map the requirements with the test ideas generated.
-When a tester does not know, if the test coverage provided is enough / sufficient to the ask of the product owner / business.
Many may differ by the mention of enough / sufficient coverage by ruling out with an answer that the test cases match the requirements. In such a case, have you as a tester ever been questioned: “Why was this not tested?” And what was your answer?
The answer, “I have provided test coverage as per the requirements gathered” help you get away from answering further questions related to test coverage?
If not, proceed.
Tester’s skills and test ideas, I would like to think are directly related.
Should a tester, stop providing test coverage at this? No.
When I was at the Selenium Conference, Pradeep Soundararajan spoke to me about providing test coverage when exploratory testing and he provided another insight to my learning.
-Having a list of ideas to refer from can help test those ideas in areas in which a tester may not be proficient at.
-Having this list prepared / modelled prior to the beginning of testing is essential and can help match the test coverage with the test requirements.
Having said this, who will prepare / provide / decide the relevance of test ideas – Everyone in the team. Because as we all know and now need to put it into action “Quality is everyone’s responsibility”. Not just an individual’s.
Next time you are encountered with the question : Why was this not tested?
Gather sufficient evidence and answer the below questions:
-If the requirement was raised to include a particular functionality?
-If the same functionality was/is built into the code?
-If test coverage was provided as part of testing this functionality?
-If the test ideas were reviewed?
Was it tested and Why was it not tested? is definitely not the first question in this series as we now notice.
How else can one benefit from testing beyond one’s own skills and by not stopping at matching the test ideas with the test requirements?
I have tried to answer this question below:
I am currently working on building the mind map factory at my work place.
Mindmap factory houses mindmaps, set of test ideas to refer from when testing.
This helped open up more opportunities to learn. And how? This exercise helped me identify areas in which I am proficient and the areas in which I needed to work upon.
There was a concentration of ideas in certain areas and there was less in certain other areas.
I got in touch with other learners in areas that I needed to improve upon.
We got to pair test and in turn add to our learning. How often do we do this?
It is essential to spare time for learning as we work which can only boost our and the product’s performance. Weekend test environment is yet another place where we can learn from many.
I am an expert, do I need to pair test?
Yes, an expert / guru could be adding to other’s learning and can learn a thing or two from many others.
I witnessed this recently when I was a participant at CAST conference, the testing guru’s pairing up with other fellow learners and co-presenting or getting their work / presentation reviewed and asking for feedback post a presentation are a few examples of how we can learn from everyone.
When the community itself is accepting of learning, why as a company / firm be away from such a culture? Embrace learning from others, learning with others and participating in a product’s development as a team.
I stress upon the fact which I also emphasized at the Selenium Conference held in Bangalore from September 5th – 6th.  ‘Stop asking for permissions to do things(work/learn) which are in your ability to do’. Take decisions about your learning without the influence of others. Stop at nothing.
Conclusion:
We testers need not limit our test ideas to requirements or an individual tester’s skills.
Originally published here: TestBrewer

Friday, 15 August 2014

CAST2014 - An experience report by a first time CAST attendee

CAST2014
An experience report by a first time CAST attendee.


A year ago, when I was introduced to the software testing committee I made a list of things to do. CAST conference was on my wishlist, a Mecca for the software testers. And I am glad to have made it to the 9th annual Conference by the Association of Software Testing organization this year held at the Kimmel Center in the picturesque city of New York.


I witnessed the ongoings first hand, thanks to the AST BOD and I have here made an attempt to present the learnings to you.


My journey began with the volunteers meet on August 10th Sunday evening.
The volunteers and the board of directors of the Association of Software Testing(AST) met to discuss about the volunteering work. The proceedings are an after effect of this meeting.
Thanks Anna for leading the way.


Volunteer’s at CAST:
Anna Royzman (BOD)
Keith Klain (BOD)
Richard Robinson
Smita Mishra
Paul Holland
Bernie Berger
Paul Holland (BOD)
Mike Lyles
Helena Jeret Mae
C:\Users\19509\Downloads\DSC05897.JPG
Volunteers


Registration Desk
The busy registration desk catered to the registrants by providing them with adequate information and directions from day 1 - day 3. Thanks to Pete Walen, Dee Ann Pizzica, Markus Gartner and the AST Board members for making it an event to cherish and who helped facilitate the learning.


AST BOD.jpg
AST Board of Directors in action.


Keynotes at CAST2014
  • James Marcus Bach
  • Trish Khoo
  • Carol Strohecker
  • Ben Simo
  • Matt Heusser


James Bach - Testcases are not testing - Towards a performance culture
If you have read about and followed the context driven testing community and have been an active member of the context driven testing talks online and offline, then you would already know that test cases are not testing.
James was found emphasizing the fact and the need to not consider test cases as testing, providing hilarious analogies of the crab and the booby-trap to testing and checking.
C:\Users\19509\Downloads\DSC06091.JPG
James delivering the keynote


I learnt this from James’s keynote.
  • How to be a strong advocate of what you believe in despite the oppositions and arguments.
  • Online mentoring can help in sharing tacit knowledge.


Many mentors have taken it upon themselves to illuminate the software testing world and eliminate the darkness that exists in non-sensical definitions, terms and processes. James Marcus Bach, Michael Bolton, Anne Marie Charrett, Huib Schoots are a few among the many other testers who coach online (via Skype) and I was glad to meet them all in person.


Douglas Hoffman and James Bach continued to debate on the automation in testing, even after the *open season. And amen to this: Arguments are common among people with passion for the craft.


*Open Season is when the audience gets to ask questions to a speaker by using one of the K-cards which is provided to all the participants.
The facilitators Paul Holland and Richard Robinson narrated to a room full of audience about the usage of K-Cards. Paul and Richard were great at facilitating throughout the event and kept the audience educated and entertained. Kudos to the both of you.


Trish Khoo - Scaling up with Embedded Testing
A passionate tester, presented about the cross-over from software development to software testing. An experiential report gathered by her talks with various testers / test teams across the industry and the need / lack of developers in testing and  viceversa. Abstract from the talk:
Trish shed light upon testers and non-testers involvement in shortening the feedback loop between creation and verification of any product. An excellent contribution by her towards understanding the delay in responses across various teams in the software development lifecycle.


Carol Strohecker - STEM to STEAM Advocacy to Cirrucula
Carol, from the Rhode Island School of Design (RISD) presented about her experiments with design. She struck a chord with the audiences with interesting insights to design in everyday life and stuck to the theme of the conference. I agree that design though sounds too artsy. References to innovation by design put the science back into the theme of CAST “The art and science of testing”.


Ben Simo - There Was Not a Breach; There Was a Blog
Focus of Ben’s keynote was Security Testing. The talk engaged the audiences and provided guidelines for securing an online web application and about using HTML Injection, browser add-ons like Developer Toolbar to capture secure information leaks. It did seem like a lot of the issues in healthcare.gov could have been fixed if the team had had tested the application, also from a web security perspective. The testing community and the testers sat up and took notice of the security aspect of testing. It was a well articulated and well presented talk. Commendable job by Ben.


Matt Heusser - Software Testing State of the Practice (And Art! And Science!)
Highlights from Matt’s talk included:
  • State and longevity of the software testing practice
  • Teaching testing


The aforementioned topics require to get spoken about more widely as the testing community around me (in India) is still naive and is not yet so much exposed to the context driven school of testing.
Matt’s message was clear → There is no difference between Development and Testing teams or rather we are all on the same level. And I feel that level is “Everyone in the team being responsible for delivering a quality product”.


I’d agree with what Jean Ann Harrison and later Matt echoed - *Weekend testing sessions  requires less commitment, one Saturday in a month and this is where the participants also get to learn from many other testers across the globe.


*Weekend testing - Online testing sessions where testers are taught skill based testing on one weekend every month.
Learn more about weekend testing chapters here:


As I gathered, the essence of Matt’s talk was “Don’t just rely on your company trainings. Make your own opportunities - take help from the mentors and learn from each other”.
He wrapped up his talk with these three words → Honesty, capability and reach as the futuristic mantra for the software testing world.


Live and Recorded Keynotes and tracks
Thanks to Benjamin and Paul Yaroch for livestreaming CAST and Dee Ann Pizzica for co-hosting “CAST Live” with Ben this year. All recorded sessions and interviews are now available online. https://www.youtube.com/user/TheAstVideos


How did CAST help my learning?
  • I need an introduction to the subject / sciences.
  • I need for myself to gather, interact and interpret the knowledge sources.
  • Be actively present to know what’s in it for me.
  • Then will I be equipped to either advocate or think is it for me or not.
And CAST did help me to learn just this. The conference indeed was a fast track ticket for me to help learn how I learn.


How I learnt at this conference?
  • Absorbed as much information as possible generated at the conference.
  • For the rest of the learning, I made notes, talked to the people to later re-collect from, got on to the *social sharing mediums to learn about the references I noted down.
*Quantifiable amounts of information is shared on the social sharing mediums, like on twitter you can follow the leads of CAST with the hashtag CAST2014.
Link to the search results for #CAST2014 → https://twitter.com/search?q=%23CAST2014&src=typd


******************************************************************************


What did I learn / unlearn about testing at CAST?
  • The scope and out of scope of testing.
  • How to and how not to test at all.
  • Is there a school of testing that I belong / subscribe to?
  • Do I belong / subscribe to the existing schools of testing at all?
  • How to change and quickly adapt to the evolving world of software testing.
  • Solving problems and sharing solutions with other community members via conferring, blogging and / or interacting by other communication mediums.
  • This list is incomplete, as I am still learning as I jot down these points.


Per Scholas
If you need an honest opinion about CAST, I would say head straight to the Per Scholas participants.
Wowness in a word. Per Scholites knew what was being talked about at this conference and how relevant it is to the current trends in the software testing world, as the education provided to them via the STEP program is first hand and undiluted.
There was an active participation from this group.
  • Harrison C Lovell
  • J. Winston Tokuhisa
  • Jessica Nickel


Humans of New York - And how these people helped me learn.


Keith Klain and Anna Royzman
As enthusiastic as ever Keith and Anna, helped me answer a lot of questions throughout my journey to CAST2014. I was glad to meet and spend time with both these AST board members. Thank you for sharing your time. You both made this conference happening via the event promotion prior, during and post the event.
Harrison C. Lovell a Per Scholas graduate, whose thirst for questioning is unquenchable. As he puts it: “I came, I saw, I conquered”.
He found automation, philosophy and gaming well paired off with each other at the conference.
There were other talks that emphasized on ‘Questioning’ as an integral part of testing including the one which Harrison and Michael Larsen delivered - Coyote Teaching.
More about the talk here in Michael Larsen’s words:  http://www.mkltesthead.com/2014/08/coyote-teaching-watch-how-it-all-came.html


Geordie Keitt from Doran Jones Inc
His views on the conference:
  • Very well run and logistically smooth. Kudos to all the past and present AST Board members.
  • About the number of participants, many including Geordie echoed that the number of participants was / is the right number as compared to the other large conferences.


Ryan Arsenault, a Community Management Associate at uTest
  • Introduced me to the ways of working at uTest.
  • uTest’s vision, mission and focus at uTest?
  • What do they do and how they are getting the testers to be socially informed and be active on their discussion forums.
Ryan, for the major part of it spoke about how to get projects at uTest and if you are new to testing and need more information login to http://www.utest.com/


Conferences are a place to meet and learn from testers who are quiet not accessible otherwise and elsewhere. I was glad to interact with Maria Kademo, Claire Moss, JeanAnn Harrison, Pradeepa Narayanaswamy, Lanessa Hunter among others with whom I shared practical problems, discussed solutions, approaches used to solve testing problems and gather their take away from this conference. Thank you all for your involvement. Oh and did I mention Mike Lyles, Michael Larsen who shed light upon other testing conferences happening and live blogging respectively.
C:\Users\19509\Downloads\DSC06162.jpg
A Testing Troupe at CAST2014


Election and results - BOD
The results are announced and we have the new and re-elected BOD’s for AST.
President:  Michael Larsen
Executive Vice President:  Keith Klain
Treasurer:  Alessandra Moreira
Secretary:  Markus Gartner
Vice President of Education:  Justin Rohrman
Executive at Large: Erik Davis and Peter Walen


Congratulations AST BOD’s. I personally look forward to all that this team can collectively do for the betterment of the testing community. We already have a great start in the form of STOP 29119 Movement.


I spoke to a few of the BOD’s and their vision for CAST2015 and here is an excerpt from the same.


Justin Rohrman
Biggest focus will be on:
  • Applicable knowledge (Know it → Use it). So that the participants can help themselves with the practical and applicable knowledge gained.
  • Authentic problem solving and that CAST would continue to encourage open debate.


Alessandra Moreira
Being a newly elected BOD of the AST: Ale, wishes to get CAST to reach out to more of it’s international audiences.
She shared that CAST2014 was different this year because of the buzz and the excitement it generated pre-conference and that CAST is diverse and she was glad to see the testing community maturing. And the part that I liked much about her vision is “Bringing CAST to Asia” which would make CAST affordable to many Asians participants who look forward to be a part of the conference in the near future. I know that Ilari Henrik Aegerter is so recommending for Portland as the next CAST venue ;)


For a detailed report on the AST election results, please follow the link below:


Indian participation
From a land of over a billion people, there were a few (countable on fingers) participants.
I wish the participants spread the word and get the community to participate often and in good numbers. (good numbers = fairly enough to spread the word about the context driven testing community). Many including myself, were unaware of the ongoings in the Software Testing world until very recently. So I urge the participants to go out and get the word spread about this friendly testing community who are willing to learn and share their knowledge with you. I am doing my bit.


Speakers and Volunteers:
Parimala Hariprasad
Dhanasekar Subramaniam
Smita Mishra
Lalit Bhamare
Jyothi Rangaiah


Test Lab - an extension of CAST
An exhibition of testing products and services like TestComplete and SoapUI Pro by SmartBear were on display at the Test Lab.


At the Test Lab, I found Mike Lyles encouraging the testers to test the http://mailinator.com/ and http://leankit.com/ applications.
  • We spoke about a host of ideas mostly about the testing community, the participation and the talks at CAST. The relevance of the abstracts prepared with the talk.
  • K - cards and it’s usage.
  • Traditional versus what new lessons we learnt at CAST.
  • The relevance of the talks to the current trends in Software Testing.


Mike shared that he could meet great testers here, who come only to CAST.
Testers took time out of the scheduled talks to come over and log bugs at the Test Lab and win goodies from SmartBear. Thanks SmartBear team for sponsoring and getting involved with CAST in this scale. Spotted Claire Moss enthusiastically participating, I felt she is omnipresent who also was live tweeting the entire event.

This happened at CAST2014
Std.png
Standard 29119 and a petition against it
Standard  29119 non-compliance and what’s in it for me?
Go ahead and read this 28 page document if you wish to.


Guess now you are pretty sure and get to decide for yourself, if this is how you wish to introduce software testing to the future generations? Answer for yourself.


1,127 signatures are registered as of 4th October. If you are a software tester or not yet, you could be a part of this historic moment too. Sign the petition if you choose to by following this link below: http://www.ipetitions.com/petition/stop29119 Thanks to Karen Nicole Johnson, Iain McCowatt and the ISST team for their enthusiasm in being the pre-cursors to this movement.

One conference many lessons - whether you subscribe to a particular school of testing or not. This conference appeals to you because of its diversity, the learning it has to offer and the enthusiastic and ever growing community. I urge the people in it to actively participate. The community needs you as much as you need it.


Reading material and Related articles
Humongous amounts of information was generated at CAST and a lot of it can be read on participants blogs and the event itself was live telecasted (http://www.ustream.tv/channel/castlive) which can now be watched online by subscribing to the AST news channel on Youtube.


Some references are as follows:
Below is a link to the four page reference reading, book recommendations and related articles collated at CAST.


Here’s a Test Engineer’s Skills map that I chanced upon while preparing this article and I thought you might want to take a look at:


About AST and the Involvement in Testing Community
AST’s Grant Program
Read about it here:
Many volunteers (including myself), Per Scholas graduates and the local testing community meetups benefit from this Grant Program. Thank you AST and team for making my wish come true to meet and learn with the world class software testers. I continue to add value and give back to this community in my capacity.

Here’s a mind map version of the above write-up.


CAST2014.png


Cheers and enjoy the online version of CAST2014 on a computer near you.


Acknowledgements
Thank you AST Board of Directors, facilitators, sponsors and participants for making CAST2014 a grand success and an event that I look forward to for the coming years.


Thank you Keith Klain and BOD’s for introducing me to CAST. I have enjoyed my time at CAST meeting people, learning from them and sharing my learning via this write-up.
I look forward to actively participate in the future CAST conferences (live or otherwise) and do my best to get the community grow and I would love to see people share new learnings year after year at CAST and other conferences for software testing.


Appendix
AST - Association of Software Testing
@ast_news


ISST -  International Society for Software Testing


K-Cards
Read the comments too - Making K-Cards accessible to the color blind facilitators, speakers and audiences.


Per Scholas
@perscholas


STEP

Software Testing Education Program (STEP) is a free 8-week course written in collaboration with several industry partners and led by Paul Holland.

Originally published at: http://www.teatimewithtesters.com/