Thursday, August 18, 2016

Registration Site Testplan



The following was requested by some folks at an online education company.  The assignment was the following

I would also like you to include the following: 

  1. What questions might you ask the Product Owner?
  2. What coordination/communication would you do at this point?
  3. What suggestions would you have to make this User Story and Acceptance Criteria better or more clear?
----

User Story: 
As a training attendee, I want to be able to fully register online, so that I can register quickly, painlessly and reduce overall paperwork

Acceptance Criteria: 
  •     All the mandatory fields are required for a user to submit the form.
  •     Protection against spam is functioning properly.
  •     Information from the form is stored in the correct database and the data is in tact.
  •     Payment can be made via credit card in a secure way.
  •     An acknowledgment email is sent to the user after submitting the form which includes how to get started.

My questions to the product owner are here below
1.    What sort of training is this?  Is this just a training session or something more like a convention or seminar.  
1.a.  Are there different options or packages the user can sign up for.  The test plan assumes only on session of training that is at a single price.  If this is the case we will need to create testcases around all costs and options.
1.b.  Is there a need for the user to schedule anything within this training.  Seminars may have a single cost but the user may need to schedule what sessions or panels they attend.  The test plan assumes just a single training without sub events.  This could be a secondary page with many testing permutations.
2.   While this training happen again?  Is it a single even or a regular training our company performs.
2.a. If there are many dates the training occurs that the user can pick from I need to capture this.  My tests assume the even occurs only once.
2.b. If it is annual we may be doing extra mailings to advertise to previous attendees for future training.
3.   Who is the host of this event?
3.a. If we are hosting do we need to provide food to the attendees?  Me may need to add a choice of food as an option to select. 
4.    Is a Hotel connected to this event?
4.a. Do we need to book the hotel room along with the training registration?
4.b. Do we need to sell or include parking passes?
5.    What materials are given out
5.a. We should have digital versions when out with confirmation email.
5.b. Are there options here like purchase of software? 
6.    Do we limit who can come to this training in anyway?
6.a. A black list of banned persons not allowed to the training.  I would need to test to include exclusion of these people.
6.b. A white list or qualification that must be met.  I don't know if only people from a particular company are allowed or if there are other qualifiers.  I would need to test.




And finally the test plan itself.
Training Registration Web Site Test plan
1.       Introduction:  Through this document I will outline my testing strategy for a training registration web site.  The goal is to list the broad areas of testing here, including general test descriptions.  Further I will be documenting dependencies and resources the test testing team will require for successful validation. 
1.1.    In scope items:  Payment system flow.  Email confirmation.  Validation of the Database through system test flows.   Cross browser web validation and checking of anti-spam/ real person validation.
1.2.    Out of scope items:  Benchmarks of speed and performance, Internationalization and localization testing.    We are assuming a single training event that does not repeat annually that will occur within an unnamed US city.
2.       Required Resources:  I am assuming a standard web site configured on a generic web server environment that is generic but standard to industry.
2.1.    Test and live environments
2.2.    Database set up and access to said database
2.3.    Deployment system, perhaps Jenkins.
2.4.    Automation environment if there are to be future training sessions.
3.       Test Approach:  My general methodologies for testing will be exploratory testing to validate code is ready to move onto functional test.  Functional verification will validate positive and negative flows of the ticket registration system.  Regression will be marked for the future as we are starting at square one there is not existing code to validate against.  
3.1.    Exploratory testing:  We will be performing two rounds of unstructured tests.  The first will explore areas broadly to determine they are functionally ready to enter testing.  The second round will start after sign up and billing flows are functional.
3.1.1.  Both rounds will be reported through James Bach exploratory testing format as documented at http://www.satisfice.com/articles/et-article.pdf
3.1.2.  Round 1 will be 30 minutes. Goal will me verification of base functionality
3.1.3.  Round 2 will be a 1 -2 hour block.  Goal will be familiarization of tester to execution and basic detection of issues.  Further UI will be evaluated for user friendly interface, basic look and feel, etc.
3.2.    Functional testing:  This will need to have sign off by both development and the product owner before execution.   I will send it out with at least three days to prior to estimated code delivery for sign off and update with feedback.
3.2.1.  (Regression) Verify the registration flow through purchase of a single purchase.
3.2.2.  Verify Name, Email, Credit Card, and Billing information fields are mandatory
3.2.3.  Verify the standard email compliance is required for entry of email field.  A@B.com should be required at minimum. Verify Credit Card information maintain 16 digit minimum digit number entry.  Error should be issued if less or more digits are entered.
3.2.4. Verify visa 4 prefix and master card 5 prefix are required for credit card entry.  Error should be issued if combinations are not correct.
3.2.5. Verify CVV is alerted for and explained within credit card information.
3.2.6. Verify can set billing information to same as address with checkbox.
3.2.7. Verify Company and address fields are not mandatory.
3.2.8. Verify 3 digits are required for all except Amex which is 4 digit.
3.2.9. Verify a test number set to be rejected by the processor displays appropriate message within purchase flow.
3.2.10.    (Regression) Verify routes through HTTPS for purchase information.
3.2.11.    Verify entry of Real Person verification is enforced as a mandatory field.
3.2.12.    Verify the user is alerted when entering invalid Real Person information.
3.2.13.    (Regression) Verify we can positively proceed through Real Person validation with correct information.
3.2.14.    Verify user can enter not standard characters from UTF-8 set in name and email fields.
3.2.15.    (Regression) Upon completion of purchase, verify correct email is sent to user.
3.2.16.    (Regression) Upon completion of purchase, verify database tables are updated with registration information
3.2.17.    (Regression) Upon completion of purchase, verify the funds delivered to our accounts from credit card processor.
3.2.18.    (Regression) Upon completion of purchase, verify web page alerts user with confirmation test and confirmation number.
3.2.19.    Attempt to register as a banned individual, verify user can not purchase attendance to training.
3.2.20.    Attempt to register as a user that has already registered with completely matching information. Verify user can not purchase second attendance to training.  Should be alerted they are already signed up.
3.2.21.    Attempt to book a number of users beyond the limit of the particular training, verify the user can not purchase attendance to training.  Web site should alert people to coming to site that the training is now full.
3.2.22.    Attempt to cancel scheduled training either through database or backend, verify email is sent to all registered users that the training has be cancelled.
3.2.23.    Attempt to purchase when the server time is now after the scheduled training, verify user can not purchase attendance to the training.  User really should not be given option to see that as a valid item for purchase.
3.2.24.    (Regression) Verify we can refund a purchase through backend flows.
3.2.25.    Verify you can purchase two different users attendance with the same credit card.
3.2.26.    Verify card maintains information if the browser session is ended but cache has not been cleared.
3.3.    Brower Testing Matrix:
Browser
Pass/ Fail
Defects
IE


Firefox


Chrome


Safari



4.       Entrance criteria:  Full functionality of website checked into source management.  No blocking issues that prevent testing from proceeding. 
5.       Exit criteria:  Functional testing, including negative tests, have passed validation.  Only low priority defects remain unfixed.  Regression of the live site has been completed which conforms to marked Regression set above.
6.       Deliverables: The following items the Quality team recognizes as their responsibility to provide.
·                       Final test plan document
·                       Detailed test cases.
·                       Sign of functional tests in staging or QA environment
·                       Sign of functional regression in live environment
7.       Risks:  Late delivery and schedule compression beyond control of quality team.  Also third party systems which would include the credit card processor and provider of Real Person validation.
8.       Schedule: This will be determined by SDLC style, Scrum sprints are usually two weeks in duration.  I am assuming testing will need to be completed within the confines of a two week cycle.   This would include reporting defects and revalidation of those defects when they have been fixed.   Regression of the live site would also be completed within the duration of the two week sprint cycle.

Toaster Testplan



Hi Everyone, 

I am starting a series of posts related to my job search.  In several of my interview I have been given various tasks related to QA work.  The following is the first such request I was given a few years ago.  This sort of request has become the norm in interviews of late.  This is a test plan for a toaster that was requested by a company in the silicon valley.  I will try to find which one but for now here it is.  I hope it can be of some use to you.



Toaster Test plan
Version
#
Implemented
By
Revision
Date



1.0
Jim Rickel
4/17/2014




1.       Introduction:  Through this document I will outline our strategy for testing a standard toaster.  The goal is to list the broad areas of testing here, including general test descriptions.  Further I will be documenting dependencies and resources the test testing team will require for successful validation.
1.1.    In scope items:  We will be doing validation of toaster and any documentation provided for that toaster.  We are making the assumption that this is a standard two slot toaster; with a lowering and rising design.  We are assuming the target market is the US for this product.  Further we assume this to only be available in a single all metal housing, without variation.
1.2.    Out of scope items:  All aspects of the box and packaging are assumed to be out of scope for our validation.   Marketing will be responsible for all advertising including copy proof of their materials.   We are also assuming no responsibility for power outside of the wall socket.
2.       Required Resources:  The following is a list of materials the quality team will require to complete a full pass of the functional test cases.
2.1.    A large cross section of breads, toaster pastries and frozen waffles.  This will assure depth of test coverage.
2.2.    Butter
2.3.    Metal knives
2.4.    Volt meter
2.5.    Standard wall outlet configured to the US 110 voltage.
2.6.    A variety of foreign adapters and a set of matching wall outlet.  See http://www.voltagevalet.com/outlets.html for a list of outlet varieties. 
2.7.    High wattage microwave.
3.       Test Approach:  As this is an electric appliance our goal is to first validate the safety of the toaster though a series of “smoke” validations.  Testing will then move onto a functional verification of the toaster.  We will then benchmark the toaster against others on the market and finally execute a series of destructive tests to assure structural quality.  All testing will assume the removal of packaging stickers prior to execution.
3.1.    Safety and Smoke testing:  This aspect of testing will assure we have created a toaster that is in fact safe to own and operate.  Localization testing with regard to adapters and non US power outlets will also be included in this testing.
3.1.1.  Verify the item can be plugged in safely.  Cord and toaster do not become unsafely hot.
3.1.2.  Verify bread can be toasted safely.  Bread does not ignite; cord and toaster do not become unsafely hot.
3.1.3.  Verify bread can be toasted to maximum darkness.  Bread does not ignite; cord and toaster do not become unsafely hot.
3.1.4. Verify toaster blows fuse when a metal knife is inserted into the operating toaster.  Toaster should become inactive rather than passing current though knife
3.1.5.  Overload the circuit with toaster and high wattage microwave.  Bread does not ignite; cord and toaster do not become unsafely hot. Verify toaster continues to operate safely after circuit is reset.
3.1.6.  Verify operation of toaster with foreign adapters and wall sockets.  The following matrix should be completed.
Adapter
Pass /Fail
PBC-1

GUB

GUR

GUK

GUI

GUS

GUL

PCC-1

GUC

PDC-1

GUD

PEC-1

GUE

PFC-1

GUF

GUZ


3.2.    Functional testing:  After we have completed verifying the toaster is safe we will begin functional validation.  Here we are assuming toasting/ warming and clean up to be the primary functionality of the toaster.  We will also include a set of incorrect use test cases in this section.
3.2.1.  Verify toaster evenly toasts a slice of bread from center to crust.
3.2.2.  Verify two slices from a loaf toasted with a similar darkness setting are toasted to the same degree.
3.2.3.  Verify toaster darkness setting.  We will be toasting a sign slice of bread all from a single loaf each with a different darkness setting.  Testing will start with the lightest setting and move the darkness setting one click or millimeter (dependent on the internal potentiometer design) per slice.  The expected outcome would be a gradient change in the darkness each slice is toasted.
3.2.4.  Set toaster darkness to max and begin toasting.  Eject the toast before toasting completion.  Finally start toasting a fresh slice.  Verify the second slice toasts to full darkness.
3.2.5.  Verify the toasting of waffles in the toaster.  Expected outcome should match results returned with bread.
3.2.6.  Verify the toasting of evenly divided bagel halves.  Toaster must accommodate this size of food item.  Expected outcome should match results returned with bread
3.2.7.  Verify the warming of pop tart brand toaster pastries.  Internal jam should become warm and viscous after toasting.  Fire does not occur.
3.2.8.  Verify the warming of breaded fish patty in toaster.  Oil release should not catch fire. 
3.2.9.  Verify you can open the crumb door.  Verify toaster cleans easily and completely.  Poor cleaning could result in bad reviews which will negatively impact sales.
3.2.10.    Verify the toasting of buttered bread does not result in fire.  Verify clean up as well.
3.2.11.    Verify the toasting of Amish friendship bread or other bread containing chocolate does not melt chocolate into internal working of toaster.
3.2.12.    Verify name brand of company is correctly spelled and with correctly logo and font
3.2.13.    Verify model number is correctly marked somewhere on the toaster.
3.3.    Benchmark validation:  In this phase of testing we will be validating our toaster against other toasters currently on the market.  We are going to make the assumption that only our own pervious toaster models will be included for this verification.  For this benchmarking we will want to include competitor’s toasters, our own previous model of toaster and our current model of toaster. This section will not include test cases but rather look for results in the form of a comparison matrix.

Darkness settings
Slot Dimensions
Electrical draw during operation
Electrical draw at rest
Current model




Previous model




Competitor 1




Competitor 2




Competitor 3




3.4.    Destructive Testing:  Our goal with destructive testing is to ensure the product can stand up to some degree of mishandling during its life time. 
3.4.1. Verify the toaster can withstand a fall from a height of three feet and continue operation.
3.4.2. Verify the three feet fall test can be repeated at least five times.
3.4.3. Verify the fall test while the toaster is plugged in.  Plug should not be adversely affected.
3.4.4. Verify the plugged in fall test can be executed at least five times.
3.4.5. Stuff bread slots completely full with bread ramming break down to further fill the slot.  Verify extended toasting does not result in fire.
3.4.6. Verify toaster can remain plugged into the wall socket for a month.  Verify fire does not occur.
3.5.    Documentation Verification:  In this section of validation we will be validating the documentation provided by the tech publications team.  Our goal is not to evaluate the grammar of the text but rather look for factual and formatting errors.
3.5.1. Verify the text of the documentation is free from spelling errors.
3.5.2. Verify the text is correct with regard to the operation of the toaster.
3.5.3. Verify all phone numbers with the documentation by calling each.
3.5.4. Verify online documentation matches printed documentation that is to be included with the toaster.
3.5.5. Verify all photos in the documentation include photos or representations of this current model of toaster.
4.       Entrance criteria:  This will just need to be the toaster with some level of mechanical and electric functionality, it should be able to produce heat from the heating elements and the mechanism to raise and lower should be functional.
5.       Exit criteria:  Safety and functional test cases passed.  Documentation tests should be passed but sign off is fine if product owner wishes.  Benchmarking must be complete.  Destructive testing should be run but is not of a pass / fail nature.
6.       Deliverables: The following items the Quality team recognizes as their responsibility to provide.
·                       Final test plan document
·                       Detailed test cases.
·                       Daily bug triage report  or alternatively this could be represented by Scrum daily burn down report
·                       Final sign off report for functional testing
7.       Risks:  As always the Quality team is a downstream team that does not control when a testable toaster is delivered.  Should engineering or tech publications teams be delayed in delivering to the Quality team then the final testing will be delayed. 
8.       Milestones
·                       Toaster product handed off for testing from engineering
·                       Toaster passes safety verification
·                       Toaster passes functional verification
·                       Toaster destructive tests completed with acceptable results
·                       Toaster benchmarking complete
·                       Documentation handed off for testing
·                       Documentation verification
9.       Schedule: This will be determined by SDLC style, though that said scrum is probably not a wise choice for development of a toaster.   Validation of documentation and toaster product can be performed in parallel depending on resources.   Benchmarking should only need to wait on safety verification of the toaster.