May 14, 2009

Bug reporting

After you complete your Software Testing, it is good practice to prepare an effective bug report. Fixing a bug depends on how effectively you report it. Below are some tips to write a good software bug report:



If you are doing manual Software Testing and reporting bugs withour the help of any tool, assign a unique number to each bug report. This will help to identify the bug record.
Clearly mention the steps to reproduce the bug. Do not assume or skip any reproducing step.
Be Specific and to the point
Apart from these tips, below are some good practices:

Report the problem immediately
Reproduce the bug atleast one more time before you report it
Test the same bug occurrence on other similar modules of the application
Read bug report before you submit it or send it.
Never ever criticize any developer or attack any individual

Quality assurance (QA) and testing's role in requirements

The typical late involvement of quality assurance (QA)/testing in software projects certainly limits their effectiveness. By the time many QA pros and testers get involved, all the development defects are already in the code, which makes detecting them much harder and fixing them much more costly. Moreover, late involvement limits how many of those errors are actually found, due to both lack of time to test and QA/testing's lack of knowledge about what to test.

Thus, virtually every QA/testing authority encourages the widely accepted conventional wisdom that QA/testing should get involved early in the life cycle, participating in the requirements process and especially in requirements reviews. The reasoning goes that by getting themselves admitted during requirements definition, QA/testing can point out untestable requirements and learn about the requirements early enough to enable developing more thorough tests by the time the code arrives. Such well-intentioned and seemingly obvious advice unfortunately has a number of hidden pitfalls that in fact can further reduce QA/testing's effectiveness.

Participating in defining requirements

In many organizations, QA/testing receives inadequate information upon which to base tests. Inadequacy comes from several directions. The documentation of requirements and/or design frequently is skimpy at best and often is incomplete, unclear and even wrong. Furthermore, QA/testing often receives the documentation so late that there's not time to create and run sufficient suitable tests.

QA/testing absolutely has a need to learn earlier and more fully what the requirements and design are. However, involving QA/testing in requirements definition and/or review raises several critical issues.

Quite simply, defining requirements is not QA/testing's job. Many organizations have some role(s) responsible for defining requirements, such as business analysts. For some, the business unit with the need is charged with defining their requirements for projects; in other organizations, the project manager or team members may end up defining requirements as part of their other project activities. QA/testing may be one of the responsibilities of these other roles, but I've not heard of any competent organization that makes such other tasks part of the QA/testing role.

Consequently, if QA/testing staff members are assigned to participate in defining requirements, they're probably going to be in addition to (rather than instead of) those regularly performing the task. Not many places are likely to double their definition costs, especially not for something which offers no apparent benefits and may even be detrimental to the definition itself.

While some folks in QA/testing may have some knowledge of the business, one cannot assume they will, and many of those in QA/testing may lack relevant business knowledge. Moreover, it's hard enough to find business analysts with good requirements definition skills, and they're supposed to be specialists in defining requirements. There's no reason to expect that QA/testing people, for whom requirements definition is not a typical job function, would have had any occasion to develop adequate requirements definition skills. Piling on the costs by including QA/testing people in requirements definition would be unlikely to help and could just get in the way.

Participation in reviewing requirements

On the other hand, it's much more logical to include QA/testing specialists in requirements reviews. After all, reviews are a form of QA/testing. In fact, some organizations distinguish QA from testing by saying QA performs static testing, primarily reviewing documents, whereas testing (or quality control, QC) executes dynamic tests of products.

Organizations with such a distinction frequently make QA responsible for reviewing requirements, designs and other documents. It's not these organizations, but rather all the others in which QA/testing is clamoring for admission to requirements reviews.

In organizations where requirements reviews are run by someone other than QA, such as the business units/users or management, there may be resistance to allowing QA/testing to join reviews. An obvious reason would be that limited review budgets may not allow for the added costs of QA/testing staff's time attending reviews.

Of course, budgeted expenses could be shifted from later project activities that presumably would require less effort due to QA/testing's participation in reviews. Nonetheless, such seemingly logical budget shifts often are not made, especially when the future savings go to a different part of the organization from that charged for reviews.

However, the bigger but often less apparent obstacle is a surprisingly (to QA/testing) common perception that adding QA/testing to reviews not only may provide no significant positive value but could actually have a negative impact on review efficiency and effectiveness. In such cases, the already stressed rest of the organization is unlikely to go out of their way just to help QA/testing meet its needs. Such rejection often is couched in terms of limited budget, but it may be based on not really wanting QA/testing involved.

The "testability trap"

Why would people feel that QA/testing actually impedes reviews? I call it the "testability trap." In the QA/testing industry, widely held conventional wisdom is that lack of testability is the main issue with requirements. Generally, lack of clarity makes a requirement untestable. An unclear/untestable requirement is likely to be implemented incorrectly, and regardless, without being able to test it, QA/testing has no way to detect whether the requirement was implemented correctly.

Consequently, it's common for comments of QA/testing folks who have been let into requirements reviews to focus almost entirely on the various places in the requirements they feel lack testability. The less they know about the business domain, the more they are stuck speaking only about lack of testability.

While testability is indeed important, frequently it mainly matters to QA/testing, and their repeated review comments about lack of testability can seem like so much annoying noise to the rest of the review participants. In such instances, the presence of QA/testing can be perceived as simply getting in the way of the review, tying up participants with trivial nitpicking. At best, QA/testing may be ignored; sometimes they even get "disinvited" from participating in further requirements reviews.

Be prepared to contribute

The key to not wearing out one's welcome is contributing productively to the review in ways that all participants recognize as valuable. That takes familiarity with the subject area content and with more effective review techniques.

QA/testing people are not only unlikely to have requirements definition skills, they also often have little familiarity with the business domain subject area that is the topic of the requirements. The difficulty can be especially acute for QA/testing groups charged with performing requirements reviews. Since they'll probably be dealing with a wide variety of business areas, chances are lower that they'll know much about any of the many areas.

Requirements are all about content. To contribute effectively to reviews, it's incumbent upon QA/testing to learn about the relevant business content before reviewing related requirements. Because few organizations recognize the need for such preparation, time and budget are seldom provided for it. Therefore, the burden will be on QA/testing to make the time, quite possibly on their own time, in order to enable them to comprehend content sufficiently to contribute productively to the reviews.

Review technique effectiveness

Understanding content is necessary but not sufficient. Most reviews are far weaker than recognized, largely because the reviewers don't really know what to do, how to do it, or how to tell whether they've done it well. Group reviews generally enhance findings because multiple heads are better than one, but they still find far fewer issues than they could or that participants presume they've found.

With the best of intentions, typical reviewers look at the requirements and spot in a somewhat haphazard manner whatever issues happen to occur to them. Even though they may be very familiar with the subject area, it's easy for them to miss even glaring errors. In fact, their very familiarity sometimes causes them to miss issues by taking things for granted, where their minds may fill in gaps unconsciously or fail to recognize something that wouldn't be understood adequately by someone with less subject expertise.

Moreover, it's hard for someone to view objectively what they're accustomed to and often have been trained in and rewarded for. QA/testing emphasizes the importance of independent reviewers/testers because people are unlikely to find their own mistakes. Yet, surely the most common reviewers of requirements are the key business stakeholders who were the primary sources of the requirements.

In addition, it's common for typical reviewers to provide insufficient feedback to the requirements writers. For example, often the comments are largely not much more than, "These requirements need work. Do them over. Do them better." The author probably did as well as they could, and such nonspecific feedback doesn't give the author enough information about what, why or how to do differently.

Formal requirements reviews

Many authorities on review techniques advise formalizing the reviews to increase their effectiveness. Formal reviews are performed by a group and typically follow specific procedural guidelines, such as making sure reviewers are selected based on their ability to participate productively and are prepared so they can spend their one- to two-hour review time finding problems rather than trying to figure out what the document under review is about.

Formal reviews usually have assigned roles, including a moderator who is like the project manager for the review, a recorder/scribe to assure review findings are captured and communicated, and a reader/leader other than the author who physically guides reviewers through the material. The leader often is a trained facilitator charged with assuring all reviewers participate actively. Many formal reviews keep key measurements, such as length of preparation time, review rate and number of issues found. Detailed issues are reported back to the author to correct, and a summary report is issued to management.

Some formal reviews have the reviewers independently review the materials and then report back their findings in the group review session. Often each reviewer reviews a separate portion of the material or looks at the material from a specific perspective different from each of the other reviewers' views. Other formal reviews work together as a group through the materials, frequently walking through the materials' logic flow, which typically covers less material but may be more effective at detecting problems.

Proponents of some prominent review methodologies essentially rely solely on such procedures to enable knowledgeable reviewers to detect defects. However, I've found that it's also, and probably more, important to provide content guidance on what to look for, not just how to look at the material under review and assuring reviewers are engaged.

For example, in my experience, typical reviews tend to miss a lot more than recognized for the reasons above and because they use only a few review perspectives, such as checking for clarity/testability, correctness and completeness. Often, such approaches find only format errors, and then sometimes only the most blatant, while missing content issues.

In contrast, I help my clients and seminar participants learn to use more than 21 techniques to review requirements and more than 15 ways to review designs. Each different angle reveals issues the other methods may miss. The more perspectives that are used, the more defects the review detects. Moreover, many of these are more powerful special methods that also can spot wrong and overlooked requirements and design content errors that typical weaker review approaches fail to identify.

When QA/testing truly contributes to reviews by revealing the more important requirements content errors, and not just lack of testability format issues, business stakeholders can appreciate it. When the business stakeholders recognize QA/testing's review involvement as valuable to them, they're more likely not only to allow participation, but advocate it.

If you find it very boring in the office

If you find it very boring in the office, here are some tips for
u!!!!!!!!!

1. Form a detective agency to find out who is quitting next.
2. Make blank calls to your Boss.
3. Send mails from lotus notes (outlook) to your internet mail (and
immediately get to the internet and see who reaches first, you or your
mail?) and read them there, and note down the time they take to reach
there. Then do vice versa....... ...... !!
4. Rearrange the furniture, i.e. flick someone else's chair just to
irritate him/her.
5. Count your fingers (and toes if you still get bored).
6. Watch other people changing their facial ex-pressions while working
and try changing your ex-pressions also.
7. Try to stretch status meetings as longer as possible, just by asking
silly
doubts. (IMP ORTANT)
8. Make faces at strangers in office.
9. Have a three and half hour lunch; it's a big social occasion.
10. Learn to whistle.
11. Revise last week's newspaper.
12. Hold "How fast my computer boots" competitions.
13. Practice aiming the coffee cup into the dustbin.
14. Enhance your Literature skills. you can author "1001 innovative
ways to waste your day" to help your colleagues
15. Pick up the phone and dial non-existing nos.
16. Have work breaks in between tea.
17. Count maximum no of applications your computer can open at a time.
18. For Win NT/95 users....Move things to Recycle bin and restore
them..Then repeat this process. (very important)
19. Look at someone & try to imagine how (s)he might have looked when
(s)he was 5 years old.
20. Read jokes and send
jokes.
21. Make full use of the comfortable chair and table provided and take
a nap.
22. Send this mail to only one at a time to every one in your contact
list.
E N J O Y E V E R Y T H I NG I N Y O U R O F F I C E AS I ENJOY DAILY……. . .

Top ten tips for building your self confidence


1. Visualise Yourself As The Person You Want To Be

Each morning spend a few minutes visualising yourself as the person you want to be. Think about the way you dress, the way you carry yourself and the way you interact with other people. Seeing yourself as the person you want to become is the first step towards building self confidence.

2. Self Confidence Statement

After visualising yourself as the person you want to be, read the following statement out loud:

"I know I have the ability to achieve my major goal in life. Therefore today,
I demand of myself persistent and continuous action towards achieving my goal"

Reading this statement out loud is a great way to start your day in a confident state of mind.

3. Dress Well

One of the most effective ways to instantly improve your level of self confidence is to dress well and to make the decision to always be well groomed. This does not mean you have to go out and buy a whole new wardrobe. Instead, just focus on gradually building up a small collection of good quality clothes. Also remember that simple accessories such as a tie clip or necklace can make a big difference to the way you look and feel.

4. Positive Posture

Another powerful way to build your sense of self confidence is to stand up straight and lose the slouch that many of us have acquired over the years. While you may be able to make a change to your posture by simply becoming aware of it, the best way to make a long lasting change to your posture is to practice yoga or pilates.

5. Move With Purpose

A simple but effective tip for increasing your levels of self confidence is to always move with a sense of purpose. In his book 'The Magic of Thinking Big', David Schwartz recommends walking 25% faster than normal. Having a spring in your step lets people know that you have important things to do and actually makes you feel more confident as you go about your daily business.

6. Become A Participant

Have you ever noticed that in most meetings or groups, people immediately head towards the back of the room so that they can remain as inconspicuous as possible? A great way to increase your visibility and sense of self confidence is to make the decision to always sit towards the front of the room and be a participant. When you have something to say - don't be afraid to say it.

7. Connect With Confidence

Another way to quickly improve your self confidence is to practice making a strong first impression. When you meet someone face-to-face, look them directly in the eye, smile broadly, shake hands firmly and say, "Hi Jim, nice to meet you".

Similarly, you can sound more confident on the phone by answering, "Good morning Carol Jones speaking" instead of simply saying "Hello"

8. Build Your Success File

Occasionally your self confidence will take a hit when something doesn't work out the way you hoped. One of the best ways to repair your self confidence in this situation is to keep a folder outlining your past achievements and successes. You should also include any positive feedback that you've received from others.

As you review your success file and fill your mind with positive comments, your doubts and insecurities will quickly disappear and your self confidence will be restored.

9. Preparation

The BIG secret to being self confident that people rarely talk about is - preparation. The more you prepare and practice for an event, the more self confident you will become.

If you are worried about an upcoming event, use your apprehension as a stimulus to take action and practice, practice, practice. The simple but powerful truth is that self confidence grows through repetition and experience.

10. Toastmasters

My final tip for developing self confidence is to join a Toastmasters group.

Toastmasters is a non-profit organisation that helps people from all walks of life to develop their public speaking and leadership skills. In my experience Toastmasters offers a safe and relaxed environment to step out of your comfort zone and develop the invaluable skills of being able to think on your feet and speak in public.

So there you have it!

If you implement some or all of these 10 techniques you'll gradually develop a greater sense of self confidence which in turn will help you to pursue and achieve your most important life goals.

Until next time,

Dare To Dream!

May 13, 2009

Testing certifications

Nowadays I observe that majority of software testing oriented people attaches great importance to certification - ISTQB has 100 thousands graduates ! Is this good from employer point of view ? How to choose right person when all of them have the breast full of distinctions ?

It’s a safe bet that a person who pass additional exam in his/her life gain some extra knowledge and will know the theory a bit better. On the other side I know some sort of people who only have “these kind of papers” and do not care about expanding horizons.

I think that the work experiences are the clue, person with bunch of diplomas without real life testing evidence is a risky choice. Below are list of available certification for quality care staff and some advice for right choose.

Vendors certification:

HP / Mercury
Segue
Rational
Empirix
good if we want engieener specialized in some sort of test tools.

On the other side we have vendor-neutral exams and courses:

ISTQB Certified Tester

three levels: Foundation, Advanced, Expert
exam prices : 200-300 Eur
website : http://www.istqb.org
Certified Software Quality Analyst (CSQA)

work experience required
exam prices: 350-400 $
website: http://www.softwarecertifications.org/
Certified Software Test Engineer (CSTE)

work experience required
exam prices: 350-400 $
website: a
Certified Manager of Software Testing (CMST)

work experience required
currently hold an active CSTE Certification
exam prices: 600 $
website: ahttp://www.softwarecertifications.org/
Certified Test Manager (CTM)

work experience required
formal education required - minimum 10 days (http://www.testinginstitute.com/Course_List.php)
website: http://www.testinginstitute.com/ctm.php
Certified Software Test Professional (CSTP)

formal education required - minimum 10 days (http://www.testinginstitute.com/Course_List.php)
exam prices: course price (500-1000 $)
website: http://www.testinginstitute.com/cstp.php Six Sigma Black Belt Certification (SSBB)

website : http://www.asq.org/certification/six-sigma/ Above exams and courses, I think, are strictly connected with test manager skills not for test engineer.

Finally, check that our potential employee is visible by google Maybe this is strange and funny but true ! Test manager should do any internet activity: blog, digg, twitter or be only a commentator. Check social media like linkedin, facebook, results might be hard to interpret but often authoritative.

Summary
So, if You want to hire a test engineer, look at CV in order:

work experience
internet activity
vendor certificate
vendor-neutral certificate
Test manager:

word experience
vendor-neutral certificate with formal job experience requirment
internet activit
vendor-neutral certificate
So, do not rely on certificates only! Think, think …
and do not use “Tester” in advertisment as a job position , rather “Test engineer”
What’s Your opinion about that process of IT specialists typecasting with distinctions ?

Ways to Quickly Rate Website Quality

Our expert judgement may be insufficient, we must support it with several pages report. How to write such document, where the data take from ? Basic information can be gathered from free to use online tools. Below is the list, composed with a view to quickly obtaining data for basic website quality rating.

1. Look at Standards
The webpage which fails to comply with the standards could have different errors, which are very difficult to predict. Starting from the problems in different browsers, to poor Google indexing. With the help comes the W3C, which sets standards and provides ready to use tools:

HTML, XHTML - http://validator.w3.org/
CSS - http://jigsaw.w3.org/css-validator/
FEED (RSS or ATOM) - http://validator.w3.org/feed/

2. Look at Design
The design has many meanings, here it comes mainly about how the site will be looked in different environments. Is not too wide, too high and looks the same in Firefox or Opera.

Plenty of browsers, systems, etc - http://browsershots.org/
Look at webpage in different resolution - http://www.markhorrell.com/tools/browser.html

3. Look at Performance
Underestimated but very important issue: the speed of page loading. We can have beautiful graphics, animations and great scripts but the user may not wait for full page load. Remember that we need to count the full load, not just the size of the HTML source, but also images, flash, ads, scripts and dynamic content via AJAX.

Popular, cute tool - http://tools.pingdom.com/fpt/
Overall page performance - http://site-perf.com/

4. Look at SEO Quality
Search engine optimization is a difficult and lengthy process but most mistakes are the same, easy and crucial errors that substantially reduce the Google ranking. The following tools, which very quickly learn us problems at the site in terms of SEO:

Complete SEO Analyzer - http://www.seoworkers.com/tools/analyzer.html
Google Webmaster Guidelines - http://www.google.com/support/webmasters/bin/answer.py?answer=35769

5. Look at Linking
Dead links are strictly connected with bad SEO score. Good to have many inner and outer links, but not too many and all must live Google Webmaster Tools are very useful particularly for the analysis of the number of links from other sites.

W3C Tool - http://validator.w3.org/checklink
DeadLinks - http://www.dead-links.com/
Google Webmaster Tools - https://www.google.com/webmasters/tools/

6. Look at Accessibility
Accessibility lies close to the Usability which is hard to check by machine. We have standards for websites accessibility and tools to rate it.

Great, visual tool - http://wave.webaim.org/
Compliance with standards - http://www.contentquality.com/
Standard - http://www.w3.org/TR/WCAG10/
Standard - http://www.access-board.gov/508.htm

7. Ready for Mobile
More and more often we watch websites in our phones, we have already large displays but unfortunately have serious problems with the correct navigate at most sites. If some improvement can be done at low costs and thus will make mobile users life easier it should be done.

Simple tool - http://ready.mobi/start.jsp
W3C Mobile Validator - http://validator.w3.org/mobile/

Conclusion
The above list is not exhaustive and detailed but gives a high potential to quickly verify whether provided website has the basic and crucial defects.

7 Ways To Be Good Developer from Tester Point of View

Try to identify points that are important for the developer from tester but also the quality assurance point of view. There are hundreds of articles about becoming a good developer, I have hope that these few following thoughts will help programmers to better collaborate with testers.

1. Do not Test the Tester !
Even if you have very bad relations with the tester or the entire quality department, never code artificial bugs to demonstrate the poor quality of the tester! Fraud sooner or later comes to light. Testers also have a great ways to show your low skills, a fight between the persons/departments always ends with the customer’s critical exceptions.

2. Do Your own acceptance tests
In a time of the unit testing code cover it seems to be more important usability or GUI testing, especially if we look at the developers work. Unfortunately, the GUI unit testing, which is closely linked with the usability is not very effective. Carry out each time a short acceptance test, though the developer often subconsciously avoids reefs.

3. Do not repeat bugs
One bad thing that you may experience being a tester, is developer which repeats the same errors. Comes up to the fact that the tester is able to predict how the error occurs in a functional change. This illustrates the carelessness of the programmer and a his lack of progress in learning this difficult skill. Man learns from mistakes, therefore the developer should.

4. They do not want to hurt you
Developers usually think that the main tester task is to demonstrate that the code writer is feeble by detecting as many bugs as it is possible. Developers often are afraid to give the code for testing but should rather seek the assistance in order to ensure doing good job. If the tester comes and says “You are poor because I detected 29 errors in Your code” - ask “And how many left?” Someone said, “The more errors we find the more is still there” - do not forget about it.

5. Do not move the whole responsibility to tester
Another negative thing that often have place is the situation when a developer does not feel the responsibility for the error found at the client. Shift it to quality assurance, of course, which is responsible but we must remember that product is created with the collaboration and accountability of the entire company. Try to make the best possible code, omit thinking like “I will write this piece of code, testers will find all mistakes, if not that will be they fault” - very bad approach.

6. Write comments and human readable code
In times of auto comments available in Visual Studio but also in other development tools, developers forget to include something from inside. Something which usually proves to be very helpful in a crisis, and necessary for code review. In parallel, write code that explains a lot without reading the comments, function and variable names are no longer restrictions on the number of characters!

7. Provide descriptive error alerts.
I think that one of the most arduous and time-consuming activities performed by the tester is looking for a path access to the bug. Submit error to bugtracker and get the response “Can not reproduce”, it usually ends with a call to developer for demo or send him screencast. Through the provision of good error messages tester can provides ready information with a bug. It is great if you have log engine, because the client log file in an bug attachment is very helpful.

Summary
These are just a few loose observations resulting from my experience as a developer and tester. If you have other proposals please comment.