thorough analysis of project spec and goals
a complete, step-by-step methodology including test cases and testing scripts for use by QA, engineers and translators throughout project
setup and configuration instructions for testing environment
detailed timetable for testing with project milestones Entry Criteria: Series of Build Acceptance Tests will be performed on localised builds from engineers. These tests require a certain level of quality prior to commencement of testing, ensuring that QA is incorporated into all stages of the process Exit Criteria: Series of tests performed on proposed final build of the product which will be a subset of overall testing plan and technical analysis. The results of these tests determine whether the product is approved for final linguistic testing.
Technical Verification to ensure quality of individual files in localised builds received from engineers. Criteria would include code integrity, link integrity, variable integrity, string consistency etc. Compatibility testing to ensure compatibility of a software product or Web site with different hardware platforms , browsers, network configurations, Operating Systems, and other third party software. Our testing labs at Oxford Conversis offer all the hardware and software needed for such testing including
PCs, Macs, and UNIX workstations and servers
Windows 95/98/ME/2000/NT/XP, MacOS, and many platform versions of UNIX
All versions of Internet Explorer, Netscape, and AOL Performance Testing is very much project specific and depending on the product to be localised a number of testing procedures such as Load, Stress and Capacity Testing can be applied to test the applications scalability.
Verification of complete translation of text, and images if applicable, on all screens
Testing for correct translation, with particular focus on newly added strings in built version
Testing to ensure that translated text always conveys correct meaning in the target locale.
Verification of correct layout of all data on each screen
Check for correct display of language specific characters, with regard to their specific keyboard settings, such as the umlaut in German.
Verification of other international standard issues such as tags, fonts, date & time functions, currencies
Reporting and fixing UI problems such as truncated text caused by translations. For example the German language is much lengthier on translation and can frequently lead to this problem of text truncation.
Check for duplication and correct functioning of hot-keys on all screens.
Quality Assurance Procedures
Scope
This document outlines the general procedures on a typical localization project. It should provide the lead test engineer with the correct guidelines for developing a specific QA plan.
1. Prepare Testing Environment
Technical analysis
based on client materials such as: QA Team Priorities
2.2 Functional Testing
Functional testing of all areas of the product ensures it will perform as closely to the English version as possible. This entails a series of tests which perform a feature by feature validation of behaviour, using a wide range of standard and erroneous input data. Depending on the project specification, it can range from simple smoke testing of the product to thorough script based testing of the product's user interface, APIs, database management, security, installation, networking, etc. The QA team at Oxford Conversis can perform functional testing on an automated or manual basis using black box and white box methodologies. Typical testing criteria for a localisation project would include:
2.3 UI Testing
UI Testing will be performed on the running application or web site. Our Visual QA procedures provide comprehensive testing on all possible cosmetic issues of localised product. All testing will involve valuable input from our language experts. Throughout the testing cycle exploratory testing is also performed by expert QA engineers to find defects that may not be found by formal testing methods. Typical criteria for such testing would include:
2.3 Linguistic Testing
In the linguistic testing phase our testing specialists add another level of quality assurance by thoroughly testing the final build conclusively ensuring the product is both linguistically and technically sound. I.e. the translations are accurate, grammatically correct, and culturally appropriate.
Bug reporting and Tracking
Using the testing methodology listed above our QA engineers, translators and language specialists log issues, or bugs, on our Online Bug Tracking System. Localization issues which can be fixed by our engineers will be fixed accordingly. Internationalization and source code issues will also be logged and reported to the Client with suggestions on how to fix them. Bug Tracking process is as follows:
When a bug is logged our QA engineers include all relevant information to that bug such as:
The QA also analyses the error and describes, in a minimum number of steps how to reproduce the problem for the benefit of the engineer. At this stage the bug is labelled "Open". Each issue must pass through at least four states:
Open: Opened by QA during testing
Pending: Fixed by Engineer but not verified as yet
Fixed: Fix Verified by QA
Closed: Fix re-verified before sign-off
In order to manage the testing process efficiently it is important to include several other bug status levels such as: Not a Bug, Use as Is, Deferred, Regressed, and Cannot Reproduce
3. Engineer releases a new internal version, with fixed bugs and new features.
See Engineering Department Workflow document.
4. QA verifies and checked the bugs in new release, and changes the status
5. Fixed bugs are closed in the BTS by QA, Non fixed are ‘Reopened’ in DTS
See BugCycle.vsd for diagram on Bug lifecycle.
Depending on quality of translations, engineering and on the number of bugs logged the QA team decides on a number of test cycles for project. Usually this would be no more than two or three full passes before Final Bug Regression and release to Linguistic Testing.
2. Initiate Testing
2.1 Minimal Acceptance Tests
Subscribe to:
Post Comments (Atom)
Post a Comment