Tuesday, June 3, 2008

QA Tester Certifications-Types

A Tester Certifications
Certification Comparisons
American Society for Quality Certified Software Quality Engineer (CSQE)
American Socity for Quality Quality Improvement Associate (CQIA)
American Socity for Quality Six Sigma Black Belt Certification (SSBB)
British Computer Society Information Systems Examinations Board (ISEB) qualification in Software Testing
International Software Quality Institute ISTQB Certified Tester. The ISTQB is the umbrella organization for the national testing boards, which have already been established in many countries across Europe and around the world.
Mercury Tools certification
Quality Assurance Institute Certified Software Quality Analyst (CSQA)
Rational Function and Performance tester certification
Segue Tools certification
http://www.testinginstitute.com/cstp.php
http://www.software-testing.com/web/CTPP.html
http://www.freetestingcertification.com/
http://www.eplanetlabs.com/CSTP-Testing-QA-Quality-Assurance-Teste

Testing Certification

Doing a test certification might add some value to your profile. Certification could be important because of various reasons. You might want to do certification because
There are not many certified software test engineer and so certification can increase your value in the organization.
  • It can add value to your career path.
  • You stand out from your peers with your professional certification.
  • You get to learn things on which you are not working.
  • Certification could also mean that you are serious about software testing as a profession.
There are various certification programs available for the software testing professional. On this page we will not compare programs for their merit demerit. Purpose of this page is to provide you information and links about the certification opportunities available to you.

I am in no way claiming that this list is complete, I am sure there are more options available. If it is not present here, its mean that I am not aware of them and so there will be some more people like me. Why not share this information with all of us?

Certification Opportunity Certification options available to the Test professional can be divided in two categories.
  • Subject knowledge
  • Tool Knowledge
  • Certification Based on the Subject Knowledge

In this category, candidates knowledge of software testing is examined and certification is awarded based on the subject knowledge. In this category, certifications are available for quality control and quality assurance, both the fields. These certifications are also divided according to the professional level and knowledge required. I have divided certifications available based on the controlling institute or organization
  • Certifications from Quality Assurance Institute, QAI

  • Certified Software Tester (CSTE) and Certified Software Quality Analyst (CSQA), both the certifications are very popular in the testing community. CSTE is related to the Quality Control and CSQA is related to the Quality Assurance.

    According to the QAI, " Acquiring the designation of Certified Software Tester (CSTE) indicates a professional level of competence in the principles and practices of quality control in the IT profession." Where as for CSQA it indicates "Acquiring the designation of Certified Software Quality Analyst (CSQA) indicates a professional level of competence in the principles and practices of quality assurance in the IT profession"

    QAI, also have advance level of certification for people, who have already done CSTE or CSQA. These certifications are called CMST (Certified Manager Of Software Testing) and CMSQ (Certified Manager Of Software Quality).

    More information about these certifications can be obtained from official website.

  • Certifications from International Software Testing Qualification Board, ISTQB

  • ISTQB have three level of certifications, all under one brand. Certifications from ISTQB are called ISTQB Certified Tester for Foundation, Advance or Expert level. These certifications are based on a syllabus created by ISTQB.

    More information about these certifications can be obtained from official website.

  • International Institute for Software Testing, IIST

  • IIST offers two certification programs, CSTP (Certified Software Test Professional) and CTM (Certified Test Manager). More information about these certifications can be obtained from official website.
Certification Based on the Tool Knowledge
In this category, test tool usage proficiency of the candidate is examined. This is normally conducted by various tool vendors. Vendor certifications are available from vendors like Rational, Mercurry, Seague etc. Some of the vendor certifications are mentioned below.

IBM Rational
Rational, well IBM Rational is probably one of the organization to have tool for all the activities involved in SDLC (Software Development Life Cycle). On this page, I will give information about certifications offered by IBM Rational, which are related to software testing. Surprisingly, IBM has two categories of certifications
  • Certification For all candidates
  • There are two certification available, for IBM Rational Manual Tester (RMT) and IBM Rational Performance Tester (RPT)

    IBM Certified Solution Designer - RMT - According to IBM website "This intermediate-level solution designer is an individual with extensive product knowledge who understands how to setup, configure and create a manual testing framework with Rational Manual Tester."

    IBM Certified Solution Designer - RPT - Again from IBM " An intermediate level solution designer is an individual with extensive product knowledge who understands how to use the IBM Rational Performance Tester tool to validate the performance, scalability and reliability of Web-based, SAP, Siebel, or Citrix hosted systems. This person is proficient in creating dynamic tests, developing workload schedules for various performance and load testing scenarios, execute tests with both small and large loads, and evaluating resulting data to measure, analyze and pinpoint factors that affect system performance and other related issues before deployment."
  • Certification For IBM Business Partners, Educational Partners and IBM Employees

  • I do not think I need to give this information here. If you are not in IBM, you will not care about these certification and if you are in IBM, you already know about it. In case you do not know and would like to know, go here.
Borland Segue

Segue, Well Borland Segue has two categories of certifications.
  • Test and Performance Management certifications

  • Under this category, there are three certifications available. These certification address different aspect of application.
    • Functional Test Management Expert - Focus of this certification is on the proficiency level of SilkTest, SilkCentral Test Manager and SilkCentral Issue Manager. More details about the certification can be found here
    • Performance Test Management Expert - Focus of this certification is on SilkPerformer, SilkCentral Test Manager and SilkCentral Issue Manager. More details can be found here.
    • Application Performance Management Expert - Focus of this certification is on SilkPerformer, SilkCentral Performance Manager and SilkCentral Issue Manager. For more details visit Segue.
  • Engineer Certifications

  • Under this category of certification, emphasis is given on only one tool, instead of complete solution. There are two certifications available under this category.
    • Borland Certified SilkTest Engineer - According to the Segue website "his process will ensure that candidates demonstrate a functional knowledge of the product as well as possess the ability to create and run tests to validate accuracy within an application under test." More details here.
    • Borland Certified SilkPerformer Engineer - According to the Segue website "This process will ensure that candidates will have demonstrated functionality knowledge of the product, the ability to effectively define, implement and execute load tests, generate reports, and interpret the results to provide accurate conclusions and recommendations." For more detail visit here.

Mercury

Well, HP Mercury in some time. So this should give you idea of the software testing market. All the big names of software testing tool organizations are now part of even bigger organizations. Mercury has divided its certification program in two categories.
  • Certified Product Consultant - According to the website "The CPC exam certifies that candidates have demonstrated extensive knowledge and ability with Mercury Interactive products. These exams are designed for mid to senior level professionals with several years of experience in their respective fields. The CPC is a hands-on, lab-based examination using actual Mercury Interactive software. Certifications are organized by product and based on a specific release family of the software." More details about the program can be found here.
  • Certified Instructor - According to the website "While the CPC focuses on product knowledge and ability, the CI certifies that a candidate is a skilled instructor with a seasoned knowledge of Mercury Interactive course materials and instructional philosophies. An individual who has achieved a CI is able to download Mercury Interactive training materials from the web site and conduct a training course for products they hold a CPC in." For more details.

Monday, June 2, 2008

Test Plan Template

Test Planning: is the selection of techniques and methods to be used to validate the product against its approved requirements and design.In this activity we assess the software application risks, and then develop a plan to determine if the software minimizes those risks.We document this planning in a Test Plan document.

Explanation of different sections in the template

Document Sign off: Usually a test plan document is a contract between testing team and all the other teams involved in developing the product including the higher management folks. Before sign off all interested parties thoroughly reviews the test plan and gives feedback, raises issues or concerns, if any.Once everybody is satisfied with the test plan, they sign off the document and which is a green signal for the testing team to start executing the test plan.

Change History: Under this section, you specify, who changed what in the document and when, along with the version of the document which contain the changes.

Review and Approval History: This captures who reviewed the document and whether they Approved the test plan or not. The reviewer may suggest some changes or comments(if any) to be incorporated in the test plan.

Document References: Any additional documents that will help better understand the test plan like design documents and/or Requirements document etc.

Document Scope: In this section specify what the test plan covers and who its intended audience is.

Product Summary: In this section describe briefly about the product that is to be tested.

Product Quality Goals: In this section describe important quality goals of the product. Following are some of the typical quality goals
-Reliability, proper functioning as specified and expected.
-Robustness, acceptable response to unusual inputs, loads and conditions.
-Efficiency of use by the frequent users
-Easy to use even for the less frequent users


Testing Objectives: In this section specify the testing goals that need to be accomplished by the testing team. The goals must be measurable and should be prioritized. The following are some example test objectives.
Verify functional correctness
Test product robustness and stability.
Measure performance ‘hot spots’ (locations or features that are problem areas).

Assumptions: In this section specify the expectations, which if not met could have negative impact on this test plan execution. Some of the assumptions can be on the test budget that must be allocated, resources needed etc.

Testing Scope: In this section specify ‘what will be covered in testing’ and ‘what will not be covered’.

Testing Strategy: In this section specify different testing types used to test the product. Tools needed to execute the strategy are also specified.

Testing Schedule: In this section specify, first the entire project schedule and then detailed testing schedule.

Resources: In this section specify all the resources needed to execute the plan successfully

Communication Approach: In this section specify how the testing team will report the bugs to the development, how it will report the testing progress to management, how it will report issues and concerns to higher ups.

Testing Vocabulary-1.2

Recovery Test
Evaluates the contingency features built into the application for handling
interruptions and for returning to specific points in the application processing cycle, including checkpoints, backups, restores, and restarts. This test also assures that disaster recovery is possible.

Regression Testing
Testing of a previously verified program or application following program
modification for extension or correction to ensure no new defects have been introduced.

Risk Matrix
Shows the controls within application systems used to reduce the identified risk, and in what segment of the application those risks exist. One dimension of the matrix is the risk, the second dimension is the segment of the application system, and within the matrix at the intersections are the controls. For example, if a risk is “incorrect input” and the systems segment is “data entry,” then the intersection within the matrix would show the controls designed to reduce the risk of incorrect input during the data entry segment of the application system.

Scatter Plot Diagram
A graph designed to show whether there is a relationship between two
changing variables.

Standards
The measure used to evaluate products and identify nonconformance. The basis upon which adherence to policies is measured.

Statement of Requirements
The exhaustive list of requirements that define a product.

Statement Testing
A test method that executes each statement in a program at least once during program testing.

Static Analysis
Analysis of a program that is performed without executing the program. It
may be applied to the requirements, design, or code.

Stress Testing
This test subjects a system, or components of a system, to varying
environmental conditions that defy normal expectations. For example, high transaction volume, large database size or restart/recovery circumstances. The intention of stress testing is to identify constraints and to ensure that there are no performance problems.

Structural Testing
A testing method in which the test data is derived solely from the program structure.

Stub
Special code segments that when invoked by a code segment under testing, simulate the behavior of designed and specified modules not yet constructed.

System Test
During this event, the entire system is tested to verify that all functional,
information, structural and quality requirements have been met.

Test Case
Test cases document the input, expected results, and
execution conditions of a given test item.

Test Plan
A document describing the intended scope, approach, resources, and schedule of testing activities. It identifies test items, the features to be tested, the testing tasks, the personnel performing each task, and any risks requiring contingency planning.

Test Scripts
A tool that specifies an order of actions that should be performed during a test session. The script also contains expected results. Test scripts may be manually prepared using paper forms, or may be automated using
capture/playback tools or other kinds of automated scripting tools.

Test Suite Manager
A tool that allows testers to organize test scripts by function or other grouping.

Unit Test
Testing individual programs, modules, or components to demonstrate that the work package executes per specification, and validate the design and technical quality of the application. The focus is on ensuring that the detailed logic within the component is accurate and reliable according to pre-determined specifications. Testing stubs or drivers may be used to simulate behavior of interfacing modules.

Usability Test
The purpose of this event is to review the application user interface and other human factors of the application with the people who will be using the application. This is to ensure that the design (layout and sequence, etc.) enables the business functions to be executed as easily and intuitively as possible. This review includes assuring that the user interface adheres to documented User Interface standards, and should be conducted early in the design stage of development. Ideally, an application prototype is used to walk the client group through various business scenarios, although paper copies of screens, windows, menus, and reports can be used.

User Acceptance Test
User Acceptance Testing (UAT) is conducted to ensure that the system meets the needs of the organization and the end user/customer. It validates that the system will work as intended by the user in the real world, and is based on real world business scenarios, not system requirements. Essentially, this test validates that the right system was built.

Validation
Determination of the correctness of the final program or software produced from a development project with respect to the user needs and requirements.

Verification
1. The process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase.
2. The act of reviewing, inspecting, testing, checking, auditing, or otherwise establishing and documenting whether items, processes, services, or documents conform to specified requirements.

Walkthroughs
During a walkthrough, the producer of a product “walks through” or
paraphrases the products content, while a team of other individuals follow along. The team’s job is to ask questions and raise issues about the product that may lead to defect identification.

White-box Testing
A testing technique that assumes that the path of the logic in a program unit or component is known. White-box testing usually consists of testing paths, branch by branch, to produce predictable results. This technique is usually used during tests executed by the development team, such as Unit or Component testing.

Testing Vocabulary-1.1

Debugging: The process of analysing and correcting syntactic, logic and other errors identified during testing.

Decision Coverage: A white-box testing technique that measures the number of - or percentage - of decision directions executed by the test case designed. 100% Decision coverage would indicate that all decision directions had been executed at least once during testing. Alternatively each logical path through the program can be tested.

Decision Table
A tool for documenting the unique combinations of conditions and associated results in order to derive unique test cases for validation testing.

Defect Tracking Tools
Tools for documenting defects as they are found during testing and for
tracking their status through to resolution.

Desk Check: A verification technique conducted by the author of the artifcat to verify the completeness of their own work. This technique does not involve anyone else.

Dynamic Analysis: Analysis performed by executing the program code.Dynamic analysis executes or simulates a development phase product and it detects errors by analyzing the response of the product to sets of input data.

Entrance Criteria: Required conditions and standards for work product quality that must be present or met for entry into the next stage of the software development process.

Equivalence Partitioning: A test technique that utilizes a subset of data that is representative of a larger class. This is done in place of undertaking exhaustive testing of each value of the larger class of data.

Error or defect: 1.A discrepancy between a computed, observed or measured value or condition and the true, specified or theortically correct value or conditon 2.Human action that results in software containing a fault (e.g., omission or misinterpretation of user requirements in a software specification, incorrect translation or omission of a requirement in the design specification)

Error Guessing: Test data selection techniques for picking values that seem likely to cause defects. This technique is based upon the theory that test cases and test data can be developed based on intuition and experience of the tester.

Exhaustive Testing: Executing the program through all possible combination of values for program variables.

Exit criteria: Standards for work product quality which block the promotion of incomplete or defective work products to subsequent stages of the software development process.

Flowchart
Pictorial representations of data flow and computer logic. It is frequently
easier to understand and assess the structure and logic of an application system by developing a flow chart than to attempt to understand narrative descriptions or verbal explanations. The flowcharts for systems are normally developed manually, while flowcharts of programs can be produced.

Force Field Analysis
A group technique used to identify both driving and restraining forces that
influence a current situation.

Formal Analysis
Technique that uses rigorous mathematical techniques to analyze the
algorithms of a solution for numerical properties, efficiency, and correctness.

Functional Testing
Testing that ensures all functional requirements are met without regard to the final program structure.

Histogram
A graphical description of individually measured values in a data set that is organized according to the frequency or relative frequency of occurrence. A histogram illustrates the shape of the distribution of individual values in a data set along with information regarding the average and variation.

Inspection
A formal assessment of a work product conducted by one or more qualified independent reviewers to detect defects, violations of development standards, and other problems. Inspections involve authors only when specific questions concerning deliverables exist. An inspection identifies defects, but does not attempt to correct them. Authors take corrective actions and arrange follow-up reviews as needed.

Integration Testing
This test begins after two or more programs or application components have been successfully unit tested. It is conducted by the development team to validate the interaction or communication/flow of information between the individual components which will be integrated.

Life Cycle Testing
The process of verifying the consistency, completeness, and correctness of software at each stage of the development life cycle.

Pass/Fail Criteria
Decision rules used to determine whether a software item or feature passes or fails a test.

Path Testing
A test method satisfying the coverage criteria that each logical path through the program be tested. Often, paths through the program are grouped into a finite set of classes and one path from each class is tested.

Performance Test
Validates that both the online response time and batch run times meet the
defined performance requirements.

Policy
Managerial desires and intents concerning either process (intended objectives) or products (desired attributes).

Population Analysis
Analyzes production data to identify, independent from the specifications, the types and frequency of data that the system will have to process/produce. This verifies that the specs can handle types and frequency of actual data and can be used to create validation tests.

Procedure
The step-by-step method followed to ensure that standards are met.

Process
1. The work effort that produces a product. This includes efforts of people and equipment guided by policies, standards, and procedures.
2. A statement of purpose and an essential set of practices (activities) that address that purpose.

Proof of Correctness
The use of mathematical logic techniques to show that a relationship between program variables assumed true at program entry implies that another relationship between program variables holds at program exit.

Quality
A product is a quality product if it is defect free. To the producer, a product is a quality product if it meets or conforms to the statement of requirements that defines the product. This statement is usually shortened to: quality means meets requirements. From a customer’s perspective, quality means “fit for use.”

Quality Assurance (QA)
Deals with 'prevention' of defects in the product being developed.It is associated with a process.The set of support activities (including facilitation, training, measurement, and analysis) needed to provide adequate confidence that processes are established and continuously improved to produce products that meet specifications and
are fit for use.

Quality Control (QC)
Its focus is defect detection and removal. Testing is a quality control activity

Quality Improvement
To change a production process so that the rate at which defective products (defects) are produced is reduced. Some process changes may require the product to be changed.

Testing Vocabulary - 1

Every profession has its own vocabulary.To learn a profession, the first and crucial step is to master its vocabulary.The entire knowledge of a profession is compressed and kept it in its vocabulary.
Take our own software testing profession, while communicating with our collegues, we frequently use terms like 'regression testing', 'System testing', now imagine communicating the same to a person who is not in our profession or who doesn't understand our testing vocabulary, we need to explain in detail each and every term .Communication becomes so difficult and painful.To speak the language of testing, you need to learn its vocabulary.
Find below a huge collection of testing vocabulary

Affinity Diagram: A group process that takes large amounts of language data, such as developing by brainstorming, and divides it into categories

Audit: This is an inspection/assessment activity that verifies compliance with plans, policies and procedures and ensures that resources are conserved.

Baseline:A quantitative measure of the current level of performance.

Benchmarking: Comparing your company's products, services or processes against best practices or competitive practices, to help define superior performance of a product,service or support processes.

Black-box Testing: A test technique that focuses on testing the functionality of the program component or application against its specifications without knowlegde of how the system constructed.

Boundary value analysis: A data selection technique in which test data is chosen from the "boundaries" of the input or output domain classes, data structures and procedure parameters. Choices often include the actual minimum and maximum boundary values, the maximum value plus or minus one and the minimum value plus or minus one.

Branch Testing: A test method that requires that each possible branch on each decision be executed on at least once.
Brainstorming: A group process for generating creative and diverse ideas.

Bug: A catchall term for all software defects or errors.

Certification testing: Acceptance of software by an authorized agent after the software has been validated by the agent or after its validity has been demonstrated to the agent.

Checkpoint(or verification point): Expected behaviour of the application which must be validated with the actual behaviour after certain action has been performed on the application.

Client: The customer that pays for the product received and receives the benefit from the use of the product.

Condition Coverage: A white-box testing technique that measures the number of or percentage of decision outcomes covered by the test cases designed.100% condition coverage would indicate that every possible outcome of each decision had been executed at least once during testing.

Configuration Management Tools
Tools that are used to keep track of changes made to systems and all related artifacts. These are also known as version control tools.

Configuration testing: Testing of an application on all supported hardware and software platforms.This may include various combinations of hardware types, configuration settings and software versions.

Completeness: A product is said to be complete if it has met all requirements.

Consistency: Adherence to a given set of rules.

Correctness: The extent to which software is free from design and coding defects. It is also the extent to which software meets the specified requirements and user objectives.

Cost of Quality: Money spent above and beyond expected production costs to ensure that the product the customer receives is a quality product. The cost of quality includes prevention, appraisal, and correction or repair costs.

Conversion Testing: Validates the effectiveness of data conversion processes, including field-field mapping and data translation.

Customer: The individual or organization, internal or external to the producing organization that receives the product.

Cyclomatic complexity: The number of decision statements plus one.

Sunday, June 1, 2008

Linux important Commands

Startx :

The startx command is an easy way to start an X session if you're working on a single computer or boot Linux to runlevel 3 mode (X11, multiuser and networking enabled)

[The startx command redirects X server and X client error messages to the file specified by the user's XERRORS environment variable. This process is useful for debugging and gives the X server a clean startup and shutdown appearance on a workstation.]

Xterm :

In computing, xterm is the standard terminal emulator for the X Window System. A user can have many different invocations of xterm running at once on the same display, each of which provides independent input/output for the process running in it (normally the process is a Unix shell).

xhost - server access control program for X

The xhost program is used to add and delete host names or user names to the list allowed to make connections to the X server. In the case of hosts, this provides a rudimentary form of privacy control and security. It is only sufficient for a workstation (single user) environment, although it does limit the worst abuses. Environments which require more sophisticated measures should implement the user-based mechanism or use the hooks in the protocol for passing other authentication data to the server.

[+]name

The given name (the plus sign is optional) is added to the list allowed to connect to the X server. The name can be a host name or a user name.

-name

The given name is removed from the list of allowed to connect to the server. The name can be a host name or a user name. Existing connections are not broken, but new connection attempts will be denied. Note that the current machine is allowed to be removed; however, further connections (including attempts to add it back) will not be permitted. Resetting the server (thereby breaking all connections) is the only way to allow local connections again.

+

Access is granted to everyone, even if they aren't on the list (i.e., access control is turned off).

-

Access is restricted to only those on the list (i.e., access control is turned on).

TWM

Twm is a window manager for the X Window System. It provides titlebars, shaped windows, several forms of icon management, user-defined macro functions, click-to-type and pointer-driven keyboard focus, and user-specified key and pointer button bindings.

xinit - X Window System initialize

The xinit program is used to start the X Window System server and a first client program on systems that cannot start X directly from /etc/init or in environments that use multiple window systems. When this first client exits, xinit will kill the X server and then terminate.

Synopsis

xinit [ [ client ] options ] [ -- [ server ] [ display ] options ]

Examples

Below are several examples of how command line arguments in xinit are used.

xinit

This will start up a server named X and run the user's .xinitrc, if it exists, or else start an xterm.

xinit -- /usr/X11R6/bin/Xqdss :1

This is how one could start a specific type of server on an alternate display.

xinit -geometry =80x65+10+10 -fn 8x13 -j -fg white -bg navy

This will start up a server named X, and will append the given arguments to the default xterm command. It will ignore .xinitrc.

xinit -e widgets -- ./Xsun -l -c

This will use the command ./Xsun -l -c to start the server and will append the arguments -e widgets to the default xterm command.

xinit /usr/ucb/rsh fasthost cpupig -display ws:1 -- :1 -a 2 -t 5

This will start a server named X on display 1 with the arguments -a 2 -t 5. It will then start a remote shell on the machine fasthost in which it will run the command cpupig, telling it to display back on the local workstation.

Xserver - X Window System display server

X is the generic name for the X Window System display server. It is frequently a link or a copy of the appropriate server binary for driving the most frequently used server on a given machine.

arping [-fqbDUAV] [-c count] [-w timeout] [-I device] [-s source] destination

-f : quit on first reply

-q : be quiet

-b : keep broadcasting, don't go unicast

-D : duplicate address detection mode

-U : Unsolicited ARP mode, update your neighbours

-A : ARP answer mode, update your neighbours

-V : print version and exit

-c count : how many packets to send

-w timeout : how long to wait for a reply

-I device : which ethernet device to use (eth0)

-s source : source ip address

destination : ask for what ip address

# Marks a command.

alias Displays alias.

Syntax

alias [name=['command']]

bg Resumes job in the background.

break Resumes execution after the loop.

breaksw Breaks from a switch command; resumes after the endsw command.

case Defines a label in a switch command.

cd Changes directory.

chdir Changes directory, same as cd.

continue Continues a loop.

default Specifies the default case in a switch.

dirs Displays the directory stack.

echo Writes arguments to the standard output of the shell.

eval Evaluates a command.

exec Executes the command in the current shell.

exit Exits the shell.

fg Brings a job in the foreground.

foreach Specifies a looping control statement and execute a sequence of commands until reaching an end command.

glob Writes arguments to the standard output of the shell, like the echo command, but without the new line.

goto Continues execution after the specified label.

hashstat Displays hash table statistics.

history Displays the history list.

if Executes a command if condition met.

jobs Lists active jobs.

kill Sends a signal to a process. term (terminate) is the default signal.

limit

Sets or list system resource limits.

login Logs on.

logout Logs out.

nice Changes the priority of commands run in the shell.

nohup Ignores the hangup signal.

notify

Notifies the user about changes in job status.

onintr Tells the shell what to do on interrupt.

popd Pops the top directory off the directory stack and changes to the new top directory. pushd Exchanges the top two elements of the directory stack.

rehash Re-computes the hash table of the contents of the directories in the path shell variable.

repeat

Repeats the execution of a command.

set Displays or set the value of a shell variable.

setenv Sets environment variables.

shift Shifts shell arguments.

source Reads commands from a script.

stop

Stops a background job.

suspend Stops the current shell.

switch Starts a switch.

time Displays the time used to execute commands.

umask Shows or set file permissions.

unalias

Removes command alias.

unhash Disables the internal hash table.

unlimit Removes limitations on system Resource.

unset Deletes shell variables.

unsetenv Deletes environment variables.

wait Waits for background jobs to complete.

while …end

Executes the commands between the while and matching end statements repeatedly.

@ Displays or set the values of all the shell variables.



  • chmod - modify file access rights
  • su - temporarily become the superuser
  • chown - change file ownership
  • chgrp - change a file's group ownership

There are several commands that can be used to control processes. They are:

  • ps - list the processes running on the system
  • kill - send a signal to one or more processes (usually to "kill" a process)
  • jobs - an alternate way of listing your own processes
  • bg - put a process in the background
  • fg - put a process in the forground


Linux Commands Information

General information:
x(7)

Protocols:
X Window System Protocol, The X Font Service Protocol, X Display Manager Control Protocol

Fonts:
bdftopcf(1), mkfontdir(1), mkfontscale(1), xfs(1), xlsfonts(1), xfontsel(1), xfd(1), X Logical Font Description Conventions

Security:
xsecurity(7), xauth(1), Xau(1), xdm(1), xhost(1), xfwp(1), Security Extension Specification

Starting the server:
xdm(1), xinit(1)

Controlling the server once started:
xset(1), xsetroot(1), xhost(1)

Server-specific man pages:
xorg(1), xdmx(1), xnest(1), xvfb(1), XDarwin(1), XWin(1).

Server internal documentation:
Definition of the Porting Layer for the X v11 Sample Server

Software Testing Process and Requirements

Introduction:

In the Software Development Life Cycle, Testing is placed right after coding/development phase. In the coding/development phase the individual objects or components of the application are coded from the physical model. Once the system objects have been developed, they are gathered and connected together (integrated) to create a working application. The integrated application is placed on a staging server for testing. Software Testing is the process of executing a program or system with the intent of finding errors

Purpose:

Software bugs will almost always exist in any software module with moderate size: not because programmers are careless or irresponsible, but because the complexity of software is generally intractable -- and humans have only limited ability to manage complexity. It is also true that for any complex systems, design defects can never be completely ruled out. As computers and software are used in critical applications, the outcome of a bug can be severe. Testing is performed -

a. To demonstrate that the product performs each function intended

b. To demonstrate that the internal operation of the product performs according to specification and all internal components have been adequately exercised

c. To increase our confidence in the proper functioning of the software.

d. To show the product is free from defect.

General Types of Testing:

Unit Testing

Integration Testing

Acceptance Testing

Testing Process:

The testing process can be divided into three main phases –

Test Planning

Test Execution

Test Closure

Test Planning –

The Test-Planning phase includes -

1) Receipt of documents from the client/developers. The documents include –

a. Business Requirement Document (From the Client)

b. Functional Specification Document (From the Client/Developers)

c. Physical Deign Document (From the Client/Developers)

2) The understanding of the application by the testers (includes clarification sessions arranged by the client)

3) Preparation of the following documents by the Testing Team –

a. Test Conditions and Test Cases (this would be done through Excel Templates or using Test director as the case may be)

b. Traceability Matrix (when the document is prepared through Excel templates)

c. Test Data (with a valid set of data received from the client or developers)

d. Test Environment Set-up (simulation of the user’s environment)

i. Input Required from the Client –

1. Hardware used in Client Environment that is compatible with the application

2. Software used in Client Environment that is compatible with the application

The Test-Execution phase includes –

Entry Criteria:

Sign off on the Test Cases prepared by the testing team, by the client

Completion of Test Environment set up

Deployment of code

Execution Phase:

Round 1 - The SQA Engineers having the Test Script document, as the base would perform one full round of testing. The defects encountered would be logged on a daily basis. The defects logged would be supported by a screen shot of the defect. This would be communicated to the developers on a daily basis.
Round 2
- The Developer's are expected to deploy the fixed code to the test environment only after the completion of one full round of testing. The second round of testing would be performed by the SQA Engineers having the defect log as the base and would mark only the retest of defects that are identified as 'Fixed' by the developers.
Round 3
- The final and the third round would again include one full round of testing performed by the SQA Engineers having the Test Scripts as the base.

Exit Criteria:

All the defects raised are 'Closed' and one full round of regresstion testing has been performed

Test Closure Phase –

The Closure phase would include the transfer of all the final list of deliverables to the client including –

    1. Test Conditions and Test Cases
    2. Traceability Matrix (as the case may be)
    3. Defect Log
    4. Final Summary Report

The above-mentioned process would be followed for each type of testing performed depending on the scope of testing defined

Conclusion:

As part of testing, the processes as categorized by SEI CMM/ISO are strictly adhered to so as to ensure the Quality of the product tested.