Google
Powered By Blogger

Saturday, January 19, 2008

Quality Assurance Job

Quality Assurance Jobs available online.....









Saturday, December 22, 2007

GLOSSARY OF QUALITY ASSURANCE TERMS

GLOSSARY OF QUALITY ASSURANCE TERMS

Absolute method: a body of procedures and techniques for which measurement is
based entirely on physically defined, fundamental quantities.
Acceptable quality level: a limit above which quality is considered satisfactory and
below which it is not. In sampling inspection, the maximum percentage of defects
or failures that can be considered satisfactory as an average.
Acceptable quality range: the interval, between specified upper and lower limits of a
sequence of values, within which the values are considered to be satisfactory.
Acceptable value: an observed or corrected value that falls within the acceptable
range. See Corrected value and Observed value.
Acceptance criteria: specified limits placed on characteristics of an item, process, or
service which are defined in requirements documents. (ASQC Definitions)
Acceptance sampling: the procedure of drawing samples from a lot or population to
determine whether to accept or reject a sampled lot or population.
Accepted reference value: a numerical quantity that serves as an agreed-upon basis
for comparison, and which is derived as; 1) a theoretical or established quantity
based on scientific principles, 2) an assigned value, based on experimental work
of some recognized organization, or 3) a consensus quantity based on
collaborative experimental work under the auspices of a scientific or engineering
group.
Accreditation: a formal recognition that an organization (e.g., laboratory) is competent
to carry out specific tasks or specific types of tests. See also Certification.
The process by which an agency or organization evaluates and recognizes a
program of study or an institution as meeting certain predetermined qualifications
or standards, thereby accrediting the laboratory. In the context of the National
Environmental Laboratory Accreditation Program (NELAP), this process is a
voluntary one. (NELAC)
Accreditation criterion: a requirement that a laboratory must meet to receive
authorization and approval to perform a specified task.
Accredited laboratory: a laboratory which has been evaluated and given approval to
perform a specified measurement or task, usually for a specific property or
analyte and for a specified period of time.
QA Glossary December 10, 1997
2
Accrediting Authority: the agency having responsibility and accountability for
environmental laboratory accreditation and who grants accreditation. For the
purposes of NELAC, this is EPA, other federal agencies, or the state.
Accuracy: the degree of agreement between an observed value and an accepted
reference value. Accuracy includes a combination of random error (precision) and
systematic error (bias) components which are due to sampling and analytical
operations; a data quality indicator. EPA recommends that this term not be used
and that precision and bias be used to convey the information usually associated
with accuracy. See Precision and Bias.
Action limit: see Control limit.
Adjusted value: the observed value after adjustment for values of a blank or bias of the
measurement system.
Aliquant: a subsample derived by a divisor that divides a sample into a number of
equal parts but leaves a remainder; a subsample resulting from such a divisor.
See Subsample.
Aliquot: a subsample derived by a divisor that divides a sample into a number of equal
parts and leaves no remainder; a subsample resulting from such a division. In
analytical chemistry the term aliquot is generally used to define any
representative portion of the sample.
Alpha error: see “Type I Error.”
Alternate method: any body of procedures and techniques of sample collection and/or
analysis for a characteristic of interest which is not a reference or approved
equivalent method but which has been demonstrated in specific cases to produce
results comparable to those obtained from a reference method.
Analysis (chemical): the determination of the qualitative and/or quantitative
composition of a substance.
Analysis duplicates: the subjection of two portions of the same prepared sample,
extract or digestate to the determinative step of an analytical method or a
measurement system to estimate that step s precision.
Analysis matrix spike: the subjection of a prepared sample, extract or digestate that
has been fortified (spiked) with a known amount of the analyte of interest, to the
determinative step of an analytical method to estimate the bias imparted by the
instrumental or determinative procedure.
QA Glossary December 10, 1997
3
Analyte: the substance, a property of which is to be measured by chemical analysis.
Analytical batch: a group of samples, including quality control samples, which are
processed together using the same method, the same lots of reagents, and at the
same time or in continuous, sequential time periods. Samples in each batch
should be of similar composition and share common internal quality control
standards.
Analytical blank: see Reagent blank.
Analytical Detection Limit (LD): the smallest amount of an analyte that can be
distinguished in a sample by a given measurement procedure throughout a given
confidence interval (e.g., 0.95). See Method Detection Limit.
Analytical limit of discrimination: see Method detection limit.
Analytical Reagent (AR): the American Chemical Society’s designation for the highest
purity of certain chemical reagents and solvents. See Reagent grade.
Arithmetic mean: the sum of all the values of a set of measurements divided by the
number of values in the set, usually denoted by x; a measure of central tendency.
See Measure of central tendency.
Assessment: the evaluation process used to measure the performance or
effectiveness of a system and its elements, used to denote any of the following:
audit, performance evaluation, management systems review, peer review,
inspection, or surveillance. ANSI/ASQC E4-1994
Assignable cause: a factor or an experimental variable shown to significantly change
the quality of an effect or a result.
Audit: a systematic evaluation to determine the conformance to quantitative
specifications of some operational function or activity. See Audit of data quality,
Performance evaluation audit, and Technical systems audit, and also Review,
and Management systems review.
Audit of data quality (ADQ): a qualitative and quantitative evaluation of the
documentation and procedures associated with environmental measurements to
verify that the resulting data are of acceptable quality.
Audit sample: See Performance evaluation sample.
Average: see Arithmetic mean.
QA Glossary December 10, 1997
4
Background level (environmental): the concentration of substance in a defined
control area during a fixed period of time before, during or after a data gathering
operation.
Batch: a quantity of material (e.g., samples) of the same or similar matrix, expected to
behave similarly with respect to the procedure(s) being employed and produced
or processed in one operation, considered to be a uniform discrete unit.
NELAC defines batch as follows: environmental samples which are prepared
and/or analyzed together with the same process and personnel, using the same
lot(s) of reagents. A preparation batch is composed of one to 20 environmental
samples of the same NELAC-defined matrix, meeting the above mentioned
criteria and with a maximum time between the start of processing of the first and
last sample in the batch to be 24 hours. An analytical batch is composed of
prepared environmental samples (extracts, digestates or concentrates) which are
analyzed together as a group. An analytical batch can include prepared samples
originating from various environmental matrices and can exceed 20 samples.
(Quality Systems)
Batch-lot: the samples collected under sufficiently uniform conditions to be processed
as a group. See Batch, Batch size.
Batch-sample: one of the samples drawn from a batch.
Batch-size: the number of samples in a batch-lot.
Beta error: see Type II Error.
Bias: the systematic or persistent distortion of a measurement process which deprives
the result of representativeness (i.e., the expected sample measurement is
different than the sample’s true value.) A data quality indicator.
Blank: a sample that has not been exposed to the analyzed sample stream in order to
monitor contamination during sampling, transport, storage or analysis. The blank
is subjected to the usual analytical and measurement process to establish a zero
baseline or background value and is sometimes used to adjust or correct routine
analytical results. (AS QC, Definitions of Environmental Quality Assurance Terms,
1996)
Blank sample: a clean sample or a sample of matrix processed so as to measure
artifacts in the measurement (sampling and analysis) process.
QA Glossary December 10, 1997
5
Blind sample: a subsample submitted for analysis with a composition and identity
known to the submitter but unknown to the analyst and used to test the analyst’s
or laboratory’s proficiency in the execution of the measurement process. See
Double-blind sample.
Bulk sample: a sample taken from a larger quantity (lot) for analysis or recording
purposes.
Calibrant: see Calibration standard.
Calibrate: to determine, by measurement or comparison with a standard, the correct
value of each scale reading on a meter or other device, or the correct value for
each setting of a control knob. The levels of the calibration standards should
bracket the range of planned measurements. See Calibration curve.
Calibration-check: calibration material obtained from a source other than the one
supplying the (primary) calibration standard, used to assess (check) the
calibration of a measurement instrument; the act of assessing the calibration of a
measurement instrument utilizing calibration material from a secondary source.
See Span check. Mid-range check, and Zero check.
Calibration-check standard: see Calibration standard.
Calibration curve: the graphical relationship between the known values for a series of
calibration standards and instrument responses.
Calibration drift: the difference between the instrument response and a reference
value after a period of operation without recalibration.
Calibration standard: a substance or reference material used to calibrate an
instrument.
Calibration Standard: a solution prepared from the primary dilution standard solution
or stock standard solutions and the internal standards and surrogate analytes.
The Calibration solutions are used to calibrate the instrument response with
respect to analyte concentration.
Candidate method: a body of procedures and techniques of sample collection and!or
analysis that is submitted for approval as a reference method, an equivalent
method, or an alternative method.
Carrying-agent: any diluent or matrix used to entrain, dilute or to act as a vehicle for a
compound of interest.
QA Glossary December 10, 1997
6
CAS#: Chemical Abstracts Service registry number of elements, chemical compounds,
and certain mixtures.
Cause-effect diagram: a graphical representation of an effect and possible causes. A
popular one is the Ishikawa “fish bone diagram.”
Central line: the line on a control chart that represents the expected value of the
control chart statistic; often the mean. See Control chart.
Certification: the process of testing and evaluation against specifications designed to
document, verify, and recognize the competence of a person, organization, or
other entity to perform a function or service usually for a specified time. See also
Accreditation.
Certification of Data Quality: the real-time attestation that the activities of an
environmental data collection operation’s individual elements (e.g., sampling
design, sampling, sample handling, chemical analysis, data reduction, etc.,) have
been carried out in accordance with the operation’s requirements and that the
results meet the defined quality criteria.
Certified Reference Material (CRM): a reference material that has one or more of its
property values established by a technically valid procedure and is accompanied
by or traceable to a certificate or other documentation issued by a certifying body.
See Certification and Reference material.
Certified value: the reported numerical quantity that appears on a certificate for a
property of a reference material.
Chain-of-custody: an unbroken trail of accountability that insures the physical security
of samples, data and records.
Chance cause: an unpredictable, random determinant of variation of a response in a
sampling or measurement operation.
Characteristic: see Property.
Check sample: an uncontaminated sample matrix spiked with known amounts of
analytes usually from the same source as the calibration standards. It is generally
used to establish the stability of the analytical system but may also be used to
assess the performance of all or a portion of the measurement system. See also
Quality control sample.
Check standard: a substance or reference material obtained from a source
independent from the source of the calibration standard; used to prepare check
QA Glossary December 10, 1997
7
samples.
Chi-square test: a statistical test of the agreement between the observed frequency of
events and the frequency expected according to some hypothesis.
Clean sample: a sample of a natural or synthetic matrix containing no detectable
amount of the analyte of interest and no interfering material.
Coefficient of variation (CV): a measure of relative dispersion (precision.) It is equal to
the ratio of the standard deviation divided by the arithmetic mean. See also
Relative standard deviation.
Collaborative testing: the evaluation of an analytical method by typical or
representative laboratories using subsamples prepared from a homogeneous
standard sample.
Collocated sample: one of two or more independent samples collected so that each is
equally representative for a given variable at a common space and time.
Collocated samplers: two or more identical sample collection devices, located
together in space and operated simultaneously, to supply a series of duplicate or
replicate samples for estimating precision of the total measurement
system/process.
Comparability: the degree to which different methods, data sets and/or decisions
agree or can be represented as similar; a data quality indicator.
Compatibility: ability of entities to be used together under specific conditions to fulfil
relevant requirements. (ISO 8402)
Completeness: the amount of valid data obtained from a data collection project
compared to the planned amount needed to meet the data quality objectives.
Usually expressed as a percentage. A data quality indicator.
Component of variance: a part of the total variance associated with a specified source
of variation.
Composite sample: a sample prepared by physically combining two or more samples
having some specific relationship and processed to ensure homogeneity. See
Flow-proportioned sample and Time- proportioned sample.
Confidence coefficient: the probability statement that accompanies a confidence
interval and is equal to unity minus the associated type I error rate (false positive
rate). A confidence coefficient of 0.90 implies that 90% of the intervals resulting
QA Glossary December 10, 1997
8
from repeated sampling of a population will include the unknown (true) population
parameter. See Confidence interval.
Confidence interval: the numerical interval constructed around a point estimate of a
population parameter, combined with a probability statement (the confidence
coefficient) linking it to the population’s true parameter value. If the same
confidence interval construction technique and assumptions are used to calculate
future intervals, they will include the unknown population parameter with the
same specified probability. See Confidence coefficient.
Confirmation: verification of the presence of a component through the use of an
analytical technique that differs from the original method. These may include:
Second column confirmation
Alternate wavelength
Derivatization
Mass spectral interpretation
Alternative detectors or
Additional cleanup procedures.
Conformity: fulfilment of specified requirements. (ISO 8402)
Control chart: a graph of some measurement plotted over time or sequence of
sampling, together with control limit(s) and, usually, a central line and warning
limit(s). See Central line, Control limit and limit.
Control limit: a specified boundary on a control chart that, if exceeded, indicates a
process is out of statistical control, and the process must be stopped, and
corrective action taken before proceeding (e.g., for a Shewhart chart the control
limits are the mean plus and minus three standard deviations, i.e., the 99.72%
confidence level on either side of the central line.)
Control sample: see quality control sample and Check sample.
Control standard: see Check standard.
Controlled variable: a variable that is set at a pre-selected level when a controlled
experiment is conducted.
Corrective action: an action taken to eliminate the causes of an existing
nonconformance, deficiency, or other undesirable situation in order to prevent
recurrence. (ISO 8402)
Correlation: a measure of association between two variables. See also Correlation
coefficient.
QA Glossary December 10, 1997
9
Correlation coefficient: a number between -1 and 1 that indicates the degree of
linearity between two variables or sets of numbers. The closer to -1 or + 1, the
stronger the linear relationship between the two (i.e., the better the correlation.)
Values close to zero suggest no correlation between the two variables. The most
common correlation coefficient is the product-moment, a measure of the degree
of linear relationship between two variables.
Critical-toxicity range: the interval between the highest concentration at which all test
organisms survive and the lowest concentration at which all test organisms die
within the test period.
Customer: any individual or organization for whom items or services are furnished or
work performed in response to defined requirements and expectations.
- recipient of a product provided by the supplier. (ISO 8402) Daily standard:
synonym for Calibration standard.
Data: facts or figures from which conclusions can be inferred.
Data analysis: the comparison of suitably reduced data with a conceptual model (e.g.,
a dispersion model) and may include computation of summary statistics, standard
errors, confidence intervals, tests of hypotheses, and goodness-of-fit tests.
Data Audit: a qualitative and quantitative evaluation of the documentation and
procedures associated with environmental measurements to verify that the
resulting data are of acceptable quality (i.e., that they meet specified acceptance
criteria.
Data quality: the totality of features and characteristics of data that bears on their
ability to satisfy a given purpose; the sum of the degrees of excellence for factors
related to data.
Data Quality Assessment (DQA): the statistical evaluation of a data set to establish
the extent to which it meets user-defined application requirements (i.e., DQOs).
Data of Known Quality: data are of known quality when the qualitative and quantitative
components associated with their derivation are documented appropriately for
their intended use, and such documentation is verifiable and defensible.
Data quality indicators: quantitative statistics and qualitative descriptors that are used
to interpret the degree of acceptability or utility of data to the user. The principal
data quality indicators are bias, precision, accuracy (precision and bias are
preferred), comparability, completeness, and representativeness.
QA Glossary December 10, 1997
10
Data Quality Objective (DQO): qualitative and quantitative statements of the overall
level of uncertainty that a decision-maker is willing to accept in results or in
decisions derived from environmental data. DQOs provide the statistical
framework for planning and managing environmental data operations consistent
with the data user’s needs.
Data Quality Objectives process: a systematic planning tool based on the scientific
method that identifies and defines the type, quality and quantity of data needed to
satisfy a specified use.
Data reduction: the process of transforming the number of data items by arithmetic or
statistical calculations, standard curves, concentration factors, etc., and collation
into a more useful form. Data reduction is irreversible and generally results in a
reduced data set and an associated loss of detail.
Data review: the systematic evaluation of achieved quality control results to establish if
the samples and/or measurements performed on them meet specified
acceptance criteria, for the purpose of determining whether or not the affected
results may or may not be used or should be qualified.
Data set: all the observed values for the samples in a test or study; a group of data
collected under similar conditions and which, therefore, can be analyzed as a
whole.
Data transformation: the conversion of individual data point values into related values
or symbols using formulae (reversible) or symbols (irreversible)
Data validation: See Data review/validation.
Datum: the singular of data. See Data and Value.
Decision error: applying incorrect or erroneous data in choosing between alternatives,
resulting in making the wrong selection..
Defect: nonfulfilment of an intended usage requirement or reasonable expectation.
(ISO 8402)
Defensible: the ability to withstand any reasonable challenge related to the veracity or
integrity of laboratory documents and derived data.
Defensible decision making: the systematic application of objective data or
information in selecting between alternatives.
QA Glossary December 10, 1997
11
Degrees of freedom: the total number of items in a sample minus the number of
independent relationships existing among them; the divisor used to calculate a
variance term; in the simplest cases, it is one less than the number of
observations.
Dependability: collective term used to describe the availability performance and its
influencing factors: reliability performance, maintainability performance and
maintenance-supported performance. (ISO 8402)
Dependent variable: see Response variable.
Detection limit (DL): the lowest concentration or amount of the target analyte that can
be determined to be different from zero by a single measurement at a stated level
degree of confidence. See Method detection limit.
Determination: the complete analytical process of measuring the property of interest in
a sample, from selecting or measuring a test portion or subsample to the
reporting of results. See Test determination.
Diluent: a substance added to another to reduce the concentration and resulting in a
homogeneous end product without chemically altering the compound of interest.
Dilution factor: the numerical value obtained from dividing the new volume of a diluted
substance by its original volume.
Document control: the policies and procedures used by an organization to ensure that
its documents and their revisions are proposed, reviewed, approved for release,
inventoried, distributed, archived, stored, and retrieved in accordance with the
organization s s requirements.
Double-blind sample: a sample submitted to evaluate performance with concentration
and identity unknown to the analyst. See Blind sample.
Duplicate: an adjective describing the taking of a second sample or performance of a
second measurement or determination. Often incorrectly used as a noun and
substituted for “duplicate sample.” Replicate is to be used if there are more than
two items. See Replicate.
Duplicate analyses or measurements: the analyses or measurements of the variable
of interest performed identically on two subsamples of the same sample. The
results from duplicate analyses are used to evaluate analytical or measurement
precision but not the precision of sampling, preservation or storage internal to the
laboratory.
QA Glossary December 10, 1997
12
Duplicate samples: two samples taken from and representative of the same population
and carried through all steps of the sampling and analytical procedures in an
identical manner. Duplicate samples are used to assess variance of the total
method including sampling and analysis. See Collocated sample.
Dynamic blank: a sample-collection material or device (e.g., filter or reagent solution)
that is not exposed to the material to be selectively captured, but is transported
and processed in the same manner as the sample. See Field blank, Instrumental
blank and Sampling equipment blank.
Dynamic calibration: standardization of both the measurement and collection systems
using a reference material similar to the unknown. For example, a series of airmixture
standards containing sulfur dioxide of known concentrations could be
used to calibrate a sulfur dioxide bubbler system.
Dynamic range: the extent over which a method can be calibrated for measuring a
variable of interest.
Entity: that which can be individually described and considered. (ISO 8402)
Environmental data: measurements or information that describes environmental
processes or conditions, or the performance of environmental technology.
Environmental data operations: work performed to obtain, use, or report information
pertaining to environmental processes and conditions.
Environmental Detection Limit (EDL): the smallest level at which a radionuclide in an
environmental medium can be unambiguously distinguished for a given
confidence interval using a particular combination of sampling and measurement
procedures, sample size, analytical detection limit, and processing procedure.
The EDL shall be specified for the 0.95 or greater confidence interval. The EDL
shall be established initially and verified annually for each method and sample
matrix. (NELAC)
Environmental sample: a sample of any material that is collected from an
environmental source.
Environmentally related measurement: any assessment of environmental concern
generated through or for field, laboratory, or modeling processes; the value
obtained from such an assessment.
Environmental technology: pollution control devices and systems, waste treatment
processes and storage facilities, and site remediation technologies and their
components that may be added to process discharges (e.g., emissions, effluents)
QA Glossary December 10, 1997
13
or utilized in the ambient environment to remove pollutants or contaminants, or
prevent them from entering the environment. (ANSI/ASQC E4-1994)
Equivalent method: any method of sampling and/or analysis demonstrated to result in
data having a consistent and quantitatively known relationship to the results
obtained with a reference method under specified conditions, and formally
recognized by the EPA.
Error (measurement): the difference between an observed or corrected value of a
variable and a specified, theoretically correct, or true value.
Error function: the mathematical relationship of the results obtained from the
measurement of one or more properties and the error of the applied
measurement process. See Normal distribution.
Experimental variable: See Independent variable.
External quality control: the activities which are routinely initiated and performed by
persons outside of normal operations to assess the capability and performance of
a measurement process.
False negative decision: see Type II Error.
False negative result: estimating (incorrectly) that an analyte is not present when it
actually is present.
False positive decision: see Type I Error.
False positive result: estimating (incorrectly) that an analyte is present when it is
actually present.
Field blank: a clean sample (e.g., distilled water), carried to the sampling site, exposed
to sampling conditions (e.g., bottle caps removed, preservatives added) and
returned to the laboratory and treated as an environmental sample. Field blanks
are used to check for analytical artifacts and/or background introduces by
sampling and analytical procedures. See Dynamic blank and Sampling equipment
blank.
Field duplicates: see Duplicate sample.
Field (matrix) spike: a sample prepared at the sampling point (i.e., in the field) by
adding a known mass of target analyte to a specified amount of sample. Field
matrix spikes are used, for example, to determine the effect of the sample
preservation, shipment, storage and sample preparation on analyte recovery
QA Glossary December 10, 1997
14
efficiency (analytical bias).
Field reagent blank: see Field blank.
Flag: to qualify or signal that an item does not meet specified requirements.
Flow rate: the quantity-per-unit time of a substance passing a point, plane, or space;
for example the volume or mass of gas or liquid emerging from an orifice, pump,
or turbine or moving through a point in a conduit or channel.
Field sample: see Sample.
Field split samples: two or more representative portions taken from the same sample
and submitted for analysis to different laboratories to estimate interlaboratory
precision.
Flag: to qualify or signal that an item does not meet specified requirements.
Flow-proportioned sample: a sample or subsample collected from a fluid system at a
rate that produces a constant ratio of sample accumulation to matrix flow rate.
Fortify: synonym for Spike.
Full-scale response: the maximum output of a measurement instrument in a given
range as displayed on a meter or scale.
Functional analysis: a mathematical evaluation of each component of the
measurement system (sampling and analysis) in order to quantitate the error for
each component. A functional analysis is usually performed prior to a ruggedness
test in order to determine those variables which should be studied experimentally.
Geometric mean: the antilogarithm of the mean of the logarithms of all the values in a
set.
Good laboratory practices (GLP): either general guidelines or formal regulations for
performing basic laboratory operations or activities that are known or believed to
influence the quality and integrity of the results.
Goodness-of-fit: the measure of agreement of the values in a data set and the
expected or hypothesized ones. the application of the chi-square distribution in
comparing the frequency distribution of a statistic observed in a sample with the
expected frequency distribution based on some theoretical model.
Grab sample: a single sample which is collected at one point in time and place.
QA Glossary December 10, 1997
15
Grade: the category or rank given to entities having the same functional use but
different requirements for quality. (ISO 8402)
Graded approach: the processs of basing the level of application of managerial
controls applied to an item or work according to the intended use of the results
and the degree of confidence needed in the quality of the results ( See Data
Quality Objectives). (U.S. DOE Order 5700.6C, Quality Assurance).
Gross sample: see Bulk sample.
Guidance: suggested practice that is not mandatory, intended as an aid or example in
complying with a standard or requirement. (ASQC Definitions of Environmental
Quality Assurance Terms, 1996).
Holding time: the period a sample may be stored prior to its required analysis. While
exceeding the holding time does not necessarily negate the veracity of analytical
results, it causes the qualifying or flagging . of the data for not meeting all of the
specified acceptance criteria. The maximum times that samples may be held
prior to analysis and still be considered valid. (40 CFR Part 136).
Homogeneity: the degree of uniformity of structure or composition.
Hypothesis (statistical): a tentative statement about one or more parameters of a
population or group of populations
- an unproved theory, proposition, supposition, etc. tentatively accepted to
explain certain facts or to provide a basis for further investigation.
Hypothesis testing: the application of statistical tests to enable an informed decision
between the null - and the alternative hypothesis.
In-control: a condition indicating that performance of the quality control system is within
the specified control limits, i.e., that a stable system of chance is operating and
resulting in statistical control. See Control chart.
Independent variable: see Controlled variable.
Initial Demonstration of Analytical Capability: the procedure for establishing a
laboratory s ability to generate the measurement accuracy and precision required
by many of the EPA’s analytical methods. In general the procedure includes the
addition of a specified concentration of each analyte (using a QC check sample)
in each of four separate aliquots of laboratory pure water. These are carried
through the entire analytical procedure and the percentage recovery and the
QA Glossary December 10, 1997
16
standard deviation are determined and compared to specified QC accetance
limits. (40 CFR Part 136).
Inspection criterion: the specification(s) and rationale for rejecting and accepting
samples in a particular sampling plan.
Instrument blank: a clean sample processed through the instrumental steps of the
measurement process; used to assess instrument contamination. See Dynamic
blank.
Interference: a positive or negative effect on a measurement caused by a variable
other than the one being investigated.
Interference equivalent: the mass or concentration of a foreign substance which gives
the same measurement response as one unit of mass or concentration of the
substance being measured.
Interlaboratory calibration: the process, procedures, and activities for standardizing a
given measurement system to ensure that laboratories participating in the same
program can produce comparable data.
Interlaboratory method validation study (IMVS): the formal study of a sampling
and/or analytical method, conducted with replicate, representative matrix
samples, following a specific study protocol and utilizing a specific written
method, by a minimum of seven laboratories, for the purpose of estimating
interlaboratory precision, bias and analytical interferences.
Interlaboratory precision: a measure of the variation, usually given as the standard
deviation, among the test results from independent laboratories participating in
the same test.
Interlaboratory test: a test performed by two or more laboratories on the same
material for the purpose of assessing the capabilities of an analytical method or
for comparing different methods.
Internal quality control: see Intralaboratory quality control.
Internal standard: aknown amount of a standard added to a test portion of a sample
and carried through the entire determination procedure as a reference for
calibrating and controlling the precision and bias of the applied analytical method.
Intralaboratory quality control: the routine activities and checks, such as periodic
calibrations, duplicate analyses and spiked samples, that are included in normal
internal procedures to control the accuracy and precision of measurements.
QA Glossary December 10, 1997
17
Intralaboratory precision: a measure of the method/sample specific analytical
variation within a laboratory; usually given as the standard deviation estimated
from the results of duplicate/replicate analyses. See also Standard deviation and
Variance.
Laboratory accreditation: see Accredited laboratory and Accreditation.
Laboratory blank: see Reagent blank.
Laboratory control sample (however named, such as laboratory fortified blank, spiked
blank):an uncontaminated sample matrix spiked with known amounts of analytes
from a source independent of the calibration standards. It is generally used to
establish intra-laboratory or analyst specific precision and bias or to assess the
performance of all or a portion of the measurement system. (Glossary of Quality
Assurance Terms, QAMS, 8/3 1/92).
Laboratory duplicates: synonym for Duplicate analyses. Aliquots of a sample taken
from the same container under laboratory conditions and processed and analyzed
independently.
Laboratory performance check solution: a solution of method and surrogate analytes
and internal standards; used to evaluate the performance of the instrument
system against defined performance criteria.
Laboratory replicates: see Replicate analysis or measurement.
Laboratory spiked blank: see Spiked laboratory blank.
Laboratory spiked sample: see Spiked sample.
Laboratory splits or split samples: two or more representative portions taken from
the same sample and analyzed by different laboratories to estimate the
interlaboratory precision or variability and data comparability.
Laboratory sample: a subsample of a field, bulk or batch sample selected for
laboratory analysis.
Least squares method: a technique for estimating model coefficients which minimizes
the sum of the squares of the differences between each observed value and its
corresponding predicted value derived from the assumed model.
Limit of detection (LOD): The lowest concentration level that can be determined (by a
single analysis and with a defined level of confidence,) to be statistically different
QA Glossary December 10, 1997
18
from a blank. [Analytical Chemeistry, 55, p. 2217, December, 1983, modified] See
also Method Detection Limit.
Limit of quantification (LOQ): the concentration of analyte in a specific matrix for
which the probability of producing analytical values above the method detection
limit is 99 percent.
Linearity: the degree of agreement between the calibration curve of a method and a
straight line assumption.
Lot: a number of units of an article or a parcel of articles offered as one item;
commonly, one of the units, such as a sample of a substance under study. See
Batch.
Lot size: the number of units in a particular lot. See Batch lot and Batch size.
Lower control limit: see Control limit.
Lower warning limit: see Warning limit.
Management review: formal evaluation by top management of the status and
adequacy of the quality system in relation to quality policy and objectives.
(ISO 8402)
Management system: a structured nontechnical system describing the policies,
objectives, principles, organizational authority, responsibilities, accountability, and
implementation plan of an organization for conduction work and producing items
and services. (ANSI/ASQC E4-1994)
Management Systems Review (MSR): the qualitative assessment of a data collection
operation and/or organization(s) to establish whether the prevailing quality
management structure, practices, and procedures are adequate for ensuring that
the type and quality of data needed and expected are obtained. See Review and
Audit
Matrix: a specific subset of a medium (e.g., surface water, drinking water, kaolinite) in
which the analyte of interest may be contained. Matrices may be
defined/differentiated by their behavior: samples of the same or similar matrix are
expected to behave the same or similarly with respect to the procedure(s)
employed on them. See Medium.
For NELAC: The component or substrate which contains the analyte of interest. For
purposes of batch determination, the following matrix types shall be used:
- Aqueous: Any aqueous sample excluded from the definition of a
QA Glossary December 10, 1997
19
drinking water matrix or Saline/Estuarine source. Includes surface water,
groundwater and effluents.
- Drinking water: Any aqueous sample that has been designated a
potable or potential potable water source.
- Saline/Estuarine: Any aqueous sample from an ocean or estuary, or
other salt water source such as the Great Salt Lake.
- Non-aqueous liquid: Any organic liquid with <15%>15% settleable solids.
- Chemical Waste: A product or by-product of a industrial process
that results in a matrix not previously defined.
- Air Samples: Media used to retain the analyte of interest from an air
sample such as sorbent tubes or summa canisters. Each medium shall be
considered as a distinct matrix. (Quality Systems)
Matrix spike: see Spiked sample.
Matrix spike duplicate sample analysis: see Matrix, Duplicate analysis and Spiked
sample.
Maximum contaminant level: the highest permissible concentration of a pollutant that
may be delivered to any receptor.
Maximum holding time: the length of time a sample can be kept under specified
conditions without undergoing significant degradation of the analyte(s) or property
of interest.
May: permitted but not required. (TRADE)
Mean: see Arithmetic mean.
Measurement range: the range over which the precision and/or recovery of a
measurement method are regarded as acceptable. See Acceptable quality range.
Measurement standard: a standard added to the prepared test portion of a sample
(e.g. to the concentrated extract or the digestate) as a reference for calibrating
and controlling measurement or instrumental precision and bias.
Measurement system: those elements of a data collection project comprised of the
sampling process, the analytical method(s), the quality control and instrument
calibration requirements, and its data acquisition and management requirements.


Measure of central tendency: a statistic that describes the grouping of values in a
data set around some common value (e.g., the median, arithmetic mean, or
geometric mean.)
Measure of dispersion: a statistic that describes the variation of values in a data set
around some common value. See Coefficient of variation, Range, Variance and
Standard deviation.
Medium: a substance (e.g., air, water, soil) which serves as a carrier of the analytes of
interest. See Matrix.
Medium blank: see Field blank and/or Laboratory blank.
Median: the middle value for an ordered set of n values; represented by the central
value when n is odd or by the mean of the two most central values when n is
even.
Method: a body of procedures and techniques for performing a task (e.g., sampling,
characterization, quantification) systematically presented in the order in which
they are to be executed.
Method blank: a clean sample processed simultaneously with and under the same
conditions as samples containing an analyte of interest through all steps of the
analytical procedure.
Method check sample: see Spiked laboratory blank.
Method detection limit (MDL): the minimum concentration of an analyte that, in a given
matrix and with a specific method, has a 99% probability of being identified,
qualitatively or quantitatively measured, and reported to be greater than zero. See
Detection limit.
Method of least squares: see Least squares method.
Method performance study: see Interlaboratory method validation study.
Method quantification limit (MQL): see Limit of quantification and also Method
detection limit.
Mid-range check: a standard used to establish whether the middle of a measurement
method s calibrated range is still within specifications.
Minimum detectable level: see Method detection limit.
QA Glossary December 10, 1997
21
Mixedwaste: Hazardous waste material as defined by 40 CFR, Part 261 (RCRA) and
mixed with radioactive waste, subject to the requirements of the Atomic Energy
Act. (ANSI/ASQC E4-1 994)
Mode: the most frequent value or values in a data set.
Multipoint calibration: the determination of correct scale values by measuring or
comparing instrument responses at a series of standardized analyte
concentrations; used to define the range for generating quantitative data of
acceptable quality.
Must: denotes a requirement that must be met. (Random House College Dictionary)
Negative controls: measures taken to ensure that a test, its components, or the
environment do not cause undesired effects, or produce incorrect test results.
NELAC: National Environmental Laboratory Accreditation Conference. A voluntary
organization of state and federal environmental officials and interest groups
purposed primarily to establish mutually acceptable standards for accrediting
environmental laboratories. A subset of NELAP. (NELAC)
NELAP: the overall National Environmental Laboratory Accreditation Program of which
NELAC is a part.
Noise: the sum of random errors in the response of a measuring instrument.
Nonconformity: nonfulfillment of a specified requirement. (ISO 8402)
Normal distribution: an idealized probability density function that approximates the
distribution of many random variables associated with measurements of natural
phenomena and takes the form of a symmetric “bell-shaped curve.”
Objective evidence: information which can be proven true, based on facts obtained
through observation, measurement, test or other means. (ISO 8402)
Observation: a fact or occurrence that is recognized and recorded.
Observed value: the magnitude of a specific measurement; a variable; a unit of space,
time or quantity; a datum. The observed value is that reported before correction
for a blank value. See Corrected value.
Organization: company, corporation, firm enterprise or institution, or part thereof,
whether incorporated or not, public or private, that has its own functions and
QA Glossary December 10, 1997
22
administration. (ISO 8402)
Organizational structure: responsibilities, authorities and relationships, arranged in a
pattern, through which an organization performs its functions. (ISO 8402)
Outlier: an observed value that appears to be discordant from the other observations in
a sample. One of a set of observations that appears to be discordant from the
others. The declaration of an outlier is dependent on the significance level of the
applied identification test. See also Significance level.
Parameter: any quantity such as a mean or a standard deviation characterizing a
population. Commonly misused for “variable”, “characteristic” or “property.”
Peer review: the documented critical evaluation of projects generally beyond the state
of the art or characterized by potential uncertainty, conducted to ensure that
activities are technically adequate, competently performed, properly documented,
and satisfy established technical and quality requirements. The peer review is
conducted by qualified individuals or organizations independent of, but
collectively equivalent to those who performed the original work.
Percentage standard deviation: synonym for Relative standard deviation.
Performance Based Measurement System (PBMS): a set of processes wherein the
data quality needs, mandates or limitations of a program or project are specified
and serve as criteria for selecting appropriate methods to meet those needs in a
cost-effective manner.
Performance evaluation audit: a type of audit in which the quantitative data generated
in a measurement system are obtained independently and compared with
routinely obtained data to evaluate the proficiency of an analyst or laboratory.
Performance evaluation sample (PE sample): a sample, the composition of which is
unknown to the analyst and is provided to test whether the analyst/laboratory can
produce analytical results within specified performance limits. See Blind sample
and Performance evaluation audit.
Population: all possible items or units which possess a variable of interest and from
which samples may be drawn.
- the totality of items or units of material under consideration. (ANSI/ASQC
Al-1978)
Positive controls: measures taken to ensure that a test and/or its components are
QA Glossary December 10, 1997
23
working properly and producing correct or expected results from positive test
subjects.
Precision: the degree to which a set of observations or measurements of the same
property, usually obtained under similar conditions, conform to themselves; a data
quality indicator. Precision is usually expressed as standard deviation, variance or
range, in either absolute or relative terms. See also Standard deviation and
Variance.
Preservation: refrigeration and or reagents added at the time of sample collection to
maintain the chemical and or biological integrity of the sample.
Preventative maintenance: an orderly program of activities designed to ensure against
equipment failure.
Primary reference standard: see Primary standard.
Primary standard: a substance or device, with a property or value that is
unquestionably accepted (within specified limits) in establishing the value of the
same or related property of another substance or device.
Probability: a number between zero and one inclusive, reflecting the limiting proportion
of the occurrence of an event in an increasingly large number of identical trials,
each of which results in either the occurrence or nonoccurrence of the event.
Probability sampling: sampling in which: (a) every member of the population has a
known probability of being included in the sample; (b) the sample is drawn by
some method of random selection consistent with these probabilities; and the
known probabilities of inclusion are used in forming estimates from the sample.
The probability of selection need not be equal for members of the population.
Procedure: a set of systematic instructions for performing an activity.
- specified way to perform an activity. (ISO 8402)
Process: set of inter-related resources and activities which transform inputs into
outputs. (ISO 8402)
Proficiency Test Sample (PT): a sample, the composition of which is unknown to the
analyst and is provided to test whether the analyst/laboratory can produce
analytical results within specified performance limits. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Proficiency Testing: Determination of the laboratory calibration or testing performance
QA Glossary December 10, 1997
24
by means of interlaboratory comparisons. (ISO/IEC Guide 2 - 12.6, amended) A
systematic program in which one or more standardized samples is analyzed by
one or more laboratories to determine the capability of each participant.
Proficiency Testing Program: the aggregate of providing rigorously controlled and
standardized environmental samples to a laboratory for analysis, reporting of
results, statistical evaluation of the results in comparison to peer laboratories and
the collective demographics and results summary of all participating laboratories.
Property: a quality or trait belonging and peculiar to a thing; a response variable is a
measure of a property. Synonym for Characteristic.
Protocol: a detailed written procedure for a field and/or laboratory operation (e.g.,
sampling, analysis) which must be strictly adhered to.
Pure Reagent Water: shall be ASTM Type I or Type II water in which no target
analytes or interferences are detected as required by the analytical method.
Qualified: status given to an entity when toe capability of fulfilling specified
requirements has been demonstrated. (ISO 8402)
Qualitative (determination or analysis): the identification of a sample, material,
compound or element without any certainty as to its mass, volume or amount.
Qualitative results are generally expressed as the presence or absence of a
material and are usually not accompanied by confidence statements.
Quality: the sum of features and properties/characteristics of a product or service that
bear on its ability to satisfy stated or implied needs.
- The totality of characteristics of an entity that bear on its ability to satisfy
stated and implied needs. (ISO 8402)
- The consistent conformance of a product or service to a given set of
standards or expectations. (ISO-9000)
Quality (assurance) assessment: the evaluation of environmental data, comprised of
data validation/verification and data quality assessment, to establish whether they
meet the quality criteria needed for a specific application.
Quality assurance (QA): an integrated system of activities involving planning, quality
control, quality assessment, reporting and quality improvement to ensure that a
product or service meets defined standards of quality with a stated level of
confidence.
Quality Assurance Narrative Statement: a description of the quality assurance and
quality control activities to be followed for a research project.
QA Glossary December 10, 1997
25
Quality Assurance Objectives: the limits on bias, precision, comparability,
completeness and representativeness defining the minimal acceptable levels of
performance as determined by the data user’s acceptable error bounds.
Quality Assurance Project Plan (QAPP): a formal document describing the detailed
quality control procedures by which the quality requirements defined for the data
and decisions pertaining to a specific project are to be achieved.
Quality audit: systematic and independent examination to determine whether quality
activities and related results comply with planned arrangements and whether
these arrangements are implemented effectively and are suitable to achieve
objectives. (ISO 8402)
Quality Circle: a small group of individuals from an organization or unit who have
related interests and meet regularly to consider problems or other matters related
to the quality of the product or process.
Quality control (QC): the overall system of technical activities whose purpose is to
measure and control the quality of a product or service so that it meets the needs
of users. The aim is to provide quality that is satisfactory, adequate, dependable,
and economical.
- operational techniques and activities that are used to fulfil requirements for
quality. (ISO 8402)
Quality control chart: see Control chart.
Quality control check sample: see Calibration standard.
Quality control sample: an uncontaminated sample matrix spiked with known amounts
of analytes from a source independent from the calibration standards. It is
generally used to establish Intralaboratory or analyst specific precision and bias
or to assess the performance of all or a portion of the measurement system. See
also Check sample.
Quality improvement: actions taken throughout the organization to increase the
effectiveness and efficiency of activities and processes in order to provide added
benefits to both the organization and its customer. (ISO 8402)
Quality loop: conceptual model of interacting activities that influence quality at the
various stages ranging from the identification of needs to the assessment of
whether these needs have been satisfied. (ISO 8402)
QA Glossary December 10, 1997
26
Quality management: all activities of the overall management function that determine
the quality policy, objectives and responsibilities, and implement them by means
such as quality planning. quality control, quality assurance, and quality
improvement within the quality system. (ISO 8402)
Quality Management Plan (QMP): a formal document describing the management
policies, objectives, principles, organizational authority, responsibilities,
accountability, and implementation plan of an agency, organization or laboratory
for ensuring quality in its products and utility to its users.
Quality planning: activities that establish the objectives and requirements for quality
and for the application of quality system elements. (ISO 8402)
Quality policy: overall intentions and direction of an organization with regard to quality
as formally expressed by top management. (ISO 8402)
Quality system: a structured and documented management system describing the
policies, objectives, principles, organizational authority, responsibilities,
accountability, and implementation plan of an organization for ensuring quality in
its work processes, products (items), and services. The quality system provides
the framework for planning, implementing, and assessing work performed by the
organization and for carrying out required QA and QC.
- organizational structure, procedures, processes, and resources needed to
implement quality management. (ISO 8402)
Quantitation limits: the maximum or minimum levels or quantities of a target variable
that can be quantified with the confidence level required by the data user.
Quantitative (determination or analysis): the relatively accurate measurement of the
amounts or percentages of one or more components of a sample or material.
Depending on the QC operations performed in support of the analysis, qualitative
results may be reported with or without estimates of variability.
Random: lacking a definite plan, purpose or pattern; due to chance.
Random error: the deviation of an observed value from a true value, which behaves
like a variable in that any particular value occurs as though chosen at random
from a probability distribution of such errors. The distribution of random error is
generally assumed to be normal.
Random sample or subsample: a subset of a population or a subset of a sample,
selected according to the laws of chance with a randomization procedure.
QA Glossary December 10, 1997
27
Random variable: a quantity which may take any of the values of a specified set with a
specified relative frequency or probability. It is defined by a set of possible values,
and by an associated probability function giving the relative frequency of
occurrence of each possible value.
Randomization: the arrangement of a set of objects in a random order; a set of
treatments applied to a set of experimental units is said to be randomized when
the treatment applied to any given unit is chosen at random from those available
and not already allocated.
Randomness: a basic statistical concept and property implying an absence of a plan,
purpose or pattern, or of any tendency to favor one outcome rather than another.
Range: the difference between the minimum and the maximum of a set of values.
Raw data: any original factual information from a measurement activity or study
recorded in laboratory worksheets, records, memoranda, notes, or exact copies
thereof and that are necessary for the reconstruction and evaluation of the report
of the activity or study. Raw data may include photographs, microfilm or
microfiche copies, computer printouts, magnetic media, including dictated
observations, and recorded data from automated instruments. If exact copies of
raw data have been prepared (e.g., tapes which have been transcribed verbatim,
dated, and verified accurate by signature), the exact copy or exact transcript may
be substituted.
Readiness review: a systematic, documented review of the readiness for start-up or
continued use of a facility , process, or activity. Readiness reviews are typically
conducted before proceeding beyond project milestones and prior to initiation of a
major phase of work. (ANSI/ASQC E4-94)
Reagent blank: a sample consisting of reagent(s), without the target analyte or sample
matrix, introduced into the analytical procedure at the appropriate point and
carried through all subsequent steps to determine the contribution of the reagents
and of the involved analytical steps to error in the observed value.
Reagent grade: the second highest purity designation for reagents which conform to
the current specifications of the American Chemical Society Committee on
Analytical Reagents.
Records system (or plan): a written, documented group of procedures describing
required records, steps for producing them, storage conditions, retention period
and circumstances for their destruction or other disposition.
Recovery efficiency: in an analytical method, the fraction or percentage of a target
QA Glossary December 10, 1997
28
analyte extracted from a sample containing a known amount of the analyte.
Reference material: a material or substance, one or more properties of which are
sufficiently well established to be used for the calibration of an apparatus, the
assessment of a measurement method, or assigning values to materials.
Reference method: a sampling and/or measurement method which has been officially
specified by an organization as meeting its data quality requirements.
Reference standard: a standard, generally of the highest metrological quality available
at a given location, from which measurements made at that location are derived.
(VIM - 6.08). See also Calibration standard.
Relative standard deviation: the standard deviation expressed as a percentage of the
mean recovery, i.e., the coefficient of variation multiplied by 100.
Reliability: the likelihood that an instrument or device will function under defined
conditions for a specified period of time.
Repeatability: the degree of agreement between mutually independent test results
produced by the same analyst using the same test method and equipment on
random aliquots of the same sample within a short period of time.
Replicability: see Repeatability.
Replicate: an adjective or verb referring to the taking of more than one sample or to the
performance of more than one analysis. Incorrectly used as a noun in place of
replicate analysis. Replicate is to be used when referring to more than two items.
See Duplicate.
Replicate analyses or measurements: the analyses or measurements of the variable
of interest performed identically on two or more subsamples of the same sample
within a short time interval. See Duplicate analyses or measurements.
Replicate samples: two or more samples representing the same population
characteristic, time, and place, which are independently carried through all steps
of the sampling and measurement process in an identical manner. Replicate
samples are used to assess total (sampling and analysis) method variance. Often
incorrectly used in place of the term “replicate analysis.” See Duplicate samples
and Replicate analysis.
Representative sample: a sample taken so as to reflect the variable(s) of interest in
the population as accurately and precisely as specified. To ensure
representativeness, the sample may be either completely random or stratified
QA Glossary December 10, 1997
29
depending upon the conceptualized population and the sampling objective (i.e.,
upon the decision to be made.)
Representativeness: the degree to which data accurately and precisely represent the
frequency distribution of a specific variable in the population; a data quality
indicator.
Reproducibility: the extent to which a method, test or experiment yields the same or
similar results when performed on subsamples of the same sample by different
analysts or laboratories.
Requirement: a formal statement of a need and the expected manner in which it is to
be met. The translation of a need into a set of individual quantified or descriptive
specifications of the characteristics of an entity in order to enable its realization
and examination.
Requirements for quality: expressions of the needs or their translation into a set of
quantitatively or qualitatively stated requirements for the characteristics of an
entity to enable its realization and examination. (ISO 8402)
Response variable: a variable that is measured when a controlled experiment is
conducted.
Result: the product of a calculation, test method, test or experiment. The result may be
a value, data set, statistic, tested hypothesis or an estimated effect.
Review: the assessment of management/operational functions or activities to establish
their conformance to qualitative specifications or requirements. See Management
systems review and also, Audit.
Rework: action taken on a nonconforming product so that it will fulfil the specified
requirements. (ISO 8402)
Rinsate blank: the solvent used to rinse a container or sampling apparatus. Rinsate
blanks are generally subjected to analysis to determine whether a container or
sampler is free of contamination.
Risk: the probability or likelihood of an adverse effect.
Risk (statistical): the expected loss due to the use of a given decision procedure.
Robustness: (in)sensitivity of a statistical test method to departures from underlying
assumptions. See Ruggedness.
QA Glossary December 10, 1997
30
Rounded number: a number, reduced to a specified number of significant digits or
decimal places using defined criteria.
Round-robin study: a method validation study involving an undefined number of
laboratories or analysts, all analyzing the same sample(s) by the same method. In
a round-robin study all results are compared and used to develop summary
statistics such as interlaboratory precision and method bias or recovery efficiency.
Routine method: a defined plan of procedures and techniques used regularly to
perform a specific task.
Ruggedness: the (in)sensitivity of an analytical test method to departures from
specified analytical or environmental conditions. See Robustness.
Ruggedness testing: the carefully ordered testing of an analytical method while
making slight variations in test conditions (as might be expected in routine use) to
determine how such 30 variations affect test results. If a variation affects the
results significantly, the method restrictions are tightened to minimize this
variability.
Sample: a part of a larger whole or a single item of a group; a finite part or subset of a
statistical population. A sample serves to provide data or information concerning
the properties of the whole group or population.
Sample data custody: see Chain-of-custody.
Sample variance (statistical): a measure of the dispersion of a set of values. The sum
of the squares of the difference between the individual values of a set and the
arithmetic mean of the set, divided by one less than the number of values in the
set. (The square of the sample standard deviation.) See also Measure of
dispersion.
Sampling: the process of obtaining a representative portion of the material of concern.
Sampling equipment blank: a clean sample that is collected in a sample container
with the sample-collection device and returned to the laboratory as a sample.
Sampling equipment blanks are used to check the cleanliness of sampling
devices. See Dynamic blank.
Sampling error: the difference between an estimate of a population value and its true
value. Sampling error is due to observing only a limited number of the total
possible values and is distinguished from errors due to imperfect selection, bias in
response, errors of observation, measurement or recording, etc. See also
Probability sampling.
QA Glossary December 10, 1997
31
Scheduled maintenance: see Preventative maintenance.
Screening test: a quick test for coarsely assessing a variable of interest.
Secondary standard: a standard whose value is based upon comparison with a
primary standard.
Selectivity (analytical chemistry): the capability of a method or instrument to respond
to a target substance or constituent in the presence of nontarget substances.
Semiqualitative: the presence or absence of one or more members of a class or group
of substances, compounds, etc., all of which produce the same or similar
response from the detection/measurement system.
Semiquantitative: the relatively inaccurate (e.g., within one order of magnitude)
measurement or approximation of the amounts or percentages of one or more
components of a sample.
Sensitivity: the ability of a method or instrument to disriminating between minimally
different levels of a variable of interest by producing a noticeably different
measurement response.
Shall: denotes a requirement that is mandatory whenever the criterion for conformance
with the specification requires that there be no deviation. This does not prohibit
the use of alternative approaches or methods for implementing the specification
so long as the requirement is fulfilled. (Style Manual for Preparation of Proposed
American National Standards, American National Standards Institute, Eighth
Edition (March 1991).
Should: denotes a guideline or recommendation whenever noncompliance with the
specification is permissible. (Style Manual for Preparation of Proposed American
National Standards, American National Standards Institute, Eighth Edition (March
1991).
Significance level: the magnitude of the acceptable probability of rejecting a true null
hypothesis or of accepting a false null hypothesis; the difference between the
hypothetical value and the sample result.
Significant digit: any of the digits 0 through 9, excepting leading zeros and some
trailing zeros, which is used with its place value to denote a numerical quantity to
a desired rounded number. See Rounded number.
Significant figure: see Significant digit.
QA Glossary December 10, 1997
32
Single operator precision: the degree of variation among the individual
measurements of a series of determinations by the same analyst or operator, all
other conditions being equal.
Site: the area within boundaries established for a defined activity.
Span check: a standard used to establish that a measurement method is not deviating
from its calibrated range.
Span-drift: the change in the output of a continuous monitoring instrument over a
stated time period during which the instrument is not recalibrated.
Span-gas: a gas of known concentration which is used routinely to calibrate the output
level of an analyzer. See Calibration check standard.
Specification: document stating requirements. (ISO 8402)
Specimen: see Sample.
Spike: a known mass of target analyte added to a blank sample or subsample; used to
determine recovery efficiency or for other quality control purposes.
Spiked laboratory blank: see Spiked reagent blank.
Spiked reagent blank: a specified amount of reagent blank fortified with a known mass
of the target analyte; usually used to determine the recovery efficiency of the
method.
Spiked sample: a sample prepared by adding a known mass of target analyte to a
specified amount of matrix sample for which an independent estimate of target
analyte concentration is available. Spiked samples are used, for example, to
determine the effect of the matrix on a method’s recovery efficiency.
Spiked sample duplicate analysis: see Duplicate analysis and Spiked sample.
Split samples: two or more representative portions taken from a sample or subsample
and analyzed by different analysts or laboratories. Split samples are used to
replicate the measurement of the variable(s) of interest.
Standard (measurement): a substance or material with a property quantified with
sufficient accuracy to permit its use to evaluate the same property in a similar
substance or material. Standards are generally prepared by placing a reference
material in a matrix. See Reference material.
QA Glossary December 10, 1997
33
Standard addition: the procedure of adding known increments of the analyte of
interest to a sample to cause increases in detection response. The level of the
analyte of interest present in the original sample is subsequently established by
extrapolation of the plotted responses.
Standard curve: see Calibration curve.
Standard deviation: the most common measure of the dispersion or imprecision of
observed values expressed as the positive square root of the variance. See
Variance.
Standard material: see Standard (measurement), Reference material.
Standard method: an assemblage of techniques and procedures based on consensus
or other criteria, often evaluated for its reliability by collaborative testing and
receiving organizational approval.
Standard operating procedure (SOP): a written document which details the method of
an operation, analysis or action whose techniques and procedures are thoroughly
prescribed and which is accepted as the method for performing certain routine or
repetitive tasks.
Standard reference material (SRM): a certified reference material produced by the
U.S. National Institute of Standards and Technology and characterized for
absolute content independent of analytical method.
Standard reference sample: see Secondarv standard.
Standard solution: a solution containing a known concentration of analytes, prepared
and verified by a prescribed method or procedure and used routinely in an
analytical method.
Standardization: the process of establishing the quantitative relationship between a
known mass of target material (e.g., concentration) and the response variable
(e.g., the measurement system or instrument response.) See Calibration,
Calibration curve and Multipoint calibration.
Statistic: an estimate of a population characteristic calculated from a data set
(observed or corrected values), e.g., the mean or standard deviation.
Stratification: the division of a target population into subsets or strata which are
internally more homogeneous with respect to the characteristic to be studied than
the population as a whole.
QA Glossary December 10, 1997
34
Stratified sampling: the sampling of a population that has been stratified, part of the
sample coming from each stratum. See Stratification.
Stock solution: a concentrated solution of analyte(s) or reagent(s) prepared and
verified by prescribed procedure(s), and used for preparing working standards or
standard solutions.
Subsample: a representative portion of a sample. A subsample may be taken from any
laboratory or a field sample. See Aliquant, Aliquot, Split sample and Test portion.
Supplier: organization that provides a product to the customer. (ISO 8402)
Surrogate analyte: a pure substance with properties that mimic the analyte of interest.
It is unlikely to be found in environmental samples and is added to them for
quality control purposes.
Surveillance: the act of maintaining supervision of or vigilance over a well-specified
portion of the environment so that detailed information is provided concerning the
state of that portion.
Synthetic sample: a manufactured sample. See Quality control sample.
Systematic error: a consistent deviation in the results of sampling and/or analytical
processes from the expected or known value. Such error is caused by human and
methodological bias.
Systems audit: see Technical systems audit.
Systems error: see Total systems error.
Target: the chosen object of investigation for which qualitative and/or quantitative data
or information is desired, e.g., the analyte of interest.
Technical systems audit: a thorough, systematic on-site, qualitative review of
facilities, equipment, personnel, training, procedures, record keeping, data
validation, data management, and reporting aspects of a total measurement
system.
Technique: a principle and/or the procedure of its application for performing an
operation.
Test: a procedure used to identify or characterize a substance or constituent. See
Method.
QA Glossary December 10, 1997
35
Test data: see Data.
Test determination: see Determination.
Test method: see Method.
Test portion: a subsample of the proper amount for analysis and measurement of the
property of interest. A test portion may be taken from the bulk sample directly, but
often preliminary operations, such as mixing or further reduction in particle size,
are necessary. See Subsample.
Test result: a product obtained from performing a test determination. See Test
determination.
Test sample: see Test portion.
Test specimen: see Test portion.
Test unit: see Test portion.
Time-proportioned sample: a composite sample produced by combining samples of a
specific size, collected at preselected, uniform time intervals.
Tolerance Chart: A chart in which the plotted quality control data is assessed via a
tolerance level (e.g. +/- 10% of a mean) based on the precision level judged
acceptable to meet overall quality/data use requirements instead of a statistical
acceptance criteria (e.g. +/- 3 sigma). (ANSI N42.23-1995, Measurement and
Associated Instrument Quality Assurance for Radioassay Laboratories)
Total Quality Management (TQM): the process whereby an entire organization, led by
senior management, commits to focusing on quality as a first priority in every
activity. TQM implementation creates a culture in which everyone in the
organization shares the responsibility for continuously improving the quality of
products and services, (i.e., for “doing the right thing, the right way, the first time,
on time.”) in order to satisfy the customer.
- management approach of an organization, centered on quality, based on
the participation of all its members and aiming at long-term success through
customer satisfaction, and benefits to all members of the organization and to
society. (ISO 8402)
Total measurement error: the sum of all the errors that occur from the taking of the
sample through the reporting of results; the difference between the reported result
QA Glossary December 10, 1997
36
and the true value of the population that was to have been sampled.
Traceability: an unbroken trail of accountability for verifying or validating the chain-ofcustody
of samples, data, the documentation of a procedure, or the values of a
standard.
The ability to trace the history, application or location of an entity by means of
recorded identifications. (ISO 8402)
The property of a result of a measurement whereby it can be related to
appropriate standards, generally international or national standards, through an
unbroken chain of comparisons. (VIM - 6.12)
Treatment (experimental): an experimental procedure whose effect is to be measured
and compared with the effect of other treatments.
Trip blank: a clean sample of matrix that is carried to the sampling site and transported
to the laboratory for analysis without having been exposed to sampling
procedures.
Tuning: the process of adjusting a measurement device or instrument, prior to its use,
to ensure that it works properly and meets established performance criteria.
Type I error, (alpha error): an (incorrect) decision resulting from the rejection of a true
hypothesis. (A false positive decision.)
Type II error, (beta error): an (incorrect) decision resulting from acceptance of a false
hypothesis. (A false negative decision.)
Uncertainty: a measure of the total variability associated with a process that includes
the two major error components: systematic error (bias) and random error
(imprecision).
Universe: see Population.
Upper control limit: see Control limit.
Upper warning limit: see Warning limit.
User check: an evaluation of a written procedure (e.g., chemical analysis method) for
clarity and accuracy in which an independent laboratory analyzes a small number
of spiked samples, following the procedure exactly.
Valid study: a study conducted in accordance with accepted scientific methodology,
QA Glossary December 10, 1997
37
the results of which satisfy predefined criteria.
Validated method: a method which has been determined to meet certain performance
criteria for sampling and/or measurement operations.
Validation: the process of substantiating specified performance criteria.
- confirmation by examination and provision of objective evidence that the
particular requirements for a specific intended use are fulfilled. (ISO 8402)
Value: the magnitude of a quantity. A single piece of factual information obtained by
observation or measurement and used as a basis of calculation.
Variable: an entity subject to variation or change.
Variance: see Sample variance.
Verifiable: the ability to be proven or substantiated.
Verification: Confirmation by examination and provision of objective evidence that
specified requirements have been fulfilled. In design and development, validation
concerns the process of examining a result of a given activity to determine
conformance to the stated requirements for that activity. (ANSI/ISO/ASQC
A8402-1994)
Warning limit: a specified boundary on a control chart that indicates a process may be
going out of statistical control and that certain precautions are required. For
example; for a Shewhart chart the warning limits are placed at plus and minus
two standard deviations of the mean (i.e., at the 95% confidence interval.)
Working standard: see Secondary standard.
Zero check: a standard, usually devoid of the analyte or variable of interest, used to
establish whether the ~zero~ point of a measurement method is still properly
calibrated.
Zero drift: the change in instrument output over a stated time period of nonrecalibrated,
continuous operation, when the initial input concentration is zero; usually
expressed as a percentage of the full scale response.
QA Glossary December 10, 1997
38
Acronyms
AAPCO American Association of Pest Control Officials (FIFRA)
ACS American Chemical Society
ADQ Audit of Data Quality
ANPRM Advanced Notice of Proposed Rule Making
AOAC Association of Official Analytical Chemists
AQCR Air Quality Control Region
ARAR Applicable or Relevant and Appropriate Standards, Limitations, Criteria,
and Requirements
ASTM American Society for Testing and Materials
BACT Best Available Control Technology
BDAT Best Demonstrated Available Technology
CA Cooperative Agreement
CAA Clean Air Act
CAIR Comprehensive Assessment Information Rule
CAR Corrective Action Report
CAS Chemical Abstract Service
CBI Compliance Biomonitoring Inspection
CEI Compliance Evaluation Inspection
CEPP Chemical Emergency Preparedness Program
CERCLA Comprehensive Environmental Responsibility, Compensation and
Liability Act
CFR Code of Federal Regulations
CGI Comprehensive Ground Water Inspection
QA Glossary December 10, 1997
39
CGME Comprehensive Ground-Water Monitoring Evaluation
CIS Compliance Inspection Strategy
CLP Contract Laboratory Program
CME Construction Management Evaluation
COE U. S. Army Corps of Engineers
CRM Certified Reference Material
CSI Compliance Sampling Inspection
CSO Combined Sewer Overflow
CV Coefficient Variation
CWA Clean Water Act
DL Detection Limit
D&R Demolition and Renovation
DMR-QA Discharge Monitoring Report - QA Program
DPO Deputy Project Officer
DQA Data Quality Assessment
DQO Data Quality Objectives
DU Decision Unit
EDCA Environmental Data Collection Activity
EDL Estimated Detection Level
EHMW Extra High Molecular Weight
EMAP Environmental Monitoring Assessment Program
EMS Enforcement Management System
EMPC Estimated Maximum (Protocol) Concentration
QA Glossary December 10, 1997
40
ERAMS Environmental Radiation Ambient Monitoring System
ERC Emergency Response Contractor
ERCS Emergency Response Cleanup Service
ERT Emergency Response Team
ESAT Environmental Service Assistant Team
ESP Electrostatic Precipitator
FDA Food and Drug Administration
FIFRA Federal Insecticide, Fungicide and Rodenticide Act
FISMP Field Inspection with Sampling
FIT Field Investigation Team
FR Food Register
FRDS Federal Reporting Data System
FS Feasibility Study
GLP Good Laboratory Practice
HDPE High Density Polyethylene
HRS Hazard Ranking System
HWDMS Hazardous Waste Data Management System
I/A Innovative/Alternative (Technology)
I&M Inspection and Maintenance
ICP Inductivity Coupled Atomic Emission Plasma Spectometry
ICR Information Collection Request
IFB Invitation for Bidders
IMR Immediate Removal
QA Glossary December 10, 1997
41
IMVS Interlaboratory Method Validation Study
IRM Initial Remedial Measure
ISS Interim Status Survey
IU Industrial User
LAER Lowest Achievable Emissions Rate
LOEC Lowest Observed Effect Concentration
LOIS Loss of Interim Status
LOQ Limit of Qualification
MCL Maximum Contaminant Level
MCLG Maximum Contaminant Level Goals
MCP Municipal Compliance Plan
MDL Method Detection Limit
MIT Mechanical Integrity Test
MPRSA Marine Protection, Research and Sanctuaries Act
MSR Management Systems Review
MSIS Model State Information System
MTR Minimum Technology Requirements
NAAQS National Ambient Air Quality Standards
NADB National Aerometric Data Bank
NAMS National Air Monitoring Stations
NBAR Non-binding Preliminary Allocation of Responsibility
NCLAN National Crop Loss Assessment Network
NCP National Contingency Plan
QA Glossary December 10, 1997
42
NEDS National Emissions Data Base
NEIC National Enforcement Investigations Center (OECM, Denver)
NESHAP National Emission Standards for Hazardous Air Pollutants
NHANES National Health and Nutrition Examination Study
NPDWR National Primary Drinking Water Regulations
NOISH National Institute of Occupational Safety and Health
NIST National Institute of Standards and Technology
NMP National Municipal Policy
NOD Notice of Deficiency
NOEC No-Observed Effect Concentration
NOPES Non-Occupational Pesticide Exposure Study
NPAP National Performance Audit Program
NPDES National Pollutant Discharge Elimination System
NDHAP National Pesticide Hazard Assessment Program
NPL National Priority List
NPO National Program Office
NPRM Notice of Proposed Rule Making
NRC National Resource Center
NSPS New Source Performance Standards
NSR New Source Review
NTIS National Technical Information Service
O&M Operation and Management
OSHA Occupational Safety and Health Administration
QA Glossary December 10, 1997
43
PA/SI Preliminary Assessment/Site Inspection
PA Preliminary Assessment
PARS Precision and Accuracy Reporting System
PCI Pretreatment Compliance Inspection
PCS Permit Compliance System
PE Performance Evaluation
PE Program Element
PI Principal Investigator
PMC Project Management Conference
PO Project Officer
POTW Publicly-Owned Treatment Works
PQL Practical Quantitation Limits
PRP Potential Responsible Party
PSD Prevention of Significant Deterioration
PTE Potential to Emit
PTI Permit to Install
PWSSP Public Water System Supervision Program
QA Quality Assurance
QAMS Quality Assurance Management Staff
QAPjP Quality Assurance Project Plan
QAPP Quality Assurance Program Plan
QC Quality Control
QNCR Quarterly Non-Compliance Report
QA Glossary December 10, 1997
44
RA Remedial Action
RACM Reasonably Available Control Measures
RACT Reasonably Available Control Technologies
RAS Routine Analytical Service (CLP)
RCRA Resource Conservation and Recovery Act
RD Remedial Design
RE Relative Error
REM RI/FS Contractors
RFA RCRA Facility Assessment (RCRA site version of PA/SI)
RFD Reference Doses
RFP Request for Proposals
RFP Reasonable Further Progress (toward attainment)
RI Reconnaissance Inspection
RI Remedial Investigation
RI/FS Remedial Investigation/Feasibility Study
RMCL Recommended Maximum Contaminant Level
ROD Record of Decision
RPM Remedial Project Manager
RSCC Regional Sample Control Center (CLP)
RSD Risk Specified Doses
SAP Sample Analysis Plan
SARA Superfund Amendments and Reauthorizations Act of 1986
SAROAD Storage and Retrieval of Aeromatic Data
QA Glossary December 10, 1997
45
SAS Special Analytical Service (CLP)
SBO Senior Budget Official
SCAP Superfund Comprehensive Accomplishment Plan
SDWA Safe Drinking Water Act
SI Site Inspection
SIF Site Inspection Follow-up
SIP State Implementation Plan
SLAM State Local Air Monitoring Stations
SNC Significant Non-Comliance
SNUR Significant New Use Rule (TSCA 5(e))
SOP Standard Operating Procedure
SRM Standard Reference Material
SS Site Survey
SSID Site/Spill Identification Designation
STC Special Terms and Conditions
TAT Technical Assistant Team
TCLP Toxicity Characteristic Leaching Procedure
TCM Traffic Control Measures
TDD Technical Direction Document
TEAM Total Exposure Assessment Methodology
TEGD Technical Enforcement Guidance Document
TMDL Total Maximum Daily Load
TOC Total Organic Carbon
QA Glossary December 10, 1997
46
TOX Total Organic Halides
TQM Total Quality Management
TSA Technical System Audit
TSCA Toxic Substances Control Act
TSD Temporary Storage and Disposal
TSDF Temporary Storage and Disposal Facility
TSP Total Suspended Particulates
TTO Total Toxic Organics (NPDES permits)
UIC Underground Injection Control
UST Underground Storage Tanks
VE Value Engineering
VE Visual Emissions
VOA Volatile Organics Analysis
VOC Volatile Organic Contaminants
VOC Volatile Organic Chemicals
WAM Work Assignment Manager
WAP Waste Analysis Plan
WENDB Water Enforcement National Data Base
WLA Waste Load Allocation
WQM Waste Quality Management

Now, '100 percent' vegetarian eggs

Now, '100 percent' vegetarian eggs
Erode (Tamil Nadu): Here's some good news for diehard vegetarians who may yet like to tuck in some eggs.

India's leading egg powder manufacturer and exporter will launch a "100 percent vegetarian egg" in the coming year.

"We will commercially launch the completely 100 percent vegetarian eggs both in the domestic market and also export them across the world in a couple of months from now," S Hariharan, general manager, operations of SKM Egg Products Ltd, told IANS.

The company is already exporting 100 percent vegetarian egg powder, egg yolk powder and egg albumen powder to as many as 27 countries in the world, including Europe and Japan.

So what is a vegetarian egg?

Chicks aged between zero and eight weeks are brought to poultry farms and bred till up to 72 weeks when they become "layers".

Normally, each layer lays about 300 eggs in poultry farms. However, these eggs are not totally vegetarian because the hens are fed fishmeal (dry fish powder) as a protein supplement.

However, SKM Egg Products Ltd, located aptly on the Gandhiji Road in Erode, claims all the "egg-laying birds" in its contract farms are not fed any "animal-based food".

Instead of fishmeal, soya powder is added to the poultry feed as the protein supplement.

"Hence, eggs produced in our contract farms are fully vegetarian," asserts Hariharan.

But this company, which buoyantly ended last fiscal (2006-07) with a Rs.845-million ($21.4 million) turnover, did not hit upon the vegetarian egg concept for the sheer sake of vegetarianism.

It was for commercial reasons to meet the strict stipulations of the export market.

The eggs laid by the hens fed on fishmeal contained antibiotic residues in excess of the limits (0.5 parts per billion) set by European countries. Hence, the company substituted soya for fish powder. Thus, the 100 percent vegetarian egg was born.

Recently, SKM, which exported 4,500 tonnes of egg powder last year, set up its own poultry farm with nearly 1.5 million chicks.

However, as of now, the company largely sources the "vegetarian eggs" from nearby Namakkal, which is southern India's "egg land".

With over 700 poultry farms, Namakkal produces 22.5 million eggs every day, which is 14 percent of the country's egg production.

"If milk is vegetarian, then all commercially produced eggs in our farms are vegetarian. Only, most of us use fish feed for the hens because soya feed is expensive," says Namakkal Poultry Feeds and Egg Producers Association president Nallathambi.

So the next time you gobble up an egg pastry, just don't feel guilty.

Source:IANS

Tuesday, June 5, 2007

Flextronics purchasing Solectron for $3.6 billion

Flextronics purchasing Solectron for $3.6 billion

Tuesday, May 15, 2007

Quality Management

Quality Management

A systematic set of activities to ensure that processes create products with maximum *Quality* at minimum *Cost of Quality*. The activities include *Quality Assurance*, *Quality Control*, and *Quality Improvement*.

Quality Assurance

Assurance

A planned and systematic set of activities to ensure that variances in processes are clearly identified, assessed and improving defined processes for fullfilling the requirements of customers and product or service makers.

A planned and systematic pattern of all actions necessary to provide adequate confidence that the product optimally fulfils customer's expectations.

A planned and systematic set of activities to ensure that requirements are clearly established and the defined process complies to these requirements.

"Work done to ensure that Quality is built into work products, rather than Defects." This is by (a) identifying what "quality" means in context; (b) specifying methods by which its presence can be ensured; and (c) specifying ways in which it can be measured to ensure conformance (see *Quality Control*, also *Quality*).

Quality

Quality

Quality is difficult to define, it's an abstract term, it requires continuous and dynamic adaptation of products and services to fulfill or exceed the requirements or expectations of all parties in the organization and the community as a whole.

----------------
'Quality means conformance to requirements' (Philip Crosby, 'Quality Is Free'). It does not matter whether or not the requirements are articulated or specified; if a product does not fully satisfy, it lacks quality in some respect. ('Quality is binary -- you've either got it, or you haven't' -- ibid. Note that both these quotes are 'top-of-the-head' and therefore approximate.)

The starting-point for a 'quality product', therefore, is precise determination of the requirements of its users. This may not be possible in practice, but should still be attempted as best possible (see *Acceptable Quality Level*).

Note that the 'quality' of a product is the sum of multiple separate *Quality Attributes*.