Jump to content

Talk:Calibration

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
(Redirected from Talk:Zeroing)



Untitled

[edit]

Rewrite completed. All of the old article follows "Additional Information" so that anyone who is interested can see other points of view and subject matter I didn't include. George5530 (talk) 21:01, 23 January 2009 (UTC)[reply]

Just if someone likes simple definitions, I added this to URBAN DICTIONARY:

1. The process of data acquisition (obtained by comparison against standards) able to anticipate the behavior of something. 2. The data so obtained. 3. The final report or documentation showing such data. 4. The stage in which something is, regarding the level of its calibration knowledge.

Adjustment is not calibration. When we adjust something we change its behavior (usually to move it closer to standards aiming a better performance); when we calibrate we just understand and certify its behavior so that we can compensate or even adjust it.

--Arturo Cortijo 19:01, 21 May 2008 (UTC)Artcort.fis


These comments were at the end of the text:

Please add references to standards organizations in other countries. Other definitions and/or views on traceability also welcome.

Please add mention of instruments that are commonly calibrated. A discussion of statistical error would also be appreciated.

Definition

[edit]

Hi. I've learned that NIST defines calibration as strictly the determination of the instrument's precision, comparing it to a higher standard, and not it's adjustment to enhance it's precision. Does someone know about this? I'm going to check this out, and if it's the case I will change the article to this more formal definition...

  • I would agree that calibration does not need to include any adjustment. An obvious example of this would be the calibration of an artifact standard that can't be adjusted because it has a fixed length or mass or whatever. I suspect that you really need to look at the VIM (and maybe the GUM) to get the official metrology definition of calibration (see refs 4 and 5 in here) Metrology in Short. This document is a fairly good in itself though. If I'm feeling inspired over Xmas I may do this myself and some similar edits on traceability, metrology etc. etc. JMiall 18:23, 22 December 2005 (UTC)[reply]

When I was calibrating instruments over the summer calibration was the process of taking the readings (e.g. putting in 5.00 volts and SEEING what the instrument said it was) and if it was too far out (as defined by the manual and the standard of calibration used e.g. in-house, UKAS). The process of changing the output of the instrument so it was not out was called adjusting. --Elfwood 19:21, 13 August 2006 (UTC)[reply]

The International Vocabulary of Basic and General Terms in Metrology is an ISO copyrighted document that defines calibration. The current definition is pretty close to what I wrote. Establishing the relationship between the measuring devise and a system of units. A new draft dated April 2004 is also pretty close to what I wrote but added that an evaluation of the measurement uncertainty is also necessary. So only the new draft addresses precision in the definition. Neither definition requires an adjustment with the calibration. *I think the draft ISO uncertainty requirement needs to be address separately from a general definition of calibration as calibration are not always done to satisfy the ISO community. I think what is needed is a general definition of calibration without the precision part, a simple example of a calibration say a measuring stick to a ruler, the current VIM's definition,and finally the draft definition. Bestshot (talk) 14:21, 29 March 2008 (UTC)[reply]

There is something missing: calibration process also involves calculating and documenting the accuracy, and uncertainty, and stability (implied by the calibration period). Sometimes the documentation is the most important part. The user of the equipment needs to know. —Preceding unsigned comment added by Adam.ratcliff (talkcontribs) 00:24, 22 October 2010 (UTC)[reply]

Cleanup 1

[edit]

I changed the definition given. Calibration does not consist of adjustment, however, I acknowledge that many people believe so, especially in the U.S. Some other incorrect and sloppy statements were corrected or removed.Dalle 19:50, 5 September 2006 (UTC)[reply]


As you said, calibration does not consist of adjustment - the two are separate processes. However in the main page it says "In non-specialized use, calibration is often regarded as including the process of adjusting....". This should be reworded to say that the word is often misused to include adjustment. Otherwise there are two confusing meanings of the same word - the wrong one which includes adjustment should be discouraged. I think your wording is still confusing - what does non-specialized use mean - this indicates that it's ok to use the wrong meaning.

Stockdam 14:33, 22 August 2007 (UTC) guys that's a good definition —Preceding unsigned comment added by 82.44.253.94 (talk) 21:49, 15 November 2007 (UTC)[reply]

Snipers and Calibration

[edit]

I came here from Sniper, which links here, looking to see what zeroing (Calibration) does for them. However, this page is complexly missing this information. Someone really does need to work on this, or remove the link that connects this page to Sniper, since this page is irrelevant as is.

Needs more detail

[edit]

I am not expert enough to help here, but this article needs help. Need to address this: http://www.hartscientific.com/publications/17025.htm and calibration for test equipment end-users. Also, there is a military standard for calibration which should be discussed. —Preceding unsigned comment added by Mcttocs (talkcontribs) 18:12, 9 June 2009 (UTC)[reply]

Required Accuracy

[edit]

May I suggest to add some words in the direction that the using organization determines the required accuracy, and that one result of the calibration might be to confirm whether the device fulfills those requirements. Stefanhanoi (talk) 02:09, 3 November 2009 (UTC)[reply]

Calibration of measuring and testing equipment by an outside lab

[edit]
When a piece of equipment is sent out for Calibration to an outside lab or testing agency - it is calibrated - which generally includes confirmation if it meets with its published calibration accuracy specifications. It is to the user to define if he wants a 5 point or a ten point calibration otherwise the lab will use its own norms, which may not even be clearly stated, If the user needs to know this information he has to request as received Calibration data otherwise none may be supplied. When an item is adjustable the lab will attempt to put it back within calibration specifications and issue an as shipped data sheet if one is requested. In some cases if the instrument has failed a repair may also be required.

All in all the Calibration precision requirements have to be user defined otherwise the calibration lab may only attempt to put it marginally within specifications. For a more detailed study of calibration requirements refer to ISA 51.1-1979 (re-affirmed 1993, ), SAMA-PMC 20.2 -1970 documentation.

The importance of calibration is determined by the kind of assurance you need or the level of uncertainty you can live with without losing your sleep over it, thinking of an impending disaster.

70.50.246.68 (talk) 14:29, 22 September 2010 (UTC) <Past experience></member ISA, ASQ>[reply]

This definition for the calibration "term" is NOT correct.

[edit]

The definition on 02/04/2012 is simply not correct. The problem lies with the question "is Wikipedia defining the "term" calibration or the calibration "process"? The calibration process usually involves much more the just what the term calibration means.

I would recommend as a definition of the term calibration, "An instrument calibration establishes the relationship between what the instrument measures and a unit of measure (meter, gram, second, joule, etc.). See "[1]". This defines the term.

This is typically done by using the instrument to measure a standard (an object, devise or material) where the magnitude of the measurement unit for the standard is well known." This defines the process.

What is wrong with the definition?

1) Discussion on the "Calibration is a comparison between measurements" part of the definition.

An average and standard deviation are comparisons between multiple measurements. And although this is often done during a calibration is is not required for a calibration. A calibration can be done with a single comparison to a standard where the magnitude of the desired measurement unit is well known.

Furthermore, how do you calibrate the first instrument? The definitions of all the units of measure have been defined intrinsically save one, the kilogram. This means for example, to measure time, an instrument (atomic clock) can be built and calibrated without comparison to another instrument or standard. The clock is calibrated by virtue of its ability to meet the definition of the measurement unit, second. This is in fact what is done to establish the primary standards around the world. To be sure these primary standards are compared to each other as a quality control measure or as secondary calibrations. But the primary and first calibration is by means of establishing the relationship between the measuring devise and a system of units.

2) As for the part, "...one of known magnitude or correctness..." the term "correctness" implies precision and uncertainty. Strictly speaking only the magnitude is required for a calibration. Neither precision or uncertainty is part of the "calibration" term. The quality of the calibration or measurement i.e. the precision and uncertainty are part of the "traceability" term. Clearly for most "calibration processes" traceability is a critical part of the process. Traceability is not part of the calibration term. But calibrations are part of the traceability term.

3)As for the part, "... one ... made or set with one device and another measurement made in as similar a way as possible with a second device." The sentence is not clear and needs help "one made or set with one device"

The use of one instrument to is calibrate another through a transfer standards is a common way to create a measurement standard. Because the first instrument device is already calibrated and traceable (well known) it is considered correct and is call a standard. It can be used to measure something. Once this measurement is done and the uncertainty of that measurement determined, that something becomes well known and can be used as a measurement standard. A 2nd instrument can then be calibrated using the "something" measurement standard. Again once this measurement is done and the uncertainty of that measurement determined, the 2nd instrument becomes well known and can be used as a measurement standard. And on it goes.

This is not the only way to create standards. As I said above the Primary Standards are calibrated intrinsically by the definitions of the measurement unit.

You can purchase standards that were created as described.

Again the term calibration does not speak to the pedigree, quality, accuracy, or precision. The term calibration speaks only to establishing the relationship to a unit of measure. The pedigree, quality, accuracy, and precision of a calibration can be extremely poor and still be called a calibration. The pedigree, quality, accuracy, and precision are part of the "traceability" of the calibration or measurement and not the calibration or measurement themselves.

It is up to the individual or organization utilizing the calibrations and measurement to establish requirement and tolerances for the pedigree, quality, accuracy, and precision.

Example A stick is marked with a 1 meter length. The relationship of the measurement device (the stick) to a unit of measure (meter) is established. The instrument is "calibrated". No statement about the pedigree, quality, accuracy, and precision are necessary.

Stick ORG101112 was compared to NIST Standard meter ABC1234 on Jan 4, 2012. The meter mark on Stick ORG101112 was found to be within +/-2% at a 95% confidence level. The pedigree, quality, accuracy, and precision of a calibration are determined and stated, therefore traceability of the calibration has been established.

The organization has established a requirement that measurement sticks must be traceable through NIST and a tolerance for the accuracy of 0.1% at a 95% confidence level. In this case the accuracy of our calibration does not meet the organizations tolerances. Although it is a traceable calibration it is NOT acceptable for use by the organization. Bestshot (talk) 19:13, 4 February 2012 (UTC)[reply]

Vacuums don't pull

[edit]

"unknown vacuum pulls the liquid up the tube, as shown" - sheer nonsense. Vacuums don't pull. The best they can do is fail to push. — Preceding unsigned comment added by 213.150.228.38 (talk) 14:30, 6 February 2013 (UTC)[reply]

Calibration versus metrology

[edit]

The terms Calibration and Metrology have very different meanings. Metrology applies to calibration as well as to any other measurement. This section is misleading and I proposed to delete it. SV1XV (talk) 14:55, 17 April 2013 (UTC)[reply]

ITAR

[edit]

The image of the F18 NA test set may be an ITAR violation. It is an image of portions of a test set used to manufacture US military equipment and is owned by the US government. There may be restrictions against displaying and storing an image of the equipment. I'm not an expert just putting it out there.... — Preceding unsigned comment added by 205.175.225.23 (talk) 15:18, 6 March 2014 (UTC)[reply]

Unreferenced

[edit]

There are not nearly enough references for an article of this length, and the four that are offered are not supporting most of the contents of the article. The "refimprove" tag should be kept on until someone can support this long chatty personal essay with some citations. --Wtshymanski (talk) 20:19, 2 September 2014 (UTC)[reply]

Definition of "uncertainty"?

[edit]

What is the definition of "uncertainty"

Is "uncertainty" a synonym for the standard deviation of a random variable? Or are there different definitions for "uncertainty" depending on the type of equipment being calibrated? 216.223.227.115 (talk) 18:49, 23 January 2015 (UTC)[reply]

Why not look in Wikipedia?? Measurement Uncertainty--Cms metrology (talk) 16:21, 15 June 2015 (UTC)[reply]

Modern Calibration

[edit]

The changes made 29 April 2015‎ by Shefes, while adding content are full of grammatical mistakes and could definitely use a facelift. — Preceding unsigned comment added by 184.69.155.202 (talk) 15:56, 8 June 2015 (UTC)[reply]

Found the source, for the material you refer to, to be strongly related to NABL and FCRI accreditation standards in India. Accordingly, I made a copyedit by putting it in context of international standards and copyedited the section. Hope this helps. ← scribbleink ᗧHᗣT 21:33, 14 June 2015 (UTC)[reply]

Possible Caption Mistake in "Manual and automatic calibrations" Section

[edit]

There appears to be misalignment between a picture caption and what it shows. In the "Manual and automatic calibrations" section, the first picture has a caption that partially reads, "Manual calibration - US serviceman calibrating a temperature gauge". But the details associated with the picture indicate "Fireman Joshua Morgan, of Waco, Texas, calibrates a Engineering pressure gage". If the caption were edited to read "pressure" instead of "temperature", this would also help align with the first sentence under the "Manual" heading in the same section that reads "The first picture shows a U.S. Navy Airman performing a manual calibration procedure on a pressure test gauge." Rez Evans (talk) 19:29, 28 December 2016 (UTC)[reply]

References

[edit]
  1. ^ International Vocabulary of Metrology — Basic and General