https://www.euro-online.org/enog/inoc2007/Papers/mac-slots.html https://www.euro-online.org/enog/inoc2007/Papers/m https://www.euro-online.org/enog/inoc2007/Papers/mac-slots.html

Spectrometer failure

ianm's picture
Forums: 

18. Spectrometer failure

Here is the original question, and, since Howard Mark asked for the background behind the asking, a short explanation. Then the responses follow.

Since all spectrometers will eventually fail, and some may not work correctly at the start, what procedures do you use to (1) verify it is operating within specifications, (2) ensure the calibrations developed for its use are still yielding correct results, and (3) correct for possible malfunction (including recalibration or calibration transfer - or what else).

I was wondering what the various practices are for verifying the quality of the spectrophotometers, both when initially installed and in routine use. There are instrument suppliers practices. There are also guidelines and rules promulgated by various bodies, such as ASTM, FDA, etc and, in the regulatory arena, are to be used rigorously. However, many analyses are not in this area, so individual usage may vary. Further, a chapter in a book soon to be issued makes a point about (eventual) instrument failure. This also piqued my curiosity on the various ways to discover when a spectrophotometer begins to fail.

Bruce Campbell

From: [email protected]

Instrument Hardware Failure

1. Instrument vendor diagnostics

a. Repeatability (noise), and patterns in the data

b. Wavelength accuracy with polystyrene

c. Response to test lamp output and detector gain

d. Bandwidth measurements

I chart the above data in an excel spreadsheet and look for trending or drifting.

2. Sealed sample sets (5 - 10) across the constituent range shared between instruments in our network at routine intervals.

a. Look for predictions that are drifting for any given instrument

b. Look at differences between instruments for predicted data

c. Look for spectral differences using ISI contrast

This data is also charted in excel.

3. Check cells or known standard samples run daily for each instrument. For liquids in transmission we take a large sample or samples. Low, medium, and high constituent level would be better than one sample. Mix thoroughly and split it into many smaller containers and freeze the containers. Run these daily.

Near the end of the designated life of the control, we get the next sample out of the freezer and compare it to the previous. Changes in Mahalanobis distance values can also be an indication that the instrument may be drifting.

4. Visually looking at spectra/instrument.

a. This step, even though is not necessarily very sophisticated, can catch problems that the data doesn't always show.

b. Routine Instrument maintenance such as cleaning, Inspection for correct mechanical operation: spinning drawer, instrument cooling fan, etc.

5. Any other measurable parameters such as path length.

6. Reference absorbance standards have been considered, but have not been implemented at this time.

7. ASTM also has some guidelines that have been developed.

Calibration accuracy after repair or Instrument calibration transfer

1. ISI standardization is used for all instruments in our network. After repair restandardization is done as necessary.

2. Each calibration is designed with samples from multiple standardized instruments.

3. ISI repeatability files are used and include standardization samples for multiple instruments, temperature variation, etc.

Hope this helps. Thanks.

From: Howard Mark [email protected]

Subject: Re: Instrument failure

Well, on the surface, the answer(s) is(are) relatively straightforward:

For questions #1 & #3, follow the manufacturer's recommendations. This usually includes a daily QC routine for the hardware performance, which is at least partially automated these days.

For #1, the traditional "quick check" is still a good idea, if not completely comprehensive: run some small number of bias check samples. Running one sample each day, along with the QC check is not too onerous and will reveal major changes in performance. A slightly more extensive set of readings would be 5-15 samples at intervals of one week to one month, depending on what makes you feel comfortable is in order. With a properly chosen set of samples you can test for sensitivity changes as well as bias, and this addresses question #2 as well.

For further tests, and more information on how to conduct testing, ASTM E1655 is a good reference. While written from the point of view of the petrochemical industry, the tests are general enough to be valid for just about any application.

ASTM is currently working on developing a generalized hardware test procedure. Input is welcomed and invited. There will be a meeting of the committee at Pittcon, and anybody with input is invited to attend.

But all this is the standard, obvious stuff. I get the impression that the question is intended to look a little deeper, or at least dig out some non-obvious, special answer, but it would help to have a clue as to what the underlying basis of it is.

Howard

From: "Jim Reeves, NCML, B-200" [email protected]

First, I have not had to worry in the past about long term changes. My work has been mostly short studies, complete within themselves. I can tell you that we had to rescan a large group of silages many years ago when we lost a detector. The new detector/source (I think the lamp was also replaced at the same time) had a completely different energy profile. There seemed to be more energy at the high end (2500 nm) and less at the low. So breakdowns can be very messy. For my Fourier transform spectrometers we check the energy using the alignment procedure and also run a 100% line. With the new scanning monochromator, there are built in filters for wavelength checks and a check sample to run. What really happens if the thing breaks I don't know. However, we ran manures over a period of months and got excellent calibrations. During this period we were switching back and forth between the transport and probe and spinning cup modules and still got good results.

Jim Reeves

From: "David Geller" [email protected]

Bruce,

Foss supplies traceable Absorbance and Wavelength standards. They also have rigorous diagnostics routines that test performance via internal Polystyrene and dididium standards.

Author: "Ritchie; Gary" [email protected] at Internet

Date: 02/14/2000 9:20 AM

(1) For reflectance work, I have been using the proposed USP chapter on Near-Infrared Spectrophotometry <1119> pp. 6463 - 6573 issued in the Pharmacopeia Forum, July-August 1999 issue.

For transmission work on tablets, I am in the process of testing Vision software upgrade for use with the Multi-tab module, where FOSS has manufactured wavelength and photometric accuracy standards for external calibration, that is, not based on an internal standard hidden inside the spectrophotometer.

(2) Using the following schedule:

 

> 
>        <<...>>  
> 

(3) Haven't experienced major malfunction with calibration equations yet, however I am only developing throw-away models for clinical identification purposes in reflectance mode, so it would not matter anyhow. As to the quantitative equations that I am working on, only transmission module applies, and there is not enough experience with this yet to see how it would hold up...stay tuned

Gary E. Ritchie

Secretary (1999) NY SAS

Sr. Scientist

Purdue Pharma L.P.

444 Saw Mill River Road

Ardsley, NY 10502