Help with Pharmaceutical analysis Log Out | Topics | Search
Moderators | Register | Edit Profile

NIR Discussion Forum » Bruce Campbell's List » I need help » Help with Pharmaceutical analysis « Previous Next »

Author Message
Top of pagePrevious messageNext messageBottom of page Link to this message

Kevin Chalmers (Kevin)
Posted on Thursday, March 08, 2001 - 8:40 am:   

Hello,

New to NIR analysis and am already having problems creating a simple ID method.

We are trying to build ID methods for some of our active ingredients (Brompheniramine Maleate and Guaifenesin). Using Vision software by FOSS NIRSystems, we are able (we think) to create a good library. We apply Max. Dist in Wavelength space using 2nd derivative on the samples. Then for the ID library, we use correlation in wavelength space with no derivative. The library passes validation, with what seems to be adequate responses.

The problem lies with challenges versus this library. All positive challenges pass, as expected. All negative challenges fail, as expected. The problem begins when we scan contaminated samples versus this library. We contaminate a library component with 5% of a different ingredient (Tartaric Acid or Oxybenzone). These contaminated samples PASS, being ID'd as almost 100%.

I realize this is a convoluted question, but is there anything obvious we are doing wrong?

I started out feeling very confident in the application of NIR with our raw materials, but as each day passes, the confidence declines.

Any help would be greatly appreciated.

Thanks
Top of pagePrevious messageNext messageBottom of page Link to this message

Gary Ritchie
Posted on Thursday, March 08, 2001 - 9:12 pm:   

Kevin,

Some questions to ponder before I attempt to answer this:

1. Are you using the full spectrum of each substance for the library?

If you are, then the problem that you are having when you try to discriminate the tainted samples from the "known" library is that even at 5% level, the spectrum are still very similar (if you are using the FULL spectral range. We have been very succesful using only those wavelength regions where there are very discrete bands that form a gap between different samples.The chances are that another substance will "fit" in those bands or fill the gap is much less than when you use the full spectrum.

2. What is the threshold values you have set for the following settings you refer to "We apply Max. Dist in Wavelength space using 2nd derivative on the samples. Then for the ID library, we use correlation in wavelength space with no derivative."?

This doesnt make any sense to me for the following reason:

If you have been able to differentiate the sample sets you have selected to build a library with by max distance in wavelength space on second derivative spectra, why would you then change the parameter to correlation in wavelength space on NO deriviatized spectra?

Try your exercise again using thresholds based on standard deviation of max distance (default is 4), KEEPING the second derivative spectra for your library. Then optimize your library by finding the cutoff at which your library fails on positive and negative challenges by tweeking your threshold value up or down as needed. Good Luck!

Gary Ritchie
Top of pagePrevious messageNext messageBottom of page Link to this message

campbell
Posted on Friday, March 09, 2001 - 8:22 am:   

Kevin,
Gary has some very good points. You may also want to overlay spectra of the added contaminants to see how similar they are to visual examination. I have seen a library/qual software program that allows more than one presentation, such as transmission, absorbance, various derivatives, all at the same time. My intuition says that would be a more rugged approach than just one presentation.

Additionally, there may be a problem with homogeniety-if the mixtures are not homogeneous, you could be obtaining spectra from the main component only. Yet another aspect may that may be upsetting the measurements is a large difference in particle sizes.
Bruce Campbell
Top of pagePrevious messageNext messageBottom of page Link to this message

DJDahm
Posted on Sunday, March 11, 2001 - 3:04 am:   

Kevin:
I�m not exactly sure what you are doing, but I know that you are not alone. The "I started out feeling very confident in the application of NIR with our raw materials, but as each day passes, the confidence declines." is fairly common among my consulting customers as they start up. Usually, in the end, the techniques work fine.

First, correlation is a good way to get tentative identification of an unknown from a library of knowns. You get one number that shows how well the spectra correlate. Perfect correlation (1.0) happens when the spectra are identical. However, since instrumental conditions, particle size, etc affect the spectra, it is quite possible that there is poorer correlation between two spectra of the same chemical material than between a pure sample and a contaminated one.

Depending on the mathematical expression used to calculate the correlation, perfect 1.0 correlation can also occur when one spectrum is an exact multiple of the other. This is very useful when checking a spectrum against data collected on other instruments or under quite different conditions.

When you want to distinguish a spectrum of a pure material from a contaminated one, wavelength distance is a better choice. You are running a test at every wavelength that you include, so you get many indications of similarity instead of one. It is relatively rare that addition of 5% of another material will not alter the spectrum sufficiently to be detectable. The problem is that it must be detectable (by deviating from the pure spectrum at one or more wavelengths) at a level greater than the �normal� variation included in the library. So we want to keep the variation in the library as low as possible. This is counter to the advice that you get when building a calibration, when we generally want to include variation beyond the range that we will encounter in our samples to be analyzed.

I have seen many very good schemes tailored by imaginative application of the tools you are using. The following is a strategy that I have found useful, but it cannot replace the intelligent customization needed for some applications.

First, don�t include too many spectra of good material. You do not want a good statistical distribution of spectra of good material in the library. You want the spectra to be weighted toward the �edges� of the �normal� or �acceptable� variation. If you have too many spectra in the library, one more will not change it much (unless it is �way out there�). If you have a normal distribution, you will fail one out of 100 independent measurements at 3 sigma. In a NIR spectrum, there might be close to 100 independent measurements, so you would be failing a lot of spectra at 3 sigma with a large, normally distributed library. (That�s why they had to make the default be 4 sigma.) You can get away with a �smaller sigma� if you have a library skewed toward the edges of the distribution.

Initially concentrate on challenging the library with �bad� samples, (i.e. samples that you want your library to be able to reject). If it passes the test as a good material, you must reduce the variation in the library. For example, you may be able to have a stricter tolerance over a smaller wavelength range, or taking derivatives might reduce irrelevant variation. If you can�t get by this hurdle, you are in for tough sledding using wavelength distance. [This is where you may want to consider more sophisticated schemes, but a novice would be well advised to get help at this point.]

Next challenge the library with spectra of good samples that were obtained under reasonable conditions. If it passes, fine. If it fails, add it to the library. Hopefully adding several such spectra to the library will fix the problem of rejecting good material, but you can always add more of these if you encounter them down the road.

Now you must make sure that the library will still reject the bad samples after the new sources of variation have been added.
Top of pagePrevious messageNext messageBottom of page Link to this message

Tony Davies (Td)
Posted on Monday, March 19, 2001 - 8:30 am:   

Hello Kevin,

I think Don has given you most, if not all, of the answer. I have one comment to add but I may be quite wrong because I do not know exactly what you have done.

You said that you were making a library for actives (Brompheniramine Maleate and Guaifenesin) and then you are adding Tartaric Acid or Oxybenzone as contaminants. It reads as if these compounds are not in the library. If this is true then it is not so surprising that they are not detected. When you create the library the program is searching for directions which separate the members. If you add a non-member then there is no reason why the system should detect it. Conversely if you add compounds which are in the library then I would expect them to be detected.

Tony
Top of pagePrevious messageNext messageBottom of page Link to this message

shay
Posted on Wednesday, February 19, 2003 - 2:59 am:   

Dear All,

I am currently working on identification of Tablets, Raw materials and quantification of active substances in Granulates and blends. Currently I encounter some problem with placebo for certain tablet and would need your opinion.
The whole case:

I have had some successful experience with few types of tablets. In such systems the validation process includes the use of placebo which is same tablet without the active substance. As expected, identification of a placebo results in a "Fail" since the method was developped to identify the active substance in the tablet.
Currently I am working with tablets in which the active substance is the main component and the binding material. In order to use and equivalent placebo I need to add cellulose as a suplement to replace the active substance.
Since I add a new component , I expect a failure in the identification of the active substance and the tablet as a whole. Changing 2 parameters at the same time (adding new substance and removing the active substance) would both affect the NIR spectra. How could I assess this validation vs previous work (when only active substance was ignored)? Could you suggest other alternative?


I am mostky thankful for the time you took to look at my work.
Shay Ben-Menachem
Top of pagePrevious messageNext messageBottom of page Link to this message

Tony Davies (Td)
Posted on Wednesday, February 19, 2003 - 4:43 am:   

Dear Shay,

I expect that you would get a "fail" but I recommend that you develop a separate calibration for "Placebo". It is much more safe to make a positive identification than to rely on a negative.
Good Luck!

Tony
Top of pagePrevious messageNext messageBottom of page Link to this message

Su-Chin Lo (Suchin)
Posted on Wednesday, February 19, 2003 - 7:29 am:   

Dear Shay,

In a typical tablet ID analysis, you need to re-validate the ID library whenever you encounter the "change control" process. Therefore adding new substance into the external validation on the specificity test should be performed the validation test one more time. In our practice, there is no need to form a separate calibration for placebo, since the purpose of ID test is focused on the target tablet only.

Su-Chin
Top of pagePrevious messageNext messageBottom of page Link to this message

francesca lastrucci (Franci)
Posted on Tuesday, February 25, 2003 - 9:33 am:   

Hi all,
I have a question about instruments, we have to buy a new instrument for identifing raw materials in pharmaceutical field.Until now we used NIR Infraprover (Bran&Luebbe),but we want to change and we don't know which is the best between NIR of Foss NIRsystem, Bruker Optics and Perkin Elmer.
What do you think? Does anybody experiences with these instruments for identifing raw materials in pharmaceuticals field?
Thank you,
Franci
Top of pagePrevious messageNext messageBottom of page Link to this message

Tony Davies (Td)
Posted on Tuesday, February 25, 2003 - 12:36 pm:   

Hello Franci,

No one has jumped in because this is a very commercial question!
I think you have to ask yourself some additional questions to help you to decide. There are differences in the performances of the instruments you listed but all of them (and several more) would do the job.
Questions?
1) Do you want to start again from scratch or do you hope to transfer existing calibrations/libraries?
2) Do you want to continue with the same software or are you prepared to learn a new system?
3) Is cost a limitation?
4) Is the new instrument a stand-alone replacement or does it need to be integrated with other systems?
You might need a consultant to help you!

Best wishes,

Tony Davies
NOTE TO INSTRUMENT SALES ENGINEERS
Please do not begin an argument about the superiority of your instruments. You may give an e-mail address where you can be contacted.
Top of pagePrevious messageNext messageBottom of page Link to this message

Jesper Wagner
Posted on Wednesday, February 26, 2003 - 12:08 am:   

Hi Franci,

More questions you can ask your self.

1) Is the software 21 CFR compliant?
2) How is the support from the instrument manufactures?

Like Tony I will also advice you to get help from a consultant especially on 21 CFR part 11. All the instrument manufactures says the are in compliant, but not all of them are in the same degree.

Best wishes

Jesper Wagner
Top of pagePrevious messageNext messageBottom of page Link to this message

Kesley M. G. de Oliveira (Kesley)
Posted on Thursday, September 23, 2004 - 2:14 pm:   

Dear Colleagues:

I am a novice to NIR analysis and after reading some articles I have some questions about creating a suitable strategy to compose a library for our compounds identification.
It seems to me there is a consensus on using wavelength correlation and distance measures as discriminating criteria for identity confirmation after creating a general library.
We are using a software which uses PCA for qualitative analysis. At first glance, I can't see how can we build a general library since matrices calculations involved would be very computing demanding task. In addition, library update would not be straightforward.
In this case, are the individual libraries the only alternative? In order to assure unequivocal identification should I challenging the libraries one by one?
Top of pagePrevious messageNext messageBottom of page Link to this message

Tony Davies (Td)
Posted on Monday, September 27, 2004 - 9:36 am:   

Dear Kesley,

The importance of being a "novice" is that you are allowed to ask questions that "experts" have forgotten.
My answers are:
1) You are correct to be distrustful of the use of PCA for qualitative analysis. PCA is widely used but it is NOT the correct tool. PCA is a very useful tool for compression of the spectral information into a few variables (PCs) but it is just fortuitous that it is useful for qualitative analysis (It does work in many situations) but the correct way is to use the PCs as input variables for other techniques such canonical variates analysis or SIMCA which were developed to attack these problems.
2) At present there does not appear to be an answer to the question of how you should validate a library system. I have been trying to raise the question but very few people can see the problem. I believe that we need a �precursor� test which would check that your incoming raw material is more likely to be a member of one of your libraries than ANY other material.
3) Two years ago, I did a comparison of computer speeds over a twenty year gap. The modern personal computer is 100,000 times faster than the computer used to first study these problems! (www.spectroscopyeurope.com/TD_14_6.pdf) The required computations are unlikely to be that demanding of the computer but the organisation might be a difficult requirement for the human part of the activity!
4) We published a paper in the Journal of Near Infrared Spectroscopy [A guide to raw material analysis using near infrared spectroscopy by Kemper & Luchetta 11, 155-174 (2003) which is obtainable from the NIR Publications website, which I am sure you would find useful.

Good luck and please keep asking the questions!
Best wishes,

Tony
Top of pagePrevious messageNext messageBottom of page Link to this message

David W. Hopkins (Dhopkins)
Posted on Monday, September 27, 2004 - 11:20 am:   

Dear Kesley,

I don�t think there is any concensus on the best way of identity confirmation. A simple distance method, the Conformity Index, was first be accepted by the European regulatory agency (see Plugge, W., C. van der Vlies, The use of near infrared spectroscopy in the quality control laboratory of the pharmaceutical industry, J Pharm Biomed Anal 10 (10-12): 797-803, 1992). I have found that this method suffers from a problem of dividing by numbers close to zero (when dividing by the mean spectrum) when first or second derivatives are used. This makes the method difficult to maintain.

PCA is at the heart of the highly successful SIMCA method of identifying materials. One needs to make a PCA model for every compound in the library, which is not too demanding of time. It also avoids the problem of dividing by the mean spectrum. A number of different software packages provide different flavors of SIMCA, and you should find them useful.

I agree with Tony, the computation time is not likely to be a problem. The model building, testing and validating is the time-consuming step. That time is paid back by the speed and usefulness of the NIR procedure.

I would recommend that you start with the program you are using, and if you are successful and you are pleased with the results, there is no need to look further. If you find your present software too limiting, please ask on the forum, and I and others will be glad to offer suggestions, I am sure.

Best wishes,
Dave
Top of pagePrevious messageNext messageBottom of page Link to this message

Gabi Levin
Posted on Monday, September 27, 2004 - 6:38 pm:   

Hi guys,

In real life, many substances differ from each other spectraly to an extent that almost any method, PCA, Mahala Nobis, etc. will distinguish nicely.

However, there are instances where these become border lines and more tricky to do.

In several cases, I have done the following:

1. The qualitative identification step, identifies the group that contains say two compounds that have similar spectrum. It happens many times in chemicals whose structure is very similar, such as monomers and dimers.
Once thre group is identified as this group the Brimrose Predictor software goes to the next step where we do "quantitative" analysis.

In this "quantitative" analysis we assign to compound A an arbitrary value of 1 and to compound B an arbitrary value of 2. We then, predict, based on the "quantitative" model, the value for the unknown substance. For prediction values below 1.5 we can determine that the compound is A and for predicted value above 1.5 we can determine that the compound is B.

In real life we seldom get predictions of more that 1.3 for compound A and below 1.7 for compound B. This leaves a very nice "safety margin" between the two compounds.

I have used this technique very successfully with compounds that the spectra looked very similar, and where PCA or Mahala were limited.

This method was also expanded by me to the case of three compounds of very similar spectra, where we assigned arbitrary values, 1, 2 and 3.

If any one cares to comment on that, I will be glad to hear.


Gabi Levin
Brimrose
Top of pagePrevious messageNext messageBottom of page Link to this message

Renato Guchardi
Posted on Tuesday, September 28, 2004 - 1:37 am:   

Hi Gabi,

I did not see your software but I think that what it does is to calc the Euclidian distance (point to point) between the averaged spectra (averaged of the group) and rescale this to give 1 or 2...
The Euclidian distance is simple and helpful to see small differences in spectrum, but it is not specific, I mean 2 different spectra could have the same Euclidian distance from another one. Therefore I guess you have implemented an outlier check with Euclidian distance between sample and predicted group averaged (this must be smaller than a variation limit).

Regards,
Renato Guchardi


Universität Basel
Switzerland
Top of pagePrevious messageNext messageBottom of page Link to this message

hlmark
Posted on Tuesday, September 28, 2004 - 5:25 am:   

Gabi - when you are trying to discriminate between only two groups of data, several methods become equivalent: Regression, Linear Discriminant Analysis, Mahalanobis Distance and others. However, there is a danger in using regression for more than two groups; the danger is that you may assign the groups the "values" 1, 2, 3, but the meaningful spectral differences are present in the samples in a different order: 1, 3, 2, say. Even if you order them correctly, there is the further implicit assumption that the differences between #3 and #2 are the same as the differences between #2 and #1. Violation of either of these conditions is equivalent to a gross error in the reference value when doing quantitative work, and we all know the effect of that on our results! So when there are more than two materials to distinguish among at any one time, you're better off with a qualitative method.

That said, the idea of identifying subgroups and having separate libraries for each subgroup when there are only small differences between the materials in the subgroups is a good one, regardless of the algorithm you're using to do the identification. This approach allows the algorithm to choose optimum parameters (wavelengths, PC factors, etc.) to distinguish the small differences between the materials within the subgroup without being swamped by the larger differences between subgroups.

Add Your Message Here
Posting is currently disabled in this topic. Contact your discussion moderator for more information.