Pretreatment Log Out | Topics | Search
Moderators | Register | Edit Profile

NIR Discussion Forum » Bruce Campbell's List » Chemometrics » Pretreatment « Previous Next »

Author Message
Top of pagePrevious messageNext messageBottom of page Link to this message

Howard Mark (hlmark)
Senior Member
Username: hlmark

Post Number: 24
Registered: 9-2001
Posted on Monday, June 19, 2006 - 4:58 am:   

Sol - if you have enough samples, something else you might try is separating them into two sets: "clear" and "slurry" and calibrating each set separately.

Then, if the "clear" samples show spectral peaks wheer the "slurry" samples are noisy, you can retain those wavelengths for the clear samples. You certainly did the right thing to eliminate the wavelengths where the slurry samples have no peaks and a lot of noise, but you don't want to throw out the baby with the bathwater, so to speak.

Howard

\o/
/_\
Top of pagePrevious messageNext messageBottom of page Link to this message

Solomon Abebe (sol)
New member
Username: sol

Post Number: 5
Registered: 3-2006
Posted on Monday, June 19, 2006 - 2:19 am:   

Hi every body,
I have vast number of NIR spectra (12000-4300 cm-1) for calibration purpose some of them in pure solution and some of them taken in slurry solution. The spectra taken in slurry solution have high noisy region, in short wave length range and high base line shift on the entire spectral region.

I tried to avoid highly noisy region 12000-9000 cm-1 and apply 2nd order derivatives, thinking that if I apply derivatives with out removing the noise section it may enhance the noise level.

The portion of spectra I cut out has not got peaks in pure solution state or slurry solution with low concentration.

My question is, is there any one has done that? I just want to be sure for my work and to have a reference form others. If you know could you please inform me where I can get the reference paper?

Regards
Top of pagePrevious messageNext messageBottom of page Link to this message

Howard Mark (hlmark)
Senior Member
Username: hlmark

Post Number: 23
Registered: 9-2001
Posted on Sunday, June 18, 2006 - 10:56 am:   

Lois - Don't assume. If you want to Normalize the data by dividing by the Standard Deviation, then you've got to subtract the mean yourself before computing the S.D. and dividing by it, otherwise you'll have happen exactly what you're seeing.

The fact that the calibration program subtracts the mean during the course of performing its calculations is immaterial; yes, it would wind up with "means" of zero to "subtract" of you do the mean subtraction beforehand, but that doesn't help you if you don't do the subtraction. Addition/subtraction and multiplication/division do not commute, so you've got to make sure they're done in the order you want. The S.D. computation also subtracts the mean as part of the S.D. calculation, and you can see that doesn't help you.

\o/
/_\
Top of pagePrevious messageNext messageBottom of page Link to this message

Lois Weyer (lois_weyer)
Junior Member
Username: lois_weyer

Post Number: 19
Registered: 7-2002
Posted on Sunday, June 18, 2006 - 8:23 am:   

I have a data set that is not a series of spectra, but a collection of somewhat independent parameters. I am having trouble standardizing (or normalizing) the data so that the variables will have equal chances to contribute to the final PLS equation, using Unscrambler. I am assuming that Unscrambler subtracts the mean before doing its other data treatments. If I divide by the standard deviations, the columns end up with very different levels. Here are two example columns. Could someone please help me standardize or normalize? Column 1:
0.95
1.67
1.17
1.19
0.83
1.23
0.97
1.14
1.10
0.86
1.10
1.16
Column 2:
0.07545
0.07560
0.07570
0.07580
0.07580
0.07590
0.07560
0.07550
0.07550
0.07560
0.07560
0.07490

Add Your Message Here
Posting is currently disabled in this topic. Contact your discussion moderator for more information.