Author |
Message |
Howard Mark (hlmark)
Senior Member Username: hlmark
Post Number: 313 Registered: 9-2001
| Posted on Thursday, March 18, 2010 - 9:08 am: | |
Andrew - you make a good point. "Division" math is a non-linear data transform, and therefore can do things that are, at best, difficult with purely linear methods. Regression, on the other hand, although a linear method, is an inherently optimizing method. Even Karl's software requires major user input to optimize the "division math" parameters. Arguably that's a good thing, but it presents the new user with a very high and steep learning curve. Karl's way of combining the advantages of the two ways of dealing with the data is, perhaps, not unique to him since we have responses on the thread of others doing the same thing, but certainly Karl has the most experience, and arguably the best insight into how to combine the two methods so as to get best results. The problem is that it is not easy to get the insight, and so the instrument companies do not have incentive to promote that method over (or even comparably to) one that they can sell as a "pushbutton" method - "push the button and let the computer do the thinking for you". Of course by the time the user realizes that he has to do at least some of his own thinking, he's locked into the instrument and the software package provided. Even if he purchases a third-party software package, the MLR capability is generally minimal compared to the development resources expended on the PLS approach (if it exists at all), the "division math" capability non-existent and the novice user gets little or no help on doing anything more than the routine methodology. \o/ /_\ |
Andrew McGlone (mcglone)
Intermediate Member Username: mcglone
Post Number: 20 Registered: 2-2001
| Posted on Wednesday, March 17, 2010 - 10:55 pm: | |
For what is worth, I object to division math being lumped in with multiple linear regression. Division math has non-linear power that linear methods won't access. In my case, at the moment, it very much seems to be in the sense of making the model self-normalising to a particular but variable character in each sample. Or am I splitting hairs here, unable to grasp that MLR in the NIR community actually stands for so much more, something like "automated derivative-screening MLR"? And Don, you said "There are wonderful techniques coming along that have the effect of adding a priori knowledge into an analysis." Are you referring to Tom Fearn's recent paper in JNIRS on using a Baynesian Framework for calibration? That paper stunned me, when he presented it at the conference in Thailand. His final results are incredible compared to what he shows other calibration methodology can do on the data. Prediction errors at better than half those achieved using PLS, SVM, ANN or LOCAL. I'm very annoyed I haven't been able to get even close yet to reproducing his method... |
venkatarman (venkynir)
Senior Member Username: venkynir
Post Number: 97 Registered: 3-2004
| Posted on Wednesday, March 17, 2010 - 9:15 am: | |
Dear Lois and MUI. First I want to make it clear that I am not against MLR. But as the need and growth of technology one method take edge over other. We should work continuously for innovative method development then blaming this and that .It is possible . We have written object oriented program for Blown film thickness measurement (on-line measurement )in 1995 under window 95 . From sensor to other electronics was built by us . We have provided outlier removal for the user and RMSEP used scale factor .This was for four filter type of equipment . Our synchronization of ref.wave length and measure wave wave length was novel on. we have used STEP WISE (forward ) Linear Regression method with Gauss -Jordon to solve the equation. Intensive work was done under MLR with like R1+R2/M1,R1+2R1R2+R3/M! and soon . We have seen the light in 1995. That is working still now. Later the we moved to PLS for multivariate analysis . I am not supporting PLS or MLR . Each have its own advantage and disadvantage . We should work for new methods then stick around the old bush. |
Howard Mark (hlmark)
Senior Member Username: hlmark
Post Number: 312 Registered: 9-2001
| Posted on Wednesday, March 17, 2010 - 8:36 am: | |
Venkynir - I'm afraid your statement: "A strong technology will edge over other." looked at scientifically, isn't supported by the evidence. In an ideal world your statement would be true, but in the real world, marketing and salesmanship often overshadow technical merit. To remove the discussion from the spectroscopic realm and into one that we're all familiar with but don't necessarily have strong opinions about (any more, at least, at the time "religious" wars on the subject raged as badly as anywhere) consider computers. When IBM decided, back in the early 1980s, to take the plunge into microcomputers, they virtually wiped out all the other microcomputer manufacturers (Compaq, etc.) despite their computer being technologically inferior, and obviously so to anyone at all familiar with the microcomputers available at the time. Other companies' computers had faster processors, better graphics, hard disk drives (the first IBM PC had only floppy disk drives), etc. To this day, the "PC"s that virtually all of us use were developed from that early IBM model. Apple is just about the only company still in continuous existence since then, that makes a mass-market computer, and even their market share is only a fraction of the "WINTEL" design. This is all because of IBM's marketing capabilities and clout. In the case of PLS/PCR versus MLR, the early promotion of the full-spectrum methods suppressed the usage of and interest in and development of MLR, to the point where the instrument and software companies hardly even mention it any more, and therefore users are largely not even aware of it's presence, much less capable of deciding when it might be beneficial, as attested by the people who do use it. Thus it becomes a self-perpetuating condition. \o/ /_\ |
Donald J Dahm (djdahm)
Senior Member Username: djdahm
Post Number: 35 Registered: 2-2007
| Posted on Wednesday, March 17, 2010 - 7:57 am: | |
Hello, young folks, fellow old timers, and self described fossils: My take on all this is that the various mathematical tools are just that: and some tools are better than others for a particular job. One tool is not optimal for every situation. I think the development of Partial Least Squares was one of the great mathematical triumphs of my generation. It is a powerful tool. I am not against Chemometrics; I am against �Chemometrics, run amok�, to quote myself. I think we who decry the fact that certain techniques are in the arsenal are doing so more on philosophical rather than technical grounds. We are saying that if the NIR practitioners were forced to include other knowledge in their analyses, we think there might be an even greater benefit. In my view that is the key. How do we get the things we �know� into the analysis. MLR is certainly a way to get spectroscopic specificity in. MSC (and related techniques) is a way to account for what we know about the effect of particle size. However, you must know enough about your system and how NIR works to be sure that the scheme you concoct is appropriate for the analysis at hand. There has been a suggestion that in the limit, if you discard enough spectral regions, PLS and MLR are not so different. Like poker, the trick is to know which cards to throw away, and which to keep in building a winning hand. There are wonderful techniques coming along that have the effect of adding a priori knowledge into an analysis. I doubt any of us old folks want to stop the development of the tools. What we want to fight until our fast approaching deaths is the idea that there is some magic in the application of these tools that can substitute for actual thought. And it seems to us, the old tools were better than the new ones at getting that in. |
Sirinnapa Saranwong (mui)
Junior Member Username: mui
Post Number: 6 Registered: 10-2006
| Posted on Wednesday, March 17, 2010 - 7:40 am: | |
Agree with Lois very much. From NSAS/MLR Fan |
Lois Weyer (lois_weyer)
Senior Member Username: lois_weyer
Post Number: 37 Registered: 7-2002
| Posted on Wednesday, March 17, 2010 - 7:17 am: | |
Mr. or Ms. Venkantar or Venky: I think that, unless a person has started his or her NIR careers using DOS NSAS software with the automated derivative-screening MLR, he or she would not know the power of it. I suspect that most PLS users have not seen that technology and so cannot compare. They think of MLR as a simple linear regression. I have done many comparisons and have found that the derivative-variations as well as the division math are capable of developing simple equations that are more logical, linear over a much wider range, and just as robust as PLS equations if not more. I agree with the others that PLS took over due to marketing issues. Lois |
venkatarman (venkynir)
Senior Member Username: venkynir
Post Number: 96 Registered: 3-2004
| Posted on Wednesday, March 17, 2010 - 4:14 am: | |
Dear Bob Rosenthal; First thanks for the remark. "For example, ther is an NIR commercial instrument that measures percent body fat from a range of 2 percent to 50 percent with an accuracy equal to the official methods and a precision equal to 0.1 percent" You are aware of the difficulty in on-line insrument measurement for wide range of 2% to 50% in same scale . It is also difficult to reach that accuracy in wide range.You know very well for reaching that accuracy in low end scale our resolution should be higher.I used segement the range of measurment in two scale factor one at low range and other at high range. Please think of accracy 2% at 2% . To Mark; You have to accpet PCR and PLS are the over growth MLR. Your word : "So you had plenty of company in setting up the wrong technologies" . Technolgoy is growing , after proper proof and validation only it is accepted the company. A strong technology will edge over other. The technolgoy supplier are expermantal people only. I hope you will change your mind set. |
Howard Mark (hlmark)
Senior Member Username: hlmark
Post Number: 311 Registered: 9-2001
| Posted on Tuesday, March 16, 2010 - 5:48 pm: | |
Bob - what I stated was correct. You may have used division math, but I didn't know about it, so the statement stands! As for software, I can provide a package that runs under WINDOWS. It also has some unique wavelength search capabilities. You can look on my web site for a description of it, but that's probably as much as I can say on the discussion group without getting banned for posting commercial messages. We can talk it about privately, though. I don't think you should blame yourself too much for promoting PCR & PLS. Harald Martens was the first person to go around "beating the drums" for those methods, and the MID-IR community also latched onto them, because they had even worse wavelength-search problems at the time than we in the NIR world did. Also, Harald was instrumental in founding CAMO, which grew into one of the premier third-party software companies serving the spectroscopic community, and guess what he promoted most in that software? Ditto Infometrix and Galactic (now Thermo). So you had plenty of company in setting up the wrong technologies. \o/ /_\ |
Andrew McGlone (mcglone)
Intermediate Member Username: mcglone
Post Number: 19 Registered: 2-2001
| Posted on Tuesday, March 16, 2010 - 12:43 pm: | |
Colleagues and I are using 'division' math, as you call it, right now. We have had a plan to develop a 3 or 4 laser unit system for a job. Using lasers provides both a significant light intensity advantage and great noise rejection through pulsing and phase-locking. Now the better models are coming out as a ratio of two differences using only 3 wavelength: (a-b)/(c-b). Its obviously a simple normalisation process. . Now what might be additionally interesting is that we have done the basic searching for key wavelengths using a full spectra range carefully measured with a wide-band source and a spectrometer. If we generate linear predictive equations via PLS then we have to be careful to segment the range to remove noisy or otherwise redundant regions. Very good models result, better than the division math (and by the expected snr advantage you might expect in adding more signal at neighbouring wavelengths etc), but unfortunately we can't use those models with our laser system. Finally I'll add that the mention of Kawano-san in a previous post in this thread reminds me that I learnt the importance of doing the range segmentation with PLS while working in his lab. He was passionate about MLR but to my mind PLS just looked the same sort of basic thing - they are both just math algorithms for developing linear equations. What became clear very quickly was that the PLS solutions often needed just as much 'intelligent' work, especially in terms of range segmentation, to generate solutions equivalent to the MLR ones. I ended up thinking it was really a matter of personal choice what you used for the types of problems we were then studying; both ways you had to think quite hard about where your information was! |
Bob Rosenthal (rosenthal)
New member Username: rosenthal
Post Number: 5 Registered: 1-2006
| Posted on Tuesday, March 16, 2010 - 9:44 am: | |
Hi Everybody, I do appreciate your comments on my posting. The following are my responses: Mui, I do agree with you. I also have not found any good modern software available that does either MLR or Division Math. In fact, the software I have was written for DOS, and I have to keep an old 486 computer in my office so I can still run it. However, the point is that it does work and it provides excellent calibrations in both MLR and Division Math. Venkatarman, you are absolutely correct. In order to use either Division Math or MLR correctly, you must to understand the spectra. I was fortunate in the 1970s when my office was close to Karl Norris� and, he was able to beat through my thick skull how to understand the spectra. In your second posting, you mentioned that one of the limitations of MLR is the short range of measurement capability. That is true, however, there are certain �tricks� you can do with MLR to give you extremely long ranges. For example, ther is an NIR commercial instrument that measures percent body fat from a range of 2 percent to 50 percent with an accuracy equal to the official methods and a precision equal to 0.1 percent. Howard, you hit the nail on the head. The reason MLR and Division Math disappeared is that it didn�t have the �Pizzazz� that PCR and PLS have. And I was the idiot that made the decision to add the �pizzazz� of PCR and PLS when my old company introduced the first commercial high energy scanning spectrophotometer (that company is now part of Foss).to help sell such an expensive NIR product. That decision proved to be an excellent business decision, however, I don�t think it was a good technical decision. I do want to correct the last paragraph of your posting where you stated that you don�t know of anybody except Karl Norris that used Division Math. I also use division math and for your information my company has delivered over 40,000 commercial NIR instruments using Division Math. Tony, as always your postings provide good insight, and as usual, I always argue with you. I believe the key to NIR gaining more widespread usage is to reduce the difficulties of calibration and of getting a sufficient number of samples. When these deficulties are combined with the high education level required for understanding PCR and PLS, it limits the number of successful applications. As I previously stated, I believe that Division Math requires fewer calibration samples and is simpler to understand than PLS. Thus, if it was re-introduced as a �new approach� with enough trumpets and drums beating so it has, using Howard�s word, PIZZAZZ, it could greatly expand the use of NIR quantitative analysis. Bob Rosenthal |
venkatarman (venkynir)
Senior Member Username: venkynir
Post Number: 95 Registered: 3-2004
| Posted on Monday, March 15, 2010 - 11:39 pm: | |
Dear Tony ; I may be permitted make remarks on your last lines. " As Mui said when your are running on-line processes it has to be fast but also there is often more noise around so it pays to keep the equations short." 1. With selective and segmented PLS we can optimize computiotnal speed . 2. When you keep the equations short , it may leads to fall of accuray . 3. I have observed that Prediciton and Validation with MLR fails to meet . 4. For short range of measurment MLR appears to be fine. 5. We have to think of Ratiometric method in MLR so that noise problem might be avoided. 6. Reports says that SPM & MLR better than PLS |
Tony Davies (td)
Moderator Username: td
Post Number: 223 Registered: 1-2001
| Posted on Monday, March 15, 2010 - 4:01 pm: | |
Hello Bob, Glad to hear that you are still active in NIR! But the truth is we are all fossils!!(There is a joke hiding in there I'm sure!). I agree with everything that Howard said (unusual but possible!). At a "Chambersburg Meeting" sometime ago I got Karl to demonstrate his method in a live session. He always claims that he doesn't add anything but in the session it was very obvious that he was using his vast experience of NIR spectroscopy and his insights into the interactions which I think everyone found difficult to follow but knew he was right! Harald didn't really intend PLS to be used as black box to produce long equations. If you use the "Jacknife" it is quite easy to reduce the length of the equation with out losing predictability and I have regularly used it in this way. I regard PLS as a two-edged sword. It definetly did give NIR a boost and encouraged the Pharm people to join us but we had been developing tools that given a bit more time (and the computer power we have today) would have made MLR more managable. As Mui said when your are running on-line processes it has to be fast but also there is often more noise around so it pays to keep the equations short. Best wishes, Tony |
Howard Mark (hlmark)
Senior Member Username: hlmark
Post Number: 310 Registered: 9-2001
| Posted on Monday, March 15, 2010 - 3:12 pm: | |
BTW - when Karl gives a talk these days (which is becoming rarer and rarer) he still does use his special software and special insights to analyze his results using the "division math". \o/ /_\ |
Howard Mark (hlmark)
Senior Member Username: hlmark
Post Number: 309 Registered: 9-2001
| Posted on Monday, March 15, 2010 - 3:10 pm: | |
Bob - there are many reasons why PCR and PLS "took over" the calibration world, over MLR. Some are technical, such as the difficulty in automatically selecting wavelengths to use with the MLR algorithm. Sure, Karl had his ways to deal with the spectra to reduce everything to a single term, but that required both his special software (which probably could have been reproduced) and his special brain, which probably could not. The instrument manufacturers always wanted to make the calibration procedure as simple, and more importantly as automatic, as possible for the user. Some retained the MLR capability in the software but even they didn't promote it. The other main reason for MLR being downgraded was that it had no "pizzazz"; PCR and PLS came along and everybody (meaning the instrument manufacturers) wanted to be able to claim the "latest and greatest" new algorithm; they didn't see much marketing benefit in using the "same old, same old" algorithms. Are there any manufacturers still providing MLR software? Sure, but I doubt that many users even know it's there, or what to do with it if they do know. As for the division math, I don't know that anybody except Karl ever took to it - besides the lack of software, it was "too hard" compared to letting the computer do all the "thinking" for you. \o/ /_\ |
venkatarman (venkynir)
Senior Member Username: venkynir
Post Number: 94 Registered: 3-2004
| Posted on Monday, March 15, 2010 - 1:57 pm: | |
I do agree with MUI on using MLR ,however it fails when you move close range of measurements ,Prediction and Validation . The instrument manufacturer use the trick of adding gain factor and off-set value . That is converting multivariate into uni-variate The regression constant plays vital role . I under stood that SPM +MLR better than PLS . If you are confident on wave length contribution then MLR is good other wise !. If you take more contributed wave length , you have lengthy equation and regression value . I found that MLR with Ratio-metric good . |
Sirinnapa Saranwong (mui)
New member Username: mui
Post Number: 5 Registered: 10-2006
| Posted on Monday, March 15, 2010 - 1:32 pm: | |
Bob, You are not alone. We, Sumio Kawano and I, and many others in Japan, especially those with high speed sorting machine are using MLR too. It is easier to understand and easier to maintenance. Just for research, you do not have a good software to work with MLR in new type Windows. Everyone just like PLS. So we have no choice, but would love to do MLR or Division Math if there is a good window-based software like a hybrid between The Unscrambler and NSAS. Mui |
Bob Rosenthal (rosenthal)
New member Username: rosenthal
Post Number: 4 Registered: 1-2006
| Posted on Monday, March 15, 2010 - 1:12 pm: | |
In the late 1970s Karl Norris and Phil Williams presented a paper describing �Division Math.� This technique selects a single-term numerator that is related to the amount of the particular constituent that is to be measured. It combined that numerator with a single-term denominator that is related to the total amount of organic constituents within a sample. typically. a single first derivative of the spectra at a constituent sensitive point would be used as the numerator and a different first derivative senssitive to the total of all the constituents for the denominator (in most cases a difference in OD can be was used instead of an actual derivative). In a slightly more complex approach, the second derivative was used both in in the numerator, and a different second derivative was used in the denominator. This technique was developed before the widespread use of personal computers for NIR applications, and thus, preceded PLS and all the other �modern techniques.� However, unlike the modern techniques, it provided a simple low-cost means of making measurements that was usable far beyond the limits of the calibration samples. As an example, for a calibrations based on samples between 8 and 15 percent,, the resulting calibration proved to be valid from 2 percent to19.5 percent. Therefore, the advantages were twofold: - Having only one variable term, it greatly reduced the number of samples needed for a calibration, and - It allowed the calibration to be extrapolable. Over the past 35 years I have evaluated NIR to accurately measure many constituen using PLS, division math, and MLR. Seldom, if ever, have I found a PLS solution that was better than division math or MLR, and therefore, all of my "production" work has used the simpler spectra approaches. Admittedly, it does require the NIR user to study and understand the spectra of the material and of its significant constituents. However, in my mind, that is an advantage because then I are not just using �computer majic� to make the measurements. Thus, my questions: Is anyone else still using division math, or MLR? Oe am I just an relic of times gone by? Bob Rosenthal |
|