i<>cT\BcT|cT\f/>Y ]=?i=?f/>f/f/(|gEϊY ]>@i>@|gEϊ |gALLEύeY ]?Ai?AEύeRGBWEϗۆY ]@Bi@BWEϗۆGWRGB+LLEϗۆY ]ACiACEϗۆRGB+LL EϗۆY ]BDiBD Eϗۆv RGB+LLcT>Y ]CEiCEcT>cTcT(f/|/Y ]DFiDFf/|/f/|g^|f///Y ]EGiEGf///f//g^/f/v/Y ]FHiFHf/v/f/vg^vfDY ]GIiGIfDf7ff/HO/Y ]HJiHJf/HO/f/HOg^HOf/yY/]_ai_a|x>y|xO|x` ` O&A0@jnY0]`bi`b&A0@jn&A|&A0 0 |0h? h% = (5) GUT_ h > (6) V@&> hE @ (7) UTj{`.and the solution is still something like (4). %`(2.5.3 Nonlinear support vector machines UTjy 2The above SVM algorithm on linear data set can be 0jw5extended to nonlinear data. The general idea is that 1we can map the nonlinear data into another space 4in which we can still do a linear separation on the 0mapped data. To do the mapping, we need to find 3some "kernel function". The detailed discussion on @.these nonlinear problems can be found in [9]. 4UTWjk`3. Database UTni 4The database to be used [1] is based on two sets of 0{g 1images taken approximately four years apart. The 4first set consists of photographs taken during 1990-191. The second set consists of photographs taken +during 1994-95. The images included in the /database were taken from a study spanning four *years in the Ouachita Forest in Arkansas. .Photographs taken under controlled conditions /were digitized using an extremely high quality -scanning process and converted into computer @readable data. 8S /The area under study was partitioned into four pQ 4blocks, with each block subdivided into five plots. ,The database includes images from each plot 1during each of the four seasons in the year. The ,plots were photographed both in the 1990-91 -period and the 1994-95 period. Each plot was 2photographed from four different angles. Thus the 3total number of images in the database is 4 blocks 5x 5 plots x 4 seasons x 4 angles x 2 years = 640. In 0addition, 40 images (20 images from the 1990-91 H?dkH? WU&UTUT`Appendix AH?dk|H?lvH1ZywH1w%lH1ZyH1;G;cN];G;6n, nu{1U <`5.2.2 Confusion Matrix -.UTi_ *The following confusion matrix is for the v_-experiment using RGB+LL on data set 1, which 8@H(gave the error rate of 32.1%.E @1`5.3. Histogram WLUT3,W 0We got the histograms for experiment results by ;;Zc;;0f#UTUT2distributions to the SVMs, thereby constructing a 0UR5better nonlinear decision region. We expect that the 1classification capability of the combined system 2will be a great improvement over the system using @ either technique independently. 54UTUUJ`7. Evaluation 7UTlH 0The system performance using various algorithms 0yF2were evaluated using a comparative study with the ,help of a front end demo where the tool was @developed in Tcl-Tk [A]. 4UT@`8. Acknowledgements UTU> ,Sincere thanks to Aravind Ganapathiraju for pU<-helping us implement SVM in the USFS system. /Thanks to Suresh Balakrishnama for giving us a -real lead into ICA implementation. Thanks to *Jonathan Hamaker, Neeraj Deshmukh and Dr. /Joseph Picone for their immense help in making 6our project a success. Finally, we also would like to 5thank the students at ISIP for helping us in various @"aspects to implement the project. M% d]M% }W `Results for ICA}#H 7n\b]#H 7n\R)WdaHHYHH j`5SCENIC BEAUTY ESTIMATION USING INDEPENDENT COMPONENT CΖ`%ANALYSIS AND SUPPORT VECTOR MACHINES ` O`*X. Zhang, V. Ramani, Z. Long, and Y. Zeng `` g0UT[^B`Image Group ^.i^@`2Department of Electrical and Computer Engineering _v^>`DMississippi State University, Mississippi State, Mississippi, 39762 SU0^<`-{zhang, ramani, long, zeng}@isip.msstate.edu :Z:el:Z:"" UTUT4images to determine the utility of a plot of forest 0UR 5land both in terms of timber use and scenic quality. 0Several data modeling algorithms have been used 4for the scenic beauty estimation of forestry images -including Principal Component Analysis(PCA) 0and Decision Trees. Each has found only limited 4success on this task. In this paper, we explore the 1use of two promising new techniques, Independent ,Component Analysis(ICA) and Support Vector @Machines(SVMs). 3U@ /Independent Component Analysis [5] is the name 0U>1for a family of techniques which in recent years 2have produced interesting results on multi-source 6and single-source audio data, natural images, etc. It (is a simple, general purpose method for 1transforming ensembles of multivariate data into /natural coordinate systems. When the data is +transformed by the ICA transformation, the ,resulting class variables are said to be as 1statistically independent as possible. This is a (considerable improvement over PCA which 0considers only orthogonal mappings. Statistical 3independence is particularly important in the USFS /problem where the features are known to have a @high degree of overlap. 6bU" .Support Vector Machine (SVM) [9] is a machine poU /learning technique which has been effective in 0many pattern matching applications such as face 1recognition and phone classification. With SVMs, 3input vectors are mapped into a higher-dimensional 3feature space through a nonlinear mapping. In this 1space a linear classification decision region is /constructed. This decision region, when mapped 1back into the original feature space, can take a :Z!y{:%l:Z"y:HZ$y{H}鑤 7n\b]鑤 7n\R)Wea};Ib];IR)Wfa}Ʋn\b]nƲn\Q)WlS`hsbeHB%Nơ]])}i}HB%NơQ)}#HƲn\b] #HƲn\Q)WqS`msbe}鑤Ʋn\b]!鑤Ʋn\Q)WuS`lsbe}; UIb] "; UIQ)W~S`hsbe} Un\b]!# Un\Q)WS`3}#H Un\b]"$#H Un\Q)WS`0}鑤 Un\b]#%鑤 Un\Q)W!S`0};"QIb]$&;"QIQ) W$S` msbe}"Qn\b]%'"Qn\Q) W%S`25}#H"Qn\b]&(#H"Qn\Q) W&S`100}鑤"Qn\b]')鑤"Qn\Q) W'S`23};:sIb](*;:sIQ) W(S` lsbe}:sn\b])+:sn\Q) W)S`0}#H:sn\b]*,#H:sn\Q) W*S`3}鑤:sn\b]+鑤:sn\Q) W+S`5}30ЗaI]l.30ЗQb(WiS`mean}$aK]-/$Pb(S variPR@ance}HB-aM].0HB-Qb(WS`ALL}uB̗aO]/1uB̗Qb(W,S`35.22}B̗aQ]02B̗Qb(W-S`33.54}B̗aS]13B̗Qb(W.S`31.87}fdB̗aU]24fdB̗Qb(W/S`33.12}30BЗaW]3530BЗQb(W0S`33.44}B$aY]46B$Qb(W1S`5.7457}H-a[]57H-Qb( W2S`RGB}u̗a]]68u̗Qb( W3S`34.59}̗a_]79̗Qb( W4S`33.54}̗aa]8:̗Qb( W6S`32.50}fd̗ac]9;fd̗Qb( W8S`33.12}30Зae]:<30ЗQb( W9S`33.44}$ag];=$Qb( W:S`2.3185}Hǧ-ai]<>Hǧ-Pb(;S`RGB+ Q @ǧ̗Qb(W>S`33.54}ǧ̗ap]?Aǧ̗Qb(W?S`33.12}fdǧ̗ar]@Bfdǧ̗Qb(W@S`33.12}30ǧЗat]AC30ǧЗQb(WAS`33.59}ǧ$av]BDǧ$Qb(WBS`1.4443}H&6-ax]CEH&6-Pb(CS`RGB+ QDR`LL+ENT}u&6̗a{]DFu&6̗Qb(WES`34.59}&6̗a}]EG&6̗Qb(WFS`34.18}&6̗a]FH&6̗Qb(WIS`33.12}fd&6̗a]GIfd&6̗Qb(WJS`32.50}30&6Зa]HJ30&6ЗQb(WKS`33.60}&6$a]IK&6$Qb(WLS`2.7569}H>L-a]JLH>L-Pb(MS`RGB+ QNR`LL+FRA}u>LŜ̗a]KMu>LŜ̗Qb(WOS`35.85}>LŜ̗a]LN>LŜ̗Qb(WPS`33.54}>LŜ̗a]MO>LŜ̗Qb(WQS`32.50}fd>LŜ̗a]NPfd>LŜ̗Qb(WRS`33.12}30>LŜЗa]OQ30>LŜЗQb(WSS`33.75}>L$a]Pr>L$Qb(WTS`6.4135}Hy-b1]SHy-QbW2S` features}uyb3]RTuyQbWQS`set1}yb5]SUyQbWRS`set2}yb7]TVyQbWSS`set3}yb9]UWyQbWTS`set4}yb;]VXyQbWWS`mean}y-b=]WYy-QbWYS` variance}H-b?]XZH-QbWZS`ALL}ubA]Y[uQbW[S`34.6}bC]Z\QbW\S`34.2}bE][]QbWvS`31.2}bG]\^QbWGS`31.9}bI]]_QbWHS`33.0}-bK]^`-QbWVS`8.4475}H©a5-bM]_aH©a5-QbWWS`RGB}u©a5bO]`bu©a5QbWXS`35.2}©a5bQ]ac©a5QbWYS`34.8}©a5bS]bd©a5QbWZS`32.5}©a5bU]ce©a5QbW[S`32.5}©a5bW]df©a5QbW\S`33.8}©a5-bY]eg©a5-QbW]S`6.3300}H#-b[]fhH#-Pb^S`RGB+ Q_R`LL}u#b^]giu#QbW`S`32.1}#b`]hj#QbWaS`32.3}#bb]ik#QbWbS`31.9}#bd]jl#QbWcS`32.5}#bf]km#QbWdS`32.2}#-bh]l#-QbWeS`0.2222};_-bj]o;_-PfS`RGB+ QgR`LL+ENT}h_bm]nph_QWhS`35.2}_bo]oq_QWiS`34.2}_bq]pr_QWjS`31.9}_bs]qs_QWkS`32.5}_bu]rt_QWlS`33.4}_-bw]su_-QWmS`6.9300};w-by]tv;w-PnS`RGB+ QoR`LL+FRA}hwb|]uwhwQWpS`35.2}wb~]vxwQWqS`35.2}wb]wywQWrS`32.5}wb]xzwQWsS`31.2}wb]y{wQWtS`33.5}w-b]zmw-QWyS`12.0675 HZqekk HZq& presentation images demo.gif0001FRAMGIF UNIX M% d]M% l Z?MccqZ?M @!`0`!`0` 4h 4h!` (P> (P@0`@0`4h4h@ xj>x`B`B0F0F@J@J0N0N@R@R B B@>x@>x :p :p@6h@6h &H &H@"@@"@ 0 00-f0Ç0Ç`ŋ`ŋjǏjǏ0ɓ 0ɓ @˗(@˗(0͛00͛0@ѣ@@ѣ@ ӧH ӧH@իP@իP ۷h ۷h@ݻp@ݻp À Àjǈjǈ0ݻp0ݻpj۷hj۷h0իP0իP`ӧH`ӧHjѣ@jѣ@`ϟ8`ϟ8 ŋ ŋ@Ç@Ç f f0& u0 zhang ica bar all.gif0001FRAMGIF UNIX ~q dc~q W| `$Comparison between PCA, ICA and SVM; d]}}; l; d]; Wz `Results for SVM~q dc~~q lD dkD W} `CFront end tool for comparing the various classification approachesD dk D loS?}+ywS?}SSHH+ytHHxx%lHH+yHHw ` Q`dtH + yH ~~&lH +yH {W`4ECE4773/Digital Signal ProcessingDecember 11, 1998dHHHHeldlHHHHn..6UTUT.nonlinear form. The primary advantage of this 0UR-technique is that it is able to model highly -nonlinear data and special properties of the 4decision surface ensure good generalization. In our 9work we use the public-domain SVMlight package ,which provides access to many different SVM @ kernels. 7mUF :A total of 45 features are extracted from the images 0zUD0including color content, density of long lines, 0entropy and fractal dimension. We examined many /combinations of features to find the one which 0match the human subjective rating best and thus 2gave the lowest error rates. The best error rates +achieved using ICA and SVMs are 33.44% and 332.2%, respectively. These results are superior to 1all previously reported results on this problem. 0Encouraged by these results, we believe that we .can combine these two techniques to produce a 0better classification scheme. By using ICA as a -front-end to SVMs, we will be able to supply /optimally separated distributions to the SVMs, 1thereby constructing a better nonlinear decision 5region. We expect that the classification capability 'of the combined system will be a great (improvement over using either technique @independently. 4UTqU ` 2. Theory 1`"2.1. Principle Component Analysis UT *Principal component analysis (PCA) [2] is 0&commonly used for data compression by 3dimension reduction. It is supposed that the first .principal component of a sample vector is the 4direction along which there is the largest variance 4over all samples. This direction corresponds to the 4eigenvector associated with the largest eigenvalue. D:?The kth principal component is chosen to be the tUT 6linear combination of the input features that has the 7largest variance, under the constraint that it is also +uncorrelated to the previous k-1 principal 2components. This approach is well suited for data 0compression, where the objective is to transmit ,data with minimal distortion. It also finds 1application in pattern classification, where the 0consideration is that the direction along which 2there is maximum variation is also most likely to (contain the information about the class HHlHHleldndHmR HmRHRHRFootnoteHr@Hr@HzHzSingle LineH Footnote HDHDHHDouble LineHDouble LineHSingle LineHZ TableFootnoteEGxREGxREPwEPw TableFootnoteHH nHHlnffeld9HH HHn TTel6HH7HHSH+JKUTUT@ problem: tUR`Given the inequalities: <7@&>? h, 1 (1) DUTU`find a pair of hyperplanes: >@&>o8 h0 H1: 2 =v h4 H2: 3 BUT ,which give the maximum margin by minimizing )q\# H#4, subject to constraints (1). wUTqw +Reformulating the problem using Lagrangian qu@ method, we have the Lagrangian: x7Tp h5 (2) yUT3 6It is further explained in [8] that minimizing (2) is @@the same as maximizing r7F`e h% 6 (3) EUT -Therefore support vector training amounts to @&> 7maximizing 8 with respect to the 7, subject to t F@ 9constraints 9 and positivity of the :, UT0@with solution given by z7t FO= hE ; (4) {@&>) ( 2? h2First, prewhitening # by $ _J&>WJ hwhere &. @v+ (?The matrix ' is then initialized to the identity TUT@0matrix, and trained using the logistic function S?}+otS?}SSS?}+m}S?}SSH + tH ~~3lH +H }W`5ECE4773/Digital Signal Processing December 11, 1998 dLeftdRightdy isip_titled isip_titledd Referenced d d d ]d cd gdk3GG\\@H title. ((\\@ $ H l micro:caption. 僙KC\\@d H head. GGO\\@H para. ((\\@ micro:ftnote. ^5\\@$ H l 10x3:cell. ^5\\@$ H l 4x3-1:cell. GGO\\@H table heading. 6$6GGO\\@6 Z ~ bullet. GG?|\\@H sub-title. ^5\\@$ H l 4x3:cell. Q냙O\\@d H subhead. 66GGO\\@A6 @ ~ list A: .\t. z\\@ 6 l D z R ¾ centered para. f@g - isip_body. f@g dsp_body. GGO\\@ H sub-title. f@Xg Reference_dsp X:[ ]\t. z\\@ 6 l D z R ¾ title page. @@Body. f@Xg Reference_dsp X:[ ]\t. \\@d . sub_head_1. f@g - dsp_body. @@ Body. f@g " isip_body. GGO\\@H sub-title. 僙KC\\@d H head. ^5\\@$ H l 10x3:cell. GGO\\@ H sub-title. f@g dsp_body. f@g #. isip_body. f@g isip_body. f@g #. isip_body. f@g dsp_body. @@ Body. f@g dsp_body. f@g Body. @@ . Header. f@g dsp_body. \\@ $ H l 10x3:cell. \\@0H sub-title. @@ &Body. f@g 2 dsp_body. f@g 1 dsp_body. GGO\\@H isip_equation. \\@d head. \\@d 4. sub_head_1. f@g dsp_body. f@g 1Z. dsp_body. \\@d Z. sub_head_1. @@ Body. \\@d . sub_head_1. f@g . dsp_body. \\@d H head_0. \\@d 4. head. GGO\\@E H isip_heading_1E:( ). \\@d 4. head_0. \\@d 4. head. Emphasis EquationVariablesڝ ) ڝڝ ) ڝڝ EquationVariables ڝ ڝ ڝtu ) uo& ڝ ڝyc> ڝ ڝ yc> yc> [ .Ё32r[ -yc>70[ ڝ yc> ڝ) ڝ yc>ڝ ڝ ڝ) ) ) tu Emphasisڝ CharFmt2ڝ ڝ tu Z d d ddddd d Zd ZF d Zd Zd d d ZNZWeight Four DoubleWeight Three DoubleWeight Two DoubleWeight One DoubleWeight FourWeight Three Weight Two Weight One Thin Medium DoubleMҁMҁHJJJ10x3MҁMҁHJJJ4x3-1MҁMҁHJJJ4x3MҁMҁHJJJ10x3MҁMҁ HJJJ10x3n\GG&In\ܸn\GG) In\n\n\v(-$v--IPn\brstuIn\bvwxyIؔnn\bz{|}IVn\b~In\b; 7mn; ; U !"#$;"Q % & ' ( ;:s ) * + , HbLPjkl-.HB b/012345Hb6 7 8 9 : ; < Hǧ b=>?@ABCH&6bDEFGHIJH>LbKLMNOPQHybRSTUVWXHbYZ[\]^_H©a5b`abcdefH#bghijklm;_nopqrst;wuvwxyz{CommentC0 C1@C2C3C4AC5KAC6dC7dBlackT!WhiteddARedddGreendd BluedCyandMagentad YellowHeader/Footer $1Header/Footer $1Header/Footer $2Header/Footer $2IndexIndexCommentCommentSubjectSubjectAuthorAuthorGlossaryGlossaryEquationEquation Hypertext Hypertext Cross-Ref Cross-Ref Conditional TextConditional TextPositionFMPrivatePositionFMPrivateRangeEndFMPrivateRangeEndFMPrivate HTML Macro HTML MacroTimes-Roman FrameRoman Times-Bold FrameRomanTimes-BoldItalic FrameRoman Helvetica FrameRomanTimes-Italic FrameRomanHelvetica-Bold FrameRomanTimes Helvetica RegularRegular BoldRegularItalic&6J5F[s>+ohיalbAB/UB9Uiʀ\?GJlT4+ѧR.ҪgΩ-IX 4{߫