The LATEX Companion


2023年12月20日发(作者:visualize)

TypesetwithNdThesiSversion2.14(2000/09/08)onJune3,2004forHaoshuWangentitledSTUDIESONSENSITIVITYOFFACERECOGNITIONPERFORMANCETOEYELOCATIONACCURACYThisclassconforitisstillpossibletogenerateanon-conformantdocumentifthepub-lishedinstructionsarenotfollowed!mmarypagecanbedisabledbyspecifyingthenosummaryoptiontotheclassinvocation.(i.e.,documentclass[nosummary]{ndthesis})THISPAGEISNOTPARTOFTHETHESIS,BUTSHOULDBETURNEDINTOTHEPROOFREADER!NdThesiSdocumentationcanbefoundattheselocations:/~afsunix/faq/tetexdoc/latex/ndthesis//Committees/ITC/p:///Committees/ITC/sample_eralLTEXdocumentationandinfo:On-linedocs:NDinstallationTEXUser’forBeg.<EXUser’ATheLTEXCompanionPackages:(checkon-linedocs)rotatinglongtablegraphicx/~afsunix/faq/tetexdoc//byKopka/DalybyLamportbyGoossens/Mittelbach/Samarinsidewaystablesandfiguresmulti-pagetablesusingPostscriptandotherfigures

STUDIESONSENSITIVITYOFFACERECOGNITIONPERFORMANCETOEYELOCATIONACCURACYAThesisSubmittedtotheGraduateSchooloftheUniversityofNotreDameinPartialFulfillmentoftheRequirementsfortheDegreeofMasterofScienceinComputerScienceandEngineeringbyHaoshuWang,B.S.

STUDIESONSENSITIVITYOFFACERECOGNITIONPERFORMANCETOEYELOCATIONACCURACYAbstractbyHaoshuWangFacerecognitionsystemsgenerallyrequirelocationoffacelandmarks(ofteneyes),proximitytoknowneyelocation)andalsoinapplication-specifi,performanceofasystemthatemploysthelocation).Thisthesisassessesanautomaticcommercialeye-findingsysteminbothabsoluteandapplication-specificterms,mentalresultshighlightthesensitivityofbothsy-lapseeffectisalsoinvestigatedinthisstudy,andtheresultssuggestthatthefacerecognitionsystemperformancedegradeswithelapsedtime.

.2.2.CHAPTER3:3.3.1.3.3..3..3..4.4..4.4.iiivviviii9293032

4.4..4.5.5.5.CHAPTER6:ANALYSISOFAUTOMATICEYELOCATIONFAILURES.6.1Statisticsof“Bad”6.2Statisticsof“Failed”iii33343536363636384

TABLES2.13.13.23.33.43.53...iv40435.15.26.16.26.36.46.55050505252

6.52v

FIGURES2.1Siximagescapturedforonesubjectinoneimageacquisitionsession..3.3Histogramforeyeseparation(3.4Histogramforeyeseparation(3.6RMSerrorsforautomaticeyelocationsofFaceIt-G3withdiff3.3.8Eyelocationresultsforimagestakenundercontrolledlighting,3..3.3.5.1Facerecognitionsystems’performancecomparisonforusingeyeloca-tionsofdiff5.2Facerecognitionsystems’performancecomparisonforusingeyeloca-tionsofdifferentaccuracies:5.3Facerecognitionsystems’performancecomparisonusingdiffvi7383940

5.45.55.6FirstrankoffacerecognitionforimagestakenundercontrolledLighting.PerformanceplotforthefixedgallerytestshownasfirstrankcorrectmatchpercentagetotimelapseeffiffThefrequencyofprobeimagesatdifferenttimelapseforthefiPerformanceplotfortherunninggallerytestshownasfirstrankcorrectmatchpercentagetotimelapseeffiffThefrequencyofprobeimagesatdiff4245455.75.846465.95.10Performanceplotforthefixedgallerytestshownasfirstrankcorrectmatchpercentagetotimelapseeff5.11Performanceplotfortherunninggallerytestshownasfirstrankcorrectmatchpercentagetotimelapseeff5.12Performanceplotforthefixedgallerytestshownasfirstrankcorrectmatchpercentagetotimelapseeffsystemistestedwiththethreediff5.13Performanceplotfortherunninggallerytestshownasfirstrankcorrectmatchpercentagetotimelapseeffsystemistestedwiththethreediff6.16.26.36.46.56.66.7Imagestakenundercontrolledlightingconditionlabeled“bad”Imagestakenunderuncontrolledlightinglabeled“bad”byFaceIt-G3.47474arealltakenundernaturalsunshinelightingandlabeled“bad”.arealltakenunderhallwaylightingandlabeled“bad”.....Imagesaretakenundercontrolledlightingconditionandlabeled“bad”Sampleimageswith“failed”Sampleimageswith“failed”56575858vii

ACKNOWLEDGMENTSIwouldliketothankmyadvisor,shelps,greatpatienceandencouragementthatleadmeintothisinterestingcomputervisionfieldandhelpmefifirstcametoNotreDame,hegavemesomanyhelpsandsuggestion,withoutwhichIwouldn’eirlovethsearchwassupportedbyDRAPAunderAFOSRgrantF49620-66-1-0388andONRgrantN00014-02-1-0410,

CHAPTER1INTRODUCTION“Thethreemostimportantproblemsinfacerecognitionareregistration,registra-tion,andregistration!”-,invitedlectureatAVBPA2003(paraphrase).Facerecognitionapproachescanblessoftheapproach,cases,,eyepositionsarenoteasilyaffectedbyotherfacechanges,suchasmouthlocationwhichcanbeaff,theinteroculardistanceisrelativelyconstant,registeredfacesyieldnon-representativeprototypesinprojectionspaces,and,theorientationoftheinterocularlinecanbeusedtocorrecttheheadpose,setal.[8]proposedafacerecognitionsystemwhichemployedeyesastheonlyfacialfeaturetorecognizetheface,andthisapproachobtainedasurprisinglyhighcorrectclassificationrateof85%.Earlystud[15]tudy,thee1

criticalproblemisthatHoughtransformationcanextractacirclepatternevenwhenitispartlyoccluded,mentswithasetof6faceimagesundersimilarlightingconditionsyieldedsmalldiffnthosestudies,manynovelautomaticeyeextractionoreyelocationdetectionalgorithmshavebeendevelopedduringthelasttwentyyears,suchaseigenfeature-basedmethod[17],deformabletemplate-basedmethod[4][5][10],Gaborwaveletfiltermethod[9],varianceprojectionfunctionalgorithm[6],neuralnetwork[17]iveaOh(2001)proposedafacialfeaturelocationalgorithmbasedoneigen-featuresandaneuralnetwork[17].Eigenfeaturesarederivedfromeigenvaluesandeigenvectorsofabinaryedgedatasetconstructedfromeyeandmouthfields,andcanlocateeyeandmoutheffixtractedeigenfeaturesareusedtotrainamultilayerperceptron,andtheoutputindicatesthedegreetowhichaspecifiudyalsoshowsthattotraintheneuralnetwork,erimentresultsshowedarecognitionrateon61facialimagesof14peoplewithvaryingfacialsizeandposeof96.8%fortheeyesand100%edtheuseofadeformabletemplatetoextracttheeyeboundaryin1989[21].Deformabletemplatesconsistofaparargyfunctionisdefinedaccordingtoaprioriknowledgeabouttheexpectedshapeoffeaturesandisusedtoguidethecontourdeformationprocessusinggeometricalconstraints.2

However,,ly,ial.[4]proposedasimplifieddeformabletemplatemodel,whichreducedtheprocessingtimewhilesacrifi-ever,thedegradedprecisioncontourisstillofsufficientqualityforrecognitionpur-poses,al.[5]usedthedeformabletemplatetoextracteyefeaturesafterestimat-ingtheapproximatepositionofeyes,whichcanhelptoacceleratetheconvergenceofthedeformabletemplatefi,LamandYan[10]introdeeyecornerisdetected,thetemplatecanbeinitializedaccurately,mentsshowedthat40%ofexecutiontimewassavedandal.(2003)[20]introducedafacialfeatureextractionalgorithmusingaBayesianShapeModel(BSM)andanaffiacemodel,consistingofthecontourpointsandthecontrolpoints,ffineinvariantdeformablemodelistodescribethelocalshapedeformationsbetweentheprototypmentsverifiedthattheBSMismoreeffndWechsler(1999)[9]implementedoptimalwaveletpacketsforeyerep-resentationandradialbasisfunctionsforthesubsequentclassificationofthefacial3

area,mentalresultsshowed85%accuracywhesthewaystolocateordetecttheeyedirectly,manyeyedetectionmethodslocatetheroughfaceoreyewindowsfirst,andthenprocessafidYuen(2001)[6]firstcueistheeye’ondcueisthedirectionoftheinterocularlinedeterminedbyPCAandthethirdcueistheresultofconvolvingtheirproposedeyevariancefinthosethreecues,croscases,hpossibleeyewindow,theVarianceProjectionFunction(VPF)isusedforeyedetectionandverificumenteyedetectionaccuracyof92.5%with930faceimagesfromtheMITAIlabdatabasewithdiffl.[11]proposedanewwaytolocateeyesbasedonvalleyfiyvalleyfileddetectionisusedtoextractthepossibleeyecandidatesfromthecomplexbackground,thenthelocalpropertiesoftheseeyecandidatesarechecked(suchasroughnessandorientationrepresentedbyfractaldimensions),andfisibleeyepairsarethenfurtherverifiedbycomparingestimatedfractaldimensionsoftheeye-pairwindowandthecorrespondingfaceregionwiththerespectiveeralsoproposedamodifiedmethodtoestimatethefractaldimension,mentsontheMITfacedatabaseshowthattheproposedapproachcanachieveanoverallrank1correctrateof100%withoutheadtiltandhead-onlighting,and90.6%elightsourcedeviationtothe4

facesis45degreestofrontal,therank1correctratesfortheuprightandtiltedfacesare87.5%and85.9%,erimentresultsusingtheORLdatabaseshowthattheoverallrank1correctrateforuprightfacesis92.9%,andforthefacesubjectedtoaslightperspectivevariationis82.4%.Itwasconcludedthatthismethodcouldberobusttolocateeyesunderdifferentscales,orientations,ndWides[3]proposedametilandiris’diameterareapproximately130to160pixelsina640×rkusedamulti-resolutioncoarse-to-finesearchapproachtomaximizegradientstrengthsanduniformitiesmeasuredacrossraysradiatingfromacandidateirisorapupil’mentswith670eyeimages,whichincludedverylowcontrast,specularreflectionandobliqueviews,yielded98%al.[13]usedahumanfacegravity-centertemplateforfacelocation,whichalsoprovidedpositioninformationfortheeyebrows,eyes,,,basedonthefaceorgans’locationinformationobtainedfromthegravity-centertemplatematch,theregionsaroundtheorgansuchaseyesarescannedand4keypoints,whichcandeterminethesizeoftheorgans,esekeypointsarelocated,theshapeoforgansischaracterizedbycurvefinadvantagesofthisapproacharesimplicity,0.28secondsforeachimageonaverageonaPentium-550PC,algorithmswereviewedabovetriedtolocateeyepositionsasaccuratelyaspossible,whichwillaffectthefacerecognitionsystem’sperformancesignifir,thereislittleunderstandingoftheeffectofeyelocationonafacerecognition5

system’setal.[12]xperimentalresultssuggestthattheeigenfacealgorithmismoresensitivetoeyepositionsthatdeviateaboveorbelowtheenrolledrthesis,weexaminetheeffectofeyelocationaccuracyonfacerecognitionsystems’performanceandthesensitivitiesofdifftwodifferentfacerecognitionsystems:anopensourcePCAbasedsystemdevelopedatColoradoStateUniversity[2]andtheFaceItsystem[1]eIt,boththeirolderversionG3andthenewerversionG5aretested.G5wasclaimedtohaveamuchbetterperformancethanG3forbothautomaticface-fiestofthethesis,wewilldestudy,facialimagedataisdrawnfromadatabaseofover33,000high-resolutioncolorimagescollectedattheUniversityofNotreDame[7]ageinthedatabaseisaccompaniedby“groundtruthwriting”rdinateswerealsoextractedusingtheface-findingfunction(eyelocater)includedintheFaceItsuite,andbothversions,G3andG5,,facerecognitionexperimentsareperformedwiththePCAsystem,r3describestheeyelocationtechniquesemployedandassessesther4brieflr5describesexperimentsandrer6addressessituationswheretheautomaticeye6

r7providescommentsandconclusions.7

CHAPTER2DATACOLLECTIONThedatabaseusedintheseexperimentswascollectedbetweenSpring2002andSpring2003attheUniversityofNotreDame,andformsacomponentofalargedatabasethatsupportslongitudinal(time-lapse)studiesofthefacerecognitionsys-tems’performance[7].Approximately200experimentalsubjectswerephotographedweeklywithahigh-resolutioncolortion,twoaddibjectspaingisabriefdescriptionsoftheimageacquisition,beconcernedprimarilyaboutthelightingconditions,thefacialexpressionofthesubjects,erencestobrandnamesandmodelsareintentedtoenablereplicationofresultsandnottoendorseaparticularvendororproduct.2.1ImageLogInformationAlltheimagesusedinthisstudywerecollectedfrom3sections,Spring2002,2.1providesdetailedimageinformationforeachsection.8

Table2.1:IMAGELOGINFORMATIONResolution1600×12001600×1200or1704×22721704×2272Spring2002Fall2002Spring2003

(a)Uncontrolled|FA(b)Uncontrolled|FB(c)LM|FA(d)LM|FB(e)LF|FA(f)LF|FBFigure2.1:Siximagescapturedforonesubjectinoneimageacquisitionsession.10

CHAPTER3EYELOCATION:TECHNIQUstudy,wewillexaminehoweyelocationaccuracyaffectsfacerecognitionsystems’ffhePCA-basedsystemdevelopedatCSU[2]andtheotheroneistheFaceItsystemfromIdentix[1],eIt,betaileddescriptichapter,theeyelocationgenerationtechniquesandthemetricstoexaminetheaccuracyofeyelocationaretheprimaryfocus.3.1EyeLocationGenerationSinceweneedtoinvestigatetheaccuracyofeyelocationonfacerecognitionsys-tems’performance,weshouldhavedifferentsetsofeyelocationcoordinateswithdiffstudy,wewillusetwodiffolocateeyepositionsmanu-allyusingtrainedhumanannotators,whichiscalledGroundTruthWriting,ticeyelocationisitselfperformedusingtwotechniques(FaceIt-G3andFaceIt-G5),yieldingthreetechniquestotally.11

3.1.1GroundTruthWritingIngroundtruthwriting,theoperatormanuallymstudy,rdinatesobtainedinthiswayarecalledTruthWriting(TW)oyedthisgroundtruthwritingprocessoneachimageinourdataset,3.1depictstheuserinterfaceweusedforthismanualprocessing,andthepinpointsofthecenterofpupil,etipandmouthcenteraremarkedforhistoricalreasons,3.1:Groundtruthwritinguserinterface.12

Table3.1:PROPERTIESASSOCIATEDWITHTHEFACEFINDINGFUNCTIONOFFACEIT-G3PropertiesAutoEyeSpacingforeyespacing,isoptionseton,’ifinstraintisappliedafterMaxEyeSpacingmustbeatmostthisdistanceapart,tionspecifitionisusedtomakelargesourceimageisscaleddownbyhalfbothverticallyandSearchDepththeimagebycontrollinghowmany“candidate”uesrangefrom0to10.

Figure3.2:EyelocationsgeneratedbyFaceItsoftware.3.1.2AutomaticEyeLocating:FaceIt-G3andFaceIt-G5TheFaceIt-G3softwaresuiteprovidesaface-findingfuncresomeparametersassoci-atedwiththeface-findingfunctionthatcanbetunedtoallowuserstocontrolsettingsfortheirownapplications,,alltheimagesweusedareofhighresolution,,wehopetomakethewholeprocessautomatic,sowekeepthedefaultsettingforAutoEyeSpacing,,forthepropertiesofNumEyeSpacingandSearchDepth,wearenotwillingtodecreasetheirvaluestospeeduptheprocedurewhilesacrifiepthosedefaultvalues.14

iedthreedifferentShrink-Factorstofiof2,ult,wehavethreesetsofautomaticeyelocations,whichshouldbeofdiff3.2shoeIt-G5,weemployedthe“auto-face-findingfunction”.Thisfunctionper-formsthefacefithodoperatessimilarlytotheface-findfunction,It-G5,pertiesneededtotunefortheface-firtokeeptheeyelocatingprocessautomatic,wechooseauto-face-findinginthisexperiment.3.2QualityEvaluationMeasuresforAutomaticEyeLocationAcompanionface-checkingfunctionintheFaceItpackageprovidessevermarymetrichasavaluebetween-1and10(inclusive).of0meanstheeyelocwiththisscorearelabeled“failed”.of10meanstheeyepositionsarelocatedwithfullconfiwiththisresultingscorearelabeled“good”.between0and10meanstheeyelocationisobtainedwithoutfullconfi-ctionprovidesscoresof2.5,of2.5indicatesthe15

locatedeyepositionsareguessed,avalueof5.0meanstheeyesarelocatedwithlowconfidenceandavalueof7.5meanstheeyesarelocatedwithmediumconfixperiment,wetreatallimagesyieldingscoresinthisintervalthesame,andlabelthem“bad”.3.3QualityCheckingResultsforAutomaticallyGeneratedEyeLocationsWewrappedtheFaceItfacefiIt-G3,wetunedtheShrinkFactorandvaluesof(2,4,8)tisticsoftheautomaticeyelocationgenerationresultsaretabulatedinTables3.2,etables,theresultsforisfor“good”,“bad”and“failed”meansqualitycheckingresultsreturnultsshowthatwithShrinkFactorincreasing,thenumberof“good”imagesundercontrolledlightingincreases,whilethenumberof“good”ingtotheFaceItdocumentation,theidealeyeseparation,afterimagesizeisdecreasedbyShrinkFactor,gesinourdataset,separatiramsofeyeseparationforimagesundercontrolledlightinganduncontrolledlightingareshowninFigures3.3and3.4,osetwohistograms,wecanseethatimagestakenundercontrolledlightinghesultofthis,alargeShrinkFactorispreferredforimagestakenundercon-trolledlighting.16

ForFaceIt-G5,berof“good”imagesisgreaterthanthenumberinG3,andthenumberof“bad”usetheimageslabeled“good”egeslabeled“bad”or“failed”willbestudiedfurtherinChapter6. Histogram of Eye Separation for Controlled Lighting350300250

Frequency250 Eye Separation300350400Figure3.3:Histogramforeyeseparation(inpixels)ofimagestakenundercontrolledlighting.3.4MetricsofEyeLocationAccuracyMeasurementInordertoinvestigatethesensitivityoffacerecognitionsystems’performancetoeyelocationaccuracy,weshouldfirsthavemeasuredtheerrorfortheeyelocation17

Table3.2:STrolled22,308Bad78Total10,4863,69183100.00%29,09812.23%PercentageTable3.3:STrolled22,580Bad36Total10,4863,82741100.00%29,23411.95%PercentageTable3.4:STrolled22,670Bad3Total10,4864,08420100.00%29,05512.55%PercentageTable3.5:STATIrolled22,727Bad2Total10,486883380100.00%31,9522.75%Percentage18

Histogram of Eye Separation for Uncontrolled Lighting18

Frequency150 Eye Separation300350400Figure3.4:Histogramforeyeseparation(inpixel)kingtheresultsofautomaticeyelocationlabeledwith“good”,wefoundouesultofthis,wedecidedtotreatTWeyecoordinatesastheaccuratestandard,andthedisparitiesofautomaticeyelocationswillbecalculatedcomparedtoTWcoordinatesasfollows.3.4.1Metric1ERMS,theaverageoftwoRootMeanSquare(RMS)values,ERMS;LandERMS;R,isusedt;Listhedisparityoftwosetsofcoordinatesforthelefteye,andERMS;nbeexpressedas19

(a)Imagesundercontrolledlighting(b)ImagesunderuncontrolledlightingFigure3.5:=111

thedisparityofTWeyecoordinatesandFaceIteyecoordinates:

本文发布于:2024-09-21 17:25:22,感谢您对本站的认可!

本文链接:https://www.17tex.com/fanyi/17828.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:
留言与评论(共有 0 条评论)
   
验证码:
Copyright ©2019-2024 Comsenz Inc.Powered by © 易纺专利技术学习网 豫ICP备2022007602号 豫公网安备41160202000603 站长QQ:729038198 关于我们 投诉建议