专利摘要:
The present invention relates to a personal identification method using a finger node fingerprint. This personal identification method comprises the steps of: a) extracting region of interest data from image data of an input-converted fingertip; b) low band filtering the extracted region of interest data; c) extracting finger boundary regions from the low band filtered data; d) extracting a finger node fingerprint from the low band filtered data; e) clustering a finger node fingerprint by applying the extracted finger boundary region to the extracted finger node fingerprint; f) independently matching the clustered fingertip fingerprints with a previously registered fingertip fingerprint cluster; And g) recognizing a specific person according to the matching result. According to the present invention, the image acquisition is easy and there is an excellent effect in user convenience. In addition, the possibility of the practical use of biometrics using the node fingerprint is improved by the independent registration of each cluster through the clustering of the finger node fingerprints.
公开号:KR20020092522A
申请号:KR1020010031173
申请日:2001-06-04
公开日:2002-12-12
发明作者:최환수;임상균
申请人:주식회사 테크스피어;
IPC主号:
专利说明:

Personal identification method and device using fingertip fingerprint {METHOD FOR IDENTIFING BIOMETRIC PERSON USING A FINGER CREASE PATTERN AND APPARATUS THEREOF}
[26] The present invention relates to a method and apparatus for identifying an individual using a finger node fingerprint, and more particularly, to a method and apparatus for identifying an individual using a finger node fingerprint clustering technique.
[27] Recently, the importance of biometrics as a method of identifying an individual is gradually increasing.
[28] Biometrics refers to performing personal identification using an individual's unique physical or behavioral characteristics.
[29] The unique characteristics of the individual include a finger fingerprint, a hand shape, an eye iris, a face, a vein pattern on the back of the hand, a voice pattern, and the like, and each feature has a difference in user convenience, intimacy, and recognition performance.
[30] On the other hand, systems for identifying individuals using the above-described features have been developed and used, but in most systems, the recognition rate is lowered or the corresponding images according to the above characteristics can be stably acquired to stably identify the individual. As it is not yet developed, there is a problem that it is difficult to apply.
[31] Accordingly, an object of the present invention is to solve the above-mentioned conventional problems, and to provide a personal identification method and apparatus for stably performing personal identification using a finger node fingerprint of an individual.
[1] 1 is a block diagram of a personal identification device using a finger node fingerprint according to an embodiment of the present invention.
[2] 2 is a flowchart of a personal identification method using a finger node fingerprint according to an embodiment of the present invention.
[3] FIG. 3 is a detailed flowchart of a finger boundary region extraction step of the personal identification method of FIG. 2.
[4] 4 is a detailed flowchart of a finger node fingerprint extraction step of the personal identification method of FIG. 2.
[5] FIG. 5 is a detailed flowchart of a clustering step of finger node fingerprints in the personal identification method of FIG. 2.
[6] FIG. 6 is a detailed flowchart of a matching step of an input image and a registered image cluster in the personal identification method of FIG. 2.
[7] FIG. 7 is a diagram illustrating a still image captured by a CCD camera in the personal identification device of FIG. 1.
[8] FIG. 8 is a diagram illustrating an ROI image extracted from the still image of FIG. 7.
[9] FIG. 9 illustrates a vertical border region image binarized from the ROI image of FIG. 8.
[10] FIG. 10 is a diagram illustrating an image obtained by Hough transforming the image of FIG. 9.
[11] FIG. 11 is a diagram illustrating an image in which a finger boundary region is defined with respect to the image of FIG. 10.
[12] FIG. 12 is a diagram illustrating node fingerprint images binarized after unsharp masking filtering of the image of FIG. 8.
[13] FIG. 13 is a view illustrating an image in which a ghost is removed from the image of FIG. 12 and only a fingerprint part is extracted.
[14] 14 is a diagram illustrating a node fingerprint center point on the image of FIG. 13.
[15] FIG. 15 is a diagram illustrating a distance between the node fingerprint center points shown in FIG. 14.
[16] FIG. 16 is a view illustrating clustering of the node fingerprints illustrated in FIG. 15.
[17] FIG. 17 is a view showing a center point of clusters shown in FIG. 16.
[18] FIG. 18 is a diagram illustrating a distance between center points of clusters shown in FIG. 17.
[19] FIG. 19 is a diagram illustrating a circumference quadrangle of a node fingerprint cluster of a registration image stored in a data memory and a center point of the circumference quadrangle of the personal identification device of FIG.
[20] 20 is a diagram illustrating node fingerprint clusters of an input image corresponding to the image of FIG. 19.
[21] 21 is a diagram illustrating matching while moving the image of FIG. 19 and the image of FIG. 20.
[22] 22 illustrates an EER (Equal Error Rate) graph according to an embodiment of the present invention.
[23] * Explanation of symbols for main parts of the drawings
[24] 10 key input 20 data memory 30 CCD camera
[25] 40 frame grabber 50 picture memory 60 microprocessor
[32] As a means for achieving the above object, the present invention clusters and registers the node fingerprints of each finger after preprocessing to reinforce the vertical border area of the finger and the node fingerprint to the input finger node fingerprint image, and to receive personal identification. The finger joint fingerprint image of the finger joint generated after the same process as described above is characterized in that the personal identification is performed by matching with the registered cluster independently.
[33] According to the above aspect, the present invention provides a method for extracting a region of interest from image data of an input-converted finger node; b) low band filtering the extracted region of interest data; c) extracting finger boundary regions from the low band filtered data; d) extracting a finger node fingerprint from the low band filtered data; e) clustering a finger node fingerprint by applying the extracted finger boundary region to the extracted finger node fingerprint; f) independently matching the clustered fingertip fingerprints with a previously registered fingertip fingerprint cluster; And g) recognizing a specific person according to the matching result.
[34] C) extracting a finger boundary region may include strengthening a vertical boundary portion of a finger with respect to the low band filtered data; Binarizing the vertical boundary enhanced data; And defining a finger boundary for the binarized data.
[35] Here, the vertical boundary portion of the finger may be strengthened by using a mask that strengthens the vertical boundary with respect to the low-band filtered data.
[36] The binarization may be performed by an algorithm for automatically selecting a binarization value according to a probabilistic distribution of gray level levels from a gray level histogram.
[37] After the binarized data is Hough transformed, a linear component of a predetermined length or more is selected, and a reference component is selected from the selected linear components to define a boundary region of the finger.
[38] The d) finger node fingerprint extraction may include filtering the low band filtered data using an unsharp masking filter; Binarizing the filtered data; And performing labeling on the binarized data.
[39] In the labeling step, a label having a predetermined size or less and a label having a predetermined height as vertical components are determined to be non-fingerprint, not fingertips, and removed.
[40] The e-finger knuckle fingerprinting step may include: clustering the extracted finger nugget fingerprint labels based on a clustering reference distance; And merging each of the finger node fingerprint clusters clustered in the step based on a merge reference distance.
[41] Wherein the clustering step includes: obtaining a circumscribed rectangle for each of the fingertip fingerprint labels; Calculating a center point of the obtained circumscribed quadrangle; Calculating a distance between the center points of the circumscribed quadrangle; And grouping finger node fingerprint labels in which the distance between the calculated center points of the circumferential rectangles is smaller than the clustering reference distance into one cluster.
[42] In addition, the merging step may include obtaining an circumscribed quadrangle for each cluster bound in the step; Calculating a center point of the obtained circumscribed quadrangle; Calculating a distance between the center points of the circumscribed quadrangle; Determining a merge candidate when the distance between the center points of the circumferential rectangles is smaller than the merge reference distance; And merging the merge candidate clusters when the minimum distance between each vertex of the circumferential rectangle of the merge candidate cluster is 1/2 or less of the merge reference distance.
[43] In addition, the step of f) matching may include obtaining an circumferential rectangle of each finger node fingerprint cluster registered in advance; Calculating a center point of the obtained circumscribed quadrangle; Obtaining each cluster of the input finger node fingerprints in which a center point is located within a predetermined range based on the calculated center point; Matching a center point of the obtained input finger node fingerprint cluster with a center point of the registered finger node fingerprint cluster; And mating while moving the clusters.
[44] The matching of each cluster in the matching step is characterized in that the circular matching to which the weight.
[45] In this case, the weights are assigned to the matching coefficients obtained for the node fingerprint area and the background area of the registered cluster, the node fingerprint area and the background area of the input cluster, respectively, in the circumferential rectangle of each cluster.
[46] In addition, according to the above features the present invention provides a key input unit for key input of a user; A data memory in which an individual finger node fingerprint data and a personal identification number are stored in advance; A camera for photographing a user's finger to receive personal identification and outputting a corresponding image; A frame grabber for extracting still image data from a finger image output from the camera; An image memory for storing still image data extracted by the frame grabber; Various processing such as finger boundary region extraction, finger node fingerprint extraction, and finger node fingerprint clustering are performed on the still image data stored in the image memory, and the resulting data and the data stored in the data memory are independently matched. A microprocessor for identifying a specific person of the user; And an interface unit for data and control communication between an external device and the microprocessor.
[47] Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
[48] 1 is a block diagram of a personal identification device using a finger node fingerprint according to an embodiment of the present invention.
[49] As shown in FIG. 1, a personal identification device using a finger node fingerprint according to an embodiment of the present invention includes a key input unit 10, a data memory 20, a CCD camera 30, a frame grabber 40, and an image memory ( 50, a microprocessor 60, and an interface unit 70.
[50] The key input unit 10 is a means for a user who wants to receive a personal identification to input his or her own unique identification number.
[51] In the data memory 20, the fingertip fingerprint data of the individual and the personal identification number are stored in advance, and this data is captured by the CCD camera 30 and the data input later through the key input unit 10, and the image memory ( It is compared with the data obtained through the processing of the microprocessor 60 for the data stored in 50), through this comparison is made a personal identification.
[52] The CCD camera 30 photographs a user's finger to receive personal identification and outputs the corresponding image. At this time, among the fingers of the user, particularly the finger node fingerprint is required, the finger node portion should be positioned so that it can be photographed.
[53] The frame grabber 40 extracts still image data from the user's finger image photographed by the CCD camera 30 and stores it in the image memory 50. At this time, it is preferable to extract the still image data at a stable moment from the image output from the CCD camera 30.
[54] Still picture data output from the frame grabber 40 is stored in the image memory 50, and intermediate processing data processed by the microprocessor 60 may also be stored.
[55] The microprocessor 60 performs various processing on data output from the frame grabber 40 and stored in the image memory 50, and compares the resultant data with the data stored in the data memory 20. Identifies the user.
[56] Various processes performed by the microprocessor 60 include a process of extracting node fingerprints of a finger by performing a vertical boundary region enhancement process and a node fingerprint enhancement process on an image photographed by the CCD camera 30. Clustering processing on the extracted fingerprints of the finger;
[57] As a result of the identification of the user by the microprocessor 60, if the finger node fingerprint of the user matches any one of the finger node fingerprint data previously stored in the data memory 20, the microprocessor 60 controls the interface unit 70. Outputs a signal for controlling an external device through, for example, a door open / close signal.
[58] 2 to 6 are flowcharts of a personal identification method using a finger node fingerprint according to an embodiment of the present invention. Hereinafter, the operation of the personal identification device will be described in detail with reference to the flowcharts of FIGS. 2 to 6.
[59] First, a user who wants to receive a personal identification inputs his or her own personal identification number through the key input unit 10, and then the CCD camera 30 can accurately photograph his or her hand, particularly the fingertip fingerprint region. Place the hand (S10).
[60] The key input unit 10 transmits a personal unique identification number input by the user to the microprocessor 60, and the microprocessor 60 determines whether or not the transmitted personal unique identification number exists in the data memory 20. (S20), if the inputted personal identification number does not exist in the data memory 20, that is, if the user is not identified as a specific person, the process returns to the step of re-entering the personal identification number (S10). If the personal identification number is present in the data memory 20, the microprocessor 60 receives the user's finger joint image according to the following process (S30).
[61] Under the control of the microprocessor 60, the CCD camera 30 captures an image of a user's finger and outputs a corresponding video signal, which is output by the frame grabber 40 as shown in FIG. It is converted into one frame of still picture data and temporarily stored in the picture memory 50.
[62] Images captured by the CCD camera 30 can be obtained in various environments. For example, a 640 × 480 size 8-bit gradation image can be obtained after shooting with a yellow LED light.
[63] The microprocessor 60 extracts a region of interest (ROI) including a finger node fingerprint portion while retaining the still image data stored in the image memory 50 as shown in FIG. 8 (S40), and then extracts the region of interest (S40) in this region. The preprocessing process for extracting node fingerprints is performed as follows. Although the size of the ROI is specified to be 345 × 245, the technical scope of the present invention is not limited thereto, and may be arbitrarily determined within 640 × 480, which is the size of still image data.
[64] In the region of interest illustrated in FIG. 8, the knot fingerprint, the finger portion, and the miscellaneous color are mixed at the same gradation level due to the swelling of the palm, uneven illumination, and the bending of the finger.
[65] Therefore, in order to reduce the noise component of the region of interest, the microprocessor 60 applies low-band filtering using an 11 × 11 Gaussian mask such as [Equation 1] below to remove the image f 1 (x, y) from which the noise is removed. Get (S50).
[66]
[67] 11 × 11 Gaussian mask
[68] As described above, two steps are required to extract a finger node fingerprint from an f 1 (x, y) image from which noise is removed.
[69] First, in order to prevent the node fingerprint of one finger from being bound to the node fingerprint of the other finger and one cluster in each node fingerprint clustering process, a process of extracting the boundary region of each finger is required (S60). A process (S70) of extracting the node fingerprint of each finger for performing clustering is necessary.
[70] First, the process of extracting the boundary region of the finger (S60) will be described.
[71] The microprocessor 60 applies the mask for strengthening the vertical boundary portion as shown in Equation 2 below to the f1 (x, y) image extracted in the step S50, where the vertical boundary portion of the finger is strengthened. After (x, y) is obtained (S62), binarization is performed on the video to extract the boundary area between the fingers (S64).
[72]
[73] [Equation 2] vertical border enhancement mask
[74] The binarization algorithm is a gray level histogram proposed in N. Otsu's "A threshold selection method from gray-level histograms" (IEEE SMC-9, No. 1, pp. 62-66, January 1997). the uses an algorithm that automatically selects a binarized value of the level of the probability distribution, f 2 (x, y) was binarized image to image in a vertical boundary between the background and the finger f 3 (x, y) is attached It is obtained as shown in FIG.
[75] Next, the finger boundary region should be defined using f 3 (x, y), which is the binarized vertical boundary image as described above, and as shown in FIG. 9, f 3 (x, y) is generally influenced by an aberration. In case of applying the boundary tracking based algorithm, it is observed that the probability of finger boundary definition failure due to unconnected boundary is very high.
[76] Therefore, in the exemplary embodiment of the present invention, the boundary of the finger is defined using the Hough transform in order to stably extract the boundary line between adjacent fingers rather than the definition of the exact boundary line. That is, the microprocessor 60 performs a Hough transform on the vertical boundary region of a given finger (S66), and then selects straight lines estimated as linear components of a predetermined length or more in the f 3 (x, y) image, Defining the boundary of the finger using (S68).
[77] FIG. 10 is a diagram illustrating linear components obtained by performing Hough transform on the f 3 (x, y) image of FIG. 9. Among the straight lines representing the vertical component, one reference straight line to set the boundary region of the finger must be determined. There may be a number of methods for determining the reference straight line. However, in the present embodiment, a straight line representing the boundary of the corresponding fingers is defined as shown in FIG. Although used, the technical scope of the present invention is not limited thereto.
[78] Next, a process (S70) of extracting a node fingerprint of each finger in order to perform clustering will be described.
[79] First, the microprocessor 60 performs a filtering process on the f 1 (x, y) image obtained in the step S50 by using a 3 × 11 unsharp masking filter as shown in Equation 3 below. (S72).
[80] The 3 × 11 unsharp masking filter is applied directly to the image f 1 (x, y) that has been processed in the step S50 without applying the respective areas of the finger. This filter is a high boost low-band filter that emphasizes node-finger prints with strong horizontal contours and smooths out image distortion caused by uneven lighting.
[81]
[82] Equation 3 unsharp masking filter
[83] The microprocessor 60 then applies binarization to the f ' 3 (x, y) image with the knuckle fingerprint enhanced (S74), and each label number to the elements connected by 8-connectivity. Execute the labeling process to give. At this time, labels having a predetermined size or less and labels having a constant height as vertical components are removed by judging not by fingertip fingerprints but by ghosting.
[84] 12 is a view illustrating binarized node fingerprint images, and FIG. 13 is a view illustrating an image in which ghost images are removed and only a node fingerprint portion is extracted through a labeling process.
[85] Next, the microprocessor 60 performs clustering of the finger node fingerprints by applying a boundary region of each finger defined using the Hough transform to an image from which the finger node fingerprints are extracted through the labeling process (S80).
[86] This clustering process (S80) will be described in detail below.
[87] First, the microprocessor 60 obtains a circumscribed rectangle of each node fingerprint label in an image in which the finger node fingerprint is extracted through the labeling process, that is, the image shown in FIG. S804). Referring to FIG. 14, the circumscribed rectangle of the node fingerprint is displayed, and the center point l_c of the rectangle is also displayed.
[88] Next, the microprocessor 60 calculates the distance l_d between the center points l_c of the circumscribed quadrangle obtained in step S804 (S806), and this distance is well illustrated in the accompanying FIG.
[89] On the other hand, when the reference distance for clustering is L_Distance, this reference distance is calculated as a value obtained by dividing the average value of the distance l_d between the center points of the circumscribed rectangle by the constant k as shown in Equation 4 below (S808). ).
[90] L_Distance =
[91] [Equation 4] clustering reference distance
[92] Here, the constant k shows the best performance at about 2.5 experimentally.
[93] Subsequently, the microprocessor 60 bundles the node fingerprint labels in which the distance l_d between the circumscribed rectangles is smaller than the predetermined reference distance L_Distance into one cluster (S810). At this time, unless the finger node fingerprint is located in the same area divided by the Hough transform, it is not bundled into one cluster. Such clustered results are shown in FIG.
[94] Next, the microprocessor 60 calculates the center point c_c of the clusters bundled in step S810 (S812), and then calculates the distance c_d between the center points c_c (S814). The center point c_c calculated as described above is shown in FIG. 17.
[95] On the other hand, the microprocessor 60 determines that the reference distance for determining whether to merge each cluster is called C_Distance, it is determined as a value obtained by adding the weight τ to the value obtained by dividing the average distance of each cluster by two (S816). This calculation is shown in [Equation 5] below, and the reference distance is shown in FIG.
[96] C_Distance = + τ
[97] [Equation 5] cluster merge reference distance
[98] Here, the constant τ is a value for considering the distance deviation between the clusters, and shows the best performance at about 10 experimentally.
[99] Next, if the clusters obtained in the step S810 are located within the reference distance calculated in the step S816, the microprocessor 60 sets one node fingerprint group as a group candidate divided into these clusters and merges these candidate clusters. It is determined whether or not (S818). Here, since one or two clusters of the finger node fingerprints are usually located within the reference distance, in the case of merging, two clusters correspond to candidate clusters, and it is determined whether the two clusters are merged.
[100] When the candidate cluster is determined in step S818, the microprocessor 60 determines that the minimum distance between the vertices of the circumferential rectangles of the corresponding fingerprints of the candidate cluster is 1/2 of the calculated reference distance, that is, C_Distance / 2. If it is below, it is determined as one cluster and merges the candidate cluster into one cluster (S820).
[101] When the finger node fingerprint image of the user is extracted from the still image captured by the CCD camera 30 as described above, the microprocessor 60 detects the finger similarity with the node fingerprint of the data stored in the data memory 20. Independent matching is performed for each cluster of node fingerprints (S90). In particular, the number of node fingerprints included in the same region of interest of the image stored in the data memory 20 and the image captured by the CCD camera 30 may be different. In this case, it is also possible to partially match only the clusters determined to be the node fingerprint clusters in the same location.
[102] For convenience of explanation, the image captured by the CCD camera 30 and finally generated by the step S820 is used as an input image, and the image stored in the data memory 20 is divided into registered images.
[103] First, the microprocessor 60 obtains an external quadrangle of each node fingerprint cluster of the registered image (S910), and finds the center point c of the obtained external quadrangle (S920). The circumscribed quadrangle thus obtained and its center point are shown in FIG.
[104] Next, after matching the center point c 'of the circumscribed rectangles of the node fingerprint clusters located within a predetermined range in the input image as shown in FIG. 20 with reference to the center point c of the circumscribed rectangle (S930), the top, bottom, left and right sides The similarity takes the best value while matching while moving the swarm fingerprint cluster (S940). In FIG. 21, a moving match between a registered image and an input image is illustrated.
[105] In this way, if each finger node fingerprint group is matched independently as a feature vector, the specific finger may be excluded from image input, or the node fingerprint loss may occur due to uneven lighting and user's immaturity. Even with the error due to the matching of the knot fingerprint clusters of the remaining fingers, a fairly reliable matching result can be obtained.
[106] On the other hand, the matching of each node fingerprint cluster performed in the step S940 is used a weighted circular matching.
[107] The weight of the match is 1/4 each of the matching coefficients obtained for the node fingerprint region of the registered image, the base region of the registered image, the node fingerprint region of the input image, and the background region of the input image in the circumference square of each cluster. It means weighting. This is to prevent bias resulting from the imbalance of the area of the node fingerprint area and the background area by giving the same weight to the node fingerprint and the background area.
[108] The equation for obtaining the matching coefficient as described above is as shown in [Equation 6] below.
[109]
[110]
[111] [Equation 6] Matching coefficient calculation formula
[112] Where s and t are used to shift the input image to find the maximum match.
[113] Variable, f (x, y) is the binarized input image, Is a complement of f (x, y), and w (x, y) represents a binarized registered image. The final matching coefficient is the maximum value of the coefficient value calculated while changing s and t to a certain value. That is, a matching coefficient is calculated | required by following Formula (7).
[114]
[115] [Formula 7] final matching coefficient calculation formula
[116] Next, when there is a registered image in which the matching result is greater than or equal to the reference value in step S940 (S100), the microprocessor 60 recognizes the user who inputs the finger node fingerprint as a specific person designated by a personal identification number. Then, by outputting a control signal through the interface unit 70 to control the opening and closing of the external device, for example, the door (S110). If there is no registered image that is equal to or larger than the reference value in step S940, the microprocessor 60 determines that the user who inputs the finger node fingerprint is not a registered specific person and does not open or close the door (S120). ).
[117] EXAMPLE
[118] The experimental environment used to evaluate the performance of the personal identification device using the fingertip fingerprint as described above was inputted using a BT-848 PCI overlay board using a general purpose CCD camera, and a personal identification method using the fingertip fingerprint. Algorithm implementation was done on Pentium PC using C ++.
[119] For the experimental sample, randomly acquired 129 finger images were used. The input image was input three times per person, and based on these, the fingertip fingerprint clusters were independently matched, and the false recognition rate (FAR) and the false rejection rate (FRR) were calculated. That is, one of three images input for each individual was selected as a registered image, and then the false rejection rate was calculated by comparing with the other images of the person, and the false recognition rate was calculated by comparing with the images of other people.
[120] [Table 1] shows the false recognition rate and false rejection rate that occur when the recognition threshold is changed from 0.50 to 0.63. In Table 1, FA and FR are misrecognition and rejection recovery, respectively.
[121]
[122] [Table 1] False recognition rate and false rejection rate test results
[123] In addition, an error rate (Equal Error Rate: ERR) when FAR and FRR, which are frequently used as the index of performance of a biometric system, are set to be equal to FAR = FRR ≒ 0.6% near a threshold of 0.5670. An EER graph is shown in the accompanying FIG. 22.
[124] Although the present invention has been described with reference to the most practical and preferred embodiments, the present invention is not limited to the above disclosed embodiments, but also includes various modifications and equivalents within the scope of the following claims.
[125] According to the present invention, the image acquisition is simple and has an excellent effect in user convenience. In addition, the possibility of the practical use of biometrics using the node fingerprint is improved by the independent registration of each cluster through the clustering of the finger node fingerprints.
权利要求:
Claims (14)
[1" claim-type="Currently amended] a) extracting region of interest data from image data of an input-converted finger node;
b) low band filtering the extracted region of interest data;
c) extracting finger boundary regions from the low band filtered data;
d) extracting a finger node fingerprint from the low band filtered data;
e) clustering a finger node fingerprint by applying the extracted finger boundary region to the extracted finger node fingerprint;
f) independently matching the clustered fingertip fingerprints with a previously registered fingertip fingerprint cluster; And
g) recognizing a specific person according to the matching result
Personal identification method using a finger node fingerprint comprising a.
[2" claim-type="Currently amended] The method of claim 1,
C) extracting a finger boundary region
Reinforcing a vertical boundary portion of a finger on the low band filtered data;
Binarizing the vertical boundary enhanced data; And
Defining a finger boundary for the binarized data
Personal identification method using a finger node fingerprint comprising a.
[3" claim-type="Currently amended] The method of claim 2,
And reinforcing the vertical boundary portion of the finger using a mask for enhancing the vertical boundary for the low-band filtered data.
[4" claim-type="Currently amended] The method of claim 2,
And the binarization is performed by an algorithm for automatically selecting a binarization value according to a probabilistic distribution of gray level levels from a gray level histogram.
[5" claim-type="Currently amended] The method of claim 2,
Hough transforming the binarized data, selecting a linear component of a predetermined length or more, and determining a reference component among the selected linear components to define a boundary region of the finger. Way.
[6" claim-type="Currently amended] The method of claim 1,
The d) finger node fingerprint extraction step
Filtering the low band filtered data using an unsharp masking filter;
Binarizing the filtered data; And
Labeling the binarized data
Personal identification method using a finger node fingerprint comprising a.
[7" claim-type="Currently amended] The method of claim 6,
In the labeling step, a label having a predetermined size or less and a label having a predetermined height as vertical components are determined to be non-fingerprint, not fingertip fingerprints, and removed.
[8" claim-type="Currently amended] The method of claim 6,
E) the finger node fingerprint clustering step
Clustering the extracted finger node fingerprint labels based on a clustering reference distance; And
Merging the clustered fingertip fingerprint clusters in the step based on a merge reference distance
Personal identification method using a finger node fingerprint comprising a.
[9" claim-type="Currently amended] The method of claim 8,
Said clustering step
Obtaining a circumscribed rectangle for each of the fingertip fingerprint labels;
Calculating a center point of the obtained circumscribed quadrangle;
Calculating a distance between the center points of the circumscribed quadrangle; And
Grouping finger node fingerprint labels whose distance between the center points of the calculated circumscribed rectangles is smaller than the clustering reference distance into one cluster;
Personal identification method using a finger node fingerprint comprising a.
[10" claim-type="Currently amended] The method of claim 9,
The clustering reference distance is the relation below
L_Distance =
Where L_Distance is the clustering reference distance,
k is a constant
Identification method using a finger node fingerprint that follows.
[11" claim-type="Currently amended] The method of claim 9,
The merging step
Obtaining an circumscribed quadrangle for each cluster bound in the step;
Calculating a center point of the obtained circumscribed quadrangle;
Calculating a distance between the center points of the circumscribed quadrangle;
Determining a merge candidate when the distance between the center points of the circumferential rectangles is smaller than the merge reference distance; And
Merging the merge candidate clusters when the minimum distance between each vertex of the circumferential rectangle of the merge candidate cluster is 1/2 or less of the merge reference distance;
Personal identification method using a finger node fingerprint comprising a.
[12" claim-type="Currently amended] The method of claim 11,
The merge criterion distance is
C_Distance = + τ
Where C_Distance is the merge criterion distance,
τ is a constant related to the distance deviation between clusters
Identification method using a finger node fingerprint that follows.
[13" claim-type="Currently amended] The method of claim 1,
F) the matching step
Obtaining a circumference quadrangle of each finger node fingerprint group registered in advance;
Calculating a center point of the obtained circumscribed quadrangle;
Obtaining each cluster of the input finger node fingerprints in which a center point is located within a predetermined range based on the calculated center point;
Matching a center point of the obtained input finger node fingerprint cluster with a center point of the registered finger node fingerprint cluster; And
Mating while moving the clusters
Personal identification method using a finger node fingerprint comprising a.
[14" claim-type="Currently amended] A key input unit for inputting a key of a user;
A data memory in which an individual finger node fingerprint data and a personal identification number are stored in advance;
A camera for photographing a user's finger to receive personal identification and outputting a corresponding image;
A frame grabber for extracting still image data from a finger image output from the camera;
An image memory for storing still image data extracted by the frame grabber;
Various processing such as finger boundary region extraction, finger node fingerprint extraction, and finger node fingerprint clustering are performed on the still image data stored in the image memory, and the resulting data and the data stored in the data memory are independently matched. A microprocessor for identifying a specific person of the user; And
Interface unit for data and control communication between an external device and the microprocessor
Personal identification device using a finger node fingerprint comprising a.
类似技术:
公开号 | 公开日 | 专利标题
JP6650946B2|2020-02-19|System and method for performing fingerprint-based user authentication using images captured with a mobile device
JP2019079546A|2019-05-23|Image, feature quality, image enhancement, and feature extraction for ocular-vascular and facial recognition, and fusion of ocular-vascular with facial and/or sub-facial information for biometric systems
Zhou et al.2011|A new human identification method: Sclera recognition
Yan et al.2007|Biometric recognition using 3D ear shape
US20130301885A1|2013-11-14|Image processing device, imaging device, image processing method
Saraswat et al.2010|An efficient automatic attendance system using fingerprint verification technique
US7612875B2|2009-11-03|Personal identification system
Proença et al.2005|UBIRIS: A noisy iris image database
EP0967574B1|2011-08-24|Method for robust human face tracking in presence of multiple persons
US7072523B2|2006-07-04|System and method for fingerprint image enhancement using partitioned least-squared filters
Raja2010|Fingerprint recognition using minutia score matching
US7035461B2|2006-04-25|Method for detecting objects in digital images
KR101390756B1|2014-05-26|Facial feature detection method and device
KR100453943B1|2004-10-20|Iris image processing recognizing method and system for personal identification
Crihalmeanu et al.2009|Enhancement and registration schemes for matching conjunctival vasculature
EP1693782B1|2009-02-11|Method for facial features detection
JP3753722B2|2006-03-08|Extraction method of tooth region from tooth image and identification method and apparatus using tooth image
Messer et al.2003|Face verification competition on the XM2VTS database
US7298874B2|2007-11-20|Iris image data processing for use with iris recognition system
Burge et al.1996|Ear biometrics
Kawaguchi et al.2003|Iris detection using intensity and edge information
DE60213032T2|2006-12-28|Facial detection device, face paw detection device, partial image extraction device, and method for these devices
Chang et al.2005|Adaptive rigid multi-region selection for handling expression variation in 3D face recognition
US7110581B2|2006-09-19|Wavelet-enhanced automated fingerprint identification system
Min et al.2009|Eyelid and eyelash detection method in the normalized iris image using the parabolic Hough model and Otsu’s thresholding method
同族专利:
公开号 | 公开日
KR100467392B1|2005-01-24|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
法律状态:
2001-06-04|Application filed by 주식회사 테크스피어
2001-06-04|Priority to KR20010031173A
2002-12-12|Publication of KR20020092522A
2005-01-24|Application granted
2005-01-24|Publication of KR100467392B1
优先权:
申请号 | 申请日 | 专利标题
KR20010031173A|KR100467392B1|2001-06-04|2001-06-04|Method for identifing biometric person using a finger crease pattern and apparatus thereof|
[返回顶部]