专利摘要:
An image sensor comprising: a depth pixel (PZ) having: a detection zone (PD); a first memory (mem1) electrically coupled to the detection zone by a first gate (310); a second memory (mem2) electrically coupled to the detection area by a second gate (314); and a third memory (mem3) electrically coupled to the detection area by a third gate (316), each of the first, second and third memories being formed of a doped region sandwiched between first and second parallel straight walls (404). , 406), the first and second walls of each memory having a conductive core adapted to receive a bias voltage; and a plurality of 2D image pixels (P1 to P8) disposed adjacent the depth pixels, wherein the first, second and third memories extend to form at least partial isolation walls between corresponding adjacent pairs 2D image pixels.
公开号:FR3060250A1
申请号:FR1662340
申请日:2016-12-12
公开日:2018-06-15
发明作者:Yvon Cazaux;Benoit Giffard
申请人:Commissariat a lEnergie Atomique CEA;Commissariat a lEnergie Atomique et aux Energies Alternatives CEA;
IPC主号:
专利说明:

Technical area
This description relates to the field of image sensors and more particularly image sensors capable of capturing a 2D image and depth information.
Technical background
Image sensors able to capture depth information have been proposed. For example, time of flight detectors (ToF) act to emit a light signal to an image scene and then to detect the return light signal reflected by objects in the image scene. By calculating the time of flight of the light signal, we can estimate the distance to the object sensor in the image scene. For example, the pixels of such a sensor can use SPADs (single photon avalanche diodes).
In some applications, it would be desirable to be able to capture both a 2D image of a scene and an image of corresponding depth of the scene.
While a solution to achieve this goal would be to use separate image sensors to capture the 2D image and the depth image, such a solution is not optimal due to the fact that these image sensors will have different views of the image scene, which leads to a defect
B15344 - DD17326UB alignment between the pixels of the corresponding images. In addition, the use of two sensors would increase the size and the cost of the device.
Another solution would be to integrate the pixels of the 2D image and the depth pixels in the same network of depth detectors.
One problem, however, is that the pixels in the image are usually larger than the complex image pixels.
Summary significantly more 2D dimensions, which makes such
An object of embodiments of the present description is to at least partially address one or more problems of the prior art.
In one aspect, an image sensor is provided comprising: a depth pixel having: a detection area; a first memory electrically coupled to the detection zone by a first grid; a second memory electrically coupled to the detection zone by a second grid; and a third memory electrically coupled to the detection zone by a third grid, each of the first, second and third memory consisting of a doped region sandwiched between first and second parallel straight walls, the first and second walls of each memory having a conductive core adapted to receive a bias voltage; and a plurality of 2D image pixels disposed adjacent to the depth pixels, in which the first, second and third memories extend to form at least partial isolation walls between the corresponding adjacent pairs of image pixels 2D.
According to one embodiment, the walls of the first memory extend between the detection zones of the first and second pixels of the plurality of 2D image pixels; the walls of the second memory extend between, the detection zones of the third and fourth pixels of the plurality of 2D image pixels; and the walls of the third memory extend between the
B15344 - DD17326JB detection areas of the fifth and sixth pixels of the plurality of 2D image pixels.
According to one embodiment, the depth pixel further comprises an additional structure electrically coupled to the detection zone of the depth pixel by a fourth grid, the additional structure comprising an n-type region sandwiched between straight parallel walls, and wherein the walls of the additional structure extend between the detection areas of the seventh and eighth pixels of the plurality of 2D image pixels.
According to one embodiment, the image sensor comprises a plurality of two by two blocks of pixels, at least some of said two by two blocks of pixels comprising the depth pixel and three 2D image pixels.
According to one embodiment, the image sensor consists of a plurality of two by two blocks of pixels, each two by two blocks of pixels comprising the depth pixel and three 2D image pixels.
According to one embodiment, the image sensor comprises an array consisting of a plurality of n by n blocks of 2D image pixels and a plurality of depth pixels indicated above, each depth pixel occupying a space in the lattice corresponding to the space occupied by an n by n block of 2D image pixels, where n is an integer greater than or equal to 2.
According to one embodiment, each 2D image pixel is isolated from the adjacent 2D image pixel by corresponding isolation walls, at least some of said isolation walls being formed by said first, second and third memories, and a or more of said isolation walls being formed by an isolation trench.
According to one embodiment, each of the first, second and third memories has a form factor of at least
4.
B15344 - DD17326JB
According to one embodiment, the length of the charge storage area of each of said first, second and third memories is greater than the width of the depth pixel.
According to one embodiment, the depth pixel and the 2D image pixel are formed in a semiconductor layer disposed on a buried insulation layer.
One embodiment provides an image forming device comprising: the above image sensor; and a light signal transmission circuit configured to emit light in the near infrared range.
According to one embodiment, the depth pixel is covered with a filter having a pass band limited to the range of the near infrared.
Brief description of the figures
These characteristics and advantages, as well as others, will appear on reading the following detailed description of embodiments given by way of example and not of limitation in relation to the attached figures among which:
FIG. 1 schematically represents an image forming device comprising a combined 2D image and depth sensor according to one embodiment;
FIG. 2 is a graph illustrating examples of light intensity of a light signal emitted and returned according to one embodiment;
FIG. 3 schematically represents the circuit of a pixel capable of detecting depth information according to an embodiment;
FIG. 4 is a top view of a pixel capable of detecting depth information according to one embodiment;
Figure 5A is a sectional view along a line A-A 'A of Figure 4 according to one embodiment.
FIGS. 5B to 5E illustrate electrostatic voltages in the structure of FIG. 5A during
B15344 - DD17326JB different operating phases according to one embodiment;
Figure 6 is a sectional view along a line B-B 'of the pixel of Figure 4 according to one embodiment;
FIGS. 7A to 7C are top views of the configuration of part of a combined 2D and depth sensor according to embodiments;
FIG. 8 is a graph representing the transmission of RGB (red, green, blue) and NIR (near infrared) filters according to one embodiment; and Figure 9 is a perspective view of an image forming device implemented in a 3D circuit according to one embodiment.
detailed description
In the following description, the term connected is used to designate a direct connection between circuit elements while the term coupled is used to designate a connection which can be direct or can take place via one or more intermediate elements. such as resistors, capacitors, and transistors. The term approximately is used to designate a tolerance of ± 10% of the value in question.
Figure 1 schematically shows an image forming device 100 comprising a combined sensor 2D image and depth. The device 100 comprises for example a light signal transmission circuit 104 which controls a light emitting diode (LED) 106. The LED for example emits light at a wavelength in the spectrum of the near infrared, for example in the range from 700 to 1000 pm. The light signal produced by the LED 106 is for example emitted towards the image scene via one or more lenses (not shown in FIG. 1). The light of the light signal reflected from the image scene is captured by the image sensor 102, for example via an image forming objective 107 and a
B15344 - DD17326JB array of microlenses 108, which focuses light on the individual pixels of the image sensor 102.
The sensor 102 comprises for example several pixels capable of receiving the light signal reflected by the image scene and of detecting the phase of the received signal to form a depth image. These pixels will hereinafter be called depth pixels. The image sensor 102 further comprises several pixels capable of capturing visible light from the image scene to form a 2D image, these pixels being here called 2D image pixels.
A processor 110 of the image-forming device 100 is for example coupled to the image sensor 102 and to the light signal transmission circuit 104 and determines, on the basis of the signals picked up by the depth pixels of the sensor 102, the corresponding distances. objects in the image scene. The 2D image captured by the image sensor 102 and the depth image produced by the processor 110 are for example stored in a memory 112 of the image forming device 100.
FIG. 2 is a graph representing, by a dotted curve 202, an example of the light intensity of the light signal emitted by the photodiode 106 towards the image scene, and, by a solid line curve 204, an example of the light intensity of the light signal received by one of the depth pixels of the image sensor 102. Although, to simplify the comparison, these signals are represented in FIG. 2 as having the same intensity, in practice the light signal received by each pixel depth is likely to be significantly less intense than the transmitted signal. In the example of Figure 2, the light signal has the form of a sine wave. However, in alternative embodiments, it could have a different periodic form, for example consisting of a sum of sine waves.
Rather than being based on an SPAD network, the depth pixels of the present description are based on photodetectors used to detect the phase of the light signal.
B15344 - DD17326JB received. There is a phase difference between I e emitted light signal and the received light signal, which represents the time of flight (ToF) of the light signal from the photodiode 106 to the image sensor 102 via an object of the image scene which reflects the light signal. An estimate of the distance d to the object in the image scene can thus be calculated using the equation:
c light _
-ÎZf ^ OF where cyight designates the speed of light, and f the frequency of the light signal.
The phase shift <PTOF is F or example estimated on the basis of sampling the received signal by a pixel depth for at least three separate sample windows for each period of the light signal. A technique based on the detection of four samples per period is described in more detail in the publication by R. Lange and P. Seitz entitled Solid-state TOF range camera, IEE J. on Quantum Electronics, vol. 37, No.3, March 2001, the content of which will be deemed to form an integral part of the present description to the extent permitted by law. The present embodiment is for example based on the detection of three samples per period.
The samples of each sampling window are, for example, integrated over a large number of periods, for example over approximately 100,000 periods, or more generally between 10,000 and 10 million periods. Each sampling window, for example, has a duration of up to a third of the period of the light signal. These sampling windows are named Cg, C] _ and C2 in FIG. 2, and in the example of FIG. 2, each sampling window has the same duration and the three sampling windows have a time of total cycle equal to the period of the light signal. More generally, there may or may not be a time interval between one sampling window and the next, and in some cases it
B15344 - DD17326JB there may be an overlap between the sampling windows. Each sampling window has for example a duration of between 25 and 40% of the period of the light signal in the case of a pixel capturing three samples per period.
The timing of the sampling windows Cq to C2 is controlled so as to be synchronized with the timing of the light signal emitted. For example, the light signal transmission circuit 104 generates a light signal based on a clock signal CLK, and the image sensor 102 receives the same clock signal CLK to control the end time of each window d 'sampling using for example delay elements to introduce the appropriate phase shifts.
Based on the integrated samples of the light signal, and for a purely sinusoidal light wave, the phase shift <Ptof of the light signal can be determined using the following equation:
. , V3CC2-CJ = arctan ( (c „- Cl ) + (C o In certain embodiments, the frequency f of the light signal is 25 MHz, or more generally between 20 and 200 MHz.
FIG. 3 is a circuit diagram illustrating an example of a 300 pixel ToF circuit. Such a circuit is described in more detail in the French patent application for application number FR 15/63457 (agent's reference: B14596), the content of which will be considered to be an integral part of this description to the extent permitted by law. For example, a timing diagram illustrating an example of operation of this circuit is presented in FIG. 3 of FR 15/63457, and the same example of operation applies in the context of the present application.
The circuit 300 comprises a photosensitive element PD coupled between a node 302 and a reference power source, the photosensitive element being for example a
B15344 - DD17326JB photodiode. The node 302 is coupled to a detection node SN via the parallel connection of three sampling circuits 304, 306, 308. The sampling circuit 304 comprises a memory mem ^ coupled to the node 302 by a gate transfer 310 which is for example an n-channel MOS transistor. The memory mem] _ is also coupled to the detection node SN by an additional transfer gate 312, which is also, for example, an n-channel MOS transistor. The transfer door 310 is controlled by a signal Vmeirq applied to its control node, and the transfer door 312 is controlled by a signal Vsn] _ applied to its control node. The sampling circuits 306 and 308 comprise circuit elements similar to the sampling circuit 304, and in particular the circuits 306, 308 respectively comprise memories memg, memg, transfer gates 314, 316 controlled respectively by signals Vmemg ^ Vmemg, and transfer gates 318, 320 controlled respectively by signals Vsng, Vsng. The memories mertq, memg and memg provide charge storage areas in which a charge transferred from the photosensitive element PD is temporarily stored.
The circuit 300 further comprises an output circuit formed by a source follower transistor 322, a selection transistor 324 and a reset transistor 326, these transistors being for example n-channel MOS transistors. The detection node SN is coupled to the control node of the transistor 322, which for example has its drain coupled to the supply voltage source Vdd, and its source coupled to an output line 328 of the pixel circuit 300 by the transistor 324 which is controlled by a Vsel signal applied to its gate. The detection node SN is also coupled to the supply voltage source Vdd through the transistor 326 which is controlled by a signal Vres applied to its gate. In alternative embodiments, the output circuit could be shared by several pixels, the detection node SN being for example coupled to the sampling circuit of one or more adjacent pixels.
B15344 - DD17326JB
The circuit 300 further comprises for example a transistor 330 coupling the node 302 to the supply voltage source Vdd and allowing the photosensitive element PD to be reset. The transistor 330 is for example controlled by a signal V res p ().
To determine the phase shift (Ptof between the light signal emitted and the light signal received by the pixel, the light signal received is sampled by transferring, successively and at regular intervals, photogenerated charges in the photosensitive element PD during the first window of sampling C0 to the memory meirq, of the photogenerated charges in the photosensitive element PD during the second sampling window C1 to the memory mem2 and of the photogenerated charges in the photosensitive element PD during the third sampling window C2 to the memory memg These three successive transfers are repeated a large number of times, for example 100,000 times before the signals stored by the memories memy, mem2 and memg are successively read by the output circuit.
FIG. 4 is a top view of a three by three block 400 of pixels forming a part of the image sensor 102 of FIG. 1 according to an exemplary embodiment. In the example of FIG. 4, a central pixel PZ is a pixel in depth for detecting the return signal of the light signal emitted towards the image scene in the case of a pixel capturing four samples per period. The eight pixels PI to P8 surrounding it are for example pixels of 2D images to capture a visible 2D light image projected from the image scene.
The depth pixel has relatively large dimensions compared to memories mem ^ to mem4 for storing the samples. This is advantageously achieved by positioning these memories along the respective sides of the depth pixel, and by extending these memories beyond the limits of the depth pixel so that they form isolation walls which act as barriers charge between the photosensitive areas of the surrounding pixels. As will be described
B15344 - DD17326JB in more detail below, the memories menq to mem ^ are for example memory areas formed by vertical charge storage regions defined by parallel walls having isolated electrodes to which a bias voltage can be applied so to allow storage of electrons. The memory areas are advantageously protected from light as will be described in more detail below.
As shown in the example in FIG. 4, the memory area merrq extends for example along a left side of the depth pixel and then between the pixels PI, P2 forming the upper-left and upper- center of the block three by three of pixels 400. The memory area mem2 extends for example along a high edge of the depth pixel and then between the pixels P3, P4 forming the upper-right and right-center pixels of the block three by three pixels 400. The memory area memg extends for example along a right side of the depth pixel, and then between the pixels P5, P6 forming the lower-right and lower-center pixels of the block three by three of pixels 400. The memory area merri4 extends for example along the bottom side of the depth pixel and then extends between the pixels P7, P8 forming the lower-left and left-center pixels of the three by three block of 400 pixels.
In certain embodiments, rather than storing a fourth sample, the memory area mem4 could be used for the purpose of resetting the photodiode and / or anti-dazzle.
Each of the memory zones merrq to mem4 comprises for example by a storage volume 402 sandwiched between a pair of straight and parallel walls 404, 406. The walls 404, 406 are for example active isolation trenches formed of a heart conductor covered with an insulating layer. The conductive core, for example, has the shape of a wall extending over the length and depth of the active isolation trench. A bias voltage is for example applied to the conductive core of each wall so that a load can
B15344 - DD17326JB be stored in the storage volumes 402 of the memory areas merrq to lied. In certain embodiments, the bias voltage is a negative voltage, for example around -2. V, so as to reduce the dark signal.
The walls 404 are for example internal walls which are lower than the external walls 406, one end 408 of the walls 404 being spaced from the side of the wall 404 from an adjacent memory area by an opening 410, and the other end 411 extending for example on the side of an additional wall 412 (only partially illustrated in FIG. 4) on the side distant from the surrounding pixels. The external walls 406 extend for example from the side of the wall 404 of a memory area adjacent to the side of the corresponding wall 412. A contact area 414 is for example adjacent to the end formed between the walls 404, 406 of each memory area.
Note that the walls 404 and 406 of each memory area are all straight walls. Thus the storage volumes 402 of each memory extend in a straight line, without angles or curvatures. This advantageously facilitates the transfer of charges into these storage volumes 402 and the discharge of charges from these storage volumes 402.
In addition, storage volumes for example have a relatively large form factor. For example, the memory area merrq has a form factor defined by where L ^ is the length of the storage volume 402 from the internal side of the gate of the transistor 312 to the level of the end 408 of the wall 404, and W m is the width of the storage volume 402 between the external sides of the walls 404, 406. This form factor is for example equal to at least 3 and in certain embodiments, it is equal to 5 or more. Each of the other memory areas merri2, memg, mem4 for example has a similar form factor. Such a form factor has the advantage, for example, of ensuring a relatively large charge storage capacity and efficient charge evacuation.
B15344 - DD17326JB
The length of the storage volume 402 is for example greater than the width w on each side of the depth pixel. The dimensions of the depth pixel are for example the same as the dimensions of the 2D image pixels. For example, the width w of each of the depth pixels and of each of the 2D image pixels is less than 10 μm, and for example less than 5 μm.
The transfer doors 310, 314, 316 and 416 are for example adjacent to the corresponding openings 410 in the internal walls 404 of the memory areas mem ^ to menvj. The photosensitive zone of the pixel of depth PZ includes for example a central region 415 of charge collection and regions under the transfer gates 310, 314, 316 and 416 of the memory zones meirq to mem4 · Thus, when a voltage is applied to the 'one of the transfer doors 310, 314, 316, or 416, the charge present in the photosensitive zone will be transferred to the corresponding memory zone through the corresponding opening 410 in the wall 404.
It will be noted that the photosensitive area of the photodiode PD is relatively small, smaller than the dimensions of the pixel. The present inventors have found that this is not detrimental to the correct capture of photons, since in any case the sampling rates of the transfer gates are such that the photosensitive charges have only a short time to be removed, and therefore large pixel dimensions do not necessarily lead to an increased signal. In addition, the use of microlenses such as those referenced 108 in Figure 1 can compensate for a relatively small photosensitive area by increasing the sensitivity.
FIG. 5A is a sectional view along a line AA ′ A traversing the structure of FIG. 4, and in particular extending over the length of the memory memp and then making a turn to the right through the opening 410 and of the transfer door 310. The other memory areas mem2 to mentj have for example a similar structure.
B15344 - DD17326JB
The photosensitive zone comprises for example an n-type region 541, having a Ng doping level, formed on a p-type semiconductor substrate 543. In some embodiments, rather than being the substrate, the p-type layer 543 could be an SOI (Silicon on insulator) layer formed by epitaxy and positioned, on a layer 544 of buried oxide (BOX) (represented by dotted lines in Figure 5A). The charge collection zone is for example made up of an n-type region 545 having a doping level N2 which is greater than N] _. The region 545 is for example covered by a heavily doped layer 547 of p-type (P +), which for example forms a passivation layer.
One of the isolation walls 412 is illustrated in FIG. 5A and comprises a conductive core 551, formed for example of polycrystalline silicon surrounded by an insulating layer 553, for example made of silicon oxide.
The memq zone comprises, for example, a box 557 of type n having a doping level N5 greater than N] _, covered with a P + layer 547. The memory zone meiq is for example separated from the charge collection zone 545 by the region 541 covered by a gate 559 of the transfer transistor 310, this gate being separated from region 541 by a layer 561 of gate isolation. In addition, the memory zone menq is for example separated from the contact zone 414 by an n-type region 562, covered by a gate 563 of the transistor 312, the gate 563 being separated from the region 561 by a gate insulator 565.
The menq memory area also includes, for example, an n-type region 567 positioned between the n-type housing 557 and the transfer door 310. This region 567 has for example a doping level N4 greater than N] _ and less than N5, and is covered by the P + layer 547. In certain embodiments, the region 541 extends to the region 567 of the memory area memp However, in the example of FIG. 5A, an additional region 569 of type n is formed under the gate 310 and separates the regions 541 and 567. The region 569 has for example a doping level N3 greater than and less than N4 and N5. In some
B15344 - DD17326JB embodiments, the doping level N3 is also less than N 2 .
Although this is not shown in FIG. 5A, the memory formed by the n-type regions 567 and 557 is for example masked by being covered with a screen protecting against light. Furthermore, in certain embodiments, the light protection screen extends to cover at least partially the grid 559, and covers for example at least part or the whole region 569 of n type.
FIGS. 5B to 5E represent electrostatic voltages according to the same section A-A'-A through the structure of FIG. 4 as that shown in FIG. 5A. In these figures, low voltages are represented by high levels of the curves, and vice versa.
FIGS. 5B to 5D illustrate a phase of charge transfer from the charge collection area 545 to the memory area merrq. FIG. 5B illustrates a step in which the transistors 310 and 312 are maintained in a state blocked by a negative or zero voltage applied to their gates. FIG. 5C illustrates a step during which the transfer door 310 is made passable, for example by applying a supply voltage Vdd to its grid, bringing the electrostatic voltage under the grid 310 to a value greater than that of the zone of charge collection 545 and thus causing the transfer of charges to region 569 under the grid. FIG. 5D illustrates a next step during which the voltage applied to the gate of the transistor 310 is reduced to a low level, so that the electrostatic voltage of the region containing the transferred charges falls below those of the regions 567 and 557 from the memory area mem] _ and thus the charge is transferred to the charge storage region of the memory area.
FIG. 5E illustrates a next phase during which a load is transferred outside of the memory via the contact area 414. As illustrated, for this, the transistor 312 is controlled by a
B15344 - DD17326JB high voltage, for example the supply voltage Vdd, and thus the electrostatic voltage under the gate 563 becomes higher than that of the charge storage area 557 of the memory area, and the charge is transferred to the area of contact 414, which for example has an electrostatic voltage greater than that of zone 557.
Figure 6 is a sectional view taken along a line B-B 'crossing the structure of Figure 4, and in particular crossing the pixels P8, PZ and P4, and the memory areas mem] _ and memg on each side of the pixel PZ. The memory areas mem2 and mem4 are not illustrated but have, for example, a structure similar to the memory areas memg and memg.
As illustrated, each of the pixels P8 and P4 comprises for example a pinched photodiode formed by a layer 650 of type n positioned under a heavily doped layer of type p (P +) 652. Each of the walls 404, 406 of the memory areas memg , memg is formed by a trench covered with a layer 654 of an insulator such as oxide and filled with a conductive core 656, formed for example of a silicide. The width of the memories memg, memg, from the exterior surfaces of the walls 404, 406 is for example around 1 μm, or more generally between 0.8 and 2 μm.
In some embodiments, light filters 658, 660, 662 are positioned on the pixels P8, PZ and P4 respectively. Filters 658 and 662 are for example color filters, such as red, green or blue filters, to allow only light having a limited wavelength range to enter 2D image pixels. The filter 660 is for example an NIR filter allowing only light in the wavelength range of the near infrared to enter the pixel PZ.
Light opaque screens 664 are for example positioned on each of the memory areas memg, memg in such a way that the light coming from the image scene does not enter directly into these memories and does not generate charges
B15344 - DD17326JB parasites. The screens 664 are for example made of a suitable metal such as tungsten, aluminum or copper.
FIG. 7A is a top view of a four by four 700 pixel block of the image sensor 102 according to an exemplary embodiment. In the example of FIG. 7A, each two by two block of pixels is composed of pixels covered by a green filter (G), a red filter (R), a blue filter (B), and an NIR filter (Z) . The memories associated with the pixels of depth Z form isolation walls between the pixels of neighboring colors in a similar manner to that described in relation to the example in FIG. 4. To simplify the illustration, each of the memories is represented by rectangles with darkened regions at one end representing the transfer door and the contact area for transferring the charges out of the memory.
In the example of FIG. 7A, the density of the depth pixels is such that the corresponding memories constitute barriers between all the pixels of the array. Indeed, FIG. 7A illustrates certain memories associated with pixels of depth neighboring the block four by four. It is only at the edge of the pixel array that some barriers may be missing, but such barriers may be implemented by isolation trenches, such as shallow isolation trenches (STI), deep isolation trenches (DTI) or by the same type of isolation walls as that used to delimit the memories.
Fig 7B is a top view of a four by four block 710 of pixels of the image sensor 102 according to an alternative embodiment with respect to the case of Figure 7A.
In the example of FIG. 7B, a density of pixels of depth lower than that of the embodiment of FIG. 7A is provided. For example, the array of pixels is formed by two by two blocks of pixels, some of which include, clockwise, green, red, green and blue pixels, and some of which include, clockwise clock hands, green, red, depth and blue pixels.
B15344 - DD17326JB
The relatively high spacing between the depth pixels means that the corresponding memories constitute barriers only between some of the 2D image pixels of the array. In places where no memory is present, the pixels are for example separated by isolation trenches 712 constituting barriers between adjacent pixels. These isolation trenches 712 constitute, for example, ghost memories, that is to say that they have the same structure as that of the memories of the deep pixels with the exception of the opening 410 of FIG. 4 which can be omitted. The conductive cores formed in the walls of the ghost memories are for example coupled to the same DC bias voltages as the depth pixels.
FIG. 7C is a top view of a block 720 of pixels of the image sensor 102 according to an alternative embodiment with respect to the case of FIGS. 7A and 7B.
In the example of FIG. 7C, each pixel of depth Z has a surface corresponding to a two by two block of 2D image pixels. For example, the 2D image pixels are arranged in blocks two by two having, in a clockwise direction, green, red, green and blue pixels. A two by two block of image pixels is replaced at regular intervals by a single pixel in depth. More generally, 2D image pixels can be arranged in n by n blocks of pixels, and each pixel in depth can occupy the space of a block of pixels n by n, where n is an integer greater than or equal to two. .
In the example of FIG. 7C, each of the memories extending from the depth pixels forms an at least partial barrier between more than one pair of adjacent 2D image pixels. Of course, although FIG. 7C illustrates an example in which each pixel of depth is surrounded by 8 blocks two by two of pixels of 2D image, in alternative embodiments a different density of pixels of depth would be possible.
B15344 - DD17326JB
Of course, although in the embodiments of FIGS. 7A, 7B and 7C the 2D pixels are pixels of green, red and blue color, in alternative embodiments these pixels could be unfiltered pixels without a color filter, or having a filter which filters the near infrared (NIR) wavelengths. For example, a filter transmitting a wavelength range from about 400 nm to about 700 nm could be used for these pixels.
FIG. 8 is a graph illustrating examples of wavelength ranges transmitted by blue, green and red color filters (BGR) and by an NIR filter according to an exemplary embodiment. For example, these filters are characterized by the following transmission ranges for which the transmission is greater than or equal to 0.5:
- Blue: wavelength range from about 400 nm to about 490 nm;
- Green: wavelength range from approximately 490 nm to approximately 590 nm;
- Red: wavelength range from about 560 nm to about 660 nm;
- NIR: wavelength range from approximately 780 nm to approximately 820 nm.
FIG. 9 is a perspective view of a 3D circuit 900 comprising stacked devices 902, 904, which are for example interconnected by 3D vertical vias 906. The device 902 comprises for example the image sensor 102, which is by example lit from behind. The circuit 904 comprises for example some or all the transistors associated with each pixel, such as the transistor 326 of FIG. 3. The circuit 904 can in addition or alternatively include some or all the transistors forming the output circuit associated with each pixel such as the transistor 322 and / or 324 of FIG. 3.
In certain embodiments, the relatively high frequency signals Vmemj_, Vmem2, Vmemg used to drive the transfer doors 310, 314, 316 are provided by
B15344 - DD17326JB wires in a device different from the device used to supply the signals driving the transfer doors of the 2D image pixels. For example, the signals Vmerrq, Vmem2, Vmemg are supplied to the pixels of the device 902 by wires on the device 904 and through the 3D vertical vias 906 between the devices 902, 904. Other arrangements of transistors and distribution of wires could be used by those skilled in the art to optimize the topography and the pixel array density.
An advantage of the embodiments described herein is Only 1 1 Arrangement of the pixel depth within the combined 2D image sensor 102 and depth allows at least three memories to be coupled to the photosensitive area of the pixel without significantly increasing the size of the pixel compared to the other pixels of the network. In addition, the memories advantageously provide effective barriers between the surrounding image pixels.
Particular embodiments have been described. Various variants and modifications will appear to those skilled in the art. In particular, although a particular example of pixel depth topography has been described in connection with Figure 4, it will be apparent to those skilled in the art that modifications can be made. For example, the fourth memory mem ^ can be omitted in certain embodiments.
In addition, the memories could extend in the opposite direction to what is shown in the example of FIG. 4, the memory meirq extending between the pixels P6 and P7, the memory mem2 extending between the pixels P8 and P1, the memory meiruj extending between the pixels P2 and P3 and the memory meiibj extending between the pixels P4 and P5.
In addition, it will be clear to a person skilled in the art that although regions having p-type and n-type dopings have been described, in variant embodiments some or all of these regions may have a type of doping. opposite. For example, the region of type n 557 in FIG. 5A forming the memory could be replaced a region of type p, and the other regions
B15344 - DD17326JB of the structure of FIG. 5A could also be of the opposite doping type.
B15344 - DD17326JB
权利要求:
Claims (12)
[1" id="c-fr-0001]
1. Image sensor comprising:
a depth pixel (PZ) having: a detection area (PD);
a first memory (mem ^) electrically coupled to the detection zone by a first grid (310);
a second memory (mem2) electrically coupled to the detection zone by a second grid (314); and a third memory (memg) electrically coupled to the detection zone by a third grid (316), each of the first, second and third memory consisting of a doped region (557) sandwiched between first and second straight walls parallel (404, 406), the first and second walls of each memory having a conductive core adapted to receive a bias voltage; and a plurality of 2D image pixels (Pl to P8) arranged adjacent to the depth pixels, in which the first, second and third memories extend to form at least partial isolation walls between the corresponding adjacent pairs 2D image pixels.
[2" id="c-fr-0002]
2. An image sensor according to claim 1, in which:
the walls (404, 406) of the first memory (mem ^) extend between the detection zones of the first and second pixels (Pl, P2) of the plurality of 2D image pixels;
the walls (404, 406) of the second memory (merr ^) extend between the detection zones of the third and fourth pixels (P3, P4) of the plurality of 2D image pixels; and the walls (404, 406) of the third memory (memg) extend between the detection areas of the fifth and sixth pixels (P5, P6) of the plurality of 2D image pixels.
B15344 - DD17326JB
[3" id="c-fr-0003]
The image sensor according to claim 2, in which the depth pixel (PZ) further comprises an additional structure (mem4) electrically coupled to the detection zone (PD) of the depth pixel by a fourth grid (416). , the additional structure comprising an n-type region sandwiched between straight parallel walls (404, 406), and wherein the walls of the additional structure extend between the detection areas of the seventh and eighth pixels (P7, P8 ) of the plurality of 2D image pixels.
[4" id="c-fr-0004]
The image sensor according to any of claims 1 to 3, wherein the image sensor comprises a plurality of two by two blocks of pixels, at least some of said two by two blocks of pixels comprising the pixel of depth according to claim 1 and three 2D image pixels.
[5" id="c-fr-0005]
5. An image sensor according to any one of claims 1 to 3 wherein the image sensor consists of a plurality of two by two blocks of pixels, each two by two block of pixels comprising the depth pixel according to claim 1 and three pixels of 2D image.
[6" id="c-fr-0006]
6. Image sensor according to any one of claims 1 to 3, in which the image sensor comprises an array consisting of a plurality of n by n blocks of 2D image pixels and a plurality of pixels of depth according to claim 1, each pixel of depth occupying a space in the network corresponding to the space occupied by an n by n block of 2D image pixels, where n is an integer greater than or equal to 2.
[7" id="c-fr-0007]
The image sensor according to any of claims 1 to 6, wherein each 2D image pixel is isolated from the adjacent 2D image pixel by corresponding isolation walls, at least some of said isolation walls being formed by said first, second and third memories, and one or more of said isolation walls being formed by an isolation trench.
B15344 - DD17326JB
[8" id="c-fr-0008]
8. An image sensor according to any one of claims 1 to 7, in which each of the first, second and third memories has a form factor of at least 4.
[9" id="c-fr-0009]
9. An image sensor according to claim 1, in which the length of the charge storage area of each of said first, second and third memories is greater than the width of the depth pixel.
[10" id="c-fr-0010]
The image sensor according to any of claims 1 to 9, wherein the depth pixel and the 2D image pixel are formed in a semiconductor layer (543) disposed on a buried isolation layer (544 ).
[11" id="c-fr-0011]
11. Image forming device comprising:
the image sensor of any one of claims 1 to 10; and a light signal transmission circuit (104, 106) configured to emit light in the near infrared range.
[12" id="c-fr-0012]
The image forming device according to claim 11, wherein the depth pixel is covered with a filter having a pass band limited to the range of the near infrared.
B15344 DD17326JB
1/6
W
类似技术:
公开号 | 公开日 | 专利标题
FR3060250B1|2019-08-23|IMAGE SENSOR FOR CAPTURING A 2D IMAGE AND DEPTH
FR3046494A1|2017-07-07|PIXEL FLIGHT TIME DETECTION
BE1024389B1|2018-02-12|SENSOR DEVICE WITH MAJORITY CURRENT AND CURRENT CONTROL CIRCUITS
EP2752878B1|2017-09-06|Image sensor
EP2315251A1|2011-04-27|Imager with vertical transfer gate and its method of fabrication
FR2955701A1|2011-07-29|COMPACT IMAGE SENSOR STRUCTURE
EP1627432B1|2009-03-25|Matrix image recorder using cmos technology
EP2503596B1|2013-12-18|Electron multiplication imaging device
FR2775541A1|1999-09-03|CMOS IMAGE DETECTOR, PHOTODIODE FOR SUCH A DETECTOR, AND METHODS FOR MANUFACTURING THE SAME AND PHOTODIODE
FR2888989A1|2007-01-26|IMAGE SENSOR
EP3016141A1|2016-05-04|Image sensor with vertical electrodes
FR2930676A1|2009-10-30|IMAGE SENSOR WITH VERY LOW DIMENSIONS
TWI569435B|2017-02-01|Image sensor with dielectric charge trapping device
BE1024783B1|2018-07-02|Detector device with majority current and isolation means
EP3503192A1|2019-06-26|Device for acquiring a 2d image and a depth image of a scene
EP2572382B1|2015-03-25|Matrix charge-transfer image sensor with asymmetric gate
EP2482317A1|2012-08-01|Multilinear image sensor with load integration
FR3019379A1|2015-10-02|VERTICAL GRID TRANSISTOR AND PIXEL STRUCTURE COMPRISING SUCH A TRANSISTOR
EP2312832B1|2014-06-04|Pixel circuit in image sensor
FR3065320A1|2018-10-19|PIXEL FLIGHT TIME DETECTION
FR2986906A1|2013-08-16|IMPROVED LOAD TRANSFER ACTIVE PIXEL STRUCTURE
EP3471151B1|2020-02-19|Helmholtz resonator photodetector
FR3095720A1|2020-11-06|Image sensor pixels with reduced pitch
FR2966678A1|2012-04-27|IMAGING DEVICE PARTICULARLY OF THE CMOS TYPE WITH REPORT AND INTEGRATION.
FR2980641A1|2013-03-29|VERTICAL GRID ELECTRON MULTIPLICATION IMAGE SENSOR
同族专利:
公开号 | 公开日
US20180167606A1|2018-06-14|
US10264242B2|2019-04-16|
FR3060250B1|2019-08-23|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
WO2011099759A2|2010-02-12|2011-08-18|Samsung Electronics Co., Ltd.|Image sensor and operating method|
US20130010072A1|2011-07-08|2013-01-10|Samsung Electronics Co., Ltd.|Sensor, data processing system, and operating method|
WO2011160130A2|2010-06-18|2011-12-22|Sionyx, Inc|High speed photosensitive devices and associated methods|
KR102007277B1|2013-03-11|2019-08-05|삼성전자주식회사|Depth pixel included in three-dimensional image sensor and three-dimensional image sensor including the same|
US20140347442A1|2013-05-23|2014-11-27|Yibing M. WANG|Rgbz pixel arrays, imaging devices, controllers & methods|
JP6138661B2|2013-10-23|2017-05-31|ソニーセミコンダクタソリューションズ株式会社|Solid-state imaging device, manufacturing method thereof, and electronic device|
US9762890B2|2013-11-08|2017-09-12|Samsung Electronics Co., Ltd.|Distance sensor and image processing system including the same|
KR102136850B1|2013-11-12|2020-07-22|삼성전자 주식회사|A depth sensor, and a method of operating the same|
JP2016039315A|2014-08-08|2016-03-22|株式会社東芝|Solid state image sensor|EP3724921A4|2017-12-13|2021-03-24|Magic Leap, Inc.|Global shutter pixel circuit and method for computer vision applications|
US10491877B1|2017-12-21|2019-11-26|Facebook Technologies, Llc|Depth camera assembly using multiplexed sensor phase measurements to determine depth using fringe interferometry|
US10158038B1|2018-05-17|2018-12-18|Hi Llc|Fast-gated photodetector architectures comprising dual voltage sources with a switch configuration|
US10515993B2|2018-05-17|2019-12-24|Hi Llc|Stacked photodetector assemblies|
US10340408B1|2018-05-17|2019-07-02|Hi Llc|Non-invasive wearable brain interface systems including a headgear and a plurality of self-contained photodetector units configured to removably attach to the headgear|
US10420498B1|2018-06-20|2019-09-24|Hi Llc|Spatial and temporal-based diffusive correlation spectroscopy systems and methods|
US11213206B2|2018-07-17|2022-01-04|Hi Llc|Non-invasive measurement systems with single-photon counting camera|
US10852434B1|2018-12-11|2020-12-01|Facebook Technologies, Llc|Depth camera assembly using fringe interferometery via multiple wavelengths|
WO2020131148A1|2018-12-21|2020-06-25|Hi Llc|Biofeedback for awareness and modulation of mental state using a non-invasive brain interface system and method|
WO2021131399A1|2019-12-26|2021-07-01|浜松ホトニクス株式会社|Distance measurement device, and method for driving distance measurement sensor|
US11096620B1|2020-02-21|2021-08-24|Hi Llc|Wearable module assemblies for an optical measurement system|
US11187575B2|2020-03-20|2021-11-30|Hi Llc|High density optical measurement systems with minimal number of light sources|
US11245404B2|2020-03-20|2022-02-08|Hi Llc|Phase lock loop circuit based signal generation in an optical measurement system|
法律状态:
2018-01-02| PLFP| Fee payment|Year of fee payment: 2 |
2018-06-15| PLSC| Publication of the preliminary search report|Effective date: 20180615 |
2018-12-31| PLFP| Fee payment|Year of fee payment: 3 |
2019-12-31| PLFP| Fee payment|Year of fee payment: 4 |
2020-12-28| PLFP| Fee payment|Year of fee payment: 5 |
2021-12-31| PLFP| Fee payment|Year of fee payment: 6 |
优先权:
申请号 | 申请日 | 专利标题
FR1662340|2016-12-12|
FR1662340A|FR3060250B1|2016-12-12|2016-12-12|IMAGE SENSOR FOR CAPTURING A 2D IMAGE AND DEPTH|FR1662340A| FR3060250B1|2016-12-12|2016-12-12|IMAGE SENSOR FOR CAPTURING A 2D IMAGE AND DEPTH|
US15/829,662| US10264242B2|2016-12-12|2017-12-01|Image sensor for capturing 2D image and depth|
[返回顶部]