Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker

 
CONTINUE READING
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
sensors
Communication
Multipurpose Bio-Monitored Integrated Circuit in a Contact
Lens Eye-Tracker
Loïc Massin 1,2 , Cyril Lahuec 1,2, *, Fabrice Seguin 1,2 , Vincent Nourrit 1
and Jean-Louis de Bougrenet de la Tocnaye 1

 1 Optics Department, Institut Mines-Télécom Atlantique, Technopôle Brest Iroise, CS 83818, CEDEX 03,
 29238 Brest, Brittany, France; loic.massin@imt-atlantique.fr (L.M.); fabrice.seguin@imt-atlantique.fr (F.S.);
 vincent.nourrit@imt-atlantique.fr (V.N.); jl.debougrenet@imt-atlantique.fr (J.-L.d.B.d.l.T.)
 2 Laboratoire des Sciences et Techniques de l’Information, de la Communication et de la Connaissance,
 UMR 6285, 29238 Brest, Brittany, France
 * Correspondence: cyril.lahuec@imt-atlantique.fr

 Abstract: We present the design, fabrication, and test of a multipurpose integrated circuit (Application
 Specific Integrated Circuit) in AMS 0.35 µm Complementary Metal Oxide Semiconductor technology.
 This circuit is embedded in a scleral contact lens, combined with photodiodes enabling the gaze
 direction detection when illuminated and wirelessly powered by an eyewear. The gaze direction
 is determined by means of a centroid computation from the measured photocurrents. The ASIC
 is used simultaneously to detect specific eye blinking sequences to validate target designations,
 for instance. Experimental measurements and validation are performed on a scleral contact lens
 prototype integrating four infrared photodiodes, mounted on a mock-up eyeball, and combined
 with an artificial eyelid. The eye-tracker has an accuracy of 0.2◦ , i.e., 2.5 times better than current
  mobile video-based eye-trackers, and is robust with respect to process variations, operating time,
  and supply voltage. Variations of the computed gaze direction transmitted to the eyewear, when the
Citation: Massin, L.; Lahuec, C.; eyelid moves, are detected and can be interpreted as commands based on blink duration or using
Seguin, F.; Nourrit, V.; de Bougrenet blinks alternation on both eyes.
de la Tocnaye, J.-L. Multipurpose
Bio-Monitored Integrated Circuit in a Keywords: instrumented contact lens; eye-tracker; blink detection; blink interpretation; CMOS; ASIC
Contact Lens Eye-Tracker. Sensors
2022, 22, 595. https://doi.org/
10.3390/s22020595

Academic Editors: Mariano Alcañiz 1. Introduction
Raya and Cristoforo Marzocca Recent advances in wearable electronics, combined with wireless communications,
Received: 18 October 2021 have triggered a huge interest in the development of smart contact lenses [1]. Currently,
Accepted: 11 January 2022 they are capable of monitoring physiological information to aid with the treatment of
Published: 13 January 2022 pathologies (e.g., intraocular pressure for glaucoma or tear film glucose concentration for
 diabetics) and may be the next step in augmented reality (cf., retinal display concepts such
Publisher’s Note: MDPI stays neutral
 as the Mojo lens). Hence, contact lenses should not be considered only as single task device
with regard to jurisdictional claims in
 but as an integrating platform embedding several functions simultaneously. Our paper
published maps and institutional affil-
 examines the use of an ASIC to implement multipurpose tasks on the same contact lens,
iations.
 i.e., here eye-tracking and blink detection.
 In parallel, eye-tracking has become a common tool in many fields, from cognitive
 science to human-computer interactions passing by medical diagnosis [2], driver assis-
Copyright: © 2022 by the authors. tance [3], and head-mounted displays with augmented/virtual reality [4]. Video-based
Licensee MDPI, Basel, Switzerland. eye-trackers are currently the most widely used approach. One or multiple cameras acquire
This article is an open access article images of the eyes, usually illuminated in infrared, and the gaze direction is estimated by
distributed under the terms and image processing using various methods [5]. The video-based approach owns its success
conditions of the Creative Commons to its noninvasive nature and constant progresses in terms of imaging, computing power,
Attribution (CC BY) license (https:// and image processing. However, various imaging conditions (camera resolution, iris color,
creativecommons.org/licenses/by/ lighting conditions, etc.) can significantly decrease performance. Moreover, the necessary
4.0/).

Sensors 2022, 22, 595. https://doi.org/10.3390/s22020595 https://www.mdpi.com/journal/sensors
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
formation on cognitive processing but it could also serve as the basis for a specific hu-
 man–machine interface (HMI) [11]. For instance, in demanding environments (e.g., sur-
 gery, driving) the user could blink to select a function among several implemented into
 the SCL-eyewear pair (e.g., eye-tracking, displaying information, etc.), while keeping his
 hands free for other tasks [12]. Such a system should obviously be able to differentiate
Sensors 2022, 22, 595 2 of 11
 between involuntary and deliberate blinks. A blinking time period is usually 0.1 s to 0.2 s,
 and therefore one can consider that, when the blinking time period exceeds 0.5 s, the
 blinking is a conscious one.
 powerful computers and fast camera make them difficult to integrate in demanding envi-
 This paper extends works presented in [9] by adding this HMI to the system. The
 ronments. Accurate gaze tracking typically requires head immobilization, the use of several
 HMI is implemented using the ASIC designed to continuously compute the gaze direc-
 cameras and IR sources, or lengthy calibration [4]. Thus, over the last decade, the reported
 tion. The blinking command is detected on the eyewear from the data sent by the SCL by
 accuracy of head mounted eye-trackers remains above 0.5◦ [2–7].
 a simpleAn algorithm.
 approach to improve gaze tracking accuracy is thesequences
 By analyzing blink durations or blink integrationonof both eyes,ina
 functions
 voluntary command
 scleral contact lenscan be [8–10].
 (SCL) detected. Thissystem,
 In this command could be used
 photodetectors to encapsulated
 (PTDs) validate whether into
 the an
 gaze direction data are recorded or not or to validate a target designation.
 SCL, are illuminated by an IR source placed on an eyewear. The gaze direction is The num-
 ber computed
 of possible combinations
 using is increased
 the PTDs response by anwhen considering
 integrated both implemented
 circuit (ASIC) eyes to design more
 into the
 complex HMIs.
 lens and Moreover,
 powered we present
 by an inductive link.performance measurements
 The result is then (accuracy,
 wirelessly sent precision,
 to the eyewear by
 robustness,
 near-fieldenergy harvesting,
 communication among
 (NFC), others)
 also ofpower
 used to the fullthe
 system
 ASIC.(ASIC encapsulated
 The ASIC is activatedinto
 as
 soon lens
 a scleral as the inductive
 and eyewear)linkinisrealistic
 established
 test and transmits
 set-ups whereasthe [9]
 digitized gazeonly
 presented direction to the
 simulation
 eyewear,
 results of theFigure
 ASIC.1a. Hence, no action from the user is required.

 LUT Calibration NFC Reader

 Blink Activate Gaze
 detector/interpretor tracking

 Eye wear
 (a) (b)
 Figure 1. (a)
 Figure 1. Multifunction-instrumented
 (a) Multifunction-instrumentedSCL
 SCLand itsits
 and eyewear.
 eyewear.An
 Aninductive
 inductivelink
 linkisis used
 used to trans-
 to trans-
 mit mit
 datadata
 andand
 powers the SCL. (b) Block diagram of both elements; the eyewear performs
 powers the SCL. (b) Block diagram of both elements; the eyewear performs data
 data
 postprocessing to use blink as a HMI.
 postprocessing to use blink as a HMI.

 Such a device could also be used to detect blinks. Blinks may provide important
 information on cognitive processing but it could also serve as the basis for a specific human-
 machine interface (HMI) [11]. For instance, in demanding environments (e.g., surgery,
 driving) the user could blink to select a function among several implemented into the
 SCL-eyewear pair (e.g., eye-tracking, displaying information, etc.), while keeping his hands
 free for other tasks [12]. Such a system should obviously be able to differentiate between
 involuntary and deliberate blinks. A blinking time period is usually 0.1 s to 0.2 s, and
 therefore one can consider that, when the blinking time period exceeds 0.5 s, the blinking is
 a conscious one.
 This paper extends works presented in [9] by adding this HMI to the system. The HMI
 is implemented using the ASIC designed to continuously compute the gaze direction. The
 blinking command is detected on the eyewear from the data sent by the SCL by a simple algo-
 rithm. By analyzing blink durations or blink sequences on both eyes, a voluntary command
 can be detected. This command could be used to validate whether the gaze direction data are
 recorded or not or to validate a target designation. The number of possible combinations is
 increased when considering both eyes to design more complex HMIs. Moreover, we present
 performance measurements (accuracy, precision, robustness, energy harvesting, among others)
 of the full system (ASIC encapsulated into a scleral lens and eyewear) in realistic test set-ups
 whereas [9] presented only simulation results of the ASIC.
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
Sensors 2022, 22, x FOR PEER REVIEW 3 of 11
 Sensors 2022, 22, x FOR PEER REVIEW 3 of 11
Sensors 2022, 22, 595 3 of 11

 2.
 2. Materials
 Materials and and Methods
 Methods
 2. Materials
 The and is Methods
 The system is composed
 system composed of of the
 the instrumented
 instrumented 16.5 16.5 mm mm diameter
 diameter SCLSCL and and the the eyewear.
 eyewear.
 The
 The Theinstrumented
 system is composed
 instrumented SCL comprises
 SCL comprises a gaze-computing
 of the instrumented ASIC,
 16.5 mm diameter
 a gaze-computing four
 ASIC, four SCL and infrared Silonex
 the eyewear.
 infrared Silonex
 SLCD-61N8
 SLCD-61N8 photodetectors (a sensitivity of Rλ = 0.55 A/W at λ = 940 nm), and a NXP NFC
 The instrumented photodetectors
 SCL (a
 comprises sensitivity
 a of Rλ
 gaze-computing= 0.55 A/W
 ASIC, at λ =
 four 940 nm),
 infrared and a
 Silonex NXP SLCD-
 NFC
 NTAG
 61N8
 NTAG NT3H110.
 NT3H110. Based
 photodetectors Based(a onon the
 the PTDs
 PTDsofresponse,
 sensitivity Rλ = 0.55
 response, aa 1.8
 A/W
 1.8 V
 V subthreshold
 at λ = 940 nm),
 subthreshold biased
 biased and 0.35 µm
 a NXP
 0.35 µm CMOS
 NFC
 CMOS
 ASIC
 ASIC computes
 NTAG NT3H110.the
 computes centroid
 Based
 the on theof
 centroid the
 the IR
 PTDs
 of IR received.
 response,
 received. As
 AsVthe
 a 1.8 eye
 eye moves,
 subthreshold
 the the
 the centroid
 moves,biased 0.35 µmchanges
 centroid CMOS
 changes
 accordingly
 accordingly [9]. Two identical current-mode subthreshold analog CMOS cores perform the
 ASIC computes [9]. Two
 the identical
 centroid ofcurrent-mode
 the IR subthreshold
 received. As the eyeanalog
 moves, CMOS
 the cores
 centroid perform
 changesthe
 centroid
 centroid computations,
 accordingly [9]. Two identical
 computations, one
 one per direction,
 direction, horizontal
 current-mode
 per subthreshold
 horizontal (θ) and
 and vertical
 (θ)analog CMOS(φ),
 vertical cores
 (φ), only activated
 perform
 only the
 activated
 when
 centroid required
 computations,
 when required to save
 to saveone power. Theresults
 per direction,
 power. The results
 areare
 horizontal then
 then(θ) and sequentially (ϕ),digitized
 verticaldigitized
 sequentially only by a by
 activated a
 volt-
 voltage-mode
 when
 age-moderequired
 12-bit 12-bit successive-approximation
 tosuccessive-approximation
 save power. The resultsregister register
 are then analog-to-digital
 sequentially
 analog-to-digital digitized
 converter converter
 by(SAR (SAR
 a voltage-
 ADC).
 mode
 ADC). 12-bit successive-approximation register analog-to-digital converter (SAR ADC). The
 The operating frequency is 12.5 kHz to meet the data rate requirement of at least 500 500
 The operating frequency is 12.5 kHz to meet the data rate requirement of at least Hz,
 operating
 Hz, and isfrequency
 provided is
 by 12.5
 the kHz
 output to meet
 signal the data rate
 divided by requirement
 32 of a 400 kHz of Wien
 at least 500 Hz,
 bridge and is
 harmonic
 and is provided by the output signal divided by 32 of a 400 kHz Wien bridge harmonic
 provided
 oscillator. by the output signal divided by a32current
 of a 400 kHz Wien bridge harmonic oscillator.
 oscillator. Since
 Since the the centroid
 centroid is is output
 output as as a current and and thethe SAR
 SAR ADCADC is is voltage
 voltage mode,mode, aa 11
 Since
 MΩ the centroid is outputthe as acurrent-to-voltage
 current and the SAR ADC is voltage mode, a 1response
 MΩ resistor
 MΩ resistor RI-V performs the current-to-voltage conversion, having a linear response over
 resistor R I-V performs conversion, having a linear over
 RaI–V performs the current-to-voltage conversion, having a linear amplifier response over a widea
 a wide
 wide dynamic
 dynamic range. range. In In addition,
 addition, aa subthreshold
 subthreshold folded-cascode
 folded-cascode amplifier implements implements a
 dynamic
 unity gainrange.
 bufferIn addition, a subthreshold folded-cascode amplifier implements a unity
 unity gain buffer forfor impedance
 impedance matching.
 matching. AfterAfter digitization,
 digitization, both
 both coordinates coordinates are
 are sequen-
 gain buffer fortransmitted
 sequentially impedancevia matching.
 the NFC After
 tag, digitization,
 Figure 1b. bothtime
 The coordinates
 diagram are
 of sequentially
 the ASIC is
 tially transmitted via the NFC tag, Figure 1b. The time diagram of the ASIC is shown in
 transmitted
 shown in via the
 Figure 2. NFC tag, Figure 1b. The time diagram of the ASIC is shown in Figure 2.
 Figure 2.

 Figure2.2.Time
 Time diagramfor for our system. Centroid computing units compute θϕand
 areφφ are resting
 Figure 2. Timediagram
 Figure diagram for our system.
 our Centroid
 system. computing
 Centroid units
 computing compute
 units θ and
 compute θ and resting most
 are resting
 most
 ofmost of the
 the time time to save power. The results are sequentially sampled at 500 Hz.
 of thetotime
 savetopower. The results
 save power. are sequentially
 The results sampled
 are sequentially at 500 Hz.
 sampled at 500 Hz.

 Figure
 Figure333shows
 Figure shows
 showsa amicrophotograph
 a microphotograph
 microphotograph of of
 of the
 the fabricated
 fabricated
 the ASIC.
 ASIC.
 fabricated Supplied
 Supplied
 ASIC. by aby
 Supplied 1.8aaV1.8
 by V
 volt-
 1.8 V
 voltage
 age
 voltage source,
 source, the the
 source, the ASIC
 ASIC consumes
 consumes
 ASIC consumes an an average
 average
 an averageof of
 137
 of 137µW
 137 µW
 µW (versus
 (versus 177
 (versus177 µW,
 µ W,ifififthe
 177µW, the centroid
 thecentroid
 centroid
 computations
 computations blocks
 computationsblocks were
 blockswere on
 wereon all
 onall the
 allthe time).
 thetime).
 time).

 Figure 3. Microphotograph of the fabricated 0.35 µm CMOS ASIC. The circuit supply voltage is 1.8
 Figure3.3.Microphotograph
 Figure Microphotographofofthe thefabricated
 fabricated 0.35
 0.35 µmµm CMOS
 CMOS ASIC.
 ASIC. The
 The circuit
 circuit supply
 supply voltage
 voltage is 1.8
 is 1.8 V
 V and has an active surface area of 0.69 mm 2.
 V and
 and hashas an active
 an active surface
 surface areaarea of 0.69
 of 0.69 mmmm2 . 2.
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
Sensors 2022, 22, x FOR PEER REVIEW 4 of 11
 Sensors 2022, 22, x FOR PEER REVIEW 4 of 11
 Sensors 2022, 22, 595 4 of 11

 The centroid computation yields an inherent error since the eye is a sphere and not a
 The centroid computation yields an inherent error since the eye is a sphere and not a
 plane.The Thiscentroid
 predictive error is easily
 computation corrected
 yields by a lookup
 an inherent tablethe(LUT)eyeprefilled with 1024
 plane. This predictive error is easily corrected by a error
 lookup since
 table (LUT) isprefilled
 a sphere and
 with not
 1024
 values
 avalues uniformly
 plane.uniformly distributed
 This predictive between
 error is easily± 20° by
 correctedmeans by ofa a theoretical
 lookup table study,
 (LUT) allowing
 prefilled each
 with
 distributed between ±20° by means of a theoretical study, allowing each
 calculated
 1024 values angle to correspond
 uniformly distributedto abetween
 real angle. ◦ by
 ±20For system
 meansrobustness
 of purposes,
 a theoretical study,the LUT
 allowing
 calculated angle to correspond to a real angle. For system robustness purposes, the LUT
 contains
 each more corrected
 calculated angle values
 to than the
 correspond to typical
 a real ±16° angle
 angle. For range
 system forrobustness
 human eye move-
 purposes,
 contains more corrected values than the typical ±16° angle range for human eye move-
 ments.
 the LUT
 ments.
 Moreover,
 containstomore
 Moreover,
 cater corrected
 to cater
 for other sources
 for othervalues
 sourcesthan
 of impairment
 the typical
 of impairment ±16◦ angle
 (fabrication
 (fabrication
 process
 rangevariations,
 process for human
 variations,
 eyewear–SCL
 eye movements. misalignments,
 Moreover, toetc.) catera simple
 for other butsources
 robust,ofefficient
 impairment calibration procedure
 (fabrication processis
 eyewear–SCL misalignments, etc.) a simple but robust, efficient calibration procedure is
 implemented [9]. Monte Carlo (MC) simulations showed
 variations, eyewear–SCL misalignments, etc.) a simple but robust, efficient calibrationthat after calibration and LUT
 implemented [9]. Monte Carlo (MC) simulations showed that after calibration and LUT
 compensation, system accuracy
 procedure is implemented is 0.2°Carlo
 [9]. Monte [9]. (MC) simulations showed that after calibration
 compensation, system accuracy is 0.2° [9]. ◦ [9]. and microcomputing unit imple-
 andThe LUTeyewear is equipped
 compensation, systemwith accuracyan NFC is 0.2reader
 The eyewear is equipped with an NFC reader and microcomputing unit imple-
 menting, Theforeyewear
 powerisconsumption
 equipped with an NFC reader
 considerations, andthe
 both microcomputing unit implement-
 LUT and the calibration, the
 menting, for power consumption considerations, both the LUT and the calibration, the
 ing, for
 blink power and
 detection consumption
 interpretationconsiderations,
 algorithm which both the LUT and
 triggers the calibration,
 commands, Figure the 1b. blink
 The
 blink detection and interpretation algorithm which triggers commands, Figure 1b. The
 detection
 eyewear and interpretation
 electronics is emulated algorithm which triggers
 by a Raspberry Pi 3 B+ commands,
 card running Figure 1b. The
 Python eyewear
 scripts for
 eyewear electronics is emulated by a Raspberry Pi 3 B+ card running Python scripts for
 electronics is emulated by a Raspberry Pi 3 B+ card running
 calibration, tabulated correction of the eye-tracker function [9] and the blink detection Python scripts for calibration,
 calibration, tabulated correction of the eye-tracker function [9] and the blink detection
 tabulated
 and correction
 interpretation of the eye-tracker
 algorithm. Figure 4 shows function the[9] and the blink
 prototype detection
 electronics beforeand interpreta-
 encapsula-
 and interpretation algorithm. Figure 4 shows the prototype electronics before encapsula-
 tioninalgorithm.
 tion an SCL (a)Figure 4 shows
 and after (b) on thea prototype
 mock-up eye. electronics
 The substratebefore encapsulation
 is a 200 µ m thick in an SCL
 filled
 tion in an SCL (a) and after (b) on a mock-up eye. The substrate is a 200 µ m thick filled
 (a) and after
 polyimide with(b)copper
 on a mock-up
 tracks ofeye. The substrate
 thickness 35 µ m. isToa 200 µm thick
 simplify filled polyimide
 the fabrication, the NFCwith
 polyimide
 copper with copper tracks of thickness 35 µ m.fabrication,
 To simplifythe the fabrication, theisNFC
 TAG chiptracks
 is notofencapsulated
 thickness 35 but
 µm. isTo simplify
 wire-connected the to provide the NFC ASICTAG chip
 its power not
 and
 TAG chip
 encapsulated is not
 but encapsulated
 is wire-connected but istowire-connected
 provide the ASICto provide
 its powerthe ASIC
 and its
 collect power
 the and
 gaze
 collect the gaze direction data (hence the flexible ribbon). Figure 5 shows the NFC part of
 collect the
 direction gaze
 data direction
 (hence theend data (hence
 flexible ribbon). the flexible ribbon). theFigure 5 shows
 of thethe NFC parttheof
 the test bench, the other of the flexibleFigure
 ribbon 5 shows
 is clearly NFC The
 seen. part primary testcoil
 bench,
 on the
 the test
 other end bench, the other end of the flexible ribbon is clearly seen. The primary coil on the
 eyewear is of the flexible
 excited with ribbon
 a 13.56isMHz clearly 140seen.
 mWThe primary
 source. Thecoil on the secondary
 lens-size eyewear is coil excited
 is
 eyewear
 with a is MHz
 13.56 excited 140 with
 mW asource.
 13.56 TheMHzlens-size
 140 mW source. coil
 secondary The islens-size
 placed atsecondary
 13 mm fromcoiltheis
 placed at 13 mm from the primary one, which corresponds to the typical
 placed one,
 primary at 13 whichmmcorresponds
 from thetoprimary the typicalone, which corresponds
 eye-to-spectacles distance. to the typical
 eye-to-spectacles distance.
 eye-to-spectacles distance.

 (a) (b)
 (a) (b)
 Figure 4. Prototype of the SCL electronics: (a) before encapsulation, (b) encapsulated.
 Figure 4. Prototype
 Figure4. Prototype of
 of the
 theSCL
 SCLelectronics:
 electronics: (a)
 (a) before
 beforeencapsulation,
 encapsulation,(b)
 (b)encapsulated.
 encapsulated.

 Figure 5. Powering the scleral lens by mean of NFC. The flexible ribbon connects the NFC chip to
 Figure5.5.Powering
 Figure Poweringthe
 thescleral
 sclerallens
 lensbyby mean
 mean ofof NFC.
 NFC. The
 The flexible
 flexible ribbon
 ribbon connects
 connects the the
 NFCNFCchipchip to
 to the
 the encapsulated electronics to ease prototyping and the Raspberry card to retrieve the gaze direc-
 the encapsulated
 encapsulated electronics
 electronics to to ease
 ease prototyping
 prototyping and andRaspberry
 the the Raspberry
 card card
 to to retrieve
 retrieve the the direction.
 gaze gaze direc-
 tion.
 tion.
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
OR PEER REVIEW 5 of 11
 Sensors 2022, 22, 595 5 of 11

 The instrumented SCL is illuminated by an infrared (IR) Lambertian light source
 The instrumented SCL is illuminated by an infrared (IR) Lambertian light source using
 using a single IR aLEDsingle(Vishay TSAL6100,
 IR LED (Vishay Vishay
 TSAL6100, Intertechnology,
 Vishay Inc.,Malvern,
 Intertechnology, Inc., Malvern, PA,PA,
 USA) at
 USA) at λ = 940 nm λ =[9]
 940associated withwith
 nm [9] associated a beam splitter
 a beam splitterto deportthethe
 to deport light
 light source
 source on theon the so
 eyewear
 eyewear so as it is as
 notit is
 innot
 theinfield
 the field of view,Figure
 of view, Figure 6.
 6.

 (a) (b)
 Figure 6. Deporting IR light
 Figure 6. source onIRthe
 Deporting eyewear:
 light source on(a)
 theprinciple,
 eyewear: (a)(b) actual (b)
 principle, realization.
 actual realization.

 Blinking is primarily accomplished by the upper lid, the lower one remains essentially
 Blinking is primarily accomplished by the upper lid, the lower one remains essen-
 idle. The upper lid displacement on the cornea is about 7 mm with different speeds during
 tially idle. The upper lid displacement
 opening and closing action,on the cornea
 70 and is about
 160 mm/s 7 mm with
 in average, different
 respectively [13].speeds
 The eyelid
 during opening and closing
 mainly blocksaction, 70 and
 visible light, with160 mm/s
 less than 5%in average, respectively
 transmittance for wavelengths[13]. belowThe 600 nm.
 eyelid mainly blocks visible light, with less than 5% transmittance for wavelengths below 80%
 Moving toward the near infrared, the transmittance of the eyelid increases to around
 600 nm. Moving towardfor λ > 900 nmnear
 the [14]. infrared, the transmittance of the eyelid increases to
 An artificial upper lid with optical characteristics similar to the human one was
 around 80% for λ >designed
 900 nmand [14].
 assembled. Figure 7 illustrates the set-up, (a) side view and (b) front view. A
 An artificial upper
 25 × 25lid mm with
 2 coloroptical
 densitycharacteristics
 filter with a 76% similar to the
 transmittance in human
 the near IR one was
 (940 nm)de-is used
 signed and assembled. Figure
 as material for7theillustrates
 eyelid. Thethelid set-up,
 movements (a) are
 sideemulated
 view and (b)a servomotor
 using front view.(Analog A
 25 × 25 mm color density filter with a 76% transmittance in the near IR (940 nm) is used in
 2 RE3LY S-0008, Conrad Electronic, Hirschau, Germany) to lower and lift the color density
 front of the mock-up eye. The color density is placed on 30 mm radius disc centered on the
 as material for the eyelid. The lid movements are emulated using a servomotor (Analog
 servorotation axis. The number of rotations per min (rpm) of the servomotor is set so as
 RE3LY S-0008, Conrad Electronic,
 to correspond to theHirschau, Germany) tospeeds,
 upper lid displacements lowere.g.,and36lift
 andthe colorfordensity
 16 rpm closing and
 in front of the mock-up
 opening, respectively, yielding artificial eyelid displacement speeds of 160 and 70on
 eye. The color density is placed on 30 mm radius disc centered mm/s,
 respectively.
 the servorotation axis. The numberThe servomotor movements
 of rotations per min are controlled
 (rpm) of by theanservomotor
 Arduino Uno is card,
 setrunning
 so
 as to correspond to basic
 theprograms
 upper lid defining the blinkingspeeds,
 displacements parameters, e.g.,e.g.,
 36number,
 and 16duration,
 rpm forfrequency,
 closing etc.
 Figure 7c,d shows the actual realization of the artificial upper lid. To facilitate assembly,
 and opening, respectively,
 the IR light yielding artificial
 source is placed eyelid
 in front of the displacement
 mock-up eye, thespeeds eyewearof with160theand 70 IR
 deported
 mm/s, respectively.source
 The preventing
 servomotor the movements
 correct motionare controlled
 of the artificial lid.byFor
 anexperiments,
 Arduino Uno card,
 the light source
 running basic programs defining the blinking parameters, e.g., number, duration, fre- mm
 is in front of the lens to ensure that it has the right source type and placed at d0 = 13
 quency, etc. Figurefrom 7c,dtheshows
 prototype.
 the Using
 actualanrealization
 optical power
 2
 of meter (Thorlabsupper
 the artificial PM100D), lid.theToirradiance
 facili- at
 the surface of the lens is 16.4 µW per mm , in accordance with eye safety regulation [6].
 tate assembly, the IR light source is placed in front of the mock-up eye, the eyewear with
 the deported IR source preventing the correct motion of the artificial lid. For experiments,
 the light source is in front of the lens to ensure that it has the right source type and placed
 at d0 = 13 mm from the prototype. Using an optical power meter (Thorlabs PM100D), the
 irradiance at the surface of the lens is 16.4 µ W per mm2, in accordance with eye safety
 regulation [6].
 The eyewear is implemented as a Raspberry Pi 3 B+ card. The gaze direction, directly
 read from the I2C ports of the tag to ease prototyping (see Figure 5), is first calibrated and
 corrected according to the algorithm described in [9] before being processed to detect the
 blinking command. When the eyelid closes, it covers gradually the four PTDs, starting
 from the top. This rapidly modifies the value of the computed centroid. When the eyelid
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
 = , (1)
 
 where μ is the average value of the previous 500 computed values of θ and σ the standard
 deviation of these 500 values of θ. Then, the absolute value of Z is compared to a thresh-
 old, chosen from several measurements, which is 4.4; if |Z|> 4.4, then the glitch is de-
 tected.
Sensors 2022, 22, 595 6 of 11
 Once the glitches are detected, the program can discriminate between voluntary and
 reflex blinks based on their durations.

 (a) (b)

 (c) (d)
 Figure 7.7. Eyelid
 Figure Eyelid emulator
 emulator principle:
 principle:(a)
 (a)side
 sideview,
 view,(b)
 (b)front view.
 front Actual
 view. implementation:
 Actual (c) side
 implementation: (c) side
 view, (d) front view.
 view, (d) front view.

 3. Results
 The eyewear is implemented as a Raspberry Pi 3 B+ card. The gaze direction, directly
 read from the I2C
 3.1. Eye-Tracker ports of the tag to ease prototyping (see Figure 5), is first calibrated and
 Performance
 corrected
 The eye-tracker to
 according the algorithm
 is first tested using described
 the set-upinshown
 [9] before being6b,
 in Figure processed to detect the
 with the deported
 blinking
 IR light source and the NFC tag providing 1.8 V voltage supply to the ASIC. Thestarting
 command. When the eyelid closes, it covers gradually the four PTDs, test
 from
 benchtheis top.
 set inThis
 room rapidly
 with no modifies
 natural the
 lightvalue of the
 but with computed
 the mock-up centroid.
 eye rotatingWhen the the
 within eyelid
 covers
 typicalall the range
 angle PTDs,for the centroid
 human eye value returnsi.e.,
 movements, to from
 its initial
 −16° value
 to +16°(since theofinfluence
 by steps 2°. The of
 the eyelid gaze
 digitized then disappears in the centroid
 direction, represented calculation).
 by horizontal There
 angle (θ) is thus
 and a briefangle
 vertical but significant
 (φ), is
 variation,
 transmitted a glitch, in the time
 to the eyewear thensequence
 applyingof inthe
 realreceived
 time thecentroid values at
 LUT correction. Thethe beginning
 resulting
 of thedirection
 gaze blink. The behavior
 accuracy onlyistaking
 similar
 thewhen theinto
 θ angle eyelid opens.
 account To detect
 is given these
 in Figure variations
 8 (a similar a
 z-score algorithm
 result is obtainedisforimplemented on the
 φ). This figure Raspberry
 shows Pi. Basically,
 the accuracy it computes
 obtained from fivethedifferent
 distance of
 aASICs
 measured gaze direction
 is consistent with MC to the mean represented
 analysis value of the by gazethedirection
 hatchedcomputed
 area (250 over
 runs).a Re-
 sliding
 peating 100 times, the accuracy measurement for each point yields
 time window, Equation (1), considering only the current horizontal eye angle θ. a precision of 0.04°.

 θ−µ
 Z= , (1)
 σ
 where µ is the average value of the previous 500 computed values of θ and σ the standard
 deviation of these 500 values of θ. Then, the absolute value of Z is compared to a threshold,
 chosen from several measurements, which is 4.4; if |Z| > 4.4, then the glitch is detected.
 Once the glitches are detected, the program can discriminate between voluntary and
 reflex blinks based on their durations.

 3. Results
 3.1. Eye-Tracker Performance
 The eye-tracker is first tested using the set-up shown in Figure 6b, with the deported
 IR light source and the NFC tag providing 1.8 V voltage supply to the ASIC. The test bench
 is set in room with no natural light but with the mock-up eye rotating within the typical
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
Sensors 2022, 22, 595 7 of 11

 Sensors 2022, 22, x FOR PEER REVIEW 7 of 11
 angle range for human eye movements, i.e., from −16◦ to +16◦ by steps of 2◦ . The digitized
 gaze direction, represented by horizontal angle (θ) and vertical angle (ϕ), is transmitted to
 the eyewear then applying in real time the LUT correction. The resulting gaze direction
 As most video-based eye-tracker require frequent recalibration, it is interesting to
 accuracy only taking the θ angle into account is given in Figure 8 (a similar result is obtained
 assess the validity duration of the proposed cameraless eye-tracker. The calibration is
 for ϕ). This figure shows the accuracy obtained from five different ASICs is consistent with
 Sensors 2022, 22, x FOR PEER REVIEWdone at t0, and the accuracy is measured then at t0, 30 min later and one hour later. 7 ofThe
 11
 MC analysis represented by the hatched area (250 runs). Repeating 100 times, the
 results show the calibration is robust in time (the accuracy remains at most at 0.2°, Figure accuracy
 measurement
 9). for each point yields a precision of 0.04◦ .
 As most video-based eye-tracker require frequent recalibration, it is interesting to
 assess the validity duration of the proposed cameraless eye-tracker. The calibration is
 done at t0, and the accuracy is measured then at t0, 30 min later and one hour later. The
 results show the calibration is robust in time (the accuracy remains at most at 0.2°, Figure
 9).

 Figure 8.
 Figure 8. Measured
 Measuredaccuracies from
 accuracies 5 ASICs,
 from compared
 5 ASICs, to MC
 compared to analysis (hatched
 MC analysis area); the
 (hatched supply
 area); the supply is
 is nominal, i.e., 1.8 V.
 nominal, i.e., 1.8 V.

 As most video-based eye-tracker require frequent recalibration, it is interesting to
 assess the validity duration of the proposed cameraless eye-tracker. The calibration is done
 at t0 , and
 Figure the accuracy
 8. Measured is measured
 accuracies then
 from 5 ASICs, at t0 , to
 compared 30MC
 min later (hatched
 analysis and onearea);
 hourthelater. The results
 supply
 show the i.e.,
 is nominal, calibration
 1.8 V. is robust in time (the accuracy remains at most at 0.2◦ , Figure 9).

 Figure 9. Calibration robustness versus operating time. Calibration done once at t0. Accuracy re-
 mains stable even after 1 h without recalibration. The supply is nominal, i.e., 1.8 V.

 Figure 9.9.Calibration
 Figure Calibrationrobustness versus
 robustness operating
 versus time. Calibration
 operating done oncedone
 time. Calibration at t0. once
 Accuracy
 at t0re-
 . Accuracy
 mains stable
 remains even
 stable afterafter
 even 1 h without recalibration.
 1 h without The supply
 recalibration. The is nominal,
 supply i.e., 1.8 V. i.e., 1.8 V.
 is nominal,
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
Sensors 2022,
Sensors 2022, 22,
 22, 595
 x FOR PEER REVIEW 88 of
 of 11
 11

 3.2. Detecting
 3.2. Detecting the Eye Blink
 the Eye Blink with
 with the
 the Artificial
 Artificial EyelidASIC
 EyelidASIC Performance
 Performance
 To illustrate
 To illustrate thethe blink
 blink detection,
 detection, the the set-up
 set-up shown
 shown in in Figure
 Figure 77 is is used.
 used. TheThe ASIC
 ASIC is is
 still powered using the NFC tag. The mock-up eye is placed
 still powered using the NFC tag. The mock-up eye is placed at angles θ ≈ ϕ ≈ 0 . The at angles θ ≈ φ ≈ ◦
 0°. The
 servomotor is programmed to apply a series of blinks with various durations. Two blinks
 300 ms apart with duration of about 270 ms with closing and opening times of 45 ms and
 respectively. These
 100 ms, respectively. These values
 values correspond
 correspond to to unconscientious
 unconscientious blinks. blinks. Figure 10 shows
 the θθangles
 anglesthe theeye-tracker
 eye-trackerprovides
 providesinin time,
 time, toptop graph.
 graph. Before
 Before thethe eyelid
 eyelid firstfirst closing,
 closing, the
 the correct
 correct θ value
 θ value is obtained.
 is obtained. When When
 the the eyelid
 eyelid closes,
 closes, the the θ value
 θ value changes
 changes rapidly,
 rapidly, withwith
 the
 the lowest
 lowest valuevalue
 when when the artificial
 the artificial eyelideyelid completely
 completely coverscovers
 the IRthe IR light
 light source.
 source. SinceSince the
 the four
 four PTDs
 PTDs receivereceive the amount
 the same same amount of attenuated
 of attenuated IR light,IRthelight, the centroid
 centroid returns to returns to its
 its initial ini-
 value
 as
 tialexpected. The process
 value as expected. Therepeats
 processitself for the
 repeats second
 itself blink.
 for the The eyewear
 second blink. The applies
 eyewear z-score
 the applies
 algorithm
 the z-scoreonalgorithm
 the received on data to detect the
 the received dataoccurrence
 to detect of theopenings and closings.
 occurrence of openings Plotting
 and
 the detection
 closings. based
 Plotting theon the z-score
 detection basedin time
 on theyields
 z-scoretheinseries
 time of pulses
 yields theinseries
 the lower graph
 of pulses in
 of Figure 10. Therefore, blinks are detected and their duration can
 the lower graph of Figure 10. Therefore, blinks are detected and their duration can easily easily be computed to
 determine
 be computed if they correspond
 to determine if to intentional
 they correspond or unintentional
 to intentionalones. As a simple example,
 or unintentional ones. As a
 blink
 simple duration
 example, longer thanduration
 a blink 0.5 s, interpreted
 longer thanhence 0.5ass,an intentionalhence
 interpreted one, isasused to validate
 an intentional
 the
 one,target
 is useddesignation.
 to validate the target designation.

 10. Computed
 Figure 10. Computed gaze direction versus time. Blink Blink effects
 effects on
 on the computation
 computation are used to
 generate blink start and end pulses to establish the blink duration so that the system interprets
 generate blink start and end pulses to establish the blink duration so that the system interprets it
 it as
 as
 either a reflex or an intentional one.
 either a reflex or an intentional one.

 Partial eyelid
 Partial eyelidclosures
 closuresare
 aredetected
 detectedasas a single
 a single glitch,
 glitch, equivalent
 equivalent to start
 to the the start pulse,
 pulse, and
 and can be thus either discarded or taken into account.
 can be thus either discarded or taken into account.

 4. Discussion
 The results presented show that the proposed cameraless eye-tracker is accurate and
 robust. Several
 Several ASICs
 ASICs have been tested; they all have have thethe same
 same performance.
 performance. The total
 power consumption of the electronics required by the instrumentedinstrumented SCL SCL isis 417
 417 µµW,W, less
 than half the power harvested (920 µµW), W), out of which 280 µW
 µW consumed by the NXP
 the NXP tag.tag.
 Encapsulating
 Encapsulating twotwointegrated
 integratedcircuits into
 circuits the the
 into SCLSCL
 was was
 not convenient at this at
 not convenient stage.
 this Hence,
 stage.
 further work will
 Hence, further consist
 work will in integrating
 consist into theinto
 in integrating ASICtheonly
 ASICtheonly
 necessary parts ofparts
 the necessary the tag
 of
 to
 theharvest
 tag to energy
 harvest and to communicate
 energy with the with
 and to communicate NFC thereader.
 NFCThere is plenty
 reader. There of available
 is plenty of
 surface area on fabricated ASIC as 61% of chip core is available (see Figure 3).
 available surface area on fabricated ASIC as 61% of chip core is available (see Figure 3). This should
 further reduce
 This should the total
 further power
 reduce theconsumption
 total powerof the on-lens system.
 consumption Table 1 system.
 of the on-lens summarizes Tablethe1
 system main features and performances.
 summarizes the system main features and performances.
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
Sensors 2022, 22, 595 9 of 11

 Table 1. Eye-tracker main features and performances.

 Item Value Unit
 ASIC power consumption 137 µW
 ASIC active surface area 0.69 mm2
 ASIC supply voltage 1.8 V
 NTAG power consumption 280 µW
 Harvested power 920 µW
 Sample rate 250 Hz
 Eye-tracker accuracy 0.2 degrees
 Eye-tracker precision 0.04 degrees

 Table 2 compares the eye-tracking performance to that of other mobile eye-trackers.
 The accuracy is more than doubled with a precision about two orders of magnitude better.
 Moreover, accuracy and precision remain valid after one hour of functioning. Although
 tested under different conditions, the results are encouraging and show a real potential for
 the proposed solution.

 Table 2. Eye-tracker main features and performances.

 This Work Tobii Pro Glasses 2 [6] Pupils Labs Glasses [7]
 Worn by user in controlled Worn by user in controlled
 Test conditions On test bench
 environment environment
 Accuracy 0.2◦ 0.5◦ 0.82◦
 Precision 0.04◦ 0.3◦ 0.31◦
 Sample rate 250 Hz 100 Hz 240 Hz
 Accuracy loss after calibration None 1 h after Unknown +0.25◦ 5 min after

 Moreover, we demonstrated that the eyelid only briefly disturbs the computed gaze
 direction when closing or opening and therefore can be used as a command to top the gaze
 detection recording or validate a target provided that conscious and unconscious blinks can
 be differentiated. These short disturbances are used to implement simply a bio-monitoring
 of the instrumented contact lens by interpreting the blinks, provided to keep aside reflex
 or unconscious blinks. As explained, this can be based on the blink duration; any with
 duration shorter than 0.5 s can be discarded. Based on duration, one can imagine coding
 several commands using series of voluntary blinks, one blink corresponding to command 1,
 two blinks to command 2, and so forth. However, using different voluntary blink durations
 is not convenient to implement several commands, even though discriminating different
 duration is technically not a problem. Indeed, it is difficult to ensure that the user blink
 with the correct duration (even considering a possible discrepancy in the duration). This
 could yield wrong interpretation and hence the wrong command activated.
 A better solution is to use both eyes, each equipped with an instrumented SCL.
 As unconscious blinking occurs on both eyes at the same time, one blink on one eye is
 necessarily a command. This increases the number of possible combinations if coupled to
 series of blinks (left, right, both eyes). An example of three blink-controlled commands is
 given in Table 3.

 Table 3. Combining different voluntary blinks to generate three commands: “1” means one blink, “0”
 no blink.

 Right Eye Left Eye Command Number
 1 0 1
 0 1 2
 1 1 3
Multipurpose Bio-Monitored Integrated Circuit in a Contact Lens Eye-Tracker
Sensors 2022, 22, 595 10 of 11

 The z-score algorithm has the advantage of being simple to implement and yields
 satisfactory results with the mock-up eye. The right threshold value was determined from
 a series of experiments; it might not be adapted when the system will be used on a real eye.
 Moreover, a fixed threshold is not appropriate considering almost certain user-to-user and
 devices variations. The threshold should be tailored by means of a specific algorithm. The
 same approach also works for a partial blink, although if the eyelid only covers the two
 upper photodiodes (fully or partially), a unique peak will be detected. For this reason and
 for sake of simplicity, the HMI should rather be based on the differential response between
 the two eyes as proposed.
 As the system is intended to be worn, ensuring eye safety is mandatory. Bio-compatibility
 is guaranteed by encapsulating all components (flexible substrate, ASIC, PTDs, etc.) in
 a medical SCL. Moreover, this rigid lens rests on the sclera and thus creates a tear-filled
 vault between the SCL and the cornea, improving comfort and avoiding eye dryness. The
 eyewear continuously illuminating the eye, an IR source is chosen to avoid disturbing the
 user. The measured irradiance at the lens surface is less than 20 µW/mm2 , compliant with
 ocular safety standards for near-IR lighting [15]. Regarding the induction link, the specific
 absorption rate (SAR) must be less than 2 W/kg. Hence, for safety reason it is necessary to
 limit the primary-antenna-emitted power imbedded in our case into the eyewear frame,
 thus at a fixed distance of 1.5 to 2 cm from the eyes. The only variable parameter is the
 orientation changing when the eye rotates. The rotation of the secondary coil with respect
 to the primary one, from 0◦ to 16◦ , leads to a 30% loss but with no impact on the system
 harvesting. Simulations in a similar case (but with no rotation) show a SAR of 0.021 W/kg
 with a 2 W power source [16]. Therefore, we can conclude the 140 mW power source
 used in this study complies with safety standards. Moreover, studies show prototypes
 of electronic lens encapsulating a LED, consuming up to 10 mW and worn by humans
 without a significant rise in temperature. The system presented, consuming less than
 0.5 mW, should only moderately generate heat and be compliant with human safety.
 Finally, the current system could also be used to detect fatigue since blink duration,
 interval, and partial closure is a symptom of fatigue or drowsiness [17,18]. This requires
 just modifying the software program on the eyewear to obtain these data from the start
 and end pulses detected by the system.

 Author Contributions: Conceptualization, J.-L.d.B.d.l.T.; writing—original draft preparation, C.L.;
 writing—review and editing, all authors; IC design, L.M., C.L. and F.S.; tests and measures, L.M. and
 V.N.; supervision, J.-L.d.B.d.l.T., C.L., F.S. and V.N.; funding acquisition, J.-L.d.B.d.l.T. and C.L. All
 authors have read and agreed to the published version of the manuscript.
 Funding: This research was funded by the “Institut Mines-Telecom Fundation Futur and Ruptures” program.
 Institutional Review Board Statement: Not applicable.
 Informed Consent Statement: Not applicable.
 Data Availability Statement: Not applicable.
 Conflicts of Interest: The authors declare no conflict of interest. The funders had no role in the design
 of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or
 in the decision to publish the results.

References
1. Park, J.; Kim, J.; Kim, S.-Y.; Cheong, W.H.; Jang, J.; Park, Y.-G.; Na, K.; Kim, Y.-T.; Heo, J.H.; Lee, C.Y.; et al. Soft, smart contact
 lenses with integrations of wireless circuits, glucose sensors, and displays. Sci. Adv. 2018, 4, eaap9841. [CrossRef] [PubMed]
2. Larrazabal, A.; Cena, C.G.; Martínez, C. Video-oculography eye tracking towards clinical applications: A review. Comput. Biol.
 Med. 2019, 108, 57–66. [CrossRef] [PubMed]
3. Vicente, F.F.; Huang, Z.; Xiong, X.; De la Torre, F.; Zhang, W.; Levi, D. Driver Gaze Tracking and Eyes Off the Road Detection
 System. IEEE Trans. Intell. Transp. Syst. 2015, 16, 2014–2027. [CrossRef]
4. Kar, A.; Corcoran, P. A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods
 in Consumer Platforms. IEEE Access 2017, 5, 16495–16519. [CrossRef]
Sensors 2022, 22, 595 11 of 11

5. Villanueva, A.; Cabeza, R. A Novel Gaze estimation system with one calibration point. IEEE Trans. Syst. Man Cybern. Part B 2008,
 38, 1123–1138. [CrossRef] [PubMed]
6. Cognolato, M.; Atzori, M.; Müller, H. Head-mounted eye gaze tracking devices: An overview of modern devices and recent
 advances. J. Rehabil. Assist. Technol. Eng. 2018, 5, 2055668318773991. [CrossRef]
7. Ehinger, B.V.; Groß, K.; Ibs, I.; König, P. A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs
 glasses and the EyeLink 1000. PeerJ 2019, 7, e7086. [CrossRef] [PubMed]
8. Khaldi, A.; Daniel, E.; Massin, L.; Kärnfelt, C.; Ferranti, F.; Lahuec, C.; Seguin, F.; Nourrit, V.; De Bougrenet de la Tocnaye, J.-L. A
 laser emitting contact lens for eye tracking. Sci. Rep. 2020, 10, 14804. [CrossRef] [PubMed]
9. Massin, L.; Seguin, F.; Nourrit, V.; Daniel, E.; De Bougrenet de la Tocnaye, J.-L.; Lahuec, C. Smart Contact Lens Applied to Gaze
 Tracking. IEEE Sens. J. 2021, 21, 455–463. [CrossRef]
10. Massin, L.; Nourrit, V.; Lahuec, C.; Seguin, F.; Adam, L.; Daniel, E.; de Bougrenet de la Tocnaye, J.-L. Development of a new
 scleral contact lens with encapsulated photodetectors for eye tracking. Opt. Express 2020, 28, 28635. [CrossRef] [PubMed]
11. Morimoto, C.H.; Mimica, M.R. Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 2005, 98,
 4–24. [CrossRef]
12. Sony Interactive Entertainment Inc. Interface Using Eye Tracking Contact Lenses. U.S. Patent US20120281181A1, 21 January 2014.
13. Sforza, C.; Rango, M.; Galante, D.; Bresolin, N.; Ferrario, V. Spontaneous blinking in healthy persons: An optoelectronic study of
 eyelid motion. Ophthalmic Physiol. Opt. 2008, 28, 345–353. [CrossRef] [PubMed]
14. Prystowsky, J.H.; Keen, M.S.; Rabinowitz, A.D.; Stevens, A.W.; DeLeo, V.A. Present status of eyelid phototherapy: Clinical efficacy
 and transmittance of ultraviolet and visible radiation through human eyelids. J. Am. Acad. Dermatol. 1992, 26, 607–613. [CrossRef]
15. ICNIRP. Guidelines on limits of exposure to incoherent visible and infrared radiation. Health Phys. 2013, 5, 74–96.
16. Park, J.; Ahn, D.B.; Kim, J.; Cha, E.; Bae, B.S.; Lee, S.Y.; Park, J.U. Printing of wirelessly rechargeable solid-state supercapacitors
 for soft smart contact lenses with continuous operations. Sci. Adv. 2019, 5, eaay0764. [CrossRef] [PubMed]
17. Stern, J.A.; Boyer, D.; Schroeder, D. Blink Rate: A Possible Measure of Fatigue. Hum. Factors J. Hum. Factors Ergon. Soc. 1994, 36,
 285–297. [CrossRef] [PubMed]
18. Bang, J.W.; Heo, H.; Choi, J.-S.; Park, K.R. Assessment of Eye Fatigue Caused by 3D Displays Based on Multimodal Measurements.
 Sensors 2014, 14, 16467–16485. [CrossRef] [PubMed]
You can also read