SearchGazer is an eye tracking library that uses common webcams to infer the eye-gaze locations of visitors on a search engine in real time. In addition, SearchGazer predicts in real-time which area of interest within a search engine result page is being examined by a visitor at any moment. SearchGazer extends WebGazer and its eye tracking model that self-calibrates by watching web visitors interact with the web page and trains a mapping between the features of the eye and positions on the screen. SearchGazer is written entirely in JavaScript and with only a few lines of code can be integrated in any search engine that wishes to conduct remote eye tracking studies. SearchGazer runs entirely in the client browser, therefore no video data needs to be sent to a server. SearchGazer runs only if the user consents in giving access to their webcam.
/* WebGazer.js library */
<script src="searchgazer.js" type="text/javascript" >
Be aware that when you do local development and you might need to run locally a simple http server that supports the https protocol.
Once the script is included, the webgazer
object is introduced into the global namespace. webgazer
has methods for controlling the operation of WebGazer.js allowing us to start and stop it, add callbacks, or change out modules. The two most important methods on webgazer
are webgazer.begin()
and webgazer.setGazeListener()
. webgazer.begin()
starts the data collection that enables the predictions, so it's important to call this early on. Once webgazer.begin()
has been called, WebGazer.js is ready to start giving predictions. webgazer.setGazeListener()
is a convenient way to access these predictions. This method invokes a callback you provide every few milliseconds to provide the current gaze location of a user. If you don't need constant access to this data stream, you may alternatively call webgazer.getCurrentPrediction()
which will give you a prediction at the moment when it is called.
webgazer.setGazeListener(function(data, elapsedTime) {
if (data == null) {
return;
}
var xprediction = data.x; //these x coordinates are relative to the viewport
var yprediction = data.y; //these y coordinates are relative to the viewport
console.log(elapsedTime); //elapsed time is based on time since begin was called
}).begin();
Here is the alternate method of getting predictions where you can request a gaze prediction as needed.
var prediction = webgazer.getCurrentPrediction();
if (prediction) {
var x = prediction.x;
var y = prediction.y;
}
There are several features that WebGazer.js enables beyond the example shown so far.
WebGazer.js can save and restore the training data between browser sessions by storing data to localstorage. This occurs automatically when end()
is called. If you want each user session to be independent make sure that you do not call the end()
function.
webgazer.end()
At the heart of WebGazer.js are the tracker and regression modules. The tracker module controls how eyes are detected and the regression module determines how the regression model is learned and how predictions are made based on the eye patches extracted from the tracker module. These modules can be swapped in and out at any time. We hope that this will make it easy to extend and adapt WebGazer.js and welcome any developers that want to contribute.
WebGazer.js requires the bounding box that includes the pixels from the webcam video feed that correspond to the detected eyes of the user. Currently we include three external libraries that implement different Computer Vision algorithms to detect the face and eyes.
webgazer.setTracker("clmtrackr"); //set a tracker module
webgazer.addTrackerModule("newTracker", NewTrackerConstructor); //add a new tracker module
Here are all the external tracker modules that come by default with WebGazer.js. Let us know if you want to introduce your own facial feature detection library.
webgazer.setRegression("ridge"); //set a regression module
webgazer.addRegressionModule("newReg", NewRegConstructor); //add a new regression module
Here are all the regression modules that come by default with WebGazer.js. Let us know if you would like introduce different modules - just keep in mind that they should be able to produce predictions very fast.
It may be necessary to pause the data collection and predictions of WebGazer.js for performance reasons.
webgazer.pause(); //WebGazer.js is now paused, no data will be collected and the gaze callback will not be executed
webgazer.resume(); //data collection resumes, gaze callback will be called again
We provide some useful functions and objects in webgazer.util
. The webgazer.params object also contains some useful parameters to tweak to control video fidelity (trades off speed and accuracy) and sample rate for mouse movements.
webgazer.util.bound(prediction);
prediction.x; //now always in the bounds of the viewport
prediction.y; //now always in the bounds of the viewport
findDomElementBing(x,y)
and findDomElementGoogle(x,y)
respectively, which given a pair of coordinates within the search engine page will return the corresponding DOM element.
SearchGazer uses the getUserMedia/Stream API to get access to the webcam. These browsers are currently supported, as seen here.
SearchGazer can identify which areas of interests are being examined in real time for the following search engines:
Download searchgazer.js and add it as a script in your HTML page. For instructions on how to initialize the eye tracking and area of interest detection check the Usage section.
See how SearchGazer works on Bing and Google. The only thing you need to do is make sure your face is captured correctly and on the next page click the center of a black circle that will appear in different locations within your screen. Once you have successfully clicked the target for 9 times you can choose to see SearchGazer's predictions in action on a Bing or Google demo. The areas of interest that correspond to the location of the predicted gaze will be logged in the console.
If you use SearchGazer please cite the following paper:
@inproceedings{papoutsaki2017searchgazer,
author = {Alexandra Papoutsaki and James Laskey and Jeff Huang},
title = {SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search},
booktitle = {Proceedings of the ACM SIGIR Conference on Human Information Interaction \& Retrieval (CHIIR)},
year = {2017},
organization={ACM}
}
Copyright (C) 2020 Brown HCI Group
Licensed under GPLv3.