During the last few decades, as part of e？orts to enhance natural human robot interaction (HRI), considerable research has been carried out to develop human-like gaze control. However, most studies did not consider hardware implementation, real-time processing, and the real environment, factors that should be taken into account to achieve natural HRI. This paper proposes a fuzzy integral-based gaze control algorithm, operating in real-time and the real environment, for a robotic head. We formulate the gaze control as a multicriteria decision making (MCDM) problem and devise seven human gaze-inspired criteria. Partial evaluations of all candidate gaze directions are carried out with respect to the seven criteria de？ned from perceived visual, auditory, and internal inputs, and fuzzy measures are assigned to a power set of the criteria to re？ect the user de？ned preference. A fuzzy integral of the partial evaluations with respect to the fuzzy measures is employed to make global evaluations of all candidate gaze directions. The global evaluation values are used to decide the final gaze direction with inhibition of return (IOR) and priming. The e？ectiveness of the proposed algorithm is demonstrated with a robotic head, developed in the Robot Intelligence Technology (RIT) laboratory at KAIST. The proposed algorithm should be compared with real human eye tracking data. Conventional research focused on evaluating each point in visual information to predict where humans usually pay attention to based on a large consistency of human gaze. However, humans produce various scanpaths even from the same visual information because gaze is a cognitive process including the preferences of the human gaze. Thus, the fuzzy integral-based gaze control algorithm is expanded to an evolutionary fuzzy integral-based gaze control algorithm. It produces various scanpaths according to preference of human gaze. The fuzzy integral-based gaze control algorithm is used to produce scanpaths. The produced scanpath is transformed into a ？xation map and compared with a scanpath obtained from a human subject by the earth mover’s distance (EMD). Based on the comparison, quantum-inspired evolutionary algorithm (QEA) gradually develops preference of human gaze and adjusts the gaze control algorithm to produce a scanpath similar to the human scanpath. The effectiveness of the proposed algorithm is demonstrated by comparing a human scanpath with a scanpath produced from the algorithm using the developed characteristics in other images. Lastly, when there is a task, human gaze shows a task-oriented gaze control with a large consistency. In this case, gaze is used for information acquisition, not information representation. These two roles are important when they cooperate with humans. This paper proposes the hierarchical temporal memory (HTM)-based gaze control algorithm in human robot cooperation. Robots can remember procedural memory of tasks and know when they interact with humans with the HTM. From the perceived envi- ronment, robots produce tasks with procedural memory using the HTM. Then, robots generate their movements with a sampling-based algorithm to perform the task. In each event of the procedural mem- ory, their gaze is also changed simultaneously to acquire information they need for the successful tasks. When humans interrupt the tasks or there is an inability of robots for the tasks, robots control their gaze to interact with humans. The e？ectiveness of the proposed algorithm is demonstrated by with Mybot-KSR2, developed in the RIT Lab. at KAIST.