Front. Robot. AI Frontiers in Robotics and AI Front. Robot. AI 2296-9144 Frontiers Media S.A. 1121624 10.3389/frobt.2023.1121624 Robotics and AI Original Research Robots with tears can convey enhanced sadness and elicit support intentions Yasuhara and Takehara 10.3389/frobt.2023.1121624 Yasuhara Akiko 1 * Takehara Takuma 2 1 Graduate School of Psychology, Doshisha University, Kyotanabe, Kyoto, Japan 2 Department of Psychology, Doshisha University, Kyotanabe, Kyoto, Japan

Edited by: José Carlos Castillo, University Carlos III of Madrid, Spain

Reviewed by: Enrique Fernández Rodicio, Universidad Carlos III de Madrid, Spain

Kazunori Terada, Gifu University, Japan

*Correspondence: Akiko Yasuhara, a.yasuhara.do@gmail.com

These authors have contributed equally to this work

01 06 2023 2023 10 1121624 12 12 2022 18 05 2023 Copyright © 2023 Yasuhara and Takehara. 2023 Yasuhara and Takehara

This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

The behaviour of shedding tears is a unique human expression of emotion. Human tears have an emotional signalling function that conveys sadness and a social signalling function that elicits support intention from others. The present study aimed to clarify whether the tears of robots have the same emotional and social signalling functions as human tears, using methods employed in previous studies conducted on human tears. Tear processing was applied to robot pictures to create pictures with and without tears, which were used as visual stimuli. In Study 1, the participants viewed pictures of robots with and without tears and rated the intensity of the emotion experienced by the robot in the picture. The results showed that adding tears to a robot’s picture significantly increased the rated intensity of sadness. Study 2 measured support intentions towards a robot by presenting a robot’s picture with a scenario. The results showed that adding tears to the robot’s picture also increased the support intentions indicating that robot tears have emotional and social signalling functions similar to those of human tears.

human-robot interaction communicative robot social robot emotional tears sadness crying section-at-acceptance Human-Robot Interaction

香京julia种子在线播放

    1. <form id=HxFbUHhlv><nobr id=HxFbUHhlv></nobr></form>
      <address id=HxFbUHhlv><nobr id=HxFbUHhlv><nobr id=HxFbUHhlv></nobr></nobr></address>

      1 Introduction

      The ability of communicative robots to express emotions in various ways is critical for human-robot interaction. People communicate with each other using language and nonverbal cues such as gestures and facial expressions to convey their emotions and infer others’ emotions (Barrett, 1993; Lange et al., 2022). By expressing emotions in the same manner as humans, the signals and intentions sent by robots can be intuitively understood by humans, without these having to undergo special training for interaction with robots. Thus, it is essential for robots to express emotions to facilitate human-robot interactions (Cañamero and Fredslund, 2001; Eyssel et al., 2010; Kim and Kwon, 2010; Złotowski et al., 2015; Tsiourti et al., 2019). Furthermore, it has been argued that emotions have a social function (van Kleef and Côté, 2022). the social functionof emotions refers to the influence of emotional expressions on the thoughts, feelings, and behaviours of others, which has been widely confirmed in the emotional expressions of robots (Melo et al., 2023). For example, it has been demonstrated that when robots express emotions through language, gestures, and facial expressions, they increase their ratings of intelligence, sociability (Garrell et al., 2017), humanlikeness, likeability, and closeness (Eyseel et al., 2010). In addition, field research in supermarkets has shown that children are interested in robots that exhibit emotional expressions and engage with them themselves (Motoda, 2019). In short, the social function of emotions has many advantages, as it works not only between people but also between people and robots (Melo et al., 2023). Additionally, many studies have shown that emotional expressions by robots can change peoples’ attitudes towards them and influence their emotions and behaviours (e.g., Garrell et al., 2017; Motoda, 2019). Moreover, Cañamero and Fredslund. (2001) argued that more emotional expressions are required for richer human-robot interactions. In addition, robots should be able to express diverse and complex emotions that are similar to those of humans.

      In September 2021, a robot that sheds tears like a human was developed at a University in Japan. The robot has tear sacs in its eyes, moistens its eyes, and sheds tears naturally (Yoon, 2021). Associate Professor Sejima, the robot developer, hopes to create an atmosphere that allows people to cry easily in the presence of crying robots. The robot is intended for tele-counselling and activities that aim to relieve stress by crying (Yoon, 2021).

      However, it is unclear whether the tears shed by robots are recognised as emotional expressions. The behaviour of shedding tears in response to emotional arousal is unique to humans (Vingerhoets, 2013). In studies on non-human subjects, the tears of animals and avatar faces have been found to convey sadness (Gračanin et al., 2021; Picó and Gadea, 2021). However, to the best of our knowledge, no studies have examined people’s perceptions of tears shed by robots. In other words, it is not clear whether the social functions of emotions that arise from people shedding tears work in the same manner when robots shed tears. Therefore, even if a robot is equipped with the ability to shed tears, it is unclear whether it will have the expected effects such as emotional communication and interaction. Hence, it is essential to clarify this issue when designing future robots. As a result, the present study aimed to determine whether robot tears also exhibit the typical signalling functions of human tears using methods employed in previous studies conducted on human tears. In the next section, the typical emotional and social signalling functions of human tears are reviewed. Finally, the overview and hypotheses of the present study are presented.

      Human tears function as a signal that conveys sadness. Provine et al. (2009) and Zeifman and Brown (2011) showed that removing tears from a picture of a tearful person reduced the intensity ratings of sadness. Conversely, adding tears to a sad expression accelerates the rate at which it is perceived as sad, indicating that tears facilitate the perception of sadness (Balsters et al., 2013). Ito et al. (2019) demonstrated that when tears were added to neutral, sad, angry, fearful, and disgusted facial expressions, the rated intensity of sadness was higher for all expressions and the rating pattern of facial expressions approached that of sadness. They concluded that tears are convey sadness.

      Additionally, tears can function as a social signal to elicit support intentions from others. Several studies examined the effect of tears on eliciting support intentions using pictures of tearful people or digitally manipulated pictures of tears. The results have consistently shown enhanced support intentions for tearful individuals compared to those who are not tearful (Hendriks and Vingerhoets, 2006; Vingerhoets et al., 2016; Stadel et al., 2019). The effect of tears on eliciting support intentions from others was proposed by Zickfeld et al. (2021) as a social support hypothesis. They demonstrated that tears elicit support intention from others; however, the magnitude of the effect of tears varies depending on certain variables such as the event of crying, relationship with the person, and gender. Recently, the process by which tears elicit support intentions has become increasingly apparent. Vingerhoets et al. (2016) and Zickfeld et al. (2021) demonstrated that tears represent feelings of helplessness and warmth. Observers feel connected to a tearful individual, which leads to supporting intentions towards the latter.

      2 Hypotheses and overview of the study

      This study investigated whether tears shed by robots have both emotional and social signalling functions, using methods similar to those employed in previous studies conducted on human tears. Specifically, we used pictures of robots as visual stimuli and measured self-reported responses. These responses provide the first clue to people’s feelings and behaviours in real-life situations. In other words, we aimed to determine whether the responses to a robot’s tears were similar to those to human tears. Study 1 investigated whether robot tears serve as emotional signals that convey sadness. Study 2 investigated whether robot tears served as a social signal that elicits support intentions. The process underlying the effect of human tears on support intentions has been identified in existing literature; therefore, Study 2 also examined whether robots and humans were similar in this process.

      Because people tend to anthropomorphise robots and exhibit social reactions, robot tears may elicit reactions similar to those of as human tears. Systematic experiments by Reeves and Nass. (1996) have shown that people behave socially and politely towards flattering inorganic objects, such as computers. This theory, which argues that people tend to assign human characteristics to media, is called media equation theory (Reeves and Nass, 1996). Furthermore, this phenomenon occurs unconsciously and automatically (Reeves and Nass, 1996), and has been confirmed in people’s reactions to robots (e.g., Rosenthal-von der Pütten et al., 2013; Suzuki et al., 2015). This phenomenon suggests that people tend to perceive inorganic objects anthropomorphically. Consequently, we predicted that people’s reactions to robot tears would be similar to their reactions to human tears. Specifically, we predicted that people would attribute a robot’s tears to feelings of sadness, thereby increasing the rating of sadness intensity. Furthermore, it was predicted that people would evaluate a robot shedding tears as warm and helpless, similar to a tearful person, and would feel more connected to the robot, thereby resulting in increased support intention.

      3 Study 1

      The participants viewed pictures of robots with and without tears and rated the intensity of the emotion experienced by the robot in the picture. In previous studies of human tears, the type of emotion for which ratings were asked varied from study to study, with some asking participants to respond to sadness only (Provine et al., 2009; Balsters et al., 2013), to negative emotions only (Ito et al., 2019), and to Ekman’s (1999) basic six emotions (Gračanin et al., 2021). It is also a customary and common practice in numerous facial expression recognition studies to measure Ekman’s (1999) six basic emotions (happiness, sadness, anger, disgust, fear, and surprise; e.g., Miwa et al., 2003; Ge et al., 2008; Kishi et al., 2012; Bennett and Šabanović, 2014). This study is the first to examine the impact of robot tears on emotional ratings. To determine the impact of more emotions, we asked the participants to rate the six basic emotions proposed by Ekman. (1999). Previous studies have shown that tears enhance sadness (e.g., Provine et al., 2009; Ito et al., 2019). Thus, we predicted that robots with tears would have higher ratings of sadness intensity than those without tears (Hypothesis 1). Conversely, the effect of tears on emotions other than sadness depends on the facial expressions before tears are applied. For example, applying tears to an angry facial expression results in a stronger evaluation of the anger emotion conveyed by the expression (Gračanin et al., 2021). However, because the robots used in this study were incapable of changing their facial expressions or clearly conveying an expression of a particular emotion, no explicit hypotheses were formulated for emotions other than sadness.

      3.1 Materials and methods 3.1.1 Participants

      Fifty-two undergraduate students from a Japanese university participated in Study 1. Participants with one or more missing data were excluded. The final analysis included 50 participants (16 men, 34 women; mean age = 20.16 years, SD = 1.27). All participants were native Japanese speakers. Class participation points were awarded to participants of the experiment. Before conducting Study 1, all participants were provided with an overview of the study using Qualtrics, an online survey platform. After being provided an overview, participants were asked to select “agree” or “disagree” to participate in the study. Only participants who selected “agree” were included in the survey. The ideal sample size was calculated using G*Power (Faul et al., 2007) with a significance level of 5% and power of 0.80, considering the effect size (d = 0.86) of the meta-analysis conducted by Zickfeld et al. (2021). The power analysis results suggested 13 as the minimum sample size. The sample size for this study exceeded the minimum requirement.

      3.1.2 Visual stimulus

      Four images of robots were used in this study. The robots were selected according to the following criteria set by Mathur and Reichling (2016). Furthermore, the signalling function of human tears has been shown to occur regardless of facial expression prior to the imposition of tears (Zeifman and Brown, 2011; Ito et al., 2019). Many robots are unable to change their facial expressions, and unlike humans, there are no standardised stimuli that represent each emotion. Thus, the facial expressions of the robots were not considered in their selection.

      3.1.2.1 Selection criteria

      1. The entire face of the robot must be in the picture.

      2. The robot’s face should be facing forward and both eyes should be in the picture.

      3. The robot must be designed to interact socially with people.

      4. The robot should have actually been produced.

      5. The robot must physically move (not a sculpture or computer graphics).

      6. Picture elucidation must be at least 100 d.p.i.

      3.1.2.2 Exclusion criteria

      1. The robot represents a well-known character or celebrity.

      2. The robot represents a specific gender.

      3. The robot is sold as a toy.

      Pictures from Hitachi Building Systems Co., Ltd.’s EMIEW, Sharp Corporation’s RoBoHoN, Engineered Arts Ltd.’s RoboThespian, and Vstone Co., Ltd.’s Sota, which met the above criteria and whose use was permitted, were used in the experiments. These pictures were obtained from company websites or provided directly by the companies. Pictures of the robots were cropped from the shoulders to the upper part of the body, and the background was white. Next, pictures of the four robots were digitally processed using Adobe Photoshop Cc 2021 to add tears (Figure 1). The size and position of the tears were standardised across robots. The final number of stimuli was eight: two pictures of each robot with and without tears. Furthermore, the suitability of tear processing was confirmed in a preliminary study and used in the main study. Details of the pilot studies are provided at https://osf.io/cgvt5/?view_only=39a3afb9ba724f669ea852e57a7afee1.

      Visual stimuli used in the experiment. From the left, it is EMIEW, Robo Thespian, and Sota. The picture of RoBoHoN is not shown due to copyright issues.

      3.1.3 Experimental design

      The independent variable was the addition of tears (no tears: No Tears condition; with tears: Tears condition), which was a within-participants factor. The dependent variable was emotional intensity, which measured the intensity of the six emotions (sadness, anger, fear, surprise, disgust, and happiness) for each stimulus.

      3.1.4 Procedure

      Participants were seated approximately 30 cm in front of a 26.5 cm × 47.0 cm stimulus presentation monitor and were surveyed using Qualtrics. A picture of the robot was presented at a size of 15.87 cm × 15.87 cm (600 px × 600 px), and participants were asked to rate on a scale of 101 (0: not at all strong to 100: very strong) how strongly they thought the robot in the picture was experiencing sadness, anger, fear, surprise, disgust, and happiness. The order in which the six emotions were presented was random for each trial. The number of stimuli used in this study was small, and each stimulus was presented three times to ensure measurement stability (Balsters et al., 2013; Gračanin et al., 2021). There were 24 trials in Study 1. In all 24 trials, stimuli were presented in random order using the Qualtrics randomisation function. The participants evaluated both versions of the same robot, with and without tears, and no time limit was placed on their responses. Study 1 was conducted after obtaining approval (KH-21088) from the Ethics Review Committee of the Faculty of Psychology at Doshisha University.

      3.2 Result

      Data were analysed using IBM SPSS Statistics (v. 27). Each response for the four robots was combined for the No Tears and Tears conditions, and the mean of each condition was calculated (Figure 2). Cronbach’s alpha coefficient was calculated to check consistency between robots, which confirmed sufficient consistency (sadness: α = .90, anger: α = .81, fear: α = .83, disgust: α = .75, surprise: α = .78, happiness: α = .62). It should be noted that, in this study, we were interested in the average of the reactions to multiple robots, rather than the reaction to a unique robot. Therefore, the type of robot was not incorporated as a factor, but a post hoc analysis incorporating the imposition of tears and the type of robot as factors was conducted at the reviewer’s suggestion and is presented in the Supplementary Material.

      Mean of emotional intensity for the No Tears and Tears conditions. Asterisks indicate significant differences (*p < .05, **p < .01, ***p < .001). Error bars represent the standard error of the mean.

      As shown in Figure 2, the means of the six emotional intensities in the No Tears condition were below the median of 50. However, in the Tears condition, only sadness had a mean value higher than the median. A paired t-test was conducted to determine the difference in the intensity of each emotion depending on the addition of tears, using the means of the intensity of the emotions in the No Tears and Tears conditions. The results showed significant differences in emotions with the addition of tears except in anger [anger: t (49) = 1.91, p = .06, d = 0.27, 95% CI (−0.01, 0.55), sadness: t (49) = 24.18, p < .001, d = 3.42, 95% CI (2.69, 4.15), fear: t (49) = 8.96, p < .001, d = 1.27, 95% CI (0.89, 1.64), surprise: t (49) = 2.10, p = .04, d = 0.30, 95% CI (0.01, 0.58), disgust: t (49) = 3.10, p = .003, d = 0.44, 95% CI (0.15, 0.73), happiness: t (49) = −5.14, p < .001, d = −0.73, 95% CI (−1.04, −0.41)]. The emotional intensity of sadness, fear, surprise, and disgust were significantly higher in the Tears condition than in the No Tears condition, whereas the emotional intensity of happiness was significantly lower in the Tears condition than in the No Tears condition.

      3.3 Discussion

      Study 1 investigated whether robot tears also serve as emotional signals for conveying sadness. We hypothesised that robots that shed tears would receive higher ratings for intensity of sadness than robots that did not shed tears (Hypothesis 1). According to the results, the Tears condition had significantly higher ratings for the intensity of sadness compared to the No Tears condition, thereby supporting the hypothesis. Additionally, the Tears condition showed higher ratings of emotional intensity than the No Tears condition for the emotions of fear, surprise, and disgust.

      Furthermore, the results of Study 1 indicate that robots with tears have the emotional signalling function of conveying sadness. Compared with the No Tears condition, the Tears condition showed a significantly higher sadness intensity rating of 77.36, which was above the median. The intensity of happiness was lower in the Tears condition than in the No Tears condition. These results are consistent with those of previous studies showing that tears enhance sadness (Provine et al., 2009; Zeifman and Brown, 2011; Balsters et al., 2013; Ito et al., 2019; Gračanin et al., 2021). Conversely, tears may enhance the intensity of emotions other than sadness (fear, surprise, and disgust). Shedding tears is an emotionally expressive behaviour associated with emotions other than sadness and occurs as a result of strong emotional arousal (Vingerhoets et al., 2001). Consequently, the addition of tears may increase the intensity of emotions other than sadness in the present study. However, any increase in the emotional intensity of emotions other than sadness due to the addition of tears was below the median, and the effect sizes were smaller for the other emotions (fear, d = 1.27; surprise, d = 0.30; disgust, d = 0.44) than for sadness (d = 3.42). In other words, it can be inferred that the enhancing effect of tears on emotions other than sadness is secondary. Thus, this study demonstrated that tears enhanced sadness the most, that is, also had the emotional signalling function of conveying sadness.

      The reason robot tears augment sadness may be influenced by the characteristics of people’s anthropomorphic perceptions of media, such as computers and television. Anthropomorphism refers to the general tendency of people to attribute human-specific characteristics (Epley et al., 2007; Waytz et al., 2010), including human-like mental abilities, to non-human objects. People tend to anthropomorphise robots (e.g., Tanaka et al., 2007; Rosenthal-von der Pütten et al., 2013). Human tears enhance various facial expressions (e.g., Provine et al., 2009; Zeifman and Brown, 2011; Balsters et al., 2013; Ito et al., 2019; Gračanin et al., 2021). Therefore, even if an inorganic robot is shedding tears, people perceive it in the same way as a tearful individual and interpret the robot as experiencing sadness. This suggests that the robot’s tears increased the emotional intensity of sadness in the present study.

      4 Study 2

      Study 1 revealed that tears enhance the rated intensity of sadness in robots. Study 2 aimed to investigate whether robot tears serve as a social signal to elicit support intentions using different scenarios. We also examined whether the effect of tears on eliciting support intentions was mediated by perceptions of warmth, helplessness, and connectedness, as in humans. Because many previous studies have shown that tears have a strong effect on eliciting support intentions (e.g., Vingerhoets et al., 2016; Stadel et al., 2019; Zickfeld et al., 2021), it was hypothesised that support intentions would be higher for robots with tears than for robots without tears (Hypothesis 2). Furthermore, the tear-eliciting support intention effect is mediated by increased perceptions of warmth, helplessness, and connectedness of the tearful individual (Vingerhoets et al., 2016; Zickfeld et al., 2021). We also hypothesised that the effect of tears on eliciting support intentions would be mediated by the perceptions of warmth, helplessness, and connectedness (Hypothesis 3).

      4.1 Materials and methods 4.1.1 Participants

      Fifty-six undergraduate students (22 men and 34 women, mean age = 20.18 years, SD = 1.19 years) from a Japanese university agreed to participate in Study 2. All participants were native Japanese speakers. Class participation points were awarded to participants of the experiment. Before conducting Study 2, all participants were provided with an overview of the study using Qualtrics, an online survey platform. After this, participants were asked to select “agree” or “disagree” to participate in the study. Only participants who selected “agree” were included in the survey. The ideal sample size was calculated using G*Power (Faul et al., 2007) with a significance level of 5% and a power of 0.80, considering the effect size (d = 0.56) of the meta-analysis conducted by Zickfeld et al. (2021). The power analysis results suggested 28 as the minimum sample size. The sample size for this study exceeded the minimum requirement.

      4.1.2 Visual stimulus

      The same pictures of the robots used in Study 1 were utilised.

      4.1.3 Scenarios

      We created two scenarios to be presented with the robot’s picture, one on the theme of “death” and the other on the theme of “farewell,” which are the antecedents that typically precede crying behaviour (Vingerhoets, 2013). The first scenario was “It has just been decided in front of the robot that this robot will be dismantled tomorrow.” The second was “The rental period for this robot ends today, and this robot is just now leaving the family it is with.”

      4.1.4 Measures

      The present study used the same questionnaire items as those used by Zickfeld et al. (2021) to measure support intentions and perceptions of warmth, helplessness, and connectedness.

      4.1.4.1 Support intentions

      To measure the intention to support the robot in the picture, participants were asked to respond to the following three items using a 7-point scale (0: not at all to 6: very much so): “I would be there if this robot needed me,” “I would express how much I accept this robot,” and “I would offer support to this robot.”

      4.1.4.2 Perceived warmth

      To measure the perceived warmth towards the robot in the picture, participants were asked to indicate the extent to which the two items—“warm” and “friendly”—applied to the robot in the picture using a 7-point scale (from 0: not at all to 6: very much so).

      4.1.4.3 Perceived helplessness

      To measure the perceived helplessness of the robot in the picture, participants were asked to respond to the following three items using a 7-point scale (0: not at all to 6: very much so): “How helpless does this robot appear to you?”, “How overwhelmed does this robot appear to you?”, and “How sad does this robot appear to you?”

      4.1.4.4 Perceived connectedness

      To measure the perceived connectedness to the robot in the picture, participants were asked to rate their perception of connectedness to the robot in the picture on a 7-point IOS (Inclusion of others in the self) scale (Aron et al., 1992). The IOS scale consists of seven figures, ranging from two separate circles, the self and the other, to two almost overlapping circles.

      4.1.5 Experimental design

      The independent variable was the addition of tears (no tears: No Tears condition, with tears: Tears condition), which was a within-participant factor. The dependent variables were support intention, perceived warmth, perceived helplessness, and perceived connectedness.

      4.1.6 Procedure

      Participants were seated approximately 30 cm in front of a 26.5 cm × 47.0 cm stimulus presentation monitor and were surveyed using Qualtrics. A picture of the robot was presented at a size of 15.87 cm × 15.87 cm (600 px × 600 px) along with the scenario, and participants were asked to answer the questions. The participants performed 16 trials (four robots × with/without tears × two scenarios). The order of stimulus presentation was random using the Qualtrics randomisation function for all 16 trials, and no time limit was placed on the responses. Participants evaluated both versions of the same robot, with and without tears. Study 2 had more questions than Study 1 and did not use the technique of asking participants to rate the same visual stimulus three times in order to reduce their burden. However, despite the difference in the type of scenario, the fact that the evaluation was asked twice for one visual stimulus suggests that the stability of the measurement was ensured. Study 2 was conducted after obtaining approval (KH-21088) from the Ethics Review Committee of the Faculty of Psychology at Doshisha University.

      4.2 Result

      Data were analysed using IBM SPSS Statistics (v. 27). The mean of the three items on support intentions was used as the support intention score (α = .92). Similarly, means of the two items for perceived warmth and three items for perceived helplessness were treated as their respective scale scores (warmth: r = .82; helplessness: α = .74).

      4.2.1 Effect of tears

      The responses to the four robots and the two scenarios were grouped according to the No Tears and Tears conditions, and the average was calculated for each condition. As shown in Figure 3, the means in the Tears condition were higher than those in the No Tears condition for all the dependent variables. A paired t-test was conducted to examine whether the addition of tears produced differences in perceived support intention, warmth, helplessness, and connectedness for each dependent variable, using the means of the No Tears and Tears conditions. Results showed significant differences in all dependent variables, with higher scores in the Tears condition as compared to the No Tears condition [Support intention: t (55) = 7.89, p < .001, d = 1.05, 95% CI (0.72, 1.38); Warmth: t (55) = 9.31 p < .001, d = 1.24, 95% CI (0.89, 1.59); Helplessness: t (55) = 15.48, p < .001, d = 2.07, 95% CI (1.60, 2.53); Connectedness: t (55) = 10.98, p < .001, d = 1.47, 95% CI (1.09, 1.84)].

      Mean of each dependent variable for the No Tears and Tears conditions. Asterisks indicate significant differences (***p < .001). Error bars represent the standard error of the mean.

      4.2.2 Mediation analysis

      A mediation analysis was performed using R (ver. 4.2.1) to determine whether the relationship between the addition of tears and support intentions was mediated by perceptions of warmth, helplessness and connectedness (Hypothesis 3). The mediation analysis was performed with the addition of tears as the independent variable, support intention as the dependent variable, and perceived warmth, helplessness, and connectedness as the mediating variables (Figure 4). As this experimental study utilised a repeated measures design, a multi-model mediation analysis was conducted with participants included as random intercepts. Monte Carlo simulations were employed to construct 95% confidence intervals for the indirect effect of the addition of tears on support intentions via the mediating variables (Falk and Biesanz, 2016). The results showed that the 95% confidence intervals for indirect effects on all mediating variables did not include zero [warmth: B = 0.31, 95% CI (0.24, 0.38); helplessness: B = 0.20, 95% CI (0.10, 0.29); and connectedness: B = 0.47, 95% CI (0.38, 0.55)].

      Mediation model summarising the direct and indirect effects of the occurrence of tears on the intention to support. Coefficients represent unstandardised estimates. Estimate in parentheses represents 95% confidence intervals.

      4.3 Discussion

      Study 2 examined whether robot tears served as social signals that elicited support intentions. We also examined whether the effect of tears on support intention was mediated by perceived warmth, helplessness, and connectedness. It was hypothesised that the Tears condition would result in higher support intentions than the No Tears condition (Hypothesis 2). The results showed that support intentions were significantly higher in the Tears condition than in the No Tears condition, thereby supporting this hypothesis. Furthermore, we hypothesised that the effect of tears on eliciting support intentions was mediated by perceived warmth, helplessness, and connectedness (Hypothesis 3). The results of this study replicated previous findings: tears not only made the expresser appear helpless, but also made others feel warmer and more connected to them (Stadel et al., 2019; Zickfeld et al., 2021). Furthermore, the effect of tears on eliciting support intentions was mediated by perceived warmth, helplessness, and connectedness, thereby supporting Hypothesis 3. Thus, the results suggest that, even in robots, tears can function as social signals that elicit support intention.

      The results of this study demonstrate for the first time that robot tears have a social signalling function to elicit support intentions, thereby supporting the media equation theory proposed by Reeves and Nass. (1996). The media equation theory states that people behave socially towards inorganic objects, such as media, and tend to perceive them as anthropomorphic. In a study conducted by Reeves and Nass. (1996), the social behaviours that occur in person-to-person relationships were replicated in person-to-media relationships. Therefore, the social behaviour that occurs in person-to-person relationships wherein support intentions are elicited towards a tearful individual was also demonstrated in person-to-robot relationships in this study.

      Additionally, the present study showed that the process by which tears cause an increase in support intention may be similar to that in humans. In the present study, tearful robots received higher ratings for perceived warmth, helplessness, and connectedness than non-tearful robots. Furthermore, these three factors mediated the effect of tears on increased support intentions, thereby replicating the results of previous studies conducted with human pictures (Stadel et al., 2019; Zickfeld et al., 2021). In addition to the signal effect of eliciting support intentions, this study showed that the process of eliciting support intentions by the robot’s tears was similar to that of humans, suggesting that people may perceive the robot as similar to a person.

      5 General discussion

      The present study aimed to investigate whether tears shed by robots have an emotional signalling function for conveying sadness and a social signalling function for eliciting support intentions, using methods similar to those of previous studies conducted on human tears. Study 1 examined whether robot tears served as an emotional signal to convey sadness, and demonstrated that robots with tears conveyed enhanced sadness. Study 2 examined whether robot tears serve as a social signal to elicit support intentions and found that robot tears also elicit support intentions. Moreover, the process by which robot tears elicit support intentions was found to be similar to that of humans.

      The results of the present study suggest that tears may have facilitated the anthropomorphisation of the robot. The results of the two experiments conducted in the present study replicated those of previous studies using human pictures. This demonstrated, for the first time, that tears in robots may function as emotional and social signals in the same manner as in humans. As inorganic objects, robots have no emotions or will and do not shed tears spontaneously. However, because people tend to anthropomorphise inorganic robots (Reeves and Nass, 1996), it is presumed that they perceive emotions and sociality in the tears of the robots. In addition to this flexibility in human cognition, it is possible that the object itself, the tears, may have been a catalyst for people to find emotional and social qualities in the robot and may have promoted anthropomorphising and perception of the robot with human-like characteristics. Riek et al. (2009) examined whether people showed empathy towards robots by displaying videos of robots being abused. The results showed that there were differences in the degree of empathy depending on the appearance of the abused robot, with higher empathy shown for a human-like robot (android) than for a robot with a mechanical appearance (cleaning robot Roomba). Thus, the personhood of a robot influences people’s social responses to it. Moreover, the perception of a robot’s personhood has been argued to increase acceptance, liking, familiarity and trust towards the robot and facilitate social interactions with people (Eyssel et al., 2010; Fink, 2012). Because shedding tears is a unique human expression of emotions (Vingerhoets, 2013), it can be inferred that tears are symbolic elements of personhood. Therefore, the addition of tears to a robot, which is an inorganic object, may have promoted people’s tendency to perceive inorganic objects as anthropomorphic and the robot as an infinitely human-like entity.

      Furthermore, shedding tears is a meaningful way for robots to express their emotions. The results of this study suggest that robot tears increase the intensity of sadness and that shedding tears can function as an emotionally expressive behaviour of sadness in robots. In addition to sadness, tears are accompanied by diverse emotions such as happiness and anger (Vingerhoets, 2013). Although Study 1 did not provide contextual information, it may be possible to express emotions other than sadness through tears, depending on the context in which the tears are shed. Consequently, equipping robots with tear-shedding behaviour is expected to broaden their range of emotional expressions and make human-robot interactions more sophisticated and richer. Moreover, the perception of sad facial expressions differs between young and older adults, with older adults being less sensitive to the perception of sad facial expressions compared to young adults (Phillips and Allen, 2004; MacPherson et al., 2006). However, no differences were found between older and young adults in their assessment of grief in response to the emotional expression of shedding tears, indicating that tears are a universal emotional signal that conveys grief across all ages (Grainger et al., 2019). Therefore, the provision of a tear function as a way of expressing sadness in robots may be beneficial, as it facilitates the transmission of sadness regardless of the age of the person interacting with the robot.

      This study demonstrates for the first time that the social functioning of emotions can occur when robots shed tears. It has been shown that when robots express emotions through facial expressions and gestures, they exhibit the same social functions as humans (Melo et al., 2023). The present study supports the results of previous studies and shows that this applies to the previously untested emotional expression of shedding tears, thereby contributing to new knowledge in this area. Specifically, Study 2 showed that robot tears elicited support intentions and influenced human behaviours. The behaviour of shedding tears attracts the attention of others and elicits approach responses (Hendriks et al., 2008; Vingerhoets, 2013). Moreover, it can be applied for the treatment of autism through human-robot interactions (e.g., Scassellati et al., 2012; Taheri et al., 2018) and as “robot being cared for” to fulfil the self-esteem and self-affirmation of older adults (Kanoh, 2014). In other words, equipping robots with the ability to shed tears may contribute to further developments in fields where robots are actively utilised.

      Finally, the limitations and future perspectives of this study are discussed. First, the baseline of stimuli used in this study was not uniform. For example, some robots missed certain facial parts (mouth and nose) or appeared to smile. In humans, tears have been shown to enhance the evaluation of sadness and support intentions, regardless of facial expressions prior to the imposition of tears (e.g., Provine et al., 2009; Zeifman and Brown, 2011; Ito et al., 2019). However, the magnitude of the effect of tears varies across facial expressions (e.g., Ito et al., 2019). Therefore, future work is needed to control the facial expressions and parts of the robot and examine the effect of tears. In addition, the number of robot types used in the experiments was small. Although four different types of robots were used in this study, many different types of communicative robots have been developed with varying appearances, ranging from those modelled on animals, such as dogs and seals, to those that look more like humans (Weiss et al., 2009; Zecca et al., 2009; Eyssel et al., 2010; Pandey and Gelin, 2018). Consequently, it is unclear whether the results presented in this study can be generalised to all robots. Second, contextual information was limited. Specifically, Study 1 did not include any contextual information. This is a necessary procedure for identifying the emotions signified by tears. Contextual information is a very important factor in determining the emotion indicated by the emotional expression, especially tears, which can be associated with a range of emotions (Vingerhoets, 2013; Zickfeld et al., 2020). Future studies should set contextual information and determine whether a robot’s tears can also intensify the evaluation of other emotional expressions such as joy, emotion, and anger. Third, the present study dealt only with subjective assessments. Our aim was to identify the signalling function of robot tears using an approach similar to that of previous studies conducted on human tears. Thus, we asked the participants to respond to the support intention of a subjective evaluation of the pictures of the robot. This method is important because it provides the first clues to actual human reactions to the robot. Conversely, previous studies examining the social function of emotions have shown that compared to non-expressed emotions, robots made concessions in negotiation games to opponents who showed emotional expressions (Sinaceur et al., 2015), donated more money (Takagi and Terada, 2021), and made suggestions with high amounts of money in ultimatum games (Terada and Takeuchi, 2017), showing that emotional expressions can influence the behaviour of others. In the present study, support intentions increased for robots that shed tears. However, it is not clear whether this leads to actual support behaviour. Future hypothesis testing using behavioural indicators is required.

      6 Conclusion

      The tears in robots may have signalling effects similar to those in humans. Robot tears have enhanced the rated intensity of sadness. They also indicated warmth and helplessness. The observers felt a sense of closeness to the tear-shedding robot, which led to support intentions. The results suggest that robot tears, like human tears, have both an emotional signalling function for conveying sadness and a social signalling effect that elicits support intentions. This is the first study to demonstrate the previously unidentified emotional and social signalling functions of robot tears and to determine the potential for new tear-specific interactions between humans and robots.

      Data availability statement

      The datasets presented in this study can be found in online repositories. The URL to access repository can be found below: https://osf.io/cgvt5/?view_only=39a3afb9ba724f669ea852e57a7afee1.

      Ethics statement

      The studies involving human participants were reviewed and approved by the Ethics Review Committee of the Faculty of Psychology of Doshisha University. The participants provided informed consent to participate in this study on the online survey platform Qualtrics.

      Author contributions

      AY and TT contributed to the study conception and design. AY organised the database, performed the statistical analysis, and wrote the first draft and sections of the manuscript. All authors contributed to the article and approved the submitted version.

      We thank Engineered Arts Ltd., Hitachi Building Systems Co., Ltd., Sharp Corporation, and Vstone Co., Ltd., for providing the pictures of the robots used in this study.

      Conflict of interest

      The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

      Publisher’s note

      All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

      Supplementary material

      The Supplementary Material for this article can be found online at: /articles/10.3389/frobt.2023.1121624/full#supplementary-material

      References Aron A. Aron E. N. Smollan D. (1992). Inclusion of other in the self-scale and the structure of interpersonal closeness. J. Pers. Soc. Psychol. 63 (4), 596612. 10.1037/0022-3514.63.4.596 Balsters M. J. Krahmer E. J. Swerts M. G. Vingerhoets A. J. (2013). Emotional tears facilitate the recognition of sadness and the perceived need for social support. Evol. Psychol. 11 (1), 148158. 10.1177/147470491301100114 Barrett K. C. (1993). The development of nonverbal communication of emotion: A functionalist perspective. J. Nonverbal Behav. 17 (3), 145169. 10.1007/BF00986117 Bennett C. C. Šabanović S. (2014). Deriving minimal features for human-like facial expressions in robotic faces. Int. J. Soc. Robot. 6 (3), 367381. 10.1007/s12369-014-0237-z Cañamero L. Fredslund J. (2001). I show you how I like you - can you read it in my face? [robotics]. IEEE Trans. Syst. Man, Cybern. - Part A Syst. Humans 31 (5), 454459. 10.1109/3468.952719 Epley N. Waytz A. Cacioppo J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychol. Rev. 114 (4), 864886. 10.1037/0033-295X.114.4.864 Eyssel F. Hegel F. Horstmann G. Wagner C. (2010). “Anthropomorphic inferences from emotional nonverbal cues: A case study,” in Proceedings of the 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, 13-15 September 2010 (IEEE), 646651. Falk C. F. Biesanz J. C. (2016). Two cross-platform programs for inferences and interval estimation about indirect effects in mediational models. SAGE Open 6 (1). 10.1177/2158244015625445 Faul F. Erdfelder E. Lang A. G. Buchner A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav. Res. Methods 39 (2), 175191. 10.3758/BF03193146 Fink J. (2012). “Anthropomorphism and human likeness in the design of robots and human-robot interaction,” in Social robotics. Editors Ge S. S. Khatib O. Cabibihan J. J. Simmons R. Williams M. A. (Berlin Heidelberg: Springer), 199208. Garrell A. Villamizar M. Moreno-Noguer F. Sanfeliu A. (2017). Teaching robot’s proactive behavior using human assistance. Int. J. Soc. Robot. 9 (2), 231249. 10.1007/s12369-016-0389-0 Ge S. S. Wang C. Hang C. C. (2008). “Facial expression imitation in human robot interaction,” in RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 01-03 August 2008 (IEEE), 213218. Gračanin A. Krahmer E. Balsters M. Küster D. Vingerhoets A. J. J. M. (2021). How weeping influences the perception of facial expressions: The signal value of tears. J. Nonverbal Behav. 45 (1), 83105. 10.1007/s10919-020-00347-x Grainger S. A. Vanman E. J. Matters G. Henry J. D. (2019). The influence of tears on older and younger adults’ perceptions of sadness. Psychol. Aging 34 (5), 665673. 10.1037/pag0000373 Hendriks M. C. P. Croon M. A. Vingerhoets A. J. J. M. (2008). Social reactions to adult crying: The help-soliciting function of tears. J. Soc. Psychol. 148 (1), 2242. 10.3200/SOCP.148.1.22-42 Hendriks M. C. P. Vingerhoets A. J. J. M. (2006). Social messages of crying faces: Their influence on anticipated person perception, emotions and behavioural responses. Cogn. Emot. 20 (6), 878886. 10.1080/02699930500450218 Ito K. Ong C. W. Kitada R. (2019). Emotional tears communicate sadness but not excessive emotions without other contextual knowledge. Front. Psychol. 10, 878. 10.3389/fpsyg.2019.00878 Kanoh M. (2014). Babyloid. J. Robot. Mechatron. 26 (4), 513514. 10.20965/jrm.2014.p0513 Kim H. R. Kwon D. S. (2010). Computational model of emotion generation for human–robot interaction based on the cognitive appraisal theory. J. Intell. Robot. Syst. 60 (2), 263283. 10.1007/s10846-010-9418-7 Kishi T. Otani T. Endo N. Kryczka P. Hashimoto K. Nakata K. (2012). “Development of expressive robotic head for bipedal humanoid robot,” in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 07-12 October 2012 (IEEE), 45844589. Lange J. Heerdink M. W. van Kleef G. A. (2022). Reading emotions, reading people: Emotion perception and inferences drawn from perceived emotions. Curr. Opin. Psychol. 43, 8590. 10.1016/j.copsyc.2021.06.008 MacPherson S. E. Phillips L. H. Sala S. D. (2006). Age-related differences in the ability to perceive sad facial expressions. Aging Clin. Exp. Res. 18 (5), 418424. 10.1007/BF03324838 Mathur M. B. Reichling D. B. (2016). Navigating a social world with robot partners: A quantitative cartography of the uncanny valley. Cognition 146, 2232. 10.1016/j.cognition.2015.09.008 Melo C. M. d. Gratch J. Marsella S. Pelachaud C. (2023). Social functions of machine emotional expressions. Proc. IEEE 2023, 116. 10.1109/JPROC.2023.3261137 Miwa H. Okuchi T. Itoh K. Takanobu H. Takanishi A. (2003). “A new mental model for humanoid robots for human friendly communication introduction of learning system, mood vector and second order equations of emotion,” in 2003 IEEE International Conference on Robotics and Automation, Taipei, Taiwan, 14-19 September 2003 (IEEE), 35883593. Motoda K. (2019). Poor communicator robots: How to deal with them well? Avaliable At: https://project.nikkeibp.co.jp/mirakoto/atcl/robotics/h_vol31/ (Accessed December 1, 2022). Pandey A. K. Gelin R. (2018). A mass-produced sociable humanoid robot: pepper: The first machine of its kind. IEEE Robot. Autom. Mag. 25 (3), 4048. 10.1109/MRA.2018.2833157 Phillips L. H. Allen R. (2004). Adult aging and the perceived intensity of emotions in faces and stories. Aging Clin. Exp. Res. 16 (3), 190199. 10.1007/BF03327383 Picó A. Gadea M. (2021). When animals cry: The effect of adding tears to animal expressions on human judgment. PLOS ONE 16 (5), e0251083. 10.1371/journal.pone.0251083 Provine R. R. Krosnowski K. A. Brocato N. W. (2009). Tearing: Breakthrough in human emotional signaling. Evol. Psychol. 7 (1), 5256. 10.1177/147470490900700107 Reeves B. Nass C. (1996). The media equation: How people treat computers, television, and new media like real people. UK: Cambridge University Press. Riek L. D. Rabinowitch T.C. Chakrabarti B. Robinson P. (2009). “How anthropomorphism affects empathy toward robots,” in Proceedings of the 4th ACM/IEEE international conference on Human robot interaction (La Jolla, California, USA: ACM), 245246. Rosenthal-von der Pütten A. M. Krämer N. C. Hoffmann L. Sobieraj S. Eimler S. C. (2013). An experimental study on emotional reactions towards a robot. Int. J. Soc. Robot. 5 (1), 1734. 10.1007/s12369-012-0173-8 Scassellati B. Admoni H. Matarić M. (2012). Robots for use in autism research. Annu. Rev. Biomed. Eng. 14, 275294. 10.1146/annurev-bioeng-071811-150036 Sinaceur M. Kopelman S. Vasiljevic D. Haag C. (2015). Weep and get more: When and why sadness expression is effective in negotiations. J. Appl. Psychol. 100, 18471871. 10.1037/a0038783 Stadel M. Daniels J. K. Warrens M. J. Jeronimus B. F. (2019). The gender-specific impact of emotional tears. Motiv. Emot. 43 (4), 696704. 10.1007/s11031-019-09771-z Suzuki Y. Galli L. Ikeda A. Itakura S. Kitazaki M. (2015). Measuring empathy for human and robot hand pain using electroencephalography. Sci. Rep. 5 (1), 15924. 10.1038/srep15924 Taheri A. Meghdari A. Alemi M. Pouretemad H. (2018). Human-robot interaction in autism treatment: A case study on three pairs of autistic children as twins, siblings, and classmates. Int. J. Soc. Robot. 10 (1), 93113. 10.1007/s12369-017-0433-8 Takagi H. Terada K. (2021). The effect of anime character’s facial expressions and eye blinking on donation behavior. Sci. Rep. 11 (1), 9146. 10.1038/s41598-021-87827-2 Tanaka F. Cicourel A. Movellan Javier R. (2007). Socialization between toddlers and robots at an early childhood education center. Proc. Natl. Acad. Sci. 104 (46), 1795417958. 10.1073/pnas.0707769104 Terada K. Takeuchi C. (2017). Emotional expression in simple line drawings of a robot's face leads to higher offers in the ultimatum game. Front. Psychol. 8, 724. 10.3389/fpsyg.2017.00724 Tsiourti C. Weiss A. Wac K. Vincze M. (2019). Multimodal integration of emotional signals from voice, body, and context: Effects of (In)Congruence on emotion recognition and attitudes towards robots. Int. J. Soc. Robot. 11 (4), 555573. 10.1007/s12369-019-00524-z van Kleef G. A. Côté S. (2022). The social effects of emotions. Ann. Rev. Psychol. 73 (1), 629658. 10.1146/annurev-psych-020821-010855 Vingerhoets A. J. J. M. Boelhouwer A. J. W. Van Tilburg M. A. L. Van Heck G. L. (2001). “The situational and emotional context of adult crying,” in Adult crying: A biopsychosocial approach. Editors Vingerhoets A. J. J. M. Cornelius R. R. ((NY, USA: Brunner-Routledge), 7190. Vingerhoets A. J. J. M. van de Ven N. van der Velden Y. (2016). The social impact of emotional tears. Motiv. Emot. 40 (3), 455463. 10.1007/s11031-016-9543-0 Vingerhoets A. J. J. M. (2013). Why only humans weep: Unravelling the mysteries of tears. Oxford, USA: Oxford University Press. Waytz A. Morewedge C. K. Epley N. Monteleone G. Gao J. H. Cacioppo J. T. (2010). Making sense by making sentient: Effectance motivation increases anthropomorphism. J. Pers. Soc. Psychol. 99, 410435. 10.1037/a0020240 Weiss A. Wurhofer D. Tscheligi M. (2009). I love this dog—children’s emotional attachment to the robotic dog AIBO. Int. J. Soc. Robot. 1 (3), 243248. 10.1007/s12369-009-0024-4 Yoon J. (2021). Tears come from my eyes Japan’s reaction to developing a crying robot. Avaliable At: https://news.sbs.co.kr/news/endPage.do?news_id=N1006483478 (Accessed December 1, 2022). Zecca M. Mizoguchi Y. Endo K. Iida F. Kawabata Y. Endo N. (2009). “Whole body emotion expressions for KOBIAN humanoid robot — Preliminary experiments with different emotional patterns,” in RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 27 September 2009 - 02 October 2009 (IEEE), 381386. Zeifman D. M. Brown S. A. (2011). Age-Related changes in the signal value of tears. Evol. Psychol. 9 (3), 313324. 10.1177/147470491100900304 Zickfeld J. H. Arriaga P. Santos S. V. Schubert T. W. Seibt B. (2020). Tears of joy, aesthetic chills and heartwarming feelings: Physiological correlates of Kama Muta. Psychophysiol 57 (12), e13662. 10.1111/psyp.13662 Zickfeld J. H. van de Ven N. Pich O. Schubert T. W. Berkessel J. B. Pizarro J. J. (2021). Tears evoke the intention to offer social support: A systematic investigation of the interpersonal effects of emotional crying across 41 countries. J. Exp. Soc. Psychol. 95, 104137. 10.1016/j.jesp.2021.104137 Złotowski J. Proudfoot D. Yogeeswaran K. Bartneck C. (2015). Anthropomorphism: Opportunities and challenges in humanrobot interaction. Int. J. Soc. Robot. 7 (3), 347360. 10.1007/s12369-014-0267-6
      ‘Oh, my dear Thomas, you haven’t heard the terrible news then?’ she said. ‘I thought you would be sure to have seen it placarded somewhere. Alice went straight to her room, and I haven’t seen her since, though I repeatedly knocked at the door, which she has locked on the inside, and I’m sure it’s most unnatural of her not to let her own mother comfort her. It all happened in a moment: I have always said those great motor-cars shouldn’t be allowed to career about the streets, especially when they are all paved with cobbles as they are at Easton Haven, which are{331} so slippery when it’s wet. He slipped, and it went over him in a moment.’ My thanks were few and awkward, for there still hung to the missive a basting thread, and it was as warm as a nestling bird. I bent low--everybody was emotional in those days--kissed the fragrant thing, thrust it into my bosom, and blushed worse than Camille. "What, the Corner House victim? Is that really a fact?" "My dear child, I don't look upon it in that light at all. The child gave our picturesque friend a certain distinction--'My husband is dead, and this is my only child,' and all that sort of thing. It pays in society." leave them on the steps of a foundling asylum in order to insure [See larger version] Interoffice guff says you're planning definite moves on your own, J. O., and against some opposition. Is the Colonel so poor or so grasping—or what? Albert could not speak, for he felt as if his brains and teeth were rattling about inside his head. The rest of[Pg 188] the family hunched together by the door, the boys gaping idiotically, the girls in tears. "Now you're married." The host was called in, and unlocked a drawer in which they were deposited. The galleyman, with visible reluctance, arrayed himself in the garments, and he was observed to shudder more than once during the investiture of the dead man's apparel. HoME香京julia种子在线播放 ENTER NUMBET 0016www.ecjdsk.com.cn
      www.hjfdl.net.cn
      www.gtwfsl.com.cn
      www.dyxdhs.com.cn
      www.jlszzxxx.com.cn
      www.sdjt518.com.cn
      rttcwx.com.cn
      telvyou.com.cn
      nggccu.com.cn
      qfkjsp.com.cn
      处女被大鸡巴操 强奸乱伦小说图片 俄罗斯美女爱爱图 调教强奸学生 亚洲女的穴 夜来香图片大全 美女性强奸电影 手机版色中阁 男性人体艺术素描图 16p成人 欧美性爱360 电影区 亚洲电影 欧美电影 经典三级 偷拍自拍 动漫电影 乱伦电影 变态另类 全部电 类似狠狠鲁的网站 黑吊操白逼图片 韩国黄片种子下载 操逼逼逼逼逼 人妻 小说 p 偷拍10幼女自慰 极品淫水很多 黄色做i爱 日本女人人体电影快播看 大福国小 我爱肏屄美女 mmcrwcom 欧美多人性交图片 肥臀乱伦老头舔阴帝 d09a4343000019c5 西欧人体艺术b xxoo激情短片 未成年人的 插泰国人夭图片 第770弾み1 24p 日本美女性 交动态 eee色播 yantasythunder 操无毛少女屄 亚洲图片你懂的女人 鸡巴插姨娘 特级黄 色大片播 左耳影音先锋 冢本友希全集 日本人体艺术绿色 我爱被舔逼 内射 幼 美阴图 喷水妹子高潮迭起 和后妈 操逼 美女吞鸡巴 鸭个自慰 中国女裸名单 操逼肥臀出水换妻 色站裸体义术 中国行上的漏毛美女叫什么 亚洲妹性交图 欧美美女人裸体人艺照 成人色妹妹直播 WWW_JXCT_COM r日本女人性淫乱 大胆人艺体艺图片 女同接吻av 碰碰哥免费自拍打炮 艳舞写真duppid1 88电影街拍视频 日本自拍做爱qvod 实拍美女性爱组图 少女高清av 浙江真实乱伦迅雷 台湾luanlunxiaoshuo 洛克王国宠物排行榜 皇瑟电影yy频道大全 红孩儿连连看 阴毛摄影 大胆美女写真人体艺术摄影 和风骚三个媳妇在家做爱 性爱办公室高清 18p2p木耳 大波撸影音 大鸡巴插嫩穴小说 一剧不超两个黑人 阿姨诱惑我快播 幼香阁千叶县小学生 少女妇女被狗强奸 曰人体妹妹 十二岁性感幼女 超级乱伦qvod 97爱蜜桃ccc336 日本淫妇阴液 av海量资源999 凤凰影视成仁 辰溪四中艳照门照片 先锋模特裸体展示影片 成人片免费看 自拍百度云 肥白老妇女 女爱人体图片 妈妈一女穴 星野美夏 日本少女dachidu 妹子私处人体图片 yinmindahuitang 舔无毛逼影片快播 田莹疑的裸体照片 三级电影影音先锋02222 妻子被外国老头操 观月雏乃泥鳅 韩国成人偷拍自拍图片 强奸5一9岁幼女小说 汤姆影院av图片 妹妹人艺体图 美女大驱 和女友做爱图片自拍p 绫川まどか在线先锋 那么嫩的逼很少见了 小女孩做爱 处女好逼连连看图图 性感美女在家做爱 近距离抽插骚逼逼 黑屌肏金毛屄 日韩av美少女 看喝尿尿小姐日逼色色色网图片 欧美肛交新视频 美女吃逼逼 av30线上免费 伊人在线三级经典 新视觉影院t6090影院 最新淫色电影网址 天龙影院远古手机版 搞老太影院 插进美女的大屁股里 私人影院加盟费用 www258dd 求一部电影里面有一个二猛哥 深肛交 日本萌妹子人体艺术写真图片 插入屄眼 美女的木奶 中文字幕黄色网址影视先锋 九号女神裸 和骚人妻偷情 和潘晓婷做爱 国模大尺度蜜桃 欧美大逼50p 西西人体成人 李宗瑞继母做爱原图物处理 nianhuawang 男鸡巴的视屏 � 97免费色伦电影 好色网成人 大姨子先锋 淫荡巨乳美女教师妈妈 性nuexiaoshuo WWW36YYYCOM 长春继续给力进屋就操小女儿套干破内射对白淫荡 农夫激情社区 日韩无码bt 欧美美女手掰嫩穴图片 日本援交偷拍自拍 入侵者日本在线播放 亚洲白虎偷拍自拍 常州高见泽日屄 寂寞少妇自卫视频 人体露逼图片 多毛外国老太 变态乱轮手机在线 淫荡妈妈和儿子操逼 伦理片大奶少女 看片神器最新登入地址sqvheqi345com账号群 麻美学姐无头 圣诞老人射小妞和强奸小妞动话片 亚洲AV女老师 先锋影音欧美成人资源 33344iucoom zV天堂电影网 宾馆美女打炮视频 色五月丁香五月magnet 嫂子淫乱小说 张歆艺的老公 吃奶男人视频在线播放 欧美色图男女乱伦 avtt2014ccvom 性插色欲香影院 青青草撸死你青青草 99热久久第一时间 激情套图卡通动漫 幼女裸聊做爱口交 日本女人被强奸乱伦 草榴社区快播 2kkk正在播放兽骑 啊不要人家小穴都湿了 www猎奇影视 A片www245vvcomwwwchnrwhmhzcn 搜索宜春院av wwwsee78co 逼奶鸡巴插 好吊日AV在线视频19gancom 熟女伦乱图片小说 日本免费av无码片在线开苞 鲁大妈撸到爆 裸聊官网 德国熟女xxx 新不夜城论坛首页手机 女虐男网址 男女做爱视频华为网盘 激情午夜天亚洲色图 内裤哥mangent 吉沢明歩制服丝袜WWWHHH710COM 屌逼在线试看 人体艺体阿娇艳照 推荐一个可以免费看片的网站如果被QQ拦截请复制链接在其它浏览器打开xxxyyy5comintr2a2cb551573a2b2e 欧美360精品粉红鲍鱼 教师调教第一页 聚美屋精品图 中韩淫乱群交 俄罗斯撸撸片 把鸡巴插进小姨子的阴道 干干AV成人网 aolasoohpnbcn www84ytom 高清大量潮喷www27dyycom 宝贝开心成人 freefronvideos人母 嫩穴成人网gggg29com 逼着舅妈给我口交肛交彩漫画 欧美色色aV88wwwgangguanscom 老太太操逼自拍视频 777亚洲手机在线播放 有没有夫妻3p小说 色列漫画淫女 午间色站导航 欧美成人处女色大图 童颜巨乳亚洲综合 桃色性欲草 色眯眯射逼 无码中文字幕塞外青楼这是一个 狂日美女老师人妻 爱碰网官网 亚洲图片雅蠛蝶 快播35怎么搜片 2000XXXX电影 新谷露性家庭影院 深深候dvd播放 幼齿用英语怎么说 不雅伦理无需播放器 国外淫荡图片 国外网站幼幼嫩网址 成年人就去色色视频快播 我鲁日日鲁老老老我爱 caoshaonvbi 人体艺术avav 性感性色导航 韩国黄色哥来嫖网站 成人网站美逼 淫荡熟妇自拍 欧美色惰图片 北京空姐透明照 狼堡免费av视频 www776eom 亚洲无码av欧美天堂网男人天堂 欧美激情爆操 a片kk266co 色尼姑成人极速在线视频 国语家庭系列 蒋雯雯 越南伦理 色CC伦理影院手机版 99jbbcom 大鸡巴舅妈 国产偷拍自拍淫荡对话视频 少妇春梦射精 开心激动网 自拍偷牌成人 色桃隐 撸狗网性交视频 淫荡的三位老师 伦理电影wwwqiuxia6commqiuxia6com 怡春院分站 丝袜超短裙露脸迅雷下载 色制服电影院 97超碰好吊色男人 yy6080理论在线宅男日韩福利大全 大嫂丝袜 500人群交手机在线 5sav 偷拍熟女吧 口述我和妹妹的欲望 50p电脑版 wwwavtttcon 3p3com 伦理无码片在线看 欧美成人电影图片岛国性爱伦理电影 先锋影音AV成人欧美 我爱好色 淫电影网 WWW19MMCOM 玛丽罗斯3d同人动画h在线看 动漫女孩裸体 超级丝袜美腿乱伦 1919gogo欣赏 大色逼淫色 www就是撸 激情文学网好骚 A级黄片免费 xedd5com 国内的b是黑的 快播美国成年人片黄 av高跟丝袜视频 上原保奈美巨乳女教师在线观看 校园春色都市激情fefegancom 偷窥自拍XXOO 搜索看马操美女 人本女优视频 日日吧淫淫 人妻巨乳影院 美国女子性爱学校 大肥屁股重口味 啪啪啪啊啊啊不要 操碰 japanfreevideoshome国产 亚州淫荡老熟女人体 伦奸毛片免费在线看 天天影视se 樱桃做爱视频 亚卅av在线视频 x奸小说下载 亚洲色图图片在线 217av天堂网 东方在线撸撸-百度 幼幼丝袜集 灰姑娘的姐姐 青青草在线视频观看对华 86papa路con 亚洲1AV 综合图片2区亚洲 美国美女大逼电影 010插插av成人网站 www色comwww821kxwcom 播乐子成人网免费视频在线观看 大炮撸在线影院 ,www4KkKcom 野花鲁最近30部 wwwCC213wapwww2233ww2download 三客优最新地址 母亲让儿子爽的无码视频 全国黄色片子 欧美色图美国十次 超碰在线直播 性感妖娆操 亚洲肉感熟女色图 a片A毛片管看视频 8vaa褋芯屑 333kk 川岛和津实视频 在线母子乱伦对白 妹妹肥逼五月 亚洲美女自拍 老婆在我面前小说 韩国空姐堪比情趣内衣 干小姐综合 淫妻色五月 添骚穴 WM62COM 23456影视播放器 成人午夜剧场 尼姑福利网 AV区亚洲AV欧美AV512qucomwwwc5508com 经典欧美骚妇 震动棒露出 日韩丝袜美臀巨乳在线 av无限吧看 就去干少妇 色艺无间正面是哪集 校园春色我和老师做爱 漫画夜色 天海丽白色吊带 黄色淫荡性虐小说 午夜高清播放器 文20岁女性荫道口图片 热国产热无码热有码 2015小明发布看看算你色 百度云播影视 美女肏屄屄乱轮小说 家族舔阴AV影片 邪恶在线av有码 父女之交 关于处女破处的三级片 极品护士91在线 欧美虐待女人视频的网站 享受老太太的丝袜 aaazhibuo 8dfvodcom成人 真实自拍足交 群交男女猛插逼 妓女爱爱动态 lin35com是什么网站 abp159 亚洲色图偷拍自拍乱伦熟女抠逼自慰 朝国三级篇 淫三国幻想 免费的av小电影网站 日本阿v视频免费按摩师 av750c0m 黄色片操一下 巨乳少女车震在线观看 操逼 免费 囗述情感一乱伦岳母和女婿 WWW_FAMITSU_COM 偷拍中国少妇在公车被操视频 花也真衣论理电影 大鸡鸡插p洞 新片欧美十八岁美少 进击的巨人神thunderftp 西方美女15p 深圳哪里易找到老女人玩视频 在线成人有声小说 365rrr 女尿图片 我和淫荡的小姨做爱 � 做爱技术体照 淫妇性爱 大学生私拍b 第四射狠狠射小说 色中色成人av社区 和小姨子乱伦肛交 wwwppp62com 俄罗斯巨乳人体艺术 骚逼阿娇 汤芳人体图片大胆 大胆人体艺术bb私处 性感大胸骚货 哪个网站幼女的片多 日本美女本子把 色 五月天 婷婷 快播 美女 美穴艺术 色百合电影导航 大鸡巴用力 孙悟空操美少女战士 狠狠撸美女手掰穴图片 古代女子与兽类交 沙耶香套图 激情成人网区 暴风影音av播放 动漫女孩怎么插第3个 mmmpp44 黑木麻衣无码ed2k 淫荡学姐少妇 乱伦操少女屄 高中性爱故事 骚妹妹爱爱图网 韩国模特剪长发 大鸡巴把我逼日了 中国张柏芝做爱片中国张柏芝做爱片中国张柏芝做爱片中国张柏芝做爱片中国张柏芝做爱片 大胆女人下体艺术图片 789sss 影音先锋在线国内情侣野外性事自拍普通话对白 群撸图库 闪现君打阿乐 ady 小说 插入表妹嫩穴小说 推荐成人资源 网络播放器 成人台 149大胆人体艺术 大屌图片 骚美女成人av 春暖花开春色性吧 女亭婷五月 我上了同桌的姐姐 恋夜秀场主播自慰视频 yzppp 屄茎 操屄女图 美女鲍鱼大特写 淫乱的日本人妻山口玲子 偷拍射精图 性感美女人体艺木图片 种马小说完本 免费电影院 骑士福利导航导航网站 骚老婆足交 国产性爱一级电影 欧美免费成人花花性都 欧美大肥妞性爱视频 家庭乱伦网站快播 偷拍自拍国产毛片 金发美女也用大吊来开包 缔D杏那 yentiyishu人体艺术ytys WWWUUKKMCOM 女人露奶 � 苍井空露逼 老荡妇高跟丝袜足交 偷偷和女友的朋友做爱迅雷 做爱七十二尺 朱丹人体合成 麻腾由纪妃 帅哥撸播种子图 鸡巴插逼动态图片 羙国十次啦中文 WWW137AVCOM 神斗片欧美版华语 有气质女人人休艺术 由美老师放屁电影 欧美女人肉肏图片 白虎种子快播 国产自拍90后女孩 美女在床上疯狂嫩b 饭岛爱最后之作 幼幼强奸摸奶 色97成人动漫 两性性爱打鸡巴插逼 新视觉影院4080青苹果影院 嗯好爽插死我了 阴口艺术照 李宗瑞电影qvod38 爆操舅母 亚洲色图七七影院 被大鸡巴操菊花 怡红院肿么了 成人极品影院删除 欧美性爱大图色图强奸乱 欧美女子与狗随便性交 苍井空的bt种子无码 熟女乱伦长篇小说 大色虫 兽交幼女影音先锋播放 44aad be0ca93900121f9b 先锋天耗ばさ无码 欧毛毛女三级黄色片图 干女人黑木耳照 日本美女少妇嫩逼人体艺术 sesechangchang 色屄屄网 久久撸app下载 色图色噜 美女鸡巴大奶 好吊日在线视频在线观看 透明丝袜脚偷拍自拍 中山怡红院菜单 wcwwwcom下载 骑嫂子 亚洲大色妣 成人故事365ahnet 丝袜家庭教mp4 幼交肛交 妹妹撸撸大妈 日本毛爽 caoprom超碰在email 关于中国古代偷窥的黄片 第一会所老熟女下载 wwwhuangsecome 狼人干综合新地址HD播放 变态儿子强奸乱伦图 强奸电影名字 2wwwer37com 日本毛片基地一亚洲AVmzddcxcn 暗黑圣经仙桃影院 37tpcocn 持月真由xfplay 好吊日在线视频三级网 我爱背入李丽珍 电影师傅床戏在线观看 96插妹妹sexsex88com 豪放家庭在线播放 桃花宝典极夜著豆瓜网 安卓系统播放神器 美美网丝袜诱惑 人人干全免费视频xulawyercn av无插件一本道 全国色五月 操逼电影小说网 good在线wwwyuyuelvcom www18avmmd 撸波波影视无插件 伊人幼女成人电影 会看射的图片 小明插看看 全裸美女扒开粉嫩b 国人自拍性交网站 萝莉白丝足交本子 七草ちとせ巨乳视频 摇摇晃晃的成人电影 兰桂坊成社人区小说www68kqcom 舔阴论坛 久撸客一撸客色国内外成人激情在线 明星门 欧美大胆嫩肉穴爽大片 www牛逼插 性吧星云 少妇性奴的屁眼 人体艺术大胆mscbaidu1imgcn 最新久久色色成人版 l女同在线 小泽玛利亚高潮图片搜索 女性裸b图 肛交bt种子 最热门有声小说 人间添春色 春色猜谜字 樱井莉亚钢管舞视频 小泽玛利亚直美6p 能用的h网 还能看的h网 bl动漫h网 开心五月激 东京热401 男色女色第四色酒色网 怎么下载黄色小说 黄色小说小栽 和谐图城 乐乐影院 色哥导航 特色导航 依依社区 爱窝窝在线 色狼谷成人 91porn 包要你射电影 色色3A丝袜 丝袜妹妹淫网 爱色导航(荐) 好男人激情影院 坏哥哥 第七色 色久久 人格分裂 急先锋 撸撸射中文网 第一会所综合社区 91影院老师机 东方成人激情 怼莪影院吹潮 老鸭窝伊人无码不卡无码一本道 av女柳晶电影 91天生爱风流作品 深爱激情小说私房婷婷网 擼奶av 567pao 里番3d一家人野外 上原在线电影 水岛津实透明丝袜 1314酒色 网旧网俺也去 0855影院 在线无码私人影院 搜索 国产自拍 神马dy888午夜伦理达达兔 农民工黄晓婷 日韩裸体黑丝御姐 屈臣氏的燕窝面膜怎么样つぼみ晶エリーの早漏チ○ポ强化合宿 老熟女人性视频 影音先锋 三上悠亚ol 妹妹影院福利片 hhhhhhhhsxo 午夜天堂热的国产 强奸剧场 全裸香蕉视频无码 亚欧伦理视频 秋霞为什么给封了 日本在线视频空天使 日韩成人aⅴ在线 日本日屌日屄导航视频 在线福利视频 日本推油无码av magnet 在线免费视频 樱井梨吮东 日本一本道在线无码DVD 日本性感诱惑美女做爱阴道流水视频 日本一级av 汤姆avtom在线视频 台湾佬中文娱乐线20 阿v播播下载 橙色影院 奴隶少女护士cg视频 汤姆在线影院无码 偷拍宾馆 业面紧急生级访问 色和尚有线 厕所偷拍一族 av女l 公交色狼优酷视频 裸体视频AV 人与兽肉肉网 董美香ol 花井美纱链接 magnet 西瓜影音 亚洲 自拍 日韩女优欧美激情偷拍自拍 亚洲成年人免费视频 荷兰免费成人电影 深喉呕吐XXⅩX 操石榴在线视频 天天色成人免费视频 314hu四虎 涩久免费视频在线观看 成人电影迅雷下载 能看见整个奶子的香蕉影院 水菜丽百度影音 gwaz079百度云 噜死你们资源站 主播走光视频合集迅雷下载 thumbzilla jappen 精品Av 古川伊织star598在线 假面女皇vip在线视频播放 国产自拍迷情校园 啪啪啪公寓漫画 日本阿AV 黄色手机电影 欧美在线Av影院 华裔电击女神91在线 亚洲欧美专区 1日本1000部免费视频 开放90后 波多野结衣 东方 影院av 页面升级紧急访问每天正常更新 4438Xchengeren 老炮色 a k福利电影 色欲影视色天天视频 高老庄aV 259LUXU-683 magnet 手机在线电影 国产区 欧美激情人人操网 国产 偷拍 直播 日韩 国内外激情在线视频网给 站长统计一本道人妻 光棍影院被封 紫竹铃取汁 ftp 狂插空姐嫩 xfplay 丈夫面前 穿靴子伪街 XXOO视频在线免费 大香蕉道久在线播放 电棒漏电嗨过头 充气娃能看下毛和洞吗 夫妻牲交 福利云点墦 yukun瑟妃 疯狂交换女友 国产自拍26页 腐女资源 百度云 日本DVD高清无码视频 偷拍,自拍AV伦理电影 A片小视频福利站。 大奶肥婆自拍偷拍图片 交配伊甸园 超碰在线视频自拍偷拍国产 小热巴91大神 rctd 045 类似于A片 超美大奶大学生美女直播被男友操 男友问 你的衣服怎么脱掉的 亚洲女与黑人群交视频一 在线黄涩 木内美保步兵番号 鸡巴插入欧美美女的b舒服 激情在线国产自拍日韩欧美 国语福利小视频在线观看 作爱小视颍 潮喷合集丝袜无码mp4 做爱的无码高清视频 牛牛精品 伊aⅤ在线观看 savk12 哥哥搞在线播放 在线电一本道影 一级谍片 250pp亚洲情艺中心,88 欧美一本道九色在线一 wwwseavbacom色av吧 cos美女在线 欧美17,18ⅹⅹⅹ视频 自拍嫩逼 小电影在线观看网站 筱田优 贼 水电工 5358x视频 日本69式视频有码 b雪福利导航 韩国女主播19tvclub在线 操逼清晰视频 丝袜美女国产视频网址导航 水菜丽颜射房间 台湾妹中文娱乐网 风吟岛视频 口交 伦理 日本熟妇色五十路免费视频 A级片互舔 川村真矢Av在线观看 亚洲日韩av 色和尚国产自拍 sea8 mp4 aV天堂2018手机在线 免费版国产偷拍a在线播放 狠狠 婷婷 丁香 小视频福利在线观看平台 思妍白衣小仙女被邻居强上 萝莉自拍有水 4484新视觉 永久发布页 977成人影视在线观看 小清新影院在线观 小鸟酱后丝后入百度云 旋风魅影四级 香蕉影院小黄片免费看 性爱直播磁力链接 小骚逼第一色影院 性交流的视频 小雪小视频bd 小视频TV禁看视频 迷奸AV在线看 nba直播 任你在干线 汤姆影院在线视频国产 624u在线播放 成人 一级a做爰片就在线看狐狸视频 小香蕉AV视频 www182、com 腿模简小育 学生做爱视频 秘密搜查官 快播 成人福利网午夜 一级黄色夫妻录像片 直接看的gav久久播放器 国产自拍400首页 sm老爹影院 谁知道隔壁老王网址在线 综合网 123西瓜影音 米奇丁香 人人澡人人漠大学生 色久悠 夜色视频你今天寂寞了吗? 菲菲影视城美国 被抄的影院 变态另类 欧美 成人 国产偷拍自拍在线小说 不用下载安装就能看的吃男人鸡巴视频 插屄视频 大贯杏里播放 wwwhhh50 233若菜奈央 伦理片天海翼秘密搜查官 大香蕉在线万色屋视频 那种漫画小说你懂的 祥仔电影合集一区 那里可以看澳门皇冠酒店a片 色自啪 亚洲aV电影天堂 谷露影院ar toupaizaixian sexbj。com 毕业生 zaixian mianfei 朝桐光视频 成人短视频在线直接观看 陈美霖 沈阳音乐学院 导航女 www26yjjcom 1大尺度视频 开平虐女视频 菅野雪松协和影视在线视频 华人play在线视频bbb 鸡吧操屄视频 多啪啪免费视频 悠草影院 金兰策划网 (969) 橘佑金短视频 国内一极刺激自拍片 日本制服番号大全magnet 成人动漫母系 电脑怎么清理内存 黄色福利1000 dy88午夜 偷拍中学生洗澡磁力链接 花椒相机福利美女视频 站长推荐磁力下载 mp4 三洞轮流插视频 玉兔miki热舞视频 夜生活小视频 爆乳人妖小视频 国内网红主播自拍福利迅雷下载 不用app的裸裸体美女操逼视频 变态SM影片在线观看 草溜影院元气吧 - 百度 - 百度 波推全套视频 国产双飞集合ftp 日本在线AV网 笔国毛片 神马影院女主播是我的邻居 影音资源 激情乱伦电影 799pao 亚洲第一色第一影院 av视频大香蕉 老梁故事汇希斯莱杰 水中人体磁力链接 下载 大香蕉黄片免费看 济南谭崔 避开屏蔽的岛a片 草破福利 要看大鸡巴操小骚逼的人的视频 黑丝少妇影音先锋 欧美巨乳熟女磁力链接 美国黄网站色大全 伦蕉在线久播 极品女厕沟 激情五月bd韩国电影 混血美女自摸和男友激情啪啪自拍诱人呻吟福利视频 人人摸人人妻做人人看 44kknn 娸娸原网 伊人欧美 恋夜影院视频列表安卓青青 57k影院 如果电话亭 avi 插爆骚女精品自拍 青青草在线免费视频1769TV 令人惹火的邻家美眉 影音先锋 真人妹子被捅动态图 男人女人做完爱视频15 表姐合租两人共处一室晚上她竟爬上了我的床 性爱教学视频 北条麻妃bd在线播放版 国产老师和师生 magnet wwwcctv1024 女神自慰 ftp 女同性恋做激情视频 欧美大胆露阴视频 欧美无码影视 好女色在线观看 后入肥臀18p 百度影视屏福利 厕所超碰视频 强奸mp magnet 欧美妹aⅴ免费线上看 2016年妞干网视频 5手机在线福利 超在线最视频 800av:cOm magnet 欧美性爱免播放器在线播放 91大款肥汤的性感美乳90后邻家美眉趴着窗台后入啪啪 秋霞日本毛片网站 cheng ren 在线视频 上原亚衣肛门无码解禁影音先锋 美脚家庭教师在线播放 尤酷伦理片 熟女性生活视频在线观看 欧美av在线播放喷潮 194avav 凤凰AV成人 - 百度 kbb9999 AV片AV在线AV无码 爱爱视频高清免费观看 黄色男女操b视频 观看 18AV清纯视频在线播放平台 成人性爱视频久久操 女性真人生殖系统双性人视频 下身插入b射精视频 明星潜规测视频 mp4 免賛a片直播绪 国内 自己 偷拍 在线 国内真实偷拍 手机在线 国产主播户外勾在线 三桥杏奈高清无码迅雷下载 2五福电影院凸凹频频 男主拿鱼打女主,高宝宝 色哥午夜影院 川村まや痴汉 草溜影院费全过程免费 淫小弟影院在线视频 laohantuiche 啪啪啪喷潮XXOO视频 青娱乐成人国产 蓝沢润 一本道 亚洲青涩中文欧美 神马影院线理论 米娅卡莉法的av 在线福利65535 欧美粉色在线 欧美性受群交视频1在线播放 极品喷奶熟妇在线播放 变态另类无码福利影院92 天津小姐被偷拍 磁力下载 台湾三级电髟全部 丝袜美腿偷拍自拍 偷拍女生性行为图 妻子的乱伦 白虎少妇 肏婶骚屄 外国大妈会阴照片 美少女操屄图片 妹妹自慰11p 操老熟女的b 361美女人体 360电影院樱桃 爱色妹妹亚洲色图 性交卖淫姿势高清图片一级 欧美一黑对二白 大色网无毛一线天 射小妹网站 寂寞穴 西西人体模特苍井空 操的大白逼吧 骚穴让我操 拉好友干女朋友3p