Brave Upside Down World – Does Looking Between the Legs Elongate or Shorten the Perceived Distance

The aim of this research was to give further evidence for the influence of vestibular and proprioceptive information on distance perception, and to verify how perceived distance changes if we use matching versus verbal judgment task. That is, the aim was to investigate how perceived distance changes when we bend over and look between the legs. The experiment was performed on a daylight open field (full-cue situation), with 20 participants, high school students from the Petnica Science Center. Participants had the task to match distances of two stimuli, one of which was in front of them at eye level, whereas the other was behind them, and they could observe it by bending their body forward and looking between the legs .Stimuli were 7cm*5cm large, and rectangular in shape. Results have shown that perceived distance changes in such a way that distances observed between the legs are perceived as longer than distances observed from an upright position. This difference in perceived distances exists only for larger physical distances (3m and 5m), but not for smaller physical distances (1m). Results coincide with our previous findings which indicate that vestibular and proprioceptive information change elongates perceived distance. On the other hand, the results contradict some findings gained in experiments in which the verbal judgment task was used. These contrary results probably appear because the verbal judgment task leaves more space for higher cognitive processes to be involved.

distortions to reconstruct the distance information.These additional information that the visual system uses to determine distance are called depth cues, such as perspective, shadows, masking, motion parallax, stereopsis, accommodation... On the other hand, there are some findings which indicate that perceived distance depends not only on visual information, but also on some proprioceptive or vestibular information (Suzuki, 2007;Lackner & Di Zio, 2005;Cohen & Stoper, 2001;Carter, 1977;Tošković, 2009).These findings suggest that depth cues are not the only information relevant for the perception of distance, and also that perceived distance is a consequence of interactions between different sense modalities.
One of the models which allow influence of non-visual information in depth perception is the so called flattened sky dome model of Rock and Kaufman (Kaufman & Rock, 1962;Ross & Plug 2002).This model suggests that distances towards the zenith are perceived as shorter than physically equal distances towards the horizon.Rock and Kaufman tried to explain the Moon illusion using the so called apparent distance theory, suggesting that misperception of the size of the moon is a consequence of misperception of distance.Namely, if we usually perceive the moon as larger on horizon than on zenith, it might be because we perceive the moon as being further away on horizon than on zenith.If we perceive the moon on horizon as further, and it subtends the same visual angle as on zenith, then its perceived size must be larger.It's just like if we want to make a shadow of the same size on a wall, but from a larger distance, we would have to use a larger object.So, we can say that the visual system uses a similar logic.But, why should we perceive the Moon on horizon as further away?Rock and Kaufman's idea was that towards horizon, on the ground, we have a lot of depth cues and contrary to that, towards zenith we have a small number of depth cues (it's just an empty sky).As a consequence of this unequal distribution of depth cues, distance is perceived as different in two directions.This inequality of perceived distances towards horizon and zenith, Rock and Kaufman called the flattened sky dome model.But, as we can see, they thought that the inequality of perceived distances in different viewing directions is a consequence of the unequal distribution of depth cues, that is, a consequence of visual information only.But is that right?
Trying to verify the flattened sky dome model, we performed several experiments in a dark room, because depth cue reduction is the same in both directions, towards horizon, and towards zenith (Tošković, 2004(Tošković, , 2008)).Our participants were asked to match the distances of two luminous objects in two directions, horizontal and vertical.Distances in the experiment ranged from 1m to 5m.First, we found that there is a difference in perceived horizontal and vertical distances; this means that it must be a consequence of something else beside visual information, since the distribution of depth cues is the same in both directions.Second, the direction of mismatch was opposite than expected, i.e. our participants matched shorter vertical distances (towards zenith) with longer horizontal distances (towards horizon).This means that they perceive vertical distances as longer.For example, if we perceive 3m as equal to 5m, it means that we perceive 3m as longer than it is.What these results suggest is that change in perceived distance is such that distance elongates towards zenith and it is not only a consequence of the visual information distribution.So, what might be the cause for this change in perceived distance?
If we look at what happens while we change our viewing direction from horizon to zenith, it appears that we change proprioception from eye muscles (eye position shift), neck muscles (head position) and vestibular information (head and body position).When we changed all these information separately, that is while performing experiments in which two kinds of information were controlled and the third was varied, it appeared that perceived distance changes when we change head and body position, but not when we change eye position (Tošković, 2008).These results suggest that proprioceptive information from neck muscles and vestibular information affect perceived distance.For instance, if the viewing direction is parallel to the ground (normal to the direction of gravity) perceived distance is shorter than when the viewing direction is normal to the ground (parallel to the direction of gravity).Furthermore, we can assume that if vestibular and proprioceptive inputs differ from the usual ones (standing upright position), perceived distance elongates.
There are other findings indicating similar changes in perceived distance.Morinaga asked his participants to match horizontal distances with static vertical standards, and reported that horizontal matches are 4% to 14% longer (Morinaga,1935 as cited in Higashiyama and Ueyama, 1988).Osaka confirmed that horizontal matches are longer, but this elongation ranged from 12% to 21% (Osaka (1947) as cited in Higashiyama and Ueyama, 1988).So, since participants matched longer horizontal distances with shorter vertical distances, it means that they perceived horizontal distances as shorter.
On the other hand, Galanter asked his participants to give verbal estimates about the distance of an airplane while it was seen above the sea level, on the horizon, or while it was seen above them, towards zenith, or somewhere around 45 degrees of elevation (Galanter & Galanter, 1973).His data show that distances towards horizon are overestimated, towards zenith are underestimated and on a 45 degrees elevation estimated distances correspond to physical ones (participants give relatively precise estimates).So, his findings contradict the previously mentioned ones (Suzuki, 2007;Morinaga (1935) & Osaka (1947) as cited in Higashiyama and Ueyama, 1988;Tošković, 2004Tošković, , 2008)).These data indicate that perceived horizontal distances are longer than perceived vertical ones.What might be the reason for these contradictory results?The only difference between two groups of findings was the task.In our, as well as in Morinaga's and Osaka's experiments, the matching task was used, and in Galanter's experiment the verbal judgment task was used.We may ask why this change of task changes the results so dramatically.
According to some recent studies (e.g., Ganong, 2003;Klatzky, Lederman & Reed, 1987;Zangaladze, Epstein, Grafton, & Sathian,1999) tactile and proprioceptive information interacts with visual information at the posterior parietal cortex.Tactile information, produced at Pacini capsules and others in the skin, is projected to the primary somatosensory area in the postcentral gyrus, and proprioceptive information is projected to the cerebellum and also to the postcentral gyrus.The information from the postcentral gyrus is then sent to the posterior parietal cortex and is combined with visual information that is sent out through the dorsal pathway of visual processing.It is thus suggested that the posterior parietal cortex is an important site for haptic-visual interactions, and it can be considered as the site for interaction of visual and some non-visual information during perception and action processes.
If change of vestibular and proprioceptive information changes perceived distance, whether it elongates or shortens it, we might ask what would happen if we change the body position in such a way that both of these types of information change.Such a position might be bending forward and looking between our legs.Helmholtz thought that looking between the legs shortens perceived distance between the two objects (Higashiyama & Adachi, 2006).He also thought that this change of perceived distance is due to retinal image inversion, and not due to proprioceptive or vestibular information, because when we bend over and look between the legs, the image on the retina inverts.Using the verbal judgment task, Higashiyama and Adachi tried to verify whether perceived distance changes while looking between the legs, and whether it depends on retinal image inversion, or on proprioceptive and vestibular input (Higashiyama & Adachi, 2006).Their participants verbally estimated distances of various objects while standing upright or looking between the legs, and also with or without glasses which invert the retinal image.So, they had four conditions, standing upright with an upright retinal image, standing upright with an inverted retinal image, looking between the legs with an upright retinal image, and looking between the legs with an inverted retinal image.Their results have shown that change in retinal image orientation only does not change perceived distance, but change in body position does change it.They also showed that perceived distances viewed between the legs are shorter than perceived distances viewed from an upright position.So, unlike Helmholtz, they showed that proprioceptive and vestibular information are more important than retinal image orientation, but like Helmholtz they showed that perceived distance shortens while looking between the legs.These findings correspond to Galanter's results, that change of vestibular and proprioceptive input shortens perceived distance.But, same as Galanter, they used the verbal judgment task, and we might ask would it be the same if we would use the matching task, since earlier matching task experiments showed results that differ from Galanter's.Based on previous results (Tošković, 2008) we predict that using the matching task while looking between the legs will change the results in such a way that distances will be perceived as longer than distances viewed from an upright position.Namely, if in all previous experiments, in which we used the matching task, change in vestibular and proprioceptive information elongated perceived distance, there is no reason to believe that the opposite should happen in the looking between the legs situation.

EXPERIMENT
The aim of this experiment was to give further evidence for the influence of non-visual information (vestibular, proprioceptive, or both) on distance perception, and to show differences between the two tasks which are commonly used in this kind of research, verbal judgment and matching task.Namely, the aim was to investigate whether there is a perceived distance change while looking from an upside-down position, between the legs, and what is the direction of that change.Previous results have shown that vestibular and some proprioceptive information do affect perceived distance, in a such way that changes in that information, regarding usual upright posture, elongate perceived distance.On the other hand, Higashiyama and Adachi showed that looking between the legs shortens the perceived distance (Higashiyama & Adachi, 2006).Since Higashiyama used the verbal judgment task, and in our previous studies we used the matching task, the idea was to investigate whether we would obtain different results if we use the matching task in the looking between the legs situation.So, the primary aim of this research was not to differentiate between vestibular and proprioceptive information effects on perceived distance, but to indicate some possible differences between the two tasks, verbal judgment and matching.

Method
Participants: We had 20 participants in the experiment, high school students from the Petnica Science Center, whose age range was from 16 to 19, including both genders.All participants had normal or corrected to normal vision.
Stimuli: As stimuli we used two objects, rectangular in shape, 7cm*5cm in size.

Procedure:
The experiment was performed on a daylight open field, which can be considered as a full-cue situation.Participants had the task to match distances of two stimuli positioned in two viewing directions.One of the directions was at the participants' eye level towards horizon, and they could observe it from an upright position.The other direction was behind the participant, parallel to the first direction, and they could observe it by bending their body forward and looking between their legs.One of the stimuli was set on a standard distance, and the participant's task was to guide the experimenter (by giving him instructions "further" or "closer") to move the other stimulus until the two stimuli appear to be at the same distance from the observer.Both stimuli were used as a standard, in a way that participants were instructed to look at the stimuli from an upright position and set up the other stimulus viewed between the legs, and then the other way round, to look at the stimulus between the legs and set up the stimulus from an upright position.As standard distances we used 1m, 3m and 5m.During the experiment participants wore glasses with 1mm wide horizontal aperture in order to prevent eye position change.

RESULTS
We used a two-factorial ANOVA for repeated measures to analyze the data.As factors we used body position (position from which subjects matched the test stimuli), with two levels (upright and upside-down) and standard distance, with three levels (1m, 3m and 5m).
The analysis showed a significant main effect of position (F(1,19)=6.88;p<0.05) and distance (F(2,38)=864.71;p<0.01), but there was no interaction (F(2,38)=864.71;p>0.05).The effect of the distance was expected, and it means that for further physical distances of the standard participants give further matches.The effect of the position means that matches from upright and upsidedown position differ, that is, perceived distance is not the same when viewed from an upright and from an upside-down position.Interestingly, participants matched longer upright distances with shorter upside-down distances, meaning that upside-down distances were perceived as longer (figure 2).The absence of interaction indicates that this difference exists on all three distances used in experiment.
We used LSD post-hoc tests to analyze differences between two positions for all three standard distances.Results have shown that there are significant differences between upright and upside-down positions for 3m and 5m distances, but not for the 1m distance.So, we can conclude that although the interaction effect was not significant, there is some interaction between position and distance, because the two positions differ only for larger distances, and not for the smaller distance.

DISCUSSION
The aim of this research was to give further evidence for the influence of vestibular and proprioceptive information on distance perception, or to show that change in one or in both of these kind of information changes perceived distance.Also, the aim was to verify how perceived distance changes if we use matching versus the verbal judgment task.
Our results have shown that change in body position, from upright to upside-down, affects perceived distance in such a way that perceived distance is longer from an upside-down position.That is, if we ask observers to match distances from upside-down position, looking between the legs, with distances viewed from an upright position, they match shorter distances viewed between the legs with longer upright distances.But, this change in perceived distance was observed only for larger physical distances (3m and 5m), and not for closer physical distances (1m).The reason why the difference in perceived distances appears only for larger physical distances is probably because in closer space a large number of relatively precise depth information is available to the observer.In more remote space, the number and exactness of depth cues diminishes, and this is probably why the visual system begins to rely on non-visual information, such as vestibular or proprioceptive, or both.These results agree with our previous results which showed that change of vestibular and proprioceptive information elongates perceived distance (Tošković, 2004(Tošković, , 2008(Tošković, , 2009)).Namely, if vestibular or proprioceptive information indicates an unusual body or head position, or a position that differs from regular upright, perceived distance will be longer than from an upright position.They also agree with results of other studies which obtained similar results that perceived distances are longer when viewed from other positions than upright (Morinaga (1935) & Osaka (1947) as cited in Higashiyama and Ueyama, 1988;Suzuki, 2007).So, can we say that they support the hypothesis about vestibular and proprioceptive information influence on distance perception?Well, only partially.These results do support the idea that perceived distance depends also on vestibular or proprioceptive input, or on both, but we cannot say that the change in perceived distance in this experiment is uniquely due to vestibular and proprioceptive information.Namely, beside vestibular and proprioceptive information, retinal image orientation also changed between the two body positions.On the other hand, since Higashiyama and Adachi (2006) previously showed that retinal image orientation does not affect perceived distance, we can say that it is no longer a factor of interest.So, since according to previous findings, retinal image orientation does not affect perceived distance, we can attribute the perceived distance change observed in this experiment to vestibular or proprioceptive information change, or to both.
If we summarize the results obtained in this experiment, we can represent them as in figure 3.
Our results, on the other hand, do not agree with the Higashiyama and Adachi results in the sense that they showed compression of perceived distance while looking between the legs, and we showed elongation (Higashiyama and Adachi, 2006).Higashiyama's participants used to report shorter distances while looking between the legs.Our participants used to match shorter distances viewed between the legs with longer distances viewed from an upright position, meaning that they perceived distances viewed between the legs as longer.,What might be the cause for these contradictory findings in two similar experiments?As we mentioned previously, Higashiyama and Adachi used the verbal judgment task, and we used the matching task.Let us try to analyze what happens during those two tasks.While an observer is giving verbal judgments, at first he uses perceptual processes to determine the distance, and then, using some higher cognitive processes, he transforms the percept into a verbal report (Gogel & Da Silva, 1987;Kaufman & Rock, 1962;Ross & Plug, 2002).During this transformation, a lot can happen.Previous experience or knowledge can interfere and change the information.For instance, if someone sees an object at a certain distance and perceives this distance as longer than it is, but the observed object subtends the same visual angle as it was, one might perceive the object as bigger than it is, as in the Moon illusion (Kaufman & Rock, 1962;Ross & Plug, 2002).Furthermore, if we ask this person how far is the object, he can reason, based on his previous experience, that bigger objects are usually closer and then report that he perceives the object as being closer than it is, although he actually perceives it as further away.Gogel and Da Silva think of this phenomenon as of two different processes, primary and secondary ones (Gogel & Da Silva, 1987).Primary processes would be purely perceptual and based only on sensory information, while secondary processes are affected by higher cognitive processes, and are more influenced by experience.So, we can say that the difference between our and the Higashiyama and Adachi findings is a consequence of different processes involved during the experiment, provoked by different experimental tasks.Higashiyama and Adachi used the verbal judgment task, which leaves more space for higher cognitive processes and previous experience to be involved, and their results can be more attributed to secondary processes than to pure perceptual processes.On the other hand, using the matching task, we got the results which physical distance perceived distance mismatch can be interpreted as pure perceptual processes, since the matching task relies on direct perception and leaves less space for higher cognitive processes to be involved.Although in our matching task participants guided the experimenter to move the stimuli, and during this process they used some kind of verbalization ("move it further" or "bring it closer"), this kind of verbalization is not directly related to perceived distance.Their verbalization is not a part of a process of perceiving distance.It happens at the same time, but it is related to experimenter guidance and not to participant's perception.So, it is not verbalization that is the problem in the verbal judgment task, but the process of judging distance which includes some kind of computation (Gogel & Da Silva, 1987), which does not exist in the matching task.
To conclude, we showed that perceived distance elongates if body or head position changes.These changes in perceived distance can be attributed to vestibular or proprioceptive information change, or to both.Also, we believe that the matching task is probably a better way to investigate pure perception processes than the verbal judgment task, since the matching task leaves less space for higher cognitive processes or previous experience and knowledge to be involved.
Figure 1: a) stimulus and glasses b) experimental setting

Figure 3 :
Figure 3: The difference in perceived distances viewed from an upright position and between the legs