Domestication has significantly improved the dog’s capacity to cope with stress and social uncertainty via the evolution of antistress and antiaggression capacities, enhanced attention and impulse-control abilities, exchange-mediated autonomic attunement, and the integration of a sophisticated SES consolidating these various changes (). As a result, the dog’s ability to explore and rapidly establish social relations under a positive expectancy of reward is generally ascendant to negative expectancies and the social aversion associated with dispersion and entrapment dynamics. Dogs appear to respond to the presence of a person as an intrinsically rewarding object, with social contact possessing both incentive significance and hedonic value. For many dogs, petting is not only calmative but is also restorative in nature (see Affection and Friendship). The mere presence of a person nearby activates antistress capacities that enhance a dog’s ability to cope with pain and stress. In addition to generally enjoying human social contact, dogs have evolved a proactive sociability that enables them to smooth over social tensions with conciliatory exchanges before they escalate into conflict. In short, dogs are developmentally organized to attune and commune with people. Along with these various changes affecting canine sociability and emotional adaptability, dogs appear to have acquired complementary sensory and cognitive capabilities that enable them to socially engage and communicate with people and to follow human instruction ().
Some authors have emphasized that the dog’s enhanced abilities to initiate communicative interaction with people is due to an enhanced capacity for social gazing (), perhaps augmenting the dog’s abilities to decipher the significance of human social signals (). McGreevy and colleagues () report that brachiocephalic breeds tend to concentrate receptor ganglion cells around the central area, in contrast to dogs with elongated muzzles that tend to express a visual streak (see Orienting, Preattentive Sensory Processing, and Visual Acuity). Consistent with the aforementioned social-gaze hypothesis, these authors speculate that a genetic trend toward a frontal placement of the eyes and shortening of the muzzle might have developed as the result of selection pressures favoring visual capacities that enabled dogs to focus on the human face.
Relevantly, Viranyi and colleagues () have observed that canine begging behavior is preferentially directed toward an attentive person rather than a person looking away from the dog. The authors suggest that such preferences might reflect an appreciation of human attentional cues insofar as they help to improve the success of instrumental food-sharing projects. The authors also found that a dog’s ability to perform a basic obedience exercise (“Down”) in response to a recorded command varied depending on whether the owner was out of sight, faced the dog, turned away, or faced another person while giving the command. The best performance was obtained when the owner gave commands while facing the dog, followed by commands given as the owner turned his or her head away from both the dog and person. The dog showed an equal disruption of performance when the owner was out of sight as when facing a nearby person. The authors interpret these findings as evidence of special attention-dependent capabilities. However, since most dogs can be trained to lie down rapidly and consistently in each of the previously mentioned stimulus and contextual conditions, and given the limited controls used in the experiment, it would seem extremely difficult to sort out what is attributable to the effects of owner-training skills versus the effects of special cognitive abilities expressed by dogs as a group. Although some acquired skills appear to depend on the help of directional cues for a dog to perform well, others do not. Warden and Warner () explored many of these problems in the case of the dog named Fellow, finding that tasks such as sitting and lying down on command were not appreciably affected by changes in attentional focus or directional cueing, whereas routines that required the dog to move toward places or to select objects were much more dependent on attentional and directional cueing (see Nora, Roger, and Fellow: Extraordinary Dogs).
Several authors have hypothesized that dogs have acquired, as the result of domestication, unique capacities for interpreting and responding to human directional cues. The dog’s ability to translate directional information derived from gross and subtle pointing and indicating movements is well developed (), seeming to surpass the abilities of chimpanzees and wolves (). Although dogs are undoubtedly responsive to human deictic (pointing) signals, nonverbal directive signals, and social gaze, capabilities that trainers have fostered for centuries, it is not clear that this capacity is the result of special cognitive adaptations. To take an extreme example, unstable pointer dogs would likely show significantly less responsiveness to directional gaze and pointing cues than would stable counterparts, not because pointers lack such ability but because preemptive reactions toward humans prevent them from showing that they have it. The ability of such dogs to use directional signals in appropriate ways only becomes fully evident when training them to hunt, as demonstrated by McBryde and Murphree (). In addition to training, the participation of an eager and playful pointer appeared to prime and attune an unstable pointer with arousal and direction that helped to break the spell of cataplexy. Once in their umwelt, the unstable pointers rapidly learned to show and respond to pointing signals:
The performances of both nervous and normal dogs were quite comparable on an overall basis. The nervous dogs scored about as well throughout and just as well as the normal subjects on their last two trials which were intended to evaluate each dog’s final abilities after rehabilitation. On an individual basis some of the nervous dogs did better than the normal controls. ()
Despite significant changes away from the laboratory, their confident and human-friendly behavior did not generalize back to the laboratory, where they rapidly reverted to the same unstable and nervous behavior shown before field training. Apparently, in the absence of natural stimuli promoting drive arousal conducive to hunting activity (prey-seeking action modes and modal strategies), these dogs get stuck. The disorder does not appear to be primarily caused by fear or an aversion to novelty, since nervous dogs rapidly lost most of their timidity and could tolerate close human contact and the blast of a shotgun while hunting. Instead, these dogs appear to be affected by an overspecialization of function genetically encoded around hunting. Perhaps more fundamental, though, is the presence of a genetic defect affecting parasympathetic braking and accelerator functions. The ability to attenuate and accelerate arousal competently while remaining in a parasympathetic mode of activation may be an important aspect of domestication and herald the emergence of the canine SES (). Thus, individual differences affecting the dog’s arousability and sociability (approach and withdrawal thresholds), motivational interest (incentive and hedonic value) in the reward object, susceptibility to conflict and distress during testing (anxiety and frustration thresholds), age, and relative social dependency () would likely generate significant variability into any cognitive test relying on social and motivational variables not equally distributed among experimental subjects. These various influences represent additive confounds that have long been recognized as obstacles to the scientific investigation of animal cognition and continue to plague it with ambiguity.
To take an experimental example of the sort of risks involved in cognitive theorizing, Triana and Pasnak () tested 32 cats and 23 dogs in eight standardized object-permanence tasks using a soft toy as the objects. Although dogs and cats completed some of the tasks, they consistently failed (with the exception of one dog) to solve the invisible displacement tasks. In a second experiment, two additional naive dogs and three cats were tested, but this time the researchers used savory treats and chunks of hamburger as rewards. Under the influence of enhanced motivation, the two dogs and three cats completed all eight of the tasks in a “logical manner.” Now, if one took the results of the first test as a true estimate of canine and feline cognitive abilities, the interpretation would be consistent with the results of the experiment but wrong with respect to the dog’s actual object-permanence abilities. Further, the second experiment might be erroneously interpreted as evidence of extraordinary cognitive skills, but neither experiment actually says much about cognition per se and instead underscores the reality that cognition and motivation are not easily dissociable, especially when the one variable is manipulated to test the other. Consequently, object-permanence tests employing such things as rubber toys may not measure the extent of cognitive capabilities as much as they measure a dog’s motivational interest in getting the object concealed and their willingness to invest the attentional resources and energy needed to encode a working memory of it.
Pointing at a container concealing a treat is not a neutral deictic signal but may also carry the added significance of a command; that is, the directional cue may signify a demand “Go there” — not merely indicating where the food is (i.e., a “There it is” signal) but carrying the added implication of a dominance imperative. Further, standing behind and pointing directly over an object may not necessarily be interpreted by the dog as a “Here it is” signal or a “Go there” signal but rather may project a “This is mine” significance. Accordingly, standing over and pointing at an object while repeatedly glancing from it to a dog should cause many dogs to withdraw from the object. In fact, many dogs can be caused to avoid forbidden objects merely by alternating glances toward the dog and back again while intently staring and pointing at the object. The effect can be very strong and appears to accumulate over repeated trials and may be augmented with auditory orienting signals. With regard to such dogs, learning to approach and take objects that are pointed at from above may be contraprepared. Dogs rarely, if ever, relinquish food to other dogs by dropping it and then glancing at the other dog and staring at the object to indicate that the other dog should take it. Such social signals when they do occur more likely carry an opposite significance, that is, represent a dare or challenge. Typically, when dogs give up objects, they indicate this intent by moving away from them. They are not particularly well adapted to engage actively in showing behavior with conspecifics when it comes to highly valued objects. The “Go there” imperative should also be subject to the influence of individual differences. In all of these cases, extraverts (with low-approach/high-withdrawal thresholds) should outperform the introvert (with high-approach/low-withdrawal thresholds).
The finding by Hare and colleagues () that puppies perform the object-choice task fairly well from the start and that the “skill” does not appear to be much affected by rearing or social exposure to people seems inconsistent with the findings of other authors (). The lack of effect resulting from rearing and social experience is especially puzzling given the findings reported by Topal and colleagues (), who found a strong correlation between the number of glances toward the owner, social dependency, and reduced problem-solving efficiency. Of course, one way to explain Hare and colleagues’ findings is the possibility that the learning needed to decipher the significance of directional cues is epigenetically articulated into puppy behavior at an early age. The notion that complex social skills might emerge in the context of early ontogeny should not come as any surprise, nor should its significance be downplayed.
The social-cognition hypothesis faces other more formidable problems when the results are judged in the light of prior experimental work performed in Konorski’s laboratory. In a series of delayed-response experiments performed by Lawicka (), dogs were taught a 3-choice response that depended on the directional information provided by a 3-sec-ond orienting signal, a buzzer emanating from one of the different locations. After a variable duration delay, with or without intervening distractions (e.g., feeding the dog or taking the dog from the room), the dogs were released to choose. Dogs readily learned to make the correct location-choice responses, despite distractions, long delays, and even after falling asleep. In one dog, delays of over 18 minutes proved of little difficulty, with the dog making no errors in 7 trials (). Thus, the dogs did not depend on body orientation, but appeared rely on an oculocentric map to orient toward the signalled location.
Lawicka’s findings suggests that the distance between the boxes from where the buzzer emanated appeared to enhance the integration of predictive information into the canine localizing map. When the boxes were widely spaced apart (e.g., over 12 feet) the dogs perform the location-choice task after long delays with few errors, whereas when the boxes are placed close together the delay abilities of the dog are “drastically reduced.” These observations are extremely interesting since they appear to suggest that delayed response capabilities are partially dependent on the spatial distribution of reference points scaled to coordinate action to locate stationary objects concealed at some distance away, perhaps revealing significant features of the canine umwelt. One might expect that moving objects, including those in slow motion, would not yield lasting memory traces of a location but might yield predictions concerning the future location of the object based on a trajectory, speed, and prominent terrain markers. Accordingly, allowing the dog to briefly (2 seconds) observe another dog walk by in front of the house before closing the door and taking it to another room for a brief delay, reveals that the dog immediately angles off in the direction last observed, even though, in fact, the dog was immediately turned about and walked in a opposite direction after the door was shut.
Lawicka and Konorski () observed that prefrontal dogs treated directional cues much like a pointer orients and freezes its focus and posture in the direction of the cued location, thereby depending on proprioceptive and vestibular signals to hold on point, discounted the value of such responses with respect to cognitive function, however, referring to them as “pseudo-delayed,” since the arrangement could not exclude the confounding possibility that the dog was relying on propioceptive and vestibular propioceptive and vestibular signals when orienting and making location-choice responses. In addition, Konorski and Lawicka () found that dogs suffering prefrontal lesions are still able to correctly follow directional signals, so long as the signals were closely tied to the object’s location and that the dog was released during or shortly after the directional cue was discontinued. Now, if dogs with massive prefrontal lesions can “solve” such problems by remaining physically oriented on the location during the delay period, it makes it difficult to assume that the performance is strictly speaking of a cognitive nature. These findings are bad news for the social-cognition hypothesis. If a dog without a functional prefrontal cortex can perform the requisite orienting and approach response, then the social-cognitive hypothesis is falsified, that is, the action might not depend on cognitive ability at all.
In order to overcome these confounding influences, delayed-response procedures can be designed with built-in interference effects that filter out positional information, e.g., the dog is turned around, distracted with food or petting, and even momentarily taken away from the starting point before being released again to choose. Interesting work performed by Nippack and colleagues () appears to have avoid many of these obvious experimental pitfalls while exploring the effect of delay on latency and response accuracy.