Goto

Collaborating Authors

 magician


Magic in Human-Robot Interaction (HRI)

Cooney, Martin, Vinel, Alexey

arXiv.org Artificial Intelligence

"Magic" is referred to here and there in the robotics literature, from "magical moments" afforded by a mobile bubble machine, to "spells" intended to entertain and motivate children--but what exactly could this concept mean for designers? Here, we present (1) some theoretical discussion on how magic could inform interaction designs based on reviewing the literature, followed by (2) a practical description of using such ideas to develop a simplified prototype, which received an award in an international robot magic competition. Although this topic can be considered unusual and some negative connotations exist (e.g., unrealistic thinking can be referred to as magical), our results seem to suggest that magic, in the experiential, supernatural, and illusory senses of the term, could be useful to consider in various robot design contexts, also for artifacts like home assistants and autonomous vehicles--thus, inviting further discussion and exploration.


Did artificial intelligence shape the 2024 US election?

Al Jazeera

Days after New Hampshire voters received a robocall with an artificially generated voice that resembled President Joe Biden's, the Federal Communications Commission banned the use of AI-generated voices in robocalls. The 2024 United States election would be the first to unfold amid wide public access to AI generators, which let people create images, audio and video – some for nefarious purposes. Institutions rushed to limit AI-enabled misdeeds. Sixteen states enacted legislation around AI's use in elections and campaigns; many of these states required disclaimers in synthetic media published close to an election. The Election Assistance Commission, a federal agency supporting election administrators, published an "AI toolkit" with tips election officials could use to communicate about elections in an age of fabricated information.


The Morning After: A 6 million fine for robocalls from fake Biden

Engadget

The Federal Communications Commission (FCC) has officially issued its full recommended fine against political consultant Steve Kramer. This is after he initiated a series of robocalls to New Hampshire residents with pre-recorded audio of President Biden's voice, using deepfake AI technology. The fake Biden told voters not to vote in the upcoming primary, saying "Your vote makes a difference in November, not this Tuesday." Kramer must pay 6 million in fines in the next 30 days or the Department of Justice will handle collection, according to a FCC statement. Kramer doesn't just face a fine; he also has criminal charges against him.


FCC fines political consultant 6 million for deepfake robocalls

Engadget

The Federal Communications Commission (FCC) has officially issued its full recommended fine against political consultant Steve Kramer for a series of illegal robocalls using deepfake AI technology and caller ID spoofing during the New Hampshire primaries. Kramer must pay 6 million in fines in the next 30 days or the Department of Justice will handle collection, according to a FCC statement. Kramer violated the Truth in Caller ID Act passed in 2009 that prohibits anyone from "knowingly transmit misleading or inaccurate caller identification information with the intent to defraud, cause harm or wrongfully obtain anything of value," according to legislative records. The law preceded the widespread usage of AI, but the FCC voted unanimously to have it apply to such deepfakes this past February. The phony robocalls delivered pre-recorded audio of President Biden's voice using deepfake AI technology to New Hampshire residents leading up to the 2024 presidential primary election.



Hiding in Plain Sight: Towards the Science of Linguistic Steganography

Raj-Sankar, Leela, Rajagopalan, S. Raj

arXiv.org Artificial Intelligence

Covert communication (also known as steganography) is the practice of concealing a secret inside an innocuous-looking public object (cover) so that the modified public object (covert code) makes sense to everyone but only someone who knows the code can extract the secret (message). Linguistic steganography is the practice of encoding a secret message in natural language text such as spoken conversation or short public communications such as tweets.. While ad hoc methods for covert communications in specific domains exist ( JPEG images, Chinese poetry, etc), there is no general model for linguistic steganography specifically. We present a novel mathematical formalism for creating linguistic steganographic codes, with three parameters: Decodability (probability that the receiver of the coded message will decode the cover correctly), density (frequency of code words in a cover code), and detectability (probability that an attacker can tell the difference between an untampered cover compared to its steganized version). Verbal or linguistic steganography is most challenging because of its lack of artifacts to hide the secret message in. We detail a practical construction in Python of a steganographic code for Tweets using inserted words to encode hidden digits while using n-gram frequency distortion as the measure of detectability of the insertions. Using the publicly accessible Stanford Sentiment Analysis dataset we implemented the tweet steganization scheme -- a codeword (an existing word in the data set) inserted in random positions in random existing tweets to find the tweet that has the least possible n-gram distortion. We argue that this approximates KL distance in a localized manner at low cost and thus we get a linguistic steganography scheme that is both formal and practical and permits a tradeoff between codeword density and detectability of the covert message.


Intransitively winning chess players positions

Poddiakov, Alexander

arXiv.org Artificial Intelligence

Positions of chess players in intransitive (rock-paper-scissors) relations are considered. Namely, position A of White is preferable (it should be chosen if choice is possible) to position B of Black, position B of Black is preferable to position C of White, position C of White is preferable to position D of Black, but position D of Black is preferable to position A of White. Intransitivity of winningness of positions of chess players is considered to be a consequence of complexity of the chess environment -- in contrast with simpler games with transitive positions only. The space of relations between winningness of positions of chess players is non-Euclidean. The Zermelo-von Neumann theorem is complemented by statements about possibility vs. impossibility of building pure winning strategies based on the assumption of transitivity of positions of chess players. Questions about the possibility of intransitive positions of players in other positional games are raised.


Why detective video games are the perfect way to experience a mystery

The Guardian

In one of her best books, The Murder of Roger Ackroyd, Agatha Christie puts these words into the mouth of her least favourite character, Hercule Poirot: "Understand this, I mean to arrive at the truth. The truth, however ugly in itself, is always curious and beautiful to seekers after it." All detective stories are an attempt to reflect this. Uncovering the truth through clever reasoning, observation and logic is wondrous. You are forced to look at the world anew: a misplaced chair is no longer just a chair, but indicative of a killer's escape; a removed lightbulb tells us the killer did not want to be seen.


Beyond AI: The future of Intelligence is collective (human and machine)

#artificialintelligence

What would you describe as intelligence; memorising notes for exams only to forget about them in the summer, or making funny jokes on the fly on seemingly any topic? How about a magician who performs strictly on a script or a magician who changes the routine ever so slightly depending on the audience? This isn't a case for who is'more intelligent', but rather, when something is an act of intelligence. Intelligence is a widely used term ("So and so is highly intelligent", "emotional intelligence", "artificial intelligence" etc.) across many aspects of life. It's often associated with the ability to learn and process information quickly, broadly or deeply.


An unexpected audience

Science

In the past decade, the study of magic effects has started to gain attention from the scientific community, particularly psychologists. This interest stems from what magic effects might reveal about the blind spots in our perception and roadblocks in our thinking. The study of magic effects may offer researchers opportunities for new lines of inquiry about perception and attention. Moreover, because magic effects capitalize on our ability to remember what happened and our ability to anticipate what will happen next, using magical frameworks elicits ways to investigate complex cognitive abilities such as mental time travel (i.e., remembering the past and anticipating the future). Moving beyond the intersection between magic and the human mind, the application of magic effects to investigate the animal mind can prompt the comparison of behavioral reactions among diverse species, in which magic effects might exploit similar perceptive blind spots and cognitive roadblocks. The internet is filled with videos of magicians performing magic effects to animals (mostly captive primates and domesticated pets), in which the attentive animal spectators appear to react with awe and exultation when objects or food magically vanish. Without further investigation, it cannot be assumed that the animal audiences in the videos are amazed and surprised by the magic effect, akin to a human spectator. However, these encounters prompt investigation about the extent to which animals are susceptible to the same techniques of deception commonly used by magicians. Over the past several decades, comparative psychologists, perhaps unintentionally, have been using magic effects as a methodological tool to explore a diverse range of cognitive abilities in animals. For instance, when investigating how dogs and great apes mentally represent different kinds of objects, experimenters have used devices inspired by props commonly used in magic effects, such as boxes with false bottoms ([ 1 ][1]). Researchers have also investigated causal cognition in New Caledonian crows using invisible string, a see-through thread frequently used for levitation effects, to determine how crows respond to objects moving “without” human interaction ([ 2 ][2]). Moreover, violation of expectation paradigms, in which a subject is presented with a series of expected and unexpected outcomes, has been extensively used in comparative cognition (the investigation of cognitive mechanisms in diverse species and their origins). Such a premise is directly comparable to magic effects, given that the result of both magic and violation of expectation paradigms aim to elicit the same reaction from the observer, namely being surprised by witnessing the unexpected. Although animal subjects do not typically verbalize their surprise at unexpected events, surprise can be measured by using looking time. For example, if the subject finds an event surprising, they spend significantly longer looking at the event compared with an event that is deemed ordinary. Although magical effects have permeated the field of comparative cognition, the scientific community has yet to study whether animals can be deceived by the same magic methodologies that would deceive a human observer. This is an interesting query because the use of magic effects to deceive animals could only be feasible if both human and animal spectators shared some analogous cognitive processes that capitalize on perceptive blind spots and cognitive roadblocks. Investigating the psychology behind magic effects in humans offers comparative psychologists an accessible pathway to formulate initial hypotheses to test in animal audiences. For example, the vanishing ball—an effect in which the magician seemingly vanishes a ball in thin air—could be used to investigate whether past experiences and current expectations alter the animal's perception. In humans, the illusion's success appears to be reliant on the spectator's expectation of the ball's movement and the social cues elicited by the magician ([ 3 ][3]). Using a similar design with animals could be insightful, regarding both the animal's expectations (i.e., throwing a ball toward the ceiling will make the ball go upward) and whether human body language offers an animal audience social cues when priming such illusions. A popular magic technique is misdirection, the manipulation of the spectator by the magician to prevent the discovery of the cause of a magic effect. Controlling the audience's attention is an important skill for magicians, otherwise spectators might discover the mechanics behind the effect. Some species have been observed using behavioral tactics that can be considered analogous to misdirection. For example, chimpanzees sometimes divert their gaze from a desired object to detract a competitor's attention from it ([ 4 ][4]). Jays (i.e., corvids) will protect their food caches from possible pilferers by moving them several times or discretely hiding the food while performing several bluff caching events, thereby making it difficult for the observer to trace the genuine cache location ([ 5 ][5]). The use of analogous methodologies by a diverse range of animal taxa to deceive conspecifics suggests that some misdirection techniques could exploit similar blind spots in attention. It also prompts the question of whether misdirection techniques used by magicians can also effectively fool animal minds. However, when doing so, experimenters must engage the attentional mechanisms of their spectators, because misdirection techniques are contingent on this. This might be challenging with animal subjects who might not pay sufficient attention to humans. Engaging the undivided attention of our closest relative, the chimpanzee, is one of the major challenges of implementing experimental designs on apes ([ 6 ][6]). Offering them long periods of intensive training, during which the ape must pay close attention to human movement, might ameliorate the challenge. By contrast, corvids possess sophisticated attentional mechanisms and are a suitable candidate for this line of research because they follow human gaze around particular objects and monitor human attentional states ([ 7 ][7], [ 8 ][8]). ![Figure][9] Hand gestures influence choice A priming experiment to observe whether a magpie's choice can be influenced by human hand gestures is shown. Magpies are first trained to discriminate between three differently shaped objects and exchange any shaped object for a food reward. GRAPHIC: A. KITTERMAN/ SCIENCE In addition to misdirection, magicians often rely on our cognitive abilities to create a magical illusion. One such ability is object permanence—the ability to represent objects in the mind's eye when the object is out of sight. This ability appears to be adaptive for diverse taxa. For example, object permanence is harnessed by corvids during caching to successfully cache and recover because individuals must understand and remember that hidden items continue to exist even when they are out of sight ([ 9 ][10]). The ability to form a mental representation of an object when it is out of sight and to maintain it in memory is also vital for conjuring magic effects, because most effects tend to involve the appearance and disappearance of objects. Thus, object permanence paradigms grant a suitable starting point for comparative psychologists to investigate the analogous mechanisms of both human and animal observers of magic. Interesting insights into object permanence have been made when adopting magic as a framework of study. When using a fake transfer technique (i.e., where the magician pretends to place an object in one hand while keeping it in the initial hand instead), human observers appear to retain the erroneous belief that a coin is placed inside the hand only for a limited period of time. Elongated reveal times seem to decrease the strength of this belief significantly ([ 10 ][11]), suggesting that inducing a false belief of object permanence might be contingent on not allowing enough time for the spectator to replay the events in their mind. Given the current research on object permanence in diverse taxa, translating the fake transfer technique to a suitable animal and paradigm (e.g., corvid caching) might elucidate the degree of commonality with object permanence abilities in humans and highlight whether perception of object permanence and memory of the hidden location in animal minds can be manipulated in analogous ways. Although the science of magic has mainly focused on the exploitation of simpler mechanisms such as attention and perception, magic effects also use techniques that affect complex cognitive abilities such as memory and mental time travel. For example, magicians often alter the spectator's recollection of an event and induce fake memories through suggestions. When researchers suggested to human subjects that a “magic” key, which had been previously bent, would continue to bend once the effect finished, the spectators were more likely to report that they had observed the bending process during and after the magic effect ([ 11 ][12]). Other effects such as the “one ahead principle” exploit the spectator's inability to effectively deconstruct memories to make them think that the magician can read their mind. This is done by the magician forcing the outcome of one of the predictions while altering the order of events that the spectator is experiencing. Given the reconstructive nature of human memory, the spectator will recall the sequences in the order they occurred, instead of dissecting it into the events that were key for the experience ([ 12 ][13]). Such effects could only be investigated with species that possess mental time travel abilities, given that, evidently, one cannot exploit the faults of a nonexistent mechanism. Current research suggests that corvids exhibit sophisticated mental time travel abilities ([ 13 ][14], [ 14 ][15]) and therefore are ideal subjects for experiments with such magic effects. The application of similar techniques adapted to an animal audience might reveal whether animals that possess complex memory abilities also encounter comparable constraints. The imperative use of language in this kind of research is a strong barrier if one is to transpose it to an animal audience. However, recent research on humans raises the possibility that simple choices can be influenced by using hand gestures ([ 15 ][16]), thus offering a more relevant way to test for analogous roadblocks in animal memories. Magical frameworks ought to be the subject of in-depth methodological inspection and theorization. A good starting point might be the use of hand gestures depicting simple primes to observe if humans can influence choice in corvids. For example, subjects could be trained to discriminate between three differently shaped objects and asked, by the experimenter, to retrieve any object in exchange for a reward. Experimental conditions could include whether making heart-shape gestures, when asking, primes the subject to retrieve the heart object instead of the circular or rectangular object (see the figure). The psychology of magic offers the scientific community a powerful methodological tool for testing the perceptive blind spots and cognitive roadblocks in diverse taxa. Studying whether animals can be deceived by the same magic effects that deceive humans can offer a window into the cognitive parallels and variances in attention, perception, and mental time travel, especially those species thought to possess the necessary prerequisites to be deceived by magic effects. Magical frameworks offer alternative and innovative avenues for hypothesis testing and experimental design, and it is hoped that future researchers will incorporate them into their investigations of the animal mind. 1. [↵][17]1. J. Bräuer, 2. J. Call , J. Comp. Psychol. 125, 353 (2011). [OpenUrl][18][CrossRef][19][PubMed][20] 2. [↵][21]1. A. H. Taylor, 2. R. Miller, 3. R. D. Gray , Proc. Natl. Acad. Sci. U.S.A. 109, 16389 (2012). [OpenUrl][22][Abstract/FREE Full Text][23] 3. [↵][24]1. G. Kuhn, 2. M. F. Land , Curr. Biol. 16, 950 (2006). [OpenUrl][25] 4. [↵][26]1. A. Whiten, 2. R. W. Byrne , Behav. Brain Sci. 11, 233 (1988). [OpenUrl][27][CrossRef][28][Web of Science][29] 5. [↵][30]1. N. S. Clayton, 2. C. Wilkins , Curr. Biol. 29, R349 (2019). [OpenUrl][31] 6. [↵][32]1. D. A. Leavens, 2. K. A. Bard, 3. W. D. Hopkins , Anim. Cogn. 22, 487 (2019). [OpenUrl][33] 7. [↵][34]1. A. M. P. von Bayern, 2. N. J. Emery , Curr. Biol. 19, 602 (2009). [OpenUrl][35][CrossRef][36][PubMed][37][Web of Science][38] 8. [↵][39]1. T. Bugnyar, 2. M. Stöwe, 3. B. Heinrich , Proc. R. Soc. London Ser. B 271, 1331 (2004). [OpenUrl][40][CrossRef][41][PubMed][42][Web of Science][43] 9. [↵][44]1. L. H. Salwiczek, 2. N. J. Emery, 3. B. Schlinger, 4. N. S. Clayton , J. Comp. Psychol. 123, 295 (2009). [OpenUrl][45][CrossRef][46][PubMed][47][Web of Science][48] 10. [↵][49]1. T. Beth, 2. V. Ekroll , Psychol. Res. 79, 513 (2015). [OpenUrl][50] 11. [↵][51]1. R. Wiseman, 2. E. Greening , Br. J. Psychol. 96, 115 (2005). [OpenUrl][52][CrossRef][53][PubMed][54][Web of Science][55] 12. [↵][56]1. N. Clayton, 2. C. Wilkins , Interface Focus 7, 20160112 (2017). [OpenUrl][57][CrossRef][58] 13. [↵][59]1. N. S. Clayton, 2. A. Dickinson , Nature 395, 272 (1998). [OpenUrl][60][CrossRef][61][PubMed][62][Web of Science][63] 14. [↵][64]1. C. R. Raby, 2. D. M. Alexis, 3. A. Dickinson, 4. N. S. Clayton , Nature 445, 919 (2007). [OpenUrl][65][CrossRef][66][PubMed][67][Web of Science][68] 15. [↵][69]1. A. Pailhès, 2. G. Kuhn , Proc. Natl. Acad. Sci. U.S.A. 117, 17675 (2020). [OpenUrl][70][Abstract/FREE Full Text][71] [1]: #ref-1 [2]: #ref-2 [3]: #ref-3 [4]: #ref-4 [5]: #ref-5 [6]: #ref-6 [7]: #ref-7 [8]: #ref-8 [9]: pending:yes [10]: #ref-9 [11]: #ref-10 [12]: #ref-11 [13]: #ref-12 [14]: #ref-13 [15]: #ref-14 [16]: #ref-15 [17]: #xref-ref-1-1 "View reference 1 in text" [18]: {openurl}?query=rft.jtitle%253DJournal%2Bof%2Bcomparative%2Bpsychology%2B%2528Washington%252C%2BD.C.%2B%253A%2B%2B1983%2529%26rft.stitle%253DJ%2BComp%2BPsychol%26rft.aulast%253DBrauer%26rft.auinit1%253DJ.%26rft.volume%253D125%26rft.issue%253D3%26rft.spage%253D353%26rft.epage%253D361%26rft.atitle%253DThe%2Bmagic%2Bcup%253A%2Bgreat%2Bapes%2Band%2Bdomestic%2Bdogs%2B%2528Canis%2Bfamiliaris%2529%2Bindividuate%2Bobjects%2Baccording%2Bto%2Btheir%2Bproperties.%26rft_id%253Dinfo%253Adoi%252F10.1037%252Fa0023009%26rft_id%253Dinfo%253Apmid%252F21574687%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [19]: /lookup/external-ref?access_num=10.1037/a0023009&link_type=DOI [20]: /lookup/external-ref?access_num=21574687&link_type=MED&atom=%2Fsci%2F369%2F6510%2F1424.atom [21]: #xref-ref-2-1 "View reference 2 in text" [22]: {openurl}?query=rft.jtitle%253DProc.%2BNatl.%2BAcad.%2BSci.%2BU.S.A.%26rft_id%253Dinfo%253Adoi%252F10.1073%252Fpnas.1208724109%26rft_id%253Dinfo%253Apmid%252F22988112%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [23]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMjoiMTA5LzQwLzE2Mzg5IjtzOjQ6ImF0b20iO3M6MjM6Ii9zY2kvMzY5LzY1MTAvMTQyNC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [24]: #xref-ref-3-1 "View reference 3 in text" [25]: {openurl}?query=rft.jtitle%253DCurr.%2BBiol.%26rft.volume%253D16%26rft.spage%253D950%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [26]: #xref-ref-4-1 "View reference 4 in text" [27]: {openurl}?query=rft.jtitle%253DBehav.%2BBrain%2BSci.%26rft.volume%253D11%26rft.spage%253D233%26rft_id%253Dinfo%253Adoi%252F10.1017%252FS0140525X00049682%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [28]: /lookup/external-ref?access_num=10.1017/S0140525X00049682&link_type=DOI [29]: /lookup/external-ref?access_num=A1988P935600045&link_type=ISI [30]: #xref-ref-5-1 "View reference 5 in text" [31]: {openurl}?query=rft.jtitle%253DCurr.%2BBiol.%26rft.volume%253D29%26rft.spage%253DR349%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [32]: #xref-ref-6-1 "View reference 6 in text" [33]: {openurl}?query=rft.jtitle%253DAnim.%2BCogn.%26rft.volume%253D22%26rft.spage%253D487%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [34]: #xref-ref-7-1 "View reference 7 in text" [35]: {openurl}?query=rft.jtitle%253DCurrent%2Bbiology%2B%253A%2B%2BCB%26rft.stitle%253DCurr%2BBiol%26rft.aulast%253Dvon%2BBayern%26rft.auinit1%253DA.%2BM.%26rft.volume%253D19%26rft.issue%253D7%26rft.spage%253D602%26rft.epage%253D606%26rft.atitle%253DJackdaws%2Brespond%2Bto%2Bhuman%2Battentional%2Bstates%2Band%2Bcommunicative%2Bcues%2Bin%2Bdifferent%2Bcontexts.%26rft_id%253Dinfo%253Adoi%252F10.1016%252Fj.cub.2009.02.062%26rft_id%253Dinfo%253Apmid%252F19345101%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [36]: /lookup/external-ref?access_num=10.1016/j.cub.2009.02.062&link_type=DOI [37]: /lookup/external-ref?access_num=19345101&link_type=MED&atom=%2Fsci%2F369%2F6510%2F1424.atom [38]: /lookup/external-ref?access_num=000265266900031&link_type=ISI [39]: #xref-ref-8-1 "View reference 8 in text" [40]: {openurl}?query=rft.jtitle%253DProceedings%2Bof%2Bthe%2BRoyal%2BSociety%2BB%253A%2BBiological%2BSciences%26rft.stitle%253DProc%2BR%2BSoc%2BB%26rft.aulast%253DBugnyar%26rft.auinit1%253DT.%26rft.volume%253D271%26rft.issue%253D1546%26rft.spage%253D1331%26rft.epage%253D1336%26rft.atitle%253DRavens%252C%2BCorvus%2Bcorax%252C%2Bfollow%2Bgaze%2Bdirection%2Bof%2Bhumans%2Baround%2Bobstacles%26rft_id%253Dinfo%253Adoi%252F10.1098%252Frspb.2004.2738%26rft_id%253Dinfo%253Apmid%252F15306330%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [41]: /lookup/external-ref?access_num=10.1098/rspb.2004.2738&link_type=DOI [42]: /lookup/external-ref?access_num=15306330&link_type=MED&atom=%2Fsci%2F369%2F6510%2F1424.atom [43]: /lookup/external-ref?access_num=000222576500002&link_type=ISI [44]: #xref-ref-9-1 "View reference 9 in text" [45]: {openurl}?query=rft.jtitle%253DJournal%2Bof%2Bcomparative%2Bpsychology%2B%2528Washington%252C%2BD.C.%2B%253A%2B%2B1983%2529%26rft.stitle%253DJ%2BComp%2BPsychol%26rft.aulast%253DSalwiczek%26rft.auinit1%253DL.%2BH.%26rft.volume%253D123%26rft.issue%253D3%26rft.spage%253D295%26rft.epage%253D303%26rft.atitle%253DThe%2Bdevelopment%2Bof%2Bcaching%2Band%2Bobject%2Bpermanence%2Bin%2BWestern%2Bscrub-jays%2B%2528Aphelocoma%2Bcalifornica%2529%253A%2Bwhich%2Bemerges%2Bfirst%253F%26rft_id%253Dinfo%253Adoi%252F10.1037%252Fa0016303%26rft_id%253Dinfo%253Apmid%252F19685971%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [46]: /lookup/external-ref?access_num=10.1037/a0016303&link_type=DOI [47]: /lookup/external-ref?access_num=19685971&link_type=MED&atom=%2Fsci%2F369%2F6510%2F1424.atom [48]: /lookup/external-ref?access_num=000268963900008&link_type=ISI [49]: #xref-ref-10-1 "View reference 10 in text" [50]: {openurl}?query=rft.jtitle%253DPsychol.%2BRes.%26rft.volume%253D79%26rft.spage%253D513%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [51]: #xref-ref-11-1 "View reference 11 in text" [52]: {openurl}?query=rft.jtitle%253DThe%2BBritish%2Bjournal%2Bof%2Bpsychology%26rft.stitle%253DBr%2BJ%2BPsychol%26rft.aulast%253DWiseman%26rft.auinit1%253DR.%26rft.volume%253D96%26rft.issue%253DPt%2B1%26rft.spage%253D115%26rft.epage%253D127%26rft.atitle%253D%2527It%2527s%2Bstill%2Bbending%2527%253A%2Bverbal%2Bsuggestion%2Band%2Balleged%2Bpsychokinetic%2Bability.%26rft_id%253Dinfo%253Adoi%252F10.1348%252F000712604X15428%26rft_id%253Dinfo%253Apmid%252F15826327%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [53]: /lookup/external-ref?access_num=10.1348/000712604X15428&link_type=DOI [54]: /lookup/external-ref?access_num=15826327&link_type=MED&atom=%2Fsci%2F369%2F6510%2F1424.atom [55]: /lookup/external-ref?access_num=000227565900018&link_type=ISI [56]: #xref-ref-12-1 "View reference 12 in text" [57]: {openurl}?query=rft.jtitle%253DInterface%2BFocus%26rft_id%253Dinfo%253Adoi%252F10.1098%252Frsfs.2016.0112%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [58]: /lookup/external-ref?access_num=10.1098/rsfs.2016.0112&link_type=DOI [59]: #xref-ref-13-1 "View reference 13 in text" [60]: {openurl}?query=rft.jtitle%253DNature%26rft.stitle%253DNature%26rft.aulast%253DClayton%26rft.auinit1%253DN.%2BS.%26rft.volume%253D395%26rft.issue%253D6699%26rft.spage%253D272%26rft.epage%253D274%26rft.atitle%253DEpisodic-like%2Bmemory%2Bduring%2Bcache%2Brecovery%2Bby%2Bscrub%2Bjays.%26rft_id%253Dinfo%253Adoi%252F10.1038%252F26216%26rft_id%253Dinfo%253Apmid%252F9751053%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [61]: /lookup/external-ref?access_num=10.1038/26216&link_type=DOI [62]: /lookup/external-ref?access_num=9751053&link_type=MED&atom=%2Fsci%2F369%2F6510%2F1424.atom [63]: /lookup/external-ref?access_num=000075974600049&link_type=ISI [64]: #xref-ref-14-1 "View reference 14 in text" [65]: {openurl}?query=rft.jtitle%253DNature%26rft.stitle%253DNature%26rft.aulast%253DRaby%26rft.auinit1%253DC.%2BR.%26rft.volume%253D445%26rft.issue%253D7130%26rft.spage%253D919%26rft.epage%253D921%26rft.atitle%253DPlanning%2Bfor%2Bthe%2Bfuture%2Bby%2Bwestern%2Bscrub-jays.%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnature05575%26rft_id%253Dinfo%253Apmid%252F17314979%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [66]: /lookup/external-ref?access_num=10.1038/nature05575&link_type=DOI [67]: /lookup/external-ref?access_num=17314979&link_type=MED&atom=%2Fsci%2F369%2F6510%2F1424.atom [68]: /lookup/external-ref?access_num=000244341200050&link_type=ISI [69]: #xref-ref-15-1 "View reference 15 in text" [70]: {openurl}?query=rft.jtitle%253DProc.%2BNatl.%2BAcad.%2BSci.%2BU.S.A.%26rft_id%253Dinfo%253Adoi%252F10.1073%252Fpnas.2000682117%26rft_id%253Dinfo%253Apmid%252F32661142%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [71]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMjoiMTE3LzMwLzE3Njc1IjtzOjQ6ImF0b20iO3M6MjM6Ii9zY2kvMzY5LzY1MTAvMTQyNC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=