{"id":432,"date":"2023-03-02T20:16:49","date_gmt":"2023-03-02T20:16:49","guid":{"rendered":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/chapter\/learn-it-multimodal-phenomena\/"},"modified":"2025-12-29T23:42:53","modified_gmt":"2025-12-29T23:42:53","slug":"learn-it-multimodal-phenomena","status":"publish","type":"chapter","link":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/chapter\/learn-it-multimodal-phenomena\/","title":{"raw":"Perception and Illusions: Learn It 2\u2014Multimodal Phenomena","rendered":"Perception and Illusions: Learn It 2\u2014Multimodal Phenomena"},"content":{"raw":"<h2 data-start=\"481\" data-end=\"539\"><strong data-start=\"484\" data-end=\"539\">Multimodal Perception: How Our Senses Work Together<\/strong><\/h2>\r\n<p>Although it has been traditional to study the various senses independently, most of the time, perception operates in the context of information supplied by multiple <strong>sensory modalities<\/strong> at the same time.<\/p>\r\n<section class=\"textbox keyTakeaway\">\r\n<h3>unimodal and multimodal perception<\/h3>\r\n<ul>\r\n\t<li><strong>Unimodal perception<\/strong> is when we use information from just one sense, like seeing with our eyes or hearing with our ears. For example, when we look at a picture, we are using only our sense of vision.<\/li>\r\n\t<li><strong>Multimodal perception<\/strong> is when we use information from multiple senses at the same time to understand the world around us. For example, when we hear a sound and see where the sound is coming from, we are using both our sense of hearing and vision.<\/li>\r\n<\/ul>\r\n<\/section>\r\n<p>For example, imagine if you witnessed a car collision. You could describe the stimulus generated by this event by considering each of the senses independently; that is, as a set of <strong>unimodal<\/strong> stimuli. Your eyes would be stimulated with patterns of light energy bouncing off the cars involved. Your ears would be stimulated with patterns of acoustic energy emanating from the collision. Your nose might even be stimulated by the smell of burning rubber or gasoline.<\/p>\r\n<p>Indeed, unless someone were to explicitly ask you to describe your perception in unimodal terms, you would most likely experience the event as a unified bundle of sensations from multiple senses. In other words, your perception would be <strong>multimodal<\/strong>. The question is whether the various sources of information involved in this multimodal stimulus are processed separately by the perceptual system or not.<\/p>\r\n<p>For the last few decades, perceptual research has pointed to the importance of <strong>multimodal perception<\/strong>: the effects on the perception of events and objects in the world that are observed when there is information from more than one sensory modality. Most of this research indicates that, at some point in perceptual processing, information from the various sensory modalities is <strong>integrated<\/strong>. In other words, the information is combined and treated as a unitary representation of the world.<\/p>\r\n<h2 data-start=\"1849\" data-end=\"1889\"><strong data-start=\"1852\" data-end=\"1889\">Why Multimodal Perception Matters<\/strong><\/h2>\r\n<p data-start=\"1891\" data-end=\"2119\">Recent neuroscience has identified many brain regions that respond to <strong data-start=\"1961\" data-end=\"1996\">multiple types of sensory input<\/strong>, suggesting that humans are fundamentally <strong data-start=\"2039\" data-end=\"2064\">multimodal perceivers<\/strong> (Spence, Senkowski, &amp; Roder, 2009). This explains why:<\/p>\r\n<ul>\r\n\t<li data-start=\"2122\" data-end=\"2167\">Movie soundtracks feel emotionally powerful<\/li>\r\n\t<li data-start=\"2170\" data-end=\"2203\">Virtual reality seems immersive<\/li>\r\n\t<li data-start=\"2206\" data-end=\"2246\">Food tastes bland when you have a cold<\/li>\r\n\t<li data-start=\"2249\" data-end=\"2314\">Loud sounds appear \u201cbrighter,\u201d and bright flashes appear \u201clouder\u201d<\/li>\r\n<\/ul>\r\n<p data-start=\"2316\" data-end=\"2420\">Our brains are wired to create <strong data-start=\"2347\" data-end=\"2369\">one coherent world<\/strong>, not separate streams of vision, sound, and touch.<\/p>\r\n<h2 data-start=\"2427\" data-end=\"2489\"><strong data-start=\"2430\" data-end=\"2489\">Behavioral Effects: Multimodal vs. Crossmodal Phenomena<\/strong><\/h2>\r\n<p data-start=\"2491\" data-end=\"2556\">Psychologists study two major categories of multisensory effects:<\/p>\r\n<section class=\"textbox keyTakeaway\">\r\n<h3>multimodal and crossmodal phenomena<\/h3>\r\n<ul>\r\n\t<li><strong>Multimodal phenomena <\/strong>concern the binding together of inputs from multiple sensory modalities and the effects of this binding on perception.<\/li>\r\n\t<li><strong>Crossmodal phenomena <\/strong>concern the influence of one sensory modality on the perception of another (Spence, Senkowski, &amp; Roder, 2009).<\/li>\r\n<\/ul>\r\n<\/section>\r\n<h2>Multimodal Phenomena<\/h2>\r\n<h3 data-start=\"2861\" data-end=\"2897\"><strong data-start=\"2864\" data-end=\"2897\">Audiovisual Speech Perception<\/strong><\/h3>\r\n<p data-start=\"2899\" data-end=\"3111\">Speech is naturally multimodal: when someone talks, they produce sound waves and visual mouth movements. Watching a speaker\u2019s lips can dramatically improve comprehension, especially in noisy environments.<\/p>\r\n<p data-start=\"3161\" data-end=\"3434\">Sumby and Pollack (1954) showed that in loud background noise, seeing the speaker\u2019s mouth movements improves word recognition more than doubling the signal-to-noise ratio. In other words, watching the speaker can make speech clearer than simply turning up the volume.<\/p>\r\n<section class=\"textbox example\" aria-label=\"Example\">\r\n<p>One of the earliest investigations of this question examined the accuracy of recognizing spoken words presented in a noisy context, much like in the example above about talking at a crowded party. To study this phenomenon experimentally, some irrelevant noise (\u201cwhite noise\u201d\u2014which sounds like a radio tuned between stations) was presented to participants. Embedded in the white noise were spoken words, and the participants\u2019 task was to identify the words. There were two conditions: one in which only the auditory component of the words was presented (the \u201cauditory-alone\u201d condition), and one in both the auditory and visual components were presented (the \u201caudiovisual\u201d condition). The noise levels were also varied, so that on some trials, the noise was very loud relative to the loudness of the words, and on other trials, the noise was very soft relative to the words.<\/p>\r\n<\/section>\r\n<p data-start=\"3161\" data-end=\"3434\">Most people assume that deaf individuals are much better at lipreading than individuals with normal hearing. It may come as a surprise to learn, however, that some individuals with normal hearing are also remarkably good at lipreading (sometimes called \u201cspeechreading\u201d). In fact, there is a wide range of speechreading ability in both normal hearing and deaf populations (Andersson et al., 2001). However, the reasons for this wide range of performance are not well understood (Auer &amp; Bernstein, 2007; Bernstein, 2006; Bernstein et al., 2001; Mohammed et al., 2005).<\/p>\r\n<p data-start=\"3436\" data-end=\"3602\">This improvement follows the <strong>principle of inverse effectiveness<\/strong>: the brain benefits from multisensory information most when each individual sense is degraded. You might have noticed this phenomenon when turning captions on to watch a show.<\/p>\r\n<section class=\"textbox example\">Another phenomenon using audiovisual speech is a very famous illusion called the \u201cMcGurk effect\u201d (named after one of its discoverers). In the classic formulation of the illusion, a movie is recorded of a speaker saying the syllables \u201cgaga.\u201d Another movie is made of the same speaker saying the syllables \u201cbaba.\u201d Then, the auditory portion of the \u201cbaba\u201d movie is dubbed onto the visual portion of the \u201cgaga\u201d movie. This combined stimulus is presented to participants, who are asked to report what the speaker in the movie said. McGurk and MacDonald (1976) reported that 98 percent of their participants reported hearing the syllable \u201cdada\u201d\u2014which was in neither the visual nor the auditory components of the stimulus. These results indicate that when visual and auditory information about speech is integrated, it can have profound effects on perception.<iframe src=\"\/\/plugin.3playmedia.com\/show?mf=4363814&amp;p3sdk_version=1.10.1&amp;p=20361&amp;pt=573&amp;video_id=G-lN8vWm3m0&amp;video_target=tpm-plugin-zpbuldaq-G-lN8vWm3m0\" width=\"800px\" height=\"500px\" frameborder=\"0\" marginwidth=\"0px\" marginheight=\"0px\" data-mce-fragment=\"1\"><\/iframe>\r\n<p>You can <a href=\"https:\/\/oerfiles.s3-us-west-2.amazonaws.com\/Psychology\/Transcriptions\/TryThisBizarreAudioIllusionBBC.txt\" target=\"_blank\" rel=\"noopener\">view the transcript for \"Try this bizarre audio illusion!\" here (opens in new window)<\/a>.<\/p>\r\n<\/section>\r\n<section class=\"textbox tryIt\">[ohm2_question height=\"300\"]4058[\/ohm2_question]<\/section>\r\n<h3 id=\"tactilevisual-interactions-in-body-ownership\">Tactile\/Visual Interactions in Body Ownership<\/h3>\r\n<p>Not all multisensory integration phenomena concern speech, however. One particularly compelling multisensory illusion involves the integration of tactile and visual information in the perception of body ownership.<\/p>\r\n<p>In the \u201c<strong>rubber hand illusion<\/strong>\u201d (Botvinick &amp; Cohen, 1998), an observer is situated so that one of his hands is not visible. A fake rubber hand is placed near the obscured hand, but in a visible location. The experimenter then uses a light paintbrush to simultaneously stroke the obscured hand and the rubber hand in the same locations. For example, if the middle finger of the obscured hand is being brushed, then the middle finger of the rubber hand will also be brushed. This sets up a correspondence between the tactile sensations (coming from the obscured hand) and the visual sensations (of the rubber hand).<\/p>\r\n<p>After a short time (around 10 minutes), participants report feeling as though the rubber hand \u201cbelongs\u201d to them; that is, that the rubber hand is a part of their body. This feeling can be so strong that surprising the participant by hitting the rubber hand with a hammer often leads to a reflexive withdrawal of the obscured hand\u2014even though it is in no danger at all. It appears, then, that our awareness of our own bodies may be the result of multisensory integration.<\/p>\r\n<section class=\"textbox example\">See the rubber hand illusion in the following video.<br \/>\r\n<iframe src=\"\/\/plugin.3playmedia.com\/show?mf=4363815&amp;p3sdk_version=1.10.1&amp;p=20361&amp;pt=573&amp;video_id=sxwn1w7MJvk&amp;video_target=tpm-plugin-ioa4amuw-sxwn1w7MJvk\" width=\"800px\" height=\"500px\" frameborder=\"0\" marginwidth=\"0px\" marginheight=\"0px\" data-mce-fragment=\"1\"><\/iframe>\r\n<p>You can <a href=\"https:\/\/oerfiles.s3-us-west-2.amazonaws.com\/Psychology\/Transcriptions\/TheRubberHandIllusionBBCTwo.txt\" target=\"_blank\" rel=\"noopener\">view the transcript for \"The Rubber Hand Illusion - Horizon: Is Seeing Believing? - BBC Two\" here (opens in new window)<\/a>.<\/p>\r\n<\/section>\r\n<h2 data-start=\"4915\" data-end=\"4954\"><strong data-start=\"4918\" data-end=\"4954\">More Everyday Crossmodal Effects<\/strong><\/h2>\r\n<ul>\r\n\t<li data-start=\"4957\" data-end=\"5038\"><strong data-start=\"4957\" data-end=\"4985\">Sound influences vision:<\/strong> A loud beep can make a visual flash seem brighter.<\/li>\r\n\t<li data-start=\"5041\" data-end=\"5137\"><strong data-start=\"5041\" data-end=\"5069\">Vision influences touch:<\/strong> Watching your hand being touched can enhance tactile sensitivity.<\/li>\r\n\t<li data-start=\"5140\" data-end=\"5212\"><strong data-start=\"5140\" data-end=\"5167\">Smell influences taste:<\/strong> Vanilla scent can make food taste sweeter.<\/li>\r\n\t<li data-start=\"5215\" data-end=\"5310\"><strong data-start=\"5215\" data-end=\"5244\">Touch influences hearing:<\/strong> Feeling low-frequency vibrations helps us perceive bass in music.<\/li>\r\n<\/ul>","rendered":"<h2 data-start=\"481\" data-end=\"539\"><strong data-start=\"484\" data-end=\"539\">Multimodal Perception: How Our Senses Work Together<\/strong><\/h2>\n<p>Although it has been traditional to study the various senses independently, most of the time, perception operates in the context of information supplied by multiple <strong>sensory modalities<\/strong> at the same time.<\/p>\n<section class=\"textbox keyTakeaway\">\n<h3>unimodal and multimodal perception<\/h3>\n<ul>\n<li><strong>Unimodal perception<\/strong> is when we use information from just one sense, like seeing with our eyes or hearing with our ears. For example, when we look at a picture, we are using only our sense of vision.<\/li>\n<li><strong>Multimodal perception<\/strong> is when we use information from multiple senses at the same time to understand the world around us. For example, when we hear a sound and see where the sound is coming from, we are using both our sense of hearing and vision.<\/li>\n<\/ul>\n<\/section>\n<p>For example, imagine if you witnessed a car collision. You could describe the stimulus generated by this event by considering each of the senses independently; that is, as a set of <strong>unimodal<\/strong> stimuli. Your eyes would be stimulated with patterns of light energy bouncing off the cars involved. Your ears would be stimulated with patterns of acoustic energy emanating from the collision. Your nose might even be stimulated by the smell of burning rubber or gasoline.<\/p>\n<p>Indeed, unless someone were to explicitly ask you to describe your perception in unimodal terms, you would most likely experience the event as a unified bundle of sensations from multiple senses. In other words, your perception would be <strong>multimodal<\/strong>. The question is whether the various sources of information involved in this multimodal stimulus are processed separately by the perceptual system or not.<\/p>\n<p>For the last few decades, perceptual research has pointed to the importance of <strong>multimodal perception<\/strong>: the effects on the perception of events and objects in the world that are observed when there is information from more than one sensory modality. Most of this research indicates that, at some point in perceptual processing, information from the various sensory modalities is <strong>integrated<\/strong>. In other words, the information is combined and treated as a unitary representation of the world.<\/p>\n<h2 data-start=\"1849\" data-end=\"1889\"><strong data-start=\"1852\" data-end=\"1889\">Why Multimodal Perception Matters<\/strong><\/h2>\n<p data-start=\"1891\" data-end=\"2119\">Recent neuroscience has identified many brain regions that respond to <strong data-start=\"1961\" data-end=\"1996\">multiple types of sensory input<\/strong>, suggesting that humans are fundamentally <strong data-start=\"2039\" data-end=\"2064\">multimodal perceivers<\/strong> (Spence, Senkowski, &amp; Roder, 2009). This explains why:<\/p>\n<ul>\n<li data-start=\"2122\" data-end=\"2167\">Movie soundtracks feel emotionally powerful<\/li>\n<li data-start=\"2170\" data-end=\"2203\">Virtual reality seems immersive<\/li>\n<li data-start=\"2206\" data-end=\"2246\">Food tastes bland when you have a cold<\/li>\n<li data-start=\"2249\" data-end=\"2314\">Loud sounds appear \u201cbrighter,\u201d and bright flashes appear \u201clouder\u201d<\/li>\n<\/ul>\n<p data-start=\"2316\" data-end=\"2420\">Our brains are wired to create <strong data-start=\"2347\" data-end=\"2369\">one coherent world<\/strong>, not separate streams of vision, sound, and touch.<\/p>\n<h2 data-start=\"2427\" data-end=\"2489\"><strong data-start=\"2430\" data-end=\"2489\">Behavioral Effects: Multimodal vs. Crossmodal Phenomena<\/strong><\/h2>\n<p data-start=\"2491\" data-end=\"2556\">Psychologists study two major categories of multisensory effects:<\/p>\n<section class=\"textbox keyTakeaway\">\n<h3>multimodal and crossmodal phenomena<\/h3>\n<ul>\n<li><strong>Multimodal phenomena <\/strong>concern the binding together of inputs from multiple sensory modalities and the effects of this binding on perception.<\/li>\n<li><strong>Crossmodal phenomena <\/strong>concern the influence of one sensory modality on the perception of another (Spence, Senkowski, &amp; Roder, 2009).<\/li>\n<\/ul>\n<\/section>\n<h2>Multimodal Phenomena<\/h2>\n<h3 data-start=\"2861\" data-end=\"2897\"><strong data-start=\"2864\" data-end=\"2897\">Audiovisual Speech Perception<\/strong><\/h3>\n<p data-start=\"2899\" data-end=\"3111\">Speech is naturally multimodal: when someone talks, they produce sound waves and visual mouth movements. Watching a speaker\u2019s lips can dramatically improve comprehension, especially in noisy environments.<\/p>\n<p data-start=\"3161\" data-end=\"3434\">Sumby and Pollack (1954) showed that in loud background noise, seeing the speaker\u2019s mouth movements improves word recognition more than doubling the signal-to-noise ratio. In other words, watching the speaker can make speech clearer than simply turning up the volume.<\/p>\n<section class=\"textbox example\" aria-label=\"Example\">\n<p>One of the earliest investigations of this question examined the accuracy of recognizing spoken words presented in a noisy context, much like in the example above about talking at a crowded party. To study this phenomenon experimentally, some irrelevant noise (\u201cwhite noise\u201d\u2014which sounds like a radio tuned between stations) was presented to participants. Embedded in the white noise were spoken words, and the participants\u2019 task was to identify the words. There were two conditions: one in which only the auditory component of the words was presented (the \u201cauditory-alone\u201d condition), and one in both the auditory and visual components were presented (the \u201caudiovisual\u201d condition). The noise levels were also varied, so that on some trials, the noise was very loud relative to the loudness of the words, and on other trials, the noise was very soft relative to the words.<\/p>\n<\/section>\n<p data-start=\"3161\" data-end=\"3434\">Most people assume that deaf individuals are much better at lipreading than individuals with normal hearing. It may come as a surprise to learn, however, that some individuals with normal hearing are also remarkably good at lipreading (sometimes called \u201cspeechreading\u201d). In fact, there is a wide range of speechreading ability in both normal hearing and deaf populations (Andersson et al., 2001). However, the reasons for this wide range of performance are not well understood (Auer &amp; Bernstein, 2007; Bernstein, 2006; Bernstein et al., 2001; Mohammed et al., 2005).<\/p>\n<p data-start=\"3436\" data-end=\"3602\">This improvement follows the <strong>principle of inverse effectiveness<\/strong>: the brain benefits from multisensory information most when each individual sense is degraded. You might have noticed this phenomenon when turning captions on to watch a show.<\/p>\n<section class=\"textbox example\">Another phenomenon using audiovisual speech is a very famous illusion called the \u201cMcGurk effect\u201d (named after one of its discoverers). In the classic formulation of the illusion, a movie is recorded of a speaker saying the syllables \u201cgaga.\u201d Another movie is made of the same speaker saying the syllables \u201cbaba.\u201d Then, the auditory portion of the \u201cbaba\u201d movie is dubbed onto the visual portion of the \u201cgaga\u201d movie. This combined stimulus is presented to participants, who are asked to report what the speaker in the movie said. McGurk and MacDonald (1976) reported that 98 percent of their participants reported hearing the syllable \u201cdada\u201d\u2014which was in neither the visual nor the auditory components of the stimulus. These results indicate that when visual and auditory information about speech is integrated, it can have profound effects on perception.<iframe loading=\"lazy\" src=\"\/\/plugin.3playmedia.com\/show?mf=4363814&amp;p3sdk_version=1.10.1&amp;p=20361&amp;pt=573&amp;video_id=G-lN8vWm3m0&amp;video_target=tpm-plugin-zpbuldaq-G-lN8vWm3m0\" width=\"800px\" height=\"500px\" frameborder=\"0\" marginwidth=\"0px\" marginheight=\"0px\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p>You can <a href=\"https:\/\/oerfiles.s3-us-west-2.amazonaws.com\/Psychology\/Transcriptions\/TryThisBizarreAudioIllusionBBC.txt\" target=\"_blank\" rel=\"noopener\">view the transcript for &#8220;Try this bizarre audio illusion!&#8221; here (opens in new window)<\/a>.<\/p>\n<\/section>\n<section class=\"textbox tryIt\"><iframe loading=\"lazy\" id=\"ohm4058\" class=\"resizable\" src=\"https:\/\/ohm.one.lumenlearning.com\/multiembedq.php?id=4058&theme=lumen&iframe_resize_id=ohm4058&source=tnh&show_question_numbers\" width=\"100%\" height=\"300\"><\/iframe><\/section>\n<h3 id=\"tactilevisual-interactions-in-body-ownership\">Tactile\/Visual Interactions in Body Ownership<\/h3>\n<p>Not all multisensory integration phenomena concern speech, however. One particularly compelling multisensory illusion involves the integration of tactile and visual information in the perception of body ownership.<\/p>\n<p>In the \u201c<strong>rubber hand illusion<\/strong>\u201d (Botvinick &amp; Cohen, 1998), an observer is situated so that one of his hands is not visible. A fake rubber hand is placed near the obscured hand, but in a visible location. The experimenter then uses a light paintbrush to simultaneously stroke the obscured hand and the rubber hand in the same locations. For example, if the middle finger of the obscured hand is being brushed, then the middle finger of the rubber hand will also be brushed. This sets up a correspondence between the tactile sensations (coming from the obscured hand) and the visual sensations (of the rubber hand).<\/p>\n<p>After a short time (around 10 minutes), participants report feeling as though the rubber hand \u201cbelongs\u201d to them; that is, that the rubber hand is a part of their body. This feeling can be so strong that surprising the participant by hitting the rubber hand with a hammer often leads to a reflexive withdrawal of the obscured hand\u2014even though it is in no danger at all. It appears, then, that our awareness of our own bodies may be the result of multisensory integration.<\/p>\n<section class=\"textbox example\">See the rubber hand illusion in the following video.<br \/>\n<iframe loading=\"lazy\" src=\"\/\/plugin.3playmedia.com\/show?mf=4363815&amp;p3sdk_version=1.10.1&amp;p=20361&amp;pt=573&amp;video_id=sxwn1w7MJvk&amp;video_target=tpm-plugin-ioa4amuw-sxwn1w7MJvk\" width=\"800px\" height=\"500px\" frameborder=\"0\" marginwidth=\"0px\" marginheight=\"0px\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p>You can <a href=\"https:\/\/oerfiles.s3-us-west-2.amazonaws.com\/Psychology\/Transcriptions\/TheRubberHandIllusionBBCTwo.txt\" target=\"_blank\" rel=\"noopener\">view the transcript for &#8220;The Rubber Hand Illusion &#8211; Horizon: Is Seeing Believing? &#8211; BBC Two&#8221; here (opens in new window)<\/a>.<\/p>\n<\/section>\n<h2 data-start=\"4915\" data-end=\"4954\"><strong data-start=\"4918\" data-end=\"4954\">More Everyday Crossmodal Effects<\/strong><\/h2>\n<ul>\n<li data-start=\"4957\" data-end=\"5038\"><strong data-start=\"4957\" data-end=\"4985\">Sound influences vision:<\/strong> A loud beep can make a visual flash seem brighter.<\/li>\n<li data-start=\"5041\" data-end=\"5137\"><strong data-start=\"5041\" data-end=\"5069\">Vision influences touch:<\/strong> Watching your hand being touched can enhance tactile sensitivity.<\/li>\n<li data-start=\"5140\" data-end=\"5212\"><strong data-start=\"5140\" data-end=\"5167\">Smell influences taste:<\/strong> Vanilla scent can make food taste sweeter.<\/li>\n<li data-start=\"5215\" data-end=\"5310\"><strong data-start=\"5215\" data-end=\"5244\">Touch influences hearing:<\/strong> Feeling low-frequency vibrations helps us perceive bass in music.<\/li>\n<\/ul>\n","protected":false},"author":20,"menu_order":30,"template":"","meta":{"_candela_citation":"[{\"type\":\"cc\",\"description\":\"Multi-Modal Perception\",\"author\":\"Lorin Lachs\",\"organization\":\"California State University, Fresno\",\"url\":\"http:\/\/nobaproject.com\/modules\/multi-modal-perception\",\"project\":\"The Noba Project\",\"license\":\"cc-by-nc-sa\",\"license_terms\":\"\"},{\"type\":\"copyrighted_video\",\"description\":\"The McGurk Effect\",\"author\":\"\",\"organization\":\"BBC\",\"url\":\"https:\/\/youtu.be\/G-lN8vWm3m0?t=32s\",\"project\":\"\",\"license\":\"other\",\"license_terms\":\"Standard YouTube License\"},{\"type\":\"copyrighted_video\",\"description\":\"The Rubber Hand Illusion\",\"author\":\"\",\"organization\":\"BBC\",\"url\":\"https:\/\/youtu.be\/sxwn1w7MJvk\",\"project\":\"\",\"license\":\"other\",\"license_terms\":\"Standard YouTube License\"}]","pb_show_title":"on","pb_short_title":"","pb_subtitle":"","pb_authors":[],"pb_section_license":""},"chapter-type":[],"contributor":[],"license":[],"part":402,"module-header":"learn_it","content_attributions":[{"type":"cc","description":"Multi-Modal Perception","author":"Lorin Lachs","organization":"California State University, Fresno","url":"http:\/\/nobaproject.com\/modules\/multi-modal-perception","project":"The Noba Project","license":"cc-by-nc-sa","license_terms":""},{"type":"copyrighted_video","description":"The McGurk Effect","author":"","organization":"BBC","url":"https:\/\/youtu.be\/G-lN8vWm3m0?t=32s","project":"","license":"other","license_terms":"Standard YouTube License"},{"type":"copyrighted_video","description":"The Rubber Hand Illusion","author":"","organization":"BBC","url":"https:\/\/youtu.be\/sxwn1w7MJvk","project":"","license":"other","license_terms":"Standard YouTube License"}],"internal_book_links":[],"video_content":null,"cc_video_embed_content":{"cc_scripts":"","media_targets":[]},"try_it_collection":null,"_links":{"self":[{"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/pressbooks\/v2\/chapters\/432"}],"collection":[{"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/pressbooks\/v2\/chapters"}],"about":[{"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/wp\/v2\/types\/chapter"}],"author":[{"embeddable":true,"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/wp\/v2\/users\/20"}],"version-history":[{"count":14,"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/pressbooks\/v2\/chapters\/432\/revisions"}],"predecessor-version":[{"id":7412,"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/pressbooks\/v2\/chapters\/432\/revisions\/7412"}],"part":[{"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/pressbooks\/v2\/parts\/402"}],"metadata":[{"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/pressbooks\/v2\/chapters\/432\/metadata\/"}],"wp:attachment":[{"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/wp\/v2\/media?parent=432"}],"wp:term":[{"taxonomy":"chapter-type","embeddable":true,"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/pressbooks\/v2\/chapter-type?post=432"},{"taxonomy":"contributor","embeddable":true,"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/wp\/v2\/contributor?post=432"},{"taxonomy":"license","embeddable":true,"href":"https:\/\/content.one.lumenlearning.com\/introductiontopsychology\/wp-json\/wp\/v2\/license?post=432"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}