{"id":1497,"date":"2023-06-22T02:36:57","date_gmt":"2023-06-22T02:36:57","guid":{"rendered":"https:\/\/content.one.lumenlearning.com\/introstatstest\/chapter\/interaction-terms-apply-it-1\/"},"modified":"2025-05-17T02:46:45","modified_gmt":"2025-05-17T02:46:45","slug":"interaction-terms-apply-it-1","status":"publish","type":"chapter","link":"https:\/\/content.one.lumenlearning.com\/introstatstest\/chapter\/interaction-terms-apply-it-1\/","title":{"raw":"Interaction Terms - Apply It 1","rendered":"Interaction Terms &#8211; Apply It 1"},"content":{"raw":"<section class=\"textbox learningGoals\">\r\n<ul>\r\n\t<li>Know what an interaction term is and use a scatterplot to understand the interaction effects<\/li>\r\n\t<li>Write appropriate multiple linear regression model equations for categorical predictors and interaction terms<\/li>\r\n<\/ul>\r\n<\/section>\r\n<h2>Interaction Effects on Equations<\/h2>\r\n<section class=\"textbox recall\">A linear regression model with two or more explanatory variables is called a <strong>multiple linear regression model. <\/strong>Since there is more than one explanatory variable, the model is no longer a line. In fact, we can include [latex]p[\/latex]\u00a0explanatory variables in our model. The equation for the estimated model that uses [latex]p[\/latex]\u00a0variables is\r\n\r\n<p style=\"text-align: center;\">[latex]\\hat{y} = a + b_1 \\cdot x_1 + b_2 \\cdot x_2 + ... + b_p \\cdot x_p[\/latex]<\/p>\r\n<p>where [latex]b_1, b_2, ... ,b_p[\/latex] are the regression coefficients for explanatory variables [latex]x_1, x_2, ... ,x_p[\/latex], respectively. In multiple linear regression, [latex]b_1, b_2, ... , b_p[\/latex] are called <strong>partial slopes<\/strong>.<\/p>\r\n<\/section>\r\n<p>An\u00a0<strong>interaction<\/strong>\u00a0occurs when an explanatory variable has a different effect on the response variable, depending on the values of another explanatory variable.<\/p>\r\n<p>Let's assume that there are 2 explanatory variables in the model. The full equation would be:<\/p>\r\n<p style=\"text-align: center;\">[latex]\\hat{y} = a + b_1 \\cdot x_1 + b_2 \\cdot x_2[\/latex]<\/p>\r\n<p>And here is the same regression equation with an interaction:<\/p>\r\n<p style=\"text-align: center;\">[latex]\\hat{y} = a + b_1 \\cdot x_1 + b_2 \\cdot x_2 + b_{12} \\cdot x_1x_2[\/latex]<\/p>\r\n<p>where [latex]b_3[\/latex] is the regression coefficient and [latex]x_1x_2[\/latex] is a two-way interaction.<\/p>\r\n<p>Note: We are not going into details on how to calculate the regression equation, instead, we focused on the interpretation of the equation.<\/p>\r\n<section class=\"textbox tryIt\">[ohm2_question hide_question_numbers=1]3865[\/ohm2_question]<\/section>\r\n<section class=\"textbox tryIt\">[ohm2_question hide_question_numbers=1]3866[\/ohm2_question]<\/section>\r\n<section class=\"textbox tryIt\">[ohm2_question hide_question_numbers=1]3867[\/ohm2_question]<\/section>","rendered":"<section class=\"textbox learningGoals\">\n<ul>\n<li>Know what an interaction term is and use a scatterplot to understand the interaction effects<\/li>\n<li>Write appropriate multiple linear regression model equations for categorical predictors and interaction terms<\/li>\n<\/ul>\n<\/section>\n<h2>Interaction Effects on Equations<\/h2>\n<section class=\"textbox recall\">A linear regression model with two or more explanatory variables is called a <strong>multiple linear regression model. <\/strong>Since there is more than one explanatory variable, the model is no longer a line. In fact, we can include [latex]p[\/latex]\u00a0explanatory variables in our model. The equation for the estimated model that uses [latex]p[\/latex]\u00a0variables is<\/p>\n<p style=\"text-align: center;\">[latex]\\hat{y} = a + b_1 \\cdot x_1 + b_2 \\cdot x_2 + ... + b_p \\cdot x_p[\/latex]<\/p>\n<p>where [latex]b_1, b_2, ... ,b_p[\/latex] are the regression coefficients for explanatory variables [latex]x_1, x_2, ... ,x_p[\/latex], respectively. In multiple linear regression, [latex]b_1, b_2, ... , b_p[\/latex] are called <strong>partial slopes<\/strong>.<\/p>\n<\/section>\n<p>An\u00a0<strong>interaction<\/strong>\u00a0occurs when an explanatory variable has a different effect on the response variable, depending on the values of another explanatory variable.<\/p>\n<p>Let&#8217;s assume that there are 2 explanatory variables in the model. The full equation would be:<\/p>\n<p style=\"text-align: center;\">[latex]\\hat{y} = a + b_1 \\cdot x_1 + b_2 \\cdot x_2[\/latex]<\/p>\n<p>And here is the same regression equation with an interaction:<\/p>\n<p style=\"text-align: center;\">[latex]\\hat{y} = a + b_1 \\cdot x_1 + b_2 \\cdot x_2 + b_{12} \\cdot x_1x_2[\/latex]<\/p>\n<p>where [latex]b_3[\/latex] is the regression coefficient and [latex]x_1x_2[\/latex] is a two-way interaction.<\/p>\n<p>Note: We are not going into details on how to calculate the regression equation, instead, we focused on the interpretation of the equation.<\/p>\n<section class=\"textbox tryIt\"><iframe loading=\"lazy\" id=\"ohm3865\" class=\"resizable\" src=\"https:\/\/ohm.one.lumenlearning.com\/multiembedq.php?id=3865&theme=lumen&iframe_resize_id=ohm3865&source=tnh\" width=\"100%\" height=\"150\"><\/iframe><\/section>\n<section class=\"textbox tryIt\"><iframe loading=\"lazy\" id=\"ohm3866\" class=\"resizable\" src=\"https:\/\/ohm.one.lumenlearning.com\/multiembedq.php?id=3866&theme=lumen&iframe_resize_id=ohm3866&source=tnh\" width=\"100%\" height=\"150\"><\/iframe><\/section>\n<section class=\"textbox tryIt\"><iframe loading=\"lazy\" id=\"ohm3867\" class=\"resizable\" src=\"https:\/\/ohm.one.lumenlearning.com\/multiembedq.php?id=3867&theme=lumen&iframe_resize_id=ohm3867&source=tnh\" width=\"100%\" height=\"150\"><\/iframe><\/section>\n","protected":false},"author":8,"menu_order":24,"template":"","meta":{"_candela_citation":"[]","pb_show_title":"on","pb_short_title":"","pb_subtitle":"","pb_authors":[],"pb_section_license":""},"chapter-type":[],"contributor":[],"license":[],"part":1473,"module-header":"apply_it","content_attributions":[],"internal_book_links":[],"video_content":null,"cc_video_embed_content":{"cc_scripts":"","media_targets":[]},"try_it_collection":null,"_links":{"self":[{"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/pressbooks\/v2\/chapters\/1497"}],"collection":[{"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/pressbooks\/v2\/chapters"}],"about":[{"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/wp\/v2\/types\/chapter"}],"author":[{"embeddable":true,"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/wp\/v2\/users\/8"}],"version-history":[{"count":5,"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/pressbooks\/v2\/chapters\/1497\/revisions"}],"predecessor-version":[{"id":6914,"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/pressbooks\/v2\/chapters\/1497\/revisions\/6914"}],"part":[{"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/pressbooks\/v2\/parts\/1473"}],"metadata":[{"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/pressbooks\/v2\/chapters\/1497\/metadata\/"}],"wp:attachment":[{"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/wp\/v2\/media?parent=1497"}],"wp:term":[{"taxonomy":"chapter-type","embeddable":true,"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/pressbooks\/v2\/chapter-type?post=1497"},{"taxonomy":"contributor","embeddable":true,"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/wp\/v2\/contributor?post=1497"},{"taxonomy":"license","embeddable":true,"href":"https:\/\/content.one.lumenlearning.com\/introstatstest\/wp-json\/wp\/v2\/license?post=1497"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}