{"_id":"57f2d7e8f4606e0e00dca29e","project":"5668fab608f90021008e882f","parentDoc":null,"version":{"_id":"5668fab608f90021008e8832","__v":19,"project":"5668fab608f90021008e882f","createdAt":"2015-12-10T04:08:22.769Z","releaseDate":"2015-12-10T04:08:22.769Z","categories":["5668fab708f90021008e8833","569740f124490c3700170a64","569742b58560a60d00e2c25d","569742bd0b09a41900b2446c","569742cd69393517000c82b3","569742f459a6692d003fad8f","569743020b09a41900b2446d","5697430b69393517000c82b5","56a17776470ae00d00c30642","56a2c48a831e2a0d0069b1ad","56b535757bccae0d00e9a1cd","56e1ff6aa49fdc0e005746b5","57e1c88115bf6522002a5e4e","57fa65275ba65a17008b988f","57fbeea34002550e004c032e","58474584889b6c2d00fb86e9","58475dcc64157f0f002f1907","587e7b5158666c2700965d4e","58a349fc30852819007ba083"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"1.18.0","version":"1.18"},"__v":0,"category":{"_id":"57e1c88115bf6522002a5e4e","project":"5668fab608f90021008e882f","__v":0,"version":"5668fab608f90021008e8832","sync":{"url":"","isSync":false},"reference":false,"createdAt":"2016-09-20T23:38:41.155Z","from_sync":false,"order":11,"slug":"metrics","title":"Metrics"},"user":"5668fa9755e4b32100935d41","updates":[],"next":{"pages":[],"description":""},"createdAt":"2016-10-03T22:12:56.710Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"settings":"","results":{"codes":[]},"auth":"required","params":[],"url":""},"isReference":false,"order":6,"body":"The widget click event (called `widget_click`) is triggered every time a recommendation from a tracked recommendation set is clicked. Developers can get a quick summary at the [Widget Events](doc:widget-events) page. This page is more for people interested in data science and analysis.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"1. Conditions for triggering\"\n}\n[/block]\nA widget click is triggered every time a user clicks on a recommendation in a recommendation widget being tracked by LiftIgniter. This could include recommendation widgets powered by LiftIgniter, as well as base slices of such widgets (that are not powered by LiftIgniter, but are still tracked to provide an A/B test comparison). The event is triggered both on a left click (called a \"click\" in Javascript) or a right click (called a \"contextmenu\" in Javascript, since it opens up the context menu).\n\nThe `widget_click` event can be triggered even in cases where the user does not actually visit the clicked page. This is obvious for right clicks: some people right-click in order to open the link in a new tab or window, but others may right-click to copy the link URL or use another context menu option. It could also be that the user intends to open the page but the page fails to load or the user bounces too quickly. We will still count these as clicks.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"2. Fields in widget click events\"\n}\n[/block]\nWidget click events have the following special fields that play a key role in performance tracking, analytics, and machine learning:\n\n* Visible items (called `visibleItems`, and compactified in our JS as `vi`): This is a list of the recommended items shown in the widget. The list of visible items helps our machine learning systems keep track of how often specific items are getting shown, so that we can calculate item CTRs and our system can better gauge the performance of specific items.\n* Click URL (called `clickUrl`): This is the URL of the item clicked. It is key to our machine learning, as it tells us what the user selected! It is also important for other analytics purposes, including deduplication.\n* Widget name (called `widgetName`, and compactified in our JS as `w`): This is the name of the widget. For instance, you might have difference widget names like `home-page-recommendations` and `article-recommendations`.\n* Source (called `source`): This gives information on the algorithm used as the source of recommendations. We use `LI` for LiftIgniter's recommendations (in our Analytics, this shows up as \"LiftIgniter\") and `base` for the baseline recommendations. You can use other names.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"3. Comparing widget click event values in an A/B test\"\n}\n[/block]\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/752dfaf-widget-click-ab-example.png\",\n        \"widget-click-ab-example.png\",\n        1698,\n        794,\n        \"#e0eaf8\"\n      ],\n      \"caption\": \"Widget click values in a fair 50/50 A/B test. LiftIgniter's slice (blue, filled solid below) performs about twice as well as the baseline slice (orange). Traffic volumes vary a lot by day but you see that LiftIgniter outperforms the baseline every day.\"\n    }\n  ]\n}\n[/block]\nLiftIgniter does A/B testing by user. This means that when we do a 50/50 A/B test, 50% of users see LiftIgniter's recommendations all the time, and the other 50% see the baseline recommendations all the time.\n\nIn such an A/B test, a direct comparison of the number of `widget_click` events in LiftIgniter's slice and the baseline is a reasonably fair measure of relative performance. Note that if the test is not 50/50, this will *not* be a fair comparison. Even if it is 50/50, the split may not be perfectly even, and we generally recommend using a CTR comparison for a clearer idea. However, widget click comparisons can sometimes capture various direct and indirect effects on *top* of the CTR differences. For more on direct and indirect effects that affect A/B test result interpretability, see [here](https://liftigniter.readme.io/docs/widget-shown#3-comparing-widget-shown-event-values-in-an-ab-tes).\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"4. Derived metrics\"\n}\n[/block]\nFor a more complete list of all the metrics and their relationship, see [Metrics summary](doc:metrics-summary).\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"Derived metric\",\n    \"h-1\": \"Numerator\",\n    \"h-2\": \"Denominator\",\n    \"h-3\": \"Supported by LiftIgniter?\",\n    \"0-0\": \"Click-Through Rate (CTR)\",\n    \"0-1\": \"Widget Click\",\n    \"0-2\": \"Widget Shown\",\n    \"0-3\": \"Yes, with [Tracking Widgets](doc:tracking-widgets) implemented\",\n    \"1-0\": \"Visible Click-Through Rate (VCTR)\",\n    \"1-1\": \"Widget Click\",\n    \"1-2\": \"Widget Visible\",\n    \"1-3\": \"Yes, with [Tracking Widgets](doc:tracking-widgets) implemented\",\n    \"2-0\": \"Conversion-to-Click Ratio\",\n    \"2-1\": \"Conversion (through click)\",\n    \"2-2\": \"Widget Click\",\n    \"2-3\": \"Yes, with [Tracking Widgets](doc:tracking-widgets)  and [Tracking Conversion](doc:tracking-conversion) implemented.\"\n  },\n  \"cols\": 4,\n  \"rows\": 3\n}\n[/block]","excerpt":"","slug":"widget-click","type":"basic","title":"Widget Click"}
The widget click event (called `widget_click`) is triggered every time a recommendation from a tracked recommendation set is clicked. Developers can get a quick summary at the [Widget Events](doc:widget-events) page. This page is more for people interested in data science and analysis. [block:api-header] { "type": "basic", "title": "1. Conditions for triggering" } [/block] A widget click is triggered every time a user clicks on a recommendation in a recommendation widget being tracked by LiftIgniter. This could include recommendation widgets powered by LiftIgniter, as well as base slices of such widgets (that are not powered by LiftIgniter, but are still tracked to provide an A/B test comparison). The event is triggered both on a left click (called a "click" in Javascript) or a right click (called a "contextmenu" in Javascript, since it opens up the context menu). The `widget_click` event can be triggered even in cases where the user does not actually visit the clicked page. This is obvious for right clicks: some people right-click in order to open the link in a new tab or window, but others may right-click to copy the link URL or use another context menu option. It could also be that the user intends to open the page but the page fails to load or the user bounces too quickly. We will still count these as clicks. [block:api-header] { "type": "basic", "title": "2. Fields in widget click events" } [/block] Widget click events have the following special fields that play a key role in performance tracking, analytics, and machine learning: * Visible items (called `visibleItems`, and compactified in our JS as `vi`): This is a list of the recommended items shown in the widget. The list of visible items helps our machine learning systems keep track of how often specific items are getting shown, so that we can calculate item CTRs and our system can better gauge the performance of specific items. * Click URL (called `clickUrl`): This is the URL of the item clicked. It is key to our machine learning, as it tells us what the user selected! It is also important for other analytics purposes, including deduplication. * Widget name (called `widgetName`, and compactified in our JS as `w`): This is the name of the widget. For instance, you might have difference widget names like `home-page-recommendations` and `article-recommendations`. * Source (called `source`): This gives information on the algorithm used as the source of recommendations. We use `LI` for LiftIgniter's recommendations (in our Analytics, this shows up as "LiftIgniter") and `base` for the baseline recommendations. You can use other names. [block:api-header] { "type": "basic", "title": "3. Comparing widget click event values in an A/B test" } [/block] [block:image] { "images": [ { "image": [ "https://files.readme.io/752dfaf-widget-click-ab-example.png", "widget-click-ab-example.png", 1698, 794, "#e0eaf8" ], "caption": "Widget click values in a fair 50/50 A/B test. LiftIgniter's slice (blue, filled solid below) performs about twice as well as the baseline slice (orange). Traffic volumes vary a lot by day but you see that LiftIgniter outperforms the baseline every day." } ] } [/block] LiftIgniter does A/B testing by user. This means that when we do a 50/50 A/B test, 50% of users see LiftIgniter's recommendations all the time, and the other 50% see the baseline recommendations all the time. In such an A/B test, a direct comparison of the number of `widget_click` events in LiftIgniter's slice and the baseline is a reasonably fair measure of relative performance. Note that if the test is not 50/50, this will *not* be a fair comparison. Even if it is 50/50, the split may not be perfectly even, and we generally recommend using a CTR comparison for a clearer idea. However, widget click comparisons can sometimes capture various direct and indirect effects on *top* of the CTR differences. For more on direct and indirect effects that affect A/B test result interpretability, see [here](https://liftigniter.readme.io/docs/widget-shown#3-comparing-widget-shown-event-values-in-an-ab-tes). [block:api-header] { "type": "basic", "title": "4. Derived metrics" } [/block] For a more complete list of all the metrics and their relationship, see [Metrics summary](doc:metrics-summary). [block:parameters] { "data": { "h-0": "Derived metric", "h-1": "Numerator", "h-2": "Denominator", "h-3": "Supported by LiftIgniter?", "0-0": "Click-Through Rate (CTR)", "0-1": "Widget Click", "0-2": "Widget Shown", "0-3": "Yes, with [Tracking Widgets](doc:tracking-widgets) implemented", "1-0": "Visible Click-Through Rate (VCTR)", "1-1": "Widget Click", "1-2": "Widget Visible", "1-3": "Yes, with [Tracking Widgets](doc:tracking-widgets) implemented", "2-0": "Conversion-to-Click Ratio", "2-1": "Conversion (through click)", "2-2": "Widget Click", "2-3": "Yes, with [Tracking Widgets](doc:tracking-widgets) and [Tracking Conversion](doc:tracking-conversion) implemented." }, "cols": 4, "rows": 3 } [/block]