{"_id":"5819394431a42219002c78f3","__v":0,"category":{"_id":"57e1c88115bf6522002a5e4e","project":"5668fab608f90021008e882f","__v":0,"version":"5668fab608f90021008e8832","sync":{"url":"","isSync":false},"reference":false,"createdAt":"2016-09-20T23:38:41.155Z","from_sync":false,"order":11,"slug":"metrics","title":"Metrics"},"parentDoc":null,"user":"5668fa9755e4b32100935d41","version":{"_id":"5668fab608f90021008e8832","__v":19,"project":"5668fab608f90021008e882f","createdAt":"2015-12-10T04:08:22.769Z","releaseDate":"2015-12-10T04:08:22.769Z","categories":["5668fab708f90021008e8833","569740f124490c3700170a64","569742b58560a60d00e2c25d","569742bd0b09a41900b2446c","569742cd69393517000c82b3","569742f459a6692d003fad8f","569743020b09a41900b2446d","5697430b69393517000c82b5","56a17776470ae00d00c30642","56a2c48a831e2a0d0069b1ad","56b535757bccae0d00e9a1cd","56e1ff6aa49fdc0e005746b5","57e1c88115bf6522002a5e4e","57fa65275ba65a17008b988f","57fbeea34002550e004c032e","58474584889b6c2d00fb86e9","58475dcc64157f0f002f1907","587e7b5158666c2700965d4e","58a349fc30852819007ba083"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"1.18.0","version":"1.18"},"project":"5668fab608f90021008e882f","updates":[],"next":{"pages":[],"description":""},"createdAt":"2016-11-02T00:54:28.193Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":7,"body":"We report all the standard metrics: widget shown, widget visible, widget click, and their associated derived metrics: CTR, VCTR, and visibility, for those cases where the recommendation widget appeared on a page that the user arrived at by clicking on the same recommendation widget.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Simple example with two widgets\"\n}\n[/block]\nFor instance, suppose your website has two widgets: `bottom-widget` and `right-rail`. A user clicks on `bottom-widget` and arrives at a new page. On that page, all events associated with `bottom-widget` (widget shown, widget visible, widget click) will be respectively counted in the shown after click, visible after click, and click after click counts for `bottom-widget`. The events associated with the `right-rail` will *not* be counted as metrics after click.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"The reasons for considering these metrics\"\n}\n[/block]\nThere are two main advantages:\n\n* **It measures our performance among engaged users who have definitely noticed the recommendation area**: One of the main problems with interpreting metrics such as CTR is that we don't know what fraction of users actually even noticed the recommendation area. When we look at metrics after click, we are restricting to a subset of the users who have given a clear indication that they are interested in the recommendations. The performance of our recommendations on this engaged subset of users provides a different way of measuring our performance.\n* **It is more robust, and less subject to day-to-day fluctuations that could arise as a result of variation in traffic quality**: Let's say that one day, an article of yours goes viral on Reddit and gets a lot of shallow traffic. The CTR will probably go down as the shallow \"drive-by\" traffic isn't interested in staying around on the site. On the other hand, CTR after click is likely to remain stable, since the users who *do* click are probably just as likely to stay around.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Two reasons for data being unavailable or zero, plus a data size issue\"\n}\n[/block]\nIf you see zero values or no data for the metrics after click, there are two possible reasons:\n\n* **We are not recording clicked item views**: This could be because you are using [setNoTag](doc:psetnotag) in your tracking implementation to strip off URL tags, or you are doing a URL rewrite or redirection after click. We are therefore unable to identify, for a given pageview, whether it arose as a result of a click. Note that if this is the case, you will also not be able to get [10-second and 3-minute clicked item views](doc:10-second-and-3-minute-clicked-item-views).\n* **We aren't showing the recommendation widget on the pages for the items we are recommending**: For instance, for a homepage recommendation widget, users who click on an item in the widget won't see the widget again on the page they arrive at. So all the \"after click\" metrics for this widget will be zero.\n\nAnother note: If your overall traffic level or your CTR on the widget is low, you may get too few \"after click\" events for the metrics to be statistically robust. For instance, if your widget is shown 10,000 times a day and gets a 1% CTR, you are effectively operating with 100 \"shown after click\" events. With such a small number, the CTR after click won't be statistically robust and you won't be able to glean much meaning from it. You can address this problem by grouping data over longer time periods (e.g., viewing at a weekly instead of a daily granularity).\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Comparing these values with the overall values\"\n}\n[/block]\nAs a general rule, here is how the counts compare (unless the data is unavailable or zero for reasons discussed above):\n\nclick after click  < visible after click < shown after click ~ widget click < widget visible < widget shown\n\nMore interesting are the ratios of the numbers. Usually, the CTR after click is *much* larger than the overall CTR. Similarly for VCTR and visibility. Essentially, people who have clicked once are much more likely to send further signals of engagement with the recommendations.\n\n\n[block:parameters]\n{\n  \"data\": {\n    \"h-0\": \"Metric after click\",\n    \"h-1\": \"Typical range\",\n    \"h-2\": \"Associated overall metric\",\n    \"h-3\": \"Typical value of ratio\",\n    \"0-0\": \"CTR After Click\",\n    \"0-1\": \"5% to 50%\",\n    \"0-2\": \"CTR\",\n    \"0-3\": \"CTR After Click is between 2 and 20 times the CTR\",\n    \"1-0\": \"VCTR After Click\",\n    \"1-1\": \"6% to 60%\",\n    \"1-2\": \"VCTR\",\n    \"1-3\": \"VCTR After Click is between 1.5 and 10 times the VCTR\",\n    \"2-0\": \"Visibility After Click\",\n    \"2-1\": \"30% to 100%\",\n    \"2-2\": \"Visibility\",\n    \"2-3\": \"Visibility After Click is between 1 and 3 times the visibility\"\n  },\n  \"cols\": 4,\n  \"rows\": 3\n}\n[/block]\n\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Examples\"\n}\n[/block]\nHere's a comparison of CTR with CTR after click. The CTR after click for LiftIgniter of 32.82% is over 6 times the CTR for LI of 5.12%. Data is from the LiftIgniter [Analytics Panel](doc:analytics-panel).\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/21e7057-ctr-to-compare-with-after-click.png\",\n        \"ctr-to-compare-with-after-click.png\",\n        1690,\n        716,\n        \"#d4e3f9\"\n      ],\n      \"caption\": \"Overall CTR numbers\"\n    }\n  ]\n}\n[/block]\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/34f910a-ctr-after-click.png\",\n        \"ctr-after-click.png\",\n        1692,\n        716,\n        \"#d4e3f9\"\n      ],\n      \"caption\": \"CTR after click numbers\"\n    }\n  ]\n}\n[/block]\nHere are the corresponding VCTR numbers.\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/0e5cfe1-vctr-to-compare-with-after-click.png\",\n        \"vctr-to-compare-with-after-click.png\",\n        1692,\n        712,\n        \"#d4e3f9\"\n      ],\n      \"caption\": \"Overall VCTR numbers\"\n    }\n  ]\n}\n[/block]\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/23209a8-vctr-after-click.png\",\n        \"vctr-after-click.png\",\n        1692,\n        714,\n        \"#d4e3f9\"\n      ],\n      \"caption\": \"VCTR after click numbers\"\n    }\n  ]\n}\n[/block]\nHere are similar numbers for visibility.\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/b1a16fb-visibility-to-compare-with-after-click.png\",\n        \"visibility-to-compare-with-after-click.png\",\n        1696,\n        714,\n        \"#d4e3f9\"\n      ],\n      \"caption\": \"Overall visibility numbers\"\n    }\n  ]\n}\n[/block]\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/cd7c739-visibility-after-click.png\",\n        \"visibility-after-click.png\",\n        1694,\n        720,\n        \"#d4e3f8\"\n      ],\n      \"caption\": \"Visibility after click numbers\"\n    }\n  ]\n}\n[/block]","excerpt":"","slug":"metrics-after-click","type":"basic","title":"Metrics After Click"}

Metrics After Click


We report all the standard metrics: widget shown, widget visible, widget click, and their associated derived metrics: CTR, VCTR, and visibility, for those cases where the recommendation widget appeared on a page that the user arrived at by clicking on the same recommendation widget. [block:api-header] { "type": "basic", "title": "Simple example with two widgets" } [/block] For instance, suppose your website has two widgets: `bottom-widget` and `right-rail`. A user clicks on `bottom-widget` and arrives at a new page. On that page, all events associated with `bottom-widget` (widget shown, widget visible, widget click) will be respectively counted in the shown after click, visible after click, and click after click counts for `bottom-widget`. The events associated with the `right-rail` will *not* be counted as metrics after click. [block:api-header] { "type": "basic", "title": "The reasons for considering these metrics" } [/block] There are two main advantages: * **It measures our performance among engaged users who have definitely noticed the recommendation area**: One of the main problems with interpreting metrics such as CTR is that we don't know what fraction of users actually even noticed the recommendation area. When we look at metrics after click, we are restricting to a subset of the users who have given a clear indication that they are interested in the recommendations. The performance of our recommendations on this engaged subset of users provides a different way of measuring our performance. * **It is more robust, and less subject to day-to-day fluctuations that could arise as a result of variation in traffic quality**: Let's say that one day, an article of yours goes viral on Reddit and gets a lot of shallow traffic. The CTR will probably go down as the shallow "drive-by" traffic isn't interested in staying around on the site. On the other hand, CTR after click is likely to remain stable, since the users who *do* click are probably just as likely to stay around. [block:api-header] { "type": "basic", "title": "Two reasons for data being unavailable or zero, plus a data size issue" } [/block] If you see zero values or no data for the metrics after click, there are two possible reasons: * **We are not recording clicked item views**: This could be because you are using [setNoTag](doc:psetnotag) in your tracking implementation to strip off URL tags, or you are doing a URL rewrite or redirection after click. We are therefore unable to identify, for a given pageview, whether it arose as a result of a click. Note that if this is the case, you will also not be able to get [10-second and 3-minute clicked item views](doc:10-second-and-3-minute-clicked-item-views). * **We aren't showing the recommendation widget on the pages for the items we are recommending**: For instance, for a homepage recommendation widget, users who click on an item in the widget won't see the widget again on the page they arrive at. So all the "after click" metrics for this widget will be zero. Another note: If your overall traffic level or your CTR on the widget is low, you may get too few "after click" events for the metrics to be statistically robust. For instance, if your widget is shown 10,000 times a day and gets a 1% CTR, you are effectively operating with 100 "shown after click" events. With such a small number, the CTR after click won't be statistically robust and you won't be able to glean much meaning from it. You can address this problem by grouping data over longer time periods (e.g., viewing at a weekly instead of a daily granularity). [block:api-header] { "type": "basic", "title": "Comparing these values with the overall values" } [/block] As a general rule, here is how the counts compare (unless the data is unavailable or zero for reasons discussed above): click after click < visible after click < shown after click ~ widget click < widget visible < widget shown More interesting are the ratios of the numbers. Usually, the CTR after click is *much* larger than the overall CTR. Similarly for VCTR and visibility. Essentially, people who have clicked once are much more likely to send further signals of engagement with the recommendations. [block:parameters] { "data": { "h-0": "Metric after click", "h-1": "Typical range", "h-2": "Associated overall metric", "h-3": "Typical value of ratio", "0-0": "CTR After Click", "0-1": "5% to 50%", "0-2": "CTR", "0-3": "CTR After Click is between 2 and 20 times the CTR", "1-0": "VCTR After Click", "1-1": "6% to 60%", "1-2": "VCTR", "1-3": "VCTR After Click is between 1.5 and 10 times the VCTR", "2-0": "Visibility After Click", "2-1": "30% to 100%", "2-2": "Visibility", "2-3": "Visibility After Click is between 1 and 3 times the visibility" }, "cols": 4, "rows": 3 } [/block] [block:api-header] { "type": "basic", "title": "Examples" } [/block] Here's a comparison of CTR with CTR after click. The CTR after click for LiftIgniter of 32.82% is over 6 times the CTR for LI of 5.12%. Data is from the LiftIgniter [Analytics Panel](doc:analytics-panel). [block:image] { "images": [ { "image": [ "https://files.readme.io/21e7057-ctr-to-compare-with-after-click.png", "ctr-to-compare-with-after-click.png", 1690, 716, "#d4e3f9" ], "caption": "Overall CTR numbers" } ] } [/block] [block:image] { "images": [ { "image": [ "https://files.readme.io/34f910a-ctr-after-click.png", "ctr-after-click.png", 1692, 716, "#d4e3f9" ], "caption": "CTR after click numbers" } ] } [/block] Here are the corresponding VCTR numbers. [block:image] { "images": [ { "image": [ "https://files.readme.io/0e5cfe1-vctr-to-compare-with-after-click.png", "vctr-to-compare-with-after-click.png", 1692, 712, "#d4e3f9" ], "caption": "Overall VCTR numbers" } ] } [/block] [block:image] { "images": [ { "image": [ "https://files.readme.io/23209a8-vctr-after-click.png", "vctr-after-click.png", 1692, 714, "#d4e3f9" ], "caption": "VCTR after click numbers" } ] } [/block] Here are similar numbers for visibility. [block:image] { "images": [ { "image": [ "https://files.readme.io/b1a16fb-visibility-to-compare-with-after-click.png", "visibility-to-compare-with-after-click.png", 1696, 714, "#d4e3f9" ], "caption": "Overall visibility numbers" } ] } [/block] [block:image] { "images": [ { "image": [ "https://files.readme.io/cd7c739-visibility-after-click.png", "visibility-after-click.png", 1694, 720, "#d4e3f8" ], "caption": "Visibility after click numbers" } ] } [/block]