Apple Has Earned a Customer for Life

macbook pro broken hinge

Broken MacBook Pro hinge (due to glue failure)

I used to think that when people talked about the “legendary Apple customer service” that there was plenty of hyperbole thrown in for good measure.  Until it happened to me with my broken MacBook Pro hinge.

“Broken MacBook Pro Hinge” - Plenty of search results

When the screen on my late 2008 15” MacBook Pro started separating from the hinge, the first thing I did was search Google.  There I found more than enough search results to make me believe this was a widespread issue with this vintage of laptop.  And since the laptop was out of warranty, most of the results talked about re-gluing the aluminum screen cover to the hinge.

After trying to re-attach the hinge to the screen using epoxy, I headed over to the Apple store in King of Prussia, PA.  To say this first encounter at the Genius Bar was frustrating is an understatement.

You should’ve bought AppleCare

Apple cashiers “Geniuses” and fanboys alike are very big on pushing the AppleCare warranty, selling you with tales that Apple will fix anything in that extended time period.  While that may be true, extended warranties generally don’t pay off for the consumer, and as such, I don’t buy them.

Not that it would have mattered for me anyway.  My MacBook Pro is well beyond 3 years old, one of the first unibody models that came out.  You think the Apple “Genius” would’ve known that after checking the serial number, but instead just kept repeating robotically:

“You should’ve bought AppleCare.  You should’ve bought AppleCare.”

Even when I asked, “A glue failure doesn’t seem like a manufacturers defect?” or “I should’ve paid $349 for an extended warranty to protect against $0.05 of faulty glue?”

“You should’ve bought AppleCare.”

At that point, after being asked if I dropped the laptop, given a series of robotic answers, suggested that I should’ve spend $349 that wouldn’t have fixed my problem, and generally treated like a monkey, I felt like smashing the laptop right on the Genius Bar just to make a scene.  Instead, I walked out feeling worse than when I arrived, with crippled MacBook Pro in hand.

Maybe an Apple Certified Repair facility can help

Since I wasn’t going back for a second round of stupidity at King of Prussia Apple Store, I decided to look up an independent shop to see what the cost of repair would be.  The repair guy immediately said “Oh, I’ve seen this a few times recently…it’s probably around $500-$600 to fix.”

$%^$&%*(#!  For $600, I’d be about 30-35% of a new 15” MacBook Pro.  Again I left a store without doing anything, and feeling worse than when I arrived.  I either need to pay $600 or pay $2000+ to get the newer equivalent of my laptop.

One more trip to the Apple Store

Several weeks had passed and my laptop became pretty much unusable.  I decided to bite the bullet and pay to get the screen fixed.  I also decided to go back to an Apple Store (this time, in Ardmore, PA) to have them fix it.  I figured if I’d have to pay, might as well guarantee it would get fixed properly.

When I walked up to the Genius Bar, the Apple “Genius” still asked me if I dropped my laptop (sidebar:  Is this part of the mind tricks they give everyone?  There isn’t a scratch on the thing, let alone any dents).  After the Apple employee looked over the laptop, I told him in my most dejected voice that I wanted to find out how much is was to replace the screen.

Apple Genius:  “How about ‘free’?”

I damn near fell off the stool I was sitting on.  How could the Apple Store in King of Prussia been so unhelpful, and then 5 minutes into the same explanation I get an offer to get the screen fixed FREE at the Suburban Square Apple Store in Ardmore?

Apple Genius:  “And we can probably get this back to you by tomorrow.”

Needless to say, I didn’t want to do anything except hit ‘Accept’ on the electronic repair form.  I’ve come too far to mess this gift up!

Apple, you’ve earned yourself a lifetime customer

Maybe I got lucky.  Maybe it was perseverance.  Maybe this screen/hinge defect has shown up too many times in the last six weeks and Apple could no longer ignore it.

Maybe it’s because I asked twice at two different Genius appointments. Or maybe Apple has realized I’ve spent several thousand dollars with them in the past several years, with this MacBook Pro, iMac, several iPhones and an iPad.  That level of spend probably doesn’t even get me in the top 50% of non-business customers, but it’s not negligible either.

Whatever the reason, by comping me the $492.41, Apple has “bought” themselves a customer for life.


The cost of a broken MacBook Pro hinge? Apparently, $492.41!

Edit: To read the follow-up of what eventually ended up of this MacBook Pro, click here for an article about me replacement battery interaction with Apple.

For Maximum User Understanding, Customize the SiteCatalyst Menu


Default Omniture report menu

Visits vs. Visitors vs. Unique Visitors…click-throughs, view-throughs, bounces…these concepts in digital analytics are fairly abstract, and many in business and marketing never really grasp the concepts fully.  Knowing the enormous amount of learning that needs to take place for digital success, why do we make our internal stakeholders hunt for data that’s organized by TOOL definitions, instead of by business function?

In this case, the “tool” that I’m referring to here is Omniture SiteCatalyst.  To be clear, there’s nothing excessively wrong about the default menu structure in Omniture, just that in my experience, understanding by end-users can be greatly enhanced by customizing the Omniture menu.

Simple modifications such as 1) Hiding Omniture variables and products not in use, 2) organizing reports by logical business function, and 3) placing custom reports and calculated metrics next to the standard SiteCatalyst reports will get users to making decisions with their data that much faster.

1)  Hide Omniture variables and products not being used

Do your users a favor and hide the Omniture products such as Test & Target, Survey, and Genesis if you aren’t using them.  Same thing with any custom traffic (props) and custom conversion variables (eVars) that aren’t being used.  Nothing will distract your users faster than clicking on folders with advertisements (T&T, Survey) or worse, frustrate the user by making them wonder “What data is supposed to be in this report?”

Just by hiding or disabling these empty reports and tools advertisements, you should see an increased confidence in data quality.  Or at the very least, keep the conversation from taking a detour.

2)  Organize SiteCatalyst reports by logical business function

Your internal users aren’t thinking about Omniture variable structures when they are trying to find the answer to their business questions.  So why do we keep our data artificially separated by “Custom Events”, “Custom Conversions” and “Custom Traffic”?

Worse yet, who remembers that the number of Facebook Likes can be found at “Site Metrics -> Custom Events -> Custom Events 21-30?”  And why are Facebook Likes next to “Logins”?  Does that mean Facebook Logins?  Probably not.

Wouldn’t it be better for our users to organize reports by business function, such as:

  • Financial/Purchase Metrics (Revenue, Discounts, Shipping, AOV, Units, Revenue Per Visit)
  • Usability (Browser, Percent of Page Viewed, Operating System)
  • SEO (Non-campaign visits, Referring Domains)
  • Mobile (Device, browser, resolution)
  • Site Engagement (Page Views, Internal Campaigns, Logins)
  • Site Merchandising (Products Viewed, Cart Add Ratio, Cross-Sell)
  • Social (Facebook Likes, Pinterest Pins, Visits from Social domains)
  • Paid Campaigns (Email, Paid Search, Display)
  • Traffic (Total Visits, Geosegmentation)

The list above isn’t meant to be exhaustive, or necessarily how you should organize your SiteCatalyst menus.  But for me, organizing the reports by the business function keeps my business thinking flowing, rather than trying to remember how Omniture was implemented by variable type.

3)  Place custom reports and calculated metrics next to the standard SiteCatalyst reports

This is probably more like “2b” to the above, but there’s no reason to keep custom reports and calculated metric reports segregated either.  Custom reports happen because of a specific business need, and the same thing with calculated metrics.  By placing these reports along with the out-of-the-box reports from SiteCatalyst, you take away the artificial distinction between data natively in SiteCatalyst and business-specific data populated by a web developer.

Why you wouldn’t want to customize?

Shawn makes two great points in his post about (not) customizing the SiteCatalyst menu: users require special training and menu customization isn’t scalable.

Users need special training

Users need to be trained anyway.  I don’t think either of us is suggesting moving all of the menus around after an implementation has been in place for years…but if you’re a company just starting out, why not start off customized?

Fellow Keystoner Tim Patten also commented to me via Twitter DM about power users being used to “default”, and it’s annoying have to learn a new menu when switching companies; I’m not really worried about power users, I’m thinking about the hundreds of users in thousands of organizations who can’t get beyond page views and visits.  Power users can pick up a new menu quickly, switch back to default, or use the search box.

This is very much true.  The larger the company, and the more complex and varied the tracking, inevitably menu customization isn’t particularly scalable.  This is probably an area where specific dashboards are a much better strategy than customizing the menus.


For me, one of the first things I look for when working with a company looking to get their digital analytics program off the ground is whether they’ve customized their Omniture menu structure.  As a free customization, it’s something that companies should at least consider.  Organizing reports by business function requires a business to think about the questions they want to regularly answer, will keep novice users from focusing on implementation concepts, and overall is just better because it’s how I think 🙂

This blog post is a continuation of a Twitter conversation with Shawn C. Reed (@shawncreed), Jason Egan (@jasonegan), Tim Patten (@timpatten) and others.  Shawn’s counter-argument can be found here.  Jason wrote about Omniture menu customization a few years back.  And finally, if you want to read more pros-and-cons about SiteCatalyst menu customization, see the Adobe blog posts here and here.

Effect Of Modified Bounce Rate In Google Analytics

A few months back, Justin Cutroni posted on his blog some jQuery code that modifies how Google Analytics tracks content.  Specifically, the code snippet changes how bounce rate and time on site are calculated, creates a custom variable to classify whether visitors are “Readers” vs. “Scanners” and adds some Google Analytics events to track how far down the page visitors are reading.

Given that this blog is fairly technical and specific in nature, I was interested in seeing how the standard Google Analytics metrics would change if I implemented this code and how my changes compared to Justin’s.  I’ve always suspected my bounce rate in the 80-90% range didn’t really represent whether people were finding value in my content.  The results were quite surprising to say the least!

Bounce Rate - Dropped through the floor!


Starting April 24th, Bounce Rate drops considerably!

As expected, implementing the content tracking code caused a significant drop in bounce rate, due to counting scrolling as a page “interaction” using Google Analytics events. Thus, the definition of bounce rate changed from single page view visits to visitors that don’t interact with the page by scrolling at least 150 pixels.

In the case of my blog, the bounce rate dropped from 80-90% to 5-15%!  This result tells me that people who arrive on-site aren’t arriving by accident, that they are specifically interested in the content.  Sure, I could’ve validated this using incoming search term research, but this provides a second data point.  The content I provide not only ranks well in Google, but once on-site also causes readers to want to see what the article contains.

Readers vs. Scanners

Even with the bounce rate drop above, I really don’t get a good feeling about whether people are actually reading the content.  Sure, people are scrolling 150px or more, but due to the ADHD nature of the web, plenty of people scroll without reading just to see what else is on the page!  That’s where the “Readers vs. Scanners” report comes in:


62% of visits only scan instead of read - Need to do better here!

The report above shows that only 38% of visits to the site actually READ an article, rather than just quickly scroll.  This is disappointing, but now that I’ve got the information being tracked, I can set up a goal in Google Analytics with the aim of improving the ratio of actual readers vs. quick scrollers.

Average Visit Duration - Still useless

Like the bounce rate definition change above, average visit duration and average time on page also change definitions when using the jQuery content tracking code.  Given that Google Analytics calculates time metrics by measuring the time between page views or events, by adding more events on the page, all time on site metrics have to increase (by definition).


Hard to see because of the Y-axis, but Avg. Visit Duration increases significantly as well.

That said, average visit duration is still a pretty useless metric, given that an increase/decrease in this metric doesn’t immediately tell you “good” or “bad”…

Content Consumption “Funnel”

Finally, the last change that occurs when you implement the content tracking code is a series of Google Analytics events that measure how far down the page visitors are actually seeing.  This report, in combination with the Readers vs. Scanners report, helps understand reader engagement better than any generic “Time on Site” metric can do.


From this report, I can see that of the 2,102 articles loaded:

  • 89.4% of the articles have a “StartReading” event fired
  • 89.8% of those who start to read an article reach the bottom of the article.
  • 19.7% of those who reach the end of the article scroll past the comments to reach the true end of page

The first metric above is analogous to subtracting the bounce rate from 1, the percentage of articles viewed that don’t bounce.  The second metric (complete articles seen), with a success rate of 89.8% is ripe for segmentation.  I stated above that only 38% actually READ an article, so segmenting the above report by “Readers” vs. “Scanners” will surely lower the success rate in the “Readers” population.

Finally, that <20% actually touch the true bottom of page is surprising to me, since this blog really doesn’t get many comments!  If there were thousands of comments and the pages were really long, ok, no one sees the bottom…but here?  I’ll have to think about this a bit.

Great update to Google Analytics default settings!

Overall, my impression of the jQuery code snippet developed by Justin and others is that it is extremely useful in understand interaction of visitors to content sites.  The only downside I see here is that it changes the definition of bounce rate within Google Analytics, which could be confusing to others who 1) aren’t aware of the code snippet running on-site or 2) don’t quite understand the subtleties of Google Analytics implementation with respect to Events and the non-interaction setting.

But since this is my personal blog, I don’t need to worry about others mis-interpreting my Google Analytics data, so I’m going to keep this functionality installed!

Update 7/25/12:  Google Analytics published a similar method to the one described above, using “setTimeout” to modify bounce rate based solely on time-on-page.

  • Using RSiteCatalyst With Microsoft PowerBI Desktop
  • RSiteCatalyst Version 1.4.14 Release Notes
  • RSiteCatalyst Version 1.4.13 Release Notes
  • RSiteCatalyst Version 1.4.12 (and 1.4.11) Release Notes
  • Self-Service Adobe Analytics Data Feeds!
  • RSiteCatalyst Version 1.4.10 Release Notes
  • WordPress to Jekyll: A 30x Speedup
  • Bulk Downloading Adobe Analytics Data
  • Adobe Analytics Clickstream Data Feed: Calculations and Outlier Analysis
  • Adobe: Give Credit. You DID NOT Write RSiteCatalyst.
  • RSiteCatalyst Version 1.4.8 Release Notes
  • Adobe Analytics Clickstream Data Feed: Loading To Relational Database
  • Calling RSiteCatalyst From Python
  • RSiteCatalyst Version 1.4.7 (and 1.4.6.) Release Notes
  • RSiteCatalyst Version 1.4.5 Release Notes
  • Getting Started: Adobe Analytics Clickstream Data Feed
  • RSiteCatalyst Version 1.4.4 Release Notes
  • RSiteCatalyst Version 1.4.3 Release Notes
  • RSiteCatalyst Version 1.4.2 Release Notes
  • Destroy Your Data Using Excel With This One Weird Trick!
  • RSiteCatalyst Version 1.4.1 Release Notes
  • Visualizing Website Pathing With Sankey Charts
  • Visualizing Website Structure With Network Graphs
  • RSiteCatalyst Version 1.4 Release Notes
  • Maybe I Don't Really Know R After All
  • Building JSON in R: Three Methods
  • Real-time Reporting with the Adobe Analytics API
  • RSiteCatalyst Version 1.3 Release Notes
  • Adobe Analytics Implementation Documentation in 60 Seconds
  • RSiteCatalyst Version 1.2 Release Notes
  • Clustering Search Keywords Using K-Means Clustering
  • RSiteCatalyst Version 1.1 Release Notes
  • Anomaly Detection Using The Adobe Analytics API
  • (not provided): Using R and the Google Analytics API
  • My Top 20 Least Useful Omniture Reports
  • For Maximum User Understanding, Customize the SiteCatalyst Menu
  • Effect Of Modified Bounce Rate In Google Analytics
  • Adobe Discover 3: First Impressions
  • Using Omniture SiteCatalyst Target Report To Calculate YOY growth
  • ODSC webinar: End-to-End Data Science Without Leaving the GPU
  • PyData NYC 2018: End-to-End Data Science Without Leaving the GPU
  • Data Science Without Leaving the GPU
  • Getting Started With OmniSci, Part 2: Electricity Dataset
  • Getting Started With OmniSci, Part 1: Docker Install and Loading Data
  • Parallelizing Distance Calculations Using A GPU With CUDAnative.jl
  • Building a Data Science Workstation (2017)
  • JuliaCon 2015: Everyday Analytics and Visualization (video)
  • Vega.jl, Rebooted
  • Sessionizing Log Data Using data.table [Follow-up #2]
  • Sessionizing Log Data Using dplyr [Follow-up]
  • Sessionizing Log Data Using SQL
  • Review: Data Science at the Command Line
  • Introducing Twitter.jl
  • Code Refactoring Using Metaprogramming
  • Evaluating BreakoutDetection
  • Creating A Stacked Bar Chart in Seaborn
  • Visualizing Analytics Languages With VennEuler.jl
  • String Interpolation for Fun and Profit
  • Using Julia As A "Glue" Language
  • Five Hard-Won Lessons Using Hive
  • Using SQL Workbench with Apache Hive
  • Getting Started With Hadoop, Final: Analysis Using Hive & Pig
  • Quickly Create Dummy Variables in a Data Frame
  • Using Amazon EC2 with IPython Notebook
  • Adding Line Numbers in IPython/Jupyter Notebooks
  • Fun With Just-In-Time Compiling: Julia, Python, R and pqR
  • Getting Started Using Hadoop, Part 4: Creating Tables With Hive
  • Tabular Data I/O in Julia
  • Hadoop Streaming with Amazon Elastic MapReduce, Python and mrjob
  • A Beginner's Look at Julia
  • Getting Started Using Hadoop, Part 3: Loading Data
  • Innovation Will Never Be At The Push Of A Button
  • Getting Started Using Hadoop, Part 2: Building a Cluster
  • Getting Started Using Hadoop, Part 1: Intro
  • Instructions for Installing & Using R on Amazon EC2
  • Video: SQL Queries in R using sqldf
  • Video: Overlay Histogram in R (Normal, Density, Another Series)
  • Video: R, RStudio, Rcmdr & rattle
  • Getting Started Using R, Part 2: Rcmdr
  • Getting Started Using R, Part 1: RStudio
  • Learning R Has Really Made Me Appreciate SAS