Web Optimization

Visualizing Speed Metrics to Improve SEO, UX, & Revenue – Whiteboard Friday

Posted on

Posted by sam.marsden

We know how important page speed is to Google, but why is that, exactly? With increasing benefits to SEO, UX, and customer loyalty that inevitably translates to revenue, there are more reasons than ever to both focus on site speed and become adept at communicating its value to devs and stakeholders. In today’s Whiteboard Friday, Sam Marsden takes us point-by-point through how Google understands speed metrics, the best ways to access and visualize that data, and why it all matters.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Moz fans, and welcome to another Whiteboard Friday. My name is Sam Marsden, and I work as an SEO at web crawling platform DeepCrawl. Today we’re going to be talking about how Google understands speed and also how we can visualize some of the performance metrics that they provide to benefit things like SEO, to improve user experience, and to ultimately generate more revenue from your site.

Google & speed

Let’s start by taking a look at how Google actually understands speed. We all know that a faster site generally results in a better user experience. But Google hasn’t actually directly been incorporating that into their algorithms until recently. It wasn’t until the mobile speed update, back in July, that Google really started looking at speed. Now it’s likely only a secondary ranking signal now, because relevance is always going to be much more important than how quickly the page actually loads.

But the interesting thing with this update was that Google has actually confirmed some of the details about how they understand speed. We know that it’s a mix of lab and field data. They’re bringing in lab data from Lighthouse, from the Chrome dev tools and mixing that with data from anonymized Chrome users. So this is available in the Chrome User Experience Report, otherwise known as CrUX.

CrUX metrics

Now this is a publicly available database, and it includes five different metrics. You’ve got first paint, which is when anything loads on the page. You’ve then got first contentful paint, which is when some text or an image loads. Then you’ve DOM content loaded, which is, as the name suggests, once the DOM is loaded. You’ve also got onload, which is when any additional scripts have loaded. That’s kind of like the full page load. The fifth and final metric is first input delay, and that’s the time between when a user interacts with your site to when the server actually responds to that.

These are the metrics that make up the CrUX database, and you can actually access this CrUX data in a number of different ways. 

Where is CrUX data?

1. PageSpeed Insights

The first and easiest way is to go to PageSpeed Insights. Now you just plug in whatever page you’re interested in, and it’s going to return some of the CrUX metrics along with Lighthouse and a bunch of recommendations about how you can actually improve the performance of your site. That’s really useful, but it just kind of provides a snapshot rather than it’s not really good for ongoing monitoring as such.

2. CrUX dashboard

Another way that you can access CrUX data is through the CrUX dashboard, and this provides all of the five different metrics from the CrUX database. What it does is it looks at the percentage of page loads, splitting them out into slow, average, and fast loads. This also trends it from month to month so you can see how you’re tracking, whether you’re getting better or worse over time. So that’s really good. But the problem with this is you can’t actually manipulate the visualization of that data all that much.

3. Accessing the raw data

To do that and get the most out of the CrUX database, you need to query the raw data. Because it’s a freely available database, you can query the database by creating a SQL query and then putting this into BigQuery and running it against the CrUX database. You can then export this into Google Sheets, and then that can be pulled into Data Studio and you can create all of these amazing graphs to visualize how speed is performing or the performance of your site over time.



It might sound like a bit of a complicated process, but there are a load of great guides out there. So you’ve got Paul Calvano, who has a number of video tutorials for getting started with this process. There’s also Rick Viscomi, who’s got a CrUX Cookbook, and what this is, is a number of templated SQL queries, where you just need to plug in the domains that you’re interested in and then you can put this straight into BigQuery.

Also, if you wanted to automate this process, rather than exporting it into Google Sheets, you could pull this into Google Cloud Storage and also update the SQL query so this pulls in on a monthly basis. That’s where you kind of want to get to with that.

Why visualize?

Once you’ve got to this stage and you’re able to visualize the data, what should you actually do with it? Well, I’ve got a few different use cases here.

1. Get buy-in

The first is you can get buy-in from management, from clients, whoever you report into, for various optimization work. If you can show that you’re flagging behind competitors, for example, that might be a good basis for getting some optimization initiatives rolling. You can also use the Revenue Impact Calculator, which is a really simple kind of Google tool which allows you to put in some various details about your site and then it shows you how much more money you could be making if your site was X% faster.

2. Inform devs

Once you’ve got the buy-in, you can use the CrUX visualizations to inform developers. What you want to do here is show exactly the areas that your site is falling down. Where are these problem areas? It might be, for example, that first contentful paint is suffering. You can go to the developers and say, “Hey, look, we need to fix this.” If they come back and say, “Well, our independent tests show that the site is performing fine,” you can point to the fact that it’s from real users. This is how people are actually experiencing your site.

3. Communicate impact

Thirdly and finally, once you’ve got these optimization initiatives going, you can communicate the impacts that they’re actually having on performance and also business metrics. You could trend these various performance metrics from month to month and then overlay various business metrics. You might want to look at conversion rates. You might want to look at bounce rates, etc. and showing those side-by-side so that you can see whether they’re improving as the performance of the site is improving as well.

Faster site = better UX, better customer loyalty, and growing SEO benefit

These are different ways that you can visualize the CrUX database, and it’s really worthwhile, because if you have a faster site, then it’s going to result in better user experience. It’s going to result in better customer loyalty, because if you’re providing your users with a great experience, then they’re actually more likely to come back to you rather than going to one of your competitors.

There’s also a growing SEO benefit. We don’t know how Google is going to change their algorithms going forward, but I wouldn’t be surprised if speed is coming in more and more as a ranking signal.

This is how Google understands page speed, some ways that you can visualize the data from the CrUX database, and some of the reasons why you would want to do that.

I hope that’s been helpful. It’s been a pleasure doing this. Until the next time, thank you very much.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read more: tracking.feedpress.it

Social Media Marketing

A family tracking app was leaking real-time location data

Posted on

A popular household tracking app was dripping the real-time areas of more than 238,000 users for weeks after the designer left a server exposed without a password.

The app, Family Locator, constructed by Australia-based software application home React Apps , permits households to track each other in real-time, such as moms and dads or partners would like to know where their kids are. It likewise lets users establish geofenced signals to send out an alert when a relative gets in or leaves a particular area, such as school or work.

But the backend MongoDB database was left available and unguarded by anybody who understood where to look.

Sanyam Jain, a security scientist and a member of the GDI Foundation, discovered the database and reported the findings to TechCrunch.

Based on an evaluation of the database, each account record consisted of a user’s name, e-mail address, profile picture and their plaintext passwords. Each account likewise kept a record of their other and own relative’ real-time areas accurate to simply a couple of feet. Any user who had a geofence established likewise had actually those collaborates kept in the database, together with what the user called them —– such as “house” or “work.”

None of the information was secured.

TechCrunch validated the contents of the database by signing and downloading the app up utilizing a dummy e-mail address. Within seconds, our real-time place looked like accurate collaborates in the database.

We called one app user at random who, albeit stunned and shocked by the findings, verified to TechCrunch that the collaborates discovered under their record were precise. The Florida-based user, who did not wish to be called, stated that the database was the place of their company. The user likewise verified that a member of the family noted in the app was their kid, a trainee at a close-by high school.

Several other records we evaluated likewise consisted of the real-time areas of moms and dads and their kids.

TechCrunch invested a week attempting to call the designer, React Apps, to no get. The business’s site had no contact details —– nor did its bare-bones personal privacy policy . The site had a privacy-enabled concealed WHOIS record , masking the owner’s e-mail address. We even purchased the business’s organisation records from the Australian Securities &&Investments Commission, just to find out the business owner’s name —– Sandip Mann Singh —– however no contact info. We sent out a number of messages through the business’s feedback type, however got no recognition.

On Friday, we asked Microsoft, which hosted the database on its Azure cloud, to call the designer. Hours later on, the database was lastly pulled offline.

It’s not understood specifically the length of time the database was exposed for. Singh still hasn’t acknowledged the information leakage.

Despite pledges to stop, United States cell providers are still offering your real-time phone place information

Read more: feedproxy.google.com