Anyone can alter Google’s search results. This article will explain how.

In 2013 Google moved to what it called the Internet of things. The move was a change in policy, which announced Google’s use of three sources of information to obtain knowledge for search results. The new sources were Wikipedia, CIA world Fact Book and Freebase. Wikipedia appears to be the dominating source. Google’s dependence upon Wikipedia cause this accident to happen.

House Majority leader Kevin McCarthy

(R-Calif.) on Thursday went after Google for displaying “Nazism” as one of the ideologies of the California Republican Party.

A search on the site for “California Republican Party” apparently returned with a sidebar result listing Nazism as an ideology alongside “conservativism” and “market liberalism.”

McCarthy noted the sidebar in a tweet at the company.

“Dear @Google, This is a disgrace #StopTheBias,” McCarthy tweeted, accompanied by a screenshot showing Nazism listed among the California Republican party’s ideologies alongside values like “conservatism” and “market liberalism.”

Ideologies associated with the California GOP are no longer visible in the sidebar on Google’s results page when users search “California Republican Party” or in similar searches.

The sidebar, which Google calls the “knowledge panel,” is often populated by content from Wikipedia; however, no mention of Nazism is visible of the California GOP’s page there. The party’s Wikipedia page’s edit history, though, shows that “Nazism” was briefly added to the page on Thursday.

Google blamed the result as online “vandalism” from outside the company that slipped through its safeguards.

“This was not the result of any manual change by anyone at Google. We don’t bias our search results toward any political party. Sometimes people vandalize public information sources, including Wikipedia, which can impact the information that appears in search,” a Google spokesperson said in a statement.

“We have systems in place that catch vandalism before it impacts search results, but occasionally errors get through, and that’s what happened here. This would have been fixed systematically once we processed the removal from Wikipedia, but when we noticed the vandalism we worked quickly to accelerate this process to remove the erroneous information,” they added.

The mistake comes in advance of California’s statewide primary elections, which are set for next week.

Other conservatives expressed outrage over Nazism appearing, including President Trump.

Anyone can post a Wikipedia article. ABCO Technology’s web development program will teach you how. If you are interested in learning advanced website techniques for promoting your business online, call our campus. You can reach us by telephone between 9 A.M and 6 P. M Monday through Friday at: (310) 216-3067.

Email all questions to :- info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Building engaging and highly visible webpages today!

Optimizing Youtube videos

As we approach memorial day, many of our readers may want to put a video online about their business, activity or cause. I decided to reprint a chapter from my book, Search engine Marketing Guide for the small business. If you like this chapter, please consider giving our school a call. My book has 32 chapters filled with information about optimizing for search engines. Today we’ll discuss optimizing your Youtube video.

Chapter 24 Optimizing YouTube Videos

Google has provided statistics proving that Internet users spend more time on YouTube than they do on Google. YouTube states they receive over forty million visitors per day. YouTube videos are entertaining. The information presented on all search engines appeals to three types of learners: visual, audio and kinesthetic. Understanding this concept will be of significant benefit to your organization’s bottom line. This chapter will show you how to make your videos stand out for your chosen key phrases on YouTube.

It’s no secret that YouTube is a traffic source with almost limitless potential. There’s only one problem: YouTube is crowded with video content. According to YouTube over 100 hours of video are uploaded to this search engine every minute. When you take into consideration the large amount of content displayed on YouTube, the key question is how to get your customers to view your videos? The answer to this question is to optimize your YouTube video. It probably will not surprise you to learn that your competition simply uploads their videos to YouTube and hopes for the best. After reading this chapter you will be able to do much better than that.

Before we dive into how you can optimize your videos for YouTube, let’s review some of the most important ranking factors that YouTube uses.

YouTube’s ranking is not nearly as complex as Google’s famous 160 ranking signals, which was published in 2015 but YouTube’s algorithm is much focused. It compiles dozens of signals into a weighting system for ranking videos in YouTube search and for suggested videos, like this:

YouTube Sidebar Suggested

Now let’s get started. We will begin with a basic YouTube primer.

YouTube Ranking Factors

Here’s some information to keep in mind about the trust signals YouTube uses. So let me give you a brief list to use containing the most important signals that YouTube ranks videos:

•Title tag information

•Audience retention

•Keywords in description tag

•Tags

•Video length

•Number of subscribers after watching

•Comments

•Likes and dislikes

YouTube

It’s time to show you five strategies that you can leverage these trust signals to bring more traffic to your videos…and ultimately to your site.

1. Write Super-Long content for your Video Descriptions

Please note that YouTube and Google can’t watch or listen to your video. The search engine can only understand the contents of your video by what you write about it.

That means that they heavily lean on the HTML text surrounding the video to index your video’s topic. That’s why it disturbs me to see extremely brief video descriptions. For example: YouTube short non-ranking description “Japanese food”. The more YouTube knows about your video, the more confidently it can rank it for your chosen key words. Even more important to your ranking, YouTube uses keywords in the description to rank you for super-long tail keywords. A better ranking video description would look like this “Japanese food West Los Angeles including: Sushi, Sashimi and Tempura under $25.” If you owned a Japanese restaurant serving these dishes containing these prices, your video would be viewed by potential customers who were hungry for Japanese food. YouTube videos are designed to be niche specific.

2 Find “Video Keywords”

Like a normal practice in SEO, the YouTube SEO process begins with keyword research.

Your goal is to find keywords that have YouTube results on the first page of Google. The Key word tools we provided will be of great help to you in selecting those key niche phrases.

These are called, “Video Keywords”.

Unlike a normal SERP with 10 webpage results, containing Video Keywords, Google reserves a large portion of the first page for video results:

Google Video Results

In general, Google tends to use video results for these types of keywords:

•How-to keywords (“how to install a dish TV antenna”)

•Reviews (“Islands Restaurant review”)

•Tutorials (“Setting up your YouTube Channel”)

•Anything fitness or sports related (“long distance running”)

•Funny videos (“Cute animals”)

Let’s suppose you optimize your video around a keyword that doesn’t have any video results in Google.

In this case, you’ll ONLY get traffic from people searching on YouTube. This will cut your traffic results in half.

If you optimize for Video Keywords, you’ll also receive targeted traffic to your video directly from Google’s first page.

How to Find those High Traffic Video Keywords

The easiest way to find video keywords is to search for keywords in your niche. Then see if any of the keywords you searched for have YouTube video results, like this:

YouTube Results in Google for your selected key phrase.

Does this sound simple enough?

Once you’ve found the right Video Keywords, it’s time to check if there’s enough search volume for that keyword.

Since videos don’t take a ton of time to put together, you don’t need to go after keywords with massive search volumes.

Just make sure your target keyword gets at least 300 searches per month in Google (you can find this information using the Google Keyword Planner):

GKP Results

Why 300 searches per month?

If a keyword delivers at least 300 searches per month, then you know it also gets a decent amount of searches within YouTube itself. The free Keyword Suggestion tool by Wordstream is another excellent tool to verify against Google’s GKP tool.

If you can get that video to rank in Google, then a lot of those 300+ monthly searches will click on YOUR video from the search results.

That means you’ll get more high-quality traffic to your video, and ultimately, your site.

3 Make your (great) Video

When manufacturing a product for your business, the more effort you put into your product, the better your return on investment. This same philosophy also applies to your video.

If your budget can afford a professional look, you might want to consider hiring a professional videographer for the day, pay an editor to add graphics, rent a studio…

YouTube has facilities around the country that offer this service at a reasonable cost. In fact a YouTube facility opened near me in the suburb of Playa Vista, California.

If you’re on a really tight budget, you can record your voice over a PowerPoint presentation using Screencast-O-Matic ($15/year). This service will deliver professional quality at a very reasonable price.

I’m highly emphasizing quality because user engagement is THE most important YouTube ranking signal. Again to repeat Google can’t watch your presentation, but they can digitally judge the reaction to it.

If your video stinks, it won’t rank…no matter how optimized it is for SEO.

Unlike Google — which can use backlinks and other signals to evaluate the quality of a piece of content — YouTube doesn’t have this luxury.

They judge your video’s quality based on how people interact with it.

The User Experience Metrics That YouTube Uses

Here’s what YouTube uses to determine the quality of your video:

•Video retention: The percentage of your video people tend to watch (the more, the better).

Video_Retention•Comments: If people comment, they probably enjoyed the video (or at least watched it).

Video Comment•Subscriptions after watching: If someone subscribes to your channel after watching your video that sends a HUGE Trust signal that you have an amazing video.

YouTube Subscribe Button•Shares: How many people share your video on social media sites like Twitter and Google+.

YouTube Sharing Icons•Favorites: The number of people that favor your video or add it to their “Watch Later” playlist:

Watch Later•Thumbs up/Thumbs down: Self-explanatory 🙂

If you want to see how your videos are performing, you can see user experience data in your YouTube Analytics:

YouTube Analytics

If you make a top-notch video you’re highly likely to get top-retention views, likes, comments and all the things that YouTube likes to see in a video.

Make Your Video At Least 5-Minutes Long

Similar to text-based articles, longer videos rank better.

Since 2013, I’ve consistently observed longer videos outperforming shorter videos in YouTube and Google search.

For example, if you search in YouTube for the keyword “WordPress”, 3 out of the 4 top videos are an hour long.

So make a Video that runs for at least 5-minutes.

If it makes sense for your video to be even longer than that, go for it. Don’t worry about your video being too long. If it’s awesome, people WILL watch it.

OK so you’ve created your compelling video that’s 5-minutes or longer in length. Good job-)

Now it’s time to optimize your video and upload it to YouTube.

Here’s how to extract the most SEO value from your video.

Video Filename

When you’re done with the video make sure that you use your chosen keyword in the video’s filename.

For example, if you wanted to rank for the keyword “computer training Los Angeles”, you’d want to name your video computer_training_Los_Angeles_video.mp4.

Video_Filename

Video Title

The title of your video should be at least 5-words long. That way, you can include your full keyword phrase without keyword-stuffing.

Video Title

Power Tip: Like with a blog post, I’ve found that you get a slight video SEO boost by putting your keyword at the beginning of the title.

So if you were trying to rank for “surfing tutorial” you’d want a title like: “Surfing Tutorial: Learn How to Ride a Wave Today”.

Description

Your videos description is VERY important.

Because Google and YouTube can’t “listen” to videos, they rely on your text description to determine your video’s content.

Here are the basic guidelines for the description:

•Put your link at the very top of the video (this maximizes CTR to your site)

•Include your keyword in the first 15 words

•Make the description at least 300-words

•Include your keyword 3-4 times

This SEO-optimized description helps tell Google and YouTube what your video is about without being spammy.

YouTube Description tags

Tags aren’t super-important…but they help.

Just include a few keywords to help YouTube and Google learn what your video is about.

Targeted Tags

Targeted tags will not only help you rank for your target keyword…

…but they will get you to show up more often as a related video in the sidebar area of YouTube:

YouTube Sidebar

When the video someone’s watching has a similar tag as your video– boom! – You’re added to the sidebar.

Once you’re done, hit “Save Changes” and your video will go live!

5 Get Video Views

We talked a lot about user experience signals so far…which are really important.

In order for your video to rank for competitive keywords, it needs to receive A LOT of views.

More views=higher rankings.

But there’s one catch…

…the views need to be real.

YouTube has caught onto fake views. That’s why I don’t recommend using a service on Fiverr to pump up your view count.

As we already discussed, long-retention views are worth their weight in gold.

Here are a few strategies you can use to get targeted views to your video:

Mention Your Video on Quora and Other Q&A Sites

Quora, Yahoo Answers, and other Q&A sites are some of the most popular sites on the web (Quora rocks a top 500 Alexa ranking).

But if you try to go in there and plaster links all over the place, you’re going to get banned in a flash unless you link to YouTube.

Because you’re posting your video in a place where people are desperate for information on a given topic, the views you’ll get are extremely high-retention.

Just search for a question on your video’s topic:

Quora Search Results

And add a link to your video. Or better yet, embed it into your answer:

Quora Answers

Link to Your Video in Your Email Signature

People that email you (like your family or wife) generally like you.

And if you’re like me, you get A TON of emails.

So when you add a link to your latest video in your email signature, that means you get high-retention views like they’re going out of style.

Email Signature

Embed Your Videos in Blog Posts

Whenever you write a blog post (on your site or as a guest post for another site), think to yourself:

“Where can I embed a YouTube video into this post?”

As we stated at the beginning of this chapter Google reports their users spending more time on YouTube rather than searching for information. If your product translates well for videos use them for promotion.

If you want to build websites or make videos, which are highly visible to your audience call ABCO Technology. You can reach our campus by telephone at: (310) 216-3067 from 9 A.m to 6 P.m Monday through Friday.

Email all questions to info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

Build highly visible webpages and videos today!

Mobile vs desktop web pages have different search engine rankings

As if we didn’t have enough to think about with respect to any given SEO campaign, it is now imperative to separate and refine your approaches to mobile and desktop search.

While mobile has moved to 70% of all searches over the past five years, this shouldn’t be to the neglect of desktop. Although SEO for mobile and desktop follow the same basic principles and best practices, there are nuances and discrepancies that need to be factored in to your overall ranking strategy.

Part of this is the keyword rankings: you won’t ever know how to adapt your strategies if you’re not tracking the rankings separately for each. Research from Bright Edge found that 79% of listings have a different rank on mobile devices compared with desktop, and the top-ranking result for a query is different on desktop and mobile 35% of the time. These are statistics that simply cannot be ignored.

Why are they different?

Before delving into how to compare keyword rankings on mobile and desktop, it’s highly important to acknowledge the why and the what: why they rank different and what it means for your SEO strategy.

It’s paramount to understand that desktop and mobile searches use different algorithms. Ultimately, Google wants to provide the best user experience for searchers, whatever device they are using. This means creating a bespoke device-tailored experience and in order to do that, we need to delve deeper into user intent.

It’s all about user intent

The crux of the mobile versus desktop conundrum is that user intent tends to differ for each device. This is particularly important when considering how far along the funnel a user is. It’s a generalization, but overall mobile users are often closer to the transactional phase, while desktop users are usually closer to the informational phase.

For example, we can better understand user intent on mobile by understanding the prevalence of local search. If a user is searching for a product or service on mobile, it is likely to be local. In contrast, users searching for a product or service on desktop are more likely to be browsing non-location-specific ecommerce sites.

Let’s also consider the types of conversions likely to occur on each device, in terms of getting in touch. Users on mobile are far more likely to call, by simply tapping the number which appears in the local map pack section. Alternatively, desktop users would be more inclined to type an email or submit a contact form.

What on earth is a micro-moment?

To better understand the different ways in which consumers behave, it may help to spend a little time familiarizing yourself with micro-moments. These refer to Google’s ability to determine a searcher’s most likely intent, and is particularly important for mobile users, when a consumer often needs to take immediate action.

For example, if a user is searching for a local product or service, the local map pack will appear, but if they are searching for information then the quick answer box will appear. These micro-moments therefore have a significant impact on the way the SERPs are constructed.

Once you’ve understood the user intent of a given searcher, you can ensure that you are providing content for both mobile and desktop users. However, it’s worth bearing in mind that content with longer word counts continues to perform well on mobile, despite the general consensus that people on mobile simply can’t be bothered to consume long form content. This harks back to Google’s prioritization of high quality content. Besides, anybody who has a long train commute into work will understand the need for a nice, long article to read on mobile.

Rankings tools

With that context, we can now return to the matter at hand: rankings. Of course, you could record the rankings for both desktop and mobile the old-fashioned way, but who has time for that? In short, any good SEO tool worth its salt will enable you to track both desktop and mobile rankings separately. Here are some favorites:

◾SEMRush is a personal favorite among the plethora of fancy SEO tools. SEMRush provides a comprehensive breakdown of mobile vs desktop results (as well as tablet if you really want to geek out) and displays the percentage of mobile-friendly results for your domain.

◾Search Metrics offers Desktop vs. Mobile Visibility metrics, detailing individual scores for desktop and mobile, as well as overlap metrics which show how many keyword search results appear in exactly the same position for both. You can also drill down further to view how a website performs with regard to localized results.

◾Moz. Through Moz Pro, you can track the same rankings metrics for both desktop and mobile. Filter by labels and locations to dig further into the data.

◾Google Search Console. Don’t have access to any of the above tools? Don’t panic as you can still rely on the trusty Google Search Console. When looking at your search analytics, filter devices by comparing mobile and desktop. Even if you do have access to a SEO tool that allows you to do comparison analysis, it’s still worth checking in on your Search Console insights.

Rankings are only part of your overall page strategy.

It’s important to remember that rankings are basically a tiny part of the picture; it’s essential to take a more holistic approach to the mobile vs desktop issue. This means taking the time to dig around Google Analytics and unearth the data and meaning beyond the vanity metrics.

You may have higher rankings for mobile, but those users might be bouncing more regularly. Is this a reflection of the user intent or is it a poor user experience? Does higher rankings for one device correlate to higher conversions? If not, then you need to consider the reasons for this. There’s no one-size-fits-all answer, so you must take a tailored approach to your search engine strategy.

ABCO Technology teaches a comprehensive course for web development. Call our campus between 9 AM and 6 PM Monday through Friday. You can reach our campus at: (310) 216-3067.

Email all questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:

11222 South La Cienega Blvd. STE #588

Los Angeles, Ca. 90304

 

Learn to build highly visible webpages today!

Google improves mobile indexing

Google announced yesterday that after a year and a half of testing it was beginning a wider rollout of its mobile-first indexing and had started migrating sites that follow the best practices for mobile-first indexing.

Google started to move a small number of sites over late last year, but this is the first announcement of what seems to be a larger scale move.

Sites which are migrating will be notified via a message in Search Console:

migration

Google said site owners would see significantly increased crawl rate from the Smartphone Googlebot and that Google would show the mobile version of pages in search results and Google cached pages. It said that for sites which have AMP and non-AMP pages, Google would favor the mobile version of the non-AMP page.

Google moved to reassure site owners who are not included in this rollout that rankings would not be affected and that sites which only have desktop content would still be indexed.

“Sites that are not in this initial wave don’t need to panic. Mobile-first indexing is about how we gather content, not about how content is ranked. Content gathered by mobile-first indexing has no ranking advantage over mobile content that’s not yet gathered this way or desktop content. Moreover, if you only have desktop content, you will continue to be represented in our index.”

Source: https://webmasters.googleblog.com/2018/03/rolling-out-mobile-first-indexing.html

However, the push towards mobile friendly sites continues with Google noting that mobile friendly content can perform better, and that slow loading content will be a ranking factor for mobile searches from July.

You can find more information about best practices for mobile-first indexing in Google’s developer documentation.

Google says it will continue to have one single index and there won’t be a mobile-first index separate from the main index. Historically it was the desktop version that was indexed but increasingly Google will now be using the mobile version of content, responding to the growth in use of mobile devices.

See Google’s blog post for full details.

ABCO Technology teaches a comprehensive course for mobile web site development. Call our campus between 9 AM and 6 PM Monday through Friday. You can reach us at:

(310) 216-3067.

Email all questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for federal funding.

 

Build highly visible mobile web sites today!

Facebook in competition with Google for Local search

When you think local SEO, you think Google. But another big name has been making some moves lately to enter the conversation, and that’s Facebook.

In the past few years, Facebook’s made a lot of strides to become a real player in local search, improving their search results to the extent that they rival Google’s. Meanwhile, Google has made investments in Google My Business to justify business owners devoting time to it instead of treating it like a defunct social media listing.

Both of these trends bode well for the impact of search on social, and of social on search.

Let’s review some of the most recent changes in local SEO from Google and Facebook.

Prioritizing local news for community engagement

At the end of January 2018, shortly following Mark Zuckerberg’s announcement that Facebook would be demoting content from of brands and publishers in favor of those from family, friends, and groups, the social media giant announced that they were tweaking their algorithm to also highlight local news in the News Feed.

Facebook

The focus on aggregating and finding local news indicates Facebook plans to double down on local search. The more they can pick up on local search signals, the better they can provide hyper-localized, relevant content for their users. Consumers increasingly expect more personalization, and assuming (like Facebook does) that there is a correlation between personalization and hyper-localized content, this change will make their platform a more valuable source of information for their users. The more valuable the information on the platform, the likelier that user base is to stick on it, using it as both a local search engine as well as a place for updates on friends and family. Let the advertising dollars roll in.

In his announcement, Zuckerberg said,

“People constantly tell us they want to see more local news on Facebook.”

Apparently Facebook wasn’t the only one listening, as earlier that same week, Google launched its own local news app.

Currently only available in two cities, the free Google Bulletin app lets users post news updates and upload photos and video about events and happenings in their area. The app essentially combines the social community features of Nextdoor with the You Report It feature many local news sites rely on to crowdsource content.

You reported

With Bulletin, Google may well be hoping to encourage users to visit it first as the source of immediate information, instead of turning to Facebook as people so often do during an emergency or to find a local event.

Google Bulletin and Facebook’s prioritization of local news are also a strong response to the pressure both companies received for disseminating fake news during the 2016 U.S. election. Both are making the same assumption – that hyperlocal necessarily means more relevant and, since it’s coming from news sources, more trustworthy.

However, both initiatives are in their early days and their assumptions don’t seem fully fool-proof. Facebook’s algorithm currently determines something is local news by noting the domain, and then seeing whether users from a concentrated geographical area engage with the content – a setup which should be fairly easy to game. Meanwhile, there’s currently no vetting process on Google Bulletin that would prevent users from uploading inaccurate information.

Crowdsourcing content to inform business listings

Besides news sources, both Facebook and Google are relying on crowdsourced information to complete, categorize, and rank the business listings in their database. On either platform, users can add places, update address information and hours, write reviews, and answer questions about the business. Then, the platform uses this information to determine the most relevant result based on a searcher’s query, their location, and even local time.

Both Google and Facebook provide robust results that display helpful attributes sourced by user reviews, ratings, and busy times.

Google crowdsourcing

Facebook also includes additional filters based on whether your friends have also visited a place – bringing the social into search.

Facebook crowdsourcing

Facebook’s City Guides do the same at a macro-level, providing trip planning for various large cities around the world, and showing the top places your friends as well as locals have explored.

Facebook city guide

Launched in November 2017, the Facebook Local tab incorporates local event results along with the business listings and displays which of your friends are attending. This hyper social aspect, as opposed to hyperlocal, is a unique differentiator that gives Facebook real value as a local search engine.

To its credit, Google has been working on ways to make its own search results more social. One of the biggest changes they introduced to Google My Business in 2017 was the Q&A feature. Users can click a button to ask questions about a business, which are then available to be answered by anyone, including the business itself, as well as local guides, regular Google users, and even competitors.

Q and A

The fact that anyone can answer leads to misinformation, or less than helpful information as in the last example shown above (“Depends what you order”). Google’s attempt to introduce social discussion to their local business listings shows a singular lack of foresight similar to their failure to include a vetting process with Bulletin.
In their defense, Google may be dealing with information overload. Each month, 700,000 new places are added to Google Maps. They’ve turned to users to help, but they’ve needed to incentivize users to get the information they need, rather than crowdsourcing it as Facebook has successfully done with Facebook Local. The more users answer questions on Google, upload photos, and edit business information, they earn points that designate them as a Local Guide – which they can exchange for early access to Google initiatives, exclusive events, and real monetary benefits like free storage on Google Drive.

Helping businesses convert users from their listings

We’ve been a bit hard on Google in the previous sections, but that’s about to change. Last year, Google also introduced Posts for Google My Business. Google Posts for Google My Business, as opposed to regular posts on a Google+ page, allowing businesses to update their listing with info that appears in the SERP along with their Knowledge Panel.

wellness

Posts offer business owners to promote new products, upcoming events, or simply useful information such as special holiday hours. Early studies indicate that engaging with Google Posts on a frequent basis can positively impact rankings – which may be an indication that Google is using a social feature as a search ranking factor.

Both Google and Facebook have also introduced CTA buttons businesses can add to their profiles, easing conversion from the SERP or social platform. Google users can book appointments with fitness and wellness-focused businesses directly from the SERP. Again, Facebook has outpaced Google here, since they offer seven CTA options which serve a variety of business needs: Book Now, Contact Us, Use App, Play Game, Shop Now, Sign Up, or Watch Video.

The convergence of local search and social

When you think about it, Facebook is the only business who could feasibly take on Google in the world of search. Its 2+ billion monthly users are a formidable force for Google’s 95% market share of mobile search users. While Google has access to email, Facebook has access to social profiles. Both companies have access to an incredible amount of demographic information on their users.

Which will reign supreme in the realm of local search is yet to be decided, although Facebook is giving Google a real run for their money thus far. Facebook’s local search results have become smarter, while Google’s attempts to incorporate social into search seem clumsy at best.

Likely, what we’ll ultimately see is a merging of local search and social as the two platforms meet somewhere in the middle.

If you are interested in using local search and social media as a powerful marketing set of strategies for your webpages, contact ABCO Technology. You can reach our campus between 9 AM and 6 PM Monday through Friday. Call today at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:

11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

Use social media for your web success.

Google updates their snipetts

On December 1st, 2017, Barry Schwartz reported on Search Engine Land that Google had officially confirmed a change to how it displays text snippets in Google’s Search Engine Results Pages (SERPs).
Barry wrote,

“A Google spokesperson told us: ‘We recently made a change to provide more descriptive and useful snippets, to help people better understand how pages are relevant to their searches. This resulted in snippets becoming slightly longer, on average.’”

These snippets are the blurbs of text displayed in Google’s SERPs along with the clickable blue text and the page URL.

A quick Google search corroborates this – let’s use the query “how were the pyramids built” as an example:

In the answer to the general query, you can see that where Google would previously display a snippet approximately 150-165 characters long including spaces (give or take, you can see it varies now and it varied before Google made the change too), but now they’re much longer.

The text snippet Google shows in the SERP is *supposed* to be (more on this later) the contents of the meta description tag in the HTML of the page – let’s check each of these page’s actual meta descriptions and their lengths.

Here they are, in the same order as above:

◾There are no photographs of the pyramid being built, and the engineers didn’t leave detailed blueprints. [Length:109]

◾The ancient Egyptians who built the pyramids may have been able to move massive stones by transporting them over wet sand. [Length:122]

◾No meta description specified in the HTML

◾No meta description specified in the HTML

◾Here’s everything you need to know about the incredible Egyptian pyramids. [Length:74]

Two things jump out right away.

1.Google is not displaying the page’s actual meta description as the SERP snippet for these specific listings for this specific query, even when the meta description is specified in the HTML, but instead is being pulled directly from the text that appears at or near the top of the page.

2.The length of the snippets is longer than the length that Google previously displayed, congruent with Google’s confirmation that they’re showing longer SERP snippets.

Here’s how that breaks down for the above query, again in the same order as the SERP listing screenshot above:

◾The first sentence of the text is used as the SERP snippet

◾The first sentence of the text is used as the SERP snippet

◾The H1 page headline, followed by ellipses ( … ), followed by the second, third, and fourth sentences on the page in the first paragraph (skipping the first sentence in the first paragraph) are used as the SERP snippet.

◾The first and second sentences, and part of the third, are used as the SERP snippet

◾The first and second sentences, the image ALT attribute (or the image caption, they’re both the same text), plus text via HTML code associated with the image, Checking a number of other queries returned similar observations about what Google is using as the SERP snippet, but note that some SERP snippets were indeed taken from the actual meta description.

For example, in the SERP for a query for musician “Todd Rundgren”, this SERP snippet is obviously taken directly from the meta description:

For many other queries I performed, both commercial and non-commercial in query intent, it turned up a mix of SERP snippet sources – primarily either text on the page or the actual meta description specified in the HTML, and in some cases via image ALT attribute, and occasionally from some other bit of code in the HTML.

On mobile devices, the SERP snippets were very similar, in many cases the same as on desktop.

The SERP orders were slightly different, so yes, there’s going to be ranking variations based on various factors (it’s well known that Google can and will alter the SERPs you see based on your search history, geo-location, query type, your previous interaction with SERPs, etc.).

However, the overall scheme of the SERP snippets remained constant – text was taken mostly from either the first paragraph of the page, or the meta description, and in some cases the image ALT attribute, and occasionally from other text in the HTML code.

Dr. Pete Meyers over at Moz conducted research late last year on 89,909 page-one organic results.

Pete noted that the average SERP snippet was 215 characters long with the median length at 186, and he was quick to point out that, “big numbers are potentially skewing the average. On the other hand, some snippets are very short because their Meta Descriptions are very short”.

Pete also noted no significant differences between desktop and mobile snippet lengths, sometimes seeing mobile snippets longer than desktop snippets.

For sure the actual SERP snippet you see, and the length, will vary by query type.

What is going on here?

Google is trying to satisfy searchers.

Yes, traditionally the idea was that Google would pull the SERP snippet from the meta description, but for years now Google has been using whatever text its algorithms determine makes the most sense based on the user’s query.

Not all sites – for example, Wikipedia and another we saw above – don’t even make use of the meta description tag in the HTML of their pages, so what’s a poor search engine to do in that case?

Similarly, what if the meta description is badly written, or spammy-sounding with lots of keyword stuffing, or doesn’t well-reflect the page’s theme and topic(s)?

So that’s what’s going on here – Google evolved over time to use whatever it deems makes the most sense to a user performing a certain query.

Wait: What the heck is a meta description, anyway?

Meta descriptions are HTML code that Google understands, and that is meant to provide a synopsis of the page.

Here’s an example:

This code goes between the tags of the HTML and is not displayed on the visible content that a user would see.

Do meta descriptions impact SEO?

Meta descriptions will not impact rankings.

But, if Google does use a page’s meta description as the SERP snippet, that can impact click-through from the SERP.

That’s because a well-written meta description that is compelling, relevant to the page, and relevant to the query or queries for which the page is ranking, can impact organic traffic.

And that can have a downstream impact on conversions (the desired actions you want website visitors to take – fill out a form, buy something, and so on).

Poorly written meta descriptions, if used as the SERP snippet, can have the opposite effect and discourage the user to click through to your page, and instead go to your competitors.

So, what should be your strategy now that Google has increased the SERP snippet length?

In summary, you could do any of the following:

◾Do nothing at all

◾Rewrite longer meta descriptions for all your pages

◾Rewrite longer meta descriptions for some of your pages (e.g. your top ten or twenty organic landing pages, or some pages you determine have low click-thru rates)

◾Delete all your meta descriptions

◾Audit your site’s content to ensure that the first text on your page is compelling, uses keywords congruent with how someone would search for your content, ensure the first paragraph contains at least 300-350 characters of text including spaces, and front-load the first 150 characters in case google changes back to shorter snippets in the future.

What you decide to do (or not do) will at least in part hinge upon resources you have available to make changes.

Don’t take a “set it and forget it” attitude with your website’s content and your meta descriptions. It’s common for businesses to put in a fair amount of work into their website, then just let it go stale.

A good recommendation here would be to cycle through this stuff on a regular basis – think quarterly or a couple times per year. Once per year at a minimum.

Here’s what I recommend

First, it should be obvious that your page’s textual content is for humans to consume, and that should always be your primary consideration.

You’ve heard the phrase “dance like no one’s watching” – well, write like Google doesn’t exist. But Google does exist, and their mission is satisfied users (so that people continue to use their service and click on ads) – Google is chasing satisfied users and so should you.

The refrain of “write great content” has been used ad nauseum. The only reason I’m mentioning the whole “write for your users” thing is simply because often people focus primarily on “how do I SEO my pages?” instead of “what’s good for my users?”.

Okay, with that out of the way and forefront in your mind, here’s what I recommend. Adjust this according to your specific needs – your industry, your users – don’t just take this as a cookie-cutter approach.

And, do this on the time frame that makes the most sense and works for you and the resources you have available to you to make changes to your site. If you haven’t looked at your page content and meta descriptions in a year or more, then this is a higher priority for you than if you refreshed all that 60 days ago.

Meta descriptions

◾Make them about 300-320 characters long, including spaces

◾Make the meta description super-relevant to the page text

◾Front-load the first 150-165 characters with your most-compelling text – compelling to your users who might see the text as a SERP snippet (just in case Google decides to shorten them again)

◾Use a call to action if applicable, but don’t be a used car salesman about it – and as appropriate, use action-oriented language

◾Remember WIIFM – what’s in it for me – as applicable, focus on benefits, not features

◾Don’t be deceptive or make promises your page content can’t keep

Keep in mind that Google may not use your meta description as the SERP snippet and may instead use content from your page, likely from the first paragraph.

With that in mind:

Review & refresh your content

◾Make sure the H1 page headline is super-relevant to the page’s topic

◾Include an image (as applicable) that is super-relevant to the page (not one of those dumb, tangentially-related stock images) and craft an excellent and page-relevant image ALT attribute

◾Ensure that your opening paragraph is enticing and practically forces the reader to keep reading – that way if it’s the text used as the SERP snippet, that will capture people’s attention.

Summary

My summary is that if you haven’t already, please go back and read the whole article – I promise you it’ll be worth it. But I will add one more piece here and that is that ostensibly the type of content you’re creating is going to dictate how you configure your meta descriptions, H1 page headlines, and especially the opening text on the page.

In some cases, it makes sense to use the “how to feed a (Google) hummingbird” technique where you pose the topic’s question and answer it concisely at the top of the page, then defend that position, journalism style, in the rest of the text under that.

Similarly, you may be shooting for a SERP featured snippet and voice-assistant-device answer using bullet points or a numbered list at the top of your content page.

The point is, the guidelines and recommendations I’ve provided for you here are not a one-size-fits-all, cookie-cutter approach to your meta descriptions and your content. SEO experience, switching your brain into the on position, and a willingness to test, observe, and adjust are all mandatory to achieve the best results.

ABCO Technology teaches a comprehensive class for web development. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email your questions to info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at :

11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Build highly visible webpages today

How to make a Google proof website

Any SEO or webmaster who has ever had a website affected by a Google algorithm change – or feared being affected by one – has probably wished that they could find a way to make their website “algorithm-proof”.

Still, surely there’s no such thing as a website that’s never impacted by Google algorithms, right? As long as your site is indexed by Google, it’s at the mercy of the algorithms that Google uses to determine website ranking, all the more so if you happen to rely heavily on organic search traffic for your business.

The art – or science – of search engine optimization is about determining as best you can what those algorithms are looking for, and giving it to them.

Yet one website believes it has found the formula for making its content “Google algorithm-proof”. Ranker is a website made up of dynamic, crowdsourced lists that users can vote on, about everything from pop culture to geography, history to sports, celebrities to science.

And according to its CEO, Clark Benson, Ranker has never suffered a negative effect from a Google algorithm change, growing its traffic steadily without interruption over the course of eight and a half years.

ABCO Technology caught up with Benson to find out Ranker’s secret to success, and whether there is a formula for creating an algorithm-proof website.

Rankings, not review sites

So what is Ranker, exactly?

“Ranker’s primary reason for being is to crowdsource anything that makes sense to rank,” says Benson. “Any topic that people are really interested in.

The unique angle that we’ve pursued is that instead of having this being one 23-year-old blogger’s opinion of the best new TV shows of the year, or whatever it happens to be, we would have a dynamic list that visitors could vote on, potentially add items to, and re-rank.

Voting on a list of ‘Historical events you most want to go back and see’ on Ranker

Lists have been a time-honored draw for magazines and other print media over the years, but it was when the internet came along that they really exploded – spawning dozens of list-oriented viral websites and the much-mocked listicle, which became a staple of online journalism. However, Benson – a self-described “lifelong list nerd” – was frustrated by the fact that these lists only ever represented one person’s opinion.

In a similar vein, he found review websites unhelpful, as user-generated reviews represented a single person’s subjective opinion in a format that wasn’t conducive to making a decision.

“Part of the reason to build Ranker was my frustration with review sites, because when I’m looking for an answer to something, like which TV show to watch, I don’t want to read a lot of text reviews.

“I also feel that in typical five-star rating systems, everything tends to be clustered around three and a half to four stars, so you don’t get any true granularity on what is best.”

In a world increasingly “cluttered with choices”, therefore, Benson was convinced that rankings were “the simplest way to dissect a choice in a category, without losing the credibility of the answer”. And so he built Ranker as a website where the wisdom of the crowd could determine the ultimate ranking for any list of items, on any topic.

The secret to Ranker’s SEO success: Content freshness

Since Ranker’s launch in 2009, the site has amassed more than 100,000 rankings across dozens of broad categories, encompassing almost any topic that people could have a passion for.

When the website first launched, however, it had very few resources, and Benson explains that he had to learn SEO from scratch in order to give the website a strong foundation.

Luckily, earning traffic was never a problem for the site, because the type of content published on Ranker was uniquely suited to catering to Google’s algorithms.

“We’ve never been hit by any algorithm changes – we’ve always grown our organic search traffic year over year over year, steadily, for the eight and a half years we’ve been live.

“You never exactly know what works in SEO, because Google doesn’t tell you what works, but I’ve always believed that the best intelligence on what to do comes from the public statements Google makes – their best practices.

“And one of the key factors that Google says is in their index is freshness of content. Content has a lifespan. In our case, because our rankings are dynamic and always changing – people are adding things to them, voting things up and down – this makes for perpetually fresh content.

“We have a lot of content that is six, seven, even eight years old that is still doing as well as it was years ago, and in some cases it’s even growing in traffic.”

One of Ranker’s most evergreen pieces of content is a list ranking the ‘Best Movies of All Time’ – which is more than 5,000 items long.

“Obviously that’s a topic that there’s a lot of passion and a lot of competition for [in search rankings]. And in the last few years, we’ve been on the top three or so results on Google for that term.

“We’ve watched that page just grow in rankings over the span of seven or eight years. I can only guess it’s because the page is always changing.”

User-curated content

At the time of writing this article, Ranker’s front page is currently spotlighting a list of best-dressed celebs at the 2018 Oscars, a best TV episode names ranking, and a list of possible game-changing deep space observations to be made by the Webb Telescope.

Anyone can add an item to a list on Ranker, although Ranker’s content is not purely user-generated. Ranker has an editorial team which is made up of people who, in Benson’s words, “have a mind for cataloging things” rather than people who specialize in writing a lot of prose.

Lists are typically started off by one of Ranker’s editors, and when a user wants to add a new item to a list, it’s cross-referenced with Ranker’s database, a huge data set made up of more than 28 million people, places and things. If the item isn’t found in the database, it’s added to a moderation queue.

Rather than UGC (user-generated content), therefore, Benson thinks of Ranker’s lists as something he terms UCC – user-curated content.

How did Ranker build such a huge data set? Beginning in 2007, a company called Metaweb ran an open source, collaborative knowledge base called Freebase, which contained data harvested from sources such as Wikipedia, the Notable Names Database, Fashion Model Directory and MusicBrainz, along with user-submitted wiki contributions.

This knowledge base made up a large part of Ranker’s data set. What’s interesting is that Freebase was later acquired by none other than Google – and is the foundation of Google’s Knowledge Graph.

Additionally, not every list on Ranker is crowdsourced or voted on. Some lists, such as Everyone Who Has Been Fired Or Resigned From The Trump Administration So Far, don’t make sense to have users voting on them, but are kept fresh with the addition of new items whenever the topic is in the news.

Can other websites do ‘Ranker SEO’?

Benson acknowledges that Ranker’s setup is fairly unique, and so it isn’t necessarily possible to emulate its success with SEO by trying to do the same thing – unless you just happen to have your own crowdsourced, user-curated list website, of course.

With that said, there are still some practical lessons that website owners, particularly publishers, can take away from Ranker’s success and apply to their own SEO strategy.

Related articles

A forward-looking history of link building

30 ways to market your online business for free

The 5 SEO mistakes holding your ecommerce site back right now

Mystified by martech? Introducing the ClickZ Buyers Guide series

First and foremost: content freshness is king

As you’ve no doubt gathered by now, the freshness of Ranker’s content is probably the biggest contributing factor to its success in search. “We’re convinced that the dynamism of our content is what really lets it just grow and grow and grow in search traffic,” says Benson.

“While our approach is somewhat unique to the way Ranker works – we have a bespoke CMS that makes lists out of datasets – I’m positive that there are other ways to apply this kind of thinking.”

To put content freshness front and center of your content marketing efforts, make sure that your publication or blog is well-stocked with evergreen content. For those articles or posts that are more time-sensitive, you can still publish a refreshed version, or look for an up-to-date spin to put on the old content, for example linking it in with current events.

According to research by Moz, other factors which can contribute to a positive “freshness” score for your website as a whole include:

◾Changes made to the core content of your website (as opposed to peripheral elements like JavaScript, comments, ads and navigation)

◾Frequency of new page creation

◾Rate of new link growth (an increase in links pointing back to your site or page)

◾Links from other fresh websites, which have the ability to transfer their “fresh value” (Justin Briggs dubbed this quality “FreshRank” in 2011)

Internal links trump external links

Other than content freshness, Benson attributes Ranker’s SEO success to one other big factor: its intricate network of internal links, which Benson believes are far more valuable to SEO than an impressive backlink profile.

“I think a lot of people who are new to SEO focus too much on trying to get outside links, versus optimizing their own internal infrastructure,” he says.

“We have a very broad site with millions of pages – not just lists, but a page for every item that’s included in a list on Ranker, showing you where it ranks on all of our different lists.”

The Ranker page for Leonardo da Vinci

“We made the mistake early on of leaving all of those pages open to Google’s index, and we learned over time that some of them are very thin, content-wise. New links are added to them, but they’re thin pages. So we quickly adopted a strategy of noindexing the thinner pages on our site – so they have utility, but they don’t necessarily have search utility.

“We’ve really focused a lot on internal link structure and on interlinking our content in a very intelligent and vertical-driven, page-optimized way. We’ve put a lot of engineering and product resources towards building a robust internal link structure that can also change as pages become more valuable in search.

“Outside links are very important, but they’re increasingly difficult to get. If you have good, unique content, and a strong internal link structure, I think you can get by with far fewer backlinks. Ranker has a lot of backlinks – we’re a big site – but we’ve never tactically gone out to build backlinks. And we get more than 30 million organic search visits per month.”

Think about how your content will appear to searchers

Benson emphasizes the importance of paying attention to basic on-site optimization like crafting good title tags and meta descriptions. These elements dictate how your website appears in the SERP to users when they search, and so will form the first impressions of your content.

“When it comes to creating new content, our editorial team definitely focuses on best practice with regards to title tags and meta descriptions – the basic stuff still applies,” says Benson. “Anyone doing editorial still needs to think about your content from the lens of the searcher.”

Optimizing for Google’s rich results and using Schema.org markup are additional ways that website owners can make sure that their website listing appears as attractive as possible to a searcher encountering it on the SERP.

The future is psychographic

What plans does Benson have for the future of Ranker? Up to now, the site has been concentrating mostly on search and social distribution (Facebook is another big source of organic traffic), but are now beginning to focus more on ad sales, media tie-ins and getting the brand name out there.

“We’re always focused on growing traffic, and we’re certainly investing a lot more into our brand,” says Benson.

However, the most exciting future project for Ranker is something called Ranker Insights – a psychographic interests platform which makes use of Ranker’s thousands of data points on what people are interested in and like to vote on.

Drawing connections between people’s interests on Ranker Insights

Big data on anything is extremely valuable in marketing, but big data on the things that people like is near enough invaluable – particularly in a world where psychographics (classifying people according to their attitudes, aspirations, and other aspects of their psychology) are increasingly more important than demographics (classifying people according to things like age, gender, race and nationality).

“The marketing world in general is steering a lot more towards psychographics rather than demographics,” says Benson. “Netflix doesn’t care what country you live in – when it comes to marketing or even recommendations, all they care about is your tastes. They stopped using demographics entirely years ago – and clearly they’re doing something right.

“We feel that in an interconnected world, what you like says at least as much about you as your age or your gender.

“And in a world where what you like tells people how to market to you and how to reach you, we have very, very granular, deep data on that front. There’s a lot of different applications for insights like this in a very data-driven world.”

Rebecca Sentance is the Deputy Editor of Search Engine Watch.

“The end result is a very wisdom-of-crowds-based answer which is always changing and dynamically moving along as tastes change, and as more people vote on things.”

Voting on a list of ‘Historical events you most want to go back and see’ on Ranker

Lists have been a time-honored draw for magazines and other print media over the years, but it was when the internet came along that they really exploded – spawning dozens of list-oriented viral websites and the much-mocked listicle, which became a staple of online journalism. However, Benson – a self-described “lifelong list nerd” – was frustrated by the fact that these lists only ever represented one person’s opinion.

In a similar vein, he found review websites unhelpful, as user-generated reviews represented a single person’s subjective opinion in a format that wasn’t conducive to making a decision.

“Part of the reason to build Ranker was my frustration with review sites, because when I’m looking for an answer to something, like which TV show to watch, I don’t want to read a lot of text reviews.

“I also feel that in typical five-star rating systems, everything tends to be clustered around three and a half to four stars, so you don’t get any true granularity on what is best.”

In a world increasingly “cluttered with choices”, therefore, Benson was convinced that rankings were “the simplest way to dissect a choice in a category, without losing the credibility of the answer”. And so he built Ranker as a website where the wisdom of the crowd could determine the ultimate ranking for any list of items, on any topic.

ABCO Teaches classes regarding building crowd funding websites in our web development program. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067

Email your questions to: ibnfo@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Learning to build crowd funding websites today!

A beginner’s guide to using negative keywords in PPC

Let’s set the scene. You’ve signed up to Google Adwords, entered your payment details, maybe even created a few ads and got to grips with the different types of matches for keywords.

You may even have gone ahead and sent your ads live. Easy enough. But you are fully aware that it doesn’t end there.

PPC can be an expensive hobby and you’re determined that your PPC campaign will become a valuable marketing channel rather than a resented, money-burning pastime.

In order to make the most of your PPC investment, you are going to have to make use of both common sense and data to constantly tailor your ads. You want to hone in on specific buyer personas which, as a byproduct (or whichever way round you want to view it), rid your campaign of wasted clicks.

You can do this by assessing quality score, A/B testing ad formats, revisiting your keywords and adding nice features such as call out extensions.

But as the title suggests, we’re here to talk about negative keywords. In this article we will walk through the basics of negative keywords in order to get you up and running. There’s loads of more detailed PPC tips on ABCO Technology’s Facebook page, so if you’re after pro tips we suggest using the handy search bar!

What are negative keywords?

One of the steps in creating your adverts is to assign the types of search terms that you want your adverts to appear for. Hopefully you have been specific about your keywords, focusing on user intent and relevance.

As you would imagine, negative keywords are almost the complete opposite of your target keywords. They help you give guidelines to Google, dictating the types of search terms for which you do not want to appear.

When would you use negative keywords?

Google defines negative keywords as “A type of keyword that prevents your ad from being triggered by a certain word or phrase. Your ads aren’t shown to anyone who is searching for that phrase. This is also known as a negative match.”

A common example of negative keyword use is ‘cheap’ (Google use ‘free’ as an example). Let’s say you make bespoke furniture or high-end watches; it makes sense that you would not want to pay for clicks from searchers looking for cheaper alternatives.

You also need to banish ambiguity. In her ultimate guide to AdWords keyword match types and negatives, Lisa Raehsler used a good example of ‘blueberry muffins’ in that the user intent could be for both recipes and bakeries – two very different user intents.

In such a situation you would then add ‘recipes’ or ‘bakeries’, whichever suits you, to your negative keywords.

Where do I enter negative keywords?

You may already have noticed the negative keywords tab when you were busy adding keywords for either a campaign or ad group – the tab is right next to the ‘keywords’ tab!

You can either enter Campaign level negative keywords which will apply the negative keyword across your whole campaign or alternatively you can also define them for specific ad groups depending on the complexity of your campaign. Simply select the ad group that you want to add your negative keywords to.

Note that, like keywords, you are able to define whether each negative keyword is exact, broad or phrase match. Amanda DiSilvestro explains more about these different types of keyword matches in her common PPC mistakes piece.

Finding negative keywords

If your campaign has already been running for a while, we would still not advise diving straight into your search terms tab. If you’ve ever read about the concept of ‘anchoring’ you would understand why – ever been asked to describe something without using a particular word, but all you can think about is that word? Same idea.

The data on search terms for which your website is appearing is not going anywhere, so why not take the time to use your own industry knowledge? Brainstorm the types of businesses, products or services that yours could be mistaken for and the search terms which would be used to describe them.

You are likely to uncover some negative keywords that haven’t been used by searchers yet – remember that if it shows up in your search terms, then you’ll have paid for it! After completing your brainstorming you can then use the search terms tab to identify further negative keywords.

SEO can play its part too. The worlds of SEO and Google Adwords can often come to blows, as teams compete for sought-after budgets and are inevitably looking to position their channel as the most effective.

We’re all on the same team, though, right? There is considerable overlap between the two, and PPC and SEO teams can actually work together, sharing data to benefit both campaigns.

If you are already collecting and analyzing data for your SEO campaign, it is advisable to dip into this data. It may well unearth potential negative keywords that your website is appearing for in organic search which have not yet found its way into your Adwords data.

Kill two birds with one stone by adding these negative keywords to your AdWords campaign or ad group and reassess your SEO strategy to hone in on that perfect buyer persona!

Keep checking in

If you don’t need to make adjustments to your campaign after setting it up, then I would suggest quitting your job and becoming a PPC guru!

Your campaign set-up may be top-notch, but things change: new data appears, different search terms develop and competitors change tactics. The knock-on effect is that you should keep checking in on your AdWords campaign (and negative keywords) regularly. If you don’t, you are either braver or sillier than I am (probably both).

Don’t waste your hard-earned cash by missing opportunities to maximize your investment; or, in the case of negative keywords, allow Google to charge you for clicks via search terms that are irrelevant to you and your business.

ABCO Technology teaches a comprehensive program for web development, which includes search engine optimization and marketing.

Call our campus today, you can reach us between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email your questions to:info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Build highly effective and visible webpages today!

2018 predictions for Google

2018 might be the Chinese Year of the Dog, but it’s going to be Google’s Year of the Machine. In this year’s article I go through my predictions for 2017 and my new predicted Google focus for 2018. Want to see how I did for 2017? Read on..!

Google Assistant

Last year I wrote a piece focused on mobile and correctly predicted Google would continue to move away from big algorithm updates, and have a continued focus on the mobile index. So let’s start off with a quick recap of what I said last year.

The main thrust of the article was about mobile and how Google was going to be focusing on this to increase revenue from Adwords…

Google has access to brand new markets and a shortcut into markets they were previously struggling with. Desktop would never provide access to these as there are way too many barriers to ownership rates rising so dramatically (such as cost and internet infrastructure).

Mobile has solved a key problem for Google. 99% of revenue for Alphabet comes from Google and 77% of that comes from AdWords. That mobile traffic is key to this figure and they are going to be doing their best to keep pushing it.

And this is from an FT article on the third quarter earnings for Alphabet last year:

Strong growth in mobile search, programmatic advertising and smartphone use in Asia helped accelerate revenue at Alphabet to the fastest rate in almost five years, surpassing estimates for sales and earnings in the third quarter.

https://www.ft.com/content/4a80da12-ba8b-11e7-8c12-5661783e5589

I did focus more on Africa than Asia, which was misplaced with the amount of investment Google has placed in the Asian market and the potential gains there. However, Africa is still an incredibly important market for Google and one they invested in massively during 2017, pledging to train 10 million people in Africa in online skills.

Additionally, Google has launched and continued to progress the mobile index, with mobile SEO further splitting from desktop.

One of the other noteworthy predictions was that we weren’t going to see any major updates on the desktop index anymore. With updates instead being small and frequent unnamed (by Google) corrections.

In 2017 we saw lots of small updates detected by the community and whilst there were a few new penalties put in place, for example targeting pages that used interstitials too heavily, these were not big algorithm shifts. The days of penguins, pandas and hummingbirds are over. It makes the lines even more blurred as it’s harder to point to things and say ‘after update X we know Y is now the case’. As a result of this, I would expect decreasing amounts of consensus across the SEO world as this continues. See our Google algorithm updates in 2017 for a recap of the year’s updates.

Finally, for the recap, I also talked about ‘Peak Mobile’ in some markets with Google’s focus shifting to changing user behavior for those who already owned mobile devices…

In the UK and US smartphones have reached saturation or are at least very near that point. With over 70% penetration in pretty much all the key markets (Europe, US, China), growth in mobile is going to be relying more on changing user behavior of existing device owners. Therefore we can expect more focus from the search engines on user behavior.

Google went on to broker a deal with Apple and they switched from Bing to Google to power Siri results on iOS in September 2017.

I’ll come back to this in the next part of the article where I lay out my predictions for 2018 as I think it’s strongly relevant.

The 2017 article is worth a read and covers a few other points as well. All the information I put across was an extension of the activity that was already taking place, so it was all a pretty safe bet. This year however I am going to go decidedly off piste…

The new device

There is a battle of the machines going on between Google and Amazon, with both of them vying to get themselves in your home. Google has ‘Home’ and Amazon has ‘Alexa’. Both of these are physically little more than glorified bluetooth speakers. However that speaker is linked to their respective AI offering.

Think about the last time there was an entirely new device for you to be served content from. We had desktop, then laptop and then mobile. An honorable mention goes out to tablet as well but it’s pretty much lumped in with mobile. But that’s it. Since 1998: 3. Well that’s now 4 with Google fighting a pitched battle to get into your home in yet another format. It’s Google VS Amazon or in other words Home VS Alexa.

Google is putting increasing resources into this battle, having realized that Amazon was leapfrogging ahead with Alexa. You might think this device is insignificant for Google as it has really limited potential as an ad serving platform. Which is correct. But, it’s got massive potential as a data collection platform, and crucially with every search or query made through Alexa, which is not visible to Google. So it’s losing out on that data.

Google Home is going to become increasingly integrated with other devices and they are going to keep driving the device cost down offering different model types. I also think that Google Assistant will be released, for free, for any manufacturer to use in their 3rd party device. There were a couple of examples of this popping up with select manufacturers right at the end of last year. Google doesn’t care about the hardware, it’s about getting Google Assistant into as many homes as possible. I can see it becoming a standard integration into bluetooth speakers, especially as Google have a track record for developing and releasing products free for any manufacturer to use, Android being a great example. That’s how they came to dominate mobile search and it’s how they’ll dominate voice search as well.

Google makes a massive $0 from Android the software, but they make billions in revenue from the searches conducted on Android devices. They also, importantly, don’t have to pay handset manufacturers for Google to be the preset search on those devices.

Looking at the recent increase in the range of Alexa products, it does look like Amazon is throwing the kitchen sink at it hardware wise and this could be in a pre-emptive strike against Google releasing Assitant out to any 3rd party to use for free. Interestingly within the EU the right to data portability will mean that consumers will be allowed to port their personal data between devices. This might sound like a win for the consumer, but it could have a dark side. It’s forcing companies to put aside their differences and develop a universal data format for our personal data. Theoretically this means it will be MUCH more useful to third parties and easier to compile even bigger data pools. Also if you swap back and forth between devices you’re sharing that data in more places, adding more information to their own networks.

The machines are coming from your home

I wrote an article recently about Natural Language Processing, which we now use within our classes, which touched on some of the current limitations of these offerings. They are a long way from perfect and if you know what you are looking for, specifically the things these types of AI are typically bad at, it’s very easy to trip them up with simple questions.

One of those question types is comparisons. You can ask ‘how far away is the moon’ or ‘how far away is the sun’ and get an answer without a problem. However ask ‘which is furthest away, the moon or the sun’ and although the information is there it’s too complex for Alexa to process. It takes a leap in understanding of both the question, processing of information and expected result to be able to respond, and this is simply out of current reach.

It’s this ability to process the information that’s already present for a more useful result which I think Google is going to try its best to leverage. Last year I touched on ‘micromoments’ which are the moments when a consumer pulls out their phone and checks information mid-decision. That point where you are walking down the street and want to know where the nearest restaurant is. These moments are incredibly valuable as they are hugely actionable – you are ready to make a decision right there and then and your next action will likely be to commit or purchase.

“Mobile has forever changed the way we live, and it’s forever changed what we expect of brands. It’s fractured the consumer journey into hundreds of real-time, intent-driven micro-moments. Each one is a critical opportunity for brands to shape our decisions and preferences.”

Source: https://www.thinkwithgoogle.com/collections/micromoments-guide.htm

This is just one touchpoint where Google hopes to be able to utilize AI to better deliver on. This ‘in the moment’ advertising is the context. However, there are also optimizations of the ad served to make sure that it’s the right ad as well as the right time. For example, based on past behavior they might look at what colour ads, placement or type you have responded to best previously. They then roll all that information into one to serve you an ad at that point that is contextual (meaning served in the correct context such as a restaurant ad as you are walking down the street looking at restaurants) and personalised (as in tailored to you specifically based on past ads you have responded well to).

Use of AI allows for taking huge amounts of data from multiple different behaviors, touchpoints and importantly, patterns and roll this information into an ad.

A big part of the opportunity for marketers is how AI will help us fully realize personalization—and relevance—at scale. With platforms like Search and YouTube reaching billions of people every day, digital ad platforms finally can achieve communication at scale. This scale, combined with customization possible through AI, means we’ll soon be able to tailor campaigns to consumer intent in the moment. It will be like having a million planners in your pocket.

We’re getting closer to a point where campaigns and customer interactions can be made more relevant end-to-end—from planning to creative messaging to media targeting to the retail experience. We will be able to take into account all the signals we have at the customer level, so we can consider not only things like a consumer’s color and tone preferences, but also purchase history and contextual relevance. And all of this will be optimized on the fly in real time.

https://www.thinkwithgoogle.com/marketing-resources/ai-personalized-marketing/

This is a mission statement from the Google VP of marketing, Marvin Chow, in September 2017 and I strongly believe it’s where they are going to be focusing a huge amount of effort and resources this year. Adwords is where Alphabet gets almost 80% of its revenue after all.

Taking AI a step further

Something not mentioned in the 2017 article – and where I am probably heading pretty deep off-piste – is predictive behaviour based upon modeling from using AI to process and understanding multiple user data. In other words, it’s not just looking at your past behavior but the past behavior of people like you.

For instance there are correlations between people and the products and services they like. So Google will be increasingly modeling you and trying to ascertain what you may be interested in from what type of person they think you are. On a simplistic level this is showing you ads for pet insurance after you googled ‘dog food’. People who purchase dog food also purchase pet insurance. There is a clear correlation. However things get a bit more weird when you take into account the multiple touch points. For instance you visit the vets and start getting ads for a specific pet insurance policy.

You’re already targeted based on past behavior, and that can then be combined with the ads that the people who meet other key criteria for you such as your age, gender, income etc have responded well to. Now you’re being pushed a single highly targeted product – advertising that is both contextual and personalized.

By the way, if you don’t want Google to know where you are all the time you need to go into your timeline within Google maps and turn off the tracking services. This is for both Android and iOS devices. This is the tracking on my device:

However I still continue to get notifications when I arrive or leave some places (particularly my local supermarket) asking me to ‘rate my experience’. This is without using Google Maps (I know the way to the supermarket!) so Google appears to still be tracking me through my device despite having it clearly set up not to. Which is comforting.

Following on from this, understanding people and patterns through AI will be Google’s biggest driver in 2018; it follows on perfectly from the mobile adoption focus. They need mobile adoption for tracking and will be trying to use this to their advantage.

So what do you think Google’s focus in 2018 will be?

If you think I’m right, wrong or anywhere in between please do have your say in the comments….

Do you think Google is tracking us too much?

Will AI be taking over advertising?

Does it even matter

ABCO Teaches a comprehensive program for E-commerce and search engine optimization. Call our campus today between 9 AM and 6 PM. Call us at: (310) 216-3067.

Email your questions to info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Build highly visible webpages today!

Google’s featured snippets what do they mean?

The use of snippets is important for many websites.

Google uses featured snippets to make it easier to connect us to the information we want, but in doing so could they be endangering the basic model the entire web relies on? We get free information and in return, we used to get served a couple of adverts on the site we look at. But without being able to serve those ads, there’s less incentive to create that content.

Featured snippets explained

Featured snippets are intended to make it easier for you to access the information available on a web page by bringing it directly into the search results.

Sometimes when you do a search, you’ll find that there’s a descriptive box at the top of Google’s results. We call this a “featured snippet.”

https://blog.google/products/search/reintroduction-googles-featured-snippets/​

Here’s an example from the Google blog post where they ‘reintroduce’ them:

So in short it’s taking the text from a page and then featuring it prominently in the search results.

#Google’s shift from connection engine to information engine

#Google has always been a connection engine. However, there appears to be a continuing change in the way in which Google sees itself. The model has always been:

•I enter a search term and Google provides a list of links to content that best answers that search

•I click on a paid or free result

•Google gains money from paid results and advertising on publishers’ sites

•Publishers get paid by the advertising on their sites

Google is increasingly moving towards just showing me the information, lifted directly from the content it indexes. The shift is subtle but it is destroying that model. So now the relationship looks like this;

•I enter a search term and Google provides me the information that best serves the search

•I read the information on Google

Not only is this chain a lot shorter, it also removes the publishers and so Google’s own methods of monetization. The key though is that Google only shows snippets for certain types of results. Results for searches with a clear purchase intent would be naturally less likely to show a snippet but more likely to have PPC ads. Whilst some results do also feature PPC results, in every search I did these were shown above the snippet, with the organic content below.

The potential effects of snippets on websites

When your business relies on traffic from providing specific or niche information then snippets can be devastating. Take the case of Celebrity networth.com as detailed in The Outline. If you want to know what someone famous is worth, you look it up on their site and they give you a number and breakdown of how they reached it. The most important thing is the number, that’s the key information people are looking for.

Back in 2014 Google emailed the owner of the site, Brian Warner, and asked for permission to use the data from the site in the knowledge graph, Brian was not keen…

“I didn’t understand the benefit to us,” he said. “It’s a big ask. Like, ‘hey, let us tap into the most valuable thing that you have, that has taken years to create and we’ve spent literally millions of dollars, and just give it to us for free so we can display it.’ At the end of it, we just said ‘look, we’re not comfortable with this.’”

https://theoutline.com/post/1399/how-google-ate-celebritynetworth-com

However when snippets were introduced Google just went ahead and took the information anyway. The information that Brian had said he didn’t want being used by them.

The result was a loss of 65% of traffic year on year and having to lay off staff as the profitability of the site took a nose dive. That’s the very real impact of Google’s change from connecting you to the information to delivering that information right there on the page. The sites that provide that information, the ones that have actually put the time and effort into creating the content, are the ones that lose out.

#Snippets likely won’t affect all websites as badly as in this example, it is just one example. But other studies consistently show featured snippets reduce clicks on other results, in effect cannibalizing traffic. Take this study from ahrefs:

Why did snippets need reintroducing?

Snippets just weren’t that bright, and there were several high profile examples of them failing. Some snippets appeared to have been removed, especially on more controversial topics.

The problem came about through a combination of not understanding the user intent and not being picky about where information was pulled from. Google’s failure to properly understand intent is something they have got in trouble with before, like with the ‘Unprofessional Hair’ problem.

As Google shifts from connecting to content, to connecting to information directly, intent becomes even more important. Of course without the context of the rest of the content we’re even less able to judge the validity of the information shown. Especially when these snippets also serve to provide information for Google Home Assistant. So there is little context available, beyond the name of the site, to evaluate the information against. It’s simply a case of being told an answer to a question as if it’s ‘The Answer’ rather than ‘an answer’.

This also leads to problems such as the case highlighted by Danny Sullivan in his own announcement post for the new feature:

Source: Google blog

Here we have two queries where the intent is the same. The suitability of reptiles as pets. However in a glass half full / half empty kind of way different people phrase this question differently depending on their initial bias. Google has then served each with a snippet that reinforces that bias. In effect two different answers to the same question depending on the searcher’s expectation of the result. For my results at least, Google appears to have put in a speedy fix for this by stopping the snippet showing on one set of results. Replacing reptiles with goats replicated the effect though, so it doesn’t look to be a fix for the wider issue.

This might not appear to be a big problem when it comes to reptiles or goats but things could potentially get out of hand quickly as they roll this out across more queries and cover more topics (for example politics). Searching around at the moment it looks like political or controversial topics are more restricted, especially in terms of the search content.

It’s not just snippets either

It’s not just content publishers that need to watch out. Google appears to be developing their own tools for popular queries and placing these directly in the search results. This is the result I get for a search on ‘internet speed test’:

I guess for the rest of the sites offering a speed checker it’s just tough. This is different from snippets as it’s not using anyone else’s information. But in this example at least Google appears to be creating a tool and then placing it at the top of the search results above competing tools. I personally feel that sets a bit of a dangerous precedent as this could potentially spread with Google creating more tools, in partnership with more companies, so harming the competition. Competition and diversity are good, but people will be less likely to innovate and create new tools if Google is going to just step in when something gets popular and publish their own tool above everyone else’s in the results.

Google has got into trouble before for placing their own services above competitors. In July of 2017 it received a record-breaking $2.7 billion fine from the EU for antitrust violations with their shopping comparison service:

Google has systematically given prominent placement to its own comparison shopping service: when a consumer enters a query into the Google search engine in relation to which Google’s comparison shopping service wants to show results, these are displayed at or near the top of the search results.

Google has demoted rival comparison shopping services in its search results: rival comparison shopping services appear in Google’s search results on the basis of Google’s generic search algorithms. Google has included a number of criteria in these algorithms, as a result of which rival comparison shopping services are demoted. Evidence shows that even the most highly ranked rival service appears on average only on page four of Google’s search results, and others appear even further down. Google’s own comparison shopping service is not subject to Google’s generic search algorithms, including such demotions.

So Google put their own service higher up in the results than competing services and didn’t make their own service subject to the same ranking algorithms as their competitors.

Source: TechCrunch

What will this mean for content?

The trouble with snippets is that the places this might hit hardest are those which invest more in the creation of their content. Or in other words, the content which has higher editorial standards. If you’re a journalist, someone needs to be paying you to write the content, an editor needs to be paid to sub the content, designers and photographers paid for graphics and images.

So the content which stands to lose the most is arguably the most important, whilst the lower quality, recycled, poorly researched and quickly written content, which needs to generate less revenue as it costs so much less to churn out, remains profitable.

This creates a vicious cycle: as there is more low-quality content it therefore captures a greater share of the audience, higher quality content gets more drowned out and so gets less revenue and diminishes even further.

Mobile users want information delivered more quickly and concisely. We have shorter user journeys on mobile with less time on site and a higher bounce rate. Capturing these visitors with properly optimized content is important as mobile is a key part of Google’s revenue as it continues to dominate the mobile search market.

It does also mean, however, that the user is less likely to visit a site which is funded by ads ironically likely served through Google’s own platform. However Google might be less concerned about this depending on how much importance they are placing on their Home Assistant product. The snippets are used by the AI to provide answers for your questions. Ads don’t factor into this and they could perhaps have calculated they stand to gain more from better information here than the loss from fewer ads served on those sites.

#ABCO #Technology teaches a comprehensive course for search engine optimization. Call our campus between 9 AM and 6PM Monday through Friday at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at:

11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Learning to build highly visible webpages today!

×

Request Info with No Obligation

    How much is tuition?Can I get financial aid?What are my career prospects?When does it start?

    By checking this box, I give consent for ABCO Technology to use automated technology to call and/or text me at the number provided above, including my wireless number if applicable. Call us for information: 310-216-3067

    I understand & agree