6.7 million job openings and only 6.4 million people to fill them, US Department of labor says

The US Department of Labor announced on June 6, 2018 that the United States had 6.7 million job openings and only 6.4 million workers to fill them.

The reason for this wide job gap was due to the fact candidates for the work simply didn’t possess the correct qualifications. The Department of Labor announcement went on to say that three million of these job openings were in the field of information technology.

This article will focus on the specific areas of information technology where these job openings exist.

Network administration

Websites, newspapers and magazines are filled with ads looking for qualified network administrators. Companies are looking for candidates who can function in the cloud and using servers on site. The problem faced by many organizations is the candidates who apply for these positions do not have the right qualifications. Network administrators need three basic certifications: CompTIA A+, Microsoft Certified system expert or Microsoft certified system associate and Cisco Certified Network administrator.

The certifications mentioned above involve fixing a computer, handling a corporate server and installing plus configuring routers.

The certifications listed above can be obtained in less than one year.

Cyber Security

The next area, which is experiencing high demand is cyber security. All networks are open to some type of cyber-attack. Cyber security professionals can also repair a network, which has been damaged by a cybercriminal.

The Certifications required for cyber security are:

CompTIA A+

CompTIA Network +

Cisco Certified Network Associate

CompTIA Security +

Certified Ethical Hacker

Linux Fundamentals.

The certifications listed in this program are completed in less than one year.

Database administrator is another area, which is receiving a great deal of demand. In this program employees are paid to organize, secure and backup a company’s information. The two popular database classes to take are Microsoft and Oracle.

A student can receive the database certification in less than one year.

Computer programming is another area receiving a lot of demand. Computer programmers are needed for gaming, application development and website backend functionality.

Computer programmers will receive their certification in one year.

Finally web developers are needed. Pew research confirmed that 75% of small businesses having less than five employees don’t have a functional website. If you become a web developer, you can be the person who builds it for them.

A career in information technology is only a telephone call away.

Call ABCO Technology between 9 A.M and 6 P.M Monday through Friday at: (310) 216-3067.

Email all questions to: info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at:

11222 South La Cienega Blvd. STE #588

Los Angeles, Ca. 90304

 

Get one of those three million jobs. Get certified for information technology today!

Anyone can alter Google’s search results. This article will explain how.

In 2013 Google moved to what it called the Internet of things. The move was a change in policy, which announced Google’s use of three sources of information to obtain knowledge for search results. The new sources were Wikipedia, CIA world Fact Book and Freebase. Wikipedia appears to be the dominating source. Google’s dependence upon Wikipedia cause this accident to happen.

House Majority leader Kevin McCarthy

(R-Calif.) on Thursday went after Google for displaying “Nazism” as one of the ideologies of the California Republican Party.

A search on the site for “California Republican Party” apparently returned with a sidebar result listing Nazism as an ideology alongside “conservativism” and “market liberalism.”

McCarthy noted the sidebar in a tweet at the company.

“Dear @Google, This is a disgrace #StopTheBias,” McCarthy tweeted, accompanied by a screenshot showing Nazism listed among the California Republican party’s ideologies alongside values like “conservatism” and “market liberalism.”

Ideologies associated with the California GOP are no longer visible in the sidebar on Google’s results page when users search “California Republican Party” or in similar searches.

The sidebar, which Google calls the “knowledge panel,” is often populated by content from Wikipedia; however, no mention of Nazism is visible of the California GOP’s page there. The party’s Wikipedia page’s edit history, though, shows that “Nazism” was briefly added to the page on Thursday.

Google blamed the result as online “vandalism” from outside the company that slipped through its safeguards.

“This was not the result of any manual change by anyone at Google. We don’t bias our search results toward any political party. Sometimes people vandalize public information sources, including Wikipedia, which can impact the information that appears in search,” a Google spokesperson said in a statement.

“We have systems in place that catch vandalism before it impacts search results, but occasionally errors get through, and that’s what happened here. This would have been fixed systematically once we processed the removal from Wikipedia, but when we noticed the vandalism we worked quickly to accelerate this process to remove the erroneous information,” they added.

The mistake comes in advance of California’s statewide primary elections, which are set for next week.

Other conservatives expressed outrage over Nazism appearing, including President Trump.

Anyone can post a Wikipedia article. ABCO Technology’s web development program will teach you how. If you are interested in learning advanced website techniques for promoting your business online, call our campus. You can reach us by telephone between 9 A.M and 6 P. M Monday through Friday at: (310) 216-3067.

Email all questions to :- info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Building engaging and highly visible webpages today!

Network administrators get high paying jobs with CompTIA Security+ listed on their resume.

Are you a US veteran, network administrator or a person who allowed their certifications to laps? Would you like to gain a big advantage over your IT competition?

If you answered yes to these important questions, The CompTIA Security + certification is for you.

The CompTIA Security + is gaining in popularity across the United States. Over two million companies are requiring the CompTIA Security + as a major requirement for network administrators.

The reason for this surge in demand for this certification is because of the massive increase in cybercrime. Cyber-attacks are a normal part of a network administrator’s business day. These attacks usually are made upon the corporate server, routers, attacks on individual work stations and in some cases hackers have been known to gain access through a device like a simple printer or fax.

On May 15, 2018 Banks in Mexico had to admit that cyber-criminals stole twenty million dollars due to hacking. In fact Money transfers had to be suspended for 24 hours so the crimes could be sorted out. Only one result came from this attack, more cyber security professionals were hired.

The US government is requiring all companies who contract with them, have a person on staff who is certified in cyber security. The last thing our government wants is to have important information fall into the wrong hands. News leaks dominate the news today. Cyber security is one solution to stop vital information from falling into the hands of the media. The CompTIA Security + is the certification most recognized by the US government.

What is needed to fight these attacks, is a qualified person who has certified cyber security skills. The CompTIA Security + is your credential, which will prove to employers that you can defend their network against a cyber-attack.

The CompTIA Security + is not a standalone certification. In order to receive maximum benefit from this certification, network administrators should have the CompTIA A+, Microsoft’s MCSE, Cisco CCNA and if the MCSE isn’t possible the CompTIA Network + will be sufficient to advance to the CompTIA Security + certification.

The CompTIA Security + is completed in 78 hours. While in class, you will learn the critical strategies you can employ to harden your network against all kinds of cyber-attacks.

In addition, the US Department of Defense recognizes the CompTIA Security + as a major proof of cyber security competence. In fact, the CompTIA Security + is a key part of a Department of Defense security clearance known as DOD 8140.

Veterans, persons working in network administration and candidates who have various degrees in the field of information technology will benefit from this information packed certification.

After completing the CompTIA Security +, many of our students move onto the Certified Ethical hacker, which provides all of the tools you will need for network penetration testing and defense.

You will receive access to over 600 tools, which will enable you to defend your organization against a cyber-attack. Cyber-security is big business in 2018. Students who train and become certified will be in demand. Jobs for cyber security professionals are listed in all 50 states. If you obtain this certificate, you can choose where you want to live. Most important, you will know that you have the earning power to receive a solid income so you can buy that house or car you’ve always wanted.

If you are interested in learning about cyber security, contact ABCO Technology.

Reaching our campus by telephone is easy. Call Monday through Friday between 9 AM and 6 PM. Our phone number is (310) 216-3067.

Email your questions to info@abcotechnology.edu

Financial aid is available to all students who qualify for education funding.

ABCO Technology is located at:

11222 South La Cienega Blvd. STE #588

Los Angeles, Ca. 90304

 

Become a cyber-security certified professional today!

Another day, another cyber-attack.

major restaurant chain hacked again, more cyber security needed

Chillis, a large chain of restaurants was hit with a major cyber-attack in 2018.

The chain restaurant reports the attack occurred between March and April 2018

Do you believe your credit information is safe when you eat out? If you believe so, think again! Chillis stock symbol is found by typing eat. Well, the restaurant chain sure ate it on this one.

Between March and April of 2018, Chili’s restaurants were hit by a data breach that may have compromised some guests’ payment card information. The breach was discovered last Friday, according to a press release by Brinker International, which owns the 1,600+ location chain.

Per Chili’s press release:

♬ Hey you got maybe hacked, maybe-hacked, maybe-hacked ♬

♬ Hey you got maybe hacked, maybe-hacked, maybe-hacked ♬

♬ Chili’s got hacked, baby, hacked ♬

Ok, I’m going to level with you. That wasn’t the press release. And I will probably never get a PR job with Chillis. But seriously, how could you even be mad at that?

Back to the real press release, though. Here is what happened:

On May 11, 2018, we learned that some of our Guests’ payment card information was compromised at certain Chili’s restaurants as the result of a data incident. Currently, we believe the data incident was limited to between March – April 2018; however, we continue to assess the scope of the incident. We deeply value our relationships with our Guests and sincerely apologize to those who may have been affected.

Chili’s immediately launched into its response plan and is currently working with third-party forensic experts to investigate exactly what happened.

Based on the information currently available, it appears as though malware was used to collect payment card information, specifically credit and debit card numbers and cardholder names, used during in-restaurant purchases at affected restaurants.

Chili’s does not collect certain personal information (such as your social security number, your full date of birth, or federal or state identification number). Therefore, this personal information was not compromised.

Chili’s suggests that all customers monitor their bank accounts for any unauthorized transactions. So far the complete scope of the breach is not known, including how many restaurants were affected and how many customers had their data compromised. The restaurant will continue to issue updates as they become available.

Overall, Chili’s handled this situation with aplomb. We’ve reached a point where, for big companies, data breaches aren’t a matter of if, they’re a matter of when. Chili’s had a contingency plan in place, it disclosed the breach within 72 hours of its discovery (something that would make GDPR compliance specialists proud) and the company is communicating clearly and giving customers actionable advice to help ensure that they aren’t negatively impacted.

All in all, at least externally, Chili’s has handled itself fine here. Besides, a data breach is not what keeps successful chain restaurants up awake at night. Millennials are the prime source of stress.

It is important for Chillis ensure that their customers continue dining at their restaurants. In order to create trust, the chain will be forced to spend more money for cyber security. That’s where readers of our ABCO Technology articles will benefit. If you want a promising career in cyber security, contact ABCO Technology. You can reach our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email all questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:

11222 South La Cienega Blvd. in STE #588

Los Angeles, Ca. 90304

Cyber-crime is growing faster than professionals can be hired. Train and certify for a cyber security career today!

Mobile vs desktop web pages have different search engine rankings

As if we didn’t have enough to think about with respect to any given SEO campaign, it is now imperative to separate and refine your approaches to mobile and desktop search.

While mobile has moved to 70% of all searches over the past five years, this shouldn’t be to the neglect of desktop. Although SEO for mobile and desktop follow the same basic principles and best practices, there are nuances and discrepancies that need to be factored in to your overall ranking strategy.

Part of this is the keyword rankings: you won’t ever know how to adapt your strategies if you’re not tracking the rankings separately for each. Research from Bright Edge found that 79% of listings have a different rank on mobile devices compared with desktop, and the top-ranking result for a query is different on desktop and mobile 35% of the time. These are statistics that simply cannot be ignored.

Why are they different?

Before delving into how to compare keyword rankings on mobile and desktop, it’s highly important to acknowledge the why and the what: why they rank different and what it means for your SEO strategy.

It’s paramount to understand that desktop and mobile searches use different algorithms. Ultimately, Google wants to provide the best user experience for searchers, whatever device they are using. This means creating a bespoke device-tailored experience and in order to do that, we need to delve deeper into user intent.

It’s all about user intent

The crux of the mobile versus desktop conundrum is that user intent tends to differ for each device. This is particularly important when considering how far along the funnel a user is. It’s a generalization, but overall mobile users are often closer to the transactional phase, while desktop users are usually closer to the informational phase.

For example, we can better understand user intent on mobile by understanding the prevalence of local search. If a user is searching for a product or service on mobile, it is likely to be local. In contrast, users searching for a product or service on desktop are more likely to be browsing non-location-specific ecommerce sites.

Let’s also consider the types of conversions likely to occur on each device, in terms of getting in touch. Users on mobile are far more likely to call, by simply tapping the number which appears in the local map pack section. Alternatively, desktop users would be more inclined to type an email or submit a contact form.

What on earth is a micro-moment?

To better understand the different ways in which consumers behave, it may help to spend a little time familiarizing yourself with micro-moments. These refer to Google’s ability to determine a searcher’s most likely intent, and is particularly important for mobile users, when a consumer often needs to take immediate action.

For example, if a user is searching for a local product or service, the local map pack will appear, but if they are searching for information then the quick answer box will appear. These micro-moments therefore have a significant impact on the way the SERPs are constructed.

Once you’ve understood the user intent of a given searcher, you can ensure that you are providing content for both mobile and desktop users. However, it’s worth bearing in mind that content with longer word counts continues to perform well on mobile, despite the general consensus that people on mobile simply can’t be bothered to consume long form content. This harks back to Google’s prioritization of high quality content. Besides, anybody who has a long train commute into work will understand the need for a nice, long article to read on mobile.

Rankings tools

With that context, we can now return to the matter at hand: rankings. Of course, you could record the rankings for both desktop and mobile the old-fashioned way, but who has time for that? In short, any good SEO tool worth its salt will enable you to track both desktop and mobile rankings separately. Here are some favorites:

◾SEMRush is a personal favorite among the plethora of fancy SEO tools. SEMRush provides a comprehensive breakdown of mobile vs desktop results (as well as tablet if you really want to geek out) and displays the percentage of mobile-friendly results for your domain.

◾Search Metrics offers Desktop vs. Mobile Visibility metrics, detailing individual scores for desktop and mobile, as well as overlap metrics which show how many keyword search results appear in exactly the same position for both. You can also drill down further to view how a website performs with regard to localized results.

◾Moz. Through Moz Pro, you can track the same rankings metrics for both desktop and mobile. Filter by labels and locations to dig further into the data.

◾Google Search Console. Don’t have access to any of the above tools? Don’t panic as you can still rely on the trusty Google Search Console. When looking at your search analytics, filter devices by comparing mobile and desktop. Even if you do have access to a SEO tool that allows you to do comparison analysis, it’s still worth checking in on your Search Console insights.

Rankings are only part of your overall page strategy.

It’s important to remember that rankings are basically a tiny part of the picture; it’s essential to take a more holistic approach to the mobile vs desktop issue. This means taking the time to dig around Google Analytics and unearth the data and meaning beyond the vanity metrics.

You may have higher rankings for mobile, but those users might be bouncing more regularly. Is this a reflection of the user intent or is it a poor user experience? Does higher rankings for one device correlate to higher conversions? If not, then you need to consider the reasons for this. There’s no one-size-fits-all answer, so you must take a tailored approach to your search engine strategy.

ABCO Technology teaches a comprehensive course for web development. Call our campus between 9 AM and 6 PM Monday through Friday. You can reach our campus at: (310) 216-3067.

Email all questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:

11222 South La Cienega Blvd. STE #588

Los Angeles, Ca. 90304

 

Learn to build highly visible webpages today!

Google improves mobile indexing

Google announced yesterday that after a year and a half of testing it was beginning a wider rollout of its mobile-first indexing and had started migrating sites that follow the best practices for mobile-first indexing.

Google started to move a small number of sites over late last year, but this is the first announcement of what seems to be a larger scale move.

Sites which are migrating will be notified via a message in Search Console:

migration

Google said site owners would see significantly increased crawl rate from the Smartphone Googlebot and that Google would show the mobile version of pages in search results and Google cached pages. It said that for sites which have AMP and non-AMP pages, Google would favor the mobile version of the non-AMP page.

Google moved to reassure site owners who are not included in this rollout that rankings would not be affected and that sites which only have desktop content would still be indexed.

“Sites that are not in this initial wave don’t need to panic. Mobile-first indexing is about how we gather content, not about how content is ranked. Content gathered by mobile-first indexing has no ranking advantage over mobile content that’s not yet gathered this way or desktop content. Moreover, if you only have desktop content, you will continue to be represented in our index.”

Source: https://webmasters.googleblog.com/2018/03/rolling-out-mobile-first-indexing.html

However, the push towards mobile friendly sites continues with Google noting that mobile friendly content can perform better, and that slow loading content will be a ranking factor for mobile searches from July.

You can find more information about best practices for mobile-first indexing in Google’s developer documentation.

Google says it will continue to have one single index and there won’t be a mobile-first index separate from the main index. Historically it was the desktop version that was indexed but increasingly Google will now be using the mobile version of content, responding to the growth in use of mobile devices.

See Google’s blog post for full details.

ABCO Technology teaches a comprehensive course for mobile web site development. Call our campus between 9 AM and 6 PM Monday through Friday. You can reach us at:

(310) 216-3067.

Email all questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for federal funding.

 

Build highly visible mobile web sites today!

Database jobs increasing rapidly

The amount of the world’s information has doubled since 2013. The fact the amount of stored data continues to increase, the career of database administrator gains in importance.

The term database basically is defined as a collection of information. The collection can be as simple as the old phone book or it can be as complex as unstructured data mining, which is becoming more important as technology advances.

The career position of database administrator has been growing rapidly since the early eighties, when Ashton Tate introduced the first commercial database system for companies.

In 2018, two primary certifications are open to database administrators, which will prepare them for a fulfilling career. These certifications are Microsoft and Oracle.

Let’s begin by looking at the Microsoft database certification for database administrator. If you have little experience working with databases, the Microsoft database certification may be just the right one for you. Microsoft has gained a 90% market share through its networking products. The Microsoft database is used widely in smaller companies who have less than 1,000 employees. Microsoft has loads of technical support and their database will communicate with Oracle and the open-sourced MYSQL. A certification for Microsoft database administrator is completed in six months when students attend courses full time. Microsoft has a version of SQL, which is structured query language, which is the language for all databases. Students who earn the Microsoft database certification receive a Microsoft MCSE for database. Microsoft’s database curriculum is well structured. Many students continue training after Microsoft to obtain their Oracle database certification.

The second database certification, which is for more experienced database professionals is the Oracle database administrator. The Oracle database is used at large companies who have vast amounts of data and many employees accessing the same records at the same time. For example, the Los Angeles Unified School District uses Oracle. Oracle is used in major hospitals, fortune 500 companies, the US Department of defense and the US Department of Home Land security. Oracle is a bit more complicated to learn. Having some knowledge of programming and how technology operates is a major plus when learning Oracle. The Oracle database administrator program is completed in eight months. Students will need to complete three exams for the certification of OCP, which stands for Oracle certified Professional. Many job opportunities exist on major job sites including Indeed and Monster for the position of Oracle Database administrator.

A college degree isn’t a requirement for either Microsoft or Oracle certifications. Companies are looking for a professional who can use the software and will not require hours of training. Oracle and Microsoft certifications let a prospective employer know that you are fully trained and can use the software.

ABCO Technology teaches a comprehensive program for either Microsoft or Oracle database administrator. You can reach our campus by telephone at: (310) 216-3067 from 9 AM to 6 PM Monday through Friday.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at:

11222 South La Cienega Blvd in STE # 588

Los Angeles, Ca. 90304

 

Database administrators are in demand get trained and certify for this career opportunity today!

Facebook in competition with Google for Local search

When you think local SEO, you think Google. But another big name has been making some moves lately to enter the conversation, and that’s Facebook.

In the past few years, Facebook’s made a lot of strides to become a real player in local search, improving their search results to the extent that they rival Google’s. Meanwhile, Google has made investments in Google My Business to justify business owners devoting time to it instead of treating it like a defunct social media listing.

Both of these trends bode well for the impact of search on social, and of social on search.

Let’s review some of the most recent changes in local SEO from Google and Facebook.

Prioritizing local news for community engagement

At the end of January 2018, shortly following Mark Zuckerberg’s announcement that Facebook would be demoting content from of brands and publishers in favor of those from family, friends, and groups, the social media giant announced that they were tweaking their algorithm to also highlight local news in the News Feed.

Facebook

The focus on aggregating and finding local news indicates Facebook plans to double down on local search. The more they can pick up on local search signals, the better they can provide hyper-localized, relevant content for their users. Consumers increasingly expect more personalization, and assuming (like Facebook does) that there is a correlation between personalization and hyper-localized content, this change will make their platform a more valuable source of information for their users. The more valuable the information on the platform, the likelier that user base is to stick on it, using it as both a local search engine as well as a place for updates on friends and family. Let the advertising dollars roll in.

In his announcement, Zuckerberg said,

“People constantly tell us they want to see more local news on Facebook.”

Apparently Facebook wasn’t the only one listening, as earlier that same week, Google launched its own local news app.

Currently only available in two cities, the free Google Bulletin app lets users post news updates and upload photos and video about events and happenings in their area. The app essentially combines the social community features of Nextdoor with the You Report It feature many local news sites rely on to crowdsource content.

You reported

With Bulletin, Google may well be hoping to encourage users to visit it first as the source of immediate information, instead of turning to Facebook as people so often do during an emergency or to find a local event.

Google Bulletin and Facebook’s prioritization of local news are also a strong response to the pressure both companies received for disseminating fake news during the 2016 U.S. election. Both are making the same assumption – that hyperlocal necessarily means more relevant and, since it’s coming from news sources, more trustworthy.

However, both initiatives are in their early days and their assumptions don’t seem fully fool-proof. Facebook’s algorithm currently determines something is local news by noting the domain, and then seeing whether users from a concentrated geographical area engage with the content – a setup which should be fairly easy to game. Meanwhile, there’s currently no vetting process on Google Bulletin that would prevent users from uploading inaccurate information.

Crowdsourcing content to inform business listings

Besides news sources, both Facebook and Google are relying on crowdsourced information to complete, categorize, and rank the business listings in their database. On either platform, users can add places, update address information and hours, write reviews, and answer questions about the business. Then, the platform uses this information to determine the most relevant result based on a searcher’s query, their location, and even local time.

Both Google and Facebook provide robust results that display helpful attributes sourced by user reviews, ratings, and busy times.

Google crowdsourcing

Facebook also includes additional filters based on whether your friends have also visited a place – bringing the social into search.

Facebook crowdsourcing

Facebook’s City Guides do the same at a macro-level, providing trip planning for various large cities around the world, and showing the top places your friends as well as locals have explored.

Facebook city guide

Launched in November 2017, the Facebook Local tab incorporates local event results along with the business listings and displays which of your friends are attending. This hyper social aspect, as opposed to hyperlocal, is a unique differentiator that gives Facebook real value as a local search engine.

To its credit, Google has been working on ways to make its own search results more social. One of the biggest changes they introduced to Google My Business in 2017 was the Q&A feature. Users can click a button to ask questions about a business, which are then available to be answered by anyone, including the business itself, as well as local guides, regular Google users, and even competitors.

Q and A

The fact that anyone can answer leads to misinformation, or less than helpful information as in the last example shown above (“Depends what you order”). Google’s attempt to introduce social discussion to their local business listings shows a singular lack of foresight similar to their failure to include a vetting process with Bulletin.
In their defense, Google may be dealing with information overload. Each month, 700,000 new places are added to Google Maps. They’ve turned to users to help, but they’ve needed to incentivize users to get the information they need, rather than crowdsourcing it as Facebook has successfully done with Facebook Local. The more users answer questions on Google, upload photos, and edit business information, they earn points that designate them as a Local Guide – which they can exchange for early access to Google initiatives, exclusive events, and real monetary benefits like free storage on Google Drive.

Helping businesses convert users from their listings

We’ve been a bit hard on Google in the previous sections, but that’s about to change. Last year, Google also introduced Posts for Google My Business. Google Posts for Google My Business, as opposed to regular posts on a Google+ page, allowing businesses to update their listing with info that appears in the SERP along with their Knowledge Panel.

wellness

Posts offer business owners to promote new products, upcoming events, or simply useful information such as special holiday hours. Early studies indicate that engaging with Google Posts on a frequent basis can positively impact rankings – which may be an indication that Google is using a social feature as a search ranking factor.

Both Google and Facebook have also introduced CTA buttons businesses can add to their profiles, easing conversion from the SERP or social platform. Google users can book appointments with fitness and wellness-focused businesses directly from the SERP. Again, Facebook has outpaced Google here, since they offer seven CTA options which serve a variety of business needs: Book Now, Contact Us, Use App, Play Game, Shop Now, Sign Up, or Watch Video.

The convergence of local search and social

When you think about it, Facebook is the only business who could feasibly take on Google in the world of search. Its 2+ billion monthly users are a formidable force for Google’s 95% market share of mobile search users. While Google has access to email, Facebook has access to social profiles. Both companies have access to an incredible amount of demographic information on their users.

Which will reign supreme in the realm of local search is yet to be decided, although Facebook is giving Google a real run for their money thus far. Facebook’s local search results have become smarter, while Google’s attempts to incorporate social into search seem clumsy at best.

Likely, what we’ll ultimately see is a merging of local search and social as the two platforms meet somewhere in the middle.

If you are interested in using local search and social media as a powerful marketing set of strategies for your webpages, contact ABCO Technology. You can reach our campus between 9 AM and 6 PM Monday through Friday. Call today at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:

11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

Use social media for your web success.

Google updates their snipetts

On December 1st, 2017, Barry Schwartz reported on Search Engine Land that Google had officially confirmed a change to how it displays text snippets in Google’s Search Engine Results Pages (SERPs).
Barry wrote,

“A Google spokesperson told us: ‘We recently made a change to provide more descriptive and useful snippets, to help people better understand how pages are relevant to their searches. This resulted in snippets becoming slightly longer, on average.’”

These snippets are the blurbs of text displayed in Google’s SERPs along with the clickable blue text and the page URL.

A quick Google search corroborates this – let’s use the query “how were the pyramids built” as an example:

In the answer to the general query, you can see that where Google would previously display a snippet approximately 150-165 characters long including spaces (give or take, you can see it varies now and it varied before Google made the change too), but now they’re much longer.

The text snippet Google shows in the SERP is *supposed* to be (more on this later) the contents of the meta description tag in the HTML of the page – let’s check each of these page’s actual meta descriptions and their lengths.

Here they are, in the same order as above:

◾There are no photographs of the pyramid being built, and the engineers didn’t leave detailed blueprints. [Length:109]

◾The ancient Egyptians who built the pyramids may have been able to move massive stones by transporting them over wet sand. [Length:122]

◾No meta description specified in the HTML

◾No meta description specified in the HTML

◾Here’s everything you need to know about the incredible Egyptian pyramids. [Length:74]

Two things jump out right away.

1.Google is not displaying the page’s actual meta description as the SERP snippet for these specific listings for this specific query, even when the meta description is specified in the HTML, but instead is being pulled directly from the text that appears at or near the top of the page.

2.The length of the snippets is longer than the length that Google previously displayed, congruent with Google’s confirmation that they’re showing longer SERP snippets.

Here’s how that breaks down for the above query, again in the same order as the SERP listing screenshot above:

◾The first sentence of the text is used as the SERP snippet

◾The first sentence of the text is used as the SERP snippet

◾The H1 page headline, followed by ellipses ( … ), followed by the second, third, and fourth sentences on the page in the first paragraph (skipping the first sentence in the first paragraph) are used as the SERP snippet.

◾The first and second sentences, and part of the third, are used as the SERP snippet

◾The first and second sentences, the image ALT attribute (or the image caption, they’re both the same text), plus text via HTML code associated with the image, Checking a number of other queries returned similar observations about what Google is using as the SERP snippet, but note that some SERP snippets were indeed taken from the actual meta description.

For example, in the SERP for a query for musician “Todd Rundgren”, this SERP snippet is obviously taken directly from the meta description:

For many other queries I performed, both commercial and non-commercial in query intent, it turned up a mix of SERP snippet sources – primarily either text on the page or the actual meta description specified in the HTML, and in some cases via image ALT attribute, and occasionally from some other bit of code in the HTML.

On mobile devices, the SERP snippets were very similar, in many cases the same as on desktop.

The SERP orders were slightly different, so yes, there’s going to be ranking variations based on various factors (it’s well known that Google can and will alter the SERPs you see based on your search history, geo-location, query type, your previous interaction with SERPs, etc.).

However, the overall scheme of the SERP snippets remained constant – text was taken mostly from either the first paragraph of the page, or the meta description, and in some cases the image ALT attribute, and occasionally from other text in the HTML code.

Dr. Pete Meyers over at Moz conducted research late last year on 89,909 page-one organic results.

Pete noted that the average SERP snippet was 215 characters long with the median length at 186, and he was quick to point out that, “big numbers are potentially skewing the average. On the other hand, some snippets are very short because their Meta Descriptions are very short”.

Pete also noted no significant differences between desktop and mobile snippet lengths, sometimes seeing mobile snippets longer than desktop snippets.

For sure the actual SERP snippet you see, and the length, will vary by query type.

What is going on here?

Google is trying to satisfy searchers.

Yes, traditionally the idea was that Google would pull the SERP snippet from the meta description, but for years now Google has been using whatever text its algorithms determine makes the most sense based on the user’s query.

Not all sites – for example, Wikipedia and another we saw above – don’t even make use of the meta description tag in the HTML of their pages, so what’s a poor search engine to do in that case?

Similarly, what if the meta description is badly written, or spammy-sounding with lots of keyword stuffing, or doesn’t well-reflect the page’s theme and topic(s)?

So that’s what’s going on here – Google evolved over time to use whatever it deems makes the most sense to a user performing a certain query.

Wait: What the heck is a meta description, anyway?

Meta descriptions are HTML code that Google understands, and that is meant to provide a synopsis of the page.

Here’s an example:

This code goes between the tags of the HTML and is not displayed on the visible content that a user would see.

Do meta descriptions impact SEO?

Meta descriptions will not impact rankings.

But, if Google does use a page’s meta description as the SERP snippet, that can impact click-through from the SERP.

That’s because a well-written meta description that is compelling, relevant to the page, and relevant to the query or queries for which the page is ranking, can impact organic traffic.

And that can have a downstream impact on conversions (the desired actions you want website visitors to take – fill out a form, buy something, and so on).

Poorly written meta descriptions, if used as the SERP snippet, can have the opposite effect and discourage the user to click through to your page, and instead go to your competitors.

So, what should be your strategy now that Google has increased the SERP snippet length?

In summary, you could do any of the following:

◾Do nothing at all

◾Rewrite longer meta descriptions for all your pages

◾Rewrite longer meta descriptions for some of your pages (e.g. your top ten or twenty organic landing pages, or some pages you determine have low click-thru rates)

◾Delete all your meta descriptions

◾Audit your site’s content to ensure that the first text on your page is compelling, uses keywords congruent with how someone would search for your content, ensure the first paragraph contains at least 300-350 characters of text including spaces, and front-load the first 150 characters in case google changes back to shorter snippets in the future.

What you decide to do (or not do) will at least in part hinge upon resources you have available to make changes.

Don’t take a “set it and forget it” attitude with your website’s content and your meta descriptions. It’s common for businesses to put in a fair amount of work into their website, then just let it go stale.

A good recommendation here would be to cycle through this stuff on a regular basis – think quarterly or a couple times per year. Once per year at a minimum.

Here’s what I recommend

First, it should be obvious that your page’s textual content is for humans to consume, and that should always be your primary consideration.

You’ve heard the phrase “dance like no one’s watching” – well, write like Google doesn’t exist. But Google does exist, and their mission is satisfied users (so that people continue to use their service and click on ads) – Google is chasing satisfied users and so should you.

The refrain of “write great content” has been used ad nauseum. The only reason I’m mentioning the whole “write for your users” thing is simply because often people focus primarily on “how do I SEO my pages?” instead of “what’s good for my users?”.

Okay, with that out of the way and forefront in your mind, here’s what I recommend. Adjust this according to your specific needs – your industry, your users – don’t just take this as a cookie-cutter approach.

And, do this on the time frame that makes the most sense and works for you and the resources you have available to you to make changes to your site. If you haven’t looked at your page content and meta descriptions in a year or more, then this is a higher priority for you than if you refreshed all that 60 days ago.

Meta descriptions

◾Make them about 300-320 characters long, including spaces

◾Make the meta description super-relevant to the page text

◾Front-load the first 150-165 characters with your most-compelling text – compelling to your users who might see the text as a SERP snippet (just in case Google decides to shorten them again)

◾Use a call to action if applicable, but don’t be a used car salesman about it – and as appropriate, use action-oriented language

◾Remember WIIFM – what’s in it for me – as applicable, focus on benefits, not features

◾Don’t be deceptive or make promises your page content can’t keep

Keep in mind that Google may not use your meta description as the SERP snippet and may instead use content from your page, likely from the first paragraph.

With that in mind:

Review & refresh your content

◾Make sure the H1 page headline is super-relevant to the page’s topic

◾Include an image (as applicable) that is super-relevant to the page (not one of those dumb, tangentially-related stock images) and craft an excellent and page-relevant image ALT attribute

◾Ensure that your opening paragraph is enticing and practically forces the reader to keep reading – that way if it’s the text used as the SERP snippet, that will capture people’s attention.

Summary

My summary is that if you haven’t already, please go back and read the whole article – I promise you it’ll be worth it. But I will add one more piece here and that is that ostensibly the type of content you’re creating is going to dictate how you configure your meta descriptions, H1 page headlines, and especially the opening text on the page.

In some cases, it makes sense to use the “how to feed a (Google) hummingbird” technique where you pose the topic’s question and answer it concisely at the top of the page, then defend that position, journalism style, in the rest of the text under that.

Similarly, you may be shooting for a SERP featured snippet and voice-assistant-device answer using bullet points or a numbered list at the top of your content page.

The point is, the guidelines and recommendations I’ve provided for you here are not a one-size-fits-all, cookie-cutter approach to your meta descriptions and your content. SEO experience, switching your brain into the on position, and a willingness to test, observe, and adjust are all mandatory to achieve the best results.

ABCO Technology teaches a comprehensive class for web development. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email your questions to info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at :

11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Build highly visible webpages today

How to make a Google proof website

Any SEO or webmaster who has ever had a website affected by a Google algorithm change – or feared being affected by one – has probably wished that they could find a way to make their website “algorithm-proof”.

Still, surely there’s no such thing as a website that’s never impacted by Google algorithms, right? As long as your site is indexed by Google, it’s at the mercy of the algorithms that Google uses to determine website ranking, all the more so if you happen to rely heavily on organic search traffic for your business.

The art – or science – of search engine optimization is about determining as best you can what those algorithms are looking for, and giving it to them.

Yet one website believes it has found the formula for making its content “Google algorithm-proof”. Ranker is a website made up of dynamic, crowdsourced lists that users can vote on, about everything from pop culture to geography, history to sports, celebrities to science.

And according to its CEO, Clark Benson, Ranker has never suffered a negative effect from a Google algorithm change, growing its traffic steadily without interruption over the course of eight and a half years.

ABCO Technology caught up with Benson to find out Ranker’s secret to success, and whether there is a formula for creating an algorithm-proof website.

Rankings, not review sites

So what is Ranker, exactly?

“Ranker’s primary reason for being is to crowdsource anything that makes sense to rank,” says Benson. “Any topic that people are really interested in.

The unique angle that we’ve pursued is that instead of having this being one 23-year-old blogger’s opinion of the best new TV shows of the year, or whatever it happens to be, we would have a dynamic list that visitors could vote on, potentially add items to, and re-rank.

Voting on a list of ‘Historical events you most want to go back and see’ on Ranker

Lists have been a time-honored draw for magazines and other print media over the years, but it was when the internet came along that they really exploded – spawning dozens of list-oriented viral websites and the much-mocked listicle, which became a staple of online journalism. However, Benson – a self-described “lifelong list nerd” – was frustrated by the fact that these lists only ever represented one person’s opinion.

In a similar vein, he found review websites unhelpful, as user-generated reviews represented a single person’s subjective opinion in a format that wasn’t conducive to making a decision.

“Part of the reason to build Ranker was my frustration with review sites, because when I’m looking for an answer to something, like which TV show to watch, I don’t want to read a lot of text reviews.

“I also feel that in typical five-star rating systems, everything tends to be clustered around three and a half to four stars, so you don’t get any true granularity on what is best.”

In a world increasingly “cluttered with choices”, therefore, Benson was convinced that rankings were “the simplest way to dissect a choice in a category, without losing the credibility of the answer”. And so he built Ranker as a website where the wisdom of the crowd could determine the ultimate ranking for any list of items, on any topic.

The secret to Ranker’s SEO success: Content freshness

Since Ranker’s launch in 2009, the site has amassed more than 100,000 rankings across dozens of broad categories, encompassing almost any topic that people could have a passion for.

When the website first launched, however, it had very few resources, and Benson explains that he had to learn SEO from scratch in order to give the website a strong foundation.

Luckily, earning traffic was never a problem for the site, because the type of content published on Ranker was uniquely suited to catering to Google’s algorithms.

“We’ve never been hit by any algorithm changes – we’ve always grown our organic search traffic year over year over year, steadily, for the eight and a half years we’ve been live.

“You never exactly know what works in SEO, because Google doesn’t tell you what works, but I’ve always believed that the best intelligence on what to do comes from the public statements Google makes – their best practices.

“And one of the key factors that Google says is in their index is freshness of content. Content has a lifespan. In our case, because our rankings are dynamic and always changing – people are adding things to them, voting things up and down – this makes for perpetually fresh content.

“We have a lot of content that is six, seven, even eight years old that is still doing as well as it was years ago, and in some cases it’s even growing in traffic.”

One of Ranker’s most evergreen pieces of content is a list ranking the ‘Best Movies of All Time’ – which is more than 5,000 items long.

“Obviously that’s a topic that there’s a lot of passion and a lot of competition for [in search rankings]. And in the last few years, we’ve been on the top three or so results on Google for that term.

“We’ve watched that page just grow in rankings over the span of seven or eight years. I can only guess it’s because the page is always changing.”

User-curated content

At the time of writing this article, Ranker’s front page is currently spotlighting a list of best-dressed celebs at the 2018 Oscars, a best TV episode names ranking, and a list of possible game-changing deep space observations to be made by the Webb Telescope.

Anyone can add an item to a list on Ranker, although Ranker’s content is not purely user-generated. Ranker has an editorial team which is made up of people who, in Benson’s words, “have a mind for cataloging things” rather than people who specialize in writing a lot of prose.

Lists are typically started off by one of Ranker’s editors, and when a user wants to add a new item to a list, it’s cross-referenced with Ranker’s database, a huge data set made up of more than 28 million people, places and things. If the item isn’t found in the database, it’s added to a moderation queue.

Rather than UGC (user-generated content), therefore, Benson thinks of Ranker’s lists as something he terms UCC – user-curated content.

How did Ranker build such a huge data set? Beginning in 2007, a company called Metaweb ran an open source, collaborative knowledge base called Freebase, which contained data harvested from sources such as Wikipedia, the Notable Names Database, Fashion Model Directory and MusicBrainz, along with user-submitted wiki contributions.

This knowledge base made up a large part of Ranker’s data set. What’s interesting is that Freebase was later acquired by none other than Google – and is the foundation of Google’s Knowledge Graph.

Additionally, not every list on Ranker is crowdsourced or voted on. Some lists, such as Everyone Who Has Been Fired Or Resigned From The Trump Administration So Far, don’t make sense to have users voting on them, but are kept fresh with the addition of new items whenever the topic is in the news.

Can other websites do ‘Ranker SEO’?

Benson acknowledges that Ranker’s setup is fairly unique, and so it isn’t necessarily possible to emulate its success with SEO by trying to do the same thing – unless you just happen to have your own crowdsourced, user-curated list website, of course.

With that said, there are still some practical lessons that website owners, particularly publishers, can take away from Ranker’s success and apply to their own SEO strategy.

Related articles

A forward-looking history of link building

30 ways to market your online business for free

The 5 SEO mistakes holding your ecommerce site back right now

Mystified by martech? Introducing the ClickZ Buyers Guide series

First and foremost: content freshness is king

As you’ve no doubt gathered by now, the freshness of Ranker’s content is probably the biggest contributing factor to its success in search. “We’re convinced that the dynamism of our content is what really lets it just grow and grow and grow in search traffic,” says Benson.

“While our approach is somewhat unique to the way Ranker works – we have a bespoke CMS that makes lists out of datasets – I’m positive that there are other ways to apply this kind of thinking.”

To put content freshness front and center of your content marketing efforts, make sure that your publication or blog is well-stocked with evergreen content. For those articles or posts that are more time-sensitive, you can still publish a refreshed version, or look for an up-to-date spin to put on the old content, for example linking it in with current events.

According to research by Moz, other factors which can contribute to a positive “freshness” score for your website as a whole include:

◾Changes made to the core content of your website (as opposed to peripheral elements like JavaScript, comments, ads and navigation)

◾Frequency of new page creation

◾Rate of new link growth (an increase in links pointing back to your site or page)

◾Links from other fresh websites, which have the ability to transfer their “fresh value” (Justin Briggs dubbed this quality “FreshRank” in 2011)

Internal links trump external links

Other than content freshness, Benson attributes Ranker’s SEO success to one other big factor: its intricate network of internal links, which Benson believes are far more valuable to SEO than an impressive backlink profile.

“I think a lot of people who are new to SEO focus too much on trying to get outside links, versus optimizing their own internal infrastructure,” he says.

“We have a very broad site with millions of pages – not just lists, but a page for every item that’s included in a list on Ranker, showing you where it ranks on all of our different lists.”

The Ranker page for Leonardo da Vinci

“We made the mistake early on of leaving all of those pages open to Google’s index, and we learned over time that some of them are very thin, content-wise. New links are added to them, but they’re thin pages. So we quickly adopted a strategy of noindexing the thinner pages on our site – so they have utility, but they don’t necessarily have search utility.

“We’ve really focused a lot on internal link structure and on interlinking our content in a very intelligent and vertical-driven, page-optimized way. We’ve put a lot of engineering and product resources towards building a robust internal link structure that can also change as pages become more valuable in search.

“Outside links are very important, but they’re increasingly difficult to get. If you have good, unique content, and a strong internal link structure, I think you can get by with far fewer backlinks. Ranker has a lot of backlinks – we’re a big site – but we’ve never tactically gone out to build backlinks. And we get more than 30 million organic search visits per month.”

Think about how your content will appear to searchers

Benson emphasizes the importance of paying attention to basic on-site optimization like crafting good title tags and meta descriptions. These elements dictate how your website appears in the SERP to users when they search, and so will form the first impressions of your content.

“When it comes to creating new content, our editorial team definitely focuses on best practice with regards to title tags and meta descriptions – the basic stuff still applies,” says Benson. “Anyone doing editorial still needs to think about your content from the lens of the searcher.”

Optimizing for Google’s rich results and using Schema.org markup are additional ways that website owners can make sure that their website listing appears as attractive as possible to a searcher encountering it on the SERP.

The future is psychographic

What plans does Benson have for the future of Ranker? Up to now, the site has been concentrating mostly on search and social distribution (Facebook is another big source of organic traffic), but are now beginning to focus more on ad sales, media tie-ins and getting the brand name out there.

“We’re always focused on growing traffic, and we’re certainly investing a lot more into our brand,” says Benson.

However, the most exciting future project for Ranker is something called Ranker Insights – a psychographic interests platform which makes use of Ranker’s thousands of data points on what people are interested in and like to vote on.

Drawing connections between people’s interests on Ranker Insights

Big data on anything is extremely valuable in marketing, but big data on the things that people like is near enough invaluable – particularly in a world where psychographics (classifying people according to their attitudes, aspirations, and other aspects of their psychology) are increasingly more important than demographics (classifying people according to things like age, gender, race and nationality).

“The marketing world in general is steering a lot more towards psychographics rather than demographics,” says Benson. “Netflix doesn’t care what country you live in – when it comes to marketing or even recommendations, all they care about is your tastes. They stopped using demographics entirely years ago – and clearly they’re doing something right.

“We feel that in an interconnected world, what you like says at least as much about you as your age or your gender.

“And in a world where what you like tells people how to market to you and how to reach you, we have very, very granular, deep data on that front. There’s a lot of different applications for insights like this in a very data-driven world.”

Rebecca Sentance is the Deputy Editor of Search Engine Watch.

“The end result is a very wisdom-of-crowds-based answer which is always changing and dynamically moving along as tastes change, and as more people vote on things.”

Voting on a list of ‘Historical events you most want to go back and see’ on Ranker

Lists have been a time-honored draw for magazines and other print media over the years, but it was when the internet came along that they really exploded – spawning dozens of list-oriented viral websites and the much-mocked listicle, which became a staple of online journalism. However, Benson – a self-described “lifelong list nerd” – was frustrated by the fact that these lists only ever represented one person’s opinion.

In a similar vein, he found review websites unhelpful, as user-generated reviews represented a single person’s subjective opinion in a format that wasn’t conducive to making a decision.

“Part of the reason to build Ranker was my frustration with review sites, because when I’m looking for an answer to something, like which TV show to watch, I don’t want to read a lot of text reviews.

“I also feel that in typical five-star rating systems, everything tends to be clustered around three and a half to four stars, so you don’t get any true granularity on what is best.”

In a world increasingly “cluttered with choices”, therefore, Benson was convinced that rankings were “the simplest way to dissect a choice in a category, without losing the credibility of the answer”. And so he built Ranker as a website where the wisdom of the crowd could determine the ultimate ranking for any list of items, on any topic.

ABCO Teaches classes regarding building crowd funding websites in our web development program. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067

Email your questions to: ibnfo@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Learning to build crowd funding websites today!

×

Request Info with No Obligation

    How much is tuition?Can I get financial aid?What are my career prospects?When does it start?

    By checking this box, I give consent for ABCO Technology to use automated technology to call and/or text me at the number provided above, including my wireless number if applicable. Call us for information: 310-216-3067

    I understand & agree