How to make a Google proof website

Any SEO or webmaster who has ever had a website affected by a Google algorithm change – or feared being affected by one – has probably wished that they could find a way to make their website “algorithm-proof”.

Still, surely there’s no such thing as a website that’s never impacted by Google algorithms, right? As long as your site is indexed by Google, it’s at the mercy of the algorithms that Google uses to determine website ranking, all the more so if you happen to rely heavily on organic search traffic for your business.

The art – or science – of search engine optimization is about determining as best you can what those algorithms are looking for, and giving it to them.

Yet one website believes it has found the formula for making its content “Google algorithm-proof”. Ranker is a website made up of dynamic, crowdsourced lists that users can vote on, about everything from pop culture to geography, history to sports, celebrities to science.

And according to its CEO, Clark Benson, Ranker has never suffered a negative effect from a Google algorithm change, growing its traffic steadily without interruption over the course of eight and a half years.

ABCO Technology caught up with Benson to find out Ranker’s secret to success, and whether there is a formula for creating an algorithm-proof website.

Rankings, not review sites

So what is Ranker, exactly?

“Ranker’s primary reason for being is to crowdsource anything that makes sense to rank,” says Benson. “Any topic that people are really interested in.

The unique angle that we’ve pursued is that instead of having this being one 23-year-old blogger’s opinion of the best new TV shows of the year, or whatever it happens to be, we would have a dynamic list that visitors could vote on, potentially add items to, and re-rank.

Voting on a list of ‘Historical events you most want to go back and see’ on Ranker

Lists have been a time-honored draw for magazines and other print media over the years, but it was when the internet came along that they really exploded – spawning dozens of list-oriented viral websites and the much-mocked listicle, which became a staple of online journalism. However, Benson – a self-described “lifelong list nerd” – was frustrated by the fact that these lists only ever represented one person’s opinion.

In a similar vein, he found review websites unhelpful, as user-generated reviews represented a single person’s subjective opinion in a format that wasn’t conducive to making a decision.

“Part of the reason to build Ranker was my frustration with review sites, because when I’m looking for an answer to something, like which TV show to watch, I don’t want to read a lot of text reviews.

“I also feel that in typical five-star rating systems, everything tends to be clustered around three and a half to four stars, so you don’t get any true granularity on what is best.”

In a world increasingly “cluttered with choices”, therefore, Benson was convinced that rankings were “the simplest way to dissect a choice in a category, without losing the credibility of the answer”. And so he built Ranker as a website where the wisdom of the crowd could determine the ultimate ranking for any list of items, on any topic.

The secret to Ranker’s SEO success: Content freshness

Since Ranker’s launch in 2009, the site has amassed more than 100,000 rankings across dozens of broad categories, encompassing almost any topic that people could have a passion for.

When the website first launched, however, it had very few resources, and Benson explains that he had to learn SEO from scratch in order to give the website a strong foundation.

Luckily, earning traffic was never a problem for the site, because the type of content published on Ranker was uniquely suited to catering to Google’s algorithms.

“We’ve never been hit by any algorithm changes – we’ve always grown our organic search traffic year over year over year, steadily, for the eight and a half years we’ve been live.

“You never exactly know what works in SEO, because Google doesn’t tell you what works, but I’ve always believed that the best intelligence on what to do comes from the public statements Google makes – their best practices.

“And one of the key factors that Google says is in their index is freshness of content. Content has a lifespan. In our case, because our rankings are dynamic and always changing – people are adding things to them, voting things up and down – this makes for perpetually fresh content.

“We have a lot of content that is six, seven, even eight years old that is still doing as well as it was years ago, and in some cases it’s even growing in traffic.”

One of Ranker’s most evergreen pieces of content is a list ranking the ‘Best Movies of All Time’ – which is more than 5,000 items long.

“Obviously that’s a topic that there’s a lot of passion and a lot of competition for [in search rankings]. And in the last few years, we’ve been on the top three or so results on Google for that term.

“We’ve watched that page just grow in rankings over the span of seven or eight years. I can only guess it’s because the page is always changing.”

User-curated content

At the time of writing this article, Ranker’s front page is currently spotlighting a list of best-dressed celebs at the 2018 Oscars, a best TV episode names ranking, and a list of possible game-changing deep space observations to be made by the Webb Telescope.

Anyone can add an item to a list on Ranker, although Ranker’s content is not purely user-generated. Ranker has an editorial team which is made up of people who, in Benson’s words, “have a mind for cataloging things” rather than people who specialize in writing a lot of prose.

Lists are typically started off by one of Ranker’s editors, and when a user wants to add a new item to a list, it’s cross-referenced with Ranker’s database, a huge data set made up of more than 28 million people, places and things. If the item isn’t found in the database, it’s added to a moderation queue.

Rather than UGC (user-generated content), therefore, Benson thinks of Ranker’s lists as something he terms UCC – user-curated content.

How did Ranker build such a huge data set? Beginning in 2007, a company called Metaweb ran an open source, collaborative knowledge base called Freebase, which contained data harvested from sources such as Wikipedia, the Notable Names Database, Fashion Model Directory and MusicBrainz, along with user-submitted wiki contributions.

This knowledge base made up a large part of Ranker’s data set. What’s interesting is that Freebase was later acquired by none other than Google – and is the foundation of Google’s Knowledge Graph.

Additionally, not every list on Ranker is crowdsourced or voted on. Some lists, such as Everyone Who Has Been Fired Or Resigned From The Trump Administration So Far, don’t make sense to have users voting on them, but are kept fresh with the addition of new items whenever the topic is in the news.

Can other websites do ‘Ranker SEO’?

Benson acknowledges that Ranker’s setup is fairly unique, and so it isn’t necessarily possible to emulate its success with SEO by trying to do the same thing – unless you just happen to have your own crowdsourced, user-curated list website, of course.

With that said, there are still some practical lessons that website owners, particularly publishers, can take away from Ranker’s success and apply to their own SEO strategy.

Related articles

A forward-looking history of link building

30 ways to market your online business for free

The 5 SEO mistakes holding your ecommerce site back right now

Mystified by martech? Introducing the ClickZ Buyers Guide series

First and foremost: content freshness is king

As you’ve no doubt gathered by now, the freshness of Ranker’s content is probably the biggest contributing factor to its success in search. “We’re convinced that the dynamism of our content is what really lets it just grow and grow and grow in search traffic,” says Benson.

“While our approach is somewhat unique to the way Ranker works – we have a bespoke CMS that makes lists out of datasets – I’m positive that there are other ways to apply this kind of thinking.”

To put content freshness front and center of your content marketing efforts, make sure that your publication or blog is well-stocked with evergreen content. For those articles or posts that are more time-sensitive, you can still publish a refreshed version, or look for an up-to-date spin to put on the old content, for example linking it in with current events.

According to research by Moz, other factors which can contribute to a positive “freshness” score for your website as a whole include:

◾Changes made to the core content of your website (as opposed to peripheral elements like JavaScript, comments, ads and navigation)

◾Frequency of new page creation

◾Rate of new link growth (an increase in links pointing back to your site or page)

◾Links from other fresh websites, which have the ability to transfer their “fresh value” (Justin Briggs dubbed this quality “FreshRank” in 2011)

Internal links trump external links

Other than content freshness, Benson attributes Ranker’s SEO success to one other big factor: its intricate network of internal links, which Benson believes are far more valuable to SEO than an impressive backlink profile.

“I think a lot of people who are new to SEO focus too much on trying to get outside links, versus optimizing their own internal infrastructure,” he says.

“We have a very broad site with millions of pages – not just lists, but a page for every item that’s included in a list on Ranker, showing you where it ranks on all of our different lists.”

The Ranker page for Leonardo da Vinci

“We made the mistake early on of leaving all of those pages open to Google’s index, and we learned over time that some of them are very thin, content-wise. New links are added to them, but they’re thin pages. So we quickly adopted a strategy of noindexing the thinner pages on our site – so they have utility, but they don’t necessarily have search utility.

“We’ve really focused a lot on internal link structure and on interlinking our content in a very intelligent and vertical-driven, page-optimized way. We’ve put a lot of engineering and product resources towards building a robust internal link structure that can also change as pages become more valuable in search.

“Outside links are very important, but they’re increasingly difficult to get. If you have good, unique content, and a strong internal link structure, I think you can get by with far fewer backlinks. Ranker has a lot of backlinks – we’re a big site – but we’ve never tactically gone out to build backlinks. And we get more than 30 million organic search visits per month.”

Think about how your content will appear to searchers

Benson emphasizes the importance of paying attention to basic on-site optimization like crafting good title tags and meta descriptions. These elements dictate how your website appears in the SERP to users when they search, and so will form the first impressions of your content.

“When it comes to creating new content, our editorial team definitely focuses on best practice with regards to title tags and meta descriptions – the basic stuff still applies,” says Benson. “Anyone doing editorial still needs to think about your content from the lens of the searcher.”

Optimizing for Google’s rich results and using Schema.org markup are additional ways that website owners can make sure that their website listing appears as attractive as possible to a searcher encountering it on the SERP.

The future is psychographic

What plans does Benson have for the future of Ranker? Up to now, the site has been concentrating mostly on search and social distribution (Facebook is another big source of organic traffic), but are now beginning to focus more on ad sales, media tie-ins and getting the brand name out there.

“We’re always focused on growing traffic, and we’re certainly investing a lot more into our brand,” says Benson.

However, the most exciting future project for Ranker is something called Ranker Insights – a psychographic interests platform which makes use of Ranker’s thousands of data points on what people are interested in and like to vote on.

Drawing connections between people’s interests on Ranker Insights

Big data on anything is extremely valuable in marketing, but big data on the things that people like is near enough invaluable – particularly in a world where psychographics (classifying people according to their attitudes, aspirations, and other aspects of their psychology) are increasingly more important than demographics (classifying people according to things like age, gender, race and nationality).

“The marketing world in general is steering a lot more towards psychographics rather than demographics,” says Benson. “Netflix doesn’t care what country you live in – when it comes to marketing or even recommendations, all they care about is your tastes. They stopped using demographics entirely years ago – and clearly they’re doing something right.

“We feel that in an interconnected world, what you like says at least as much about you as your age or your gender.

“And in a world where what you like tells people how to market to you and how to reach you, we have very, very granular, deep data on that front. There’s a lot of different applications for insights like this in a very data-driven world.”

Rebecca Sentance is the Deputy Editor of Search Engine Watch.

“The end result is a very wisdom-of-crowds-based answer which is always changing and dynamically moving along as tastes change, and as more people vote on things.”

Voting on a list of ‘Historical events you most want to go back and see’ on Ranker

Lists have been a time-honored draw for magazines and other print media over the years, but it was when the internet came along that they really exploded – spawning dozens of list-oriented viral websites and the much-mocked listicle, which became a staple of online journalism. However, Benson – a self-described “lifelong list nerd” – was frustrated by the fact that these lists only ever represented one person’s opinion.

In a similar vein, he found review websites unhelpful, as user-generated reviews represented a single person’s subjective opinion in a format that wasn’t conducive to making a decision.

“Part of the reason to build Ranker was my frustration with review sites, because when I’m looking for an answer to something, like which TV show to watch, I don’t want to read a lot of text reviews.

“I also feel that in typical five-star rating systems, everything tends to be clustered around three and a half to four stars, so you don’t get any true granularity on what is best.”

In a world increasingly “cluttered with choices”, therefore, Benson was convinced that rankings were “the simplest way to dissect a choice in a category, without losing the credibility of the answer”. And so he built Ranker as a website where the wisdom of the crowd could determine the ultimate ranking for any list of items, on any topic.

ABCO Teaches classes regarding building crowd funding websites in our web development program. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067

Email your questions to: ibnfo@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Learning to build crowd funding websites today!

A guide to web technologies

As a certified Internet web developer, your role will invariably lead you to interactions with people in a wide variety of roles including business owners, marketing managers, content creators, link builders, PR agencies, and developers.

That last one – developers – is a catch-all term that can encompass software engineers, coders, programmers, front- and back-end developers, and IT professionals of various types. These are the folks who write the code and/or generally manage the underlying various web technologies that comprise and power websites.

In your role as a web developer, it may or may not be practicable for you to completely master programming languages such as C++ and Java, or scripting languages such as PHP and JavaScript, or markup languages such as HTML, XML, or the stylesheet language CSS.

And, there are many more programming, scripting, and markup languages out there – it would be a Herculean task to be a master of every kind of language, even if your role is full-time programmer and not a web developer.

But, it is essential for you, as a certified web developer professional, to understand the various languages and technologies and technology stacks out there that comprise the web. When you’re making website recommendations, which developers will most likely be executing, you need to understand their mindset, their pain points, what their job is like – and you need to be able to speak their language.

You don’t have to know everything developers know, but you should have a good grasp of what developers do so that you can ask better questions and provide SEO recommendations in a way that resonates with them, and those recommendations are more likely to be executed as a result.

When you speak their language, and understand what their world is like, you’re contributing to a collaborative environment where everyone’s pulling on the same side of the rope for the same positive outcomes.

And of course, aside from building collaborative relationships, being a professional web developer involves a lot of technical detective work and problem detection and prevention, so understanding various aspects of web technology is not optional; it’s mandatory.

Web tech can be complex and intimidating, but hopefully this guide will help make things a little easier for you and fill in some blanks in your understanding.

Let’s jump right in!

The internet vs. the World Wide Web

Most people use these terms interchangeably, but technically the two terms do not mean the same thing, although they are related.

The Internet began as a decentralized network of independent interconnected computers.

The US Department of Defense was involved over time and awarded contracts, including for the development of the ARPANET (Advanced Research Projects Agency Network) project, which was an early packet switching network and first to use TCP/IP (Transmission Control Protocol and Internet Protocol).

The ARPANET project led to “internetworking” where various networks of computers could be joined into a larger “network of networks”.

The development of the World Wide Web is credited to British computer scientist Sir Tim Beners-Lee in the 1980s; he developed linking hypertext documents, which resulted in an information-sharing model built “on top” of the Internet.

Documents (web pages) were specified to be formatted in a markup language called “HTML” (Hypertext Markup Language), and could be linked to each other using “hyperlinks” that users could click to navigate to other web pages.

Further reading:

◾History of the Internet

◾History of the World Wide Web

◾ARPANET

Web hosting

Web hosting, or hosting for short, are services that allow people and businesses to put a web page or a website on the internet. Hosting companies have banks of computers called “servers” that are not entirely dissimilar in nature to computers you’re already familiar with, but of course there are differences.

There are various types of web hosting companies that offer a range of services in addition to web hosting; such services may include domain name registration, website builders, email addresses, website security services, and more.

In short, a host is where websites are published.

Further reading:

◾Web Hosting Service

Web servers

A web server is a computer that stores web documents and resources. Web servers receive requests from clients (browsers) for web pages, images, etc. When you visit a web page, your browser requests all the resources/files needed to render that web page in your browser. It goes something like this:

Client (browser) to server: “Hey, I want this web page, please provide all the text, images and other stuff you have for that page.”

Server to client: “Okay, here it is.”

Various factors impact how quickly the web page will display (render) including the speed of the server and the size(s) of the various files being requested.

There are three server types you’ll most often encounter:

1.Apache is open-source, free software compatible with many operating systems such as Linux. An often-used acronym is “LAMP stack” referring to a bundling of Linux, Apache, MySQL (relational database) and PHP (a server-side scripting language).

2.IIS stands for “Internet Information Services” and is proprietary software made by Microsoft. An IIS server is often referred to as a “Windows Server” because it runs on Windows NT operating systems.

3.NGINX – pronounced “Engine X”, is billed as a high-performance server able to also handle load balancing, used as a reverse proxy, and more. Their stated goals and reason for being include outperforming other types of servers.

Further reading:

◾Apache

◾IIS

◾NGINX

Server log files

Often shortened to “log files”, these are records of server activity in response to requests made for web pages and associated resources such as images. Some servers may already be configured to record this activity, others will need to be configured to do so.

Log files are the “reality” of what’s happening with a website and will include information such as the page or file requested, date and time stamp of the request, the user agent making the request, the response type (found, error, redirected, etc.), the referrer, and a few other items such as bytes served and client IP address.

Web developers should get familiar with parsing log files. To go into this topic in more detail, read JafSoft’s explanation of a web server log file sample.

FTP

FTP stands for File Transfer Protocol, and it’s how you upload resource files such as webpages, images, XML Sitemaps, robots.txt files, and PDF files to your web hosting account to make these resource files available and viewable on the Web via browsers. There are free FTP software programs you can use for this purpose.

The interface is a familiar file-folder tree structure where you’ll see your local machine’s files on the left, and the remote server’s files on the right. You can drag and drop local files to the server to upload. Voila, you’ve put files onto the internet! For more detail, Wired has an excellent guide on FTP for beginners.

Domain name

A domain name is a string of (usually) text and is used in a URL (Uniform Resource Locator). Keeping this simple, for the URL https://www.website.com, “website” is the domain name. For more detail, check out the Wikipedia article on domain names.

Root domain & subdomain

A root domain is what we commonly think of as a domain name such as “website” in the URL https://www.website.com. A subdomain is the www. part of the URL. Other examples of subdomains would be news.website.com, products.website.com, support.website.com and so on.

For more information on the difference between a domain and a subdomain, check out this video from HowTech.
URL vs. URI

URL stands for “Universal Resource Locator” (such as https://www.website.com/this-is-a-page) and URI stands for “Uniform Resource Identifier” and is a subset of a full URL (such as /this-is-a-page.html). More info here.

HTML, CSS, and JavaScript

I’ve grouped together HTML, CSS, and JavaScript here not because each don’t deserve their own section here, but because it’s good for web developers to understand that those three languages are what comprise much of how modern web pages are coded (with many exceptions of course, and some of those will be noted elsewhere here).

HTML stands for “Hypertext Markup Language”, and it’s the original and foundational language of web pages on the World Wide Web.

CSS stands for “Cascading Style Sheets” and is a style sheet language used to style and position HTML elements on a web page, enabling separation of presentation and content.

JavaScript (not to be confused with the programming language “Java”) is a client-side scripting language to create interactive features on web pages.

Further reading:

◾HTML intro

◾CSS intro

◾JavaScript intro

AJAX & XML

AJAX stands for “Asynchronous JavaScript And XML. Asynchronous means the client/browser and the server can work and communicate independently allowing the user to continue interaction with the web page independent of what’s happening on the server. JavaScript is used to make the asynchronous server requests and when the server responds JavaScript modifies the page content displayed to the user. Data sent asynchronously from the server to the client is packaged in an XML format, so it can be easily processed by JavaScript. This reduces the traffic between the client and the server which increases response time and speed.

XML stands for “Extensible Markup Language” and is similar to HTML using tags, elements, and attributes and was designed to both store and transport data, whereas HTML is used to display data. For the purposes of SEO, the most common usage of XML is in XML Sitemap files.

Structured data (AKA, Schema.org)

Structured data is markup you can add to the HTML of a page to help search engines better understand the content of the page, or at least certain elements of that page. By using the approved standard formats, you provide additional information that makes it easier for search engines to parse the pertinent data on the page.

Common uses of structured data are to markup certain aspects of recipes, literary works, products, places, events of various types, and much more.

Schema.org was launched on June 2, 2011, as a collaborative effort by Google, Bing and Yahoo (soon after joined by Yandex) to create a common set of agreed-upon and standardized set of schemas for structured data markup on web pages. Since then, the term “Schema.org” has become synonymous with the term “structured data”, and Schema.org structured data types are continually evolving with new types being added with relative frequency.

One of the main takeaways about structured data is that it helps disambiguate data for search engines so they can more easily understand information and data, and that certain marked-up elements may result in additional information being displayed in Search Engines Results Pages (SERPs), such as review stars, recipe cooking times, and so on. Note that adding structured data is not a guarantee of such SERP features.

There are a number of structured data vocabularies that exist, but JSON-LD (JavaScript Object Notation for Linked Data) has emerged as Google’s preferred and recommended method of doing structured data markup per the Schema.org guidelines, but other formats are also supported such as microdata and RDFa.

JSON-LD is easier to add to pages, easier to maintain and change, and less prone to errors than microdata which must be wrapped around existing HML elements, whereas JSON-LD can be added as a single block in the HTML head section of a web page.

Here is the Schema.org FAQ page for further investigation – and to get started using microdata, RDFa and JSON-LD, check out our complete beginner’s guide to Schema.org markup.

Front-end vs. back-end, client-side vs. server-side

You may have talked to a developer who said, “I’m a front-end developer” and wondered what that meant. Of corse you may have heard someone say “oh, that’s a back-end functionality”. It can seem confusing what all this means, but it’s easily clarified.

“Front-end” and “client-side” both mean the same thing: it happens (executes) in the browser. For example, JavaScript was originally developed as something that executed on a web page in the browser, and that means without having to make a call to the server.

“Back-end” and “server-side” both mean the same thing: it happens (executes) on a server. For example, PHP is a server-side scripting language that executes on the server, not in the browser. Some Content Management Systems (CMS for short) like WordPress use PHP-based templates for web pages, and the content is called from the server to display in the browser.

Programming vs. scripting languages

Engineers and developers do have differing explanations and definitions of terms. Some will say ultimately there’s no differences or that the lines are blurry, but the generally accepted difference between a programming language (like C or Pascal) vs. a scripting language (like JavaScript or PHP) is that a programming language requires an explicit compiling step, whereas human-created, human-readable code is turned into a specific set of machine-language instructions understandable by a computer.

Content Management System (CMS)

A CMS is a software application or a set of related programs used to create and manage websites (or we can use the fancy term “digital content”). At the core, you can use a CMS to create, edit, publish, and archive web pages, blog posts, and articles and will typically have various built-in features.

Using a CMS to create a website means that there is no need to create any code from scratch, which is one of the main reasons CMS’ have broad appeal.

Another common aspect of CMS’ are plugins, which can be integrated with the core CMS to extend functionalities which are not part of the core CMS feature list.

Common CMS’ include WordPress, Drupal, Joomla, ExpressionEngine, Magento, WooCommerce, Shopify, Squarespace, and there are many, many others.

Read more here about Content Management Systems.

Content Delivery Network (CDN)

Sometimes called a “Content Distribution Network”, CDNs are large networks of servers which are geographically dispersed with the goal of serving web content from a server location closer to the client making the request in order to reduce latency (transfer delay).

CDNs cache copies of your web content across these servers, and then servers nearest to the website visitor serve the requested web content. CDNs are used to provide high availability along with high performance. More info here.

HTTPS, SSL, and TLS

Web data is passed between computers via data packets of code. Clients (web browsers) serve as the user interface when we request a web page from a server. HTTP (hypertext transfer protocol) is the communication method a browser uses to “talk to” a server and make requests. HTTPS is the secure version of this (hypertext transfer protocol secure).

Website owners can switch their website to HTTPS to make the connection with users more secure and less prone to “man in the middle attacks” where a third party intercepts or possibly alters the communication.

SSL refers to “secure sockets layer” and is a standard security protocol to establish communication encryption between the server and the browser. TLS, Transport Layer Security, is a more-recent version of SSL

◾More info on HTTPS, SSL, & TLS

HTTP/1.1 & HTTP/2

When Tim Berners-Lee invented the HTTP protocol in 1989, the computer he used did not have the processing power and memory of today’s computers. A client (browser) connecting to a server using HTTP/1.1 receives information in a sequence of network request-response transactions, which are often referred to as “round trips” to the server, sometimes called “handshakes”.

Each round trip takes time, and HTTPS is an HTTP connection with SSL/TSL layered in which requires yet-another handshake with the server. All of this takes time, causing latency. What was fast enough then is not necessarily fast enough now.

HTTP/2 is the first new version of HTTP since 1.1. Simply put, HTTP/2 allows the server to deliver more resources to the client/browser faster than HTTP/1.1 by utilizing multiplexing, compression, request prioritization, and server push which allows the server to send resources to the client that have not yet been requested.

Further reading:

◾HTTP/2 FAQ

◾What is HTTP/2 and how does it benefit SEO?

Application Programming Interface (API)

Application is a general term that, simply put, refers to a type of software that can perform specific tasks. Applications include software, web browsers, and databases.

An API is an interface with an application, typically a database. The API is like a messenger that takes requests, tells the system what you want, and returns the response back to you.

If you’re in a restaurant and want the kitchen to make you a certain dish, the waiter who takes your order is the messenger that communicates between you and the kitchen, which is analogous to using an API to request and retrieve information from a database. For more info, check out Wikipedia’s Application programming interface page.

AMP, PWA, and SPA

If you want to build a website today, you have many choices.

You can build it from scratch using HTML for content delivery along with CSS for look and feel and JavaScript for interactive elements.

Or you could use a CMS (content management system) like WordPress, Magento, or Drupal.

Or you could build it with AMP, PWA, or SPA.

AMP stands for Accelerated Mobile Pages and is an open source Google initiative which is a specified set of HTML tags and various functionality components which are ever-evolving. The upside to AMP is lightning-fast loading web pages when coded according to AMP specifications, the downside is some desired features may not be currently supported, and issues with proper analytics tracking.

Further reading:

◾What will Google’s Accelerated Mobile Pages mean for marketers?

◾Accelerated Mobile Pages (AMP) one year on: stats and infographic

◾Accelerated Mobile Pages vs Facebook Instant Articles: Is Google winning the mobile war?

PWA stands for Progressive Web App, and it blends the best of both worlds between traditional websites and mobile phone apps. PWAs deliver a native app-like experience to users such as push notifications, the ability to work offline, and create a start icon on your mobile phone.

By using “service workers” to communicate between the client and server, PWAs combines fast-loading web pages with the ability to act like a native mobile phone app at the same time. However, because PWAs are JavaScript frameworks, you may encounter a number of technical challenges.

Further reading:

◾Progressive Web Apps versus Android Instant Apps: Which is better for marketers?

◾Google I/O: What’s going on with Progressive Web Apps?

SPAs – Single Page Applications – are different from traditional web pages which load each page a user requests in a session via repeated communications with the server. SPAs, by contrast, run inside the browser and new pages viewed in a user session don’t require page reloading via server requests.

The primary advantages of SPAs include streamlined and simplified development, and a very fast user experience. The primary disadvantages include potential problems with SEO, due to search engines’ inconsistent ability to parse content served by JavaScript. Debugging issues can also be more difficult and take up more developer time.

It’s worth noting that future success of each of these web technologies ultimately depends on developer adoption.

Conclusion

Obviously, it would require a very long book to cover each and every bit of web technology, and in sufficient detail, but this guide should provide you, the professional web developer, with helpful info to fill in some of the blanks in your understanding of various key aspects of web technology.

I’ve provided many links in this article that serve as jumping off points for any topics you would like to explore further. There’s no doubt that there are many more topics web developers need to be conversant with, such as robots.txt files, meta robots tags, rel canonical tags, XML Sitemaps, server response codes, and much more.

In closing, here’s a nice article on the Stanford website titled “How Does The Internet Work?” that you might find interesting reading; you can find that here.

ABCO Technology teaches a comprehensive program for web development. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Learn web development today!

Personalized search gets local results

As marketers in the ever-changing world of digital, success depends on knowing what consumers want and expect from us. After all, it’s the only way we can deliver.

So, it’s interesting to see that a recent data release from Google tells us that personalized search is becoming more and more prominent among internet users.

No longer are they turning to friends and family for personal advice and recommendations, but search engines too.

Of course, we already knew that… that’s why we work so hard at getting to know our audience and understanding their micro-moments and pain points, delivering the right content at the right time, in the right way.

But what Google is telling us is that rather than searching, “How often should you wash your hair?”, we are now searching “How often should I wash my hair?”. Changing those two little words is making the way that we use search engines far more personal than ever before.

And the data suggests that consumers now truly trust that their most specific needs can be answered by content on the web. In fact, in the last two years Google has reported that mobile searches using “…for me” has grown by a huge 60% over the last two years.

On top of this, they have also seen an 80% increase in mobile searches including “…should I?”. As a result, we really are treating search as one of our best, most trusted friends.

And that’s great news for content marketers.

For those of us working in motor, beauty, finance, fitness and pet care, it seems that this new insight is especially relevant – these are the industries in which users are most frequently turning to Google to solve their personal pain points.

How can we prepare and optimize our content for these types of search?

Tools

Creating calculators and tools is a brilliant way of targeting personal search terms and providing our users with the personalized response they are looking for. Let’s use a fitness example to demonstrate this:

This recent data circulation from Google suggests that users are starting to search for something like, “how much water should I drink each day?” in higher volumes than something like, “how much water should you drink per day?”.

Now, most of us know that the answer to this question will depend on a number of different factors including gender, body composition, activity level and so on.

What our audience is expecting from this search is a personalized answer that takes all of these things into consideration and tells them exactly how much water they should personally be drinking each day.

A water consumption calculator would do this well, and if the user wants the specificity of an individual result, they will be willing to fill in the necessary personal details to retrieve it. A blog post that simply states the average recommended fluid intake for a man or a woman as recommended by the NHS is no longer user focused enough.

Case studies and testimonials

Providing personalized content will not always be easy, and at times users may need encouragement to spend a little longer on a page to find the personalized answer they are looking for. In this instance, case studies and testimonials are a great way to push users further through their journey in the right direction.

For example, “How much money do I need to retire?” is a more complex question than our fitness example. There are so many variants that could alter the accurate and personalized response to this question, so it’s difficult to answer it quickly in a personalized way.

However, if we provide users with a testimonial or case study at the right stage in their journey – one that was created after a lot of persona research and uses someone or a situation that will resonate with them – they are likely to engage with the content.

Creating engagement via a case study will increase the likelihood that they’ll enquire with your brand for a more personalized answer, continuing their journey on their way to the personalized answer they are looking for.

Hygiene content

Informational content (something we refer to here in ABCO Technology’s search engine class as ‘hygiene content’) is absolutely essential in light of this evolution of search.

It’s critical that all the informational content and resources on your website are up to date, and as specific to the different types of users you’re expecting to visit your site as possible. Not only this, but ensuring that on-page content is optimized for long tail search (tying back to your personas) is a must.

Moreover, having a clear call to action that points the user in the direction of personalized answers to their questions is also important. It isn’t always possible to answer their query in an individualized way using written content, but pointing the user towards a ‘contact us here’ call to action could make all the difference in their user journey, and ultimately, whether they end up with you or your competitor.

Thought leadership and expert content

Finally, with consumers turning to search like a trusted friend or family member more than ever before, you need to ensure that the content you’re putting out there is seen as being the most reliable. Therefore, it’s never been more important to be viewed as a thought leader within your field.

Expert content will naturally help to strengthen the consumer-brand relationship. It also means that when you are appearing in SERPs, your expert reputation will stand you in good stead when it comes to users choosing which ‘friend’ they want to seek advice from.

We can’t wait to see how the evolution of search changes the way that Google is rewarding and penalizing brands’ content. The above is just a start, but we are certain we will be kept on our toes as time goes on!

ABCO Technology teaches a comprehensive program for web development, which includes search engine optimization and social media strategies. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Build highly visible WebPages today!

Web design 101

The design of a company’s website plays an important function in attracting prospects and customers, generating leads, and potentially creating satisfaction.

Generally speaking, a website has between 15 to 30 seconds to convey a favorable impression about a company, as well as the products and services offered, to online visitors; therefore, paying attention to the site’s design is a key factor in how well it performs.

Design Basics

To some extent, a website’s design will be influenced by a company’s industry and culture. As an example, a website for a law or accounting firm will likely have a more serious design approach than a party planner or an ice cream parlor.

An effective website blends design, text, images, and video to provide a good overview of a company and what it offers. A website’s images can highlight both team members and the workplace, as well as provide a personal touch that helps differentiate a company from its competitors.

To avoid increasing a website’s startup or “load time,” it’s a good idea to optimize images for online display. Because the screen resolution is lower than print, using a “save for web” command in a photo-editing program reduces the “density” of the image so it will load more rapidly. There are also many plugins available for WordPress-designed sites that will aid in optimizing load time for pages and images.

It’s also helpful to consider readability as a site’s design is prepared. Dark text on a light background, for instance, is easier to read. It’s also an excellent idea to consider the target market’s age — if your target market skews over 40, making text easy for on-screen reading will help improve your site’s usability for visitors. Font type is also a key factor in the design of a website. Fonts such as Arial, Times New Roman, Verdana, or Georgia are all easy-to-read fonts that can be read clearly through a variety of browser types.

Improving Search Results

A website’s content is a critical factor in how the site is indexed by search engines. Thus, it is important to make sure that headlines and content relate to products and services offered.

Consider what keywords or phrases customers are likely to use, as they search for offerings that a company specializes in, and include those phrases in page titles and headlines. Avoid repeating keywords over and over, since search engines may penalize for “keyword stuffing;” yet make sure that the site describes the company and what it does, utilizing the terms necessary to find the website online.

Prepare for Scanners

Since most visitors scan a site quickly (think 15 seconds), instead of reading it carefully, it’s helpful to break up text with headlines, bulleted lists, and images. Keeping sentences and paragraphs short is another helpful way to improve the appearance and readability of a website. Important elements and the company’s contact information should appear “above the fold,” which is the portion of the screen visitors can see without scrolling down. Most people won’t bother to look for information that’s not directly in front of them.

Navigation Drives Results

While most people think of how a site looks, good design also includes navigation menus that are straightforward and easy to understand. Therefore, a web designer should make it as easy as possible for visitors to find their way around the site, since confused visitors will likely hit the back button instead of exploring a website further.

Some navigation tips:

•Place a company brand/logo in the upper left corner of every page, and make that logo a clickable link to the home page.

•Keep navigation menu items to a minimum. Professional web designers suggest using seven items or less on a navigation menu.

•The navigation menu should use descriptive titles. Summarize what a user will find on the specific page. Rather than have a menu item called “Team,” consider expanding it to “Meet Our Team.”

•Keep design usability in mind. A user should never have to go more than three levels deep (or three clicks away) from the homepage to find information.

Enlisting the Help of a Professional

Depending on goals for the website, enlisting help from a professional web designer could be a profitable investment in making the website as appealing and functional as possible. If you’re technically inclined, or wish to tackle the site’s appearance yourself, most web hosting companies offer a wide range of website templates designed for small businesses. Additionally many website building companies offer templates for little to no cost. WordPress and Wix are just two of the well-known companies that offer simple solutions for web design that don’t cost a fortune or require a programmer to code.

When working with a professional, it’s important to ask if you’ll have the ability to update or add content yourself (and how easy those updates will be to perform). If you have to rely on the web designer for future changes, additional costs will be incurred and possibly delays in having the website updated. Instead, ask the designer to include content management features so you or your colleagues will be able to update the site yourselves.

If you prefer to do it yourself with a design template, customize the images or graphics to give the site a more personal look and feel that better matches your company and its personality.

Either way, remember that creating a website is not a one-time project. You’ll have to plan for frequent updates as you develop more content and change your design periodically to prevent the site from appearing out of date.

#ABCO #Technology teaches a comprehensive program for web design. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

good web design = success. Become a successful web design professional today! #Website

Kodak creates crypto currency

Shares of Eastman Kodak stock were soaring Tuesday after the company announced a new #cryptocurrency initiative.

The company unveiled a licensing partnership with Wenn Digital to launch an image rights management platform called KODAKOne and a photo-centric crypto currency called #KODAK Coin.

The stock price opened at $3.10 a share Tuesday and rose as high as $7.65 after the announcement was made at the Consumer Electronics Show in Las Vegas. When the stock market closed at 4 p.m., shares were trading at $6.80.

In a release, the company said the KODAKOne platform will be an encrypted, digital ledger of rights ownership for photographers to register both new and archival work that they can then license for use.

The company describes KODAK Coin as “a new economy for photography,” which will allow photographers to receive payment for licensing their work immediately upon sale, sell their work confidently on a secure block chain platform.

The system will be open to both professional and amateur photographers.

Initial reactions from financial analysts were mixed.

CBS Marketwatch said Kodak was “boarding the block chain bandwagon,” hoping to capitalize on the crypto currency trend to boost its stock price.

Bloomberg said, “The move comes as investors snap up virtually any asset related to digital coins or the block chain technology that underpins them — no matter how tenuous the tie.”

The Financial Times was more blunt in its criticism, calling it “Kodak’s last desperate bid for relevance

Encryption, which has been a part of information technology for 25 years is now experiencing an expansion of its use. Bitcoin has been the number one crypto currency out of 1375 different encrypted currency chains. Now that Kodak has entered this market we have 1376. This writer believes Kodak’s entry will gain in popularity because it is focused upon a specific industry, which is photography. Could Kodak’s new currency be the beginning of a new trend, where different services have their own currency? This appears to be the new reality.

#ABCO #Technology teaches classes in encryption in our cyber security program. Call our campus between 9 AM and 6 PM Monday through Friday. Call today at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

 

Encryption is opening up numerous opportunity. Join #CyberSecurity today to learn more.

Oracle database administrator information

There is an old saying, which dominates the marketing and search engine industries. This saying states “content is king.”

Content comes from information. If content is king, then information is power.

The question of how to use this information begins with its storage and access. Information is stored in columns, rows and tables, which makes up what information technology experts call a relational database. The term relational means one piece of information relates to another. This information must be easily accessible, have the flexibility to be manipulated into a report and the presentation should be in such a way that other relationships can be drawn from that information.

One occupation, which has the job of making all of this activity take place is that of an Oracle database administrator. Oracle, handles the nation’s largest databases including those from the US government, major educational institutions and Fortune 1,000 companies. Google uses a customized version of Oracle for its search engine.

#Oracle #Database administrators are responsible for the accessibility, security and safety for a company’s information.

This includes client records, financial data, product information and all other documents needed for an organization’s successful operation. Oracle database administrators do not have to possess a four-year college degree. The person entering into this occupation must be well organized, enjoy categorizing information and have a passion for making that information accessible to those persons who need to use it. Security is also a major concern. The database administrator will become proficient with Oracle’s database security protocols and procedures.

The Oracle database administrator is accomplished when students train and pass two Oracle certification exams. The first exam a student must successfully complete is that of Oracle Certified Associate. The #OCA is usually completed after eight weeks of training in an accredited Oracle class.

The final exam, which is the Oracle Certified Professional, which carries the title of database administrator is usually passed after three additional months of training.

Oracle database administrators are experiencing a wide variety of job openings throughout the country. The job sites of Indeed and Glass door show the average salary for Oracle database administrators at approximately $93,000 per year. As you gain experience in this occupation, your salary will increase. Oracle has specialized databases, which include Oracle financials, Oracle medical and Oracle biological. Students can specialize in one of these databases after passing the Oracle certified Professional exam. Naturally specializing in a specific field will mean an increase in salary.

If you are interested in training for this exciting career, contact #ABCO #Technology. You can reach our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email your questions to info@abcotechnology.edu

Financial aid is available to all qualified students.

#ABCO #Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304

Database administrators are in demand, train and certify for a job today!

Google pulls Adwords reviews

Google has withdrawn ratings from Adwords extensions. So you will no longer be able to add these to your ads. Why are they taking this action and does it signify a wider problem with ratings? More importantly, does this mean local results or seller ratings

#Google #AdWords allows you to add something called extensions to your ads. Extensions are additional pieces of information that are free to add and can boost your click through rate. Some examples are:

•Sitelinks
•Call outs
•Structured snippets

If you’re not already using extensions, then you should be! It’s a great way of increasing CTR (by around 10%) and at a point where more advertisers are using them than not, you’re going to be hit increasingly hard if you’re not using them.

Unfortunately, like many things with Google, you can suggest what you want them to show but they decide whether or not they want to show it.

Reviews Extensions

Google didn’t source these reviews themselves. You added them, linking to the source, and the Adwords team just reviewed them before they were allowed live.

Although Google have given little reasoning behind why they have removed them, it seems it might just have been a badly designed system.

For a start on Google’s part it has got to be resource intensive, with team members having to click through, find reviews with matching text and approve. Although it’s likely this was partially automated, if even only a tiny percentage required manual review then it would be a large drain on resources.

This brings us to the next issue: legitimacy. Could you just make the reviews up? Well yes, you could. If you wanted to add a fake review to your site, then include it in the site link there is little Google could do to tell. This is the problem with pretty much all reviews and leads me on nicely to the final point.

Was the review representative? The reason why we trust a review from an unknown source is that it correlates with many other reviews. With this system you could just cherrypick whichever review you wanted, so even if the product is terrible someone, somewhere, is bound to like it and leave a good review. Which you can then legitimately use in your adwords extension.

Finally, you might be clicking on an ad for a product but for the sitelinks reviews, Google encouraged you to utilise those which reflected the site as a whole:

Reviews should focus on your business as a whole, as opposed to a review about a specific product or service. This makes the reviews relevant to just about all of your ads.

So you the user could well think that a general review such as “Absolutely excellent, will be purchasing again” is about the specific link they see in the ad. But it’s not, it’s about the site as whole and may be validating an otherwise awful product.

Does this mean there is a wider problem with reviews?

Reviews themselves have come increasingly under attack. If like myself you use shopping sites such as Amazon and have been doing so for a while, you’ll have seen what I mean. There are so many fake reviews, paid-for reviews, and reviews created by third-party companies paid to create them.

This problem has been created by the effectiveness of reviews, if customer is purchasing a third party product through your site you’ll want to make sure that the best products are shown first. This means using reviews to help weight products in how you rank them. Which in turn means vendors are more and more incentivized to have the best reviews possible. Which is fine, except that not all vendors have great products, but they still want great reviews.

So what’s the solution? What about other ratings-based extensions?

Reviews as part of AdWords extensions are in principle a great idea, but there some real fundamental issues with these which need to be resolved. AdWords needs to strike a balance between what the advertisers – who are after all paying for that advert – want to display and what is a fair representation for the consumer.

These means really they should be looking at a more independent source of reviews or way of gathering these. This is what they do with both seller ratings and consumer ratings.

Seller ratings are an automated (opt out) extension which look like this:

These are shown on search network text ads, or as an abridged version in Google shopping results. Seller ratings show a star rating and text snippet which they gather from a long list of third party review sites and they only show in certain circumstances;

In most cases, seller ratings only appear when a business has 150 unique reviews and a composite rating of 3.5 stars or more.

Consumer ratings are another automated extension and are taken directly from Google customer surveys which they ran for specific industries. So, these will only appear on certain results where they have data:

Google also use a star rating in reviews within local results and although having these linked to a Google account negates some of the problems, they are still pretty open to abuse. Many businesses only have a small number of reviews, meaning just one or two added from friends or family accounts might be all that’s needed to influence a consumer’s decision.

AdWords needs to use a service which is more long standing with a large number of reviews already present and which have their own controls in place to ensure these are legitimate, much like they do with their seller reviews extension. However, even dedicated review sites have massive problems with this. Like the guy who made his shed the top rated restaurant on trip advisor. In short you can’t trust reviews.

Adverts fall under advertising guidelines in both the UK and US where if ads are misleading they have to be removed. Publishers who regularly break these rules or are deemed to be not doing enough to prevent misleading ads can be fined. So AdWords must adhere to a higher standard than the sites it would be taking the source content from, hence the potential problem.

So was Google right to remove them and will we be seeing them again?

In my opinion reviews are becoming less of a trustworthy metric. The ability of bad actors to place them has far outstripped that of companies to ensure they are legitimate. This might be down to the cost / benefit being vastly different from both sides. For an individual company a set of good reviews could be the difference in zero revenue for a product, or $1000’s. So spending several hundreds or even thousands just on ensuring they have good reviews is well worth it. For a review site though, they simply cannot afford to spend anywhere near an equivalent amount on checking the reviews for an individual product. In short there is a lot more total effort going into making fake reviews than there is in removing them. Google tries to negate this with automated extensions by using a wide variety of sources, or their own survey results which they have more control over.

The way the Reviews extension was implemented in AdWords was pretty much asking for trouble. Then again, while there are better ways, I still don’t think there are any that are ‘good enough’. We’ll have to wait and see what Google does in the longer term and if it decides to make any changes in the organic results or to the other automation tools on their site.

#ABCO #Technology teaches courses for E-commerce and web design. Contact our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

ABCO Technology is located at:
11222 South La Cienega Blvd. STE # 588
Los Angeles, Ca. 90304

Building successful webpages today!

New breakthroughs in cyber security by Polyverse will create new jobs for programmers and network administrators.

The “WannaCry” virus, which took down, among others, the National Health Services’ computer network in the United Kingdom last May, was formally labeled a North Korean plot by the U.S. last month. Lost in the foreign intrigue were some basic questions. Why had computer administrators in the U.K. and elsewhere not applied the software fix issued by Microsoft months earlier to protect the vulnerable Windows software? Or was the U.S.’s National Security Agency partly to blame for stockpiling malicious code?

Perhaps most important, why hadn’t billions of dollars worth of computer security gear from leading cyber protection companies such as Cisco Systems (ticker: CSCO), FireEye (FEYE), Palo Alto Networks (PANW), and Symantec (SYMC) foiled the attacks?

Not too surprisingly, the founders of a three-year-old cyber security start-up called Polyverse are convinced their new system “would have completely prevented WannaCry,” says Alex Gounares, the company’s CEO.

The chief technologist of Microsoft’s (MSFT) online unit back in the 2000s, Gounares—who was Bill Gates’ personal technology advisor—says the hackers’ success relied on a simple asymmetry: It costs far less to attack a computer system than it does to protect one. Existing systems build the equivalent of walls and moats around a castle, the so-called firewall that sniffs out intruders and tries to block access. The defenders must guess where they might be attacked and try to anticipate every possibility. It’s a costly and never-ending process.

The problem is that attackers can spend all the time they want studying the situation, looking for holes or ways to get around the protections. If anything, new discoveries have made it easier for hackers far less sophisticated than WannaCry’s creators to take control of a computer.

Polyverse, which has gotten good reviews for its fledgling system, tries to shift the balance of economic power back to the defenders. By replacing the basic instructions inside a computer program with alternate instructions, Polyverse keeps scrambling the code. Doors and windows familiar to hackers disappear quickly, raising the stakes—both on speed and spending—for attackers.

“Dollar for dollar, offense has been winning,” despite billions spent on computer defense, says Bryan Smith, who worked for six years at the National Security Agency and now runs a tech incubator called Bantam Technologies. “Polyverse actually does switch the advantage back to the defender.”

If #Polyverse or a rival does succeed, it will mark the latest shift in the decades-long war for control of computer networks. A computer operates via a series of instructions written by a programmer telling the microprocessor, the brains, to carry out one basic function over and over. That function is to take some values stored in its memory circuits, to perform an operation on them, such as addition, and stick the result back in memory. A hacker tries to gain control of a computer by replacing the programmer’s series of instructions with his own, either changing the operations specified or sometimes changing where in memory the chip fetches and stores values.

One of the last big strategic shifts in the war came in 2007, when a computer scientist named Hovav Shacham showed it was possible to use a computer’s own code against it without injecting new code. Code is a long string of ones and zeros, and the computer chip only knows the instructions by knowing how to divide the ones and zeros into the right sequence of bits that make up each successive instruction. But Shacham realized he could direct the chip to divide the ones and zeros differently, thus changing the instructions.

To complicate hackers’ task, Gounares, 46, conjured ways for them to find not the traditional string of instructions, but a completely different set. Polyverse’s technology is what’s called a binary scrambler. It mixes up the ones and zeros of a program but lets the users’ tasks be completed undisturbed. The exercise turns the attackers’ own game against them, employing different instructions before the attacker can.

AT MICROSOFT, Gounares was well aware of the common complaint that Windows was a “monoculture,” a uniform system that attracted a mass following of developers but also armies of attackers aware of the software’s vulnerabilities. At Gates’ famous retreats to contemplate high-level software issues, the two would occasionally discuss using epidemiology, or the study of the spread of disease, as a guide.

Gounares, who is fond of nerdy references, poses the question, “Why hasn’t the earth been taken over by the zombie apocalypse?” The answer is because human DNA varies enough that no diseases can spread so far they devastate the entire population. But software is like DNA that’s uniform: It can be compromised because it’s reliably the same.

The solution was to create “entropy,” as he puts it—a divergence in the code so that every computer has unique sequences of instructions running through it. Polyverse’s product to date has been for scrambling the Linux operating system. Later this year, it will offer a version that can scramble the entire Windows operating system and programs that run on Windows, says Gounares. Some customers have been given the Windows version to test.

POLYVERSE IS A VERY small company with a promising idea. Funded with just $6 million in private capital, the Seattle-area entity has less than $10 million in annual sales, though Gounares pledges that will rise into the tens of millions over the course of the next 12 months. That’s compared with roughly $2 billion annually in security-related revenue for Cisco, the biggest publicly traded cyber security vendor.

To be sure, Polyverse is not the only company to have thought of what’s known as “moving target defense.” The Massachusetts Institute of Technology’s Lincoln Labs has a rich literature on the subject. But researchers there found problems cropping up: Either the scrambling is limited, leaving avenues of attack, or the scrambled programs degrade in performance.

“We have taken this from an academic approach to an industrial-strength system,” insists Gounares. Polyverse scrambles all the parts of a program, not just some, he says, and without affecting the performance a user experiences.

Steven Potter, a former Navy SEAL who heads sales, sees the military as a key market for Polyverse. There are U.S. weapons systems running on versions of Windows no longer supported by Microsoft. To rip and replace, as they say, those computer systems to make them safer can run into billions of dollars. Hence, a Polyverse sale can be an economical option for government, notes Potter, who served as a contractor in Afghanistan ensuring cargo was safe for the war effort. The firm has already won several military contracts.

Potter, however, becomes most animated when discussing the possibilities offered by the weakness of existing cyber companies. “Where the disruption comes from,” says Potter, “is that with the Palo Alto’s, and the FireEyes, and Symantecs, you can literally take a class and for $1,000, you can hack through any known firewall on the planet.”

Cisco, FireEye, and Symantec declined to comment, while Palo Alto did not return my calls last week.

With the publicity and questions that accompany each new WannaCry-like cyber disruption, Polyverse’s opportunity grows. The system of walls and moats just might be giving this company a great opening.

ABCO Technology offers a complete program for cyber security. Cyber security jobs in Los Angeles are exploding. If you are interested in a career in this exciting field, contact ABCO Technology.

You can reach us by telephone from 9 AM to 6 PM Monday through
Friday at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who qualify for funding.

ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588

Los Angeles, Ca. 90304

 

Cyber security jobs will expand through 2030 says the US Department of Labor. Start your new career today!

How to ensure your local Search Engine Marketing campaign is working for your business

 

For local businesses, having a strong presence in local search results is fundamental to those all-important conversions

Just to be clear, a “local business” refers to any business that has either a physical location that offers face-to-face contact with the customer, such as a showroom or shop, or one that offers a face-to-face service within a certain area.

When it comes to local search, it’s simple: if searchers can’t find you on the web, then frankly, you are web invisible. It’s the way of the modern world.

It’s all very well dominating the SERPs for your more general target keywords, but if you fail to rank highly for location-specific terms then you are missing a great opportunity.

When users are searching for a local term, they are far more likely to be looking for a service or product. Hence why the conversions on local search tend to be higher, and why you need to ensure that your local search engine marketing is on target for your business.

Of course all the usual SEO 101 knowledge applies. Offer an unrivaled user experience, nail your on-site optimization, provide exceptional content and build quality links.

Those fundamentals will set you up for ranking well for local search terms, but there are extra steps you must take to differentiate yourself from the competition and really bolster your local SEM strategy.

Local business listings

The first place to start is with local business listings. Ensure that your business is included in all the major directories (Yell, which is the UK’s local directory, Yelp, Thomson Local, etc.), as well as any industry specific ones. Some listings may already exist, and it may just be a case of claiming your business so that you can take ownership of the listing.

We recommend keeping track of all your business listings in one comprehensive spreadsheet to save you repeating or forgetting any entries. It also enables you to be consistent (more on this in the next point) in your information across all listings.

Remove all duplicated entries, as multiple listings for one business or location can become confusing, both to potential customers but also to Google. And we certainly don’t want to be confusing the Big G.

Be thorough but don’t be reckless. Avoid spammy directories as these could have a detrimental effect on your SEO. Deploy a spot of common sense to identify the spammy directories but if you are really unsure then it’s worth checking the spam score via Moz’s Open Site Explorer or via other similar tools.

Google My Business

So this technically falls under business listings, but it’s so important we’ve given Google My Business its own subheading. Arguably the most important business listing because, well, it’s Google. Remember to implement the following:
◾Claim your business via a verification process
◾Include accurate information: contact details, location and opening hours
◾Carefully select a small number of highly relevant categories to represent your business
◾Ensure up-to-date branding, such as in any images of logos or premises
◾Use high quality images to represent the business

Be comprehensive and accurate in the information you provide in order to strengthen your Google My Business profile and improve your chances of being featured in Google’s three-pack.

For further information, have a read of Google’s guidelines on representing your business. Don’t forget to also cover off the equivalent for Bing and Yahoo with Bing Places and Yahoo! Local.

NAP consistency

NAP consistency sounds a like a fancy term but the concept is very simple. NAP stands for Name, Address and Phone number, although it is sometimes expanded to NAP+W to include website address too. As mentioned above, it is crucial that your business information appears consistently across the web.

This is particularly important to consider if your business has changed address, contact details or even rebranded. Any mentions of your business will need to be checked and updated to ensure accuracy.

Simply google your business name (do the same with your previous business name if you have undergone a name change) and work your way through the listings. Maintain a spreadsheet of your progress so you can keep track.

Reviews

Reviews can bring both utter joy and absolute misery to any business owner. Unfortunately you cannot simply ignore them, as reviews are indeed used as ranking signals in the eyes of the search engine. This is especially true for your Google My Business reviews.

Not only are reviews important in terms of local rankings, they are also key in terms of click-through rates. According to a recent study by BrightLocal, 74 per cent of consumers say that positive reviews make them trust a local business more.

Apart from providing the most incredible customer service you can muster, how else can you seize some control over your reviews? No, this isn’t about getting your mum, brother and great-nan to write a review for your business. It’s about a bit of gentle encouragement and managing a bad customer experience before it reaches the review stage.

It is also important to check the rules and regulations of each review platform, as they all have very different policies on asking customers for reviews and responding to them.

We’ve had several students who have received a negative one-off, anonymous review for their business or website that is either quite clearly spam, or in some cases, a bitter competitor or personal enemy. These situations can get a bit sticky, but sadly there isn’t an awful lot you can do.

Generally people won’t be deterred by one bad review, and the best course of action is to encourage other happy customers to get reviewing. This will push the bad review down and push the average star rating back up.

Many review platforms allow you to reply to reviews. This can be a good opportunity to set the record straight but you have to be careful about it. For this reason, sometimes it is best to get someone who is not as emotionally invested in the business to either write the response or edit it before it gets published. Be professional, remain calm, and kill them with kindness.

Location pages

If you don’t already have location pages on your website, then you could be missing a valuable opportunity to target all the relevant locations. For each key location that your business operates within, create a page dedicated to that location on your website. This is easier if you have a unique physical address in each location, as it is important to include as much location-specific information as possible.

Where there is a physical location, be sure to include an interactive map and images to further enhance the page. If you do not have separate physical addresses, try including testimonials and case studies relevant to each location.

This will help you to avoid duplicating content across your location pages; it’s a fine art to differentiate the copy, but do it right and it can have seriously good effects on your local SEM strategy.

Schema markup

Once you have your location pages set up, the cherry on the cake is schema markup. The whole concept of structured data can sound very daunting to markup newbies, but it’s easier than it sounds. Schema markup simply helps search engines to understand what your website is about.

This is particularly important for local information, as it will help those spiders crawl your location pages and you’ll benefit as a result.

According to a study by Searchmetrics, pages with schema markup rank an average of four positions higher in search results. Now that’s a pretty good incentive. Get your head around schema markup and you’ll have that crucial advantage over your competitors in the local search results.

Ensuring your local search marketing strategy is up to speed shouldn’t be difficult or convoluted. Follow the above steps and obey the usual SEO rules. With some hard work and perseverance, you’ll start dominating those coveted top spots and see your conversions skyrocket in no time.

ABCO Technology teaches a comprehensive program for web design, which includes search engine marketing and social media strategy. Call our campus to receive information about this program or other classes. Call us between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.

Email your questions to: info@abcotechnology.edu

Financial aid is available to all students who can qualify for funding.

 

ABCO Technology is located at:
11222 South La Cienega Blvd. in STE # 588
Los Angeles, Ca. 90304.
Build highly successful websites today!

Four Top tips for guest blogging

Many people still rely on guest blogging as an ongoing part of their link-building strategy.

If you analyze this for a moment, getting links through guest blogging is much easier than getting links through some other channel.

So it’s no wonder that a lot of bloggers and SEO experts still favor this method. The main question is, are these guest post links still safe and viable?

The answer is yes.

It is very possible to build our link portfolio by contributing on authoritative websites. However, there are certain rules that we need to follow in order to avoid any issues with Google.

Guest blogging 101

One of the beautiful things about guest blogging is that it gives us the opportunity to score some great links from high-tier websites.

To be honest, we probably couldn’t have gotten those links with any other method. This is why guest blogging is always worth bearing in mind as a link-building strategy.

But, we need to be very careful when choosing the websites we guest post for.

Although guest blogging can be carried out on a large scale, you should probably avoid it. This method is optimal in small dosages while cooperating with the biggest domains.

Google trusts authoritative domains. If they notice that your links are coming only from reputable sources, they will not impose a penalty. However…

1) Be Careful not to overdo it!

The biggest problem with guest blogging is that people tend to overdo it.

Your articles should be coming from various sources, with different anchors. If your only source of inbound links is from guest articles, Google will notice this pattern and you will soon get into trouble.

Instead, you should choose your battles carefully. You need to diversify your link profile.

If you already decide to do some guest posting, make sure it counts. Otherwise you’ll waste all that time you spent building up relationships and writing your posts – with only a Google penalty to show for it.

2) Focus on quality

This is where most people go wrong.

Google assesses the articles from which you are getting links. If the article is of high quality, your link will also be regarded as a quality one.

This makes sense, right? After all, why would anyone bother creating a great piece only to place crappy links throughout?

So when you put together a guest post, make sure it’s a good one.

After creating their own article, people try to promote it by writing guest posts. These guest posts will usually be of much lower quality and they will have the same regurgitated content which you published on your own blog.

By doing this, not only are you getting a devalued link, but you are also endangering your original piece. Google will flag up the regurgitated versions of your article as possible duplicate content. And because there are any number of similar, low-quality pieces out there online, it may conclude that your article is low-quality as well.

Everything you create has to be unique and to provide value to the reader. When you write a guest post, ask yourself: would I link to this piece?

If the answer is yes, you are in the clear.

3) Add images, links and formatting

As I mentioned, each guest post you make needs to be distinctive. Even if you are employing this strategy on a larger scale, at least make sure that everything you create is a separate entity.

One of the best ways to differentiate articles is by using varied formatting.

Blogs always have use different fonts and that is something you have no control over. But you have the ability to break up paragraphs and add things like bullet points, subheadings, block quotes and more. These increase the readability of your piece, and also make it easier for search engines to crawl it and interpret the content.

Another way to improve the look and layout of your text is to add images and other media.

Do not be shy and don’t wait for the website editor to insert them for you. Instead, be proactive and use your own images. Add a couple of them if necessary. If they make sense and the text looks better because of it, the editor will be more inclined to ask you for additional guest posts.

You can even go the extra mile and write titles and alt text to optimize the images for SEO – the editor will thank you, as it will save them the effort, and it will improve the overall SEO value of your piece.

Lastly, we come to links.

Now, editors usually allow one link in your bio, or one link within the article. Most of them do not like it when an author writes a piece with numerous links pointing to different websites.

However, if the editor allows it, make sure to add some highly relevant links that will make the article even more authoritative.

4) Vary your anchor text

You are trying to rank for a certain keyword. In an attempt to rank, you try to spam the same anchor text over and over.

This strategy is pretty much obsolete. Instead, just as with everything else that we’ve mentioned so far, make sure to diversify things.

Anchor text should vary.

When people place links with purely editorial value, without trying to cynically rank for a specific keyword, they will rarely link with the exact same phrase every time. This is highly unnatural behavior and can get you in trouble.

Instead, make sure to use different phrases. Place links in different sentences, with different anchors. Focus on writing naturally and place your link accordingly.

Conclusion

Guest blogging is NOT dead. As far as we know, there is no Google system or algorithm that will penalize the creation of such articles.

Nevertheless, it is better to be conservative. Like always, it comes down to whether your link profile looks natural. There should be no indication that you are purposely trying to push a keyword (even if you are).

People usually think about guest blogging in terms of links. However, you should observe it from a different perspective. By using this strategy, not only should you get links, you should also get some good exposure.

Your articles should promote your skills as well as your blog.

By placing emphasis on this, you will be able to accomplish much more with guest posts and as a result, links will start coming from various sources without you forcing them.

ABCO Technology teaches a comprehensive course for web development, guest blogging and search engine marketing. Call our campus between 9 AM and 6 PM Monday through Friday. You can reach us at: (310) 216-3067.
Email your questions to: info@abcotechnology.edu

 

Financial aid is available to all students who can qualify for the federal funding.
ABCO Technology is located at:
11222 South La Cienega Blvd. in STE # 588
Los Angeles, Ca. 90304
Start building successful websites today!

×

Request Info with No Obligation

    How much is tuition?Can I get financial aid?What are my career prospects?When does it start?

    By checking this box, I give consent for ABCO Technology to use automated technology to call and/or text me at the number provided above, including my wireless number if applicable. Call us for information: 310-216-3067

    I understand & agree