On March 1, 2018, New York state has moved forward with the strongest cybersecurity regulations in the United States. All banks, financial institutions and other investment organizations who transact business with the public must have a cybersecurity officer as part of their information technology staff.
The New York regulations are nationwide because all national banks and investment offices have major branches in New York. Companies who fail to comply with this regulation will be subject to fines, which range up to 1% of assets. The 1% of assets is a large number. For example banks licensed in New York managing one trillion dollars of assets could be fined up to ten billion dollars for simply not having a cybersecurity officer as part of their staff.
Becoming a cybersecurity officer can be accomplished with proper certification and training. The best path for a financial industry cybersecurity career in this field is to first become a network administrator and then add the cybersecurity certifications to your existing credentials.
Let’s breakdown each certification down one by one.
1. Begin with the CompTIA A+ certification, which qualifies the holder to repair, maintain, install and configure computers plus devices on a network. This certification is completed in less than two months.
2. Microsoft’s MCSE or Microsoft certified Solutions Expert. This certification, which is completed in six months entitles the holder to install, maintain, configure and secure a corporate server, which is the large computer on a network distributing programs to all workstations.
3. Your next certification is the Cisco Certified Network Associate or CCNA, which qualifies the holder to install, configure and secure a corporate router. This certification is usually finished in six weeks.
4. You are now a junior level network administrator. Your first cyber security certification is the CompTIA Security +, which entitles the holder to secure a network against cyberattacks and teach other employees about cyber security. This certification is completed in six weeks.
5. The next certification is the Certified Ethical Hacker, which is completed in six weeks. This course teaches you to think like a hacker so you can defend your organization against cyberattacks. The certification allows you to have access to the latest attacks committed by hackers and how you can defend against them.
6 Your final certification is your CompTIA Linux +, which teaches you how to use the Linux operating system. Organizations large and small use Linux in the background because the operating system is difficult for hackers to penetrate and attack.
After completing these certifications, which will take you about one year, you can apply for a position in cybersecurity. The US labor department has stated that jobs will grow at an average rate of 15% a year up through 2030. This career offers strong job security, good benefits and a chance to earn an excellent living doing what you love.
If you are interested in obtaining a promising career in the fast growing field of cybersecurity, contact ABCO Technology. You can reach our campus between 9 AM and 6 PM Monday through Friday. Call our campus today at: (310) 216-3067.
Email your questions to: info@abcotechnology.edu
Financial aid is available to all students who qualify for funding.
ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304
Cybersecurity is an exploding jobs field. Train and certify for a position today!
Over the years, I’ve written and reprinted many articles for ABCO Technology’s Facebook page. Today I want to reprint and give special credit to Tiernan Ray who writes for Barron’s magazine. This article about quantum computing is published with an investment perspective, however readers interested in information technology careers will grasp the job possibilities represented in this brilliant article written by Tiernan Ray.
Technology Trader
Getting Your Mind Around Quantum Computing
Is five years beyond your investment horizon? If so, ignore what I’m about to say: In five years, we will have practical quantum computers, long the holy grail of computer scientists.
That prediction comes from Microsoft (ticker: MSFT), which is pursuing novel avenues to build a computer that operates on the strange quantum mechanical properties of subatomic particles. Such computers may solve previously intractable problems in information technology.
Even if quantum computing lies outside your portfolio considerations, there are implications worth pondering. Quantum computers are already being “simulated” by Microsoft, meaning that some of their basic operations are being mimicked on plain old microprocessors and memory chips.
As quantum computing grows nearer, and as programmers eager to learn about it explore it through mimicry, it could ripple through technology. The race for innovative chips, software, and cloud computing could be affected. Companies that shoulder the risk and reward include chip makers Intel (INTC), Nvidia (NVDA), and Micron Technology (MU), and cloud-computing operators such as Microsoft, with its Azure cloud service; Alphabet’s (GOOGL) Google; and Amazon.com (AMZN).
QUANTUM COMPUTERS EXPLOIT nonlinear aspects of quantum particles such as “entanglement” and “superposition,” in which particles exist in not one but several states simultaneously. That makes possible computations in parallel, rather than the traditional one-by-one processing of classical computing. Nobel physicist Richard Feynman helped propel the field in a series of 1981 lectures, when he proposed a computer built using individual atoms. Because atoms have “measurable physical attributes,” known as “spin,” said Feynman, digital ones and zeros could be represented, or encoded, in them. Later, scientists broadened the concept. Instead of ones and zeros at a subatomic level, the qualities of entanglement and superposition could give quantum computers the ability to dramatically multiply the work that can be done in a given amount of time.
Making quantum computing practical has taken decades of fundamental research. A turning point came in 2012, Microsoft’s quantum team leader, Todd Holmdahl, told Barron’s last week. That was the year a team that included Leo Kouwenhoven, principal researcher on Microsoft’s quantum team, found evidence of the Majorana fermion. The Majorana is a particle with the property of being both matter and antimatter at the same time. Prior to that, its existence had only been hypothesized.
Kouwenhoven and the Microsoft team have gained greater control of the Majorana since then, says Holmdahl. Today, they are using it as a storage medium to manipulate a qubit, the fundamental unit of information in a quantum computer.
The Microsoft approach has its detractors, but Holmdahl and his colleague, physicist Julie Love, who heads business development, believe that the company will end up with the best qubits, that is, those with the lowest error rates. Minimizing errors means that the eventual Microsoft quantum computer should involve a far simpler design than rivals, and one that’s more scalable and practical.
The eventual quantum machine could offer breakthroughs in computationally intense fields, such as the chemistry of heavy metals. Artificial intelligence could be dramatically sped up.
OUTSIDE OF MICROSOFT, MANY FIRMS, including Alphabet, IBM (IBM), and various start-ups, are actively working on the technology, and programmers increasingly want to simulate the computers before they’re available commercially. That could further boost demand for DRAM memory chips. To simulate a relatively simple quantum computer involving just 40 qubits requires 16 trillion bytes of DRAM, a thousand times as much as the average laptop. That’s nice for Micron Technology (MU), which makes such components, along with Samsung Electronics (005930KS) and SK Hynix (000600.Korea).
Such simulations should fuel demand for Azure and other cloud-computing providers. After all, it’s much easier to roll out trillions of bits of DRAM if you’re Microsoft Azure, Google Cloud, or Amazon AWS than it is for the average shop to buy tons of memory chips for laptops. Moreover, the algorithms to simulate quantum computing are still being theorized and tested. By rolling out new software, Microsoft and its cloud rivals can make the case that their services are perfect for learning about the new technology.
Quantum simulation may also put a strain on today’s chips. After all, current chips were first developed 60 years ago for processing simple bits, not for qubits with their multiple simultaneous states Traditional processors that manipulate integer or floating-point arithmetic might suffer by comparison to novel designs based on other principles.
The chip industry is already undergoing great change, and industry veterans are reinventing themselves with new start-ups. One is Ampere Computing, led by former Intel software executive Renee James. While James declined to describethe design of her new chips, she says some will be built to handle tasks such as artificial intelligence. Quantum could fuel such specialization, if there is enough demand to run the new emerging algorithms. While Nvidia and Advanced Micro Devices (AMD) are both seeing a renaissance for their graphics chips, they may have to prepare for computers with very different requirements.
And what of Microsoft? Its quantum efforts have to be reckoned with. The effort may be the most promising development at the company since Satya Nadella became CEO four years ago. Regardless of whether Microsoft makes it across the finish line before others, the fact that it is competing in the race is encouraging for those rooting for the company.
Related: Microsoft: We Have the Qubits You Want
Getting Your Mind Around Quantum Computing
TIERNAN RAY can be reached at: tiernan.ray@barrons.com
ABCO Technology offers classes in five major areas of information technology, which include: networking, cyber security, web development, computer programming and Microsoft Office products. Call our campus today between 9 AM and 6 PM. We are available Monday through Friday at: (310) 216-3067.
Email your questions to: info@abcotechnology.edu
Financial aid is available to all students who qualify for funding.
ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304
Train and certify for a career in information technology today!
For one of the world’s largest tech companies, “small” is a relative term.
So when IBM, a tech conglomerate that boasts 380,000 employees, says it has a “small” team working on blockchain, by startup standards, it’s anything but. Far from just building a garage and staffing it with a few engineers, IBM has created a network of global offices seeking to operationalize its team of 1,500 blockchain professionals now operating out of a dozen offices.
Perhaps more impressively, all those moving parts are choreographed by one person: Marie Wieck, a 20-year veteran of IBM and the general manager of the newly created blockchain unit.
In an exclusive interview with Coin Desk, Wieck explained what it takes to build distributed networks using both its proprietary IBM Blockchain Platform and the open-source Hyperledger Fabric, which her firm helped pioneer. For companies looking to gain access to one of those networks, build their own network or compete against IBM, the step-by-step description provides a rare glimpse into how the $135 billion company conducts its blockchain business.
Speaking from her office at IBM’s Watson headquarters in downtown Manhattan (one half of what is internally referred to as “Blockchain North”), Wieck painted a picture of a distributed team that in many ways mirrors a blockchain in its design.
She told Coin Desk:
“We’re trying to keep as co-located as possible with the teams working together so we can really focus on the speed to market that we want to see.”
Blockchain North
Thomas J. Watson Research Center,
While her job now is buzzing back and forth between the Manhattan location and the Thomas J. Watson Research Center in Yorktown Heights, New York (the other half of Blockchain North), Wieck first started working with IBM back in 1997 when she joined as a founding member of the company’s nascent internet unit.
As part of this team, she began a career of finding business use cases for cutting-edge technology that would eventually include XML, web services and mobile, preparing her in many ways for her current task of helping IBM’s clients with blockchain.
The “solution work” of this process – as Wieck calls it – is centered around Blockchain North, the assembly line mechanism of the project, where staff help clients around the world build applications using the IBM Blockchain Platform.
Due in large part to the open-source code at the core of IBM’s blockchain strategy, one which lets clients build on their own distributed ledgers as well, Wieck frequently doesn’t get involved until the clients – or potential clients – are already well advanced in their work.
As for work on that open-source platform, and the IBM Blockchain Platform itself, that largely takes place 511 miles south.
Blockchain South
IBM Research Triangle Park
Known as “Blockchain South,” the Research Triangle Park offices in Raleigh, North Carolina, are home to what Wieck calls IBM’s “platform work.”
This is where the IBM Blockchain Platform – unveiled for enterprises last month – has been developed for the past three years. The platform is designed to be an end-to-end or “full-cycle” solution where developers and managers can experiment with the technology, building it and testing it either by the hour or via subscriptions.
This platform is the machinery that in part cranks out the solutions in Blockchain North. But “platform work” also has another meaning at Blockchain South.
For builders around the world with a more adventurous bent, this is also where they can go to hire help on projects that bypass IBM’s proprietary platform and go straight to its open-source core: Hyperledger Fabric.
While Fabric comprises about one-third the total code used in the proprietary IBM Blockchain Platform, anyone can build on it – even if what they want to create is a direct competitor to IBM.
“Whatever they need to do at the technical level to operate or to build a blockchain network, we would like to see continuing to expand in that platform,” said Wieck.
Littleton, Massachusetts
IBM Mass Lab – Littleton campus
IBM’s newest blockchain offices are located at the IBM Mass Lab in Littleton, Massachusetts.
Originally opened in January 2010 as what was then touted by IBM as the largest software development lab in North America, the location now serves as a satellite location of sorts for Blockchain North.
But instead of being focused on solutions work generally, the location is helping develop what Wieck calls “solution accelerators,” or frequently used widgets such as the provenance engine required by many of IBM’s clients to track items.
Crucially, however, this is also the base operations for another kind of solution: governance.
Based on the lessons learned from other implementations, IBM uses the Littleton branch to help companies write software to onboard new members, develop consensus mechanisms so they can find ways to agree, and if things go wrong, kick bad actors off the network.
Or as Wieck put it:
“How to actually operate a network at scale.”
In the garage
IBM Bluemix Soho
Arguably the most startup-like component of IBM’s blockchain work, Wieck also oversees nine “Bluemix Garages” scattered around the world, in New York City, Toronto, San Francisco, London, Nice, Tokyo, Singapore, Austin and Melbourne.
Initially launched in 2014, the collaborative locations are similar to WeWork facilities, but with startups hand-selected to receive support from IBM.
Gradually, those locations are being adapted to accommodate increasing demand by blockchain companies. Most recently, this July, the BlueMix Garage in the Soho area of New York (pictured above) expanded to include support for blockchain services.
At these disparate locations, and in any real garages where people build on the open-source technology Wieck helped develop, she said the basic principles that form IBM’s blockchain networks first take root.
“To me, it’s kind of like a mall,” she said, concluding:
“You may have the anchor tenants, but you don’t stay in a mall unless the food court is good, there’s good movies playing. You want all of those value-added services around that network.”
ABCO Technology teaches courses for cyber security. Learn about blockchain technology and what it can do for your business. Call our campus today between 9 AM and 6 PM at: (310) 216-3067.
Email your questions to: info@abcotechnology.edu
Financial aid is available to all students who can qualify for funding.
ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304
Block Chain is more than bitcoin, learn about it today1!
As a certified Internet web developer, your role will invariably lead you to interactions with people in a wide variety of roles including business owners, marketing managers, content creators, link builders, PR agencies, and developers.
That last one – developers – is a catch-all term that can encompass software engineers, coders, programmers, front- and back-end developers, and IT professionals of various types. These are the folks who write the code and/or generally manage the underlying various web technologies that comprise and power websites.
In your role as a web developer, it may or may not be practicable for you to completely master programming languages such as C++ and Java, or scripting languages such as PHP and JavaScript, or markup languages such as HTML, XML, or the stylesheet language CSS.
And, there are many more programming, scripting, and markup languages out there – it would be a Herculean task to be a master of every kind of language, even if your role is full-time programmer and not a web developer.
But, it is essential for you, as a certified web developer professional, to understand the various languages and technologies and technology stacks out there that comprise the web. When you’re making website recommendations, which developers will most likely be executing, you need to understand their mindset, their pain points, what their job is like – and you need to be able to speak their language.
You don’t have to know everything developers know, but you should have a good grasp of what developers do so that you can ask better questions and provide SEO recommendations in a way that resonates with them, and those recommendations are more likely to be executed as a result.
When you speak their language, and understand what their world is like, you’re contributing to a collaborative environment where everyone’s pulling on the same side of the rope for the same positive outcomes.
And of course, aside from building collaborative relationships, being a professional web developer involves a lot of technical detective work and problem detection and prevention, so understanding various aspects of web technology is not optional; it’s mandatory.
Web tech can be complex and intimidating, but hopefully this guide will help make things a little easier for you and fill in some blanks in your understanding.
Let’s jump right in!
The internet vs. the World Wide Web
Most people use these terms interchangeably, but technically the two terms do not mean the same thing, although they are related.
The Internet began as a decentralized network of independent interconnected computers.
The US Department of Defense was involved over time and awarded contracts, including for the development of the ARPANET (Advanced Research Projects Agency Network) project, which was an early packet switching network and first to use TCP/IP (Transmission Control Protocol and Internet Protocol).
The ARPANET project led to “internetworking” where various networks of computers could be joined into a larger “network of networks”.
The development of the World Wide Web is credited to British computer scientist Sir Tim Beners-Lee in the 1980s; he developed linking hypertext documents, which resulted in an information-sharing model built “on top” of the Internet.
Documents (web pages) were specified to be formatted in a markup language called “HTML” (Hypertext Markup Language), and could be linked to each other using “hyperlinks” that users could click to navigate to other web pages.
Further reading:
◾History of the Internet
◾History of the World Wide Web
◾ARPANET
Web hosting
Web hosting, or hosting for short, are services that allow people and businesses to put a web page or a website on the internet. Hosting companies have banks of computers called “servers” that are not entirely dissimilar in nature to computers you’re already familiar with, but of course there are differences.
There are various types of web hosting companies that offer a range of services in addition to web hosting; such services may include domain name registration, website builders, email addresses, website security services, and more.
In short, a host is where websites are published.
Further reading:
◾Web Hosting Service
Web servers
A web server is a computer that stores web documents and resources. Web servers receive requests from clients (browsers) for web pages, images, etc. When you visit a web page, your browser requests all the resources/files needed to render that web page in your browser. It goes something like this:
Client (browser) to server: “Hey, I want this web page, please provide all the text, images and other stuff you have for that page.”
Server to client: “Okay, here it is.”
Various factors impact how quickly the web page will display (render) including the speed of the server and the size(s) of the various files being requested.
There are three server types you’ll most often encounter:
1.Apache is open-source, free software compatible with many operating systems such as Linux. An often-used acronym is “LAMP stack” referring to a bundling of Linux, Apache, MySQL (relational database) and PHP (a server-side scripting language).
2.IIS stands for “Internet Information Services” and is proprietary software made by Microsoft. An IIS server is often referred to as a “Windows Server” because it runs on Windows NT operating systems.
3.NGINX – pronounced “Engine X”, is billed as a high-performance server able to also handle load balancing, used as a reverse proxy, and more. Their stated goals and reason for being include outperforming other types of servers.
Further reading:
◾Apache
◾IIS
◾NGINX
Server log files
Often shortened to “log files”, these are records of server activity in response to requests made for web pages and associated resources such as images. Some servers may already be configured to record this activity, others will need to be configured to do so.
Log files are the “reality” of what’s happening with a website and will include information such as the page or file requested, date and time stamp of the request, the user agent making the request, the response type (found, error, redirected, etc.), the referrer, and a few other items such as bytes served and client IP address.
Web developers should get familiar with parsing log files. To go into this topic in more detail, read JafSoft’s explanation of a web server log file sample.
FTP
FTP stands for File Transfer Protocol, and it’s how you upload resource files such as webpages, images, XML Sitemaps, robots.txt files, and PDF files to your web hosting account to make these resource files available and viewable on the Web via browsers. There are free FTP software programs you can use for this purpose.
The interface is a familiar file-folder tree structure where you’ll see your local machine’s files on the left, and the remote server’s files on the right. You can drag and drop local files to the server to upload. Voila, you’ve put files onto the internet! For more detail, Wired has an excellent guide on FTP for beginners.
Domain name
A domain name is a string of (usually) text and is used in a URL (Uniform Resource Locator). Keeping this simple, for the URL https://www.website.com, “website” is the domain name. For more detail, check out the Wikipedia article on domain names.
Root domain & subdomain
A root domain is what we commonly think of as a domain name such as “website” in the URL https://www.website.com. A subdomain is the www. part of the URL. Other examples of subdomains would be news.website.com, products.website.com, support.website.com and so on.
For more information on the difference between a domain and a subdomain, check out this video from HowTech. URL vs. URI
URL stands for “Universal Resource Locator” (such as https://www.website.com/this-is-a-page) and URI stands for “Uniform Resource Identifier” and is a subset of a full URL (such as /this-is-a-page.html). More info here.
HTML, CSS, and JavaScript
I’ve grouped together HTML, CSS, and JavaScript here not because each don’t deserve their own section here, but because it’s good for web developers to understand that those three languages are what comprise much of how modern web pages are coded (with many exceptions of course, and some of those will be noted elsewhere here).
HTML stands for “Hypertext Markup Language”, and it’s the original and foundational language of web pages on the World Wide Web.
CSS stands for “Cascading Style Sheets” and is a style sheet language used to style and position HTML elements on a web page, enabling separation of presentation and content.
JavaScript (not to be confused with the programming language “Java”) is a client-side scripting language to create interactive features on web pages.
Further reading:
◾HTML intro
◾CSS intro
◾JavaScript intro
AJAX & XML
AJAX stands for “Asynchronous JavaScript And XML. Asynchronous means the client/browser and the server can work and communicate independently allowing the user to continue interaction with the web page independent of what’s happening on the server. JavaScript is used to make the asynchronous server requests and when the server responds JavaScript modifies the page content displayed to the user. Data sent asynchronously from the server to the client is packaged in an XML format, so it can be easily processed by JavaScript. This reduces the traffic between the client and the server which increases response time and speed.
XML stands for “Extensible Markup Language” and is similar to HTML using tags, elements, and attributes and was designed to both store and transport data, whereas HTML is used to display data. For the purposes of SEO, the most common usage of XML is in XML Sitemap files.
Structured data (AKA, Schema.org)
Structured data is markup you can add to the HTML of a page to help search engines better understand the content of the page, or at least certain elements of that page. By using the approved standard formats, you provide additional information that makes it easier for search engines to parse the pertinent data on the page.
Common uses of structured data are to markup certain aspects of recipes, literary works, products, places, events of various types, and much more.
Schema.org was launched on June 2, 2011, as a collaborative effort by Google, Bing and Yahoo (soon after joined by Yandex) to create a common set of agreed-upon and standardized set of schemas for structured data markup on web pages. Since then, the term “Schema.org” has become synonymous with the term “structured data”, and Schema.org structured data types are continually evolving with new types being added with relative frequency.
One of the main takeaways about structured data is that it helps disambiguate data for search engines so they can more easily understand information and data, and that certain marked-up elements may result in additional information being displayed in Search Engines Results Pages (SERPs), such as review stars, recipe cooking times, and so on. Note that adding structured data is not a guarantee of such SERP features.
There are a number of structured data vocabularies that exist, but JSON-LD (JavaScript Object Notation for Linked Data) has emerged as Google’s preferred and recommended method of doing structured data markup per the Schema.org guidelines, but other formats are also supported such as microdata and RDFa.
JSON-LD is easier to add to pages, easier to maintain and change, and less prone to errors than microdata which must be wrapped around existing HML elements, whereas JSON-LD can be added as a single block in the HTML head section of a web page.
Here is the Schema.org FAQ page for further investigation – and to get started using microdata, RDFa and JSON-LD, check out our complete beginner’s guide to Schema.org markup.
Front-end vs. back-end, client-side vs. server-side
You may have talked to a developer who said, “I’m a front-end developer” and wondered what that meant. Of corse you may have heard someone say “oh, that’s a back-end functionality”. It can seem confusing what all this means, but it’s easily clarified.
“Front-end” and “client-side” both mean the same thing: it happens (executes) in the browser. For example, JavaScript was originally developed as something that executed on a web page in the browser, and that means without having to make a call to the server.
“Back-end” and “server-side” both mean the same thing: it happens (executes) on a server. For example, PHP is a server-side scripting language that executes on the server, not in the browser. Some Content Management Systems (CMS for short) like WordPress use PHP-based templates for web pages, and the content is called from the server to display in the browser.
Programming vs. scripting languages
Engineers and developers do have differing explanations and definitions of terms. Some will say ultimately there’s no differences or that the lines are blurry, but the generally accepted difference between a programming language (like C or Pascal) vs. a scripting language (like JavaScript or PHP) is that a programming language requires an explicit compiling step, whereas human-created, human-readable code is turned into a specific set of machine-language instructions understandable by a computer.
Content Management System (CMS)
A CMS is a software application or a set of related programs used to create and manage websites (or we can use the fancy term “digital content”). At the core, you can use a CMS to create, edit, publish, and archive web pages, blog posts, and articles and will typically have various built-in features.
Using a CMS to create a website means that there is no need to create any code from scratch, which is one of the main reasons CMS’ have broad appeal.
Another common aspect of CMS’ are plugins, which can be integrated with the core CMS to extend functionalities which are not part of the core CMS feature list.
Common CMS’ include WordPress, Drupal, Joomla, ExpressionEngine, Magento, WooCommerce, Shopify, Squarespace, and there are many, many others.
Read more here about Content Management Systems.
Content Delivery Network (CDN)
Sometimes called a “Content Distribution Network”, CDNs are large networks of servers which are geographically dispersed with the goal of serving web content from a server location closer to the client making the request in order to reduce latency (transfer delay).
CDNs cache copies of your web content across these servers, and then servers nearest to the website visitor serve the requested web content. CDNs are used to provide high availability along with high performance. More info here.
HTTPS, SSL, and TLS
Web data is passed between computers via data packets of code. Clients (web browsers) serve as the user interface when we request a web page from a server. HTTP (hypertext transfer protocol) is the communication method a browser uses to “talk to” a server and make requests. HTTPS is the secure version of this (hypertext transfer protocol secure).
Website owners can switch their website to HTTPS to make the connection with users more secure and less prone to “man in the middle attacks” where a third party intercepts or possibly alters the communication.
SSL refers to “secure sockets layer” and is a standard security protocol to establish communication encryption between the server and the browser. TLS, Transport Layer Security, is a more-recent version of SSL
◾More info on HTTPS, SSL, & TLS
HTTP/1.1 & HTTP/2
When Tim Berners-Lee invented the HTTP protocol in 1989, the computer he used did not have the processing power and memory of today’s computers. A client (browser) connecting to a server using HTTP/1.1 receives information in a sequence of network request-response transactions, which are often referred to as “round trips” to the server, sometimes called “handshakes”.
Each round trip takes time, and HTTPS is an HTTP connection with SSL/TSL layered in which requires yet-another handshake with the server. All of this takes time, causing latency. What was fast enough then is not necessarily fast enough now.
HTTP/2 is the first new version of HTTP since 1.1. Simply put, HTTP/2 allows the server to deliver more resources to the client/browser faster than HTTP/1.1 by utilizing multiplexing, compression, request prioritization, and server push which allows the server to send resources to the client that have not yet been requested.
Further reading:
◾HTTP/2 FAQ
◾What is HTTP/2 and how does it benefit SEO?
Application Programming Interface (API)
Application is a general term that, simply put, refers to a type of software that can perform specific tasks. Applications include software, web browsers, and databases.
An API is an interface with an application, typically a database. The API is like a messenger that takes requests, tells the system what you want, and returns the response back to you.
If you’re in a restaurant and want the kitchen to make you a certain dish, the waiter who takes your order is the messenger that communicates between you and the kitchen, which is analogous to using an API to request and retrieve information from a database. For more info, check out Wikipedia’s Application programming interface page.
AMP, PWA, and SPA
If you want to build a website today, you have many choices.
You can build it from scratch using HTML for content delivery along with CSS for look and feel and JavaScript for interactive elements.
Or you could use a CMS (content management system) like WordPress, Magento, or Drupal.
Or you could build it with AMP, PWA, or SPA.
AMP stands for Accelerated Mobile Pages and is an open source Google initiative which is a specified set of HTML tags and various functionality components which are ever-evolving. The upside to AMP is lightning-fast loading web pages when coded according to AMP specifications, the downside is some desired features may not be currently supported, and issues with proper analytics tracking.
Further reading:
◾What will Google’s Accelerated Mobile Pages mean for marketers?
◾Accelerated Mobile Pages (AMP) one year on: stats and infographic
◾Accelerated Mobile Pages vs Facebook Instant Articles: Is Google winning the mobile war?
PWA stands for Progressive Web App, and it blends the best of both worlds between traditional websites and mobile phone apps. PWAs deliver a native app-like experience to users such as push notifications, the ability to work offline, and create a start icon on your mobile phone.
By using “service workers” to communicate between the client and server, PWAs combines fast-loading web pages with the ability to act like a native mobile phone app at the same time. However, because PWAs are JavaScript frameworks, you may encounter a number of technical challenges.
Further reading:
◾Progressive Web Apps versus Android Instant Apps: Which is better for marketers?
◾Google I/O: What’s going on with Progressive Web Apps?
SPAs – Single Page Applications – are different from traditional web pages which load each page a user requests in a session via repeated communications with the server. SPAs, by contrast, run inside the browser and new pages viewed in a user session don’t require page reloading via server requests.
The primary advantages of SPAs include streamlined and simplified development, and a very fast user experience. The primary disadvantages include potential problems with SEO, due to search engines’ inconsistent ability to parse content served by JavaScript. Debugging issues can also be more difficult and take up more developer time.
It’s worth noting that future success of each of these web technologies ultimately depends on developer adoption.
Conclusion
Obviously, it would require a very long book to cover each and every bit of web technology, and in sufficient detail, but this guide should provide you, the professional web developer, with helpful info to fill in some of the blanks in your understanding of various key aspects of web technology.
I’ve provided many links in this article that serve as jumping off points for any topics you would like to explore further. There’s no doubt that there are many more topics web developers need to be conversant with, such as robots.txt files, meta robots tags, rel canonical tags, XML Sitemaps, server response codes, and much more.
In closing, here’s a nice article on the Stanford website titled “How Does The Internet Work?” that you might find interesting reading; you can find that here.
ABCO Technology teaches a comprehensive program for web development. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067
Email your questions to: info@abcotechnology.edu
Financial aid is available to all students who can qualify for funding.
ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304
Amazon, Berkshire Hathaway, and JPMorgan Chase on Tuesday announced a partnership to cut health-care costs and improve services for their U.S. employees. The announcement slammed the shares of multiple companies in the health-care sector.
The giant companies, which together employ more than 1.1 million workers, will launch an independent operation that’s intended to be free from profit-making incentives.
Investing In Health Care Innovation
The new company’s goal at first will be to target technology solutions to simplify the health-care system.
Details of the new company were sketchy, with principles of Amazon, Berkshire and J.P. Morgan noting that the way it will work remains to be seen. They’re hoping that their sheer size will help bring the necessary scale and resources to tackle the issue.
“The ballooning costs of healthcare act as a hungry tapeworm on the American economy,” Berkshire CEO Warren Buffett said in a statement. “Our group does not come to this problem with answers. But we also do not accept it as inevitable. Rather, we share the belief that putting our collective resources behind the country’s best talent can, in time, check the rise in health costs while concurrently enhancing patient satisfaction and outcomes.”
Shares of Amazon and JP Morgan were off slightly in morning trade, while Berkshire edged higher..
However, shares of health-care companies fell sharply. Express Scripts sank 10 percent; Cigna was down 5 percent as was CVS and UnitedHealth, and Aetna slid about 3 percent.
Three top executives, one from each company, will take the lead on the project: Berkshire investment officer Todd Combs, J.P. Morgan’s Marvelle Sullivan Berchtold and Beth Galetti, a senior vice president at Amazon.
Combs was a hedge fund manager before joining Berkshire in 2010. Berchtold was previously global head of mergers and acquisitions at drug maker Novartis before joining J.P. Morgan last year, and Galetti was FedEx’s vice president for planning, engineering and operations before joining Amazon in 2013, according to their LinkedIn profiles.
“The healthcare system is complex, and we enter into this challenge open-eyed about the degree of difficulty,” said Amazon CEO Jeff Bezos. “Hard as it might be, reducing healthcare’s burden on the economy while improving outcomes for employees and their families would be worth the effort.”
“Our people want transparency, knowledge and control when it comes to managing their healthcare,” said J.P. Morgan CEO Jamie Dimon. “The three of our companies have extraordinary resources, and our goal is to create solutions that benefit our U.S. employees, their families and, potentially, all Americans.”
J.P. Morgan currently uses Cigna and UnitedHealth Group to administer health benefits on a self-insured basis and Amazon uses nonprofit Premera Blue Cross, according to Evercore analysts. Amazon uses ExpressScripts as its pharmacy benefits manager, said Leerink Partners’ Ana Gupte.
The move also speaks to the desire to rip apart the traditional health-care system from distinctive silos. Experts have anticipated more deals and vertical integration in wake of CVS announcing its intention to buy Aetna.
“I think it is good news,” Allergan CEO Brent Saunders told Fox News. “The health-care delivery system is antiquated and in dire need of positive disruption. My hope is these three companies light the spark!”
Adam Fein, president of Pembroke Consulting, said it’s “long past time” for employers like these three to force innovation into the health-care system.
“For better or worse, there are warped incentives baked into every aspect of the U.S. health-care system, from medical innovation to care delivery to insurance and benefit management,” Fein told Fox News. “Rather than merely bashing the current system, I hope this new organization can help patients and their physicians make more informed and more cost-effective decisions. Technology will be necessary but not sufficient to make positive changes.”
Analysts echoed the sentiment that the health-care system is outdated and ripe for disruption, paving the way for the new endeavor. However, they cautioned it could take time.
“If this winds up being the low cost provider to make insurance more affordable at the employer level, it could wind up being a real disruptive competitor to an industry that has not seen any new players in years/decades,” Jefferies analyst Jared Holz told Fox News. “[I’m] not going to call this a black swan event yet because there are few details and would be making too many assumptions but it has potential to be.”
Amazon in particular can play a strong role if it promotes a greater presence for technological advances including artificial intelligence and information sharing platforms into health care, said Idris Adjerid, management information technology professor at the University of Notre Dame’s Mendoza College of Business.
“We find that technology initiatives which facilitated information sharing between disconnected hospitals resulted in significant reductions in healthcare spending,” Adjerid said in a statement. “That said, it is unclear what the scope of this effort will be. If this partnership is to meaningfully improve healthcare delivery, it needs to include more than the employees of these companies
ABCO Technology teaches a comprehensive programs in database administration, networking and computer programming. All of these areas of information technology will play a vital part in reducing the cost of health care for all in America. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.
Email your questions to: info@abcotechnology.edu
Financial aid is available to all students who qualify for funding.
ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304
Information technology will reduce health care costs. If you want to make a difference, join this field today!
As marketers in the ever-changing world of digital, success depends on knowing what consumers want and expect from us. After all, it’s the only way we can deliver.
So, it’s interesting to see that a recent data release from Google tells us that personalized search is becoming more and more prominent among internet users.
No longer are they turning to friends and family for personal advice and recommendations, but search engines too.
Of course, we already knew that… that’s why we work so hard at getting to know our audience and understanding their micro-moments and pain points, delivering the right content at the right time, in the right way.
But what Google is telling us is that rather than searching, “How often should you wash your hair?”, we are now searching “How often should I wash my hair?”. Changing those two little words is making the way that we use search engines far more personal than ever before.
And the data suggests that consumers now truly trust that their most specific needs can be answered by content on the web. In fact, in the last two years Google has reported that mobile searches using “…for me” has grown by a huge 60% over the last two years.
On top of this, they have also seen an 80% increase in mobile searches including “…should I?”. As a result, we really are treating search as one of our best, most trusted friends.
And that’s great news for content marketers.
For those of us working in motor, beauty, finance, fitness and pet care, it seems that this new insight is especially relevant – these are the industries in which users are most frequently turning to Google to solve their personal pain points.
How can we prepare and optimize our content for these types of search?
Tools
Creating calculators and tools is a brilliant way of targeting personal search terms and providing our users with the personalized response they are looking for. Let’s use a fitness example to demonstrate this:
This recent data circulation from Google suggests that users are starting to search for something like, “how much water should I drink each day?” in higher volumes than something like, “how much water should you drink per day?”.
Now, most of us know that the answer to this question will depend on a number of different factors including gender, body composition, activity level and so on.
What our audience is expecting from this search is a personalized answer that takes all of these things into consideration and tells them exactly how much water they should personally be drinking each day.
A water consumption calculator would do this well, and if the user wants the specificity of an individual result, they will be willing to fill in the necessary personal details to retrieve it. A blog post that simply states the average recommended fluid intake for a man or a woman as recommended by the NHS is no longer user focused enough.
Case studies and testimonials
Providing personalized content will not always be easy, and at times users may need encouragement to spend a little longer on a page to find the personalized answer they are looking for. In this instance, case studies and testimonials are a great way to push users further through their journey in the right direction.
For example, “How much money do I need to retire?” is a more complex question than our fitness example. There are so many variants that could alter the accurate and personalized response to this question, so it’s difficult to answer it quickly in a personalized way.
However, if we provide users with a testimonial or case study at the right stage in their journey – one that was created after a lot of persona research and uses someone or a situation that will resonate with them – they are likely to engage with the content.
Creating engagement via a case study will increase the likelihood that they’ll enquire with your brand for a more personalized answer, continuing their journey on their way to the personalized answer they are looking for.
Hygiene content
Informational content (something we refer to here in ABCO Technology’s search engine class as ‘hygiene content’) is absolutely essential in light of this evolution of search.
It’s critical that all the informational content and resources on your website are up to date, and as specific to the different types of users you’re expecting to visit your site as possible. Not only this, but ensuring that on-page content is optimized for long tail search (tying back to your personas) is a must.
Moreover, having a clear call to action that points the user in the direction of personalized answers to their questions is also important. It isn’t always possible to answer their query in an individualized way using written content, but pointing the user towards a ‘contact us here’ call to action could make all the difference in their user journey, and ultimately, whether they end up with you or your competitor.
Thought leadership and expert content
Finally, with consumers turning to search like a trusted friend or family member more than ever before, you need to ensure that the content you’re putting out there is seen as being the most reliable. Therefore, it’s never been more important to be viewed as a thought leader within your field.
Expert content will naturally help to strengthen the consumer-brand relationship. It also means that when you are appearing in SERPs, your expert reputation will stand you in good stead when it comes to users choosing which ‘friend’ they want to seek advice from.
We can’t wait to see how the evolution of search changes the way that Google is rewarding and penalizing brands’ content. The above is just a start, but we are certain we will be kept on our toes as time goes on!
ABCO Technology teaches a comprehensive program for web development, which includes search engine optimization and social media strategies. Call our campus between 9 AM and 6 PM Monday through Friday at: (310) 216-3067.
Email your questions to: info@abcotechnology.edu
Financial aid is available to all students who qualify for funding.
ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304
Congratulations, you’ve recently graduated high school. That’s a great achievement. You completed twelve years of education and now the time has come to search for a good paying full time job. you begin your search.
Are you finding that job search to be more difficult than you believed?
Would you like a better job than working at that fast food restaurant?
Have you been told by many employers that they would like to hire you if you only had those important job skills?
Have employers emailed you that you don’t have the right experience?
Would you like a profitable solution to this problem that will not take a lot of time and cost a lot of money?
Would you want that solution to have career advancement and give you excellent raises?
An outstanding solution to your problem is to enroll in a career or vocational school, which teaches information technology. According to the United States Department of Labor, information technology is leading the job hiring fields today, because the education is performance based, which is what employers are looking for in their new hires. Information technology training is based upon training, performance and certification. This training does not require a college degree. Many students who get a job in this field wind up working for a company that will pay all or part of your college tuition after working for that company for more than one year. If you have heard or read about attending a vocational school will stop you from ever attending college in the future, this statement is truly a myth. Many students use vocational training as a steppingstone to finance future debt free college degrees. Many employers, especially colleges and universities offer tuition free classes to university employees who have been employed for a certain length of time. This is one way to graduate from college with no student loans. A great example of one university offering free tuition to college employees is: Loyola Marymount University located about three miles from our ABCO Technology campus. UCLA and all community colleges offer education to their employees at a substantial discount or totally free!
What certifications will get that great job?
If you are a person who enjoys repairing and solving problems with a computer, the CompTia A+ certification is just for you. The A+ is completed in six weeks. After completing this training you can look for that better paying job as a computer repair specialist or as a desktop support technician.
After repairing computers, you can advance to higher paying fields of networking, which include the MCSE or Microsoft Certified Solutions Expert and the Cisco Certified Network Associate. With a little experience and a few certifications your job title will be network administrator.
High school graduates can also train and certify in other fields of information technology including Web Development, database administration, and computer programming.
Students certifying in web development build websites, which all businesses need to advertise their services. The database field has the job title of database administrator, which involves handling large amounts of corporate information. Computer programmers write games, design smart phone applications and write programs for Windows and other operating systems.
Some of the certifications listed in this article will take six months to complete.
Get Hired and Get To Work
Vocational training is in high demand by countless employers because the training is performance based. Employers substitute your performance for countless years of experience when you fill out that important job application, which lists your performance based skills. Employers in 2018 want to view at a glance what you can do for them. Businesses are spending less money on training. A certification in information technology saves companies countless training dollars spent in time and money.
ABCO Technology is an ACCSC accredited institution. When an institution is accredited students may apply for financial aid and receive help with their education if they qualify.
Students enrolling at ABCO Technology receive a diploma instead of a certificate. The diploma is highly valued when placed next to that important certification.
If you would like to receive more information about how a vocational education will jump-start your job career, contact ABCO Technology.
You can reach our campus by phone at: (310) 216-3067 Monday through Friday from 9 Am to 6 PM.
Email us for information at: info@abcotechnology.edu
Financial aid is available to all students who can qualify for funding.
ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304
Get those important information technology job skills today!
2018 might be the Chinese Year of the Dog, but it’s going to be Google’s Year of the Machine. In this year’s article I go through my predictions for 2017 and my new predicted Google focus for 2018. Want to see how I did for 2017? Read on..!
Google Assistant
Last year I wrote a piece focused on mobile and correctly predicted Google would continue to move away from big algorithm updates, and have a continued focus on the mobile index. So let’s start off with a quick recap of what I said last year.
The main thrust of the article was about mobile and how Google was going to be focusing on this to increase revenue from Adwords…
Google has access to brand new markets and a shortcut into markets they were previously struggling with. Desktop would never provide access to these as there are way too many barriers to ownership rates rising so dramatically (such as cost and internet infrastructure).
Mobile has solved a key problem for Google. 99% of revenue for Alphabet comes from Google and 77% of that comes from AdWords. That mobile traffic is key to this figure and they are going to be doing their best to keep pushing it.
And this is from an FT article on the third quarter earnings for Alphabet last year:
Strong growth in mobile search, programmatic advertising and smartphone use in Asia helped accelerate revenue at Alphabet to the fastest rate in almost five years, surpassing estimates for sales and earnings in the third quarter.
I did focus more on Africa than Asia, which was misplaced with the amount of investment Google has placed in the Asian market and the potential gains there. However, Africa is still an incredibly important market for Google and one they invested in massively during 2017, pledging to train 10 million people in Africa in online skills.
Additionally, Google has launched and continued to progress the mobile index, with mobile SEO further splitting from desktop.
One of the other noteworthy predictions was that we weren’t going to see any major updates on the desktop index anymore. With updates instead being small and frequent unnamed (by Google) corrections.
In 2017 we saw lots of small updates detected by the community and whilst there were a few new penalties put in place, for example targeting pages that used interstitials too heavily, these were not big algorithm shifts. The days of penguins, pandas and hummingbirds are over. It makes the lines even more blurred as it’s harder to point to things and say ‘after update X we know Y is now the case’. As a result of this, I would expect decreasing amounts of consensus across the SEO world as this continues. See our Google algorithm updates in 2017 for a recap of the year’s updates.
Finally, for the recap, I also talked about ‘Peak Mobile’ in some markets with Google’s focus shifting to changing user behavior for those who already owned mobile devices…
In the UK and US smartphones have reached saturation or are at least very near that point. With over 70% penetration in pretty much all the key markets (Europe, US, China), growth in mobile is going to be relying more on changing user behavior of existing device owners. Therefore we can expect more focus from the search engines on user behavior.
Google went on to broker a deal with Apple and they switched from Bing to Google to power Siri results on iOS in September 2017.
I’ll come back to this in the next part of the article where I lay out my predictions for 2018 as I think it’s strongly relevant.
The 2017 article is worth a read and covers a few other points as well. All the information I put across was an extension of the activity that was already taking place, so it was all a pretty safe bet. This year however I am going to go decidedly off piste…
The new device
There is a battle of the machines going on between Google and Amazon, with both of them vying to get themselves in your home. Google has ‘Home’ and Amazon has ‘Alexa’. Both of these are physically little more than glorified bluetooth speakers. However that speaker is linked to their respective AI offering.
Think about the last time there was an entirely new device for you to be served content from. We had desktop, then laptop and then mobile. An honorable mention goes out to tablet as well but it’s pretty much lumped in with mobile. But that’s it. Since 1998: 3. Well that’s now 4 with Google fighting a pitched battle to get into your home in yet another format. It’s Google VS Amazon or in other words Home VS Alexa.
Google is putting increasing resources into this battle, having realized that Amazon was leapfrogging ahead with Alexa. You might think this device is insignificant for Google as it has really limited potential as an ad serving platform. Which is correct. But, it’s got massive potential as a data collection platform, and crucially with every search or query made through Alexa, which is not visible to Google. So it’s losing out on that data.
Google Home is going to become increasingly integrated with other devices and they are going to keep driving the device cost down offering different model types. I also think that Google Assistant will be released, for free, for any manufacturer to use in their 3rd party device. There were a couple of examples of this popping up with select manufacturers right at the end of last year. Google doesn’t care about the hardware, it’s about getting Google Assistant into as many homes as possible. I can see it becoming a standard integration into bluetooth speakers, especially as Google have a track record for developing and releasing products free for any manufacturer to use, Android being a great example. That’s how they came to dominate mobile search and it’s how they’ll dominate voice search as well.
Google makes a massive $0 from Android the software, but they make billions in revenue from the searches conducted on Android devices. They also, importantly, don’t have to pay handset manufacturers for Google to be the preset search on those devices.
Looking at the recent increase in the range of Alexa products, it does look like Amazon is throwing the kitchen sink at it hardware wise and this could be in a pre-emptive strike against Google releasing Assitant out to any 3rd party to use for free. Interestingly within the EU the right to data portability will mean that consumers will be allowed to port their personal data between devices. This might sound like a win for the consumer, but it could have a dark side. It’s forcing companies to put aside their differences and develop a universal data format for our personal data. Theoretically this means it will be MUCH more useful to third parties and easier to compile even bigger data pools. Also if you swap back and forth between devices you’re sharing that data in more places, adding more information to their own networks.
The machines are coming from your home
I wrote an article recently about Natural Language Processing, which we now use within our classes, which touched on some of the current limitations of these offerings. They are a long way from perfect and if you know what you are looking for, specifically the things these types of AI are typically bad at, it’s very easy to trip them up with simple questions.
One of those question types is comparisons. You can ask ‘how far away is the moon’ or ‘how far away is the sun’ and get an answer without a problem. However ask ‘which is furthest away, the moon or the sun’ and although the information is there it’s too complex for Alexa to process. It takes a leap in understanding of both the question, processing of information and expected result to be able to respond, and this is simply out of current reach.
It’s this ability to process the information that’s already present for a more useful result which I think Google is going to try its best to leverage. Last year I touched on ‘micromoments’ which are the moments when a consumer pulls out their phone and checks information mid-decision. That point where you are walking down the street and want to know where the nearest restaurant is. These moments are incredibly valuable as they are hugely actionable – you are ready to make a decision right there and then and your next action will likely be to commit or purchase.
“Mobile has forever changed the way we live, and it’s forever changed what we expect of brands. It’s fractured the consumer journey into hundreds of real-time, intent-driven micro-moments. Each one is a critical opportunity for brands to shape our decisions and preferences.”
This is just one touchpoint where Google hopes to be able to utilize AI to better deliver on. This ‘in the moment’ advertising is the context. However, there are also optimizations of the ad served to make sure that it’s the right ad as well as the right time. For example, based on past behavior they might look at what colour ads, placement or type you have responded to best previously. They then roll all that information into one to serve you an ad at that point that is contextual (meaning served in the correct context such as a restaurant ad as you are walking down the street looking at restaurants) and personalised (as in tailored to you specifically based on past ads you have responded well to).
Use of AI allows for taking huge amounts of data from multiple different behaviors, touchpoints and importantly, patterns and roll this information into an ad.
A big part of the opportunity for marketers is how AI will help us fully realize personalization—and relevance—at scale. With platforms like Search and YouTube reaching billions of people every day, digital ad platforms finally can achieve communication at scale. This scale, combined with customization possible through AI, means we’ll soon be able to tailor campaigns to consumer intent in the moment. It will be like having a million planners in your pocket.
We’re getting closer to a point where campaigns and customer interactions can be made more relevant end-to-end—from planning to creative messaging to media targeting to the retail experience. We will be able to take into account all the signals we have at the customer level, so we can consider not only things like a consumer’s color and tone preferences, but also purchase history and contextual relevance. And all of this will be optimized on the fly in real time.
This is a mission statement from the Google VP of marketing, Marvin Chow, in September 2017 and I strongly believe it’s where they are going to be focusing a huge amount of effort and resources this year. Adwords is where Alphabet gets almost 80% of its revenue after all.
Taking AI a step further
Something not mentioned in the 2017 article – and where I am probably heading pretty deep off-piste – is predictive behaviour based upon modeling from using AI to process and understanding multiple user data. In other words, it’s not just looking at your past behavior but the past behavior of people like you.
For instance there are correlations between people and the products and services they like. So Google will be increasingly modeling you and trying to ascertain what you may be interested in from what type of person they think you are. On a simplistic level this is showing you ads for pet insurance after you googled ‘dog food’. People who purchase dog food also purchase pet insurance. There is a clear correlation. However things get a bit more weird when you take into account the multiple touch points. For instance you visit the vets and start getting ads for a specific pet insurance policy.
You’re already targeted based on past behavior, and that can then be combined with the ads that the people who meet other key criteria for you such as your age, gender, income etc have responded well to. Now you’re being pushed a single highly targeted product – advertising that is both contextual and personalized.
By the way, if you don’t want Google to know where you are all the time you need to go into your timeline within Google maps and turn off the tracking services. This is for both Android and iOS devices. This is the tracking on my device:
However I still continue to get notifications when I arrive or leave some places (particularly my local supermarket) asking me to ‘rate my experience’. This is without using Google Maps (I know the way to the supermarket!) so Google appears to still be tracking me through my device despite having it clearly set up not to. Which is comforting.
Following on from this, understanding people and patterns through AI will be Google’s biggest driver in 2018; it follows on perfectly from the mobile adoption focus. They need mobile adoption for tracking and will be trying to use this to their advantage.
So what do you think Google’s focus in 2018 will be?
If you think I’m right, wrong or anywhere in between please do have your say in the comments….
Do you think Google is tracking us too much?
Will AI be taking over advertising?
Does it even matter
ABCO Teaches a comprehensive program for E-commerce and search engine optimization. Call our campus today between 9 AM and 6 PM. Call us at: (310) 216-3067.
Email your questions to info@abcotechnology.edu
Financial aid is available to all students who can qualify for funding.
ABCO Technology is located at: 11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304
The field of network administrator is experience strong job growth. The question we are asked by many students concerns how to get started in the field.
The answer, which has helped hundreds of new candidates find fulfilling jobs is the CompTia certification path.
Benefits of CompTia
CompTia certifications are vendor neutral. A holder of a CompTia certification is qualified to work on any computer. The US government, all state plus local governments, private industry and the non-profit organizations recognize CompTia as an excellent gateway to employment.
CompTia certifications
The first CompTia certification we will discuss is the CompTia A+. This certification is granted when a candidate passes two exams: one for hardware and the other for installation, configuration and security of operating systems. The A+ certification applies to individual computers or workstations. Holders of the A+ install all other types of network service devices, which includes printers and faxes. A+ holders apply for the job titles of computer service technician, help desk specialist or help desk technician. All of these job titles are listed on the main job sites.
The next CompTia certification is the CompTia Network +. The network + states the holder is qualified to install, configure, back up and secure corporate network servers. The Network + is a strong step toward the job title of junior network administrator.
The third CompTia certification we will discuss is the CompTia security +. This certification is a certificate for cyber security. Holders of the Security + know how to secure a network. This course is now a major require for employment in any company who has data to protect.
Our final CompTia certification is the CompTia Linux +. Linux is an open source operating system. Linux has many flavors or versions, which makes it difficult for cyber criminals to attack. In addition to Linux having different flavors, those flavors can be modified, which makes the job of cyber crime very difficult. The CompTia Linux + is required for many companies who use Linux as a background operating system.
#ABCO #Technology teaches each certification listed in this brief article.
Call our campus between 9 AM and 6 PM Monday through Friday. You can reach us at (310) 216-3067.
Email your questions to: info@abcotechnology.edu
Financial aid is available to all students who can qualify for funding.
ABCO Technology is located at 11222 South La Cienega Blvd. STE # 588 Los Angeles, Ca. 90304
The use of snippets is important for many websites.
Google uses featured snippets to make it easier to connect us to the information we want, but in doing so could they be endangering the basic model the entire web relies on? We get free information and in return, we used to get served a couple of adverts on the site we look at. But without being able to serve those ads, there’s less incentive to create that content.
Featured snippets explained
Featured snippets are intended to make it easier for you to access the information available on a web page by bringing it directly into the search results.
Sometimes when you do a search, you’ll find that there’s a descriptive box at the top of Google’s results. We call this a “featured snippet.”
Here’s an example from the Google blog post where they ‘reintroduce’ them:
So in short it’s taking the text from a page and then featuring it prominently in the search results.
#Google’s shift from connection engine to information engine
#Google has always been a connection engine. However, there appears to be a continuing change in the way in which Google sees itself. The model has always been:
•I enter a search term and Google provides a list of links to content that best answers that search
•I click on a paid or free result
•Google gains money from paid results and advertising on publishers’ sites
•Publishers get paid by the advertising on their sites
Google is increasingly moving towards just showing me the information, lifted directly from the content it indexes. The shift is subtle but it is destroying that model. So now the relationship looks like this;
•I enter a search term and Google provides me the information that best serves the search
•I read the information on Google
Not only is this chain a lot shorter, it also removes the publishers and so Google’s own methods of monetization. The key though is that Google only shows snippets for certain types of results. Results for searches with a clear purchase intent would be naturally less likely to show a snippet but more likely to have PPC ads. Whilst some results do also feature PPC results, in every search I did these were shown above the snippet, with the organic content below.
The potential effects of snippets on websites
When your business relies on traffic from providing specific or niche information then snippets can be devastating. Take the case of Celebrity networth.com as detailed in The Outline. If you want to know what someone famous is worth, you look it up on their site and they give you a number and breakdown of how they reached it. The most important thing is the number, that’s the key information people are looking for.
Back in 2014 Google emailed the owner of the site, Brian Warner, and asked for permission to use the data from the site in the knowledge graph, Brian was not keen…
“I didn’t understand the benefit to us,” he said. “It’s a big ask. Like, ‘hey, let us tap into the most valuable thing that you have, that has taken years to create and we’ve spent literally millions of dollars, and just give it to us for free so we can display it.’ At the end of it, we just said ‘look, we’re not comfortable with this.’”
However when snippets were introduced Google just went ahead and took the information anyway. The information that Brian had said he didn’t want being used by them.
The result was a loss of 65% of traffic year on year and having to lay off staff as the profitability of the site took a nose dive. That’s the very real impact of Google’s change from connecting you to the information to delivering that information right there on the page. The sites that provide that information, the ones that have actually put the time and effort into creating the content, are the ones that lose out.
#Snippets likely won’t affect all websites as badly as in this example, it is just one example. But other studies consistently show featured snippets reduce clicks on other results, in effect cannibalizing traffic. Take this study from ahrefs:
Why did snippets need reintroducing?
Snippets just weren’t that bright, and there were several high profile examples of them failing. Some snippets appeared to have been removed, especially on more controversial topics.
The problem came about through a combination of not understanding the user intent and not being picky about where information was pulled from. Google’s failure to properly understand intent is something they have got in trouble with before, like with the ‘Unprofessional Hair’ problem.
As Google shifts from connecting to content, to connecting to information directly, intent becomes even more important. Of course without the context of the rest of the content we’re even less able to judge the validity of the information shown. Especially when these snippets also serve to provide information for Google Home Assistant. So there is little context available, beyond the name of the site, to evaluate the information against. It’s simply a case of being told an answer to a question as if it’s ‘The Answer’ rather than ‘an answer’.
This also leads to problems such as the case highlighted by Danny Sullivan in his own announcement post for the new feature:
Source: Google blog
Here we have two queries where the intent is the same. The suitability of reptiles as pets. However in a glass half full / half empty kind of way different people phrase this question differently depending on their initial bias. Google has then served each with a snippet that reinforces that bias. In effect two different answers to the same question depending on the searcher’s expectation of the result. For my results at least, Google appears to have put in a speedy fix for this by stopping the snippet showing on one set of results. Replacing reptiles with goats replicated the effect though, so it doesn’t look to be a fix for the wider issue.
This might not appear to be a big problem when it comes to reptiles or goats but things could potentially get out of hand quickly as they roll this out across more queries and cover more topics (for example politics). Searching around at the moment it looks like political or controversial topics are more restricted, especially in terms of the search content.
It’s not just snippets either
It’s not just content publishers that need to watch out. Google appears to be developing their own tools for popular queries and placing these directly in the search results. This is the result I get for a search on ‘internet speed test’:
I guess for the rest of the sites offering a speed checker it’s just tough. This is different from snippets as it’s not using anyone else’s information. But in this example at least Google appears to be creating a tool and then placing it at the top of the search results above competing tools. I personally feel that sets a bit of a dangerous precedent as this could potentially spread with Google creating more tools, in partnership with more companies, so harming the competition. Competition and diversity are good, but people will be less likely to innovate and create new tools if Google is going to just step in when something gets popular and publish their own tool above everyone else’s in the results.
Google has got into trouble before for placing their own services above competitors. In July of 2017 it received a record-breaking $2.7 billion fine from the EU for antitrust violations with their shopping comparison service:
Google has systematically given prominent placement to its own comparison shopping service: when a consumer enters a query into the Google search engine in relation to which Google’s comparison shopping service wants to show results, these are displayed at or near the top of the search results.
Google has demoted rival comparison shopping services in its search results: rival comparison shopping services appear in Google’s search results on the basis of Google’s generic search algorithms. Google has included a number of criteria in these algorithms, as a result of which rival comparison shopping services are demoted. Evidence shows that even the most highly ranked rival service appears on average only on page four of Google’s search results, and others appear even further down. Google’s own comparison shopping service is not subject to Google’s generic search algorithms, including such demotions.
So Google put their own service higher up in the results than competing services and didn’t make their own service subject to the same ranking algorithms as their competitors.
Source: TechCrunch
What will this mean for content?
The trouble with snippets is that the places this might hit hardest are those which invest more in the creation of their content. Or in other words, the content which has higher editorial standards. If you’re a journalist, someone needs to be paying you to write the content, an editor needs to be paid to sub the content, designers and photographers paid for graphics and images.
So the content which stands to lose the most is arguably the most important, whilst the lower quality, recycled, poorly researched and quickly written content, which needs to generate less revenue as it costs so much less to churn out, remains profitable.
This creates a vicious cycle: as there is more low-quality content it therefore captures a greater share of the audience, higher quality content gets more drowned out and so gets less revenue and diminishes even further.
Mobile users want information delivered more quickly and concisely. We have shorter user journeys on mobile with less time on site and a higher bounce rate. Capturing these visitors with properly optimized content is important as mobile is a key part of Google’s revenue as it continues to dominate the mobile search market.
It does also mean, however, that the user is less likely to visit a site which is funded by ads ironically likely served through Google’s own platform. However Google might be less concerned about this depending on how much importance they are placing on their Home Assistant product. The snippets are used by the AI to provide answers for your questions. Ads don’t factor into this and they could perhaps have calculated they stand to gain more from better information here than the loss from fewer ads served on those sites.
#ABCO #Technology teaches a comprehensive course for search engine optimization. Call our campus between 9 AM and 6PM Monday through Friday at: (310) 216-3067.
Email your questions to: info@abcotechnology.edu
Financial aid is available to all students who can qualify for funding.
ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588 Los Angeles, Ca. 90304