Crafting the perfect piece of content nine top tested tips

Do you know about the perfect piece of content

You are now living in the midst of a content revolution as the great minds of user experience (UX) and search engine optimization (SEO) finally collaberate to produce beautiful on-page content designed to rank in search results AND engage or educate the user.

Gone are the days of plugging in keyword phrases into your blog posts to get the density just right and building landing page after landing page targeted at keyword variations like, “automobiles for sale”, “cars for sale” and “trucks for sale”.

Since the introduction of RankBrain, the machine-learning component of Google’s Core Algorithm, in late 2015, Google has moved farther away from a simple question and answer engine and has become a truly intelligent source of information matching the user’s intent — not just the user’s query.

Crafting compelling content is tough, especially in such a competitive landscape. How can you avoid vomiting up a 1,500-word blog post that will meet the deadline but fall very short of the user’s expectations? If you follow these 10 on-page essential elements, your brand will be on the right track to provide a rich content experience designed to resonate with your audience for months to come.

The basics:

Title Tag
Always seen in the <head> block or the beginning of a web page’s source code, the title tag is text wrapped in the <title> HTML tag. Visible as the headline of the search listing on results pages, on the user’s browser tab, and sometimes in social media applications when an Open Graph Tag is not present, this text is intended to describe the overarching intent of the page and the type of content a user can expect to see when browsing.

What I mean by “intent” can be illustrated with the following example. Say my title tag for a product page was Beef for Dogs | Brand Name. As a user, I would not expect to find a product page, but rather, information about whether I can feed beef to my dogs.

A better title tag to accurately match my users’ intent would be Beef Jerky Dog Treats | Brand Name.

Query = “beef for dogs”

Query = “beef jerky dog treats”
How do I know what the title tag of my page is?
Identifying what has been set as the title tag or meta description of your pages can be done URL-by-URL or at scale for many URLs. There are distinct uses for each discovery method, and it is always important to remember that Google may choose to display another headline for your page in search results if it feels that its title is a better representation for the user. Here are a few great online tools to get you started:

URL-by-URL inspection:
Slerpee (Free)
Moz Title Tag Preview Tool (Free)
Google SERP Simulator (Free)
At scale:
Screaming Frog (Free Up to 500 URLs)
Sitebulb (Paid)
DeepCrawl (Paid)
NOTE: If you are one that prefers to “live in the moment”, you can also view the page source of the page you are currently on and search for “<title>” in the code to determine what should be output in search results. Lifewire produced this handy guide on viewing the source code of a webpage, regardless of the internet browser you are using.

Are there guidelines for crafting the perfect title tag?

Yes. The optimal title tag is designed to fit the width of the devices it’s displayed upon. In my experience, the sweet spot for most screens is between 50-60 characters. In addition, a page title should:

Be descriptive and concise
Be on-brand
Avoid keyword stuffing
Avoid templated/boilerplate content
Meta Description
Though the text below the headline of your search result, also known as the meta description, does not influence the ranking of your business’ URL in search results, this text is still critical for providing a summary of the webpage. The meta description is your chance to correctly set a potential user’s expectations and engage them to click-through to the website.

How do I build the perfect meta description?

Pay close attention to three things when creating a great meta description for each of your website’s pages: branding, user-intent, and what’s working well in the vertical (competitive landscape). These 150-160 characters are a special opportunity for your page to stand out from the crowd.

Do your page descriptions look and sound like they are templated? Investing time in describing the page in a unique way that answers user’s questions before they get to the website can go a long way in delighting customers and improving search performance.

Take for example the following product page for the Outdoor Products Multi-Purpose Poncho. The top listing for this product page is via Amazon.com, with a very obviously templated meta description. The only information provided is the product name, aggregate rating, and an indication of free delivery.

While not the top listing, the following result from REI Co-op clearly includes the product name, breadcrumbs, aggregate rating, price, availability, and a unique non-templated meta description. The standout feature of this meta description is that it does not copy the manufacturer’s text, provides some product differentiators like “easy to pull out of your bag” and “great travel item” that speak to user questions about portability.

The meta description plays an important role in complementing other elements of a well defined rich result, and it is often overlooked when retail businesses are using rich results to improve the ecommerce search experience specifically. That said, the same considerations apply to information focused pages as well.

Section Headings
Section heading elements (H1-H6) were originally intended to resize text on a webpage, with the H1 being used to style the primary title of a document as the largest text on the page. With the advent of Cascading Styling Sheets (CSS) in the late 90’s, this element has less effect. CSS started being used for much of this functionality, and HTML tags acted as more of a “table of contents” for a variety of user-agents (i.e. Googlebot) and users alike.

For this reason, the primary header (h1) and subheaders (h2-h6) can be important in helping search engines understand the organization of and context around a particular page of written content. Users do not want to read through a huge brick of text and neither do search engines. Organizing written words into smaller entities (sections) will help digestion and lead to better organic results.

In the example above, the primary topic (How to Teach a Child to Ride a Bike) is marked-up with an H1 tag, indicating that it is the primary topic of the information to follow. The next section “Getting Ready to Ride” is marked-up with an H2 tag, indicating that it’s a secondary topic. Subsequent sections are marked up with <h3> tags. As a result of carefully crafted headings, which organize the content in a digestible way and supporting written content (among other factors), this particular page boasts 1,400 search listings in the top 100 positions on Google — with only 1,400 words.

Over 92% of long-tail (greater than 3 words) keyword phrases get less than 10 searches per month, but they are more likely to convert users than their head term counterparts.

Focus on providing your potential users with answers to the search questions about a particular topic, rather than granular keyword phrases, will lead to a more authentic reading experience, more engaged readers, and more chances of capturing the plethora of long-tail phrases popping up by the minute.

Internal Linking

Internal links are hyperlinks in your piece of content that point back to a page on your own website. What is important to note here is that one should not create a link in a piece simply to provide a link pathway for SEO success. This is an old practice, and it will lead to a poor user experience. Instead, focus on providing a link to a supplemental resource if it will genuinely help a user answer a question or learn more about a specific topic.

A great example of helpful internal linking can be found above. In this article about “How to Ride a Bike”, the author has linked the text “Braking” to an article about types of bicycle brakes and more specifically how to adjust each type for optimal performance.

If there is supplemental information on your own website to substantiate your claims or provide further education to the reader in the article at hand, link to this content. If this doesn’t exist or there’s a better source of information on a particular topic, link out to this external content. There’s no harm in linking out to 3rd parties and in many if not all cases, this will serve as a citation of sorts, making your content more legitimate and credible in the user’s eyes.

External Linking

Linking to sources outside your own domain, also known as external linking, is often seen as one of the major ranking factors in organic search. External entities linking to your content are similar to calling someone you live next to a good neighbor, with a credibility effect similar to the citations you put in a term paper or an article on Wikipedia.

When writing a post or crafting a page for your own website, consider the following:

How can I substantiate my statistics or claims?
Why should my users believe what I have to say?
Can anyone (customers or companies) back up my thoughts?
If you are crafting the best user experience, you will want to take special care in building an authentic, data-driven relationship with your past and present customers.

There are no magic rules or hacks in how you link to external sources. As the SEO industry evolves, you will realize professionals are simply “internet custodial engineers,” cleaning up the manipulations of the past (part of the reasons for Penguin, Panda, Hummingbird, and less notable algorithm changes by Google) and promoting the creation of expert-driven, authoritative, and accurate (E.A.T.) content on the web.

For more information on E.A.T., check out Google’s Official Quality Raters Guidelines.

Images break up large blocks of text with useful visuals,
Alternate text embedded within an image can provide more context to search engines about the object, place, or person it is representing. This can help to improve your rankings in this medium.
According to a study by Clutch in 2017, written articles, videos, and images are the three most engaging types of content on social media. Adding images to your text can improve an article’s shareability.

A great example of using varying types of content to break up a topic can be seen below. In the article titled, “How to Tie the Windsor Knot”, the author has provided an informative primary header (h1) based on the functional query and also included video content (in case the user prefers this method of consumption), origin information, a comparison of this knot to others, and an explanatory graphic to walk anyone through the entire process.

By providing an abundance of detail and multimedia, not only can your business realize the additional search opportunities in the form of video object structured data and alternate text on the images, but meet the E.A.T. standards that will delight your potential users and drive performance.

Open Graph Tags

Developed by Facebook in 2007, with inspiration from Microformats and RDFa, the Open Graph protocol is one element of your page that can be easily forgotten because it’s often built into popular content management systems. Forgetting to review how your shared content will display on popular social networks can kill productivity as you race to add an image, name, description post-publishing. A lack of “OG Tags” can also hurt the shareability of the piece, decreasing the chances for its promotion to be successful.

“OG Tags” as they are commonly referred to are similar to other forms of structured data but are specifically relevant to social media sharing. They can act as a failsafe when a page title is not available, as Google commonly looks to this field when it cannot find text between the <title> elements.

How can I construct and validate open graph tags on my website?
Unless your content management system automatically generates Open Graph tags for you, you will have to build a few snippets of code to populate this information for those sharing your posts. You can find a few tools to help you out below:

Documentation:
Facebook Open Graph Markup
Getting Started with Twitter Cards
Rich Pins Overview
Code snippet generators:
Web Code Tools
Mega Tags
Code snippet validation:
Facebook Open Graph Debugger
Twitter Open Graph Validator
Pinterest Rich Pins Validator
Meta Robots Tags
The content your team produces will never get the success it deserves in organic search if no one can find it. While a powerful tool for ensuring search results stay nice and clean, the meta robots tag can also be a content marketers worst enemy. Similar to the robots.txt file, it is designed to provide crawlers information about how to treat a certain singular URL in the search engine results and following it’s contained links, a single line of code can make your page or post disappear.

Where can I find the meta robots instructions?
This specific tag (if your website contains one) is generally contained within the <head> section of the HTML document and may appear to look similar to the following:

<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>

What instructions can I provide to crawlers via the meta robots tag?
At bare minimum, your URL will need to be eligible for indexing by Google or other search engines. This can be accomplished with an INDEX directive in the content field above.

Note: It is still up to the search engine’s discretion if your URL is worthy and high-quality enough to include in search results.

In addition to the INDEX directive, you can also pass the following instructions via the meta robots tag:

NOINDEX – Tells a search engine crawler to exclude this page from their index

NOFOLLOW – Instructs the crawler to ignore following any links on the given page

NOARCHIVE – Excludes the particular page from being cached in search results

NOSNIPPET – Prevents a description from displaying below the headline in search results

NOODP – Blocks the usage of the Open Directory Project description from search results.

NONE – Acts as a NOFOLLOW, NO INDEX tag.

If you are taking the time to produce a high-quality article, make sure the world can see it with ease! Competing against yourself with duplicate articles and/or pages can lead to index bloat, and your search performance will not live up to its true potential.

Canonical Tags
The canonicalization and the canonical tag can be a tricky subject, but it is one that should not be taken lightly. Duplicate content can be the root of many unforeseen problems with your business’ organic search efforts.

What does a canonical tag (rel=”canonical”) do?
In simple terms, utilizing a canonical tag is a way of indicating to search engines that the destination URL noted in this tag is the “master copy” or the “single point of truth” that is worthy of being included in the search index. When implemented correctly, this should prevent multiple URLs with the same information or identical wording from being indexed and competing against each other on search engine results pages (SERPs).

Can my canonical tag be self-referential?
Absolutely. If it’s the best version of a page, do not leave it up to a search engine to decide this. Wear the “single source of truth” badge with pride and potentially prevent the incorrect implementation of canonical tags on other pages that are identical or similar.

Page Speed Test
Last but not least, we can’t forget about page speed on individual pages of a business’ website. While the elements listed above are excellent for helping search engines and users better understand the context around a piece of content, page speed is important for ensuring the user gets a quality technical experience.

The entire premise of using a search engine is centered around getting a quick answer for a particular question or topic search. Delivering a slow page to a user will likely lead to them leaving your website all together. According to a study from Google across multiple verticals, increasing page load time from 1 to 5 seconds increases the probability of a bounce by 90%. That could be a huge loss in revenue for a business.

Source: Google/SOASTA Research, 2017.

 

Tools for testing page speed:
Page by page:
Google PageSpeed Insights (Free)
GTMetrix (Free & Paid Versions)
Pingdom (Free & Paid Versions)
At scale:
Screaming Frog (Free Up to 500 URLs)
Sitebulb (Paid)
Crafting the perfect piece of content is more than simply understanding your audience and what they want to read about online. There are many technical elements outlined above that can make or break your success in organic search or many other marketing mediums. As you think about producing a blog, an informational guide, or even a product page, consider all of the information a user needs to take the desired next step.

If you need more info Contact ABCO Technology

Computer programmerdemand is exploding

 

US Based computer programmer demand is exploding.  A key development in this area resulted from Google’s announcement of Stadia, Microsoft moving the majority of its games to the cloud and other companies including Electronic Arts are also planning such moves. If you are learning about this information for the first time, you might want to know the reason for the change?

The answer is right directly in front of us.  The fact that networks are moving to 5G means that the delay, which is now experienced on 4G or LTE systems will be gone.  Last week a major surgery was performed using 5G technology.

The fact that delay or latency as technicians prefer to call it has allowed the playing of games or other instantaneous activities to be performed directly results from new 5G technology. In the past these activities only were conducted through a type of console.  Now with 5G the console will disappear.

Since we will no longer have consoles, we will have more video games and these games will be able to go online at a faster rate.  More games mean more programmers.  The US Department of labor projects that jobs involving computer programmers will grow 25% faster than the normal growth for other occupations.

The programs, which will lead this movement, are: Java, Python, C++ and all programs that work with Microsoft systems.

You can join this exciting occupation by contacting ABCO Technology.  Call our campus today at: (310) 216-3067 between 9 A. M and 6 P. M. Monday through Friday.

Email all questions to: cpascal@abcotechnology.edu

Financial aid is available to all students who qualify for federal student funding.

Train for your career today.

 

 

CompTIA A+ power consumption questions

  1. What name do you use to talk about the rate of electrons flowing past a point in a circuit in one second? Ampere or Amps. The physics books tell us that 1.2 trillion electrons flowing past that circuit point in one second is equal to one Amp.
  2. What is the calculation used to determine how much power is being consumed in a DC circuit?
    Volts * Amps = Watts. The watt is the unit of power. This question comes directly from Ohm’s law.  This calculation only applies to DC circuits.  All computers both home and laptop actually run on DC power.  A power supply contained inside your computer converts the household AC to direct current.  Direct current doesn’t fluctuate and it is much easier to lower voltage and current to power solid state components.
  3. What is the difference in electrical power supplied to houses in the USA compared to the electrical power in Europe? Power is supplied at 110 – 120 volts @ 60 Hz in the US and 220-240 volts @ 50 Hz in Europe.
  4. What is one main difference between AC and DC? AC is alternating current; it moves in one direction and then reverses its direction, over and over again. Electric technicians refer to this condition as the AC cycle. The alternating current stream is not steady; it’s always fluctuating a little more or a little less as current changes direction. The average AC current is known as RMS or root mien square voltage. DC is current flowing in one direction.  Therefore DC is a steady stream of current. Our sensitive electronics need a steady stream of current. Solid state components such as chips and integrated circuits will not tolerate any form of fluctuating current.  Solid state components require low amounts of steady current and voltage. 5 What does it mean when a power supply uses multiple rails? It means it has separate circuitry and each is its own power source.
  • Getting and using a 900 Watt power supply means that when you are using the computer, you will be using 900 Watts, True or False? False! You will only be using the power you need to run the computer. If your PSU can supply more power, it just simply means it’s available up to 900 watts if your computer needs it, but it’s not using all of that available wattage.
  • It’s ok to plug a 110 volt power supply into an outlet that provides 220 volts, True or False? False, if that power supply receives more current or voltage than its expecting then it will smoke/burn, break!

Hopefully, you will have a fuse, ground fault interrupter or circuit breaker in the line to protect the circuit against such stupidity. If not, I’ll see you in class.

 

If you want to comment on this Blog, ask a question, or suggest further topics, email our school at:

 

cpascal@abcotechnology.edu

To know More: Contact ABCO Technology

A+ question Blog post by ABCO Technology

Topic Motherboards
A+ question1:

Are the connectors the same or different when comparing different sized motherboards?
All motherboard connectors are standardized, so it does not matter what the size of the motherboard is, they will all have the same type of connectors on them. It’s just a matter of how many connectors each motherboard is designed to handle.

A+ question 2
Airflow is not considered to be one factor of how a motherboard and its components are laid out? True or False?
Answer = False
False, airflow is extremely important inside of a computer’s case. The components heat up as you use the computer and too much heat can damage the components. Remember solid state components have low heat tolerance. You must have some method to cool down the solid state parts. One way is to move air into the case, across the components and out of the case on the other side. Typically, fans in front of the case move air in and fans in the back of the case move air out.

Be careful what you post on social media, insurance companies are using your posts against you!

New York is the first state to legalize the monitoring of social media by insurance companies. What does this mean to your bottom line?
Put simply, what you post on social media will cost you money in terms of your life insurance, health insurance and even auto premiums. Insurance companies need to save money, therefore they will use your lifestyle against you, that is if you make it public.

New York’s Department of Financial Services (NYFS) has
released new guidelines
that will allow life insurance companies to use data from customers’ social media posts to determine their premiums, and experts say that these rules could
potentially extend beyond New York’s borders.

The new guidelines suggest that companies can use data from other “non-traditional” sources as well, though insurers will have to prove the information
Does not unfairly discriminate against protected groups:

block quote

An insurer should not use an external data source, algorithm or predictive model for underwriting or rating purposes unless the insurer can establish that
the data source does not use and is not based in any way on race, color, creed, national origin, status as a victim of domestic violence, past lawful travel,
or sexual orientation in any manner, or any other protected class.
block quote end

The NYFS press release states that

block quote

…insurers’ use of external data sources has the potential to benefit insurers and consumers alike by simplifying and expediting life insurance sales
and underwriting processes. External data sources also have the potential to result in more accurate underwriting and pricing of life insurance.
block quote end

The use of social media by insurance companies has been a topic of debate for years now, although there’s very little legal guidance about what privacy
rights we have when posting online. Maria T. Vullo, the chief of the NYFS has been trying to get ahead of the inevitable by establishing some ground rules
after an 18-month investigation which collected information from 160 life insurers about their practices.

She
told the Wall Street Journal
last week:

block quote

Because this is a rapidly evolving area in insurance underwriting, it was important for the department to create general principles now.
block quote end

According to an inside source
from New York’s investigation, only one of the 160 companies polled currently uses social media data, but that company was not identified.

In 2012, the National Association of Insurance Commissioners released a
white paper
from their Social Media Working Group which mostly addressed the ways in which insurance companies could use social media in their marketing, but acknowledged
that it was already being used to monitor customers:

block quote

Companies are using social media in forensic data mining to discover workers’ compensation fraud. For example, some companies monitor social media sites
that might contain posts negating the claims of allegedly injured workers who are participating in activities that are beyond the restrictions placed by
the treating physician.
block quote end

Some background:

Traditionally, life insurance companies used physical exams and questionnaires to determine a customer’s rates. But as this is costly and time-consuming,

companies began to engage in predictive modeling
to determine how likely it was for a potential customer to develop a disease or engage in dangerous activities and used data collected from many public
sources (think medical records of injuries, accident claims, even parking tickets). This new method of data collection is an extension of this, but into
a realm we often (and mistakenly) treat as private.

Companies already had access to general social media trends (common phrases or hashtags, viral content, etc.) to help them understand their customers,
but this was largely for marketing and customer service, so it’s hard to make the case that that violated privacy in the way that this new, more personalized,
surveillance would.

A New York court decision in 2010 (
McCann v. Harleysville Insurance Co.)
declared that an insurance company could not conduct “a fishing expedition” into someone’s Facebook account “based on the mere hope of finding relevant
evidence,” but clearly insurers are finding workarounds. At the very least, it appears that the Fair Credit Reporting Act might give customers who are
denied insurance the right to know whether the decision was based on information gleaned from a social media profile. This could provide material for lawsuits
that could clarify the boundaries for everyone.

The danger and power of algorithms:

The new ground rules also warn life insurers using non-traditional data that they are responsible for analyzing their algorithms to be sure they are free
of bias against protected groups. This means that they can’t simply shop for algorithmic software and employ it without thorough testing first.

Of course, there are multiple issues here, despite the agency’s best efforts to try to make the process unbiased. First, we know that companies have often

refused to share details of their algorithms
with customers and the law has allowed them to do so. We often don’t know how they are processing data, so all we can do is continue to test them. But
we also don’t know how much testing it takes to determine if an algorithm is unbiased, and there’s no objective mechanism or yardstick that allows a company
to truly confirm a lack of bias.

Second, while there are plenty of great data scientists and ethicists working together to find ways to make algorithms less biased, we simply don’t know
how to do it yet. Humans write algorithms and all of us have biases of some sort. The more we claim we don’t have them, the more deeply entrenched they
likely are, making it even more difficult to ferret them out. This has been
a disaster already
in employment decisions and court sentencing decisions that employ these algorithms. But we continue to think that data is objective and can yield some
sort of truth about the world.

Third, it will be very difficult for customers who are not well-versed in algorithmic bias to fight against unfair decisions made about their life insurance
premiums based on data they don’t even realize they’re giving away. We’ve been bombarded with stories about privacy violations, especially from social
media giants like Facebook, over the last year and instead of seeing people take steps to protect themselves,
Facebook has only seen more new customers and increased profits.

At the end of the day, a surprising number of people are
perfectly willing to hand over their data
to life insurance companies for as little as a gift card or a
discount on an Apple Watch.

And if you think you’re safe because you don’t have a social media profile, think again. Recent research has shown that
information about a person can be constructed from the comments of as few as 8 of their friends.
You are what you post, but apparently, you are what your friends and family post as well. This doesn’t appear to be a tactic insurance companies are looking
into yet, but it’s important to keep in mind as they expand their methods of surveillance.

It’s also important to note that social media posts can be deeply misleading, even to a deep learning algorithm assigned to seek out, process, and judge
the value of photos that customers post online. If you’ve given up smoking but have old photos with cigarette in hand (or repost one of those popular Facebook
Memories) how can a computer (or even an underwriter with a lot of work to do) properly assess the context of a photo? How do you control what other people
post about you online?

While Photoshopped images and even deepfakes could eventually become a problem for those looking to sabotage customers, that’s likely a problem that lies
farther down the line. But that’s not to say we shouldn’t keep it in mind.

There are some more likely scenarios that it’s worth watching out for (and, to be fair, some scenarios that are worth avoiding altogether). The U.S. Insurance
Agent’s blog, which aims to help customers compare plans and provides commentary on the industry
has shared some possibilities
for ways in which customers might harm their chances of getting not just life in insurance, but home and renters insurance, and keeping their premiums
down. It’s unclear whether or not companies would go this far, but some possibilities include posting photos or updates while driving, posting about an
unregistered pet that is classified as a “bully breed,” leaving on your geotagging when you’re on vacation and thus signaling to thieves that your house
is vacant. These are things we rarely think about when posting online.

So what can we do?

Plenty of people will join the chorus of protests against this invasion of privacy, but it could also be the case that New York is doing us a favor by
putting something on the record that can be challenged. No states have any rules right now governing how life insurers can populate their algorithms. We
know they currently use public records such as homeownership data, credit information, educational attainment, civil judgments, licensures and other public
filings, and even internet use. But now that they’ve taken an extra step – and one that will appropriately freak people out – the legal system can move
into action. Yes, it will take unfair decisions and lawsuits, and time, and money, but that’s how the system currently works.

While we wait to see how this all shakes out, customers should be sure to read the fine print on their insurance policies and ask specific questions about
what companies will access in order to determine their rates. Companies should be as transparent as possible about how they collect data and state that
on their websites in easy-to-understand terms so that customers can make informed decisions about whether they want to apply for a policy with a specific
company. The new guidelines do mention the need for transparency:

block quote

Where an insurer is using external data sources or predictive models, the reason or reasons for any declination, limitation, rate differential or other
adverse underwriting decision provided to the insured or potential insured should include details about all information upon which the insurer based such
decision, including the specific source of the information upon which the insurer based its adverse underwriting decision.
block quote end

And while data breaches and hacks can’t protect even data you’ve marked private, it does make sense for social media users to explore the privacy settings
on their accounts. It will be harder for an insurance company to defend their use of data if it has been stolen and trafficked on the Dark Web.

In the meantime, make your profiles private, revisit your friends’ list and privacy settings for individual posts, turn off location services and geotagging,
delete compromising photos, do not allow other people to tag you on social media without your permission, and most importantly, do not engage in dangerous
behavior like texting and posting while driving, and always be honest with your insurance company about your habits and health.

Of course, it’s in the best financial interests of honest customers and insurance companies to prevent insurance fraud, which appears to be the main reason
a company would check a social media profile right now.

To know more: Contact ABCO technology

A college degree needs a purpose

 

Over the years, I’ve posted numerous articles for ABCO Technology’s webpage. At times, in researching material I run across an author who has something meaningful to add to the discussion. Today, I’m republishing an article written by Jack Hough. Mr. Hough writes for Barrons Magazine. His latest article, entitled, want to earn a degree in outrage examines some interesting facts about obtaining a college degree. The tone of his article isn’t against attending college. Jack Hough points out that a college degree should have a purpose. One main flaw is that colleges should not be able to judge their own product. The important aspect to consider when going for that degree is to determine a definite plan and purpose for future employment after completing your course of study. Now let’s get to Mr. Hough’s article.

Photoshopping one kid’s face onto another kid’s pole-vault picture is almost always funny. One exception is when a media executive pays $200,000 for the
job as part of a scam that gets her kid into the University of Southern California. It spoils the humor even more if she disguises the payoff as charity
to claim a tax break.

That’s one allegation among many in a sprawling college-admissions scandal unveiled this past week. An investigation by more than 200 federal agents led
to charges against 50 people in six states. Parents had test administrators bribed. They claimed that their kids had learning disabilities to get extra
test time. One actress accused of paying $500,000 to pass her daughters off as crew-team recruits had previously confessed to falsifying a preschool application—not
in real life, but in a 1993 episode of Full House. If aggravated irony isn’t a crime, it should be.

My personal outrage meter ought to be registering an eight out of 10 over this case, but it’s stuck at 4.5. The problem is I recently uncovered some other
scandalous college facts as part of a sweeping investigation using Google for nearly an hour. Among them:

New York University charges $6,500 for Calculus I. That’s tuition and fees, not books, residence, and a Vespa scooter. The rules of calculus were laid
down more than 300 years ago by two guys in wigs. You can learn everything for free on YouTube. So where does all this pricing power come from? Hold that
thought.

It’s not just math, and it’s not just NYU. The sticker price for the average private four-year college is now over $50,000 a year, including room and board.
Do you know what else $200,000 in cash can buy a 22-year-old? A $3 million retirement, if the money is invested at about a 6% year return until age 68.

Stanford University let in 4.3% of applicants last year. It ranks among the top U.S. schools. The number of applicants has more than tripled over the past
30 years, but yearly enrollment has barely budged.

That’s remarkable, because those 30 years have witnessed the birth of the web, which has enabled all sorts of once-tiny enterprises, especially right down
the road from Stanford in Silicon Valley, to connect with vast numbers of users. Some things you just can’t do online, of course. Like visiting Disney
World in Florida. Only Disney World has figured out how to grow attendance from 20 million a year to 55 million over the past 30 years.

Harvard’s admission rate is about 5%. Princeton’s, too. I’m not saying elite schools resemble a supply-restricting cartel. I’m saying that no hospital
would be proud of turning away 95% of patients. To applicants, these schools can seem less in the business of teaching than in the business of culling
the brightest students and branding them for lifelong success. No wonder some Hollywood power moms are going Full House on the admissions process.

The financial payoff of a college degree is looking shaky. I know: The average college graduate earns significantly more than the average nongraduate.
But that income premium has dropped for students born in the 1980s, and even more important, the wealth premium for those students has fallen off a cliff,

according to a recently published study
by the Federal Reserve Bank of St. Louis.

Also, it doesn’t take a $6,500 calculus class to judge the financial payoff for a college degree, but it does require a fairly robust set of inputs. Ignore
any analysis that doesn’t mention the time value of money. I have yet to see one that adjusts for slowpokes and dropouts. Among students who started at
a four-year college in 2010, only 60% had graduated six years later.

Anyone with a reasonable shot of completing a college degree should do so. There are nonmonetary benefits related to career fulfillment, social status,
health, and more. But the fact that so many kids will lose money on a thing that ought to be cheap is at least as scandalous as impersonating a pole-vaulter.

U.S. student debt stands at $1.5 trillion. Not to sound ungrateful to lenders, or to the federal government for nudging them toward students, but whenever
we want to make something more affordable in America, we seem to come up with one answer: artificially puff up buying power against constrained supply.
See housing and health care for other examples. What eventually happens is about as comfortable for consumers as trying to sneeze with a mouthful of mashed
potatoes.

Did you hear that Walmart has started deciding for itself which steaks in its grocery section look likely to grill up juicy and flavorful? Just kidding.
Much as we like Walmart, we’d never trust it with a job that important. That’s why we have an army of U.S. Department of Agriculture inspectors to judge
beef “prime” or “choice” based on marbling. Yet colleges confer their own degrees, and so are the main judges of their own product, education. That’s not
scandalous, but it might hold the key to making college affordable.

Here’s my prediction: Somewhere hunched over a $180 textbook on data structures and algorithms right now is an indebted 20-year-old who will one day create
a new system for measuring student achievement. This will allow us to tell which schools impart the greatest student improvement per dollar of cost, and
not just which schools the brightest kids flock to. Brand names will become less important than performance. The cost of college will crash. Schools will
innovate to survive, then thrive.

Just in case, I’m going to keep stashing money in those college accounts for the kids. And maybe push them into crew.

Write to Jack Hough at
jack.hough@barrons.com
A vocational education in information technology could be the answer for many who are reading this article. Call our campus between 9A. M. and 6 P. M. Monday through Friday at (310) 216-3067.
Email all questions to: cpascal@abcotechnology.edu
Financial aid is available to all students who qualify for federal student aid.
ABCO Technology is located in Inglewood California less than two miles from LAX.
We are at:
11222 South La Cienega Blvd. in STE #588
Los Angeles, Ca. 90304
Train and certify for your information technology career today!

Google explains how dates are determined in search results listings and outlines best practices.

Google has put out a detailed blog post explaining how it determines the dates displayed by listings in search results and outlining helpful guidance and best practices on how to get these right.

Google shows a date when its systems determine it would be useful to do so, so for example for news and other time-sensitive content.

Put your dates in snippets by using mark ups. You can learn how at: www.schema.org.
Google wouldn’t be Google without a bit of obfuscation and it was less than precise about defining exactly the factors that govern which dates are shown:

block quote
“Google determines a date using a variety of factors, including but not limited to: any prominent date listed on the page itself or dates provided by the
publisher through structured markup.”
block quote end

To be fair, Google says all factors can be prone to issues which is why it doesn’t depend on a single one. Publishers might not provide a clear date or
the structured data may be lacking or incorrect. So, it looks at multiple factors to determine a best estimate of when a page was published or significantly
updated.

Google offers the following guidance for publishers and site owners.

How to specify a date
list of 2 items
• Show a clear date prominently on the page.
• With
structured data,
use date Published and date Modified schema with the correct time zone designator for
AMP or non-AMP
pages.
list end

Google News

Google says these items clearly require a correct date for when they were created or substantially updated and specific guidelines for Google News are
available via
Google Help.
Google also reminds publishers not to artificially freshen articles without a significant update.

Further best practice

In addition to the significant information above, Google says publishers should follow these best practices:

list of 6 items
• Show when a page has been updated
• Use the right time zone
• Be consistent in usage
• Don’t use future dates or dates related to what a page is about
• Follow Google’s
structured data guidelines
• Troubleshoot by minimizing other dates on the page
list end

 

For more details on the above, see Google’s
blog post.

Oracle Database careers in Los Angeles and how to get them

 

Have you ever thought about becoming an Oracle database administrator? This article will provide you the information you need to join this group of highly talented and professional information technology personnel.
On March 13, 2019 I decided to conduct a job search for Oracle database administrator jobs in Los Angeles. After typing the search query into Google’s search box, I found 22 jobs available for Oracle database administrators in the Los Angeles area. A great job is being offered by planned Parenthood in Downtown Los Angeles with good benefits, the Pasadena School district is looking for two candidates with Oracle DBA certification and a list of companies who all want to hire a candidate with the latest database skills.
The salaries ranged from $60,000 to $130,000 based upon work experience. The point of the salary range demonstrates that as you gain experience with Oracle, your salary will advance with your knowledge.
Many of you who will be reading this article for the first time will want to know about the best career path to become a database administrator with Oracle certifications.
Oracle has two certifications for this career path. These certificates include: Oracle Certified Associate and the Oracle certified Professional. Candidates who are interested in completing this course of study pass three exams. The first exam grants the holder the certification of Oracle certified Associate and passing the next two exams grants the candidate the certificate of Oracle database administrator.
A comprehensive course for Oracle is completed at ABCO Technology in 7 months. Students attending this course pass the first exam, the Oracle Certified Associate within the first 7 weeks. Oracle certified associates become proficient with Oracle’s version of SQL, which is known as structured query language. Oracle’s version of SQL is known as PLSQL. Many companies hire Oracle certified Associates for data entry positions.
After passing the next two exams, which we will call Oracle Certified Professional, Part 1 and Part 2, the student is granted a full certification as an Oracle database administrator.
After passing this certification, you are ready to begin your job search through ABCO Technology’s job placement assistance program, which will help you with resume writing, job interview skills and actual company searches.
Call our campus between 9 A. M. and 6 P. M. Monday through Friday at: (310) 216-3067.
Email: cpascal@abcotechnology.edu
Financial aid is available to all students who qualify for federal student aid.

Why CompTIA’s Network + will help your IT career

Why Network + certification will help your IT career
The network + certification is the next certificate level above the CompTIA A+. The network + focuses its’ curriculum on server theory and network operations. The A+ focuses on the individual computer or workstation.
After passing the A+ exams, which deal with computer hardware and operating systems, employers want their employees to advance to the level of being able to maintain a corporate server. The network + focuses on that employer requirement involved with handling servers either on site or in the cloud. CompTIA’s network + is also vendor neutral. The exam has questions about servers, WIFI, cyber-security, cabling, troubleshooting networks and basic network theory. After completing the CompTIA Network +, the level of understanding of how a network operates and functions is greatly increased.
The Network + exam also explores TCP and UDP protocol theory, which is essential to a network’s operation.
Many candidates who have passed their Network + have reported getting raises in salary from their employer as a result of passing this exam.
The CompTIA Network + is completed in approximately 90 hours. Close to 78 of those hours are instructor lead. Students who study for this exam should take several practice tests. A score of 90% or higher on the practice exams will place the odds of passing the actual exam strongly in your favor.
Like all of CompTIA certifications, the Network + doesn’t require a two or four-year college degree. Companies are hiring network personnel who have completed high school and passed the A+ and network + exams.
After passing the network +, the doors are open for a career in either cyber-security or network administration. Students will need to pass other certifications, which include the Cisco Certified Network Associate. The Network + builds that strong foundation, which is needed to pass those higher certifications.

CompTIA A+ graduates are proven problem solvers in today’s digital world!

CompTIA launched A+, the organization’s first information technology certification, in 1993 to standardize skills for PC technicians in a manner that showed they could work on any machine, no matter the vendor. Today, CompTIA A+ validates the entry-level IT skills employers demand and gives information technology pros a broad perspective on many computer and networking functions. If you received your CompTia A+ 10 or 20 years ago, you may be wondering how the Core series, which launched in January of 2019, compares to the exams you took. Keep reading below to see what’s changed and what’s stayed the same.

CompTIA A+: The Initial Release

1993 to 2000

The first iteration of CompTIA A+ focused on PC components and did not have dedicated domains for computer networking or security. It covered popular operating systems, including DOS 6.x, Windows 3.1 and Apple Macintosh System 7. It also covered an array of hardware, including AT motherboard form factors, serial and parallel ports and connectors, CRT monitors, dot matrix printers, floppy disk drives and CD-ROM.

When was the last time you saw a dot matrix printer or a floppy disk drive? This is a strong example of why CompTIA A+ has continued to evolve over the years – to reflect the changes in technology and the needs of the industry.

CompTIA A+: Version 2001

2001 to 2003

After eight years on the market, CompTIA A+ was ready for a refresh. This also marks the beginning of the three-year renewal cycle.

The core of the exam still focused on PC hardware components, but this sparked its evolution from computer repair to a better-rounded tech support certification. This version saw the addition of a domain dedicated to computer networking, including topics on internet connectivity, HTTP and TCP/IP. CompTIA A+ also now emphasized diagnosing and troubleshooting issues, a shift from the break-fix mentality.

The exam still covered Windows, motherboard form factors, drives and ports, but it upgraded to the latest technologies, including Windows 2000, DVD drives, SCSI connectors and LCD monitors. Coverage of the Apple Macintosh was dropped.

CompTIA A+: Version 2003

2004 to 2006

In this second update, we saw even more of a transition from PC repair to building a PC from components, connecting it to the network and troubleshooting. In 2004, CompTIA A+ put network and internet connectivity front and center and split networking into two domains: network hardware and client software configuration.

In 2003, the first iteration of personal devices – personal digital assistants (PDAs) and Palm Pilots – were added to CompTIA A+. New technology including wireless LAN connectivity, DNS and Cat6 networking cabling were also added, as were firewire, USB 1.1 and infrared receivers.

How Has CompTIA A+ Changed?

Wondering how CompTIA A+ has changed since you took your exam? Call ABCO Technology where you can learn about the new exam objectives from your exam to the new Core Series.

CompTIA A+: 600 Series

2006 to 2009

By 2006, security had found its way to tech support and the help desk, and CompTIA A+ reflected that. The 600 series added a dedicated security domain, covering topics including software-level security, accounts, permissions, antivirus software, firewalls, encryption and malware, including adware, viruses and grayware.

This series also offered three different exam options to meet the needs of different IT pros: remote support for end users, enterprise technical support and PC build and repair.

Computer networking and the internet continued to be a focus, with topics such as LAN/WAN and VoIP. In addition to PCs, CompTIA A+ also now supported laptops, setting the stage for mobile device support.

CompTIA A+: 700 Series

2010 to 2012

As cybersecurity threats exploded, CompTIA A+ delved even deeper into types of attacks and how to mitigate them. The entry-level IT certification covered topics including authentication technologies, encryption, BitLocker and social engineering as well as Disk Manager, Event Viewer, Device Manager and Remote Desktop Protocol (RDP).

Networking coverage expanded as well, covering ports and protocols, TCP/IP, HTTP, FTP, POP, SMTP, TELNET, wireless networking, WEP, WPA, SSID and more. In addition to the printers themselves, the 700 series of CompTIA A+ covered networked versus local printers.

Lastly, devices including projectors, web cameras, touchscreens, touchpads, track points and removable storage were added to the IT certification exam.

CompTIA A+: 800 Series

2013 to 2015

The CompTIA A+ 800 series split its focus evenly between supporting end users across devices and building custom PCs. Basic cloud computing concepts were added for the first time in 2013, as well as a dedicated domain for mobile devices, featuring OS administration and device connectivity.

Operating system coverage included Windows XP, Vista and 7, Android and iOS, with command-line admin tools and utilities, user account control, Windows firewall and upgrade paths.

On the hardware side, CompTIA A+ included motherboard expansion slots, storage drives, output connectors and devices. The exam even covered gamepads, joysticks, motion sensors and Smart TVs.

Physical and logical security, client-side virtualization and virtual printing were also added.

CompTIA A+: 900 Series

2016 to 2018

In 2016, CompTIA A+ brought back Apple and added Linux and the internet of things (IoT). The 900 series focused on supporting user access to applications and data from any device. Security moved to the forefront, bleeding into all exam domains rather than being limited to one.

CompTIA A+ now covered a wide variety of computers, ranging from PCs to wearables, from smartphones to GPS. The operating systems covered are just as broad, with Windows, Android, Mac and Linux all getting attention.

The modern exam stayed true to its roots with coverage of motherboard form factors, hardware and components, but it included the latest iterations. Hardware coverage also now included mobile device accessories, such as headsets, docking stations and credit card readers.

Cloud, mobile devices and connectivity, wireless networking and cybersecurity were all addressed, as well as PowerShell, software as a service (SaaS) and router configurations.

CompTIA A+: The Core Series

2019

This week CompTIA released the newest version of the CompTIA A+ certification, the Core Series. It focuses on enabling end users and devices to securely access the data needed to complete a task. Despite its evolution, the exam continues to emphasize problem solving and takes a vendor-neutral approach to tech support.

The latest iteration gives cloud computing and virtualization a dedicated domain and introduces scripting as an important technical support tool. Coverage of IoT is expanded, including both device hardware and connectivity.

Cybersecurity also plays a key role, with topics such as Active Directory security settings, software tokens, security protocols and encryption, authentication methods and security threat identification and prevention. Privacy concerns and policies, such as GDPR, are also addressed.

Networking and hardware round out the certification, which gives aspiring IT pros a well-rounded view of what they might encounter at the help desk and a taste of the specialties they could choose to pursue in the future.

 

They say the more things change, the more they stay the same. While CompTIA A+ still covers the basics, like motherboards and operating systems, it has evolved to include modern technology concepts and devices and shifted some emphasis to cloud technologies.
If you are interested in a fulfilling career in the field of information technology, stop what you are doing and contact ABCO Technology. You can reach our campus between 9 A. M. and 6 P. M. Monday through Friday at: (310) 216-3067.
Email all questions to: cpascal@abcotechnology.edu
Financial aid is available to all students who qualify for federal student aid.
ABCO Technology is located at:
11222 South La Cienega Blvd. STE #588
Los Angeles, Ca. 90304
Train and certify for your information technology career today!

×

Request Info with No Obligation

    How much is tuition?Can I get financial aid?What are my career prospects?When does it start?

    By checking this box, I give consent for ABCO Technology to use automated technology to call and/or text me at the number provided above, including my wireless number if applicable. Call us for information: 310-216-3067

    I understand & agree