Black-Hat SEO
Despite knowing that black-hat SEO will result in penalties, some SEO experts resort to
underhanded techniques. Link farming, cloaking, keyword stuffing, irrelevant content,
and spamming are black-hat techniques that are still in use. The results may seem
positive, but eventually Google and other search engines realize that they are being
duped, resulting in a penalty.
Let’s consider the cloaking black-hat SEO technique. It is akin to spamdexing , where
the content presented to search engine crawlers is different than the content presented
to human users. It is a deceitful way of achieving higher rankings: the content delivered is
different depending on IP addresses or HTTP headers. This manipulative technique tries
to trick search engines into believing that the content is the same as what users see.
Another black- hat SEO technique is link farming , where sites exchange reciprocal
links to boost their rankings by fooling the search engines. It is different from link building ,
which is an organic way of boosting rankings. Because search engines such as Google and
Bing rank sites based on their popularity and inbound links, some SEO consultants used to
implement link farming to get links from hundreds of websites that were not even slightly
related to the target site. Some SEO consultants also had link farms that used devious ways
of exchanging links with other sites, something like a link-exchange program.
For example, suppose a site is devoted to troubleshooting Windows OS issues. If
inbound links come from sites such as Stack Overflow, authority sites, and relevant
blogs, then they are legitimate. However, if such a site receives inbound links from travel
and tourism sites offering vacation packages in Miami or from sites offering plumbing
solutions in Ibiza, then there is no relevance—the sites have no connection. Therefore,such sites use link farming and are deceitful because they just want to boost their
rankings using these underhanded techniques.
Instead of fooling the search engines, it is better to use white-hat SEO techniques
that are beneficial in the long run. Quality link building, using social media appropriately,
and engaging users with solid content are some of the white-hat SEO techniques. It may
take weeks or even months for the results to show, but white-hat techniques are the
norms that SEO experts must follow to gain visibility in SERPs.
Irrelevant Content
Content is king. However, if you use duplicate content or inappropriate methods such
as keyword stuffing in your content, you are bound to be penalized. Content must be
relevant and must engage users. Fresh content is also an essential factor, because the
search engines show an affinity for fresh, quality content. Write content for users, and
then tweak it to optimize it for the search engines.
Targeting the Wrong Audience
Your SEO implementation must be optimized for an appropriate target audience. If
you do not target the right audience, your efforts will be wasted. For example, gaming
consoles, portable music players, and MP3 gadgets are appealing to youth, whereas long-
term retirement plans are better suited for middle-aged users.
Competition
Small- and medium-sized businesses do not have a huge budget for advertising their
products and services. Therefore, they need to ensure that they do not try the same
approaches as large-scale enterprises that have a panel of SEO experts, funding, and
expensive advertising methods. You should also avoid using metadata and content
similar to that of large-scale enterprises, because doing so will hinder your SEO process.
Using keywords prevalent in the web pages of enterprise-grade organizations is
detrimental because small enterprises do not have the budget, web presence, and reach
for mass-scale advertising to outperform large adversaries. You can use Google My
Business and other Google tools (as well as third-party enhancements) to gain visibility.
You can also use keywords that have less competition and target a niche market. You may
see your website gain prominence on SERPs using such low-competition keywords.
Overlooking Social Media as a Medium
Social media marketing is no longer an optional method. It is mandatory to take
advantage of the scope and reach of social media to create awareness of your brand.
Many organizations neglect social media, and this limits exposure to your site. This
doesn’t mean you should over-optimize social media strategies by using every social
media app in the market. You also have to be relevant.
For example, you can advertise by using a concise video about your product or
service on YouTube. You can tweet about your latest product update on Twitter or write a
blog on WordPress that is relevant to your product. Users love to read fresh and engaging
content, and so do the search engines. You can also use backlinks to your site and
outbound links to relevant sites (such as linking a term to Wikipedia), which will benefit
users looking for informative content.
Ignoring UX for Your Website
Your site may have lots of jazzy features, but if your users cannot navigate easily or find it
difficult to access content, then the result may be a shabby UX. For example, the Add To
Cart button on an e-commerce website must be easily accessible to users. UX is a crucial
factor for search engines because they like sites that are popular and have a high degree
of usability.
Missing XML and HTML Sitemaps
XML and HTML sitemaps are designed for search engines and users, respectively. Your
site may have the latest updates and game-changing features, but if the search engines
are unable to crawl and map your site, that is detrimental to your SEO workflow. You must
submit XML and HTML sitemaps to the search engines, so your deep web pages can be
crawled more easily. (Chapter 6 is dedicated to sitemaps.)
Slow Page-Load Time
Slow page-load time is a deterrent to SEO processes. Code-heavy pages, uncompressed
and unoptimized HTML, images, external embedded media, extensive use of Flash,
and JavaScript result in slow page-loading times. There can be other factors, too, that
may result in high page-load times: using server-centric dynamic scripts, non-optimal
web hosting, and lack of required bandwidth. These factors negatively affect your SEO
processes and are major hindrances resulting in lower rankings. For example, SEO
experts recommend 2 to 3 seconds as the optimal loading time for product pages on
e-commerce sites after analyzing statistics from research and analytics.
Moreover, surveys and studies related to site speed infer that users tend to abandon a
site that isn’t loaded within 3 to 4 seconds. This also represents a shabby user experience,
resulting in lower conversions and sales.
Using Flash on Your Site
In the early days of the web, Flash was an awesome resource that helped site build
intuitive modules and impressive page elements. However, with the advent of HTML5,
this cutting-edge utility has taken a backseat. Moreover, as the mobile and tablet market
has become a dominant force, the use of Flash is considered redundant, because Flash
was tailored for desktop users.
Flash is more prone to malicious malware hacking, from a security point of view.
Its non-scalable features mean you cannot minimize or expand it or set a viewport for it.
Everything you can do in Flash can be done more quickly and easily in HTML5. You can
also use the latest web design frameworks to build interactive sites, thereby relegating
Flash to the sidelines. Google and other search engines cannot read Flash, although
recently, Google claims it can index the text in Flash files. Considering the pitfalls
associated with Flash, you should use HTML5 to design and develop interactive sites with
enhanced animation and special effects.
JavaScript Accessibility Issues
Client-side JavaScript , extensively used in single-page applications, helps you build
dynamic and highly intuitive websites. However, search engines cannot parse JavaScript-
generated content efficiently. Currently, only Google’s search engine is able to
understand JavaScript, albeit at a more basic level. With frameworks such as Angular and
Ember being used to build dynamic web pages, search engines will have to develop the
ability to understand complex JavaScript; so far, this ability is evolving. There are a few
workarounds you can implement to tackle JavaScript accessibility issues.
Suppose your browser is an old version and cannot parse the latest features and
functions of JavaScript. You can use fallback code (also called polyfills ) to replicate the
content of JavaScript-based web applications. Using fallback code is an option for tackling
the JavaScript issue using server-side rendering. However, it requires a lot of effort and
costs more because you must develop code for all the features on the website; plus there
is heavy code maintenance. It also makes your pages code-heavy, resulting in higher
page-loading times.
Another aspect to be considered is the workaround for applications built using
JavaScript frameworks such as Angular and Backbone. When search engines and
social networks crawl your pages, they only see the JavaScript tags. To ensure that these
dynamic pages are accessible to the search engines, you can use a prerendering service
such as Prerender ( https://prerender.io/ ) or SEO.js ( http://getseojs.com/ ).
The Prerender middleware checks all requests and, if there is a request from a
search engine spider or bot, sends a request to Prerender.io for the static HTML for that
JavaScript page. The Prerender service uses PhantomJS to create static HTML, which
in turn is submitted to the spiders for crawling. Prerender can be used for most of the
JavaScript frameworks.
An alternative to Prerender is SEO.js. Once you submit a website to the SEO.
js dashboard, the utility service visits your web pages and creates HTML screenshots
for each page. An added advantage is that the snapshots are updated automatically.
Therefore, when a search engine’s spiders or bots visit your site, they see only fully
rendered content from the snapshot. You can also use SEO.js to create XML sitemaps to
enhance the accessibility of your web pages to the spiders.
Now that you have learned the core obstacles that hinder SEO, let’s look at Google’s
Accelerated Mobile Pages (AMP): a new concept that will change the face of the mobile web.
AMP
Accelerated Mobile Pages (AMP) is a Google project aimed at the mobile web. Akin to the
concept of Facebook’s Instant Articles and Apple’s Apple News, AMP pages will change
the way people perceive the mobile web. AMP pages are web-based, meaning they are
rendered in a browser. They are independent documents that are sourced from your web
server. Optionally, you can store AMP documents in a CDN cache to render them more
quickly.
AMP pages are made up of the following modules:
• AMP HTML
• AMP Runtime
• AMP Cache
While responsive websites face issues such as rendering heavy-duty desktop content
on a mobile website, JavaScript bloat, and sluggish speed on the mobile platform, AMP
pages are designed for the mobile platform and help users view site pages efficiently
across various mobile and tablet sizes. JavaScript is baked-in for AMP Runtime, which
manages the loading of AMP modules along with features such as runtime validation for
AMP HTML. It defines the priority for resource loading, thereby resulting in an optimal
page-loading experience. AMP HTML documents can be stored on your server, and you
can use your own CDN; but you can also take advantage of the benefits of using Google’s
CDN, which streamlines the SEO processes built around these pages. When you search
a page on Google, you see the results on desktop browsers. However, on the mobile
platform, there is a high probability that Google will direct you to AMP pages rather
than regular pages, because AMP load instantaneously and the Runtime streamlines the
utilization of available resources.
Unlike Facebook’s Instant Articles or Apple’s Apple News, AMP pages (although
backed by Google) are portal-agnostic and open-source. They are supposed to be built-
in with ads and analytics support. AMP are accessible from any portal: Google Search,
Pinterest, or anywhere online. In summary, AMP pages are a lucrative alternative to
heavy-duty websites; they display information quickly and render content effectively
without the bulk or clutter.
Go to www.theedesign.com/blog/2016/year-of-google-amp to see the difference
between a normal web page and an AMP page. You can find out more at
https://www.ampproject.org/.
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Your happy
Comments