Content and Links For Search Engines

Content

When you look at a Web page, you see the page displayed on your computer screen. You can read the text, look at the images, and figure out what that page is about.

Search engines don’t see Web pages the same way a person does. In fact, search engines cannot actually see at all, at least not visually. Instead, they read the HTML code of the Web page, and the actual text that it contains.

All the search engines can read is text. They also can look at the HTML code (which is also text) of the site to try and get some clues about what that text means or which text is most important.

Search engines can sometimes use the HTML code to get some clues about other elements on the page, such as images and animation. For example, search engines can look at an image tag and read the alt text attribute, if the page author supplied it, to get an idea of what the image is.

img src="cowpicture.jpg" alt="Picture of a cow"
However, this is not a replacement for actual text content.

links

Links

Web links from other sites are also important clues that search engines use to figure out what your page is about, or how important your page is for a particular search query. In a search engine’s view, a link from one page to another is basically a “vote” for that page.

If you have a page about cows, and a local farmer’s Web page links to your page from their website for more information on the topic of cows, that is an extra vote for your page.

More links = more votes.

Not all votes are equal votes, however. Most important is how relevant the link is. For example, a link from a page about video poker software doesn’t have much to do with dairy products or cows, so a link from that page to your website about cows does not count for very much at all, if anything.

Some Web page owners put a lot of time and effort into chasing down links from other Web page authors, swapping links or trying to get listed on directories or have articles posted to sites like Digg or Reddit. This can be helpful for your site, but you have to remember to focus on your own page content first. If your Web page doesn’t have much value to other site authors, they are unlikely to link to it.

Search engines have developed a lot of sophisticated techniques for weighting and valuing pages on the Web. But they all come down to basically two categories:

  • What does your Web page say? The actual text content of your Web page and HTML code. What content does your site convey to the user?

  • Who is linking to you? What sort of other Web pages are linking to yours? Do they have the same topic or a related topic?

How to create great alt text?

ALT Tags – If your browser cannot display an image from a website, then the ALT tag displays the description of the image as text. ALT image tags also make it possible for the visually impaired to understand the images on your website. The ALT tag should be only a few words describing the content of the image. ALT tags contribute to the keyword count on the Web page. So, using relevant images with appropriate ALT tags can increase the overall keyword count on your page.

 

Google Image publishing guidelines

  1. Don’t embed important text inside images
  2. Tell Google as much as you can about the image
  3. Give your images detailed, informative filenames
  4. Provide good context for your image
  5. Create a great user experience

The alt attribute is used to describe the contents of an image file. It’s important for several reasons:

<img src="puppy.jpg" alt=""/>

Better:

<img src="puppy.jpg" alt="puppy"/>

Best:

<img src="puppy.jpg" alt="Dalmatian puppy playing fetch"> 

To be avoided

<img src="puppy.jpg" alt="puppy dog baby 
dog pup pups puppies doggies pups litter puppies dog retriever 
 labrador wolfhound setter pointer puppy jack russell terrier 
puppies dog food cheap dogfood puppy food"/>

https://support.google.com/webmasters/answer/114016?hl=en

About Malware in your server

Keep your site clean and secure. Every time shoppers place an order, they’re trusting you to keep them safe from hackers who steal information or spread spyware and viruses.

Malware is short for malicious software. It’s a catch-all term that describes harmful applications or other malicious code such as adware, spyware, trojan horses, worms or viruses.

images (3)

Malware comes in many forms, from an unwanted ad reappearing on your site to an executable file that infects visitors who click on it. Telltale signs that your site is infected can include unexplained ads, links or pop-ups, but some malware can have no noticeable effects at all.

Your best defenses against malware are staying current with third-party application patches and using strong server passwords. When checking for the presence of malware, be sure to check the code residing on your server and not your backup files. Always use a virtual machine for verification to avoid infecting your own computer.

We cannot assist you with removing malware from your server. Consider taking your site down immediately to prevent infecting visitors, and take action quickly to identify/remove it.

Defend your website

SiteLock protects your web investment, keeping you and your customers safe from hackers and other online threats.

Slow Website Speed Problems – Why?

Knowing how to improve your website’s performance is important. We use tools like P3 Profiler, Yslow, Pagespeed, and WebPageTest.org to diagnose poor Web page performance. For more information on using tools to determine site slowness, see Website speed tests

It’s quite possible that your website is slow because of one of the five issues below. Check them out, and see how they relate to your site.

banner-meter-reading

  1. Page Size — The bigger your page, the longer it takes to download, especially over slower connections.Big images are probably the number one cause of slow loading pages. Most image creation software has image compression options. There are also online tools, such as Smushit by Yahoo®! that can help you compress large images. You should make sure that each image on your website is optimized for the Web. Also, resize images to fit the width and height you want them to display on your page. We often see people upload giant 2000-plus pixels-wide images they snapped with their digital cameras and then use the width and height parameters to shrink them, like this width="500"height="300". Don’t do that. If you say width=”500″ height=”300″ in your img tag, the image should be optimized and 500×300 pixels.
  2. Time to First Byte — An increased time to 1st byte means there are too many SQL queries or non optimized SQL queries. This can also include server-side calls to third-party API. If you’re running WordPress, get the WordPress Plugin P3 Profiler to discover what plugins are running what queries and how long each one takes.If you’re a WordPress user, there are a number of plugins you can check out. We’ve seen caching plugins affect performance both positively and negatively on customer sites and it’s largely dependent upon the traffic, and how dynamic the site is. Popular choices for WordPress are WP Super Cache, W3 Total Cache, Batcache and Tribe Object Cache. These plugins offer various page, database and browser cache features. Try each one out (one at a time, not all at once) and see what works best for you.
  3. Total Objects and Third-Party Objects — Too many objects on your page will require visitors’ browsers to perform the request and receive pattern too many times and slow down your page. Try combining JavaScript and CSS. Use sprites for your images. You could use mod_pagespeed to help automate this for your site. For more information, see Which mod_pagespeed functions do you support? Also be wary of how many third-party domains you’re using. Too many social buttons cause problems. If you use WordPress, you might want to check out the WordPress plugin Lazy Social Buttons.
  4. Cached Objects — You want browsers caching your site. You need to instruct the Web server to enable expires headers on your static objects. This tells browsers to cache the site. This is not currently enabled by default on our Windows hosting plans, but is available for Linux plans. For more information, see Enabling mod_expires with Your Hosting Account.
  5. Text Compression — If you don’t have text compression turned on, your page is going to be slow. We turn this on by default on our Web Hosting plans, so your if your page is suffering from this, it’s either because of third-party objects, or it somehow got disabled on your hosting account. See Enabling mod_deflate with Your Hosting Account for more information

Website Builder Version 7 is Here!

Website Builder Version 7 is Here!
The release of Website Builder v7 has been much anticipated and We are happy to announce that it is now available!

Website Builder
Website Builder

Key Features of Version 7:

Drag and drop website builder, designed for easy use
Unlimited Pages on ALL plans!
Up to 300 themes available depending on the plan purchased
Mobile Site available depending on plan
All this and more! Check out your Website Builder page for more details!
With this change, we updated the names of the different plans to best communicate who they are intended for; specifically, the Economy plan is now called Personal, Deluxe is now called Business, and Premium is now called Business Plus. All new purchases of Website Builder will be provisioned in version 7 of the product. Current Website Builder customers will continue to have access and use of version 6 of Website Builder. Unfortunately, at this time, there is not a migration path between version 6 and 7. Visit here for more information about website builder V7 or cheap domain registration service here.

 

What is a Cron Job?

Cron is a standard Linux feature that allows you to schedule tasks, called “Cron Jobs,” to run unattended at a specified frequency. For example, you can set the frequency of a job to run twice an hour, Mondays at 8:00 a.m., or weekdays at 12:00 p.m. and 6:00 p.m.

Cron Job reports are sent to the email address specified in the Hosting Control Panel Cron Job Manager.

There are several ways to schedule commands to run. Typically, you create a shell script to run as a Cron Job. It runs a list of commands while checking for errors and valid return codes. To run a shell script, set its bit set permissions to “executable.” Alternatively, compose a binary executable with a number of arguments. For example, “touch” a file:

/bin/touch $HOME/html/cron_test

The first part of this sample command, “/bin/touch” runs the executable. The second part, “$HOME/html/cron_test” is an argument for the “touch” command.

NOTE: In the example above, the “$HOME” variable is set to the default directory of the hosting account. The “html” directory is the document root of the hosting account.

In addition to shell scripts, your hosting environment supports other language scripts. For example, Perl is a commonly used scripting language that can use scripts as Cron Jobs. In most cases, you can perform this task by specifying to use the executable as the first line of the script. For example:

#!/usr/bin/perl

When a script begins with a line, as in the previous example, and the executable permissions are set, you can specify the line as the command to run for a Cron Job.

The PHP (versions 4 and 5) hosting installations do not support the use of the executable line in scripts. To run a PHP script with Cron, you must set the path to the PHP interpreter as the first element of the command, and then enter the full path of the script you want to run.

The full path to the PHP executables depend on what type of hosting you have (more info) and your PHP version (more info):

    • All cPanel Shared Hosting: /usr/local/bin/php -q
    • Other hosting PHP version 4: /web/cgi-bin/php
    • Other hosting PHP version 5 through 5.2: /web/cgi-bin/php5
    • Other hosting PHP version 5.3: /web/cgi-bin/php5_3
    • Plesk hosting PHP version 5.4: /web/cgi-bin/php5_4

NOTE: Our Classic & Web hosting accounts using PHP version 5.4 do not currently support running PHP scripts with Cron.

Search Engine Visibility – Optimizing Your Search Engine Tags

Search Engine Visibility lets you view your site’s current tags and create populated tags for easy downloading and pasting in your Web page’s code.

To View Current Tags

  1. Log in to your Account Manager.
  2. Click Search Engine Visibility V1.
  3. Next to the account you want to use, click Manage.
  4. From the Optimize tab, select Optimize Tags.
  5. Click the Current Tags tab to view your site’s active tags.
  6. Use the Back and Next buttons to navigate between your site’s pages.

NOTE: You cannot modify content on the Current Tags tab. To modify or add new tags, use the Optimized Tags tab.

We are committed in providing cheap domain registration, hosting, transfer and renewal services with no hidden cost. Our domain registration and hosting plans are ideal for most individuals and small businesses. We’re offering features like a 99.9% network uptime commitment, 24/7 support, and free access to our exclusive Value Applications – with EVERY domain hosting plan!

To Create New Tags

    1. On the Optimize Tags page, select the Optimize Tags tab.
    2. Use the Back and Next buttons to navigate between your site’s pages.
    3. In the Title field, enter the title tag.
    4. In the Description field, enter description tags.
    5. In the Keywords field, enter the keywords tags.

NOTE: For suggestions and guidelines on each tag, click the question mark next to the tag name.

  1. (Optional) Click Select defined keywords to add keywords that you have already defined.
  2. Click Publish All Pages or More Options, and then choose between Download Text File or Export to Webmaster. Continue downloading or publishing for each page as needed

About some Key SEO Terms

ALT Tags — If your browser cannot display an image from a website, then the ALT tag displays the description of the image as text. ALT image tags also make it possible for the visually impaired to understand the images on your website. The ALT tag should be only a few words describing the content of the image. ALT tags contribute to the keyword count on the Web page. So, using relevant images with appropriate ALT tags can increase the overall keyword count on your page.

Backlinks — Links to your page from other sites on the Internet are called backlinks. Search engines use links to indicate general popularity. Search engines take into account where the link is coming from, which page it’s pointing to, and what the actual text of the link says.

Black Hat — In SEO, black hat SEO refers to using deceptive techniques to fool search engines into ranking a site higher than it deserves. These techniques are usually short lived. Search engines are constantly updating their ranking algorithms to eliminate the effectiveness of black hat practices. Search engines ban sites that use black hat techniques.

Hidden Content — This is another technique common among black hat SEO. This practice involves placing content on a Web page that is hidden to normal Web viewers, and is only visible to search engines. The hidden content artificially increases search result rankings. Search engines have gotten very good at detecting these type of techniques. Using hidden content can cause your site to be penalized, including exclusion from search results.

Keywords — Chosen words and phrases that describe what your Web page is about. These keywords are the actual terms people search for in the search engines that relate to your web site. Once you identify the keywords, they should be placed in the Keywords meta tag.

Link Bait — Content that is posted to a web site with a controversial or inflammatory title or content, that is intended only to draw links and traffic. Most of the time this is used as a derogatory term for content that has no value except to get people angry or excited enough to link to or visit the content.

Link Farm — This is another black hat SEO technique. It involves setting up multiple sites whose main purpose is to contain links to other sites. This technique tries to take advantage of the relative importance search engines place on links. Changes to search engine algorithms have been made to detect and devalue these sort of links, rendering them useless from a ranking perspective.

Meta Tags — Contains data that describes your page to other systems, such as search engines or RSS feed readers. This information about your Web page is invisible to the typical user. Some of the common meta tags from a search engine standpoint include keywords, description, and title tags.

PageRank — This is a proprietary measure used by Google to indicate how much authority a page has, based on incoming links (backlinks) from other sites on the Internet. The outwardly-visible PageRank number that Google exposes through its tools no longer has much real-life bearing on rankings. However, it’s still well known and some people mistakenly focus on this number to improve on their search results rankings.

Pay Per Click (PPC) — Sponsored listings on the Search Engine Results Page (SERP). These links are contained in a different colored background on Google. These links are not actual search results, but instead are paid listings. The search engines are paid every time people click on these links. While they are paid listings, relevance may still play a part in how high on the page these listings show up. Running some PPC ads can be a good supplement to an SEO campaign.

Redirect — This is a command that a web server can give to a web browser (or search engine) to tell the requestor that the content has been moved. There are different types of redirect, meaning different things such as Moved Temporarily (302) and Moved Permanently (301). When you move content on your site, you need to check with your server administrator to make sure that the old pages are redirected to the new location using a Moved Permanently (301) code.

Robots file — This is an optional file that you include on the root of your web site (in the main domain name, not in a sub-folder). This file contains suggestions to the search engines including which pages you would not like the engines to include in their index, which pages you would like them to index, and the location of your sitemap file. This file is also used to block search engines entirely.

Search Engine Optimization (SEO) — This refers to the process of making your web site more accessible to search engines. This can include optimizing the text content of your site to include proper keywords, optimizing the code structure of your site itself, and finding ways to attract incoming links to your page.

Search Engine Result Page (SERP) — The page on which the search engine displays the results of a visitor’s search.

Sitemap — This is a file that lists the pages on your site, along with each page’s relative importance. This optional file can help search engines find all of your site’s pages. You would use this file during search engine submission.

Spider — A spider is a virtual browser program search engines run to crawl through the links on the Internet and compile information about the pages they find to index and rank the content.

Submission — Most search engines have a form you can use or a Web service you can call to submit your website to them. This is nothing more than letting the search engines know that your website is up and active so that they can add your site to your list of pages to index. Submission does not guarantee search engine listing or ranking. Those factors are decided entirely on the individual search engine ranking algorithms.

About Spiders and Internal Links

What Are Spiders and Internal Links?

Spiders — automatic programs used by Internet search engines to regularly crawl the World Wide Web — look for websites and other content to add to the search engines’ databases. They follow the hyperlinks that connect websites on the Internet.

578px-Many_Spiders

Search Engine Visibility provides several options for you to help search engine spiders and crawlers in navigating your site with ease. Spiders are crippled when they have no links to follow. If your website has broken or no internal links a spider will likely not go beyond the first link. If your site contains external links, the spider might follow those and leave the site behind.

If, however, all the Web pages on your site are linked to each other with functional hyperlinks, a visiting spider is able to visit every nook and cranny of the site.

Migrating Your cPanel Site from HostGator to LuckyRegister

If you want to migrate your site hosted by HostGator to LuckyRegister – Cheap Domain Registration, Domain Hosting Services -, you can use cPanel’s backup features. This will move your website, its databases, as well as the email accounts you have set up on cPanel.

A quick note about email — if you follow these instructions, your email will continue working like it did before. However, if you change your domain’s nameservers to point to our nameservers, it might stop your email from working.

Overview

Though this process has a number of steps, you can complete them in about an hour. Here’s what to do:

  1. Download all of your backups from HostGator.
  2. Restore those backups at LuckyRegister – Cheap Domain Registration, Domain Hosting Services -.
  3. Preview your site on our servers.
  4. Make it live.

It’s pretty easy.

More about our domain hosting here