Posts

Do I need to build a Sitemap for my Website?

What is a sitemap? Human readable sitemaps are static HTML files that outline the first and second level structure of a Web site. The original purpose of a site map was to enable users to easily find items on the Web site.  But the purpose grew over time to became useful as a shortcut method to help search engines find and index all the parts of a site. Now, we have an XML sitemap, which effectively provides an easy-to-read link dump for the spiders to index. Some Web browsers can display an XML sitemap for users to read as well, you should offer both kinds of site maps (HTML and XML) if you want to be sure to cover both the search engines and your users. What is a Sitemap used for? Building a website can be a long tedious process, which can be more complicated by the huge amount of information that should be organized before including in your website. Some designers will start by creating  wireframes and mockups.  But for the rest of us it a lot easier to just build a sitemap

Do I need a robots.txt File?

The robots.txt file The robots.txt file to give instructions about their site to web robots; this is called  The Robots Exclusion Protocol .   Here why it gets used; a robot wants to visits a Web site URL, say http://www.myorg.com/index /.  But, before it does so, it firsts checks for http://www.myorg.com/robots.txt , and find that it contains the following: User-agent: googleboot Allow: / Allow: /images/ Allow: /css/ Allow: /js/ Disallow: /admin/ Disallow: /cgi-bin/ Disallow: /includes/ Disallow: /processors/ Disallow: /skins/ User-agent: bingbot Allow: / Allow: /images/ Allow: /css/ Allow: /js/ Disallow: /admin/ Disallow: /cgi-bin/ Disallow: /includes/ Disallow: /processors/ Disallow: /skins/ User-agent: * Allow: / Allow: /images/ Allow: /css/ Allow: /js/ Disallow: /admin/ Disallow: /cgi-bin/ Disallow: /includes/ Disallow: /processors/ Disallow: /skins/ sitemap: http://www.myorg.com/sitemap.xml At MyO

Could My Domain Be Blacklisted?

What is Blacklisting In Internet terminology, a generic name for a list of e-mail addresses or IP addresses that are originating with known spammers. Individuals and enterprises can use blacklists to filter out unwanted e-mails, as most e-mail applications today have filtering capabilities. Network administrators and users alike employ blacklists to block entities who would be likely to cause problems. The problem entities could be malware networks, spammers, hackers, DoS (denial of service) attackers or abusive site or forum users, among a plethora of other possibilities. Application blacklisting prevents the execution of undesirable programs, which include applications that are known to contain security threats or vulnerabilities and those that are deemed inappropriate for a given business environment. Hackers and IT security researchers sometimes use blacklists differently, seeking interaction with blacklisted entities to provide information. How does Blacklist Works

Do You know the Robots META tag

Proper Robot Tag Usage There is a special HTML meta tag  to tell robots not to index the content of a page, and/or not scan it for links to follow. Example Usage:  <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> There are two important considerations when using the robots Meta Tag. robots can ignore your Meta Tag. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention. the NOFOLLOW directive only applies to links on this page. It's entirely likely that a robot might find the same links on some other page without a NOFOLLOW (maybe on some other site), and so still arrives at your undesired page. How to write a Robots Meta Tag Where to put it Like any Meta Tag it should be placed in the HEAD section of an HTML page. You should put it in every page on your site, because a robot can encounter a  deep link  to any page on your site. What to put into it T

The Two Major Types of SEO

There are Two Major Types of Search engine optimization (SEO.) On page SEO: On Page is the Primary form of SEO.     T his include making sure your website uses all the right HTML tags, structured data to correctly markup your content and is correctly formed or verified. This includes everything the words in the content you use and the code that forms the page. Think of it as: the key points of your page they are known as keywords. You have to use those keyword exactly in your text in order for people to find them. There also something called local SEO, where you know how to identify your business. And lastly the type of design you use, is it responsive? Yes there are even many more factors that need to be looked at, so hiring a professional can save you a lot of time.  Off page SEO: OK , so what does it mean  Off the Page SEO ? Off the Page is basically about building proper links to your site. You do not have a lot of control, but you have some ability to influence it. You can

Death of the Meta Tag Keywords

What Exactly Are Meta Keywords? Keyword meta tags you could use in the HTML code of your site. In theory, you would use this tag to provide the search engines with more information regarding what your page is about. The search engines would then read the keywords in the tag and if the keyword is not in a target or header tahe it would think your spamming the search engine. Keyword meta tags quickly became an area where someone could stuff often-irrelevant keywords without typical visitors ever seeing those keywords. Because the keywords meta tag was so often abused, many years ago Google began disregarding the keywords meta tag. Google took action first. In 2009, the search engine officially announced that it does not use the meta keywords tag as a ranking factor.  Google has just removed another feature from the Google Search Console: the content keywords report, one of the earliest features found in the Google Search Console when it was first built So the real answer is Goo

Google favors the use of Secure Socket Layer (SSL) on your server over non secure websites

The main benefit of Hypertext Transfer Protocol Secure (HTTPS) is that it gives security to the users, on the pages where they share personal data with you. It’s great to have on your entire website. But, when a user shares precious info, like credit card details, HTTPS adds extra layers of protection. From Google Webmaster Central Blog, I quote from Google the following:   " For these reasons, over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal - affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content - while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HT