SEO at work logo

Use Google Webmaster Tools to Tweak Sites

Posted By Str82u on Wednesday 04 February 2009
printer friendly create pdf of this news item
Google webmaster tools lets you submit your sitemaps and gives quite a bit of useful data on the pages submitted, maybe even pages that don't exist because there are link errors in your site pointing to your own internal files instead of online site pages. Information that may seem trivial but can definitely improve a site are warnings about duplicate meta-tags, unreachable pages, bad links, and how the Google search bot views your site inside and out.

One tool that helps with sites built using a common template are the warnings about duplicate meta-tags. It easy to miss replacing certain attributes or writing new meta data for custom pages, so using this function makes it easy to do some housekeeping and as a reminder for those forgotton changes.

Another feature is called "What Googlebot Sees". This analyzes a site's content and links from other sites and presents a breakdown of the most commonly used words and phrases in content including outgoing links plus that of incoming links from other sites as the search engine sees it. Additionally, you might find that the "Marching Band" site looks like "High School" to Google. You can make som important modifications from this information, the article Improving A Working Website With Google Webmaster Tools details exactly what is done when a working site is updated using this.

Removing annoying code brings numbers up

Posted By I Have You Admin on Friday 23 January 2009
printer friendly create pdf of this news item
This is an update to "Taking out code that may be alienating Google", which finally got completed, or so it's thought. Using a few simple time saving helpers like CuteHTML (no longer updated), updating pages can be a little less painful.

Page visits are up from an average of 915 uniques per day to 1585 and time on page have shown an increase of 25 seconds for "Direct Traffic" with search engine traffic right behind. Slightly higher positions for one keyphrase and wider distribution. These numbers come from Google Analytics, but there is only a slight effect on Google the search engine. Yahoo, which has the site listed well, has nearly doubled the traffic and time on site has gone up slightly.

First, always remember to make backups before making changes, on a regular basis also if possible, even if your host or server does it at the data center. Backups on raid systems might be as close to 5 minutes or less apart and what are you going to do with a backup that's 5 minutes old after 6 minutes? You're stuck with the changes. Keep them in your pocket if you can, a thumb drive is cheap and if your site(s) isn't a terabyte monster, all your files travel with you.

Most text editors and web authoring programs have a "Find" and "Find and Replace" feature, maybe named something different, but valuable, especially if it can be used on multiple pages or an entire website at once. A good example of needing this was when the only copy of a site was on the Web Archive (THANK YOU!). The source is modified to preserve certain aspects of the pages and needs to have bits of thier code removed. The repetitive parts added to links can be removed in one swift blow, usually page by page, but quicker than locating each instance per edit and just using the "Find" feature is enough to locate the parts you need to remove that the program or YOU missed.

Don't mess around while using FTP in Windows instead of a client, if your online and offline folders look identical and you're being a wise-apple to the guy in the next room, you could be overwriting updates instead of improving your SEO. And not understanding why a file isn't updating in the browser. even after being "Uploaded" several times is what the real pros do, all the time.

The site targeted in this update uses the .shtml extension, only for the purpose of including other files in web pages in the simplest way possible for the time it was built. Updating the pages served to the include locations also updates the pages including the files, but does so without having the "Last Modified" date and time change on the server. Another trick is using a rotating banner script with several messages on a regular basis just to keep pages fresher to search spiders and robots.... Click Here For More ...

SEO outgoing links on directory pages

Posted By Str82u on Wednesday 14 January 2009
printer friendly create pdf of this news item
Large at this moment is between 300 - 500 links on one page without much content to water it down. Part of page rank that you can't buy in a box is good content. Directory pages are there to give visitors a large choice of RELEVANT options, and while it is tempting to slap paying affiliate links together to form content, it isn't necessarily gonna get past Google, Yahoo or Live.

So the lesson is going to be free people search or people search or both, might as well knock both out since it's a big difference with the free but only a matter of deleting the extra word in the search form.

Using a keyword suggestion tool, like SEO Book, you can determine what are the best performing, or most searched phrases. For the page being created, "Alabama People Search" is what is most desirable in theory, but is it really what is being used in the search engines when someone wants to find a person in Alabama? It's actually not, but it does contain the phrase people search, adding the word "Alabama" might make the link less valuable, but for the content to make sense, necessary.

When linking to pages, each should be checked, not just to avoid your competition, but to see if the page really serves the purpose. Don't make a copy of Google's search results, not only will they detect it, but there are going to be other

Check meta titles for the keyword or keyphrase, checking other pages that may seem similar to see if the titles fit better.

PR check for multiple pages, if the choice is between two equal titles on pages, go with the one that has a higher Google page rank.

Use other words in or around the link just to give the site visitor a more clear idea of the destination for that link.

Taking out code that may be alienating Google

Posted By Str82u on Tuesday 13 January 2009
printer friendly create pdf of this news item
An associate mentioned a site going from having hundreds of pages listed by Google to almost nothing... Among other things was a sliding banner code for making ads, or anything else, slide down long pages. Of course, there are only a few reasons for doing this and you know what? IT WORKS. After finding a similar problem with a different site, one that need the ad to be positioned in a DIV container because the original page was designed without this ad and then the old script defined the ad's starting position.

Why would the code be offensive? Some online advertisers don't want their ads displayed in ways that could be considered overbearing.

Maybe the search engine recognizes the type of script and associates it with pushy advertising.

The most important part could be shedding the weight of the script.

From the visitors point of veiw, when some of the scripts start running, it adds load time to the page and can make the ad popping up obvious. Some even cause the page to be unusable while waiting for it, not to mention, some scripts jerk. It was fun while it lasted, time to clean it up, keeping in mind the solution must be a DIV, to positition the ad properly, modifying the table just isn't worth it, DIVs are much lighter anyhow.

Here's just a quick note on the file size savings of a page with loads of links:
Started out 78516 bytes,
Ended 76885

Close to 2 kb. Remember, every inch counts.

The DIV code will probably end up in another post or article, times up.
Go to page first 4 5 6 7 8 9 10 11 12 13
© 2012 I Have You Now, All rights reserved.