SEO at work logo

Site Tweaking

Website Housekeeping

Posted By Str82u on Thursday 04 February 2010
printer friendly create pdf of this news item
Cleaning code is good SEO. Can it be bad? After doing terrible things to good domains just to see how it happens, I'm sure there's an "SEO Dossier" with "reseller rights" that describes this in technical detail. For now all we're working with is experience and the money saved. There's a key word method we'll share toward the end that compliments any housekeeping.

When trying to improve your page rank and placement, you would expect the only adverse effects of cleaning code might be the newness of the page on the server. That's a basic philosophy for our aged html. There aren't going to be specific references on this page, if you work with source and haven't experienced something like this, think about it. Humanizing searchbots while you're reading this might lend perspective to our conclusions which influences our intuitive design over basic programming priciples.

Reformatting and cleaning html from a 111kb file dropped it to 12kb. (short blip, read at the bottom) It was the first time to get a Google top ten homepage on purpose. What are you going to do now? Start cleaning up source on ALL your sites so you can go to Disney! But WAIT! There's more. Same site and an enormous in-page JavaScript for making banners slide down a page. Making money on this until an email from AdSense made us rethink/recode and that made another improvement. Speed, right? (CTR oddly improved too).

NOW we really had a plan. Remove the slider code from another popular site, reformat the html and make money. LOL, really loud! Sadly, this time the result so negative it was HEART CRUSHING, disappearing from hundreds of Google SERPs within the month. DEAD! Yahoo resurrected it later where it's successful today, but how can removing a large portion of JAVASCRIPT of all things cause that?

THERE'S THE POINT: The AdSense code was originally in a div at the bottom of the source where it was the last thing the spider saw. When the ad script was written to an adjoining cell of the "main attraction", the content had changed. Not visibly, but to the search engine's ability to assume what people see. Not going to try an make a theory out of this, we did this. Do what ever it takes to get the visible content as close to the head as possible. If you have an expanding menu, WE BELIEVE it's interpreted as being a solid object your users have to scroll through to get to what they came for. If you have in-line styles, nested anything or out of place scripts directly after your bodytag, get rid of it or reposition it with CSS. In regards to nested tables, too many sites were done with in this fashion when padding is all that's neccessary.

Summery: Make sure search engines get meat and potatoes first when they hit your plate, serve the sweet stuff once they fill up. Want an example. Most any page, search bar and top float from the bottom. View source and you'll see content right away.

Here's that tip for writing the hyperlink. If it's a good phrase, hit the return/enter key to begin the text against the left margin. Doesn't work when we coded any page like that exclusively. It's not human/software interaction natural. Other times we'll break the text of a link to place the keyPARTS on the margin.

Keep it Str8!

PS From the top: was originally on a big messy php CMS but looked good to me and at the time, it jaust HAD to go back to .html format. "View Source", it became a really huge file. Eventually it came down to taking out all the redundant references to the database, css and all. The total added by automation was about 100kb and averaged around 50kb for most pages. Days after, one phrase went from page 10 to 4 and another appeared from nowhere at #4 and believe it or not, another site followed the same term when it got reformatted. This made more money than my three sons working in a lumber yard. Thanks for sticking with me down here.

Link Quality in SEO for Links Pages

Posted By Str82u on Monday 10 August 2009
printer friendly create pdf of this news item
So you have a link site, directory is the polite term, and it's update and optimize time. One of the hardest things to avoid when making link orr information pages is information overload, trying to impress or deluge the user with more than they need, sacrificing quality just to get that extra link or tool added. Bigger is better, when linking to other sites, the higher the target's page rank the better. The effect of the score on your page might not seem like much, but if you have 300+ links per page, it does. Here's an idea that has improved the quality of our links and can bring up your over score on that factor. It's easier to do this from the start, but if you're updating now, you can try this.

Check neighboring page's ranks if they are close enough to lead the visitor to the right information. Sometimes, the location that suits our needs for a subject may not be the best choice SEO wise. Consider dropping an email to the webmaster or site owner to let them know about the listing, they might be in the middle of optimizing their site and want the link text different or give you a different target that has a better rank and is being marketed purposely. Simply go to a page rank checker to compare the pages, then use the one with the highest number.

Second, use the page with the best title and content suited to your link. Incorporate the title into your link text as it applies to your keywords and use the title with the most keywords IF it is relevant to the user. Don't make another sacrifice and alienate people in the process, send them where they need to go as quick as you can. If the page title reads "Silver Gold and Platinum Jewelry Repair and Sales" and it's about repair, don't go linking to it with "Gold Jewelry" just because it has a higher page rank than the sales or information pages.

The last thing to touch on is just plain honesty.... Click Here For More ...

Mods and Mistakes Under Development

Posted By Str82u on Tuesday 05 May 2009
printer friendly create pdf of this news item
Due to an advertiser switch, at the advertiser's level, we have the "opportunity" to open a lot of sites in the editor and do a regular update while adding or fixing some of the things mentioned in our articles (and some that weren't) while swapping out ad code. The way the older sites were designed and built originally, and tweaked as such, there is alot that needs done to some. Keeping notes is about the ony way to keep the head screwed on straight if you happen to be a one man show in a 3 ring circus and you know there should be more things going on in the background of a site than is visible in html output if you want to understand how your sites are doing. Analytics is a big one considering you can track advertising and server logs to compare raw traffic. It's safe to say that we are going to tweak something on the inside and outside, and tweaking isn't just the bad, but the good and indifferent.It can cost you money but will teach you something one way or another... painfully at time$. The lesson: Don't edit and reedit content obsessively, it doesn't allow enough time to see results in search engines. If you can't do that, keep reminders of what you did and refer to it before stumbling in the dark again. How Many Extras Made The List
  • Added a new company to the Competitive Ad FIlter for all sites at Adsense. Check earnings over the next 7 days for reductions.Already showing promise after only half a day!
  • Change last phrase in homepage back to original. Original version of header was there, need to figure that out and decide on a previous modification to retry instead.
  • Something about findpeoplefree makes CTR and Conversions higher with half the traffic. Gotta feed people to this site!
Code change for Reunion to MyLife searchbox: - Also added Analytics banner code findpeoplefree freepeoplesearch - also added Analytics code realfreesearches - changed Str82u - [i]changed code on index.htm, county jail, prison and courtcaserecord pages[i]Run form tests Waiting, waiting... Backups Done 051909

Use Google Webmaster Tools to Tweak Sites

Posted By Str82u on Wednesday 04 February 2009
printer friendly create pdf of this news item
Google webmaster tools lets you submit your sitemaps and gives quite a bit of useful data on the pages submitted, maybe even pages that don't exist because there are link errors in your site pointing to your own internal files instead of online site pages. Information that may seem trivial but can definitely improve a site are warnings about duplicate meta-tags, unreachable pages, bad links, and how the Google search bot views your site inside and out.

One tool that helps with sites built using a common template are the warnings about duplicate meta-tags. It easy to miss replacing certain attributes or writing new meta data for custom pages, so using this function makes it easy to do some housekeeping and as a reminder for those forgotton changes.

Another feature is called "What Googlebot Sees". This analyzes a site's content and links from other sites and presents a breakdown of the most commonly used words and phrases in content including outgoing links plus that of incoming links from other sites as the search engine sees it. Additionally, you might find that the "Marching Band" site looks like "High School" to Google. You can make som important modifications from this information, the article Improving A Working Website With Google Webmaster Tools details exactly what is done when a working site is updated using this.
Go to page 1 2
© 2012 I Have You Now, All rights reserved.