Optimize My Website

How to Optimize My Website

E Words

SEO Dictionary

Earnings Per Click – Many contextual advertising publishers estimate their potential earnings based on how much they make from each click.
Editorial Link – Search engines count links as votes of quality. They primarily want to count editorial links that were earned over links that were bought or bartered.
Many paid links, such as those from quality directories, still count as signs of votes as long as they are also associated with editorial quality standards. If they are from sites without editorial control, like link farms, they are not likely to help you rank well. Using an algorithm similar to TrustRank, some search engines may place more trust on well known sites with strong editorial guidelines.
Emphasis – An HTML tag used to emphasize text.
Please note that it is more important that copy reads well to humans than any boost you may think you will get by tweaking it for bots. If every occurrence of a keyword on a page is in emphasis that will make the page hard to read, convert poorly, and may look weird to search engines and users alike.
<em>emphasis</em> would appear as emphasis
Entry Page – The page which a user enters your site.
If you are buying pay per click ads it is important to send visitors to the most appropriate and targeted page associated with the keyword they searched for. If you are doing link building it is important to point links at your most appropriate page when possible such that:
  • if anyone clicks the link they are sent to the most appropriate and relevant page
  • you help search engines understand what the pages on your site are associated with
Ethical SEO – Search engines like to paint SEO services which manipulate their relevancy algorithms as being unethical. Any particular technique is generally not typically associated with ethics, but is either effective or ineffective.
Some search marketers lacking in creativity tend to describe services sold by others as being unethical while their own services are ethical. Any particular technique is generally not typically associated with ethics, but is either effective or ineffective.
The only ethics issues associated with SEO are generally business ethics related issues. Two of the bigger frauds are:
  • Not disclosing risks: Some SEOs may use high risk techniques when they are not needed. Some may make that situation even worse by not disclosing potential risks to clients.
  • Taking money & doing nothing: Since selling SEO services has almost no start up costs many of the people selling services may not actually know how to competently provide them. Some shady people claim to be SEOs and bilk money out of unsuspecting small businesses.
As long as the client is aware of potential risks there is nothing unethical about being aggressive.
Everflux – Major search indexes are constantly updating. Google refers to this continuous refresh as everflux.
In the past Google updated their index roughly once a month. Those updates were named Google Dances, but since Google shifted to a constantly updating index Google no longer does what was traditionally called a Google Dance.
See also: Matt Cutts Google Terminology Video – Matt talks about the history of Google Updates and the shift from Google Dances to everflux.
Expert Document – Quality page which links to many non-affiliated topical resources.
External Link – Link which references another domain.
Some people believe in link hoarding, but linking out to other related resources is a good way to help search engines understand what your site is about. If you link out to lots of low quality sites or primarily rely on low quality reciprocal links some search engines may not rank your site very well. Search engines are more likely to trust high quality editorial links (both to and from your site).

D Words

SEO Dictionary

Dayparting – Turning ad campaigns on or off, changing ad bid price, or budget constraints based on bidding more when your target audience is available and less when they are less likely to be available.
Dead Link – A link which is no longer functional.
Most large high quality websites have at least a few dead links in them, but the ratio of good links to dead links can be seen as a sign of information quality. See Broken Link.
Dedicated Server – Server which is limited to serving one website or a small collection of websites owned by a single person.
Dedicated servers tend to be more reliable than shared (or virtual) servers. Dedicated servers usually run from $100 to $500 a month. Virtual servers typically run from $5 to $50 per month.
Deep Link Ratio – The ratio of links pointing to internal pages to overall links pointing at a website.
A high deep link ratio is typically a sign of a legitimate natural link profile.
Deep Linking – Linking to a page that is one or more levels removed from the home directory. Deep linking is often desirable to build PageRank to a specific page on a website.
De-Listing – Temporarily or permanently becoming de-indexed from a directory or search engine.
De-indexing may be due to any of the following:
Pages on new websites (or sites with limited link authority relative to their size) may be temporarily de-indexed until the search engine does a deep spidering and re-cache of the web.
During some updates search engines readjust crawl priorities.
You need a significant number of high quality links to get a large website well indexed and keep it well indexed.
Duplicate content filters, inbound and outbound link quality, or other information quality related issues may also relate to re-adjusted crawl priorities.
Pages which have changed location and are not properly redirected, or pages which are down when a search engine tries to crawl them may be temporarily de-indexed.
Search Spam:
If a website tripped an automatic spam filter it may return to the search index anywhere from a few days to a few months after the problem has been fixed.
If a website is editorially removed by a human you may need to contact the search engine directly to request reinclusion.
Del.icio.us – Popular social bookmarking website.
See also:
Demographics – Statistical data or characteristics which define segments of a population.
Some internet marketing platforms, such as AdCenter and AdWords, allow you to target ads at websites or searchers who fit amongst a specific demographic. Some common demographic data points are gender, age, income, education, location, etc.
Denton, Nick – Publisher of Gawker, a popular ring of topical weblogs, which are typically focused on controversy.
See also: Nick Denton.org – official blog, where Nick often talks about business and his various blogs.
Description – Directories and search engines provide a short description near each listing which aims to add context to the title.
High quality directories typically prefer the description describes what the site is about rather than something that is overtly promotional in nature. Search engines typically
use a description from a trusted directory (such as DMOZ or the Yahoo! Directory) for homepages of sites listed in those directories
use the page meta description (especially if it is relevant to the search query and has the words from the search query in it)
attempt to extract a description from the page content which is relevant for the particular search query and ranking page (this is called a snippet)
or some combination of the above
Description Meta Tag – A meta tag that describes the content of the web page in which it is found. Used by some search engines for keyword density purposes. Also, some SE’s will use the description meta tag for the description provided to a user when the page is returned in a listing of search results. It is recommended that you use a couple of your targeted keywords in the description meta tag.
<META NAME=Description CONTENT=This sentence describes the content on this page.>
Digg – Social news site where users vote on which stories get the most exposure and become the most popular.
Directory – A categorized list of websites that is maintained by human editors instead of crawlers. Yahoo.com is the most widely recognized directory on the web, but there are literally thousands of others.
Domain – The human-friendly address, or URL of a website. When a user types a URL into a web browser, a dedicated computer somewhere on the web known as a Domain Name Server, or DNS translates the URL into a discrete IP address which is then used to find the actual website being requested.
In the URL http://www.roadsidemultimedia.com, roadsidemultimedia.com is the domain.
Domain Name Servers (DNS) – These are special computers that translate human-friendly URLs into computer-friendly IP addresses. This process takes place every time a user requests a page from a website.
DMOZ (The Open Directory Project) – is the largest human edited directory of websites. DMOZ is owned by AOL, and is primarily ran by volunteer editors.
See also:
DNS (Domain Name Server or Domain Name System) – A naming scheme mechanism used to help resolve a domain name / host name to a specific TCP/IP Address.
DNS Propagation – Every time a new domain name is registered (or an existing one is transferred to a new DNS), the information about the domain and the DNS that hosts it must make its way around the entire internet. This process usually takes around 24 hours, during which time the domain will be inaccessible to users.
Doorway Page – A page that is usually optimized for a particular search engine and search term. Multiple doorway pages are often used to help ensure that the same basic content is ranked well on several different search engines. The use of doorway pages for this purpose is frowned upon by most larger search engines, including Google.
Doorway Pages – Pages designed to rank for highly targeted search queries, typically designed to redirect searchers to a page with other advertisements.
Some webmasters cloak thousands of doorway pages on trusted domains, and rake in a boatload of cash until they are caught and de-listed. If the page would have a unique purpose outside of search then search engines are generally fine with it, but if the page only exists because search engines exist then search engines are more likely to frown on the behavior.
Dreamweaver – Popular web development and editing software offering a what you see is what you get interface.
See also: Dreamweaver official site.
Duplicate Content – Two or more separate web pages that contain substantially the same content are said to contain duplicate content.
Google and other top search engines have set up filters to detect duplicate content when their crawlers are active on the web. When pages containing duplicate content are detected, they are often assessed a duplicate content penalty which means a lowering of the page’s ranking from what it would have received naturally.
See also:
Dynamic Content (dynamic pages) – Web pages that are often generated from database information based upon queries initiated by users. Dynamic pages often include the ? character in the URL.
The URLs of dynamic pages often use these extensions: .asp, .cgm, or .cgi. Most search engines don’t index dynamic content very well (or at all). Google has recently been doing a better job at indexing them however.
Dynamic IP Address – An IP address that changes every time a computer logs on to the internet. See also Static IP Address.
Dynamic Languages – Programming languages such as PHP or ASP which build web pages on the fly upon request.

C Words

SEO Dictionary

Cache – Copy of a web page stored by a search engine. When you search the web you are not actively searching the whole web, but are searching files in the search engine index. Some search engines provide links to cached versions of pages in their search results, and allow you to strip some of the formatting from cached copies of pages.
Calacanis, Jason – Founder of Weblogs, Inc. Also pushed AOL to turn Netscape into a Digg clone. Jason’s blog, Calacanis.com
Canonical URL – Many content management systems are configured with errors which cause duplicate or exceptionally similar content to get indexed under multiple URLs. Many webmasters use inconsistent link structures throughout their site that cause the exact same content to get indexed under multiple URLs. The canonical version of any URL is the single most authoritative version indexed by major search engines. Search engines typically use PageRank or a similar measure to determine which version of a URL is the canonical URL.
Webmasters should use consistent linking structures throughout their sites to ensure that they funnel the maximum amount of PageRank at the URLs they want indexed. When linking to the root level of a site or a folder index it is best to end the link location at a / instead of placing the index.html or default.asp filename in the URL.
Examples of URLs which may contain the same information in spite of being at different web addresses:
  • http://www.seobook.com
  • http://www.seobook.com/index.shtml
  • http://seobook.com/
  • http://seobook.com/index.shtml
  • http://www.seobook.com/?tracking-code
Catch All Listing – A listing used by pay per click search engines to monetize long tail terms that are not yet targeted by marketers. This technique may be valuable if you have very competitive key words, but is not ideal since most major search engines have editorial guidelines that prevent bulk untargeted advertising, and most of the places that allow catch all listings have low traffic quality. Catch all listings may be an attractive idea on theme specific search engines and directories though, as they are already pre qualified clicks.
Click-Through – The action of clicking on a link to visit a web page.
Click-Through-Rate (CTR) – The number of times a link is clicked on divided by the number of times that same link is displayed (called an impression).
A link is displayed 20 times (20 impressions) and clicked on 2 times.
The CTR is 10% (2/200=0.10).
Cloaking – Serving one version of a page to a human visitor and a different version of the same page to the search engines. This is usually done to fool the search engines into giving the page a higher rank than it would normally receive while making sure the human visitor sees a useful and attractive page.
Note: Cloaking is discouraged by most major search engines, including Google.
Co-citation – In topical authority based search algorithms links which appear near one another on a page may be deemed to be related to one another. In algorithms like latent semantic indexing words which appear near one another often are frequently deemed to be related.
CGI – Common Gateway Interface – interface software between a web server and other machines or software running on that server. Many cgi programs are used to add interactivity to a web site.
Client – A program, computer, or process which makes information requests to another computer, process, or program.
Cloaking – Displaying different content to search engines and searchers. Depending on the intent of the display discrepancy and the strength of the brand of the person / company cloaking it may be considered reasonable or it may get a site banned from a search engine.
Cloaking has many legitimate uses which are within search guidelines. For example, changing user experience based on location is common on many popular websites.
Cluetrain Manifesto, The – Book about how the web is a marketplace, and how it is different from traditional offline business.
Clustering – In search results the listings from any individual site are typically limited to a certain number and grouped together to make the search results appear neat and organized and to ensure diversity amongst the top ranked results. Clustering can also refer to a technique which allows search engines to group hubs and authorities on a specific topic together to further enhance their value by showing their relationships.
Comment Tags – Used in a web page’s HTML source code to indicate certain information about a section of the page code. Some search engines will consider keywords contained in comment tags for keyword density purposes, others (including Google) will not.
Comments – Many blogs and other content management systems allow readers to leave user feedback.
Leaving enlightening and thoughtful comments on someone else’s related website is one way to help get them to notice you.
Compacted Information – Information which is generally and widely associated with a product. For example, most published books have an ISBN.
As the number of product databases online increases and duplicate content filters are forced to get more aggressive the keys to getting your information indexed are to have a site with enough authority to be considered the most important document on that topic, or to have enough non compacted information (for example, user reviews) on your product level pages to make them be seen as unique documents.
CMS – Content Management System. Tool used to help make it easy to update and add information to a website.
Blog software programs are some of the most popular content management systems currently used on the web. Many content management systems have errors associated with them which make it hard for search engines to index content due to issues such as duplicate content.
Concept Search – A search which attempts to conceptually match results with the query, not necessarily with those words, rather their concept.
For example, if a search engine understands a phrase to be related to another word or phrase it may return results relevant to that other word or phrase even if the words you searched for are not directly associated with a result. In addition, some search engines will place various types of vertical search results at the top of the search results based on implied query related intent or prior search patterns by you or other searchers.
Conceptual Links – Links which search engines attempt to understand beyond just the words in them. Some rather advanced search engines are attempting to find out the concept links versus just matching the words of the text to that specific word set. Some search algorithms may even look at co-citation and words near the link instead of just focusing on anchor text.
Content – The information located on a web page. This includes text, images, and any other types of information that a webmaster places on the page.
Contextual Advertising – Advertising programs which generate relevant advertisements based on the content of a webpage.
Conversion – Many forms of online advertising are easy to track. A conversion is reached when a desired goal is completed.
Most offline ads have generally been much harder to track than online ads. Some marketers use custom phone numbers or coupon codes to tie offline activity to online marketing.
Here are a few common example desired goals
  • a product sale
  • completing a lead form
  • a phone call
  • capturing an email
  • filling out a survey
  • getting a person to pay attention to you
  • getting feedback
  • having a site visitor share your website with a friend
  • having a site visitor link at your site
Bid management, affiliate tracking, and analytics programs make it easy to track conversion sources.
Cookie – Small data file written to a user’s local machine to track them. Cookies are used to help websites customize your user experience and help affiliate program managers track conversions.
Copyright – The legal rights to publish and reproduce a particular piece of work.
Counter – A script that counts the number of hits, unique visitors, and/or page views that a web page (or an entire site) receives. These stats provide very useful information for the webmaster.
CPA – Cost per action. The effectiveness of many other forms of online advertising have their effectiveness measured on a cost per action basis. Many affiliate marketing programs and contextual ads are structured on a cost per action basis. An action may be anything from an ad click, to filling out a lead form, to buying a product.
CPC – Cost per click. Many search ads and contextually targeted ads are sold in auctions where the advertiser is charged a certain price per click. Examples of this include:
  • Google AdWords – Google’s pay per click ad program which allows you to buy search and contextual ads.
  • Google AdSense – Google’s contextual ad program.
  • Microsoft AdCenter – Microsoft’s pay per click ad platform.
  • Yahoo! Search Marketing – Yahoo!’s pay per click ad platform.
CPM – Cost per thousand ad impressions. Many people use CPM as a measure of how profitable a website is or has the potential of becoming.
Crawl Depth – How deeply a website is crawled and indexed. Since searches which are longer in nature tend to be more targeted in nature it is important to try to get most or all of a site indexed such that the deeper pages have the ability to rank for relevant long tail keywords. A large site needs adequate link equity to get deeply indexed. Another thing which may prevent a site from being fully indexed is duplicate content issues.
Crawl Frequency – How frequently a website is crawled. Sites which are well trusted or frequently updated may be crawled more frequently than sites with low trust scores and limited link authority. Sites with highly artificial link authority scores (ie: mostly low quality spammy links) or sites which are heavy in duplicate content or near duplicate content (such as affiliate feed sites) may be crawled less frequently than sites with unique content which are well integrated into the web.
Crawler – A program used by search engines to crawl the web by following links from page to page. This is how most search engines find the web pages that they place in their index. Also referred to as a spider or robot.
Crawling The Web – Search engines use crawlers to move from web page to web page by following the links on the pages. The pages found are then ranked using an algorithm and indexed into the search engine database.
Cross Linking – This is where the owner of two or more websites interlink the sites in order to boost their search engine rankings. If detected, cross linking often results in a search engine penalty.
CSS – Cascading Style Sheets is a method for adding styles to web documents.
Note: Using external CSS files makes it easy to change the design of many pages by editing a single file.
CTR – Clickthrough rate – the percentage of people who view click on an advertisement they viewed, which is a way to measure how relevant a traffic source or keyword is. Search ads typically have a higher clickthrough rate than traditional banner ads due to being highly relevant to implied searcher demand.
Cutts, Matt – Google’s head of search quality.
Cybersquatting – Registering domains related to other trademarks or brands in an attempt to cash in on the value created by said trademark or brand.

B Words

SEO Dictionary

Bad Neighborhood – A web page that has been penalized by a search engine (most notably Google) for using shady SEO tactics, such as hidden text or link farms.
Backlinks – Also referred to as Inbound Links. Links from another web site to your web site. Most search engines provide an easy way to get a list of all of the backlinks to a specific page.
Bait and Switch – Marketing technique where you make something look overtly pure or as though it has another purpose to get people to believe in it or vote for it (by linking at it or sharing it with friends), then switch the intent or purpose of the website after you gain authority.
It is generally easier to get links to informational websites than commercial sites. Some new sites might gain authority much quicker if they tried looking noncommercial and gaining influence before trying to monetize their market position.
Banner Blindness – During the first web boom many businesses were based on eyeballs more than actually building real value. Many ads were typically quite irrelevant and web users learned to ignore the most common ad types.
In many ways text ads are successful because they are more relevant and look more like content, but with the recent surge in the popularity of text ads some have speculated that in time people may eventually become text ad blind as well.
Battelle, John – Popular search and media blogger who co-founded The Industry Standard and Wired, and authored a popular book on search called The Search.
Behavioral Targeting – Ad targeting based on past recent experience and/or implied intent. For example, if I recently searched for mortgages then am later reading a book review the page may still show me mortgage ads.
Bias – A prejudice based on experiences or a particular worldview.
Any media channel, publishing format, organization, or person is biased by:
  • how and why they were created and their own experiences
  • the current set of social standards in which they exist
  • other markets they operate in
  • the need for self preservation
  • how they interface with the world around them
  • their capital, knowledge, status, or technological advantages and limitations
Search engines aim to be relevant to users, but they also need to be profitable. Since search engines sell commercial ads some of the largest search engines may bias their organic search results toward informational (ie: non-commercial) websites. Some search engines are also biased toward information which has been published online for a great deal of time and is heavily cited.
Search personalization biases our search results based on our own media consumption and searching habits.
Large news organizations tend to aim for widely acceptable neutrality rather than objectivity. Some of the most popular individual web authors / publishers tend to be quite biased in nature. Rather than bias hurting one’s exposure:
  • The known / learned bias of a specific author may make their news more appealing than news from an organization that aimed to seem arbitrarily neutral.
  • I believe biased channels most likely typically have a larger readership than unbiased channels.
  • Most people prefer to subscribe to media which matches their own biases worldview.
  • If more people read what you write and passionately agree with it then they are more likely to link at it.
  • Things which are biased in nature are typically easier to be cited than things which are unbiased.
Bid Management Software – (see Automated Bid Management Software)
Black Hat SEO – Search engines set up guidelines that help them extract billions of dollars of ad revenue from the work of publishers and the attention of searchers. Within that highly profitable framework search engines consider certain marketing techniques deceptive in nature, and label them as black hat SEO. Those which are considered within their guidelines are called white hat SEO techniques. The search guidelines are not a static set of rules, and things that may be considered legitimate one day may be considered deceptive the next.
Search engines are not without flaws in their business models, but there is nothing immoral or illegal about testing search algorithms to understand how search engines work.
People who have extensively tested search algorithms are probably more competent and more knowledgeable search marketers than those who give themselves the arbitrary label of white hat SEOs while calling others black hat SEOs.
When making large investments in processes that are not entirely clear trust is important. Rather than looking for reasons to not work with an SEO it is best to look for signs of trust in a person you would like to work with.
Block Level Analysis – A method used to break a page down into multiple points on the web graph by breaking its pages down into smaller blocks.
Block level link analysis can be used to help determine if content is page specific or part of a navigational system. It also can help determine if a link is a natural editorial link, what other links that link should be associated with, and/or if it is an advertisement. Search engines generally do not want to count advertisements as votes.
Blog – A periodically updated journal, typically formatted in reverse chronological order. Many blogs not only archive and categorize information, but also provide a feed and allow simple user interaction like leaving comments on the posts.
Most blogs tend to be personal in nature. Blogs are generally quite authoritative with heavy link equity because they give people a reason to frequently come back to their site, read their content, and link to whatever they think is interesting.
Blog Comment Spam – Either manually or automatically (via a software program) adding low value or no value comments to other sites.
Automated blog spam:
Nice post!
Discreat Overnight Viagra Online Canadian Pharmacy Free Shipping
Manual blog spam:
I just wrote about this on my site. I don’t know you, but I thought I would add no value to your site other than linking through to mine. Check it out!!!!!
cluebag manual spammer (usually with keywords as my name)
As time passes both manual and automated blog comment spam systems are evolving to look more like legitimate comments. I have seen some automated blog comment spam systems that have multiple fake personas that converse with one another.
Blogger – Blogger is a free blog platform owned by Google.
It allows you to publish sites on a subdomain off of Blogspot.com, or to FTP content to your own domain. If you are serious about building a brand or making money online you should publish your content to your own domain because it can be hard to reclaim a website’s link equity and age related trust if you have built years of link equity into a subdomain on someone else’s website.
Blogger is probably the easiest blogging software tool to use, but it lacks many some features present in other blog platforms.
Blogroll – Link list on a blog, usually linking to other blogs owned by the same company or friends of that blogger.
Bold – A way to make words appear in a bolder font. Words that appear in a bolder font are more likely to be read by humans that are scanning a page. A search engine may also place slightly greater weighting on these words than regular text, but if you write natural page copy and a word or phrase appears on a page many times it probably does not make sense or look natural if you bold ever occurrence.
Example use:
  • <b>words</b>
  • <strong>words</strong>
Either would appear as words.
Bookmarks – Most browsers come with the ability to bookmark your favorite pages. Many web based services have also been created to allow you to bookmark and share your favorite resources. The popularity of a document (as measured in terms of link equity, number of bookmarks, or usage data) is a signal for the quality of the information.
Boolean Search – Many search engines allow you to perform searches that contain mathematical formulas such as AND, OR, or NOT. By default most search engines include AND with your query, requiring results to be relevant for all the words in your query.
  • A Google search for SEO Book will return results for SEO AND Book.
  • A Google search for “SEO Book” will return results for the phrase SEO Book.
  • A Google search for SEO Book -Jorge will return results containing SEO AND Book but NOT Jorge.
  • A Google search for ~SEO -SEO will find results with words related to SEO that do not contain SEO.
Some search engines also allow you to search for other unique patterns or filtering ideas. Examples:
  • A numerical range: 12…18 would search for numbers between 12 and 18.
  • Recently updated: seo {frsh=100} would find recently updated documents. MSN search also lets you place more weight on local documents
  • Related documents: related:www.threadwatch.org would find documents related to Threadwatch.
  • Filetype: AdWords filetype:PDF would search for PDFs that mentioned AdWords.
  • Domain Extension: SEO inurl:.edu
  • IP Address: IP:
Brand – The emotional response associated with your company and/or products.
A brand is built through controlling customer expectations and the social interactions between customers. Building a brand is what allows you to move away from commodity based pricing and move toward higher margin value based pricing.
Branded Keywords – Keywords or keyword phrases associated with a brand. Typically branded keywords occur late in the buying cycle, and are some of the highest value and highest converting keywords.
Some affiliate marketing programs prevent affiliates from bidding on the core brand related keywords, while others actively encourage it. Either way can work depending on your business model and marketing savvy, but it is important to ensure there is synergy between internal marketing and affiliate marketing programs.
Breadcrumb Navigation – Navigational technique used to help search engines and website users understand the relationship between pages.
Example breadcrumb navigation:
Home > SEO Tools > SEO for Firefox
Whatever page the user is on is unlinked, but the pages above it within the site structure are linked to, and organized starting with the home page, right on down through the site structure.
Brin, Sergey – Co-founder of Google.
Broken Link – A link that no longer takes the user to the destination page when it is clicked on. This is usually the result of the destination page having been renamed or deleted from the server. Also referred to as a Dead Link.
Browser – Client used to view the world wide web.
The most popular browsers are Microsoft’s Internet Explorer, Mozilla’s Firefox, Safari, and Opera.

A Words

SEO Dictionary

Above the Fold – A term traditionally used to describe the top portion of a newspaper. In email or web marketing it means the area of content viewable prior to scrolling. Some people also define above the fold as an ad location at the very top of the screen, but due to banner blindness typical ad locations do not perform as well as ads that are well integrated into content. If ads look like content they typically perform much better.
Absolute Link – A link which shows the full URL of the page being linked at. Some links only show relative link paths instead of having the entire reference URL within the a href tag. Due to canonicalization and hijacking related issues it is typically preferred to use absolute links over relative links.
Example absolute link:
<a href=http://www.yoursite.com/folder/file.html>File Title</a>
Example relative link:
<a href=../folder/file.html>File Title</a>
AdCenter – Microsoft’s cost per click ad network.
While it has a few cool features (including dayparting and demographic based bidding) it is still quite nascent in nature compared to Google AdWords. Due to Microsoft’s limited marketshare and program newness many terms are vastly underpriced and present a great arbitrage opportunity.
AdSense – Google’s contextual advertising network. Publishers large and small may automatically publish relevant advertisements near their content and share the profits from those ad clicks with Google.
AdSense offers a highly scalable automated ad revenue stream which will help some publishers establish a baseline for the value of their ad inventory. In many cases AdSense will be underpriced, but that is the trade off for automating ad sales.
AdSense ad formats include:
  • text
  • graphic
  • animated graphics
  • videos
AdWords – Google’s advertisement and link auction network. Most of Google’s ads are keyword targeted and sold on a cost per click basis in an auction which factors in ad clickthrough rate as well as max bid. Google is looking into expanding their ad network to include video ads, demographic targeting, affiliate ads, radio ads, and traditional print ads.
Affiliate Marketing – Affiliate marketing programs allows merchants to expand their market reach and mindshare by paying independent agents on a cost per action (CPA) basis. Affiliates only get paid if visitors complete an action.
Most affiliates make next to nothing because they are not aggressive marketers, have no real focus, fall for wasting money on instant wealth programs that lead them to buying a bunch of unneeded garbage via other’s affiliate links, and do not attempt to create any real value.
Some power affiliates make hundreds of thousands or millions of dollars per year because they are heavily focused on automation and/or tap large traffic streams. Typically niche affiliate sites make more per unit effort than overtly broad ones because they are easier to focus (and thus have a higher conversion rate).
Selling a conversion is typically harder than selling a click (like AdSense does, for instance). Search engines are increasingly looking to remove the noise low quality thin affiliate sites ad to the search results through the use of:
  • algorithms which detect thin affiliate sites and duplicate content;
  • manual review; and,
  • implementation of landing page quality scores on their paid ads.
Age – Some social networks or search systems may take site age, page age, user account age, and related historical data into account when determining how much to trust that person, website, or document. Some specialty search engines, like blog search engines, may also boost the relevancy of new documents.
Fresh content which is also cited on many other channels (like related blogs) will temporarily rank better than you might expect because many of the other channels which cite the content will cite it off their home page or a well trusted high PageRank page. After those sites publish more content and the reference page falls into their archives those links are typically from pages which do not have as much link authority as their home pages.
Some search engines may also try to classify sites to understand what type of sites they are, as in news sites or reference sites that do not need updated that often. They may also look at individual pages and try to classify them based on how frequently they change.
AJAX – Asynchronous JavaScript and XML is a technique which allows a web page to request additional data from a server without requiring a new page to load.
Alexa – Amazon.com owned search service which measures website traffic.
Alexa is heavily biased toward sites that focus on marketing and webmaster communities. While not being highly accurate it is free.
Algorithm – A complex mathematical formula used by a search engine to rank the web pages that it finds by crawling the web.
AllTheWeb – Search engine which was created by Fast, then bought by Overture, which was bought by Yahoo. Yahoo may use AllTheWeb as a test bed for new search technologies and features.
ALT Tags – The ALT description is displayed in place of the image if the user is browsing with image display turned off. Microsoft Internet Explorer improperly displays a short text description of an image when you hover your mouse over it, thus mimicking the Title Tag.
Image ALT tags are useful to your page’s visitors. Equally as important, they can help with your search engine rankings by increasing the keyword density (if you use your keywords in your ALT tags).
<img src=company-logo.jpg width=200 height=80 alt=Photo of company logo />
AltaVista – Search engine bought out by Overture prior to Overture being bought by Yahoo. AltaVista was an early powerhouse in search, but on October 25, 1999 they did a major algorithmic update which caused them to dump many websites. Ultimately that update and brand mismanagement drove themselves toward irrelevancy and a loss of mindshare and marketshare.
Amazon.com – The largest internet retailing website. Amazon.com is rich in consumer generated media. Amazon also owns a number of other popular websites, including IMDB and Alexa.
Analytics – Software which allows you to track your page views, user paths, and conversion statistics based upon interpreting your log files or through including a JavaScript tracking code on your site.
Anchor Text – The text that a user would click on to follow a link. In the case the link is an image the image alt attribute may act in the place of anchor text.
Search engines assume that your page is authoritative for the words that people include in links pointing at your site. When links occur naturally they typically have a wide array of anchor text combinations. Too much similar anchor text may be a considered a sign of manipulation, and thus discounted or filtered. Make sure when you are building links that you control that you try to mix up your anchor text.
Example of anchor text:
<a href=http://www.yoursite.com/folder/file.html>File Title</a>
Outside of your core brand terms if you are targeting Google you probably do not want any more than 10% to 20% of your anchor text to be the same.
AOL – Popular web portal which merged with Time Warner.
Apache Web Server – The web server software that is most used on the internet today.
API – Application Program Interface – a series of conventions or routines used to access software functions. Most major search products have an API program.
Arbitrage – Exploiting market inefficiencies by buying and reselling a commodity for a profit. As it relates to the search market, many thin content sites laced with an Overture feed or AdSense ads buy traffic from the major search engines and hope to send some percent of that traffic clicking out on a higher priced ad. Shopping search engines generally draw most of their traffic through arbitrage.
ASP – Active Server Pages – a dynamic Microsoft programming language.
Ask – Ask is a search engine owned by InterActive Corp. They were originally named Ask Jeeves, but they dumped Jeeves in early 2006. Their search engine is powered by the Teoma search technology, which is largely reliant upon Kleinberg’s concept of hubs and authorities.
Authority – The ability of a page or domain to rank well in search engines. Five large factors associated with site and page authority are link equity, site age, traffic trends, site history, and publishing unique original quality content.
Search engines constantly tweak their algorithms to try to balance relevancy algorithms based on topical authority and overall authority across the entire web. Sites may be considered topical authorities or general authorities. For example, Wikipedia and DMOZ are considered broad general authority sites. This site is a topical authority on SEO, but not a broad general authority.
Authorities – Topical authorities are sites which are well trusted and well cited by experts within their topical community. A topical authority is a page which is referenced from many topical experts and hub sites. A topical hub is page which references many authorities.
Example potential topical authorities:
  • the largest brands in your field
  • the top blogger talking about your subject
  • the Wikipedia or DMOZ page about your topic
Automated Bid Management Software – Pay per click search engines are growing increasingly complex in their offerings. To help large advertisers cope with the increasing sophistication and complexity of these offerings some search engines and third party software developers have created software which makes it easier to control your ad spend. Some of the more advanced tools can integrate with your analytics programs and help you focus on conversion, ROI, and earnings elasticity instead of just looking at cost per click.

How PPC Works

PPC – How does it work?

Pay-Per-Click (PPC) or Sponsored listings are used in order to help establish a website on a search engine’s result pages when the common generic listings are not providing top positioning. By paying a PPC provider a fee it is possible to have a listing within the top few results given by a search engine. For this reason, PPC is a nice solution when a website is starting to establish a relationship with these search engines, which has developed into a sometimes long term project. But like everything, there is more to the process than just paying a fee.


If you have a bottomless budget than the issues of PPC are virtually eliminated. Of course, this is seldom the case. Lets use Google as our default search engine in this discussion. When you search on Google for “LA Dentist” you are going to see over 2 million competing webpages. This would be a good candidate for PPC. But, it is not the whole answer. Currently, the phrase “LA Dentist” will cost you $11.78 per click with the average of 33 people a day clicking on the PPC add. You would have to keep a budget of almost $400 a day or $12,000 a month to catch all of these clicks. Add to this the fact that most people are going to want multiple phrases be advertised for and you quickly need a budget in the millions of dollars.This is simply not possible for most businesses. So, the solution is to lower the budget to a more realistic amount. This causes another issue though; competition.


There are dozens of people advertising for the PPC phrase “LA Dentist” and Google is only going to display a few of these at any given time. Who is eliminated when not everyone can be displayed? Google looks at several factors. 1) Google is going to display those that have a bigger budget. They are looking at making as big a profit as possible, so Google is going to display the adds that bring in the most money for them. 2) When the higher paying PPC clients exhaust their budget then they are dropped from the list and the lesser budget advertisements get a chance. 3) Google is also worried about relevance. They will choose to display adds that have been clicked on more often. To Google, clicks prove relevance. If a site or an add is picked by a wide margin more than another then it must truly be about that searched for subject. 4) Another factor is randomization. If you were to rely simply on the above factors then a newly added PPC add with a smaller budget would never appear, especially when there are dozens or more deep pocket PPC clients to compete with. In LA you can expect many, many dozens of competing clients. So, a small percentage of adds are displayed randomly, or once in a while, to give them a chance for occasional click through. When they are clicked on a few times then factor 3 above promotes their rate of display. 5) History; the longer an add has been running the greater the relevance granted.

PPC Fraud:

It is a sad fact that about half of the clicks are fraudulent. In fact, most studies claim that click fraud is greater than 50%. When a person’s budget is used up then they no longer appear. Most people who first start using PPC learn how all this works and will click on their competitors adds just to delete him from the daily mix, or just cost him some money for a quick laugh. These are sad facts. However, all of the PPC providers are working zealously to improve this issue.

Monthly Budgets:

Once the adds start being displayed there is another issue; over spending. You would think that if you set a monthly budget then you would stay within that monthly budget. Again, we find more issues. Even though these PPC providers allow for monthly budgeting, it is a daily click budget that they truly offer and it is not a very smart budget at that. Let us assume there is a monthly budget of $30 for Google. This would equate to $1 per day. If an add costs $0.80 then you would think that only another 20 cents worth of click was possible. Instead, Google is going to allow any click to occur no matter what its cost because the budget is still open. If another 80 cent add is chosen then the budget is then met and the account is treated as done for the day. We have seen budgets exceeded by 300% for the month because of this budget weirdness. The provider promises that they will guarantee roughly a 20% accuracy rate for budgeting, but it requires a long phone call and proof that the budget was not changed and further arguments, because how do you prove that you did not change the budget and… you get the point. We are going to be stuck with this overage. What we do to avoid this is monitor all of the accounts for all of our clients all month long and turn them off when they are exceeded. Of course this causes the add not to be shown again until next month. Currently we are only using Google and Overture. MSN will be added to the mix very soon requiring yet another account to watch. Very troubling.


Even this is complicated. You would think that you could look at the server logs of a site and know how many clicks occurred from a PPC add. Not the case. These providers also allow personal sites to display PPC adds and the generic listings. Even when using customized javascripts provided by these PPC engines many of the PPC referals are not always traceable.


When a client starts a PPC campaign they usually expect to see their advertisement right away, and they want to see that add every time they look for every phrase they want supported and even for phrases that were not discussed. The fact of the matter is; it is not going to happen. It might take a few thousand searches before that randomizer kicks in and displays the add for the first time. The second time might not be any better until the add is clicked on. Maybe dozens of clicks might be needed depending on how rooted your competition is.
PPC can be a very successful tool. In order for this to be the case we need a decent budget and enough time to get the needed relationship between the PPC provider and the site. A month or two is normal, but not always the case. It all depends on the site and its competition.

Google Toolbar PageRank

Does it mean anything or is it only for Entertainment Purposes?

Last week, two statements about Google’s PageRank started a new discussion about this topic.
The first statement was made in the Search Engine Watch forums. It was from a person who received the following answer to a PageRank question from a Google employee:
The PageRank that is displayed in the Google Toolbar is for entertainment purposes only.
Due to repeated attempts by hackers to access this data, Google updates the PageRank data very infrequently because it is not secure. On average, the PR that is displayed in the Google Toolbar is several months old.
If the toolbar is showing a PR of zero, this is because the user is visiting a new URL that hasn’t been updated in the last update.
The PR that is displayed by the Google Toolbar is not the same PR that is used to rank the webpage results so there is no need to be concerned if your PR is displayed as zero.
If a site is showing up in the search results, it doesn’t not have a real PR of zero, the Toolbar is just out of date”
In another forum, a person with the name GoogleGuy, who is believed to a Google employee, made the following statement:
I’d strongly disagree with the statement that the toolbar PageRank is for ‘entertainment purposes only’–millions of toolbar users use the PageRank display to judge the quality of pages.
I think it’s also a little irresponsible to quote John Galt claiming to talk to some random person at Google, and then for you to quote it as a response from Google, which makes it sound more official. I’m happy to refute that this is any sort of official stance.”
GoogleGuy didn’t say that the comments in the first statement are wrong. He said that toolbar users use the PageRank display to judge the quality of web pages. He did not say that Google uses the number of the PageRank toolbar to rank web pages.
The PageRank feature is also no longer mentioned in the official Google toolbar tour.
Matt Cutts of Google made the statement as quoted here, “As far as the toolbar PageRank, I definitely wouldn’t expect to see it in the next few days. Probably not even in the next couple weeks, if I had to guess.” What is interesting about this is that it has already been several months since we have seen a ranking update for any websites. The author of this particular article went on to say, “The response is overwhelmingly unanimous to remove PageRank altogether. Some members are just totally overwhelmed by the amount of PageRank updates; they want it to cease existing so that they don’t have to hear about it anymore.” It would be sad though to see a tool disappear as there is no other resource quite like it. Is it better to have misunderstood information that is seldom updated or nothing at all? I personally can not decide.

What does this mean to you?

Of course, PageRank is important to get good rankings on Google. However, the PageRank number that is displayed in the Google Toolbar and the green PageRank bar are not important at all for good Google rankings.
It’s very likely that the PageRank number that can be seen in the Google Toolbar is mainly a marketing instrument for Google that doesn’t have much effect on the search results. Google seems to use an internal PageRank value for its ranking algorithm and a public PageRank value for the toolbar.
That would explain why many web pages with a high ranking on Google have a low (toolbar) PageRank.
When you want to trade links with another web site, don’t look at the PageRank of that site. Instead, ask yourself: Is the web site related to your site? Would it make sense for web surfers if they linked to you and you linked to them? Could visitors of the other web site be interested in your site? If you find a web site that you would want to visit or your visitors would want to visit then link to it and ask for a link back to your site.
Just use common sense. If you like a page, chances are that other people also like that page, no matter what PageRank the Google toolbar displays.

Call to Action

SEO, like every other industry, is full of buzz words. In advertising, “Call to Action” is going to be one of the most popular and most important buzz words to consider and even design your website for.
Most of the websites concerned with “Call to Action” are looking at selling product, but any website should consider getting a visitor’s attention; why else be online than to get people’s attention.
The misconceptions are there and to be avoided. For instance, how many times have you been inundated with popups, music or in-your-face obnoxiousness only to have to close the site or close your browser to escape? Popups can be done tastefully, but the technology to block popups may be enough to consider not using them.
If you are selling product then your best option is the “Add to Cart” link next to the product. Commonly used related phrases would be “doing words” such as “Click here”, “Buy Now”, “Enter Now” or “Click to download.”
The page title tag, the text that appears at the very top of the browser window, is undoubtably the most important area to consider as it also appears before the descriptions on the search engine result pages. The description meta tag is also important as is good wording on the Open Director  (DMOZ.org) which is often used instead of the description meta tag. The page description is what appears just below the title of the search engine result page listings.
In all cases, no matter the nature of the website, let your message make the sale. If your site pages are well written then you will be shown to be an authority and the visitor will want to contact you for more information or to purchase your services or product. Remember to make sure that every page on your website includes either your contact information or a link to it. Visitors must be able to contact you easily or else they will move on to your competition.
The worst thing that you can do is drive someone from your site before they can contact you.
That said it should be noted that obnoxious spam techniques do work for many, but do you really want to have one of those kind of sites?


Learning from Reports
… and Adjusting Accordingly

Now that your website has been built and initially optimized, the next stage of SEO can begin; ReOptimization.

This website has explained the importance of reports and that optimization never ends. What does this mean? Depending on what you are optimizing and who you are trying to market to the answer is going to be different. Here is an example of an optimization job that I oversaw a few years ago and what I learned from it.

This particular client had performed dental work for a well known Playboy Bunny, Jenny McCarthy. He then took the initiative to work out a deal Ms. McCarthy to film her praising his work. Granted, not everyone is going to have this type of material to work with. Stay with me here though as I learned a lot from what happened next.

We started by loading the videos onto YouTube with the text of the dialog entered into the description and using the other optimization options available. We also wanted the client’s website to benefit so we linked to it accordingly from YouTube. These were good moves.

Click on the graph image above. Jenny McCarthy has brought in 19,310 views to these videos thus far. That is a very decent amount of traffic for a dental testimonial.

There were things that did nothing for us though. For instance, I wanted the website to benefit from search traffic from these videos so the videos were also placed on the website. Nothing ever happened from this as any Google juice was given to YouTube and nothing applied to the website. The site’s server logs only reported a handful of hits from the name Jenny McCarthy. Bummer. We could have applied different text for these videos on the site or not have placed them on YouTube, but I would argue that what we did worked well; sometimes one choice is not better, but simply provides different benefits.

In addition, the videos could have contained more “Call to Action” for redirecting people to the doctor’s website. This is a touchy point. The client did benefit from our optimization work, but could we have done more? I am not certain since too much “Call to Action” becomes spam and people are quick to tune out obvious and/or overdone advertising. The trick here is to apply different tactics and see what happens. When too much tactic is applied the reports will show this and you can back off some. What works or does not work now may produce different results at another time, so keep trying and adjusting and even retrying things that may not have shown benefit earlier.

Another excellent lesson from this particular effort was the Viral marketing aspect. People subscribed to these videos. People left comments. We learned that the largest demographic was Males between the ages of 45-54 and that the entire world likes viewing Jenny McCarthy, although the USA was definitely the largest supplier of viewers. We also see from this graph that even though the videos have been online since November 2007, people are still viewing them 2 and half years later at a really steady rate. Above we noted that these videos were watched over 19 thousand times thus far. In this graph we see that those views were from 3805 visitors. People liked these videos enough that they re-watched them many times.

These are just a few observations from one job. You get the point though; SEO requires CONTINUED effort and experimentation and adjustment. If you mess up then learn from it and move on. Also understand that what works for one client may not be possible to apply for another; in this case Jenny McCarthy is probably not available for you and your advertising, but you can do something similar… or not…

How to Optimize my WebSite

It is presumed that whoever is performing the optimization of the website is:
  • Familiar with the type of programming involved with the site.
  • They have the required tools and know-how necessary to tweak the coding.
  • They have access to the original site files or database and has authorization to upload files and/or make modifications to the web server.
That said, let us continue.
There are multiple steps to Optimize your WebSite which include:
Easy enough, right? So here we go…

Optimize Your WebSite with Good Phrases

Everything within optimization balances on the fact you are working with good phrases. You need keywords that accurately describe your site and the pages within.

Finding phrases

After you have a sizable list you now need to whittle away at them until you are left with marketable keywords. Here you need to keep two things in mind:
Do people search for these phrases?
And what kind of competition are you up against?
If you can only acquire position 380 for a phrase then no one is going to find your site. If you work with a phrase that is not as often searched for but gets you on the top 3 pages of a search engine, then you will get results for your work, albeit maybe not what you are hoping for. This will improve as your website gains in its ranking.
There used to be several free online tools for researching and brainstorming good keywords, but sadly this has changed. Overture used to provide a free tool titled “Search Term Suggestion Tool” for finding out how many people searched for a particular phrase in the last 30 days. These were searches that Overture managed so the number only represented a portion of the searches made, but this gave a good bases to start with. Unfortunately this tool is no longer available. WordTracker and Keyworddiscovery are two of the current more popular services, but these come at a price. Note that their base of data is limited, so their results are also limited. But, until the search engines start sharing this data services like these are what is available.

Filtering Your Phrase List

Search Frequency is impossible to get good data for at this time. The above mentioned tools and even Alexa can help here. Knowing the popularity of a keyword phrase is helpful, but not the most important aspect… especially when first optimizing for a website.
Now go to your favorite search engines and type in your phrase. How many positions are returned. This also is not a perfect test since not all of the pages here showing results have been optimized. Some engines allow you to filter out pages that do not have the phrase in the Title, description meta tag or linking text. By putting quotes around the phrase you can narrow down the pages where the words are not inline with each other. Tricks like this give you a better idea of what kind of competition you are up against.
Remember, the more competition you have the fewer phrases your site will be able to support. But again, as your site gains in ranking it will support more phrases. If you continue to add content to your site it will be able to support more phrases too. A site should never sit stagnant. Let it grow.

Minimize the Code to Text Ratio

If your site pages were built with a program such as a word processor or a user friendly web page tool then you may have a lot of redundant and unneeded markup code. Become familiar with style sheets as these can cut your document size in half or more.
If your pages are heavy with javascript, place the code into a separate file and reduce this bulk to a one line callout. This can be extremely helpful if the same code is used on multiple pages.
Can you format the page with good CSS positioning so that the precious text is closer to the top?
Reduce the useless bulk of indents. You can save a good 10% or more in your file size here.
Make it easy for the search engines to find your precious content. It should also be noted that a page with coding errors within it may stop the search engine from seeing the content below the error. Validate your work.

Write Good Copy

Do not overuse a phrase. Your keywords should be near the top of a page, in the body and in the conclusion. If you can use similar words, or different forms of a word then do. It is a common misconception that you should market plural and singular forms of a phrase or misspellings. The search engines are now filtering this out and even recommending the right spelling to the user. Badly worded or misspelled content gives the appearance of being unprofessional.
Use a good amount of words. This is tough to balance since a potential client is not going to want to read through a thousand words of text to find what they are after. But the search engines like to see text and lots of it. Place the main points of a topic near the top of a page and use anchor text and links to help the visitor to find the wanted information below. I used an example of this at the top of this page.

Tweaking the Content

This is where you can get yourself into trouble. If you over-optimize the website then you will be committing a form of spam. But some tweaking is not only accepted, it is expected and beneficial. Just do not over-do it and never use every possible option. DO NOT OVER OPTIMIZE.
Here are some of the more commonly used search engine friendly tricks:
  • Page Title: A well written page will cover one subject, use it in the head/title to describe the page.
  • Page Description: Some of the search engines look at the head/meta-description for a quick summery of the page and then use it or part of it on their result pages. Because this is an abused tactic the engines are relying on their own tools to come up with their own description line, but it is still worth using this tag.
  • Bold, Capitalization & Headings: Use the H3 and H4 tags to start a section or page. Also use the BOLD tag, but very sparingly. Capitalization can also help in areas, but do NOT over use it. NOTE that H1 tags are losing their power because of their over use. Substitute with H2 and lower.
  • Alt Tags: I have only found these to be useful when within a picture link. Place your phrase in these, but again, use with care.
  • File Names: This is also a good place to use your phrases. As always, a page file name should correspond with the subject of that page. File names are also important to reflect the contents of a graphic since search engines can not actually see a picture.
  • Directory Names: Along with file names you can organize you pages within well described folders or directories such as before and afters, testimonials, articles or what ever applies to your site. NOTE that you do not want to over word these directories or use directories within directories within directories within directories; keep it real.

Optimize your website pages to properly reflect what your site is about. Do not over do it. The search engines look for proper emphasis within the code when ranking a site. But if the engines think you are trying to trick them you will only hurt yourself and the ranking value of your site.

What to Expect from Search Engine Optimization

There are many misunderstandings of what Search Engine Optimization can accomplish for a website. Here is what you can and can NOT expect:

What Can Be Expected

  • Organic Optimization allows the search engines to examine your site, nothing more.
  • The search engines will analyze the content and determine what the subject of each page is about and the site as a whole.
  • When a search is made your site pages will be compared with billions of other pages and the pages that the search engine believes to best represent the search will be shown first.
  • The search engines are stupid automated programs, nothing more.
  • What ranks high today may not rank well tomorrow. Be prepared for site positioning to change regularly and radically.
  • Sites that have been around for years will be more likely to rank higher.
  • If your site pages are about a competitive subject, you may be competing with millions of other pages. Many of which are probably also optimized. Many of which are using spamming techniques that for now are going to give them unwarranted positioning.
  • If you do NOT perform Search Engine Optimization, your site will never be found.
  • Every search engine ranks with a different set of ever changing rules. Good results in one search engine does NOT mean good results in other engines.
  • If the search engines as yet do not even know your site exists, you can expect up to 3 or 4 months before they will show any results.
  • If successfully used, you can expect traffic.
  • If your site is eye catching and informative, you can expect new clients and perspective sales.
  • As with all advertising approaches, being in the right place at the right time applies.

What Search Engine Optimization Will NOT Do

  • SEO will NOT give you guaranteed results.
  • Visitors will probably not start flocking to your site in great numbers for at least several months, if ever.
  • You can NOT market hundreds of phrases. Plan on less than a dozen phrases, or maybe one competitive keyword phrase.
  • This is not a one time processes. Your site will need continued updating, analysis and even severe re-writing. Markets change and you will need to change with them.
  • Search Engine Optimization is not an easy answer. It requires hard and continued work.
  • Real SEO will NOT come cheaply. The $49 a month guarantees are nothing. Expect to pay about $2000 down and $400 a month there after; on the conservative side.

In Conclusion

Search Engine Optimization is important. Depending on your needs and revenue, you may not be able to have this professionally done. There are no magical or mysterious techniques. You can do this yourself, but plan on a lot of work ahead of you.

Optimize My Website

Welcome to Optimize My Website .com. It is my goal to provide the information necessary so that you can understand the Search Engine Optimization (SEO) industry and then either optimize your own site or knowledgeably find someone who can perform this service for you.

 SEO is more than a one time do-it-and-forget-about-it service. SEO requires continued updates, tweaking and loving care. The industry is ever changing and what worked yesterday for good search engine positioning may not work today.

Please, check back often. We strive to provide up to date optimization tricks that are endorsed by the search engines themselves and we try to keep our article pages current with important happenings. There is also a dictionary section if you come across a term that you are not familiar with.

We hope that you find this resource useful.

Next posts

Google Analytics Alternative