The Startup Review focuses upon providing case studies about successful online businesses. It makes for some interesting reading, especially if you run an online business, or are considering starting one.
Companies profiled in case studies so far include Craig’s List, Advertising.com, Newegg, Rent.com, Flickr, Linkshare, Myspace, Zappos.com, Rotten Tomatoes, and Homegain. The case studies look at things like why the businesses are being profiled, what their key success factors are, launch strategies, exit analysis, a “food for thought” section, and references articles about the businesses.
These are pretty nice, thoughtful looks at online businesses, and the factors that have brought them success.
One of the titles to a business review caught my eye – Rotten Tomatoes Case Study: SEO drives traffic growth. Most of the information in the post about Rotten Tomatoes actual SEO strategies are included in a comment to the post from the in-house team that worked on SEO for Rotten Tomatoes. Here is a brief summary:
I had the good fortune to be able to meet Jim Hedger at the San Jose SES a little over a week ago. While we didn’t have the opportunity to talk at great length, it was nice to meet him. I’ve been reading his blog posts and articles for a few years now. I really enjoyed one of his latest.
On the Tuesday during the four day conference, I ran into Jill Whalen, who had just finished an interview with someone outside of the press room in the conference hall. It was good to be able to say hi, though I caught Jill going to another interview. Seems like she had a pretty full day of interviews. One of them was with Jim – Jill Whalen Interviewed at SES San Jose. Jill makes some pretty astute observations. Definitely worth a read.
Jill talks about the growth and maturation of the Search Marketing Industry, a larger focus on in-house SEO, more women in the search sector, the importance of educating clients, and the next High Rankings Seminar in Texas in October. I’ve been a guest at a couple of those seminars, and I’d highly recommend them to people interested in learning more about search engine marketing.
Nice interview, Jim and Jill.
When you perform a search on one of the major search engines for a particular query, and when I perform the same search, chances are that we will see the same pages appearing on the search results pages. Then again, we may not. Chances are also good that in the future, the results that each of us sees will be different.
One of the areas that many in academia, and at commercial search engines are exploring is how to personalize web search.
We see that most visibly in the personalized search pages that the major search engines have released. They explain how to receive personalized searches on the following pages:
Google Help Center – Personalizing your search results
This post doesn’t describe the actual creation of content for a site, from an SEO stance, but it does detail some of the planning and steps that can be taken to help in the process.
It also doesn’t discuss some of the technical aspects of SEO that should be planned for to make a site easier to be found by search engines. But it does provide a number of questions that may make it easier for someone who is considering optimizing their site for search engines as they are putting together content for the pages of their site.
One of my favorite articles of the past few years on design is a Digital Web article from 2003 by G.A. Buchholz, titled A Content Requirements Plan (CRP) helps Web designers take a leadership role.
I think that part of the planning of the content of a site also should include an awareness of search engines, and a knowledge of some SEO goals. Those goals aren’t too difficult to keep in mind when it comes to creating the words for a site, but are definitely worth considering:
There are a number of reasons why pages don’t show up in search engine results.
One area where this is particularly true is when the content at more than one web address, or URL, appears to be substantially similar at each of the locations it is seen by the search engines.
Some duplicate content may cause pages to be filtered at the time of serving of results by search engines, and there is no guarantee as to which version of a page will show in results and which versions won’t. Duplicate content may also lead to some sites and some pages not being indexed by search engines at all, or may result in a search engine crawling program stopping the indexing all of the pages of a site because it finds too many copies of the same pages under different URLs.
There are a few different reasons why search engines dislike duplicate content. One is that they don’t want to show the same pages in their search results. Another is that they don’t want to spend the resources in indexing pages that are substantially similar.
I’ve listed some areas where duplicate content exists on the web, or seems to exist from the stance of search engine crawling and indexing programs. I’ve also included a list of some patents and some papers that discuss duplicate content issues on the web.
How harmful are dead links to search engine rankings? Or pages filled with outdated information? Can internal redirects on a site also hurt rankings? What about the redirects used on parked domains?
A new patent application published last week at the US Patent and Trademark Office (USPTO), and assigned to IBM, Methods and apparatus for assessing web page decay, explores the topics of dead pages, web decay, soft 404 error messages, redirects on parked pages, and automated ways for search engines to look at these factors while ranking pages. I’ll explore a little of the patent application here, and provide some ideas on ways to avoid having decay harm the rankings of web sites.
The authors of the patent filing include:
I received my copy of the first magazine devoted to Search Marketing, and Search Engine Optimization on Monday.
Search Marketing Standard went out to more than 15,000 people over the last week or two. The first issue was on the slim side, but it had some well written articles and news coverage. Headlines on the front page include:
- 15 of the Biggest Myths in Search Marketing Exposed
- Measuirng SEO Success with Web Analytics
- Targeting the Tail: How to get the Most out of Every Marketing Dollar
I think that they are off to an excellent start, and I hope to see them grow and evolve into a well known and highly respected part of the Search Marketing community.
The magazine is quarterly, and is aimed at owners of small to medium sized businesses, and search marketers. Their fall issue is expected at the end of August, and will take a closer look at “Search Engine Marketing and Web Site Usability.”
I’ve been unhappy for a long time with what is on the pages of the Wikipedia for Search Engine Optimization. I decided this weekend to start making some changes to present the subject from a more rounded perspective.
Some of the things that bothered me about the article as it was:
1. It presented the industry as one largely drawn into two different camps, mostly at odds with one another – white hats and black hats – or those who follow ethical practices as defined by search engine guidelines, and those who don’t.
Ethics aren’t defined by search engines, but rather by moral codes of conduct, and having search engines set the tone of that conduct probably isn’t appropriate. They are businesses, beholden to shareholders, reliant on advertisers, and dependent upon searchers. They’ve never set themselves up to be the moral policemen for the search engine optimization community, and it’s a role that I suspect that they don’t relish.
2. Search engines have expanded their offerings considerably in the past few years to include much more than just organic results, and someone practicing SEO can be helped by having an understanding of RSS feeds, local search, mapping, vertical search, shopping search, news, and paid advertising.