Search Engine Optimization
There was a thread on this list recently about Search Engine Optimization for CF sites. Just so happens SEO has become an issue for me on a freelance project, and I want to start a list of best practices I can work into my development processes. Thus far, I have three items: 1) Configure the server to run other file extensions through the CF parser (i.e. HTM, HTML) in order to avoid being ignored 2) Use meta description and meta keyword tags to indicate content on the site 3) Use search engine safe links instead of passing CGI parameters Obviously, this is just a start and there are non-technical issues as well (for instance, making sure there are plenty of incoming links to the site). Does anyone have any more tips, or else pointers to resources, that might be helpful here? M ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Get the mailserver that powers this list at http://www.coolfusion.com Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Search Engine Optimization
I know this is a little off topic, but does anyone have any specific techniques they use to get the most out of thier listing in Search engines? Other than, the title tag, the amount of times the words are listed in the page etc, submitting to search engines directly, as well as having many links to your site as possible from other sites. Any help would be appreciated . Thanks Ryan [Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings] [Donations and Support]
Re: Search Engine Optimization
At 15:37 08/07/2003 -0400, you wrote: >There was a thread on this list recently about Search Engine Optimization >for CF sites. Just so happens SEO has become an issue for me on a freelance >project, and I want to start a list of best practices I can work into my >development processes. Thus far, I have three items: > >1) Configure the server to run other file extensions through the CF parser >(i.e. HTM, HTML) in order to avoid being ignored >2) Use meta description and meta keyword tags to indicate content on the >site >3) Use search engine safe links instead of passing CGI parameters Making sure the crawler bots index your pages is obviously the best first step... Bear in mind that Googlebot and others can index dynamic pages, but only if they're linked to from static pages (i.e. ones with a "?" in the URL). But then special pages of links for crawlers are only a last resort, and using some sort of other site-wide technique (slash-delimited query strings, or getting your CMS to write out flat HTML files) is preferrable. But as far as actual optimisation goes, the following rules are important in today's Google-centric web (more than META keywords and description, though I always use these anyway, for their potential value for things other than Google): - Put keywords in the TITLE tag of your page. I used to avoid this cos I sympathise with people bookmarking things and having to change the title to something short and useful in your browser. But then, if no one finds your page, how can they bookmark it? ;-) I go for a reasonable phrase-like string like "Cheap Banana Imports for UK Retail, from XYZ corp" - instead of "XYZ corp - Home" (which is nicer for bookmarking, but useless for search engines). - Use structural XHTML markup wherever possible. Make sure the H1 tag contains keywords relevant to the page's topic (without rendering it silly as a human-readable main title of course). Pages with keywords in the TITLE, H1, and body text near the top of the page get higher rankings than those that don't. - If possible, use table-less CSS layouts. Then you can shove your H1 and main content right at the top of the markup, even if in the layout it comes underneath loads of navigation and banners and whatnot. These can be shoved at the bottom of the code, but positioned at the top using CSS positioning. Obviously in tables, you're often forced to have your left-hand side nav as well as your top nav above the content in your markup. This means lower rankings. These aren't set in stone, but they've got me some pretty good rankings so far. HTH, Gyrus [EMAIL PROTECTED] play: http://norlonto.net/ work: http://tengai.co.uk/ PGP key available ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Signup for the Fusion Authority news alert and keep up with the latest news in ColdFusion and related topics. http://www.fusionauthority.com/signup.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Re: Search Engine Optimization
Is there any benefit whatsoever to the first point (parsing other extensions thru CF)? I can believe it might have been true in the past but I've never had a problem getting CF pages indexed. I thought it was the parameters that *might* get you; not the page extension. Anyone with recent experience to the contrary? I'd add that you should publish to static HTML wherever possible (which may be the bulk of the pages on many sites). I do it because I'm greedy about conserving server resources, but I sell it to clients by telling them the links are more SE-friendly. Of course you can always do the old blah.cfm/parm/value bit, but that opened up a security hole in CF. http://www.securityfocus.com/advisories/4110 I use it on 4.5 cuz I have a site-wide error handler, whose special handling of 404's supposedly makes the technique safe to use. Did this ever get patched in one of the MX updaters? --- Matt Robertson, [EMAIL PROTECTED] MSB Designs, Inc. http://mysecretbase.com --- -- Original Message -- From: Gyrus <[EMAIL PROTECTED]> Reply-To: [EMAIL PROTECTED] Date: Tue, 08 Jul 2003 21:07:01 +0100 >At 15:37 08/07/2003 -0400, you wrote: >>There was a thread on this list recently about Search Engine Optimization >>for CF sites. Just so happens SEO has become an issue for me on a freelance >>project, and I want to start a list of best practices I can work into my >>development processes. Thus far, I have three items: >> >>1) Configure the server to run other file extensions through the CF parser >>(i.e. HTM, HTML) in order to avoid being ignored >>2) Use meta description and meta keyword tags to indicate content on the >>site >>3) Use search engine safe links instead of passing CGI parameters > >Making sure the crawler bots index your pages is obviously the best first >step... Bear in mind that Googlebot and others can index dynamic pages, but >only if they're linked to from static pages (i.e. ones with a "?" in the >URL). But then special pages of links for crawlers are only a last resort, >and using some sort of other site-wide technique (slash-delimited query >strings, or getting your CMS to write out flat HTML files) is preferrable. > >But as far as actual optimisation goes, the following rules are important >in today's Google-centric web (more than META keywords and description, >though I always use these anyway, for their potential value for things >other than Google): > >- Put keywords in the TITLE tag of your page. I used to avoid this cos I >sympathise with people bookmarking things and having to change the title to >something short and useful in your browser. But then, if no one finds your >page, how can they bookmark it? ;-) I go for a reasonable phrase-like >string like "Cheap Banana Imports for UK Retail, from XYZ corp" - instead >of "XYZ corp - Home" (which is nicer for bookmarking, but useless for >search engines). > >- Use structural XHTML markup wherever possible. Make sure the H1 tag >contains keywords relevant to the page's topic (without rendering it silly >as a human-readable main title of course). Pages with keywords in the >TITLE, H1, and body text near the top of the page get higher rankings than >those that don't. > >- If possible, use table-less CSS layouts. Then you can shove your H1 and >main content right at the top of the markup, even if in the layout it comes >underneath loads of navigation and banners and whatnot. These can be shoved >at the bottom of the code, but positioned at the top using CSS positioning. >Obviously in tables, you're often forced to have your left-hand side nav as >well as your top nav above the content in your markup. This means lower >rankings. > >These aren't set in stone, but they've got me some pretty good rankings so far. > >HTH, > >Gyrus >[EMAIL PROTECTED] >play: http://norlonto.net/ >work: http://tengai.co.uk/ >PGP key available > > ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Get the mailserver that powers this list at http://www.coolfusion.com Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
RE: Search Engine Optimization
I can't see what this security issue has to do with SE friendly URLs, please explain? -Original Message- From: Matt Robertson [mailto:[EMAIL PROTECTED] Sent: Wednesday, 9 July 2003 6:35 AM To: CF-Talk Subject: Re: Search Engine Optimization Is there any benefit whatsoever to the first point (parsing other extensions thru CF)? I can believe it might have been true in the past but I've never had a problem getting CF pages indexed. I thought it was the parameters that *might* get you; not the page extension. Anyone with recent experience to the contrary? I'd add that you should publish to static HTML wherever possible (which may be the bulk of the pages on many sites). I do it because I'm greedy about conserving server resources, but I sell it to clients by telling them the links are more SE-friendly. Of course you can always do the old blah.cfm/parm/value bit, but that opened up a security hole in CF. http://www.securityfocus.com/advisories/4110 I use it on 4.5 cuz I have a site-wide error handler, whose special handling of 404's supposedly makes the technique safe to use. Did this ever get patched in one of the MX updaters? --- Matt Robertson, [EMAIL PROTECTED] MSB Designs, Inc. http://mysecretbase.com --- -- Original Message -- From: Gyrus <[EMAIL PROTECTED]> Reply-To: [EMAIL PROTECTED] Date: Tue, 08 Jul 2003 21:07:01 +0100 >At 15:37 08/07/2003 -0400, you wrote: >>There was a thread on this list recently about Search Engine >>Optimization for CF sites. Just so happens SEO has become an issue for >>me on a freelance project, and I want to start a list of best >>practices I can work into my development processes. Thus far, I have >>three items: >> >>1) Configure the server to run other file extensions through the CF >>parser (i.e. HTM, HTML) in order to avoid being ignored >>2) Use meta description and meta keyword tags to indicate content on >>the site >>3) Use search engine safe links instead of passing CGI parameters > >Making sure the crawler bots index your pages is obviously the best >first >step... Bear in mind that Googlebot and others can index dynamic pages, but >only if they're linked to from static pages (i.e. ones with a "?" in the >URL). But then special pages of links for crawlers are only a last resort, >and using some sort of other site-wide technique (slash-delimited query >strings, or getting your CMS to write out flat HTML files) is preferrable. > >But as far as actual optimisation goes, the following rules are >important >in today's Google-centric web (more than META keywords and description, >though I always use these anyway, for their potential value for things >other than Google): > >- Put keywords in the TITLE tag of your page. I used to avoid this cos >I >sympathise with people bookmarking things and having to change the title to >something short and useful in your browser. But then, if no one finds your >page, how can they bookmark it? ;-) I go for a reasonable phrase-like >string like "Cheap Banana Imports for UK Retail, from XYZ corp" - instead >of "XYZ corp - Home" (which is nicer for bookmarking, but useless for >search engines). > >- Use structural XHTML markup wherever possible. Make sure the H1 tag >contains keywords relevant to the page's topic (without rendering it silly >as a human-readable main title of course). Pages with keywords in the >TITLE, H1, and body text near the top of the page get higher rankings than >those that don't. > >- If possible, use table-less CSS layouts. Then you can shove your H1 >and >main content right at the top of the markup, even if in the layout it comes >underneath loads of navigation and banners and whatnot. These can be shoved >at the bottom of the code, but positioned at the top using CSS positioning. >Obviously in tables, you're often forced to have your left-hand side nav as >well as your top nav above the content in your markup. This means lower >rankings. > >These aren't set in stone, but they've got me some pretty good rankings >so far. > >HTH, > >Gyrus >[EMAIL PROTECTED] >play: http://norlonto.net/ >work: http://tengai.co.uk/ >PGP key available > > ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Your ad could be here. Monies from ads go to support these lists and provide more resources for the community. http://www.fusionauthority.com/ads.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
RE: Search Engine Optimization
>I can't see what this security issue has to do with SE > friendly URLs, please explain? To make SES urls work (i.e. foo.cfm/parm/value) you have to shut OFF the setting for "verify that pages exist" in IIS. If you do that, you open yourself up to the exploit described at Bugtraq, where certain types of requests will reveal the true web root on the server. For that reason, MM issued the warning they did, copied at the BugTraq site. The warning said "don't do that" which more or less killed that widely used technique, unless you have a site-wide error handler that handles CF 404's, in which case you're safe. However I *think* the SW errhandler only protected you pre-mx. my memory is way hazy on this point and may be dead wrong. --- Matt Robertson, [EMAIL PROTECTED] MSB Designs, Inc. http://mysecretbase.com --- -- Original Message -- From: "Taco Fleur" <[EMAIL PROTECTED]> Reply-To: [EMAIL PROTECTED] Date: Wed, 9 Jul 2003 06:59:55 +1000 > >-Original Message- >From: Matt Robertson [mailto:[EMAIL PROTECTED] >Sent: Wednesday, 9 July 2003 6:35 AM >To: CF-Talk >Subject: Re: Search Engine Optimization > > >Is there any benefit whatsoever to the first point (parsing other >extensions thru CF)? > >I can believe it might have been true in the past but I've never had a >problem getting CF pages indexed. I thought it was the parameters that >*might* get you; not the page extension. Anyone with recent experience >to the contrary? > >I'd add that you should publish to static HTML wherever possible (which >may be the bulk of the pages on many sites). I do it because I'm greedy >about conserving server resources, but I sell it to clients by telling >them the links are more SE-friendly. > >Of course you can always do the old blah.cfm/parm/value bit, but that >opened up a security hole in CF. > >http://www.securityfocus.com/advisories/4110 > >I use it on 4.5 cuz I have a site-wide error handler, whose special >handling of 404's supposedly makes the technique safe to use. > >Did this ever get patched in one of the MX updaters? > >--- > Matt Robertson, [EMAIL PROTECTED] > MSB Designs, Inc. http://mysecretbase.com >--- > > >-- Original Message -- >From: Gyrus <[EMAIL PROTECTED]> >Reply-To: [EMAIL PROTECTED] >Date: Tue, 08 Jul 2003 21:07:01 +0100 > >>At 15:37 08/07/2003 -0400, you wrote: >>>There was a thread on this list recently about Search Engine >>>Optimization for CF sites. Just so happens SEO has become an issue for > >>>me on a freelance project, and I want to start a list of best >>>practices I can work into my development processes. Thus far, I have >>>three items: >>> >>>1) Configure the server to run other file extensions through the CF >>>parser (i.e. HTM, HTML) in order to avoid being ignored >>>2) Use meta description and meta keyword tags to indicate content on >>>the site >>>3) Use search engine safe links instead of passing CGI parameters >> >>Making sure the crawler bots index your pages is obviously the best >>first >>step... Bear in mind that Googlebot and others can index dynamic pages, >but >>only if they're linked to from static pages (i.e. ones with a "?" in >the >>URL). But then special pages of links for crawlers are only a last >resort, >>and using some sort of other site-wide technique (slash-delimited query > >>strings, or getting your CMS to write out flat HTML files) is >preferrable. >> >>But as far as actual optimisation goes, the following rules are >>important >>in today's Google-centric web (more than META keywords and description, > >>though I always use these anyway, for their potential value for things >>other than Google): >> >>- Put keywords in the TITLE tag of your page. I used to avoid this cos >>I >>sympathise with people bookmarking things and having to change the >title to >>something short and useful in your browser. But then, if no one finds >your >>page, how can they bookmark it? ;-) I go for a reasonable phrase-like >>string like "Cheap Banana Imports for UK Retail, from XYZ corp" - >instead >>of "XYZ corp - Home" (which is nicer for bookmarking, but useless for >>search engines). >> >>- Use structural XHTML markup wherever possible. Make su
Re: Search Engine Optimization
.cfm and .php and .asp pages are all easily indexed like .htm or .html's. Algorithm's weighing static pages are pretty much existent now a days. It's the ? the spiders(many of them) won't pass for basic rule of getting caught in dynamic loops. Google on coldfusion and safe URL's you'll find helpful alternatives using CF Great link on that /param/value stuff. Matt Robertson wrote: > Is there any benefit whatsoever to the first point (parsing other extensions thru > CF)? > > I can believe it might have been true in the past but I've never had a problem > getting CF pages indexed. I thought it was the parameters that *might* get you; not > the page extension. Anyone with recent experience to the contrary? > > I'd add that you should publish to static HTML wherever possible (which may be the > bulk of the pages on many sites). I do it because I'm greedy about conserving > server resources, but I sell it to clients by telling them the links are more > SE-friendly. > > Of course you can always do the old blah.cfm/parm/value bit, but that opened up a > security hole in CF. > > http://www.securityfocus.com/advisories/4110 > > I use it on 4.5 cuz I have a site-wide error handler, whose special handling of > 404's supposedly makes the technique safe to use. > > Did this ever get patched in one of the MX updaters? > > --- > Matt Robertson, [EMAIL PROTECTED] > MSB Designs, Inc. http://mysecretbase.com > --- > > > -- Original Message -- > From: Gyrus <[EMAIL PROTECTED]> > Reply-To: [EMAIL PROTECTED] > Date: Tue, 08 Jul 2003 21:07:01 +0100 > > >>At 15:37 08/07/2003 -0400, you wrote: >> >>>There was a thread on this list recently about Search Engine Optimization >>>for CF sites. Just so happens SEO has become an issue for me on a freelance >>>project, and I want to start a list of best practices I can work into my >>>development processes. Thus far, I have three items: >>> >>>1) Configure the server to run other file extensions through the CF parser >>>(i.e. HTM, HTML) in order to avoid being ignored >>>2) Use meta description and meta keyword tags to indicate content on the >>>site >>>3) Use search engine safe links instead of passing CGI parameters >> >>Making sure the crawler bots index your pages is obviously the best first >>step... Bear in mind that Googlebot and others can index dynamic pages, but >>only if they're linked to from static pages (i.e. ones with a "?" in the >>URL). But then special pages of links for crawlers are only a last resort, >>and using some sort of other site-wide technique (slash-delimited query >>strings, or getting your CMS to write out flat HTML files) is preferrable. >> >>But as far as actual optimisation goes, the following rules are important >>in today's Google-centric web (more than META keywords and description, >>though I always use these anyway, for their potential value for things >>other than Google): >> >>- Put keywords in the TITLE tag of your page. I used to avoid this cos I >>sympathise with people bookmarking things and having to change the title to >>something short and useful in your browser. But then, if no one finds your >>page, how can they bookmark it? ;-) I go for a reasonable phrase-like >>string like "Cheap Banana Imports for UK Retail, from XYZ corp" - instead >>of "XYZ corp - Home" (which is nicer for bookmarking, but useless for >>search engines). >> >>- Use structural XHTML markup wherever possible. Make sure the H1 tag >>contains keywords relevant to the page's topic (without rendering it silly >>as a human-readable main title of course). Pages with keywords in the >>TITLE, H1, and body text near the top of the page get higher rankings than >>those that don't. >> >>- If possible, use table-less CSS layouts. Then you can shove your H1 and >>main content right at the top of the markup, even if in the layout it comes >>underneath loads of navigation and banners and whatnot. These can be shoved >>at the bottom of the code, but positioned at the top using CSS positioning. >>Obviously in tables, you're often forced to have your left-hand side nav as >>well as your top nav above the content in your markup. This means lower >>rankings. >> >>These aren't set in stone, but they've got me some pretty good rankings so far. >> &g
Re: Search Engine Optimization
Ryan...search the archives over the past few weeks...asked and answered ;-) Cheers Bryan Stevenson B.Comm. VP & Director of E-Commerce Development Electric Edge Systems Group Inc. phone: 250.480.0642 fax: 250.480.1264 cell: 250.920.8830 e-mail: [EMAIL PROTECTED] web: www.electricedgesystems.com [Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings] [Donations and Support]
Re: Search Engine Optimization
SES Urls (article in FA), CSS for tighter code (Sandy speaking on it tonight or take her course), clean url spaces, good content on a regular rotation, lots o' links I know this is a little off topic, but does anyone have any specific techniques they use to get the most out of thier listing in Search engines? Other than, the title tag, the amount of times the words are listed in the page etc, submitting to search engines directly, as well as having many links to your site as possible from other sites. Any help would be appreciated . Thanks Ryan [Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings] [Donations and Support]
Re: Search Engine Optimization
Ryan, I can highly recommend a site called webmasterworld.com. Searching their forums will teach you anything you need to know about search engine optimization. There is also Brett Tabke's famous 26 steps for 15K a day. It's a bit dated but a good starting point. http://www.searchengineworld.com/misc/guide.htm Rick Mason - Original Message - From: Ryan Mannion <[EMAIL PROTECTED]> Date: Tue, 10 Aug 2004 12:08:20 -0400 Subject: Search Engine Optimization To: CF-Talk <[EMAIL PROTECTED]> I know this is a little off topic, but does anyone have any specific techniques they use to get the most out of thier listing in Search engines? Other than, the title tag, the amount of times the words are listed in the page etc, submitting to search engines directly, as well as having many links to your site as possible from other sites. Any help would be appreciated . Thanks Ryan [Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings] [Donations and Support]
RE: Search Engine Optimization
Ryan, I've had tons of success with coding my applications so that I can go into the admin and write out the site in static HTML. Most of the big search engines are getting better at being able to crawl dynamic sites, but they crawl like mad when it's good ole static HTML text. Also, make sure you don't have any broken links... If you can offer any affiliate-type marketing partnerships, you will be able to easily boost your page rank and link popularity because of how many people will highlight your site in an effort to earn commissions. What I do with my affiliate programs is that I pay very good commissions, but I only offer the program to sites with very high page ranking and newsletters with good reach. If you do this, make sure you lay down the law about how your partners are NOT allowed to promote your products or services. You certainly don't want an affiliate bringing DOWN your reputation. One of the other things that you can do to help yourself out is to dedicate significant sections of your site to useful content. If your site is concerned with selling aftermarket Harley Davidson parts, provide as much useful information on HD motorcycles, customizing and riding culture... writing your own articles and syndicating as much content as you can from elsewhere. We all know there are plenty of sources on the internet without even having to plagiarize. The more real, informative content you have on your site, the better you will rank naturally (just be sure to keep it all neatly organized). --Ferg _ I know this is a little off topic, but does anyone have any specific techniques they use to get the most out of thier listing in Search engines? Other than, the title tag, the amount of times the words are listed in the page etc, submitting to search engines directly, as well as having many links to your site as possible from other sites. Any help would be appreciated . Thanks Ryan [Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings] [Donations and Support]
IIS and making SES URLs work [WAS: Search Engine Optimization]
At 14:07 08/07/2003 -0700, you wrote: > >I can't see what this security issue has to do with SE > > friendly URLs, please explain? > >To make SES urls work (i.e. foo.cfm/parm/value) you have to shut OFF the >setting for "verify that pages exist" in IIS. I've been trying to figure out why that method worked on one server but not another, but I've never found a setting in IIS similar to "verify that pages exist". Could you point out where this option is set in the IIS Management Console? Gyrus [EMAIL PROTECTED] play: http://norlonto.net/ work: http://tengai.co.uk/ PGP key available ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Signup for the Fusion Authority news alert and keep up with the latest news in ColdFusion and related topics. http://www.fusionauthority.com/signup.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Re: IIS and making SES URLs work [WAS: Search Engine Optimization]
Sure. Open the IIS manager Right click on a site and choose Properties Click the Home Directory tab Click the Configuration button (lower right of dialog) Click the .cfm extension and choose 'Edit' The lower left checkbox: "Check that File Exists" If you leave that on (the default) IIS will throw its own 404 if it does not find a page named foo.cfm/blah/blah (which of course it won't). Once you make this setting CF will become responsible for handling 404's to .cfm pages, which is where you can get into trouble via that bugtraq bit. --- Matt Robertson, [EMAIL PROTECTED] MSB Designs, Inc. http://mysecretbase.com --- -- Original Message -- From: Gyrus <[EMAIL PROTECTED]> Reply-To: [EMAIL PROTECTED] Date: Tue, 08 Jul 2003 22:25:37 +0100 >At 14:07 08/07/2003 -0700, you wrote: >> >I can't see what this security issue has to do with SE >> > friendly URLs, please explain? >> >>To make SES urls work (i.e. foo.cfm/parm/value) you have to shut OFF the >>setting for "verify that pages exist" in IIS. > >I've been trying to figure out why that method worked on one server but not >another, but I've never found a setting in IIS similar to "verify that >pages exist". Could you point out where this option is set in the IIS >Management Console? > >Gyrus >[EMAIL PROTECTED] >play: http://norlonto.net/ >work: http://tengai.co.uk/ >PGP key available > > ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Your ad could be here. Monies from ads go to support these lists and provide more resources for the community. http://www.fusionauthority.com/ads.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Re: IIS and making SES URLs work [WAS: Search Engine Optimization]
At 14:34 08/07/2003 -0700, you wrote: >Open the IIS manager >Right click on a site and choose Properties >Click the Home Directory tab >Click the Configuration button (lower right of dialog) >Click the .cfm extension and choose 'Edit' >The lower left checkbox: "Check that File Exists" > >If you leave that on (the default) IIS will throw its own 404 if it does >not find a page named foo.cfm/blah/blah (which of course it won't). Once >you make this setting CF will become responsible for handling 404's to >.cfm pages, which is where you can get into trouble via that bugtraq bit. OK, thanks, found it. Still confused tho, as it's not checked on our local server, but /-delimited URLs don't seem to work locally. They do on our live site, but we don't have IIS access there. We plumped for "OK, it works, let's not ask questions" - but naturally this is less than ideal. I never really questioned it, but "index.cfm/action/page.view/id/42" seems to just work on our live site. Now I'm really foxed as to why - we don't have any custom 404 catching or anything... Any ideas? Hmmm, frustration at not knowing why something *works* makes a change! Gyrus [EMAIL PROTECTED] play: http://norlonto.net/ work: http://tengai.co.uk/ PGP key available ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Signup for the Fusion Authority news alert and keep up with the latest news in ColdFusion and related topics. http://www.fusionauthority.com/signup.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Re: IIS and making SES URLs work [WAS: Search Engine Optimization]
You should also check this website: http://www.fusium.com/index.cfm?fuseaction=view.aProduct&contentObjectID=5DA 3DC7A-1C03-4367-A7B90A1EEE3EF675#sesConverter Also, if you're using iis lockdown, make sure allowdotinpath is set to 1 otherwise ses urls will not work. - Original Message - From: "Gyrus" <[EMAIL PROTECTED]> To: "CF-Talk" <[EMAIL PROTECTED]> Sent: Wednesday, July 09, 2003 12:25 AM Subject: IIS and making SES URLs work [WAS: Search Engine Optimization] > At 14:07 08/07/2003 -0700, you wrote: > > >I can't see what this security issue has to do with SE > > > friendly URLs, please explain? > > > >To make SES urls work (i.e. foo.cfm/parm/value) you have to shut OFF the > >setting for "verify that pages exist" in IIS. > > I've been trying to figure out why that method worked on one server but not > another, but I've never found a setting in IIS similar to "verify that > pages exist". Could you point out where this option is set in the IIS > Management Console? > > Gyrus > [EMAIL PROTECTED] > play: http://norlonto.net/ > work: http://tengai.co.uk/ > PGP key available > > ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq This list and all House of Fusion resources hosted by CFHosting.com. The place for dependable ColdFusion Hosting. http://www.cfhosting.com Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
RE: IIS and making SES URLs work [WAS: Search Engine Optimization ]
Dumb q... SES urls... Are they even required any more? I mean, the smarter engines follow links through your site, so is it really necessary? -Original Message- From: Gyrus [mailto:[EMAIL PROTECTED] Sent: Tuesday, July 08, 2003 2:26 PM To: CF-Talk Subject: IIS and making SES URLs work [WAS: Search Engine Optimization] At 14:07 08/07/2003 -0700, you wrote: > >I can't see what this security issue has to do with SE friendly > >URLs, please explain? > >To make SES urls work (i.e. foo.cfm/parm/value) you have to shut OFF >the >setting for "verify that pages exist" in IIS. I've been trying to figure out why that method worked on one server but not another, but I've never found a setting in IIS similar to "verify that pages exist". Could you point out where this option is set in the IIS Management Console? Gyrus [EMAIL PROTECTED] play: http://norlonto.net/ work: http://tengai.co.uk/ PGP key available ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Get the mailserver that powers this list at http://www.coolfusion.com Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
RE: IIS and making SES URLs work [WAS: Search Engine Optimization ]
At 15:27 08/07/2003 -0700, you wrote: >Dumb q... SES urls... Are they even required any more? I mean, the >smarter engines follow links through your site, so is it really necessary? AFAIK Googlebot only crawls dynamic pages that are linked to from static pages - an obvious rule to avoid the kind of crazy tangles that Inktomi's Slurp seems to get into. Google only indexed the first pages of our site's multi-page articles until we shifted to SES URLs. Gyrus [EMAIL PROTECTED] play: http://norlonto.net/ work: http://tengai.co.uk/ PGP key available ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Signup for the Fusion Authority news alert and keep up with the latest news in ColdFusion and related topics. http://www.fusionauthority.com/signup.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
RE: IIS and making SES URLs work [WAS: Search Engine Optimization ]
Our pages, using SES url's get good ranking at google. An example is if you search for football eye shield you get this indexed link: http://www.teamskyline.com/index.cfm/method/cdd2_productdisplay/catid/6/category/Football/subcatid/172/sub_category/Eye%20Shields/product_id/ITECH/name/Eye%20Shield/ Brook At 12:11 AM 7/9/2003 +0100, you wrote: >At 15:27 08/07/2003 -0700, you wrote: > >Dumb q... SES urls... Are they even required any more? I mean, the > >smarter engines follow links through your site, so is it really necessary? > >AFAIK Googlebot only crawls dynamic pages that are linked to from static >pages - an obvious rule to avoid the kind of crazy tangles that Inktomi's >Slurp seems to get into. Google only indexed the first pages of our site's >multi-page articles until we shifted to SES URLs. > >Gyrus >[EMAIL PROTECTED] >play: http://norlonto.net/ >work: http://tengai.co.uk/ >PGP key available > > ~| Archives: http://www.houseoffusion.com/cf_lists/index.cfm?forumid=4 Subscription: http://www.houseoffusion.com/cf_lists/index.cfm?method=subscribe&forumid=4 FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Your ad could be here. Monies from ads go to support these lists and provide more resources for the community. http://www.fusionauthority.com/ads.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4