ASP.NET 2.0 Web Site Search Page

S

Sam

Does anyone know of a way to create a search page under ASP.NET 2.0?

I have started out by configuring a catalog in Index Server,
registering the aspx, ascx extensions in the registry to allow them to
be indexed and built the catalog as per KB article, but I've run into
an interesting problem.

When you publish a website from Whidbey, it precompiles everything and
strips out the searchable details of the page (metadata, html, etc)
leaving only the statement "This is a marker file generated by the
precompilation tool, and should not be deleted".

Can anyone advise as to how we should be creating web pages under
ASP.NET 2.0 that can be detected by search engines, and/or how to
perform a search against a site created in Whidbey?

Regards,

Sam.
 
S

Steven Cheng[MSFT]

Hi Sam,

As for the ASP.NET2.0, it provide some new features but seems haven't a
buildin search engine framework.
Generally, in ASP.NET web application if we need to build a search engine,
we should make use of the index service. And the Microsoft's SharePoint
Portal service which provide a very rich search engine fuction just make
use of the indexing service. You may have a look and get some ideas from it.

Also, as for the make search engine friendly page, I think it is also
similar with the one we do in asp.net1.1. because genreally most pages in
asp.net web applications are dynamic generated pages which are bydefault
not friendly to search engines, sometimes we need to provide our own
firendtly urls for search engine, for example ,using the asp.net's
urlrewriting. Here are some tech articles discussing on this:

Search engine friendly URLs using ASP.NET
http://www.codetoad.com/asp.net_ma_searchenginefriendly.asp

#URL Rewriting in ASP.NET
http://msdn.microsoft.com/asp.net/using/building/web/default.aspx?pull=/libr
ary/en-us/dnaspp/html/urlrewriting.asp

In addition, if you're instereted for such techniques' support in
Whidbey's final release support, you can try posting in the Whidbey beta
newsgroup in the MSDN newsgroup.

Thanks

Regards,

Steven Cheng
Microsoft Online Support

Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)
 
S

Sam Loveridge

Hi Steven.

Thanks for your response, I'll have a look at those techniques. One
question though, doesn't the indexing service require some content in
the page to parse for matching search criteria? How does this work if
the only content in the page is a one line sentence of "This is a marker
file generated by the precompilation tool, and should not be deleted"?

It would appear that ASP.NET 2.0 uses these marker files to reference
back to a compiled dll to build the page on the fly (not sure myself??),
so with that in mind, would the indexing service be able to parse the
page for potential search results? I might be completely
misunderstanding the way the indexing service works.

Thanks in advance,

Sam.
 
S

Steven Cheng[MSFT]

Hi Sam,

Thanks for your followup. As for the Indexing service, I've confirm it with
some other experts. Yes, as you have thought, it seems that it does mainly
work for static pages(html) , for dynamic web pages such as (asp/ asp.net
pages), it'll parse the page's template file , (.asp or .aspx). So as fro
the asp.net2.0, if we use the precompile function to make all the pages
into dll, the indexing service can't index those ones.
Also, indexing service is provided by IIS that help the users to search the
documents on the site , but it will have no help to any web search engine
(such as google or yahoo) since they all have there own search rule. And
the web search engines generally works as below:

1.try locate a website/pages via url

2.send request to that url and retrieve response html content

3.parse the response content and try go through the site via other links in
the content
...

So the most important is to let the web search engine find your web page.
To do that , we make the url of our web pages search engine friendly.
Generally static url such as :
Http://servername/webapp/folder/page.aspx is good for search engine
and dynamic urls such as:

"http://servername/webapp/mainpage.aspx?folder=fsdfds&pageid=fsafds

is not search engine firendly. In asp.net most ones use the URLRewriting to
manually provide a friendly(like a static url path) for their dynamic
pages. In addition to the ones mentioned in my last reply , here are some
further tech article discusssing on this:

#Search engine friendly URLs using ASP.NET (C#.NET)
http://www.codetoad.com/asp.net_ma_searchenginefriendly.asp
http://www.ftponline.com/vsm/2002_02/magazine/columns/qa/
http://www.15seconds.com/issue/030522.htm

In addition, here is another article discussing how to create a web spider
(search engine ) in .net
#Crawl Web Sites and Catalog Info to Any Data Store with ADO.NET and Visual
Basic .NET
http://msdn.microsoft.com/msdnmag/issues/02/10/spiderinnet/

Hope also helps.thanks.


Regards,

Steven Cheng
Microsoft Online Support

Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)
 
S

Sam Loveridge

Thanks for your help Steven. I'll try out a few of your suggestions.

Regards,

Sam.
 
S

Steven Cheng[MSFT]

You're welcome Sam.
If you have any otherquestions later, please always feel free to post here.
Thanks again for your posting.


Regards,

Steven Cheng
Microsoft Online Support

Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top