Steve,
Are you looking at some other code that I am not aware of??
The attached code does exactly what you are proposing, on every request it
ONLY retrieves the number of records that you need for that page. In other
words, if you are displaying 10 records per page and you are on page 3 then
only 10 records are returned (transferred through the connection or over the
network) instead of 30.
The code works with derived tables, the inner table retrieves (page times
max rows), the table around that retrieves only the last (rows) the outer
table sorts the results as required.
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_NULLS OFF
GO
CREATE PROCEDURE [dbo].[spPaging]
(
@maxRow INT,
@currentPage INT
) AS
/*
####################################################
##
## author: Taco Fleur ([EMAIL PROTECTED])
## description: stored procedure to perform paging
## file name: spPaging
## this file can be freely used and distributed as long
## as this header remains intact
##
####################################################
*/
DECLARE @sql NVARCHAR(2000)
SET @maxRow = 10
SET @currentPage = 1
SET @sql = '
SELECT *
FROM (SELECT TOP ' + CAST(@maxRow AS VARCHAR(10)) + ' *
FROM (SELECT TOP ' + CAST((@maxRow * @currentPage) AS VARCHAR(10)) +
' DT1.username
FROM tblUser DT1
ORDER BY DT1.username ASC) DT2
ORDER BY DT2.username DESC) DT3
ORDER BY username '
EXEC sp_executesql @sql
GO
SET QUOTED_IDENTIFIER OFF
GO
SET ANSI_NULLS ON
GO
Taco Fleur
Blog http://www.tacofleur.com/index/blog/
Methodology http://www.tacofleur.com/index/methodology/
Tell me and I will forget
Show me and I will remember
Teach me and I will learn
> -----Original Message-----
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf
> Of Steve Onnis
> Sent: Sunday, 1 February 2004 2:11 PM
> To: CFAussie Mailing List
> Subject: [cfaussie] Re: Efficient Paging - follow up on an old post
>
>
> Just had a look at the rest of the code
>
> Your not really doing anything differently
>
> Your still having to query the database each time for the
> total count of the results and the StartRow and MaxRows
> attributes are still being used to output the query results
> to the user
>
> What idealy needs to happen is that the results returned are
> filtered so you only get the records for the page your
> displaying, but stull maintaining the total records for the
> full recordset being returned
>
> That way your not caching, no SPs, no tmp tables, just a
> custom tag for displaying the NEXT,PREV and page numbers
>
> Anyway
>
> -----Original Message-----
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] Behalf Of
> Taco Fleur
> Sent: Sunday, 1 February 2004 2:54 PM
> To: CFAussie Mailing List
> Subject: [cfaussie] Re: Efficient Paging - follow up on an old post
>
>
> Steve I think you are not reading my SP correctly.
> The SP uses no temp tables, and you are right it uses one
> single table (or more if required) i.e. the actual table that
> holds the data. It would definitely work with more than 1
> user hitting the search, and the proper results to.
>
> I think you have been thinking about "paging" before and came
> across the issue you just mentioned, I came across the same
> issue when I first started thinking about paging, anyway that
> issue could be handled by using the CFTOKEN of the user for
> the temp table name, but temp table sis not the way to go, in
> my opinion.
>
> Taco Fleur
> Blog http://www.tacofleur.com/index/blog/
> Methodology http://www.tacofleur.com/index/methodology/
>
> Tell me and I will forget
> Show me and I will remember
> Teach me and I will learn
>
>
> > -----Original Message-----
> > From: [EMAIL PROTECTED]
> > [mailto:[EMAIL PROTECTED] On Behalf
> Of Steve
> > Onnis
> > Sent: Sunday, 1 February 2004 1:45 PM
> > To: CFAussie Mailing List
> > Subject: [cfaussie] Re: Efficient Paging - follow up on an old post
> >
> >
> > Taco
> >
> > The process wouldnt work if you had more than 1 user
> hitting the same
> > page.
> >
> > You process only uses a single table for all records
> >
> > If i hit the page then you hot the page, I would no longer have my
> > results when i hit the next button, I would have yours.
> >
> > As a result of this, to make your process more flexible,
> you would I
> > think need to create tables on the fly to hold a sprecific users
> > records. 100 concurrent users would then mean 100 extra tables in
> > your database. You would then also need to have dbOwner
> rights to the
> > database to do so.
> >
> > Also, would this mean that you would have to create a tmp table for
> > each type of paging display?
> >
> > Or am i not reading your SP correctly?
> >
> > Steve
> >
> > -----Original Message-----
> > From: [EMAIL PROTECTED]
> > [mailto:[EMAIL PROTECTED] Behalf Of Taco
> > Fleur
> > Sent: Sunday, 1 February 2004 2:34 PM
> > To: CFAussie Mailing List
> > Subject: [cfaussie] Re: Efficient Paging - follow up on an old post
> >
> >
> > Hi Peter,
> >
> > Not sure whether I understand your post or whether you
> understand mine
> > ;-))
> >
> > But the method I am proposing does not use cached queries at all.
> > Cached queries is nice for little apps, but as you say yourself its
> > limited to a certain amount of cached queries, thus not the ideal
> > method for a heavily used search interface. Even so, a cached query
> > transfers ALL data in the resultset to the app, the method I am
> > proposing/using does not transfer any more data than required upon
> > each request.
> >
> > Where the [EMAIL PROTECTED] is everyone?
> > are we the only ones working in the weekend?
> >
> > Taco Fleur
> > Blog http://www.tacofleur.com/index/blog/
> > Methodology http://www.tacofleur.com/index/methodology/
> >
> > Tell me and I will forget
> > Show me and I will remember
> > Teach me and I will learn
> >
> >
> > > -----Original Message-----
> > > From: [EMAIL PROTECTED]
> > > [mailto:[EMAIL PROTECTED] On Behalf
> > Of Peter
> > > Tilbrook
> > > Sent: Sunday, 1 February 2004 1:20 PM
> > > To: CFAussie Mailing List
> > > Subject: [cfaussie] Re: Efficient Paging - follow up on
> an old post
> > >
> > >
> > > Caching the query for "paging", ie next 10 records, previous 10
> > > records, firs page, last page, is a great idea.
> > >
> > > But take this into account first:
> > >
> > > How "dynamic" will the list be - if not altogether
> dynamic - great.
> > >
> > > Otherwise - if the database data is changed, force a database
> > > "refresh" to the same scope as your cached query to
> ensure the end
> > > user is seeing the up-to-date information.
> > >
> > > Apart from that this method can really help your application
> > > performance, as long as you respect the amount of memory
> you server
> > > has and the limit of (100 cached queries I believe -
> limited by the
> > > servers memory - 1 cached query could kill it) that CF
> has for this.
> > >
> > > ---
> > > You are currently subscribed to cfaussie as:
> > [EMAIL PROTECTED] To
> > > unsubscribe send a blank email to
> > > [EMAIL PROTECTED]
> > >
> > > MXDU2004 + Macromedia DevCon AsiaPac + Sydney, Australia
> > http://www.mxdu.com/ + 24-25 February, 2004
> >
> >
> > ---
> > You are currently subscribed to cfaussie as:
> [EMAIL PROTECTED]
> > To unsubscribe send a blank email to
> > [EMAIL PROTECTED]
> >
> > MXDU2004 + Macromedia DevCon AsiaPac + Sydney, Australia
> http://www.mxdu.com/ + 24-25 February, 2004
>
>
> ---
> You are currently subscribed to cfaussie as:
> [EMAIL PROTECTED] To unsubscribe send a blank email to
> [EMAIL PROTECTED]
>
> MXDU2004 + Macromedia DevCon AsiaPac + Sydney, Australia
http://www.mxdu.com/ + 24-25 February, 2004
---
You are currently subscribed to cfaussie as: [EMAIL PROTECTED] To
unsubscribe send a blank email to [EMAIL PROTECTED]
MXDU2004 + Macromedia DevCon AsiaPac + Sydney, Australia
http://www.mxdu.com/ + 24-25 February, 2004
---
You are currently subscribed to cfaussie as: [EMAIL PROTECTED] To
unsubscribe send a blank email to [EMAIL PROTECTED]
MXDU2004 + Macromedia DevCon AsiaPac + Sydney, Australia
http://www.mxdu.com/ + 24-25 February, 2004
---
You are currently subscribed to cfaussie as: [EMAIL PROTECTED]
To unsubscribe send a blank email to [EMAIL PROTECTED]
MXDU2004 + Macromedia DevCon AsiaPac + Sydney, Australia
http://www.mxdu.com/ + 24-25 February, 2004