Showing posts with label application. Show all posts
Showing posts with label application. Show all posts

Monday, March 26, 2012

Parameter Display

I need to display parameter value on my report.This parameter is coming from web application.

Example:I am passing parameter Dealer Name from web application to Report.

How to display that Dealer Name on Report.

Drag a textbox control in the report.

go to edit expression for that textbox.

select parameter on ur left side, it displays all the parameters listed in ur report on the right hand side.(dealer name as u said for example). incase it is not used in query, u can create one parameter which also be listed in the right hand side.

double click the parameter you want to display in the text box.

sql

parameter collection

greetings -
I am trying to set up a custom parameters page in an ASP .NET application -
I am in the process of choosing between keeping the values (information I
need to list the parameters for the reports, as well as the available, valid
and default values for any given report) in a database or access all I need
through reporting services (web service).
Does anyone have any advice for setting up a Web App to run rs reports - can
you get all you need from the rs web services to create a custom parameter
page, or is it better to retrieve the values you need from a database?
I can dynamically build the custom parm page using values from a database
(dynamically list the parms for the report, fire off the sp to return valid
values for the parm and set the default value) - I would like to get this
going using the web services but am stuck at the point of trying to fill the
valid parm values - I do have the name of the stored procedure to run to
retrieve the valid values for the parm, but can't seem to figure out how to
either a)get the name of the sp to run, or b)just do a databind some how to
my dynamic control (what ever that may be - a textbox, a list box, a drop
down list, etc.)
Any ideas - ?
ThanksI have done exactly this, but the code is at work and it is a (very) long
weekend.
You can certainly use the web services to retrieve everything - the trick to
getting the values you have set up from datasets that populate the possible
values etc is to that when you run the GetReportParameters method, make sure
you set the ForRendering option to true - then everything you need comes
back.
Have a go and I'll post my code on Tuesday
--
Mary Bray [SQL Server MVP]
Please reply only to newsgroups
"Myles" <Myles@.discussions.microsoft.com> wrote in message
news:44942D91-7D8D-4514-91BE-E0FD9E9750FA@.microsoft.com...
> greetings -
> I am trying to set up a custom parameters page in an ASP .NET
> application -
> I am in the process of choosing between keeping the values (information I
> need to list the parameters for the reports, as well as the available,
> valid
> and default values for any given report) in a database or access all I
> need
> through reporting services (web service).
> Does anyone have any advice for setting up a Web App to run rs reports -
> can
> you get all you need from the rs web services to create a custom parameter
> page, or is it better to retrieve the values you need from a database?
> I can dynamically build the custom parm page using values from a database
> (dynamically list the parms for the report, fire off the sp to return
> valid
> values for the parm and set the default value) - I would like to get this
> going using the web services but am stuck at the point of trying to fill
> the
> valid parm values - I do have the name of the stored procedure to run to
> retrieve the valid values for the parm, but can't seem to figure out how
> to
> either a)get the name of the sp to run, or b)just do a databind some how
> to
> my dynamic control (what ever that may be - a textbox, a list box, a drop
> down list, etc.)
> Any ideas - ?
>
> Thanks
>
>|||Thank you for the reply Mary - I may need a little help yet,
I went back and read my question and I was so involved I think I mis-stated
my question! Do the web services expose the names of the stored procedures
used to fill the valid values for the parameters - from your response it
sounds as if you just use the GetReportParameters method and that returns
everything (all of the datasets for all of the parameters) - if that's the
case how do you separate or sort through it all...HELP!
It was a very long weekend - and I still didn't get anything done!
Thanks,
"Mary Bray [MVP]" wrote:
> I have done exactly this, but the code is at work and it is a (very) long
> weekend.
> You can certainly use the web services to retrieve everything - the trick to
> getting the values you have set up from datasets that populate the possible
> values etc is to that when you run the GetReportParameters method, make sure
> you set the ForRendering option to true - then everything you need comes
> back.
> Have a go and I'll post my code on Tuesday
> --
> Mary Bray [SQL Server MVP]
> Please reply only to newsgroups
> "Myles" <Myles@.discussions.microsoft.com> wrote in message
> news:44942D91-7D8D-4514-91BE-E0FD9E9750FA@.microsoft.com...
> > greetings -
> >
> > I am trying to set up a custom parameters page in an ASP .NET
> > application -
> > I am in the process of choosing between keeping the values (information I
> > need to list the parameters for the reports, as well as the available,
> > valid
> > and default values for any given report) in a database or access all I
> > need
> > through reporting services (web service).
> >
> > Does anyone have any advice for setting up a Web App to run rs reports -
> > can
> > you get all you need from the rs web services to create a custom parameter
> > page, or is it better to retrieve the values you need from a database?
> >
> > I can dynamically build the custom parm page using values from a database
> > (dynamically list the parms for the report, fire off the sp to return
> > valid
> > values for the parm and set the default value) - I would like to get this
> > going using the web services but am stuck at the point of trying to fill
> > the
> > valid parm values - I do have the name of the stored procedure to run to
> > retrieve the valid values for the parm, but can't seem to figure out how
> > to
> > either a)get the name of the sp to run, or b)just do a databind some how
> > to
> > my dynamic control (what ever that may be - a textbox, a list box, a drop
> > down list, etc.)
> >
> > Any ideas - ?
> >
> >
> > Thanks
> >
> >
> >
> >
>
>|||OK - now I'm at work so here are some code snippets that may help (in C#):
They are used to build drop down lists in a web page with the parameter
values. This way RS takes care of the security for the parameter data. To
find out the names of the stored procs you need to get the DataSetDefinition
object and query it. I haven't yet worked out how to get it back though...
sorry. I think i tried this first and gave up, so used the parameters
collection to build the lookup data.
ReportingService rs=new ReportingService();
rs.Credentials = System.Net.CredentialCache.DefaultCredentials;
bool forRendering = true;
string historyID = null;
ParameterValue[] values = null;
DataSourceCredentials[] credentials = null;
ReportParameter[] parameters = null;
parameters = rs.GetReportParameters(ReportPath, historyID, forRendering,
values, credentials);
if (parameters != null)
{
foreach (ReportParameter rp in parameters)
if((rp.ValidValues!=null)||(rp.ValidValuesQueryBased))
{
DropDownList lst=new DropDownList();
foreach(ValidValue vv in rp.ValidValues)
{
ListItem item=new ListItem();
item.Text=vv.Label;
item.Value=vv.Value;
lst.Items.Add(item);
}
"Myles" wrote:
> Thank you for the reply Mary - I may need a little help yet,
> I went back and read my question and I was so involved I think I mis-stated
> my question! Do the web services expose the names of the stored procedures
> used to fill the valid values for the parameters - from your response it
> sounds as if you just use the GetReportParameters method and that returns
> everything (all of the datasets for all of the parameters) - if that's the
> case how do you separate or sort through it all...HELP!
> It was a very long weekend - and I still didn't get anything done!
>
> Thanks,
>
> "Mary Bray [MVP]" wrote:
> > I have done exactly this, but the code is at work and it is a (very) long
> > weekend.
> > You can certainly use the web services to retrieve everything - the trick to
> > getting the values you have set up from datasets that populate the possible
> > values etc is to that when you run the GetReportParameters method, make sure
> > you set the ForRendering option to true - then everything you need comes
> > back.
> >
> > Have a go and I'll post my code on Tuesday
> > --
> >
> > Mary Bray [SQL Server MVP]
> > Please reply only to newsgroups
> >
> > "Myles" <Myles@.discussions.microsoft.com> wrote in message
> > news:44942D91-7D8D-4514-91BE-E0FD9E9750FA@.microsoft.com...
> > > greetings -
> > >
> > > I am trying to set up a custom parameters page in an ASP .NET
> > > application -
> > > I am in the process of choosing between keeping the values (information I
> > > need to list the parameters for the reports, as well as the available,
> > > valid
> > > and default values for any given report) in a database or access all I
> > > need
> > > through reporting services (web service).
> > >
> > > Does anyone have any advice for setting up a Web App to run rs reports -
> > > can
> > > you get all you need from the rs web services to create a custom parameter
> > > page, or is it better to retrieve the values you need from a database?
> > >
> > > I can dynamically build the custom parm page using values from a database
> > > (dynamically list the parms for the report, fire off the sp to return
> > > valid
> > > values for the parm and set the default value) - I would like to get this
> > > going using the web services but am stuck at the point of trying to fill
> > > the
> > > valid parm values - I do have the name of the stored procedure to run to
> > > retrieve the valid values for the parm, but can't seem to figure out how
> > > to
> > > either a)get the name of the sp to run, or b)just do a databind some how
> > > to
> > > my dynamic control (what ever that may be - a textbox, a list box, a drop
> > > down list, etc.)
> > >
> > > Any ideas - ?
> > >
> > >
> > > Thanks
> > >
> > >
> > >
> > >
> >
> >
> >|||beautiful - thank you very much Mary - this will be a tremendous help. If I
figure out how to actually get the sp names I will post back ~
Hope you had a great weekend!
Thanks,
Pete
"Mary Bray [SQL Server MVP]" wrote:
> OK - now I'm at work so here are some code snippets that may help (in C#):
> They are used to build drop down lists in a web page with the parameter
> values. This way RS takes care of the security for the parameter data. To
> find out the names of the stored procs you need to get the DataSetDefinition
> object and query it. I haven't yet worked out how to get it back though...
> sorry. I think i tried this first and gave up, so used the parameters
> collection to build the lookup data.
> ReportingService rs=new ReportingService();
> rs.Credentials = System.Net.CredentialCache.DefaultCredentials;
> bool forRendering = true;
> string historyID = null;
> ParameterValue[] values = null;
> DataSourceCredentials[] credentials = null;
> ReportParameter[] parameters = null;
> parameters = rs.GetReportParameters(ReportPath, historyID, forRendering,
> values, credentials);
> if (parameters != null)
> {
> foreach (ReportParameter rp in parameters)
> if((rp.ValidValues!=null)||(rp.ValidValuesQueryBased))
> {
> DropDownList lst=new DropDownList();
> foreach(ValidValue vv in rp.ValidValues)
> {
> ListItem item=new ListItem();
> item.Text=vv.Label;
> item.Value=vv.Value;
> lst.Items.Add(item);
> }
>
>
> "Myles" wrote:
> > Thank you for the reply Mary - I may need a little help yet,
> >
> > I went back and read my question and I was so involved I think I mis-stated
> > my question! Do the web services expose the names of the stored procedures
> > used to fill the valid values for the parameters - from your response it
> > sounds as if you just use the GetReportParameters method and that returns
> > everything (all of the datasets for all of the parameters) - if that's the
> > case how do you separate or sort through it all...HELP!
> >
> > It was a very long weekend - and I still didn't get anything done!
> >
> >
> > Thanks,
> >
> >
> >
> > "Mary Bray [MVP]" wrote:
> >
> > > I have done exactly this, but the code is at work and it is a (very) long
> > > weekend.
> > > You can certainly use the web services to retrieve everything - the trick to
> > > getting the values you have set up from datasets that populate the possible
> > > values etc is to that when you run the GetReportParameters method, make sure
> > > you set the ForRendering option to true - then everything you need comes
> > > back.
> > >
> > > Have a go and I'll post my code on Tuesday
> > > --
> > >
> > > Mary Bray [SQL Server MVP]
> > > Please reply only to newsgroups
> > >
> > > "Myles" <Myles@.discussions.microsoft.com> wrote in message
> > > news:44942D91-7D8D-4514-91BE-E0FD9E9750FA@.microsoft.com...
> > > > greetings -
> > > >
> > > > I am trying to set up a custom parameters page in an ASP .NET
> > > > application -
> > > > I am in the process of choosing between keeping the values (information I
> > > > need to list the parameters for the reports, as well as the available,
> > > > valid
> > > > and default values for any given report) in a database or access all I
> > > > need
> > > > through reporting services (web service).
> > > >
> > > > Does anyone have any advice for setting up a Web App to run rs reports -
> > > > can
> > > > you get all you need from the rs web services to create a custom parameter
> > > > page, or is it better to retrieve the values you need from a database?
> > > >
> > > > I can dynamically build the custom parm page using values from a database
> > > > (dynamically list the parms for the report, fire off the sp to return
> > > > valid
> > > > values for the parm and set the default value) - I would like to get this
> > > > going using the web services but am stuck at the point of trying to fill
> > > > the
> > > > valid parm values - I do have the name of the stored procedure to run to
> > > > retrieve the valid values for the parm, but can't seem to figure out how
> > > > to
> > > > either a)get the name of the sp to run, or b)just do a databind some how
> > > > to
> > > > my dynamic control (what ever that may be - a textbox, a list box, a drop
> > > > down list, etc.)
> > > >
> > > > Any ideas - ?
> > > >
> > > >
> > > > Thanks
> > > >
> > > >
> > > >
> > > >
> > >
> > >
> > >

Wednesday, March 21, 2012

Parallel operations in SQL 2000 Standard

Hi,
I have an IIS 6.0 application on one server using database SQL Derver 2000
Standard Edition on a 2 processor machine. In server properties it is set to
use all avaliable processors for parallelism.
As I understand differences between Standard and Enterpise editions,
Standard one can only execute 2 different queries in the samem time on 2
processors. Enterprise can also divide one long lasting queries on 2
processors if it has sens of course.
I have tested it and noticed that my SQL Server Standard still uses one
processor. Can it be caused by a fact that these two concurent queries are
executed by the same one application (IIS)?
Here is my test scenario:
1. I have a .aspx page which shows results of a query (it takes about 7 s)
2. I open this page from 2 different IE windows same time (almost same, but
the delay is about 1-2s, it is the time I need to click)
3. When I look at processor history in task manager on sql server, it never
goes more then 50 % (when I have 1 graph for all of 2 processors)
Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
caused by 1 client application (IIS).
PrzemoSQL Server 2000 SE supports up to 4 processors. Do you observe the same
behavior when you execute the same 2 queries concurrently from Query
Analyzer?
Hope this helps.
Dan Guzman
SQL Server MVP
"Przemo" <Przemo@.discussions.microsoft.com> wrote in message
news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
> Hi,
> I have an IIS 6.0 application on one server using database SQL Derver 2000
> Standard Edition on a 2 processor machine. In server properties it is set
> to
> use all avaliable processors for parallelism.
> As I understand differences between Standard and Enterpise editions,
> Standard one can only execute 2 different queries in the samem time on 2
> processors. Enterprise can also divide one long lasting queries on 2
> processors if it has sens of course.
> I have tested it and noticed that my SQL Server Standard still uses one
> processor. Can it be caused by a fact that these two concurent queries are
> executed by the same one application (IIS)?
> Here is my test scenario:
> 1. I have a .aspx page which shows results of a query (it takes about 7 s)
> 2. I open this page from 2 different IE windows same time (almost same,
> but
> the delay is about 1-2s, it is the time I need to click)
> 3. When I look at processor history in task manager on sql server, it
> never
> goes more then 50 % (when I have 1 graph for all of 2 processors)
> Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
> caused by 1 client application (IIS).
> Przemo|||Try disconnecting one of the connections and reconnect or adda new
connection and then try it again. Connections are bound to a UMS which is
basically tied to a processor. The connections get assigned in a round
robin fashion. If you have 3 connections it could have gone like this.
Connection 1 - Attached to UMS 1
Connection 2 - Attached to UMS 2
Connection 3 - Attached to UMS 1
If you run 2 queries, one on Connection 1 and the other on Connection 3 they
may share the same processor. If the optimizer chose to not use parallelism
for your query the third connection could simply be waiting on the first.
--
Andrew J. Kelly SQL MVP
"Przemo" <Przemo@.discussions.microsoft.com> wrote in message
news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
> Hi,
> I have an IIS 6.0 application on one server using database SQL Derver 2000
> Standard Edition on a 2 processor machine. In server properties it is set
> to
> use all avaliable processors for parallelism.
> As I understand differences between Standard and Enterpise editions,
> Standard one can only execute 2 different queries in the samem time on 2
> processors. Enterprise can also divide one long lasting queries on 2
> processors if it has sens of course.
> I have tested it and noticed that my SQL Server Standard still uses one
> processor. Can it be caused by a fact that these two concurent queries are
> executed by the same one application (IIS)?
> Here is my test scenario:
> 1. I have a .aspx page which shows results of a query (it takes about 7 s)
> 2. I open this page from 2 different IE windows same time (almost same,
> but
> the delay is about 1-2s, it is the time I need to click)
> 3. When I look at processor history in task manager on sql server, it
> never
> goes more then 50 % (when I have 1 graph for all of 2 processors)
> Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
> caused by 1 client application (IIS).
> Przemo|||Thank you.
This is the reason. I have tested and it works ok. Simple I have some
coincidence.
Przemo
"Andrew J. Kelly" wrote:

> Try disconnecting one of the connections and reconnect or adda new
> connection and then try it again. Connections are bound to a UMS which is
> basically tied to a processor. The connections get assigned in a round
> robin fashion. If you have 3 connections it could have gone like this.
> Connection 1 - Attached to UMS 1
> Connection 2 - Attached to UMS 2
> Connection 3 - Attached to UMS 1
> If you run 2 queries, one on Connection 1 and the other on Connection 3 th
ey
> may share the same processor. If the optimizer chose to not use paralleli
sm
> for your query the third connection could simply be waiting on the first.
> --
> Andrew J. Kelly SQL MVP
>
> "Przemo" <Przemo@.discussions.microsoft.com> wrote in message
> news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
>
>

Parallel operations in SQL 2000 Standard

Hi,
I have an IIS 6.0 application on one server using database SQL Derver 2000
Standard Edition on a 2 processor machine. In server properties it is set to
use all avaliable processors for parallelism.
As I understand differences between Standard and Enterpise editions,
Standard one can only execute 2 different queries in the samem time on 2
processors. Enterprise can also divide one long lasting queries on 2
processors if it has sens of course.
I have tested it and noticed that my SQL Server Standard still uses one
processor. Can it be caused by a fact that these two concurent queries are
executed by the same one application (IIS)?
Here is my test scenario:
1. I have a .aspx page which shows results of a query (it takes about 7 s)
2. I open this page from 2 different IE windows same time (almost same, but
the delay is about 1-2s, it is the time I need to click)
3. When I look at processor history in task manager on sql server, it never
goes more then 50 % (when I have 1 graph for all of 2 processors)
Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
caused by 1 client application (IIS).
Przemo
SQL Server 2000 SE supports up to 4 processors. Do you observe the same
behavior when you execute the same 2 queries concurrently from Query
Analyzer?
Hope this helps.
Dan Guzman
SQL Server MVP
"Przemo" <Przemo@.discussions.microsoft.com> wrote in message
news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
> Hi,
> I have an IIS 6.0 application on one server using database SQL Derver 2000
> Standard Edition on a 2 processor machine. In server properties it is set
> to
> use all avaliable processors for parallelism.
> As I understand differences between Standard and Enterpise editions,
> Standard one can only execute 2 different queries in the samem time on 2
> processors. Enterprise can also divide one long lasting queries on 2
> processors if it has sens of course.
> I have tested it and noticed that my SQL Server Standard still uses one
> processor. Can it be caused by a fact that these two concurent queries are
> executed by the same one application (IIS)?
> Here is my test scenario:
> 1. I have a .aspx page which shows results of a query (it takes about 7 s)
> 2. I open this page from 2 different IE windows same time (almost same,
> but
> the delay is about 1-2s, it is the time I need to click)
> 3. When I look at processor history in task manager on sql server, it
> never
> goes more then 50 % (when I have 1 graph for all of 2 processors)
> Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
> caused by 1 client application (IIS).
> Przemo
|||Try disconnecting one of the connections and reconnect or adda new
connection and then try it again. Connections are bound to a UMS which is
basically tied to a processor. The connections get assigned in a round
robin fashion. If you have 3 connections it could have gone like this.
Connection 1 - Attached to UMS 1
Connection 2 - Attached to UMS 2
Connection 3 - Attached to UMS 1
If you run 2 queries, one on Connection 1 and the other on Connection 3 they
may share the same processor. If the optimizer chose to not use parallelism
for your query the third connection could simply be waiting on the first.
Andrew J. Kelly SQL MVP
"Przemo" <Przemo@.discussions.microsoft.com> wrote in message
news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
> Hi,
> I have an IIS 6.0 application on one server using database SQL Derver 2000
> Standard Edition on a 2 processor machine. In server properties it is set
> to
> use all avaliable processors for parallelism.
> As I understand differences between Standard and Enterpise editions,
> Standard one can only execute 2 different queries in the samem time on 2
> processors. Enterprise can also divide one long lasting queries on 2
> processors if it has sens of course.
> I have tested it and noticed that my SQL Server Standard still uses one
> processor. Can it be caused by a fact that these two concurent queries are
> executed by the same one application (IIS)?
> Here is my test scenario:
> 1. I have a .aspx page which shows results of a query (it takes about 7 s)
> 2. I open this page from 2 different IE windows same time (almost same,
> but
> the delay is about 1-2s, it is the time I need to click)
> 3. When I look at processor history in task manager on sql server, it
> never
> goes more then 50 % (when I have 1 graph for all of 2 processors)
> Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
> caused by 1 client application (IIS).
> Przemo
|||Thank you.
This is the reason. I have tested and it works ok. Simple I have some
coincidence.
Przemo
"Andrew J. Kelly" wrote:

> Try disconnecting one of the connections and reconnect or adda new
> connection and then try it again. Connections are bound to a UMS which is
> basically tied to a processor. The connections get assigned in a round
> robin fashion. If you have 3 connections it could have gone like this.
> Connection 1 - Attached to UMS 1
> Connection 2 - Attached to UMS 2
> Connection 3 - Attached to UMS 1
> If you run 2 queries, one on Connection 1 and the other on Connection 3 they
> may share the same processor. If the optimizer chose to not use parallelism
> for your query the third connection could simply be waiting on the first.
> --
> Andrew J. Kelly SQL MVP
>
> "Przemo" <Przemo@.discussions.microsoft.com> wrote in message
> news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
>
>

Parallel operations in SQL 2000 Standard

Hi,
I have an IIS 6.0 application on one server using database SQL Derver 2000
Standard Edition on a 2 processor machine. In server properties it is set to
use all avaliable processors for parallelism.
As I understand differences between Standard and Enterpise editions,
Standard one can only execute 2 different queries in the samem time on 2
processors. Enterprise can also divide one long lasting queries on 2
processors if it has sens of course.
I have tested it and noticed that my SQL Server Standard still uses one
processor. Can it be caused by a fact that these two concurent queries are
executed by the same one application (IIS)?
Here is my test scenario:
1. I have a .aspx page which shows results of a query (it takes about 7 s)
2. I open this page from 2 different IE windows same time (almost same, but
the delay is about 1-2s, it is the time I need to click)
3. When I look at processor history in task manager on sql server, it never
goes more then 50 % (when I have 1 graph for all of 2 processors)
Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
caused by 1 client application (IIS).
PrzemoSQL Server 2000 SE supports up to 4 processors. Do you observe the same
behavior when you execute the same 2 queries concurrently from Query
Analyzer?
--
Hope this helps.
Dan Guzman
SQL Server MVP
"Przemo" <Przemo@.discussions.microsoft.com> wrote in message
news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
> Hi,
> I have an IIS 6.0 application on one server using database SQL Derver 2000
> Standard Edition on a 2 processor machine. In server properties it is set
> to
> use all avaliable processors for parallelism.
> As I understand differences between Standard and Enterpise editions,
> Standard one can only execute 2 different queries in the samem time on 2
> processors. Enterprise can also divide one long lasting queries on 2
> processors if it has sens of course.
> I have tested it and noticed that my SQL Server Standard still uses one
> processor. Can it be caused by a fact that these two concurent queries are
> executed by the same one application (IIS)?
> Here is my test scenario:
> 1. I have a .aspx page which shows results of a query (it takes about 7 s)
> 2. I open this page from 2 different IE windows same time (almost same,
> but
> the delay is about 1-2s, it is the time I need to click)
> 3. When I look at processor history in task manager on sql server, it
> never
> goes more then 50 % (when I have 1 graph for all of 2 processors)
> Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
> caused by 1 client application (IIS).
> Przemo|||Try disconnecting one of the connections and reconnect or adda new
connection and then try it again. Connections are bound to a UMS which is
basically tied to a processor. The connections get assigned in a round
robin fashion. If you have 3 connections it could have gone like this.
Connection 1 - Attached to UMS 1
Connection 2 - Attached to UMS 2
Connection 3 - Attached to UMS 1
If you run 2 queries, one on Connection 1 and the other on Connection 3 they
may share the same processor. If the optimizer chose to not use parallelism
for your query the third connection could simply be waiting on the first.
--
Andrew J. Kelly SQL MVP
"Przemo" <Przemo@.discussions.microsoft.com> wrote in message
news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
> Hi,
> I have an IIS 6.0 application on one server using database SQL Derver 2000
> Standard Edition on a 2 processor machine. In server properties it is set
> to
> use all avaliable processors for parallelism.
> As I understand differences between Standard and Enterpise editions,
> Standard one can only execute 2 different queries in the samem time on 2
> processors. Enterprise can also divide one long lasting queries on 2
> processors if it has sens of course.
> I have tested it and noticed that my SQL Server Standard still uses one
> processor. Can it be caused by a fact that these two concurent queries are
> executed by the same one application (IIS)?
> Here is my test scenario:
> 1. I have a .aspx page which shows results of a query (it takes about 7 s)
> 2. I open this page from 2 different IE windows same time (almost same,
> but
> the delay is about 1-2s, it is the time I need to click)
> 3. When I look at processor history in task manager on sql server, it
> never
> goes more then 50 % (when I have 1 graph for all of 2 processors)
> Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
> caused by 1 client application (IIS).
> Przemo|||Thank you.
This is the reason. I have tested and it works ok. Simple I have some
coincidence.
Przemo
"Andrew J. Kelly" wrote:
> Try disconnecting one of the connections and reconnect or adda new
> connection and then try it again. Connections are bound to a UMS which is
> basically tied to a processor. The connections get assigned in a round
> robin fashion. If you have 3 connections it could have gone like this.
> Connection 1 - Attached to UMS 1
> Connection 2 - Attached to UMS 2
> Connection 3 - Attached to UMS 1
> If you run 2 queries, one on Connection 1 and the other on Connection 3 they
> may share the same processor. If the optimizer chose to not use parallelism
> for your query the third connection could simply be waiting on the first.
> --
> Andrew J. Kelly SQL MVP
>
> "Przemo" <Przemo@.discussions.microsoft.com> wrote in message
> news:4498ECD3-D1F6-4687-AFDD-D538A6797AD4@.microsoft.com...
> > Hi,
> >
> > I have an IIS 6.0 application on one server using database SQL Derver 2000
> > Standard Edition on a 2 processor machine. In server properties it is set
> > to
> > use all avaliable processors for parallelism.
> > As I understand differences between Standard and Enterpise editions,
> > Standard one can only execute 2 different queries in the samem time on 2
> > processors. Enterprise can also divide one long lasting queries on 2
> > processors if it has sens of course.
> > I have tested it and noticed that my SQL Server Standard still uses one
> > processor. Can it be caused by a fact that these two concurent queries are
> > executed by the same one application (IIS)?
> > Here is my test scenario:
> > 1. I have a .aspx page which shows results of a query (it takes about 7 s)
> > 2. I open this page from 2 different IE windows same time (almost same,
> > but
> > the delay is about 1-2s, it is the time I need to click)
> > 3. When I look at processor history in task manager on sql server, it
> > never
> > goes more then 50 % (when I have 1 graph for all of 2 processors)
> >
> > Am I wrong in my opinion about possibilitie of SQL Serer Standard or it is
> > caused by 1 client application (IIS).
> >
> > Przemo
>
>

Tuesday, March 20, 2012

Parallel activations with GET CONV GROUP

Hey guys. I'll try to keep this as short as possible.

Was testing out some thresholds with a simple service broker application and ran into something interesting that I'm trying to find a workaround to. I setup a simple service on a single instance, and am transmitting messages to/from the same instance. I assigned a procedure to be activated on the queue, with a max of 20 parallel activations.

The procedure originally was reading messages off the queue by following a process with major steps outlined here:

    Start loop until no more messages to process Call get conv. group with 3000 timeout into variable x If group retrieved, receive top 1 from queue where conv_group = x
      If no group, exit
    Process message(s) received (simply insert some of the msg data into user table, then waitfor 1 second to simulate doing some work) for the given group Loop to get next group and messages

So, with this type of configuration, I sent off a few thousand messages to the queue and monitored the queue count, user table count, and activated tasks count. To my surprise, despite the fact that there were thousands of messages in the queue waiting to be processed, only a single task was processing data, pulling about 1 per second.

I tried over and over with different values in the waitfor, and no additional tasks got spun up until I had a waitfor of 6 seconds or more...then only about 3-4 would get spun up at any time in parallel.

Given the way activation occurs, I thought I'd try things without the call to get conv group, and instead simply use a receive without a where clause...so, I rewrote the activation procedure to skip the call to GET CONV GROUP, and simply perform a receive without any where clause. I also removed the waitfor delay and simply left it to process messages as fast as it could...

This time, I sent over the same number of messages and checked the same counts, this time noticing that within 20 seconds or so, all 20 parallel instances of the activation procedure were processing messages off the queue, and doing so very quickly. Same occured if I put the waitfor delay back in...

The problem is, that this type of coding doesn't allow for processing related messages, even though it seems to be much better at allowing parallel activations. I tried to rewrite again using 2 types of receive statements (first one without a where clause and next one with a where clause on the conv. group from the first receive after processing the message) to see if that would allow for better parallel activation, however that worked about the same as using the GET CONV GROUP call.

So, if I have an application that mostly does not make use of grouping and want to ensure optimal parallel activation when needed, however also handle the times when related messages arrive and need to be processed together, what can I do? Any ideas?

I have test code scripts that can reproduce this behavior if needed. Thanks up front,

Can you send your scripts to Rushi (dot) Desai (at) microsoft (dot) com so that I can take a look at what is going on? Activation should not start tasks more frequently than once every 5 seconds whether you use GET CONVERSATION GROUP or RECEIVE directly. Given that, starting 20 concurrent queue readers should take 100 or more seconds.|||

Hi Rushi...you're correct, it appears that the tasks are getting started about 1 every 5 seconds at best. Not sure how I got the original 20 seconds I mentioned above, but I can't seem to reproduce that no matter what I do...must have been a brain mix-up on my part, sorry.

I can still forward you the scripts, but I'm mostly curious as if there's anything that can be done to 'force' the same type of activation when using the GET CONV GROUP call, then a RECEIVE with a filter on the given group? If my queue is getting filled with about 50 messages a second, 99.9% of which will contain unique conversation_group values, and each message takes about 1/2 second to process, it seems that the best I can get for parallel activation is about 3, and that isn't coming close to handling staying in front of the arrival rate...

Thanks,

|||

You should not notice any difference in the behavior of activation if you do:

WAITFOR (GET CONVERSATION GROUP ...)
... something that takes less than a second or two to complete ...
RECEIVE ... WHERE conversation_group_id = ...
... process message for long time...

and

WAITFOR (RECEIVE ... )
... process message for long time...

50 msgs on unique conversation groups with about 1/2 second per message would perform as follows (roughly):

t = 0: Start first task
5 < t < 10: Consumed 10 messages, start second task
10 < t < 15: Consumed 30 messages; start third task
15 < t < 20: Consumed 50 messages

Unfortunately, in SQL Server 2005, the 5 second timeout cannot be adjusted by the user and therefore user cannot adjust the "responsiveness" of activation. There is a tradeoff to be made between how quickly activation should launch additional task and how much time it should let existing tasks take to drain the queue. In future releases it would be nice to allow the user to control responsiveness. I would recommend that you write to product feedback (http://lab.msdn.microsoft.com/productfeedback/) so that we can open an issue to resolve this in the next release.

Current workaround may be using multiple identical services (and queues) and partitioning your dialogs among those services. Each of its queues will be activated independently and you could get more number of tasks started. (We know at least one internal customer who is doing this).

Thanks,
Rushi

|||

Hey again Rushi.

Well, I notice a HUGE difference in activation between those 2 exact scenarios. Using a GET CONV GROUP followed by a RECEIVE with a where clause, I can't get more than a single activation. However, using just a RECEIVE without a where clause and not call to GET CONV GROUP, I get max activations, 1 new activation every 5 seconds or so.

I'm not so concerned about the 5 second activation threshold, as that would be plenty fast enough an activation under load, and once the tasks are activated, they remain until they exit (i.e. queue is empty).

Since I can't attach scripts, I'll post another message immediately following this one with the full-text of the scripts that can easily reproduce this different behavior. To use the scripts:

1) SCRIPT 1 - will create the sample db, broker objects, endpoint, queue, service, etc., including the activation procedure (cp_test1). Notice in the intial creation, the procedure will make use of method #1 from above (i.e. a call to GET CONV GROUP, followed by a RECEIVE with where clause), and simply pulls top 1 from the queue w/ group filter, then inserts into another logging table, then pauses for 1/2 second before continuing.

2) SCRIPT 2 - will simply send @.c number of messages (2000 on post) into the queue, pausing for @.delay (2/100ths of a second on post) time between each send onto the queue

Run SCRIPT 1, then setup another script to simply the count of activated tasks, then start SCRIPT 2. You should notice that despite the fact that messages are arriving much faster than being processed, only a single task is activated no matter how long you wait.

3) SCRIPT 3 - will alter the cp_test1 procedure to use method #2 from above (i.e. no call to GET CONV GROUP, but instead just a call to RECEIVE without where clause). Run this to modify the proc, then start the tests over after all activated tasks have either exited or been killed.

After running script 3, you'll notice that tasks get activated every 5 seconds up to max activations...

|||

--

SCRIPT 1

--

/*
use master
drop database sbTest
drop endpoint ep_ServiceBroker_tcp
*/
execute as login = 'sa'
go

create database sbTest
go

use sbTest
go


create master key encryption by password = 'jpaiweqf2q938hq2-3980nhf9piunfwpojf';
alter master key add encryption by service master key;
go

create endpoint ep_ServiceBroker_tcp
state = stopped
as tcp (listener_port = 4022, listener_ip = all)
for service_broker (authentication = windows, encryption = disabled,
message_forwarding = disabled);
go

alter database sbTest set enable_broker;
alter database sbTest set error_broker_conversations;
go

create message type mt_test1 validation = well_formed_xml;
go

create contract ct_test1 (mt_test1 sent by initiator);
go

create procedure dbo.cp_test1
as
declare @.conversation_group_id uniqueidentifier,
@.conversation_handle uniqueidentifier,
@.msg_type_name nvarchar(128)

declare @.messages table (status tinyint, priority int, queing_order bigint, conversation_group_id uniqueidentifier,
conversation_handle uniqueidentifier, message_sequence_number bigint, service_name nvarchar(512),
service_id int, service_contract_name nvarchar(256), service_contract_id int, message_type_name nvarchar(256),
message_type_id int, validation nchar(2), message_body varbinary(max))

-- Loop until we manually break
while 1 = 1 begin
-- Start a transaction to receive the appropriate group
begin transaction;

-- Get the next available conversation group...wait for a few seconds then loop...
waitfor (
get conversation group @.conversation_group_id from dbo.qu_test1),
timeout 3000;

-- If we didn't get anything, break since we're all done and will be re-awoken automatically if needed
if @.conversation_group_id is null begin
if @.@.trancount > 0
rollback transaction;

break;
end

-- Got a group, so process all the messages for said group...
while 1 = 1 begin
-- Receive the next message for the conversation group...notice the where clause in the
-- receive statement, which ensures we only get messages for this group...no need for a waitfor here,
-- as calling the get conversation group will lock that group (so noone else gets any messages for it), and
-- we wouldn't even get to this point if there wasn't a message for this group...
receive top (1) *
from dbo.qu_test1
into @.messages
where conversation_group_id = @.conversation_group_id;

if @.@.rowcount = 0 or @.@.error <> 0
-- If an error occured, there is probably no more messages, or something happened in the receive
break;

-- Get the information that we need to process the message
select @.conversation_handle = conversation_handle,
@.msg_type_name = message_type_name
from @.messages

-- If an error occured (on the other end), or the message is an end conversation message,
-- end the conversation from our end
if @.msg_type_name in('http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog',
'http://schemas.microsoft.com/SQL/ServiceBroker/Error') begin
end conversation @.conversation_handle;
end

-- Stick the message data into our logging table
begin try
insert dbo.zztemp_messages
select status, priority, queing_order, conversation_group_id, conversation_handle,
message_sequence_number, service_name, service_id, service_contract_name,
service_contract_id, message_type_name, message_type_id, validation, message_body, getdate()
from @.messages;

-- Simulate think time for doing some work for a half a second or so
waitfor delay '000:00:00.5'
end try
begin catch
select error_message() as err_msg, error_number() as err_number,
error_line() as err_line, error_procedure() as err_proc;

print 'Caught error ' + quotename(error_message());

rollback transaction;
break;
end catch

delete @.messages
end -- inner - while 1 = 1

-- All done processing this conversation group, commit now...
commit transaction
end -- while 1 = 1
go

if isnull(object_id('dbo.zztemp_messages'), 0) = 0
create table dbo.zztemp_messages (status tinyint, priority int, queing_order bigint, conversation_group_id uniqueidentifier,
conversation_handle uniqueidentifier, message_sequence_number bigint, service_name nvarchar(512),
service_id int, service_contract_name nvarchar(256), service_contract_id int, message_type_name nvarchar(256),
message_type_id int, validation nchar(2), message_body varbinary(max), dt datetime default getdate());
go

-- Create the queue for the source server
create queue dbo.qu_test1 with
status = on,
retention = off,
activation (
status = on,
procedure_name = dbo.cp_test1,
max_queue_readers = 20,
execute as 'dbo'
)
on [default];
go

-- Create the source service
create service sv_test1
on queue dbo.qu_test1
(ct_test1);
go

alter endpoint ep_ServiceBroker_tcp state = started
go

|||

--

SCRIPT 2

--

use sbTest
go

declare @.i int, @.c int, @.delay varchar(25)

select @.i = 0, -- Incremental count...
@.c = 2000, -- Set this to the number of messages to send to the queue
@.delay = '000:00:00.02' -- waitfor delay value to pause on each iteration before sending next message

while @.i <= @.c begin

-- Send a test message to the target service
declare @.msg xml, @.hConversation uniqueidentifier;
set @.msg = N'<message>Test Message to Target</message>';

begin try
begin transaction;

-- Start the conversation
begin dialog conversation @.hConversation
from service sv_test1
to service 'sv_test1'
on contract ct_test1
with encryption = off;

-- Send the message on the dialog
send on conversation @.hConversation
message type mt_test1
(@.msg);

-- Commit the send...
commit transaction;

-- Not going to code for receiving back, as the auto-activated stored procedure handles that
-- NOTE that it also handles ending the conversation that was begun here...
end try
begin catch
-- Show the error information
select error_number() as err_num, error_message() as err_msg,
error_line() as err_line, error_procedure() as err_proc;

if @.hConversation is not null
end conversation @.hConversation

if @.@.trancount > 0
rollback transaction;

end catch

if @.@.trancount > 0
commit transaction;

set @.i = @.i + 1
waitfor delay @.delay
end

|||

--

SCRIPT 3

--

use sbTest
go

alter procedure dbo.cp_test1
as
declare @.conversation_group_id uniqueidentifier,
@.conversation_handle uniqueidentifier,
@.msg_type_name nvarchar(128),
@.error int

declare @.messages table (status tinyint, priority int, queing_order bigint, conversation_group_id uniqueidentifier,
conversation_handle uniqueidentifier, message_sequence_number bigint, service_name nvarchar(512),
service_id int, service_contract_name nvarchar(256), service_contract_id int, message_type_name nvarchar(256),
message_type_id int, validation nchar(2), message_body varbinary(max))

-- Got a group, so process all the messages for said group...
while 1 = 1 begin
begin transaction;

-- Notice not performing a call to GET CONVERSATION GROUP here...
waitfor (
receive top (1) *
from dbo.qu_test1
into @.messages), timeout 3000;

if @.@.error <> 0 begin
rollback tran;
break;
end

if (select count(*) from @.messages) = 0 begin
rollback tran;
break;
end

-- Get the information that we need to process the message
select @.conversation_handle = conversation_handle,
@.msg_type_name = message_type_name,
@.conversation_group_id = conversation_group_id
from @.messages

-- If an error occured (on the other end), or the message is an end conversation message,
-- end the conversation from our end
if @.msg_type_name in('http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog',
'http://schemas.microsoft.com/SQL/ServiceBroker/Error') begin
end conversation @.conversation_handle;
end

-- Stick the message data into our logging table
begin try
insert dbo.zztemp_messages
select status, priority, queing_order, conversation_group_id, conversation_handle,
message_sequence_number, service_name, service_id, service_contract_name,
service_contract_id, message_type_name, message_type_id, validation, message_body, getdate()
from @.messages;

-- Simulate think time for doing some work for a half a second or so
waitfor delay '000:00:00.5'
end try
begin catch
select error_message() as err_msg, error_number() as err_number,
error_line() as err_line, error_procedure() as err_proc;

print 'Caught error ' + quotename(error_message());

rollback transaction;
break;
end catch

-- All done processing this conversation group, commit now...
commit transaction;

delete @.messages
end -- while 1 = 1
go

|||

Yes, you can.

The problem is the inner loop that iterates over messages for a given group until RECEIVE returns empty result set. This kind of processing fools the activation machinery to believe that the activated proc is keeping up with the incomming rate of messages, because of the empty result set.

The solution is to receive ALL available messages for a group (no TOP clause), and then process the resultset, using a cursor:

use sbTest

go

alter procedure dbo.cp_test1

as

declare @.conversation_group_iduniqueidentifier,

@.conversation_handleuniqueidentifier,

@.msg_type_namenvarchar(128),

@.queuing_order INT;

declare @.messages table (status tinyint, priority int, queing_order bigint, conversation_group_id uniqueidentifier,

conversation_handle uniqueidentifier, message_sequence_number bigint, service_name nvarchar(512),

service_id int, service_contract_name nvarchar(256), service_contract_id int, message_type_name nvarchar(256),

message_type_id int, validation nchar(2), message_body varbinary(max))

-- declare a cursor for the @.messages

DECLARE crsMessages CURSOR

forward_only read_only

for

SELECT conversation_handle,

message_type_name,

queing_order

FROM @.messages;

-- Loop until we manually break

while 1 = 1 begin

-- Start a transaction to receive the appropriate group

begin transaction;

-- Get the next available conversation group...wait for a few seconds then loop...

waitfor (

get conversation group @.conversation_group_id from dbo.qu_test1),

timeout 3000;

-- If we didn't get anything, break since we're all done and will be re-awoken automatically if needed

if @.conversation_group_id is null begin

if @.@.trancount > 0

rollback transaction;

break;

end;

-- Got a group, so process all the messages for said group...

-- Receive the next message for the conversation group...notice the where clause in the

-- receive statement, which ensures we only get messages for this group...no need for a waitfor here,

-- as calling the get conversation group will lock that group (so noone else gets any messages for it), and

-- we wouldn't even get to this point if there wasn't a message for this group...

receive *

from dbo.qu_test1

into @.messages

where conversation_group_id = @.conversation_group_id;

if @.@.rowcount = 0 or @.@.error <> 0

-- If an error occured, there is probably no more messages, or something happened in the receive

break;

-- open the cursor over @.messages

OPEN crsMessages;

FETCH NEXT FROM crsMessages

INTO @.conversation_handle, @.msg_type_name, @.queuing_order;

WHILE (@.@.FETCH_STATUS = 0)

BEGIN

-- If an error occured (on the other end), or the message is an end conversation message,

-- end the conversation from our end

if @.msg_type_name in('http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog',

'http://schemas.microsoft.com/SQL/ServiceBroker/Error') begin

end conversation @.conversation_handle;

end

-- Stick the message data into our logging table

begin try

insert dbo.zztemp_messages

select status, priority, queing_order, conversation_group_id, conversation_handle,

message_sequence_number, service_name, service_id, service_contract_name,

service_contract_id, message_type_name, message_type_id, validation, message_body, getdate()

from @.messages

WHERE queing_order = @.queuing_order;

-- Simulate think time for doing some work for a half a second or so

waitfor delay '000:00:00.5'

end try

begin catch

select error_message() as err_msg, error_number() as err_number,

error_line() as err_line, error_procedure() as err_proc;

print 'Caught error ' + quotename(error_message());

rollback transaction;

break;

end catch

FETCH NEXT FROM crsMessages

INTO @.conversation_handle, @.msg_type_name, @.queuing_order;

END

CLOSE crsMessages;

delete @.messages

-- All done processing this conversation group, commit now...

commit transaction

end

DEALLOCATE crsMessages

go

HTH,
~ Remus

|||Yep, that makes sense...thanks guys!

Monday, March 12, 2012

Paging using Web Services

We are developing a Web Application for which we have written a class to
render reports that consumes web services provided by Reporting Services.
We invoke "Render" method of web service for displaying report in aspx page.
Paging is achieved using "Section" config of HTMLDeviceInfo passed to render
method.
This approach is working fine for First Page, Previous Page and Next Page
buttons. However we are not able to implement code for Last Page Button
because we are unable to retrieve details regarding total number of pages
through code.
Documentation states that if a value greater than last page index is passed
to "Section" config then Web Service would render last page. This behavior is
causing our code to crash.
Any idea how to overcome this problem.
Thanks in advance.I've described my approach a few months ago.
http://groups.google.com/group/microsoft.public.sqlserver.reportingsvcs/tree/browse_frm/thread/5a73412801f5ba54/f63ce6b85e448735?rnum=1&hl=en&q=%22Oleg+Yevteyev%22+pages&_done=%2Fgroup%2Fmicrosoft.public.sqlserver.reportingsvcs%2Fbrowse_frm%2Fthread%2F5a73412801f5ba54%2F41e4ed35916811eb%3Flnk%3Dst%26q%3D%22Oleg+Yevteyev%22+pages%26rnum%3D8%26hl%3Den%26#doc_ac075ac1383673e8
Hope that helps
Oleg Yevteyev,
San Diego, CA
It is OK to contact me with a contracting opportunity.
"myfirstname"001atgmaildotcom.
Replace "myfirstname" with Oleg.
--
"PVV" <PVV@.discussions.microsoft.com> wrote in message
news:327535A8-2C87-4B38-A60D-223B95D8AFD2@.microsoft.com...
> We are developing a Web Application for which we have written a class to
> render reports that consumes web services provided by Reporting Services.
> We invoke "Render" method of web service for displaying report in aspx
> page.
> Paging is achieved using "Section" config of HTMLDeviceInfo passed to
> render
> method.
> This approach is working fine for First Page, Previous Page and Next Page
> buttons. However we are not able to implement code for Last Page Button
> because we are unable to retrieve details regarding total number of pages
> through code.
> Documentation states that if a value greater than last page index is
> passed
> to "Section" config then Web Service would render last page. This behavior
> is
> causing our code to crash.
> Any idea how to overcome this problem.
> Thanks in advance.
>

Paging of Large Results Using Server Cursors

I'm writing an ASP.NET application that uses a SQL Server 2000 database. The application searches in large tables with 500, 000+ Records and then displays the search results, the search results could be easily 20,000 or 30,000 results. Ofcourse i need to use paging to show like 10 or 20 results per page. Unfortunetly ADO.NET doesn't support the paging functions that were found in ADO (like PageSize or AbsolutePosition) so i have to implement the paging myself.

I've read many articles that talk about how paging could be implemented. Most of them suggest doing the paging through SQL Server using Server Cursors. I know that cursors are resource intensive and should be avoided whenever possible but it seems that this is the only solution that fits. I just want you to notice that the cursor will just loop through 20 or 30 entries no more (Page Size) So is this a problem?

I will be using code that looks similar to this:

--

DECLARE @.PK /* PK Type */
DECLARE @.tblPK TABLE (
PK /* PK Type */ NOT NULL PRIMARY KEY
)

DECLARE PagingCursor CURSOR DYNAMIC READ_ONLY FOR
SELECT @.PK FROM Table ORDER BY SortColumn

OPEN PagingCursor
FETCH RELATIVE @.StartRow FROM PagingCursor INTO @.PK

WHILE @.PageSize > 0 AND @.@.FETCH_STATUS = 0
BEGIN
INSERT @.tblPK(PK) VALUES(@.PK)
FETCH NEXT FROM PagingCursor INTO @.PK
SET @.PageSize = @.PageSize - 1
END

CLOSE PagingCursor
DEALLOCATE PagingCursor

SELECT ... FROM Table JOIN @.tblPK temp ON Table.PK = temp.PK
ORDER BY SortColumn

I got this from the article http://codeproject.com/aspnet/PagingLarge.asp

Another method was suggested also that uses RowCount but it doesn't work for some technical reasons discussed in the article above.

So what do you think should i move on or what?

Regards,

Mohamed Salah

Please take a look at Aaron's article on this:

http://aspfaq.com/show.asp?id=2120

Paging of Large Results Using Server Cursors

I'm writing an ASP.NET application that uses a SQL Server 2000 database. The
application searches in large tables with 500, 000+ Records and then display
s
the search results, the search results could be easily 20,000 or 30,000
results. Ofcourse i need to use paging to show like 10 or 20 results per
page. Unfortunetly ADO.NET doesn't support the paging functions that were
found in ADO (like PageSize or AbsolutePosition) so i have to implement the
paging myself.
I've read many articles that talk about how paging could be implemented.
Most of them suggest doing the paging through SQL Server using Server
Cursors. I know that cursors are resource intensive and should be avoided
whenever possible but it seems that this is the only solution that fits. I
just want you to notice that the cursor will just loop through 20 or 30
entries no more (Page Size) So is this a problem?
I will be using code that looks similar to this:
---
DECLARE @.PK /* PK Type */
DECLARE @.tblPK TABLE (
PK /* PK Type */ NOT NULL PRIMARY KEY
)
DECLARE PagingCursor CURSOR DYNAMIC READ_ONLY FOR
SELECT @.PK FROM Table ORDER BY SortColumn
OPEN PagingCursor
FETCH RELATIVE @.StartRow FROM PagingCursor INTO @.PK
WHILE @.PageSize > 0 AND @.@.FETCH_STATUS = 0
BEGIN
INSERT @.tblPK(PK) VALUES(@.PK)
FETCH NEXT FROM PagingCursor INTO @.PK
SET @.PageSize = @.PageSize - 1
END
CLOSE PagingCursor
DEALLOCATE PagingCursor
SELECT ... FROM Table JOIN @.tblPK temp ON Table.PK = temp.PK
ORDER BY SortColumn
----
I got this from the article http://codeproject.com/aspnet/PagingLarge.asp
Another method was suggested also that uses RowCount but it doesn't work for
some technical reasons discussed in the article above.
So what do you think should i move on or what?
Regards,
Mohamed SalahLet's assume you are using a stored procedure call to page through a
Customer table and sorting by LastName. All you need to do is maintain in
session state the last offset value of LastName and CustomerID. This should
be fast resource efficeint. For example:
select top 20
LastName,
FirstName,
PhoneNumber
from
Customer
where
LastName > @.PrevLastName and
CustomerID > @.PrevCustomerID
order by
LastName,
CustomerID
"Mohamed Salah" <MohamedSalah@.discussions.microsoft.com> wrote in message
news:1D026C27-4BA2-43F8-B1CF-053D10B127D5@.microsoft.com...
> I'm writing an ASP.NET application that uses a SQL Server 2000 database.
> The
> application searches in large tables with 500, 000+ Records and then
> displays
> the search results, the search results could be easily 20,000 or 30,000
> results. Ofcourse i need to use paging to show like 10 or 20 results per
> page. Unfortunetly ADO.NET doesn't support the paging functions that were
> found in ADO (like PageSize or AbsolutePosition) so i have to implement
> the
> paging myself.
> I've read many articles that talk about how paging could be implemented.
> Most of them suggest doing the paging through SQL Server using Server
> Cursors. I know that cursors are resource intensive and should be avoided
> whenever possible but it seems that this is the only solution that fits. I
> just want you to notice that the cursor will just loop through 20 or 30
> entries no more (Page Size) So is this a problem?
> I will be using code that looks similar to this:
> ---
> DECLARE @.PK /* PK Type */
> DECLARE @.tblPK TABLE (
> PK /* PK Type */ NOT NULL PRIMARY KEY
> )
> DECLARE PagingCursor CURSOR DYNAMIC READ_ONLY FOR
> SELECT @.PK FROM Table ORDER BY SortColumn
> OPEN PagingCursor
> FETCH RELATIVE @.StartRow FROM PagingCursor INTO @.PK
> WHILE @.PageSize > 0 AND @.@.FETCH_STATUS = 0
> BEGIN
> INSERT @.tblPK(PK) VALUES(@.PK)
> FETCH NEXT FROM PagingCursor INTO @.PK
> SET @.PageSize = @.PageSize - 1
> END
> CLOSE PagingCursor
> DEALLOCATE PagingCursor
> SELECT ... FROM Table JOIN @.tblPK temp ON Table.PK = temp.PK
> ORDER BY SortColumn
> ----
> I got this from the article http://codeproject.com/aspnet/PagingLarge.asp
> Another method was suggested also that uses RowCount but it doesn't work
> for
> some technical reasons discussed in the article above.
> So what do you think should i move on or what?
> Regards,
> Mohamed Salah
>

Paging of Large Results Using Server Cursors

I'm writing an ASP.NET application that uses a SQL Server 2000 database. The application searches in large tables with 500, 000+ Records and then displays the search results, the search results could be easily 20,000 or 30,000 results. Ofcourse i need to use paging to show like 10 or 20 results per page. Unfortunetly ADO.NET doesn't support the paging functions that were found in ADO (like PageSize or AbsolutePosition) so i have to implement the paging myself.

I've read many articles that talk about how paging could be implemented. Most of them suggest doing the paging through SQL Server using Server Cursors. I know that cursors are resource intensive and should be avoided whenever possible but it seems that this is the only solution that fits. I just want you to notice that the cursor will just loop through 20 or 30 entries no more (Page Size) So is this a problem?

I will be using code that looks similar to this:

--

DECLARE @.PK /* PK Type */
DECLARE @.tblPK TABLE (
PK /* PK Type */ NOT NULL PRIMARY KEY
)

DECLARE PagingCursor CURSOR DYNAMIC READ_ONLY FOR
SELECT @.PK FROM Table ORDER BY SortColumn

OPEN PagingCursor
FETCH RELATIVE @.StartRow FROM PagingCursor INTO @.PK

WHILE @.PageSize > 0 AND @.@.FETCH_STATUS = 0
BEGIN
INSERT @.tblPK(PK) VALUES(@.PK)
FETCH NEXT FROM PagingCursor INTO @.PK
SET @.PageSize = @.PageSize - 1
END

CLOSE PagingCursor
DEALLOCATE PagingCursor

SELECT ... FROM Table JOIN @.tblPK temp ON Table.PK = temp.PK
ORDER BY SortColumn

I got this from the article http://codeproject.com/aspnet/PagingLarge.asp

Another method was suggested also that uses RowCount but it doesn't work for some technical reasons discussed in the article above.

So what do you think should i move on or what?

Regards,

Mohamed Salah

Please take a look at Aaron's article on this:

http://aspfaq.com/show.asp?id=2120

Wednesday, March 7, 2012

Pagination for Large data

I want to build a system that will have about 1 million rows in a
table in sql server database.I am using this for a web application and
accessing it via JDBC type 4 driver.But display 20 records at a time
only using pagination(as in google).What will be the best way to go
about this.On 10 Oct 2003 04:04:11 -0700, nik_sharma75@.hotmail.com (Nikhil
Sharma) wrote:

>I want to build a system that will have about 1 million rows in a
>table in sql server database.I am using this for a web application and
>accessing it via JDBC type 4 driver.But display 20 records at a time
>only using pagination(as in google).What will be the best way to go
>about this.

Figure how many anyone is actually ever going to page through.
Obviously no one is going to page through to the end. How many google
pages would you look at before refining your search? Then construct
your query and restrict it with TOP. Cache the results and handle the
paging in your app.

Saturday, February 25, 2012

Page splits/ Dirty pages/ Checkpoint

We have few tables in our application. Basically they are used as temporary
tables. Each day they start with no records in it(empty) and as the day goes
they are filled with the data and at the end of the day they will be
truncated to get ready for the next day. We have a high transaction rate
about 4000/sec. When I noticed the page splits /sec counter it is showing
about 130-160 per second. This is driving the checkpoint to take longer time
.
How Can I reduce this high page splits.
Another question is, we have a char(15) column in those tables and that
column is indexed. It is an Id column but is not unique. Each record has an
unique number Id(generated by our app). But we need to seacrh on the CHAR Id
column so indexed on it. This index is creating/making lot of dirty pages.
This also is a contributing reason for the checkpoint to take longer. How ca
n
I make changes to the index so that it would not create/make many pages
dirty? I tried to change it to VARCHAR and there is not much difference.
The check point is taking about 10-15 seconds and it repeats every 60 second
s.
Your suggestion is greatly appreciated.
Thanks.
Thanks.Just give a try with the following info
1. Check the recovery interval option on the system.
2. make sure that the temporary tables have fixed size by using char
rather than
varchar therefore you can reduce the page spilts.
HTH
Regards
Rajesh Peddireddy.
"Srini" wrote:

> We have few tables in our application. Basically they are used as temporar
y
> tables. Each day they start with no records in it(empty) and as the day go
es
> they are filled with the data and at the end of the day they will be
> truncated to get ready for the next day. We have a high transaction rate
> about 4000/sec. When I noticed the page splits /sec counter it is showing
> about 130-160 per second. This is driving the checkpoint to take longer ti
me.
> How Can I reduce this high page splits.
> Another question is, we have a char(15) column in those tables and that
> column is indexed. It is an Id column but is not unique. Each record has a
n
> unique number Id(generated by our app). But we need to seacrh on the CHAR
Id
> column so indexed on it. This index is creating/making lot of dirty pages.
> This also is a contributing reason for the checkpoint to take longer. How
can
> I make changes to the index so that it would not create/make many pages
> dirty? I tried to change it to VARCHAR and there is not much difference.
> The check point is taking about 10-15 seconds and it repeats every 60 seco
nds.
> Your suggestion is greatly appreciated.
> Thanks.
> Thanks.|||It would really help to show the entire DDL for the table including the
indexes. It sounds like your disk subsystem isn't up to the task. If you
are going to have that many transactions you need a fast disk I/O subsystem,
especially for the transaction logs. Is the log file on it's own RAID 1 or
RAID 10 and is the data on a RAID 10?
Andrew J. Kelly SQL MVP
"Srini" <Srini@.discussions.microsoft.com> wrote in message
news:1F6FF490-C19A-4DFE-BDBC-3AD397CC9D5A@.microsoft.com...
> We have few tables in our application. Basically they are used as
> temporary
> tables. Each day they start with no records in it(empty) and as the day
> goes
> they are filled with the data and at the end of the day they will be
> truncated to get ready for the next day. We have a high transaction rate
> about 4000/sec. When I noticed the page splits /sec counter it is showing
> about 130-160 per second. This is driving the checkpoint to take longer
> time.
> How Can I reduce this high page splits.
> Another question is, we have a char(15) column in those tables and that
> column is indexed. It is an Id column but is not unique. Each record has
> an
> unique number Id(generated by our app). But we need to seacrh on the CHAR
> Id
> column so indexed on it. This index is creating/making lot of dirty pages.
> This also is a contributing reason for the checkpoint to take longer. How
> can
> I make changes to the index so that it would not create/make many pages
> dirty? I tried to change it to VARCHAR and there is not much difference.
> The check point is taking about 10-15 seconds and it repeats every 60
> seconds.
> Your suggestion is greatly appreciated.
> Thanks.
> Thanks.|||We have SAN disk system. Probably the HW is good enough, just trying to see
if I can rearrange some things on the database front to make some improvemen
t.
Coming to the DDL, the tables are not temporary tables the data is temporary
in the sense that the data is kept only for the current day and at the end o
f
the day they are truncated. Each table has about 20 columns. some are decima
l
fields some are datatime columns and others are integer and char type which
includes many char(1)'s and two/three columns char(15 to 20)). One of the
char(15) is indexed which is a kind of Id but is not unique there are no
relationships between these tables and other tables. No triggers no views an
d
anything as such. The data comes into to the system gets inserted to these
standalone tables using some stored procedures. And these tables are queried
using some other stored procedures. The major problem to me looks like is
because of the page splits that it is generating and the dirty pages that it
is generating(about 10000 dirty pages). Is there any thing that can be done
on the table or anything else to make things perform better?
Thanks in advance for your suggestion.
"Andrew J. Kelly" wrote:

> It would really help to show the entire DDL for the table including the
> indexes. It sounds like your disk subsystem isn't up to the task. If you
> are going to have that many transactions you need a fast disk I/O subsyste
m,
> especially for the transaction logs. Is the log file on it's own RAID 1 o
r
> RAID 10 and is the data on a RAID 10?
> --
> Andrew J. Kelly SQL MVP
>
> "Srini" <Srini@.discussions.microsoft.com> wrote in message
> news:1F6FF490-C19A-4DFE-BDBC-3AD397CC9D5A@.microsoft.com...
>
>|||Srini wrote:
> We have SAN disk system. Probably the HW is good enough, just trying
> to see if I can rearrange some things on the database front to make
> some improvement.
> Coming to the DDL, the tables are not temporary tables the data is
> temporary in the sense that the data is kept only for the current day
> and at the end of the day they are truncated. Each table has about 20
> columns. some are decimal fields some are datatime columns and others
> are integer and char type which includes many char(1)'s and two/three
> columns char(15 to 20)). One of the char(15) is indexed which is a
> kind of Id but is not unique there are no relationships between these
> tables and other tables. No triggers no views and anything as such.
> The data comes into to the system gets inserted to these standalone
> tables using some stored procedures. And these tables are queried
> using some other stored procedures. The major problem to me looks
> like is because of the page splits that it is generating and the
> dirty pages that it is generating(about 10000 dirty pages). Is there
> any thing that can be done on the table or anything else to make
> things perform better?
>
The reason it would help to see the DDL is because page splits are a
result of a clustered index and inserts that are not in clustered index
order. You can eliminate the page splitting by chaning the clustered
index to non-clustered or inserting the data in clustered index key
order (if that's possible).
David Gugick
Quest Software
www.imceda.com
www.quest.com|||Just because it is a SAN does not mean it is sufficient or configured
properly for your application. I run across more issues SAN related simply
because people tend to ignore the configuration in thinking it can handle
what ever they need. The DDL was to see what we are dealing with and leave
nothing to imagination. It only takes a second to script the table and
indexes but it goes a long way towards letting us see what is actually
there. Not just what you may think is relevant. This is especially true for
the indexes.
Andrew J. Kelly SQL MVP
"Srini" <Srini@.discussions.microsoft.com> wrote in message
news:623BAACD-7754-4BF4-B9ED-5AA2EDC16F8B@.microsoft.com...
> We have SAN disk system. Probably the HW is good enough, just trying to
> see
> if I can rearrange some things on the database front to make some
> improvement.
> Coming to the DDL, the tables are not temporary tables the data is
> temporary
> in the sense that the data is kept only for the current day and at the end
> of
> the day they are truncated. Each table has about 20 columns. some are
> decimal
> fields some are datatime columns and others are integer and char type
> which
> includes many char(1)'s and two/three columns char(15 to 20)). One of the
> char(15) is indexed which is a kind of Id but is not unique there are no
> relationships between these tables and other tables. No triggers no views
> and
> anything as such. The data comes into to the system gets inserted to these
> standalone tables using some stored procedures. And these tables are
> queried
> using some other stored procedures. The major problem to me looks like is
> because of the page splits that it is generating and the dirty pages that
> it
> is generating(about 10000 dirty pages). Is there any thing that can be
> done
> on the table or anything else to make things perform better?
> Thanks in advance for your suggestion.
> "Andrew J. Kelly" wrote:
>|||David is correct but I just want to caution that changing the clustered
index to a nonclustered will not remove page splits. It may reduce them but
a nonclustered index is implemented just like a clustered index and can page
split as well.
Andrew J. Kelly SQL MVP
"David Gugick" <david.gugick-nospam@.quest.com> wrote in message
news:eC3cUzslFHA.3316@.TK2MSFTNGP14.phx.gbl...
> Srini wrote:
> The reason it would help to see the DDL is because page splits are a
> result of a clustered index and inserts that are not in clustered index
> order. You can eliminate the page splitting by chaning the clustered index
> to non-clustered or inserting the data in clustered index key order (if
> that's possible).
> --
> David Gugick
> Quest Software
> www.imceda.com
> www.quest.com|||If we try to insert the data in clustered index order, which sounds to me
like a monotonically increasing clustered index, would n't it create hot
spots on the disk there by reducing the throughput? Currently we can't
control the data insert order. But I think I can change it so that I can
create clustered index on the serial number which is sequential and that is
generated by me so the data insertions will be in the clustered index order.
How about the dirty pages created by the other non clustered indexes? When I
run DBCC MEMUSAGE it is showing lot of dirty pages on the pages related to
the non-clustered indexes. How can I reduce the dirty pages on those? I
understand FILLFACTOR will not help here as that option is useful when there
is some data in the table and we are creating indexes on that table.
Thanks.
"David Gugick" wrote:

> Srini wrote:
> The reason it would help to see the DDL is because page splits are a
> result of a clustered index and inserts that are not in clustered index
> order. You can eliminate the page splitting by chaning the clustered
> index to non-clustered or inserting the data in clustered index key
> order (if that's possible).
> --
> David Gugick
> Quest Software
> www.imceda.com
> www.quest.com
>|||Is there any limit on the number of repplies that one can post in a time
frame... This thing is not letting me post mine. Trying to POST again...
I agree SAN may have some issues we are trying to check on that. But by just
looking at the SQL server front 10000 dirty pages per checkpoint, using
default recovery interval(which is 60 seconds), looks like something can be
done there to reduce that huge number of dirty pages.
DDL looks like this:
SET ANSI_PADDING ON
GO
CREATE TABLE
[dbo].[DAILY_DATA1](
[Event_id] [char] (18) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
--This is unique identifier for the data, currently clustered index is
created on this field
[Category_cd] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[Type_cd] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[Session_id] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[Serial_nb] [int] NOT NULL , App generated serial number, which I can use to
create clustered index
[Order_ts] [datetime] NOT NULL ,
[Data_Id_tx] [char] (14) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
--This is the column that we have index on
.
.
.
[Description_tx] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
.
.
.
[Receipt_ts] [datetime] NOT NULL
CONSTRAINT [PK_DD_1_Evt_ID] PRIMARY KEY CLUSTERED
(
[Event_id]
) ON [PRIMARY]
) ON [PRIMARY]
END
GO
CREATE INDEX [IDX_DD_1_Data_Id_tx] ON [dbo].[Daily_Data1]([Data_Id_tx])
WITH FILLFACTOR = 90 ON [PRIMARY]
GO
Thanks.
"Andrew J. Kelly" wrote:

> Just because it is a SAN does not mean it is sufficient or configured
> properly for your application. I run across more issues SAN related simply
> because people tend to ignore the configuration in thinking it can handle
> what ever they need. The DDL was to see what we are dealing with and leav
e
> nothing to imagination. It only takes a second to script the table and
> indexes but it goes a long way towards letting us see what is actually
> there. Not just what you may think is relevant. This is especially true f
or
> the indexes.
> --
> Andrew J. Kelly SQL MVP
>
> "Srini" <Srini@.discussions.microsoft.com> wrote in message
> news:623BAACD-7754-4BF4-B9ED-5AA2EDC16F8B@.microsoft.com...
>
>|||I repplied to thsi in the morning but it did not get posted...
Interval option is set to default - not changed. Default is 60 seconds. If I
increase it, it is taking way long on the checkpoint. If I decrease it
CHCKPOINT occurs too frequently. Both are problematic.
The tables are not temporary but the data is. Data gets inserted as part of
the daily operations and will be truncated in the evening. All the columns
are set to fixed length CHAR fields(to their maximum possible lenghts). Ther
e
are only two/thress columns with CHAR(15), CHAR(18) and CHAR(10) all other
columns are integer, decimal, datetime, CHAR(1) type.
I need to find a way to reduce the number of dirty pages and the number of
page splits. How can I do that?
Thanks.
"Rajesh" wrote:
> Just give a try with the following info
> 1. Check the recovery interval option on the system.
> 2. make sure that the temporary tables have fixed size by using char
> rather than
> varchar therefore you can reduce the page spilts.
> HTH
> Regards
> Rajesh Peddireddy.
> "Srini" wrote:
>

Monday, February 20, 2012

Page Setup button in Report Viewer object

I have develop a Windows Form Application that have a Report Viewer object
that rendering a report .rdlc.
I have a problem with the PageSetup Button. The button for set the layout of
the page. If I click that button the type of the page is on Letter format and
i must always set the margin on A4 format of the page before print the
report. But that appends in someone PC...unfortunately it appends on the PC
of the client (in the driver of the printer the page layout is set to A4).
How I can resolve that problem?
Another problem is the set of the margin in PageSetup Button.
If I click on 'Page setup' button the margins left & right are not what I
set on the rdlc file at design time (0,5cm) they are 2 mm
If I click OK on the 'Page Setup' form the Report margins change to the
margins from 'Page setup'
If I open ''Page setup' again the margin settings are now less than
previously set and clicking OK results in the report margins changing again
The margin change from 2 to 0,8 -> from 0,8 to 0,3 -> from 0,3 to 0,1 ->
from 0,1 to 0.
I have Visual Studio 2005 service pack 1Are you doing the 'ChangeService' voodoo that needs to be done to save the
report?
Check 'www.gotreportviewer.com' for an example at what you are doing.
I am no expert at SSRS myself.. just started my fight with it last week so...
- Tanmay
"Marco82bg" wrote:
> I have develop a Windows Form Application that have a Report Viewer object
> that rendering a report .rdlc.
> I have a problem with the PageSetup Button. The button for set the layout of
> the page. If I click that button the type of the page is on Letter format and
> i must always set the margin on A4 format of the page before print the
> report. But that appends in someone PC...unfortunately it appends on the PC
> of the client (in the driver of the printer the page layout is set to A4).
> How I can resolve that problem?
> Another problem is the set of the margin in PageSetup Button.
> If I click on 'Page setup' button the margins left & right are not what I
> set on the rdlc file at design time (0,5cm) they are 2 mm
> If I click OK on the 'Page Setup' form the Report margins change to the
> margins from 'Page setup'
> If I open ''Page setup' again the margin settings are now less than
> previously set and clicking OK results in the report margins changing again
> The margin change from 2 to 0,8 -> from 0,8 to 0,3 -> from 0,3 to 0,1 ->
> from 0,1 to 0.
> I have Visual Studio 2005 service pack 1

Page numbering - X of Y format

I have a reporting services application which generate reports for given IDs. You can pass any number of IDs.
e.g.. If you pass 2 IDs it will generate a single report for those 2 IDs..
Page numbering would look like
1,2,3,1
for ID 1 it has generated 3 pages while for ID 2 it has genereted only 1 page.

But I want the page numbering to return 1 of 3,2 of 3, 3 of 3, 1 of 2, 2 of 2.

Parameters i pass to generate the report is only the ID. It will be in a string format & the SP I use to get data split the string & get the relevant IDs...

Any help to achieve this will be great.

thanks

If you don't need the "of y Pages"-Part look at:
http://blogs.msdn.com/bwelcker/archive/2005/05/19/420046.aspx
|||Hi thanks for the reply; but i doo want of y pages :(

Page numbering - X of Y format

I have a reporting services application which generate reports for given IDs. You can pass any number of IDs.
e.g.. If you pass 2 IDs it will generate a single report for those 2 IDs..
Page numbering would look like
1,2,3,1
for ID 1 it has generated 3 pages while for ID 2 it has genereted only 1 page.

But I want the page numbering to return 1 of 3,2 of 3, 3 of 3, 1 of 2, 2 of 2.

Parameters i pass to generate the report is only the ID. It will be in a string format & the SP I use to get data split the string & get the relevant IDs...

Any help to achieve this will be great.

thanks

If you don't need the "of y Pages"-Part look at:
http://blogs.msdn.com/bwelcker/archive/2005/05/19/420046.aspx
|||Hi thanks for the reply; but i doo want of y pages :(