Showing posts with label columns. Show all posts
Showing posts with label columns. Show all posts

Wednesday, March 21, 2012

Read CSV file - Save Columns into rows

I want to import CSV file and convert columns into rows depending on Customer count(2nd record in each row of CSV file) and save to SQL table

--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3, Total_3..can go upto 350

GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Data in SQL table from csv file should look like this

State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00

I have multiple CSV files with millions of records. How can i achieve this using Integration Services or Bulk Data Import.

Amar:

I think for what you are describing I would use SSIS; if you are running SQL Server 2000, that would be DTS instead of SSIS. Other alternatives include the use of OPENROWSET, BULK INSERT, or BCP. Read about these alternatives in books online and choose the alternative that you think best fits.


Dave

|||

Hi Amar,

refer http://blogs.msdn.com/euanga/archive/2006/07/20/672272.aspx also.

Hemantgiri S. Goswami

sql

Read CSV file - Save Columns into Rows

I want to import CSV file and convert columns into rows depending on
Customer count(2nd record in each row of CSV file) and save to SQL
table
--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3,
Total_3..can go upto 350
GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James
Brown',17.00,'Rick Davis',18.00
Data in SQL table from csv file should look like this
State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00
I have multiple CSV files with millions of records.What is the best and
fastest way to achieve this?
Throw me anything
Hi
"pintoo" wrote:

> I want to import CSV file and convert columns into rows depending on
> Customer count(2nd record in each row of CSV file) and save to SQL
> table
> --CSV file format
> State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3,
> Total_3..can go upto 350
> GA,2,'John Doe',14.00,'Roger Smith',15.00
> FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
> SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James
> Brown',17.00,'Rick Davis',18.00
> Data in SQL table from csv file should look like this
> State,Name,Total
> GA,John Doe,14.00
> GA,Roger Smith,15.00
> FL,John Doe,14.00,
> FL,Roger Smith,15.00
> FL,Sally Cox,16.00
>
If you are moving these into rows the data is not normalized! You will only
be able to have a finite number of rows that can be converted, therefore you
could self join if each row is allocated a sequence number or the methid
described by Erland in
http://groups-beta.google.com/group/comp.databases.ms-sqlserver/msg/250f0c68596ce22e?&hl=en

> I have multiple CSV files with millions of records.What is the best and
> fastest way to achieve this?
> Throw me anything
>
If you are using SQL 2005 you could use the PIVOT command.
John

Read CSV file - Save Columns into Rows

I want to import CSV file and convert columns into rows depending on Customer count(2nd record in each row of CSV file) and save to SQL table

--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3, Total_3..can go upto 350

GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Data in SQL table from csv file should look like this

State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00

I have multiple CSV files with millions of records. How can i achieve this using Integration Services or Bulk Data Import.

Perhaps you should look at creating a VB/C# application to handle this situation.|||

I am trying do something very similar to what you are asking...I found import wizard by right clicking on the database itself. Under the "Tasks" option. I just got this administrators handbook which is somewhat helpful. First, you will have to import so it seems the column names first there is a check box on the screen you need to select to get the column names to appear this helps out with the data import too. Then go back from the top and do the same thing to get the data. You need to arrive at the copy or query screen to handle this. And choose the option to write a query. For, me however when I go back for the second pass through. It doesn't seem to let me get to the screen again, I saw it on the initial pass. Let me know if this worked for you?

|||

Once you have imported the rows to a temp. table then you might need to run a self-join to extract the required columns like:

insert into TableB(column1,column2,column3)
select A.column1 , B.column1, C.column1
from
(select column1 from TableA where column1 like 'A%') A
cross join (select column1 from TableA where column1 like 'B%') B
cross join (select column1 from TableA where column1 like 'C%') C
order by A.column1 , B.column1, C.column1

Read CSV file - Save Columns into Rows

I want to import CSV file and convert columns into rows depending on Customer count(2nd record in each row of CSV file) and save to SQL table

--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3, Total_3..can go upto 350

GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Data in SQL table from csv file should look like this

State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00

I have multiple CSV files with millions of records. How can i achieve this using Integration Services or Bulk Data Import.

Perhaps you should look at creating a VB/C# application to handle this situation.|||

I am trying do something very similar to what you are asking...I found import wizard by right clicking on the database itself. Under the "Tasks" option. I just got this administrators handbook which is somewhat helpful. First, you will have to import so it seems the column names first there is a check box on the screen you need to select to get the column names to appear this helps out with the data import too. Then go back from the top and do the same thing to get the data. You need to arrive at the copy or query screen to handle this. And choose the option to write a query. For, me however when I go back for the second pass through. It doesn't seem to let me get to the screen again, I saw it on the initial pass. Let me know if this worked for you?

|||

Once you have imported the rows to a temp. table then you might need to run a self-join to extract the required columns like:

insert into TableB(column1,column2,column3)
select A.column1 , B.column1, C.column1
from
(select column1 from TableA where column1 like 'A%') A
cross join (select column1 from TableA where column1 like 'B%') B
cross join (select column1 from TableA where column1 like 'C%') C
order by A.column1 , B.column1, C.column1

Read CSV file - Save Columns into Rows

I want to import CSV file and convert columns into rows depending on Customer count(2nd record in each row of CSV file) and save to SQL table

--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3, Total_3..can go upto 600

GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Data in SQL table from csv file should look like this

State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00

I have multiple CSV files with millions of records. How can i achieve this using Integration Services or Bulk Data Import.

You can use something like this..

Create Table #DataIntoRows
(
State Varchar(10),
Name Varchar(100),
Total Int
)
SELECT * INTO #Data FROM OPENROWSET('MSDASQL','Driver={Microsoft Text Driver (*.txt; *.csv)};DefaultDir={PATH}','SELECT * FROM Data.csv')

Declare @.I as Int;
Declare @.Count as Int;
Select
@.I = 1,
@.Count=Count(Colid)
From
tempdb..Syscolumns
Where
id = Object_Id('tempdb..#Data') and Colid > 2;


While @.I < @.Count
begin
Declare @.SQL as Varchar(300);
Select @.SQL = 'Insert Into #DataIntoRows Select State,'
Select @.SQL = @.SQL + name From tempdb..Syscolumns Where id = Object_Id('tempdb..#Data') and colId= @.I + 2
Select @.SQL = @.SQL + ',' + name + ' From #Data Where ' + name + ' Is NOT NULL' From tempdb..Syscolumns Where id = Object_Id('tempdb..#Data') and colId= @.I + 3
Exec(@.SQL)
Select @.I = @.I + 2;
End

Read CSV file - Save Columns into Rows

I want to import CSV file and convert columns into rows depending on Customer count(2nd record in each row of CSV file) and save to SQL table

--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3, Total_3..can go upto 600

GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Data in SQL table from csv file should look like this

State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00

I have multiple CSV files with millions of records. How can i achieve this using Integration Services or Bulk Data Import.

See if my post in this thread helps get you started. I know it's not quite the same, but it may be a good foundation for you.

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=880588&SiteID=1

Phil|||

Thanks Phil for your reply

I am new to SSID can you please provide some more detail

|||You want to use the unpivot transform to accomplish this. It does a fantastic job about flipping over columns to rows and purging nulls for you. Let me know if you need more info but there are some good blog posts out there on it.|||

Brian Knight wrote:

You want to use the unpivot transform to accomplish this. It does a fantastic job about flipping over columns to rows and purging nulls for you. Let me know if you need more info but there are some good blog posts out there on it.

I don't believe the unpivot will work because there can be a variable number of columns to unpivot. If it was a set number of columns, then this would work, however from row to row the number of columns changes.|||

Phil Brammer wrote:

Brian Knight wrote:

You want to use the unpivot transform to accomplish this. It does a fantastic job about flipping over columns to rows and purging nulls for you. Let me know if you need more info but there are some good blog posts out there on it.

I don't believe the unpivot will work because there can be a variable number of columns to unpivot. If it was a set number of columns, then this would work, however from row to row the number of columns changes.

I think it should be OK. If the original poster populates those columns wich don't have a value with NULL,then they won't get unpivoted.

If unpivot doesn't do the job let us know, there are alternatives using SORT and UNION ALL.

-Jamie

|||

Jamie Thomson wrote:


I think it should be OK. If the original poster populates those columns wich don't have a value with NULL,then they won't get unpivoted.

If unpivot doesn't do the job let us know, there are alternatives using SORT and UNION ALL.

-Jamie

My guess is that the OP can't control the file. That seems to usually be the case. The OP also states that the number of columns can go up to 600. Wow.

So for his data:
GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Row 1 has 6 columns. Row 2 has 8 columns. Row 3 has 12 columns.

Will the SSIS flat file connector work to correctly set the number of columns if the file is 10,000 rows long with the longest record (say 50 columns) is at the end when all of the other records were under 40 columns?|||

Phil Brammer wrote:

Jamie Thomson wrote:


I think it should be OK. If the original poster populates those columns wich don't have a value with NULL,then they won't get unpivoted.

If unpivot doesn't do the job let us know, there are alternatives using SORT and UNION ALL.

-Jamie

My guess is that the OP can't control the file. That seems to usually be the case. The OP also states that the number of columns can go up to 600. Wow.

So for his data:
GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Row 1 has 6 columns. Row 2 has 8 columns. Row 3 has 12 columns.

Will the SSIS flat file connector work to correctly set the number of columns if the file is 10,000 rows long with the longest record (say 50 columns) is at the end when all of the other records were under 40 columns?

No it won't But then the complexity here is in the source file - there's no getting around that. The original poster will just have to accept that he has a ridiculously complex source file and bite the bullet. There are still lots of ways to achieve this.

-Jamie

|||

Can you guys please point me to any examples or blogs.

|||

Amar Khaira wrote:

Can you please point me any examples or blogs

One way is to search the forums for examples. On the main page of this forum is a search box on the left-hand side of the screen. Type in unpivot and you'll get a few examples/topics that discuss that transformation.|||

Here's a great blog post: http://sqljunkies.com/WebLog/ashvinis/archive/2005/03.aspx

Send me an email and I'll send you an example code and I'll eventually do a video on JumpstartTV.com. The variable amount of columns is not a problem though. bknight<at>jumpstarttv.com is my email addr if needed.

|||

Brian Knight wrote:

Here's a great blog post: http://sqljunkies.com/WebLog/ashvinis/archive/2005/03.aspx

Send me an email and I'll send you an example code and I'll eventually do a video on JumpstartTV.com. The variable amount of columns is not a problem though. bknight<at>jumpstarttv.com is my email addr if needed.

How can the variable columns not be a problem? Because the flat file source won't account for the variability, you'll have to read in each record as one column, and then split it with a script transformation. In doing that, you might be able to set a fixed number of columns and fill them with NULLs if no data exists. I'm trying to understand how you think you can expose all of the columns to the unpivot transformation when the metadata can't be pre-populated with certainty. (That is, based on my previous example, there's a certain row limit that the flat file connector uses to assess how many columns there are in the file.)|||

I am getting following error while trying Unpivot Transform:

PivotKeyValue is not valid. In an UnPivot transform with more than one unpivoted DestinationColumn, the set of PivotKeyValues per destination must match exactly.

Read CSV file - Save Columns into Rows

I want to import CSV file and convert columns into rows depending on Customer count(2nd record in each row of CSV file) and save to SQL table

--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3, Total_3..can go upto 600

GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Data in SQL table from csv file should look like this

State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00

I have multiple CSV files with millions of records. How can i achieve this using Integration Services or Bulk Data Import.

See if my post in this thread helps get you started. I know it's not quite the same, but it may be a good foundation for you.

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=880588&SiteID=1

Phil|||

Thanks Phil for your reply

I am new to SSID can you please provide some more detail

|||You want to use the unpivot transform to accomplish this. It does a fantastic job about flipping over columns to rows and purging nulls for you. Let me know if you need more info but there are some good blog posts out there on it.|||

Brian Knight wrote:

You want to use the unpivot transform to accomplish this. It does a fantastic job about flipping over columns to rows and purging nulls for you. Let me know if you need more info but there are some good blog posts out there on it.

I don't believe the unpivot will work because there can be a variable number of columns to unpivot. If it was a set number of columns, then this would work, however from row to row the number of columns changes.|||

Phil Brammer wrote:

Brian Knight wrote:

You want to use the unpivot transform to accomplish this. It does a fantastic job about flipping over columns to rows and purging nulls for you. Let me know if you need more info but there are some good blog posts out there on it.

I don't believe the unpivot will work because there can be a variable number of columns to unpivot. If it was a set number of columns, then this would work, however from row to row the number of columns changes.

I think it should be OK. If the original poster populates those columns wich don't have a value with NULL,then they won't get unpivoted.

If unpivot doesn't do the job let us know, there are alternatives using SORT and UNION ALL.

-Jamie

|||

Jamie Thomson wrote:


I think it should be OK. If the original poster populates those columns wich don't have a value with NULL,then they won't get unpivoted.

If unpivot doesn't do the job let us know, there are alternatives using SORT and UNION ALL.

-Jamie

My guess is that the OP can't control the file. That seems to usually be the case. The OP also states that the number of columns can go up to 600. Wow.

So for his data:
GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Row 1 has 6 columns. Row 2 has 8 columns. Row 3 has 12 columns.

Will the SSIS flat file connector work to correctly set the number of columns if the file is 10,000 rows long with the longest record (say 50 columns) is at the end when all of the other records were under 40 columns?|||

Phil Brammer wrote:

Jamie Thomson wrote:


I think it should be OK. If the original poster populates those columns wich don't have a value with NULL,then they won't get unpivoted.

If unpivot doesn't do the job let us know, there are alternatives using SORT and UNION ALL.

-Jamie

My guess is that the OP can't control the file. That seems to usually be the case. The OP also states that the number of columns can go up to 600. Wow.

So for his data:
GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Row 1 has 6 columns. Row 2 has 8 columns. Row 3 has 12 columns.

Will the SSIS flat file connector work to correctly set the number of columns if the file is 10,000 rows long with the longest record (say 50 columns) is at the end when all of the other records were under 40 columns?

No it won't But then the complexity here is in the source file - there's no getting around that. The original poster will just have to accept that he has a ridiculously complex source file and bite the bullet. There are still lots of ways to achieve this.

-Jamie

|||

Can you guys please point me to any examples or blogs.

|||

Amar Khaira wrote:

Can you please point me any examples or blogs

One way is to search the forums for examples. On the main page of this forum is a search box on the left-hand side of the screen. Type in unpivot and you'll get a few examples/topics that discuss that transformation.|||

Here's a great blog post: http://sqljunkies.com/WebLog/ashvinis/archive/2005/03.aspx

Send me an email and I'll send you an example code and I'll eventually do a video on JumpstartTV.com. The variable amount of columns is not a problem though. bknight<at>jumpstarttv.com is my email addr if needed.

|||

Brian Knight wrote:

Here's a great blog post: http://sqljunkies.com/WebLog/ashvinis/archive/2005/03.aspx

Send me an email and I'll send you an example code and I'll eventually do a video on JumpstartTV.com. The variable amount of columns is not a problem though. bknight<at>jumpstarttv.com is my email addr if needed.

How can the variable columns not be a problem? Because the flat file source won't account for the variability, you'll have to read in each record as one column, and then split it with a script transformation. In doing that, you might be able to set a fixed number of columns and fill them with NULLs if no data exists. I'm trying to understand how you think you can expose all of the columns to the unpivot transformation when the metadata can't be pre-populated with certainty. (That is, based on my previous example, there's a certain row limit that the flat file connector uses to assess how many columns there are in the file.)|||

I am getting following error while trying Unpivot Transform:

PivotKeyValue is not valid. In an UnPivot transform with more than one unpivoted DestinationColumn, the set of PivotKeyValues per destination must match exactly.

Read CSV file - Save Columns into rows

I want to import CSV file and convert columns into rows depending on Customer count(2nd record in each row of CSV file) and save to SQL table

--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3, Total_3..can go upto 350

GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James Brown',17.00,'Rick Davis',18.00

Data in SQL table from csv file should look like this

State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00

I have multiple CSV files with millions of records. How can i achieve this using Integration Services or Bulk Data Import.

Amar:

I think for what you are describing I would use SSIS; if you are running SQL Server 2000, that would be DTS instead of SSIS. Other alternatives include the use of OPENROWSET, BULK INSERT, or BCP. Read about these alternatives in books online and choose the alternative that you think best fits.


Dave

|||

Hi Amar,

refer http://blogs.msdn.com/euanga/archive/2006/07/20/672272.aspx also.

Hemantgiri S. Goswami

Read CSV file - Save Columns into Rows

I want to import CSV file and convert columns into rows depending on
Customer count(2nd record in each row of CSV file) and save to SQL
table
--CSV file format
State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3,
Total_3..can go upto 350
GA,2,'John Doe',14.00,'Roger Smith',15.00
FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James
Brown',17.00,'Rick Davis',18.00
Data in SQL table from csv file should look like this
State,Name,Total
GA,John Doe,14.00
GA,Roger Smith,15.00
FL,John Doe,14.00,
FL,Roger Smith,15.00
FL,Sally Cox,16.00
I have multiple CSV files with millions of records.What is the best and
fastest way to achieve this?
Throw me anythingHi
"pintoo" wrote:

> I want to import CSV file and convert columns into rows depending on
> Customer count(2nd record in each row of CSV file) and save to SQL
> table
> --CSV file format
> State, Customer_Count, Name_1, Total_1,Name_2, Total_2,Name_3,
> Total_3..can go upto 350
> GA,2,'John Doe',14.00,'Roger Smith',15.00
> FL,3,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00
> SC,5,'John Doe',14.00,'Roger Smith',15.00,'Sally Cox',16.00,'James
> Brown',17.00,'Rick Davis',18.00
> Data in SQL table from csv file should look like this
> State,Name,Total
> GA,John Doe,14.00
> GA,Roger Smith,15.00
> FL,John Doe,14.00,
> FL,Roger Smith,15.00
> FL,Sally Cox,16.00
>
If you are moving these into rows the data is not normalized! You will only
be able to have a finite number of rows that can be converted, therefore you
could self join if each row is allocated a sequence number or the methid
described by Erland in
http://groups-beta.google.com/group...r />
2e?&hl=en

> I have multiple CSV files with millions of records.What is the best and
> fastest way to achieve this?
> Throw me anything
>
If you are using SQL 2005 you could use the PIVOT command.
John

Read + Statistics from SQl

I have a very big problem. I must do this vbs or i'll be fired...Help me, i'm newbie at vb...

I have a SQL2000 database with a columns defined like that:
Server name, data, service1, service2, ..., service7.

Records looks something like that:

k2-ss-ss, 2003-08-13, 1, 0, 1, 1, 1, 1, 0.
k2-qq-ww, 2003-08-14 0, 0, 1, 1, 0, 0, 1.

"1" - means running
"0" - means stopped

example
a write: "service1", "2003-08-11 ", "2003-08-14"
Script list, that service1 worked 94,3% at choosed period, on k2-ss-ss.
Script list, that service1 worked 06,3% at choosed period, on k2-qq-ww.

Records to the database are writen every 3 minutes.

Can anyone help me? Some piece of code?:(create procedure SaveYerAss
(@.Service varchar(50),
@.StartDate datetime,
@.EndDate datetime)
as

select @.Service + ' worked ' +
case @.Service
when 'Service1' then 100 * Service1Count/RecordCount
when 'Service2' then 100 * Service2Count/RecordCount
when 'Service3' then 100 * Service3Count/RecordCount
when 'Service4' then 100 * Service4Count/RecordCount
when 'Service5' then 100 * Service5Count/RecordCount
when 'Service6' then 100 * Service6Count/RecordCount
when 'Service7' then 100 * Service7Count/RecordCount
end
+ '% at choosed period, on ' + Servername + '.'
From
(select servername,
sum([service1]) Service1Count,
sum([service2]) Service2Count,
sum([service3]) Service3Count,
sum([service4]) Service4Count,
sum([service5]) Service5Count,
sum([service6]) Service6Count,
sum([service7]) Service7Count,
count(*) RecordCount
from [YerDataTable]
where [data] between @.StartDate and @.EndDate) SummarizedData|||bm, your subquery lacks a
group by servername
clause.|||Well, he ought to do SOMETHING to save his own job! :D|||bm, I take it as a THANKS. :D|||What do you want to bet that was a homework problem?