Showing posts with label working. Show all posts
Showing posts with label working. Show all posts

Friday, March 23, 2012

mmc.exe writes a lot to SQL.LOG when working with Enterprise Manag

I noticed that mmc.exe wrote a lot of information to SQL.LOG file, which was
generated by mmc.exe, when I was working with Enterprise Manager. For
instance, a double-click a database name or the server name would write abou
t
330K to the file. Thus, I had to wait for about 40 seconds per click. It
happended only on my Windows 2000 Server (SP4). Other servers do not even
generate the file. The following is sample of information written to the fil
e:
sqlstp 17c-840 ENTER SQLAllocHandle
SQLSMALLINT 1 <SQL_HANDLE_ENV>
SQLHANDLE 00000000
SQLHANDLE * 00794460
Any idea about it? How to turn off it? Thanks.I found a solution: Turning off ODBC tracing. It's the ODBC tracing that
writes trace to the file.
"Kathy" wrote:

> I noticed that mmc.exe wrote a lot of information to SQL.LOG file, which w
as
> generated by mmc.exe, when I was working with Enterprise Manager. For
> instance, a double-click a database name or the server name would write ab
out
> 330K to the file. Thus, I had to wait for about 40 seconds per click. It
> happended only on my Windows 2000 Server (SP4). Other servers do not even
> generate the file. The following is sample of information written to the f
ile:
> sqlstp 17c-840 ENTER SQLAllocHandle
> SQLSMALLINT 1 <SQL_HANDLE_ENV>
> SQLHANDLE 00000000
> SQLHANDLE * 00794460
> Any idea about it? How to turn off it? Thanks.
>

mmc.exe writes a lot to SQL.LOG when working with Enterprise Manag

I noticed that mmc.exe wrote a lot of information to SQL.LOG file, which was
generated by mmc.exe, when I was working with Enterprise Manager. For
instance, a double-click a database name or the server name would write about
330K to the file. Thus, I had to wait for about 40 seconds per click. It
happended only on my Windows 2000 Server (SP4). Other servers do not even
generate the file. The following is sample of information written to the file:
sqlstp 17c-840ENTER SQLAllocHandle
SQLSMALLINT 1 <SQL_HANDLE_ENV>
SQLHANDLE 00000000
SQLHANDLE * 00794460
Any idea about it? How to turn off it? Thanks.
I found a solution: Turning off ODBC tracing. It's the ODBC tracing that
writes trace to the file.
"Kathy" wrote:

> I noticed that mmc.exe wrote a lot of information to SQL.LOG file, which was
> generated by mmc.exe, when I was working with Enterprise Manager. For
> instance, a double-click a database name or the server name would write about
> 330K to the file. Thus, I had to wait for about 40 seconds per click. It
> happended only on my Windows 2000 Server (SP4). Other servers do not even
> generate the file. The following is sample of information written to the file:
> sqlstp 17c-840ENTER SQLAllocHandle
> SQLSMALLINT 1 <SQL_HANDLE_ENV>
> SQLHANDLE 00000000
> SQLHANDLE * 00794460
> Any idea about it? How to turn off it? Thanks.
>

mmc.exe writes a lot to SQL.LOG when working with Enterprise Manag

I noticed that mmc.exe wrote a lot of information to SQL.LOG file, which was
generated by mmc.exe, when I was working with Enterprise Manager. For
instance, a double-click a database name or the server name would write about
330K to the file. Thus, I had to wait for about 40 seconds per click. It
happended only on my Windows 2000 Server (SP4). Other servers do not even
generate the file. The following is sample of information written to the file:
sqlstp 17c-840 ENTER SQLAllocHandle
SQLSMALLINT 1 <SQL_HANDLE_ENV>
SQLHANDLE 00000000
SQLHANDLE * 00794460
Any idea about it? How to turn off it? Thanks.I found a solution: Turning off ODBC tracing. It's the ODBC tracing that
writes trace to the file.
"Kathy" wrote:
> I noticed that mmc.exe wrote a lot of information to SQL.LOG file, which was
> generated by mmc.exe, when I was working with Enterprise Manager. For
> instance, a double-click a database name or the server name would write about
> 330K to the file. Thus, I had to wait for about 40 seconds per click. It
> happended only on my Windows 2000 Server (SP4). Other servers do not even
> generate the file. The following is sample of information written to the file:
> sqlstp 17c-840 ENTER SQLAllocHandle
> SQLSMALLINT 1 <SQL_HANDLE_ENV>
> SQLHANDLE 00000000
> SQLHANDLE * 00794460
> Any idea about it? How to turn off it? Thanks.
>sql

Wednesday, March 21, 2012

mixed recovery modes

Are there any limitations in SQL Server 2000, or 2005 in working with one
database in full recovery mode and another in simple recovery mode?
Under other platforms, you can't join between a logged and non-logged
database.
I think the answer is "no problem", but I would like to be sure.
In particular, I'm thinking about changing the recovery model of the
database drop off point to the search engine to simple, as its just a copy
and is reset daily from production.No problem. :-)
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://sqlblog.com/blogs/tibor_karaszi
"Jay" <nospam@.nospam.org> wrote in message news:ec6fVRL8HHA.2004@.TK2MSFTNGP06.phx.gbl...
> Are there any limitations in SQL Server 2000, or 2005 in working with one database in full
> recovery mode and another in simple recovery mode?
> Under other platforms, you can't join between a logged and non-logged database.
> I think the answer is "no problem", but I would like to be sure.
> In particular, I'm thinking about changing the recovery model of the database drop off point to
> the search engine to simple, as its just a copy and is reset daily from production.
>|||ty sir.
"Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com> wrote in
message news:eAGqgSL8HHA.1204@.TK2MSFTNGP03.phx.gbl...
> No problem. :-)
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://sqlblog.com/blogs/tibor_karaszi
>
> "Jay" <nospam@.nospam.org> wrote in message
> news:ec6fVRL8HHA.2004@.TK2MSFTNGP06.phx.gbl...
>> Are there any limitations in SQL Server 2000, or 2005 in working with one
>> database in full recovery mode and another in simple recovery mode?
>> Under other platforms, you can't join between a logged and non-logged
>> database.
>> I think the answer is "no problem", but I would like to be sure.
>> In particular, I'm thinking about changing the recovery model of the
>> database drop off point to the search engine to simple, as its just a
>> copy and is reset daily from production.
>

Monday, March 19, 2012

Missing xp_sqlagent_notify

I'm trying to have an alert fire when it receives an error via the RAISERROR
call. I have it working in SQL 2000, but can't get it to work for SQL 2005.
When I look at the history of the alert, it says that it has not been raised
yet.
When I look at the log from SQLServerAgent, I see that it is missing the
stored procedure xp_sqlagent_notify. Where would I find this missing
procedure? I see that I have an sp_sqlagent_notify -- should I just rename o
r
copy it?
SteveIt sounds like it may not be logging if the history of the
alert indicates it hasn't been raised. Did you create the
message using @.with_log = true or are you raising the error
using with log?
Try testing by doing a raiserror with log. Or you can use
sp_altermessage YourMessageNumber, 'with log', 'true'
to log the message.
-Sue
On Fri, 19 May 2006 11:04:01 -0700, SteveIrwin
<SteveIrwin@.discussions.microsoft.com> wrote:

>I'm trying to have an alert fire when it receives an error via the RAISERRO
R
>call. I have it working in SQL 2000, but can't get it to work for SQL 2005.
>When I look at the history of the alert, it says that it has not been raise
d
>yet.
>When I look at the log from SQLServerAgent, I see that it is missing the
>stored procedure xp_sqlagent_notify. Where would I find this missing
>procedure? I see that I have an sp_sqlagent_notify -- should I just rename
or
>copy it?
>Steve|||That worked, Sue. Thanks!
Steve
"Sue Hoegemeier" wrote:

> It sounds like it may not be logging if the history of the
> alert indicates it hasn't been raised. Did you create the
> message using @.with_log = true or are you raising the error
> using with log?
> Try testing by doing a raiserror with log. Or you can use
> sp_altermessage YourMessageNumber, 'with log', 'true'
> to log the message.
> -Sue
> On Fri, 19 May 2006 11:04:01 -0700, SteveIrwin
> <SteveIrwin@.discussions.microsoft.com> wrote:
>
>

Missing xp_sqlagent_notify

I'm trying to have an alert fire when it receives an error via the RAISERROR
call. I have it working in SQL 2000, but can't get it to work for SQL 2005.
When I look at the history of the alert, it says that it has not been raised
yet.
When I look at the log from SQLServerAgent, I see that it is missing the
stored procedure xp_sqlagent_notify. Where would I find this missing
procedure? I see that I have an sp_sqlagent_notify -- should I just rename or
copy it?
SteveIt sounds like it may not be logging if the history of the
alert indicates it hasn't been raised. Did you create the
message using @.with_log = true or are you raising the error
using with log?
Try testing by doing a raiserror with log. Or you can use
sp_altermessage YourMessageNumber, 'with log', 'true'
to log the message.
-Sue
On Fri, 19 May 2006 11:04:01 -0700, SteveIrwin
<SteveIrwin@.discussions.microsoft.com> wrote:
>I'm trying to have an alert fire when it receives an error via the RAISERROR
>call. I have it working in SQL 2000, but can't get it to work for SQL 2005.
>When I look at the history of the alert, it says that it has not been raised
>yet.
>When I look at the log from SQLServerAgent, I see that it is missing the
>stored procedure xp_sqlagent_notify. Where would I find this missing
>procedure? I see that I have an sp_sqlagent_notify -- should I just rename or
>copy it?
>Steve|||That worked, Sue. Thanks!
Steve
"Sue Hoegemeier" wrote:
> It sounds like it may not be logging if the history of the
> alert indicates it hasn't been raised. Did you create the
> message using @.with_log = true or are you raising the error
> using with log?
> Try testing by doing a raiserror with log. Or you can use
> sp_altermessage YourMessageNumber, 'with log', 'true'
> to log the message.
> -Sue
> On Fri, 19 May 2006 11:04:01 -0700, SteveIrwin
> <SteveIrwin@.discussions.microsoft.com> wrote:
> >I'm trying to have an alert fire when it receives an error via the RAISERROR
> >call. I have it working in SQL 2000, but can't get it to work for SQL 2005.
> >When I look at the history of the alert, it says that it has not been raised
> >yet.
> >
> >When I look at the log from SQLServerAgent, I see that it is missing the
> >stored procedure xp_sqlagent_notify. Where would I find this missing
> >procedure? I see that I have an sp_sqlagent_notify -- should I just rename or
> >copy it?
> >
> >Steve
>

Monday, March 12, 2012

missing tables

I have a database that I have been working on that is on a remote server for a website. I recently had them back up the database so I can work on it locally. After some time I managed to get the backup working, well sorta.

Originally the database was a access 2000 database. This soon proved not to be up to the task. The server people converted the database to sql 2000 and everything is working on the site. However my backup has a problem. In the manager I can see all the tables that were there before. But when I go to run a query on the database through CF the only tables that it can see are tables that were created after the access conversion. I have gone in and checked permissions and set every table and every column to public and still cannot see the missing tables. Any help?

What is CF?

How did the the server people move the Access tables to SQL Server?

Are you having this problem with the backup that you mention or with the original copy on the server?

Mike

|||CF is Cold Fusion.

I have no idea how they converted the Access to SQL

I am only having this problem with the backup.|||

Well, I have no idea how Cold Fusion works, it's always possible that they read the file is some "special" way that is causing this. But there is also the fact that the backup is "sort of" working.

As a first test, it would be worth checking if you can see the tables using some other tool. Try creating linked tables to your database in Access using the Link Table wizard and see if the behavior is any different. If it works in Access, then I'd say pursue the problem with the Cold Fusion people, if it behaves the same in Access, then it we should investigate the backup it self and did into the "sort of" part of the equation.

Mike

|||I had allready thought of that. PHP cannot see the hidden tables either. My access wont let me do linked tables to sql. I only have access 2000 and I guess that it doesnt do that.|||

Odd, Access has supported linked tables to SQL since 2.0.

I'm assuming that you can see the tables using SQL tools such as Management Studio, right? Could you check what Schema the tables that you cannot see from other programs belong to?

If you can see tables that are created in SQL, the answer might just be to recreate these tables directly in SQL. You could script out the problem tables, change the names of the original, recreate the tables using the scripts and then move the data. I don't know how many tables we're talking about here, but it's worth looking into if it gets you up and running. (Try it with one table to see if it works.)

Mike

missing tables

I have a database that I have been working on that is on a remote server for a website. I recently had them back up the database so I can work on it locally. After some time I managed to get the backup working, well sorta.

Originally the database was a access 2000 database. This soon proved not to be up to the task. The server people converted the database to sql 2000 and everything is working on the site. However my backup has a problem. In the manager I can see all the tables that were there before. But when I go to run a query on the database through CF the only tables that it can see are tables that were created after the access conversion. I have gone in and checked permissions and set every table and every column to public and still cannot see the missing tables. Any help?

What is CF?

How did the the server people move the Access tables to SQL Server?

Are you having this problem with the backup that you mention or with the original copy on the server?

Mike

|||CF is Cold Fusion.

I have no idea how they converted the Access to SQL

I am only having this problem with the backup.|||

Well, I have no idea how Cold Fusion works, it's always possible that they read the file is some "special" way that is causing this. But there is also the fact that the backup is "sort of" working.

As a first test, it would be worth checking if you can see the tables using some other tool. Try creating linked tables to your database in Access using the Link Table wizard and see if the behavior is any different. If it works in Access, then I'd say pursue the problem with the Cold Fusion people, if it behaves the same in Access, then it we should investigate the backup it self and did into the "sort of" part of the equation.

Mike

|||I had allready thought of that. PHP cannot see the hidden tables either. My access wont let me do linked tables to sql. I only have access 2000 and I guess that it doesnt do that.|||

Odd, Access has supported linked tables to SQL since 2.0.

I'm assuming that you can see the tables using SQL tools such as Management Studio, right? Could you check what Schema the tables that you cannot see from other programs belong to?

If you can see tables that are created in SQL, the answer might just be to recreate these tables directly in SQL. You could script out the problem tables, change the names of the original, recreate the tables using the scripts and then move the data. I don't know how many tables we're talking about here, but it's worth looking into if it gets you up and running. (Try it with one table to see if it works.)

Mike

Saturday, February 25, 2012

Missing MDF files but databases run fine?

About two months ago, we had a problem with a couple SQL 2005 databases. We
got everything working and all has been fine. Today, I noticed that the
databases that we had a problem with have LDF file but no MDF files. I don't
know much about SQL but does that make sense? Is there any way to make the
MDF file at this point?
Thanks!
Arch WillinghamWhat do you mean by 'no MDF files'? You can have a scenario where the drive
that hosts a database file disappears and the database may still function as
usual for a while until it has to access the disk.
Linchi
"Arch Willingham" wrote:
> About two months ago, we had a problem with a couple SQL 2005 databases. We
> got everything working and all has been fine. Today, I noticed that the
> databases that we had a problem with have LDF file but no MDF files. I don't
> know much about SQL but does that make sense? Is there any way to make the
> MDF file at this point?
> Thanks!
> Arch Willingham
>
>|||Never mind....weird thing. I am a holdout thatstill uses the old file
manager (WINFILE.EXE). For whatever reason, teh MDF files are hidden to it
but they are there when you look at the directory with Explorer.
Weird as I can see all other hidden and system files.
Arch
"Linchi Shea" <LinchiShea@.discussions.microsoft.com> wrote in message
news:CEE64FB0-73F3-483E-A4DF-3EEB750908DC@.microsoft.com...
> What do you mean by 'no MDF files'? You can have a scenario where the
> drive
> that hosts a database file disappears and the database may still function
> as
> usual for a while until it has to access the disk.
> Linchi
> "Arch Willingham" wrote:
>> About two months ago, we had a problem with a couple SQL 2005 databases.
>> We
>> got everything working and all has been fine. Today, I noticed that the
>> databases that we had a problem with have LDF file but no MDF files. I
>> don't
>> know much about SQL but does that make sense? Is there any way to make
>> the
>> MDF file at this point?
>> Thanks!
>> Arch Willingham
>>

Monday, February 20, 2012

Missing files after using File System Task

Hi All,

I don't know if anyone faced this issue. We are having a strange problem. Our process was working well when it was implemented on 32 bit processor.IT ran perfectly for 6 months with out a problem. But when we moved the packages to a 64 bit machine, this issue along with some other issues started to show up.

The issue is we are missing files in the source folder.

Our process is designed such that a source process, brings in a file and updates a status for the file in a audit table. The ETL process picks up the file, then assigns the status as ‘running’ when SRC process is complete and loads into Target DB, and updates ETL status to complete. But current problem is the ETL is losing files after it assigns the status as running. When we looked into the DB weather the data is loaded, we could not find any data related to these files.

we are have mapping level parameters for source path and target path.

We are using a For Each Loop task, and processing files(which are simple flat files) in the source path. The file name is stored in the mapping level parameter. Once the file is process we are moving them into a target path.

Our src and target file paths are on the same drive, just have src folder, inside src folder we have processed folder and failed folder. So files are picked from the source folder and moved into processed folder after processing. The files are not even moved to a failed folder.

There are lot other processing going on this box, and the trend observed is that when more processors are running at peak hour, the missing files’ count is more.

Right now we are refetching those files, as a work around, but does any one has any suggestion why this is happening or any better implementation suggestions?

Thanks

We solved our problem. This was due to the combination of problems.

1. Environment setup for our new 64 bit server

2. Use of event handler for 'On Error' instead of 'On task failure'.

3. Overwrite destination in move file task properties in Event handler.

4. Consistent connection (I don't exactly remember the word, but its in the properties of the connections in connection manager)

1st and 4th issue: The sympotom was the missing files were more when the system was too busy, also processing other packages. When we set up our initial 32 bit machine, we had our named pipes enabled on both client and server. so no issues. But in our new 64 bit environment, the server named pipes was disabled (Shared Memory, TCP/IP were enabled) and client's was enabled, so when client wants a connection, it looks in the same order, 1st Shared Memory, 2nd TCP/IP and last named pipes. But the server, only gives connection on first 2. So the connection were timing out in peak processing.

Added to that the connection properties in the package(4th issue), was not stable state, so every time, its encounters a task in the package which needs DB connection, its requests new connection. So this issue multiplied because the named pipes were disabled. So the solution to this both issues is, just enable named pipes on the server (or disable on client, enable is better since the process has extra options for connections). In the package conection properties, select connection as stable state. This option helps to have stable connection from the start of package execution to the end, instead of connection and disconnecting for each task.

2nd & 3rd issue: This issue was already present in our 32 bit also, but did not show up since we did not have connection failure error due to named pipes. Since connection was timing out on our new environment, each failure consitues to an error, so when the first error raised, the src file was moved to the destination folder(with overwrite destination option true). But due to connection failures, we had often multiple error events, so for subsequest error events, its tries to moved again the src file which is not present since it was moved the first time. So for subsequest errors, since we had overwrite destination option to true, the package first deletes the target files, and tries to move the file and fails, since the src file is not present. This leads to the missing file problem.

The solution to this problem is, just set overwrite destination option to false in file task. Based on your requirement, you can also have the event handler on the 'On Task Failure' event instead of 'on Error event'.

Hope this helps a lot of pains to lot of people, and saves lot of bucks to many companies.

Excuse my spelling mistakes if any.

Thanks,

Venkat

Missing files after using File System Task

Hi All,

I don't know if anyone faced this issue. We are having a strange problem. Our process was working well when it was implemented on 32 bit processor.IT ran perfectly for 6 months with out a problem. But when we moved the packages to a 64 bit machine, this issue along with some other issues started to show up.

The issue is we are missing files in the source folder.

Our process is designed such that a source process, brings in a file and updates a status for the file in a audit table. The ETL process picks up the file, then assigns the status as ‘running’ when SRC process is complete and loads into Target DB, and updates ETL status to complete. But current problem is the ETL is losing files after it assigns the status as running. When we looked into the DB weather the data is loaded, we could not find any data related to these files.

we are have mapping level parameters for source path and target path.

We are using a For Each Loop task, and processing files(which are simple flat files) in the source path. The file name is stored in the mapping level parameter. Once the file is process we are moving them into a target path.

Our src and target file paths are on the same drive, just have src folder, inside src folder we have processed folder and failed folder. So files are picked from the source folder and moved into processed folder after processing. The files are not even moved to a failed folder.

There are lot other processing going on this box, and the trend observed is that when more processors are running at peak hour, the missing files’ count is more.

Right now we are refetching those files, as a work around, but does any one has any suggestion why this is happening or any better implementation suggestions?

Thanks

We solved our problem. This was due to the combination of problems.

1. Environment setup for our new 64 bit server

2. Use of event handler for 'On Error' instead of 'On task failure'.

3. Overwrite destination in move file task properties in Event handler.

4. Consistent connection (I don't exactly remember the word, but its in the properties of the connections in connection manager)

1st and 4th issue: The sympotom was the missing files were more when the system was too busy, also processing other packages. When we set up our initial 32 bit machine, we had our named pipes enabled on both client and server. so no issues. But in our new 64 bit environment, the server named pipes was disabled (Shared Memory, TCP/IP were enabled) and client's was enabled, so when client wants a connection, it looks in the same order, 1st Shared Memory, 2nd TCP/IP and last named pipes. But the server, only gives connection on first 2. So the connection were timing out in peak processing.

Added to that the connection properties in the package(4th issue), was not stable state, so every time, its encounters a task in the package which needs DB connection, its requests new connection. So this issue multiplied because the named pipes were disabled. So the solution to this both issues is, just enable named pipes on the server (or disable on client, enable is better since the process has extra options for connections). In the package conection properties, select connection as stable state. This option helps to have stable connection from the start of package execution to the end, instead of connection and disconnecting for each task.

2nd & 3rd issue: This issue was already present in our 32 bit also, but did not show up since we did not have connection failure error due to named pipes. Since connection was timing out on our new environment, each failure consitues to an error, so when the first error raised, the src file was moved to the destination folder(with overwrite destination option true). But due to connection failures, we had often multiple error events, so for subsequest error events, its tries to moved again the src file which is not present since it was moved the first time. So for subsequest errors, since we had overwrite destination option to true, the package first deletes the target files, and tries to move the file and fails, since the src file is not present. This leads to the missing file problem.

The solution to this problem is, just set overwrite destination option to false in file task. Based on your requirement, you can also have the event handler on the 'On Task Failure' event instead of 'on Error event'.

Hope this helps a lot of pains to lot of people, and saves lot of bucks to many companies.

Excuse my spelling mistakes if any.

Thanks,

Venkat

Missing File

I am trying hard to get my downloaded SQL server express working.
I need to have use of the reporting services.
This appears on my start menu, but clicking it gives a message that
devenv.exe is missing and can't be found anywhere.
Where does devenv.exe come from - how can I get hold of it?
Thanks for helping this newbie.The Business Intelligence Development Studio.
The readme has the information you need - you need the
Express Edition Toolkit. You can find the download link at:
http://msdn.microsoft.com/vstudio/express/sql/download/
-Sue
On Sun, 03 Sep 2006 20:49:58 GMT, "PeeJayEll"
<nospamthanks@.hotmail.com> wrote:

>I am trying hard to get my downloaded SQL server express working.
>I need to have use of the reporting services.
>This appears on my start menu, but clicking it gives a message that
>devenv.exe is missing and can't be found anywhere.
>Where does devenv.exe come from - how can I get hold of it?
>Thanks for helping this newbie.
>
>