help me with an issue please.
My windows service depended on MS SQL server, and It has to load some information on start up, but it is starts before MS SQL and stops with an exception. I set dependency with a "sc" tool (sc config MyService depend= MSSQL) but my service still does not start on system load. MS SQL added as depednecy, I can see it in dependency tab of service properties and sc qc MyService command but my service still does not start. Can anyone help? My system is Win7, MS SQL Server 2008 R2, my service runs under local system and MS SQL is network service. Thanks a lot.
P.S. Sorry for poor English.
I think you should use dependencies
How to delay loading of specific services
or use "Automatic (Delayed Start)" option for your service Please read
WS2008: Startup Processes and Delayed Automatic Start
or
You can manage this through programming logic, that if resource is unavailable
repetitively check for it and when its available proceed for normal operation,
and you can intimate Service Client though appropriate response
Hopes that helps
Related
I study both SQL and Mongodb and after a power outage at home, I can't connect to either locally. I use XE18c and SQL developer 21 for SQL and I get the silly message
An error was encountered performing the requested operation:
Listener refused the connection with the following error: ORA-12514,
TNS:listener does not currently know of service requested in connect
descriptor (CONNECTION_ID=(id)==)
Vendor code 12514
For Mongo, I can't even start the server, not even from services.
I'm on windows 10
I guess it has to do with networking....or the mercury is in retrograde or something. Any help on the former, would be great.
UPDATE
One problem fixed. Mongodb's files got corrupted from the power outage shutdown. I was able to fixe it using --REPAIR. This Mongodb doc was very useful. Recover a Standalone after an Unexpected Shutdown
One or more components are restarting or down; wait and re-try.
We have a distributed system system built on erlang with one server node and hundreds of client nodes(the systems are distributed over the intra-net). We have a requirement that all the client nodes will connect to the server node and try to download some file(mostly the same file will be accessed by all client nodes) simultaneously by using sftp. The steps we follow for downloading the file is:
Establish ssh sftp connection between the server node and client node using function call as below:
ssh_sftp:start_channel/2 .
Then reads the file by doing function call as below:
ssh_sftp:read_file/2
The problem what we are facing is that when the number of clients are more then it is observed that few client nodes are failing to establish connection between server node. i.e. the ssh_sftp:start_channel/2 function call is failing.
Can somebody please explain me;
Is there any limitation for the number of sftp session what we can establish in a single system ?
What are the possible reason because of which the connection request fails ?
Is there anything wrong we are doing in this approach ?
Is there any better solution by which we can guarantee that all client nodes will be able to connect to server and will be able to download the file.
Observation: We tried to connect 25 client nodes to the server; during the first try only 2 nodes failed to connect and on the second try 5 nodes failed to connect. Why this random behavior ?.
I think I can answer some question below (fix me if I wrong):
Erlang is really strong when you use this concurrent, so your issue here is power of your physical hardware (server).
I don't really know what the issue in your project but my project about telecom can easy handle a thousand call which 2 process handle each call. 1 process is the master process handle the session (connection) and each other watching and handle error master process, so that be can NOT failed the connection
I have build a few apps that receive and send data over a Datasnap server.
I have multiple Datasnap servers running on my customers servers.
The Datasnap server runs as a Windows Service on their machines.
How can I remotely update those servers or what is the best way to do this?
Is it maybe better to make the Datasnap Server run on IIS instead of as a Windows Service?
Can I let the server update itself? Maybe if I make a function that sends the new version of it that it can replace itself?
I can see no way of updating the exes on your customers' machines without a) getting them to do it; b) logging in remotely using something like VNC (though I'd be happy to know of a better way).
This is an approach I would consider:
I would look to have a separate application which is called by the service on a regular basis (daily/weekly) that looks for a new versions of software on your server (or somewhere), downloads it stops the service, replaces the service's exe and restarts the service.
In the new service exe I would have something that calls this application periodically, so you only have to do this once.
This assumes the machines that run your software have internet access.
Don't forget to include a mechanism to update the updater application.
I am using Community Edition of Neo4j on my local machine. Is there any way for handling failover (Failover mechanism is present in Enterprise Edition)?.
One option 1 have thought of is to take the backup of directory on real time and run the new Neo4j installation on backup directory. Is there any way to manage it through any script or some software like Load Balancer which will enable the backup on same port and IP on failover of main Server.
If anyone has prepared setup of handling failover with Community Edition. Please share.
Copying over the store directory while the database is running is a dangerous operation that might result in corrupted data.
Of course you can write a clustering solution on top of Neo4j community yourself, but be assured that you need to invest multiple man-years to do this in production ready way. That's why Neo4j enterprise already solved that problem for you.
The recommended way of course is to use enterprise edition. Contact Neo4j sales folks to gather more information regarding licensing and prices.
I built an application that is running as a windows service and is installed through my code.
All is fine except at logon.
When at the first windows xp/2003 server logon screen, I am not sure if the service is running at all. If it is, then it does work as it's not functional (the service IS USING WINPCAP so that could be an issue).
The service settings are set to "interact with desktop" and run as SYSTEM.
How can I ensure the service will start before windows logon? Also how can I make sure it is running even after I log off?
There are a couple of issues to consider.
First, you can check if your service really is running before login and after logout by logging events to the Windows Event Log. Pretty much all services do this whenever they start and stop and yours should do the same.
It may be that WinPcap is part of the problem. There are a couple of golden rules for using WinPcap in a service.
1a) Your service must not do anything that might cause the WinPcap service to try to start up while your own service is starting up because this will cause a deadlock in the Windows Service Control Manager. That means that if the WinPcap service is not already SERVICE_RUNNING when your service begins startup, you must not do anything that might cause it to start until after your service is SERVICE_RUNNING.
There are two ways to ensure this. Either make your service dependent on npf, the Network Packet Filter service. Or do not call any WinPcap function until after your service is SERVICE_RUNNING. I've not tried this latter method. I presume then the WinPcap function will block until npf is SERVICE_RUNNING.
1b) If you make your service dependent on npf, you will also have to make it dependent on nm (Network Monitor Driver) - if and only if nm is installed on the system. nm provides WinPcap with PPP/VPN support, and WinPcap always tries to use it if installed. Obviously, if you make nm a dependency of your service and nm isn't installed then your service will fail to start.
I don't think there is a guaranteed way to ensure that your service starts up before the desktop appears. But you might be able to help things along by creating a Service Control Group, adding it to the end of the existing list of Service Control Groups, and putting your service into this group. I'm not entirely convinced that this is an 'approved' way to get your service to start sooner, because if there was an approved way then everyone would do it and it wouldn't work any more. But there is a suggestion that services in a group are started before services not in a group.
Look at HKEY_LOCAL_MACHINE, "SYSTEM\CurrentControlSet\Control\GroupOrderList" and HKEY_LOCAL_MACHINE, "SYSTEM\CurrentControlSet\Control\ServiceGroupOrder" and do a bit of Googling.