Quartz.Net ZeroSizeThreadPool high CPU usage - quartz.net

For some reason when I configure and start the ZeroSizeThreadPool Quartz.net scheduler as below, the CPU usage is very high making the computer unresponsive... Does anyone have any idea why this would be happening and how to fix the problem? TIA.
var properties = new NameValueCollection();
properties["quartz.scheduler.instanceName"] = "MyScheduler";
properties["quartz.scheduler.instanceId"] = "instance_myscheduler";
properties["quartz.threadPool.type"] = "Quartz.Simpl.ZeroSizeThreadPool, Quartz";
properties["quartz.jobStore.type"] = "Quartz.Impl.AdoJobStore.JobStoreTX, Quartz";
properties["quartz.jobStore.useProperties"] = "true";
properties["quartz.jobStore.dataSource"] = "default";
properties["quartz.jobStore.tablePrefix"] = "QRTZ_";
properties["quartz.jobStore.clustered"] = "false";
properties["quartz.dataSource.default.connectionString"] = "Server=(local);Database=mydb;Uid=user;Pwd=pass;";
properties["quartz.dataSource.default.provider"] = "SqlServer-20";
schedFact = new StdSchedulerFactory(properties);
Scheduler = schedFact.GetScheduler();
Scheduler.Start();

There is no reason to start the ZeroSizeThreadPool scheduler. It can schedule the jobs without being started so just don't call Start() method and the CPU usage will not spike.

Related

Google Cloud speech simple problems with no response

I have error requests. I still don't know where to invoke request and how to fetch response. Where do I set API key?
var initialize = new Google.Apis.Services.BaseClientService.Initializer();
initialize.ApiKey = "key";
var speech = new Google.Apis.Speech.v1.SpeechService(new Google.Apis.Services.BaseClientService.Initializer {
});
var recognizeReq = new Google.Apis.Speech.v1.Data.RecognizeRequest();
var recognitionConf = new Google.Apis.Speech.v1.Data.RecognitionConfig();
recognitionConf.LanguageCode = "pl-PL";
recognitionConf.SampleRateHertz = 16000;
recognitionConf.Encoding = "FLAC";
recogniseReq.Config = recognitionConf;
var aud = new Google.Apis.Speech.v1.Data.RecognitionAudio();
string path1 = #"c:\output.flac";
//var bytesAudio = File.ReadAllBytes(path1);
aud.Uri = path1;
recognizeReq.Audio = aud;
var variable = speech.Speech.Recognize(recogniseReq);
variable.Key = "key";
//variable.OauthToken =
variable.Execute();
Google.Apis.Speech.v1.Data.RecognizeResponse resp = new Google.Apis.Speech.v1.Data.RecognizeResponse();
var lista = resp.Results;
I change software and now I use Google.Cloud.Speech.V1 library
I managed to save voice using NAudio
and I tried to send continuos request to cloud, but it doesn't work
'''
waveFile.Write(e.Buffer, 0, e.BytesRecorded);
waveFile.Flush();
audio5 = RecognitionAudio.FromBytes(e.Buffer);
var result = client.LongRunningRecognizeAsync(config, audio5);
'''
This solves problem.
problem with buffer is for a longer time.
I get into trap like others.
found solution in dispute about bug (from Google of corse ;) )
https://github.com/GoogleCloudPlatform/dotnet-docs-samples/blob/95b32e683ba534883b8a7f3c979deee101ba3678/speech/api/Recognize/InfiniteStreaming.cs

ActiveMQTextMessages are kept in memory after they are consumed from topic

So i have a problem. I am running an embedded apache.activemq.broker in my application which has a topic. i have one producer which sends small messages to the topic and consumer consumes them.
The problem is the applications memory footprint just keeps growing and growing to the point when it takes up several gigabytes of memory after some days. I did a memory profiling with JProfiler and noticed that alot of instances of type ActiveMQTextMessage are kept in the memory.
This is how i set up my broker
BrokerService brokerService = new BrokerService();
brokerService.setUseJmx(false);
brokerService.setUseLocalHostBrokerName(false);
brokerService.addConnector(tenantConfiguration.getConnectionString());
brokerService.setBrokerName(tenantConfiguration.getBrokerComponentIdentifier());
brokerService.setPersistenceAdapter(persistenceAdapterFromConnectionString);
SystemUsage systemUsage = new SystemUsage();
brokerService.setSystemUsage(systemUsage);
brokerService.setDestinationPolicy(createDestinationPolicyForBrokerService());
And here is how i set up destination policy
private PolicyMap createDestinationPolicyForBrokerService() {
PolicyMap policyMap = new PolicyMap();
List<PolicyEntry> policyEntries = new ArrayList<>();
ConstantPendingMessageLimitStrategy constantPendingMessageLimitStrategy = new ConstantPendingMessageLimitStrategy();
constantPendingMessageLimitStrategy.setLimit(10);
PolicyEntry queuePolicyEntry = new PolicyEntry();
queuePolicyEntry.setPrioritizedMessages(true);
queuePolicyEntry.setGcInactiveDestinations(true);
queuePolicyEntry.setInactiveTimoutBeforeGC(86400);
queuePolicyEntry.setQueue(">");
queuePolicyEntry.setPendingMessageLimitStrategy(constantPendingMessageLimitStrategy);
PolicyEntry topicPolicyEntry = new PolicyEntry();
topicPolicyEntry.setTopic(">");
topicPolicyEntry.setGcInactiveDestinations(true);
topicPolicyEntry.setInactiveTimoutBeforeGC(5000);
topicPolicyEntry.setPendingMessageLimitStrategy(constantPendingMessageLimitStrategy);
topicPolicyEntry.setUseCache(false);
policyEntries.add(queuePolicyEntry);
policyEntries.add(topicPolicyEntry);
policyMap.setPolicyEntries(policyEntries);
return policyMap;
}
Here is screenshot of one of messages outgoing references
Message
And here is an image when i click on "Show paths to GC root"
Gc root
EDIT:
Here is how i setup the DurableConsumer
private NMSConnectionFactory _connnectionFactory;
private IConnection _connection;
private ISession _session;
public void Start()
{
_connection = _connnectionFactory.CreateConnection(queueUser, queuePwd);
_connection.Start();
_session = _connection.CreateSession(AcknowledgementMode.AutoAcknowledge);
if (!string.IsNullOrEmpty(TopicName))
{
_topicConsumer = _session.CreateDurableConsumer(SessionUtil.GetTopic(_session, TopicName), ConsumerName, null, false);
_topicConsumer.Listener += TopicConsumerOnListener;
}
}
And this is how we i publish messages to topic
public void PublishMessage(string message)
{
using (var connection = _connnectionFactory.CreateConnection(user, pwd))
{
try
{
connection.Start();
ActiveMQTopic topic = new ActiveMQTopic(TopicName);
using (var session = connection.CreateSession())
using (var producer = session.CreateProducer(topic))
{
var textMessage = producer.CreateTextMessage(message);
producer.Send(textMessage);
}
}
catch (Exception exception)
{
Console.WriteLine(exception);
}
}
}
Does anyone know why the messages are not being removed after they are consumed?
Thanks
Solved the problem by adding my own clientId to connection of topic
_connection = _connnectionFactory.CreateConnection(queueUser, queuePwd);
_connection.ClientId = "MY CLIENT ID";
_connection.Start();
That way no new consumer rows are created on restart.

How to set timeout wcf on xmarin droid

My application connects with a WCF service (.NET 4.5). I build proxy using slsvcutil (silverligth 5) and Works fine.
but I´ve having problems with timeout. I get an error over 1 minute.
this it´s my code:
BasicHttpBinding bindin = new BasicHttpBinding();
bindin.MaxReceivedMessageSize = 267386880;
var timeout = new TimeSpan(0, 10, 0);
bindin.SendTimeout = timeout;
bindin.OpenTimeout = timeout;
bindin.ReceiveTimeout = timeout;
wcf = new ServicioInasaClient(bindin, new EndpointAddress(editHost.Text));
Thanks
I solved it with:
wcf.InnerChannel.OperationTimeout = timeout;

With C# Dev Kit, Invoice Not Appearing In QB

The code seems to run. I don't get any error messages, but an invoice does not appear in QB after I sync. The code is basically this (http://pastebin.com/y7QENxeX) with a few (presumably) minor changes as noted. I'm able to create Accounts and Customers so I believe the basic infrastructure of my app is good. I don't understand why I'm stuck on invoices. I think my customerID is 2. I only have 5 in my company right now. And I think my itemID is 1 as I only have one in QB right now.
Any and all help is greatly appreciated.
Intuit.Ipp.Data.Qbd.PhysicalAddress physicalAddress = new Intuit.Ipp.Data.Qbd.PhysicalAddress();
physicalAddress.Line1 = "123 Main St.";
physicalAddress.Line2 = "Apt. 12";
physicalAddress.City = "Mountain View";
physicalAddress.CountrySubDivisionCode = "CA";
physicalAddress.Country = "USA";
physicalAddress.PostalCode = "94043";
physicalAddress.Tag = new string[] { "Billing" };
Intuit.Ipp.Data.Qbd.InvoiceHeader invoiceHeader = new Intuit.Ipp.Data.Qbd.InvoiceHeader();
invoiceHeader.ARAccountId = new Intuit.Ipp.Data.Qbd.IdType() { idDomain = Intuit.Ipp.Data.Qbd.idDomainEnum.QB, Value = "37" };
invoiceHeader.ARAccountName = "Accounts Receivable";
// original code : invoiceHeader.CustomerId = new IdType() { idDomain = idDomainEnum.NG, Value = "3291253" };
invoiceHeader.CustomerId = new Intuit.Ipp.Data.Qbd.IdType() { idDomain = Intuit.Ipp.Data.Qbd.idDomainEnum.QB, Value = "2" };
invoiceHeader.Balance = (decimal)100.00;
invoiceHeader.BillAddr = physicalAddress;
invoiceHeader.BillEmail = "detroit#tigers.com";
invoiceHeader.CustomerName = "Detroit Tigers";
invoiceHeader.DocNumber = "1234567";
invoiceHeader.DueDate = DateTime.Now;
invoiceHeader.ShipAddr = physicalAddress;
invoiceHeader.ShipDate = DateTime.Now;
invoiceHeader.TaxAmt = (decimal)5;
invoiceHeader.TaxRate = (decimal).05;
invoiceHeader.ToBeEmailed = false;
invoiceHeader.TotalAmt = (decimal)105.00;
List<Intuit.Ipp.Data.Qbd.InvoiceLine> listLine = new List<Intuit.Ipp.Data.Qbd.InvoiceLine>();
//Loop for multiple invoice lines could be added here
Intuit.Ipp.Data.Qbd.ItemsChoiceType2[] invoiceItemAttributes = { Intuit.Ipp.Data.Qbd.ItemsChoiceType2.ItemId, Intuit.Ipp.Data.Qbd.ItemsChoiceType2.UnitPrice, Intuit.Ipp.Data.Qbd.ItemsChoiceType2.Qty };
// original code : object[] invoiceItemValues = { new IdType() { idDomain = idDomainEnum.QB, Value = "5" }, new decimal(33), new decimal(2) };
object[] invoiceItemValues = { new Intuit.Ipp.Data.Qbd.IdType() { idDomain = Intuit.Ipp.Data.Qbd.idDomainEnum.QB, Value = "1" }, new decimal(33), new decimal(2) };
var invoiceLine = new Intuit.Ipp.Data.Qbd.InvoiceLine();
invoiceLine.Amount = 66;
invoiceLine.AmountSpecified = true;
invoiceLine.Desc = "test " + DateTime.Now.ToShortDateString();
invoiceLine.ItemsElementName = invoiceItemAttributes;
invoiceLine.Items = invoiceItemValues;
invoiceLine.ServiceDate = DateTime.Now;
invoiceLine.ServiceDateSpecified = true;
listLine.Add(invoiceLine);
Intuit.Ipp.Data.Qbd.Invoice invoice = new Intuit.Ipp.Data.Qbd.Invoice();
invoice.Header = invoiceHeader;
invoice.Line = listLine.ToArray();
Intuit.Ipp.Data.Qbd.Invoice addedInvoice = commonService.Add(invoice);
Chris
You need to read the following information about how QuickBooks for Windows Sync Manager works, how to see if Sync ran correctly, if objects are in an errored state and how to resolve. It could be any number of things. Once a record is inserted into the cloud, it asynchronously downloads to QuickBooks on the desktop, at which time business logic is applied and records are matched from the cloud to the desktop. If there is an issue, sync manager will show a record of the object that failed, why it failed and the object will now be in an error state.
At this point you can review the error and take steps to fix, like revert or update and resubmit. Links to the documentation below.
QuickBooks Sync Manager
Data Sync
Objects in Errored State
Sync Activity
Sync Status
regards
Jarred

When migrating from Quartz.Net 1.x to 2.x will I have to change the way my scheduler is configured and started?

Here is how I configure and start my Quartz.Net scheduler in 1.x:
properties["quartz.scheduler.instanceName"] = instanceName;
properties["quartz.scheduler.instanceId"] = "AUTO";
properties["quartz.threadPool.type"] = "Quartz.Simpl.SimpleThreadPool, Quartz";
properties["quartz.threadPool.threadCount"] = threadCount;
properties["quartz.threadPool.threadPriority"] = "Normal";
properties["quartz.jobStore.misfireThreshold"] = "60000";
properties["quartz.jobStore.type"] = "Quartz.Impl.AdoJobStore.JobStoreTX, Quartz";
properties["quartz.jobStore.useProperties"] = "true";
properties["quartz.jobStore.dataSource"] = "default";
properties["quartz.jobStore.tablePrefix"] = tablePrefix;
properties["quartz.jobStore.clustered"] = "false";
properties["quartz.jobStore.lockHandler.type"] = "Quartz.Impl.AdoJobStore.UpdateLockRowSemaphore, Quartz";
properties["quartz.dataSource.default.connectionString"] = connectionString;
properties["quartz.dataSource.default.provider"] = "SqlServer-20";
schedFact = new StdSchedulerFactory(properties);
svcScheduler = schedFact.GetScheduler();
svcScheduler.Start();
After migrating to 2.x will I have to change something here and what?
Most importantly is quartz.dataSource.default.provider for SQL Server still SqlServer-20 or did something change there?
Nothing has really changed in the configuration of Quartz.net 2.x.
You can find some useful information here.
Yes, you still have to use SqlServer-20 if you want to use MS Sql Server.
For a full list of db provider have a look here.

Resources