Creating Scoped Job Requests For Quickbooks Web Connector QBWC? - sdk

I'm developing a web application that communicates with many different Web Connectors, sometimes simultaneously.
The problem I'm running into is that I have a single, global job queue on the server that all Web Connectors are polling from.
Is there any way to create an XML job request that specifies which Web Connector should run a particular job? I'm wondering if the OwnerID tag could be used to match a job to a specific local .qwc configuration? Or possibly FileID? Beyond these two variables, I can't imagine I have any additional control over influencing the Web Connector to make a decision to run a specific job or not.
I'm trying to avoid having each individual Web Connector run every single job on the queue, whether it was intended for them or not.
Thanks!!

The Web Connector itself doesn't have any logic like this - it's up to your SOAP server implementation to only feed the correct requests to the Web Connector.
This is what the username parameter in the .QWC files/Web Connector is for.
If you have a single username, everything gets sent to just a single Web Connector.
If you have multiple usernames, then you specify which username to queue up each request under, and only the Web Connector with the .QWC file with that username in will run the corresponding items that were queued up for that username.
When you create your .QWC files, use the corresponding usernames in the .QWC files.

Related

Send reminder email with a windows service

We have an ASP.NET MVC web app that we have installed in several clients/domains (more than 100). This web app works with MS SQL database running in a Windows server. All the domains are in the same service and running in separate databases in the same MS SQL server.
Each client can create events for their users. We are talking some days one client could have more than 200 events.
We need to create a Window service to be running on the server where all the clients are to send, every few hours, reminder emails of each event for each client.
As I explained before each client has its own database and there is a common database with all the clients information and their database names.
What the service has to do is check, for each client, the events they have and send the corresponding emails.
We don't know if is better to have one service to go through all the clients or have a service per client. If we have just one service we would have to run it , for example, every 6 hours sending those emails.
In case we have a service per client we would need to create the service from the asp.net web app, would it be possible?
What is the best approach for this?
You can use SQL Server Integration Service to do this job. All you need to write a SSIS package and schedule it.
On each execution of this package you can check data and send emails accordingly.
I hope this idea will help.

Why is the company file name supplied by the web service?

I understand that to access a company that is not currently open in QuickBooks, the web service needs to supply QuickBooks Web Connector with the file location as a return value to an authenticate() call.
This seems backwards to me. Why would the web service be in charge of telling the Web Connector where the relevant company file is? Wouldn't it make more sense for it to be managed by the Web Connector?
Here's the relevant explanation I've found within the QuickBooks Web Connector
Programmer’s Guide:
IF your web service wants to try a different company, supply the company pathname in the returned string. (You can supply an empty string if you want to use whatever company file happens to be open.) The web connector will respond by attempting to connect to QuickBooks again using that supplied string.
Why Would a Web Service Try a Different Company?
Why would a web service perform the second of these actions instead of simply just stopping altogether? In practice this approach is used when the web service remembers the company file path from session to session (a recommended practice) and wants to have a fall-back to use whatever company file is currently open in QuickBooks (by responding to the connectionError call with an empty string).
This is not as haphazard as it might seem. When a web service is added to the web connector, the web connector stores a unique FileID as a private data extension in the specified company. As a result, the web service can always verify that it is talking to the expected company file simply by checking the CompanyRet returned to your web service in the web connector’s first sendRequestXML call in the data exchange sequence. (Check the data extension list for the expected FileID.)
This seems like a poor end-user experience; if they move their company file (assuming they want the Web Connector operate without QuickBooks open), the web service will fail until that path is updated on the server side. It seems totally plausible that an end-user could do this without knowing it would break things.
Why is it structured this way? And more importantly: is there a way around this?
Why is it structured this way?
Because this is how Intuit built it.
is there a way around this?
No.

Windows azure web role

We have planned to migrate our application to web role as web role splits the server traffics to other instances. We have some queries regarding that . Let me post it one by one.
1) Since web role involves multiple instances (redirecting to different servers at run time based on the number of instances created), what happens to my session related details which was maintained in one server (with inproc mode will get resides in iis ) when my next requests gets redirected to other server where my session related details wont be available know? Do the windows azure takes a copy of that too or do we need to manully handle?
2)Our application works like Presentation Layer makes a call to the web service which in turn queries the database and results are displayed accordingly (presentation -> webservice -> database). So when am making my presenataion layer as cloud service web role obvioulsy i would need to make my service also as a web role . Am i right?
2.1) If so, what happens when am making the request from presentation how the requests will get carried ?
2.2) Am having my database in a separated vm (not azure db) hosted in sql express when the service dynamically
creates multiple instances what happens to database part?
2.3) Shall we host service and presenation in same cloud service or different which will be preferrable one?

Unable to Distinguish between app server and Web Server in Rails

I just confused to distinguish between app server and web server.
as far as i know , web server handles user request , fetch from database and renders back to user and so on .
Now my question is what does a app server do in a web-application??
why it is useful to use app server along with web server ??
A Web server exclusively handles HTTP requests, whereas an application server serves business logic to application programs through any number of protocols.
An example
As an example, consider an online store that provides real-time pricing and availability information. Most likely, the site will provide a form with which you can choose a product. When you submit your query, the site performs a lookup and returns the results embedded within an HTML page. The site may implement this functionality in numerous ways. I'll show you one scenario that doesn't use an application server and another that does. Seeing how these scenarios differ will help you to see the application server's function.
Scenario 1: Web server without an application server
In the first scenario, a Web server alone provides the online store's functionality. The Web server takes your request, then passes it to a server-side program able to handle the request. The server-side program looks up the pricing information from a database or a flat file. Once retrieved, the server-side program uses the information to formulate the HTML response, then the Web server sends it back to your Web browser.
To summarize, a Web server simply processes HTTP requests by responding with HTML pages.
Scenario 2: Web server with an application server
Scenario 2 resembles Scenario 1 in that the Web server still delegates the response generation to a script. However, you can now put the business logic for the pricing lookup onto an application server. With that change, instead of the script knowing how to look up the data and formulate a response, the script can simply call the application server's lookup service. The script can then use the service's result when the script generates its HTML response.
In this scenario, the application server serves the business logic for looking up a product's pricing information. That functionality doesn't say anything about display or how the client must use the information. Instead, the client and application server send data back and forth. When a client calls the application server's lookup service, the service simply looks up the information and returns it to the client.
By separating the pricing logic from the HTML response-generating code, the pricing logic becomes far more reusable between applications. A second client, such as a cash register, could also call the same service as a clerk checks out a customer. In contrast, in Scenario 1 the pricing lookup service is not reusable because the information is embedded within the HTML page. To summarize, in Scenario 2's model, the Web server handles HTTP requests by replying with an HTML page while the application server serves application logic by processing pricing and availability requests.
Hope this is clear now!

Design pattern: ASP.NET API for RPC against a back-end application

I'm designing an API to enable remote clients to execute PowerShell scripts against a remote server.
To execute the commands effectively, the application needs to create a unique runspace for the remote client (so it can initialise the runspace with an appropriate host and command set for that client). Every time the client makes a request, the API will need to ensure the request is executed within the correct runspace.
An (over-simplified) view of the flow might look like this:
Client connects to Web API, POSTs credentials for the backend application
Web API passes these credentials through to the backend app, which uses them to create a RunSpace uniquely configured for that client
Web API and app "agree" on a linked session-runspace ID
Web API either informs client of session-runspace ID or holds it in memory
Client makes request: e.g. "GET http://myapiserver/api/backup-status/"
Web API passes request through to backend app function
Backend app returns results: e.g. "JSON {this is the current status of backup for user/client x}"
Web API passes these results through to remote client
Either timeout or logout request ends 'session' and RunSpace is disposed
(In reality, the PowerShell App might just be a custom controller/model within the Web API, or it could be an IIS snap-in or similar - I'm open to design suggestions here...).
My concern is, in order to create a unique RunSpace for each remote client, I need to give that client a unique "session" ID so the API can pass requests through to the app correctly. This feels like I'm breaking the stateless rule.
In truth, the API is still stateless, just the back-end app is not, but it does need to create a session (RunSpace) for each client and then dispose of the RunSpace after a timeout/end-session request.
QUESTIONS
Should I hack into the Authentication mechanism in ASP.NET MVC to spin-up the RunSpace?
Should I admit defeat and just hack up a session variable?
Is there a better SOA that I should consider? (Web API feels very neat and tidy for this though - particularly if I want to have web, mobile and what-have-you clients)
This feels like I'm breaking the stateless rule.
Your application is stateful - no way around it. You have to maintain a process for each client and the process has to run on one box and client always connecting to the same box. So if you have a single server, no problem. If you have multiple, you have to use sticky session so client always comes back to the same server (load balancers could do that for you).
Should I hack into the Authentication mechanism in ASP.NET MVC to
spin-up the RunSpace?
If you need authentication.
Should I admit defeat and just hack up a session variable?
No variable, just use plain in-memory session. In case more than 1 server, use sticky session as explained above.
Is there a better SOA that I should consider? (Web API feels very neat
and tidy for this though - particularly if I want to have web, mobile
and what-have-you clients)
SOA does not come into this. You have a single service.

Resources