I understand that to access a company that is not currently open in QuickBooks, the web service needs to supply QuickBooks Web Connector with the file location as a return value to an authenticate() call.
This seems backwards to me. Why would the web service be in charge of telling the Web Connector where the relevant company file is? Wouldn't it make more sense for it to be managed by the Web Connector?
Here's the relevant explanation I've found within the QuickBooks Web Connector
Programmer’s Guide:
IF your web service wants to try a different company, supply the company pathname in the returned string. (You can supply an empty string if you want to use whatever company file happens to be open.) The web connector will respond by attempting to connect to QuickBooks again using that supplied string.
Why Would a Web Service Try a Different Company?
Why would a web service perform the second of these actions instead of simply just stopping altogether? In practice this approach is used when the web service remembers the company file path from session to session (a recommended practice) and wants to have a fall-back to use whatever company file is currently open in QuickBooks (by responding to the connectionError call with an empty string).
This is not as haphazard as it might seem. When a web service is added to the web connector, the web connector stores a unique FileID as a private data extension in the specified company. As a result, the web service can always verify that it is talking to the expected company file simply by checking the CompanyRet returned to your web service in the web connector’s first sendRequestXML call in the data exchange sequence. (Check the data extension list for the expected FileID.)
This seems like a poor end-user experience; if they move their company file (assuming they want the Web Connector operate without QuickBooks open), the web service will fail until that path is updated on the server side. It seems totally plausible that an end-user could do this without knowing it would break things.
Why is it structured this way? And more importantly: is there a way around this?
Why is it structured this way?
Because this is how Intuit built it.
is there a way around this?
No.
I am currently trying to find a way for a customer to connect with Power Query (plugin for Excel) to access their published Odata-feed (which is hosted by Microsoft NAV 2013 R2).
For security reasons the NAV server is set to only accept Windows as an credential type. This means that the current user credentials on the client is passed on to the webservice.
The problem: The users of the system is often off site and working on another domain with a VPN connection to the NAV-environment. With that said Power Query does not pass the "correct" AD-information to the published Odata-feed which means that the user is not authorized.
I am looking for a way to change which AD-credentials that are sent thru Power Query and then to the Odata webservice.
The users have no problem typing in the webservice adress in a web browser and type in the Windows credentials when prompted and access the feed. But in Power Query there is no option for typing in custom Windows Credentials when refreshing the data.
I've tried with WebAPIKey and Basic authentication. But since the NAV-server/Webservice is set to only accept Windows authentication I'm in the dark..
Any thoughts?
I got this answer from Curt Hagenlocher (Moderator on Technet)
I'm afraid this isn't something we currently support, though we have
considered implementing it. We do loosely track feature requests and
use them to prioritize future work.
(https://social.technet.microsoft.com/Forums/en-US/03c529ba-5f20-4bc1-84de-35cc91e7c1a6/power-query-custom-windows-credentials-authentication-with-odata-feeds?forum=powerquery)
This is more of an architectural question rather than anything code-related. I'm working on a project to auto-create invoices in QuickBooks using data from our ERP, in order to speed up the process of adding that data. It seems that it is a requirement that in order to access data within the QuickBooks company file directly, you need to use the SDK (which actually opens the file using a QuickBooks application instance). Since my application will need to access two company files depending on where the request came from, I can't keep it open, and this opening/closing adds 15ish seconds to each request.
Why is this a requirement, does anyone know? Is there another way to directly access the data and bypass the massive overhead of the QuickBooks application?
Thanks!
Why is this a requirement, does anyone know?
My understanding (based on a really old forum post by an Intuit developer) is that the way the SDK works is by using a GUI message pump Windows COM call to push data to QuickBooks.
No GUI present = no message pump = no connection to QuickBooks. Thus, you need the GUI components of QB in place.
Is there another way to directly access the data and bypass the massive overhead of the QuickBooks application?
No.
If it's a problem, consider batching your requests, queueing them up, and sending more than one at a time in a batch. This is what the Web Connector (and most other QuickBooks integrated apps) do to avoid this issue. It's very, very rare that an application truly needs real-time connections to QuickBooks (and be careful if you think you do - QuickBooks is not a great candidate for real-time access to data - there's a lot of things that can lock you out of QuickBooks so you have to be careful if you're building an app that assumes you'll always be able to connect to it).
I've just begun to read the QB Developer documentation and have come to the conclusion that to write a web-enabled application that will sync/remote backup QB files between two machines over the Internet, that the QBWC is the 'approved' way to accomplish this task. The .NET application samples in the QB SDK (V12) are not using WCF but WSDL and SOAP.
But before I commit to going that route, I am asking if anyone has a better approach. I'd prefer to use WCF and MS Sync Framework, but I don't want to head down that road if it will mean using a cannon to kill a mosquito.
Thanks
You really hinted at two separate goals here, so I'll address each specifically:
... remote backup QB files between two machines ...
If your goal is BACKUP then the Web Connector is certainly not the answer. The purpose of the Web Connector is to enable integration between QuickBooks and web applications, via the QuickBooks API/SDK. Since not all data stored within QuickBooks is accessible via the API, the Web Connector is not appropriate for backup. It is impossible to get a complete, accurate backup of the entire QuickBooks data set via the Web Connector.
On the other hand...
... web-enabled application that will sync ...
If your goal is to allow integration/sync of data between your web app and QuickBooks, the Web Connector is a decent solution. Yes, it uses SOAP (with a grand total of only about 5 very simple methods). No, you can't use WCF/anything else without writing your own version of the Web Connector.
If you add more details about specifically what you're looking to do with specifically what data, you'll probably get some better answers and suggestions about approach.
I have some delphi code which, given a list of items, calculates the total price taking into account any special deals that might apply.
This code is non-trivial to rewrite in another language.
How should I set it up to communicate with a website running on the same server? The website will need to ask it for a price every time the user updates their shopping cart. It's possible that there will be multiple concurrent requests.
The delphi code needs to maintain an in-memory list of special deals, periodically refreshed from a database. So it cannot simply be executed every time or anything as simple as that.
I don't know what the website is written in, or even which http server it runs under, so I'm just looking for ideas or standard methods.
It sounds like the win32 app is already running as a Windows Service on the box. So, if you can't modify that service, you are going to have to deal with whatever way it wants to accept and respond to requests. This could be through sockets or some higher level communication protocol like web services.
You could do a couple of things. Write an assembly that knows how to communicate with the service and have your web site use that assembly. Or you could build a shim service that knows how to communicate with the legacy service, but exposes communication over higher level protocols such as web services. Either way will have the benefit of hiding the concurrency, threading and communications issue behind an easy to call interface, but the latter will make communicating with the service easier for everyone going forward.
If you can modify the delphi app to take an XML request and respond with an XML answer over a TCP socket (ideally using the HTTP protocol), you will be able to make it interoperate with most web server frameworks relatively easily. But the exact details of how to make that integration happen will depend on the language/framework it was written in.
If the web server is on windows you can compile your delphi app as a DLL that can return XML or HTML, taking parameters as part of the URL or a POST operation. Some details on making a Delphi DLL for web servers are here.
It doesn't matter what web server or OS the existing system is running under. What matters is what you want YOUR code to run under. If it is windows then the easiest solution would be to use WebBroker and write a custom ISAPI application, or use SOAP to expose web services. The first method could be used if you wanted to write a rest like API for instance, the second if your web application has the ability to consume web services.
Another option, if you are running both on the same box under IIS, is to create a COM/Automation object which you then invoke via server side scripting (ASP). If the application is an ASP.NET application, then I would use PRISM to port your code into an assembly.
I have done this with a quite complicated workers compensation calculator. I created a windows service using RemObjects Sdk. The calculations are exposed as a soap method so it can be accessed by nearly anything.
It's not necessary to use RemObjects in the service but it makes it much easier to do as it handles a lot of the underlying plumbing. The clients don't need RemObjects, they just need to be able to call soap methods. Nearly any programming langugae can do that.
You could also create an isapi dll for IIS that exposes a soap interface. This would be useful if other websites on different servers needed access to the methods. However I have handled this in my case by opening a port in the firewall to access my windows service.
There is a lot of examples on the web. A couple of places to start reading are About.Com and Dr Bob.
Torn this app into Windows Service. Write Web Service that will communicate with your windows service. You should spend some time designing your Web Service, because this Web Service is going to be your consistent interface, shielding old Delphi app. So in the future whenever you will want to write web app, mobile app, or whatever you will imagine, you will have one consistent interface – XML Web Service.
A popular way to integrate a web application with background services is a message broker.
The message flow would be:
the web application sends a "calculation request" message to a message destination on the message broker, which contains all needed parameters and also a correlation id to match the calculation request with the response from the Delphi service
one (or, in a high availability / load balanced environment more) Delphi services handle the messages: pull the next incoming message, process it by feeding the parameters to the calculation engine, and send a "calculation result message" back to the web server
the web server can either synchronously wait for the response (and discard responses which have no matching correlation ide) and build the result HTML document, or continue with other tasks and asynchronously receive the calculation result in a separate thread, for example in a Ajax based web application
See for an introduction this slideshow about the Dopplr image service:
http://de.slideshare.net/carsonified/dopplr-its-made-of-messages-matt-biddulph-presentation
If you can make it a service (but not a library), you have to do inter-process communication somehow - there are a few ways to do this on Windows:
Sockets directly which is hardest since you have to do marshalling/auth yourself
Shared Memory (yuck!)
RPC which works great but isn't trivial
DCOM which is easier but a pain to configure
WCF - but can you call it from your Windows Service written in Delphi?