Firebird 3 connection over internet - database-connection

I have software developed in Delphi with Firebird 3 database that is already ready. I would like to distribute this software to clients without installing the database physically on the client. So the database would stay on my server. I do not like the idea of distributing the database to the client for several reasons, so this would be my last option.
I thought about several possibilities like: RDP, direct connection through port 3050, VPN (that would better serve the functionality of the software). In the test that I performed with VPN using IKEv2 protocol I obtained excellent results in the matter of performance and stability, however I doubt if this would be the best approach, since it would have to create a VPN connection for each user that would connect to the software.
So I'd like to get firebird database connection experiences over the internet.

Related

How to license number of users at the database?

Given a Delphi and Interbase client-server application, I'd like to license the application by the number of users at the database. How can this be done with commercial licensing software? I don't see any of those listing features that look like they would cover this. Every user initially logs on to the database. The database seems so available that it would be open to any user - or at least administrators. Would I have to also write a Delphi exe or dll to run on the server - perhaps as a function in the database - with the licensing connected to that? Not sure how to proceed.
BTW, Interbase licenses simultaneous users, but I think they wrote that right into the server, but I want something similar.
To control simultaneous client connections you definitively need a server side application.
It can be a simple tcp/ip socket server as a service (daemon on linux) or another (midas?) server layer.
When your client app starts it call a server method for example Session.Connect, here you count active connections and return false (no code) in case of maximum limit reaches.
When application closes you notify server with Session.Disconnect. to decrease connection count.
Also is a good idea to keep a live (permanent) connection between client app and server service (as I sad sockets) to handle application hangups, uncontrolled restarts and process this event for example OnSockedDisconnect on server side, to decrease connection count and handle for disconnect propery, for example write in logs etc...
Of course communication should be crypted (handshaked), to avoid unwanted guests.
You can play also with sim cardreaders etc..
This method will not provide a industrial (nuclear) level of security, but if coded corectly it can take some time even for an expert hacker to broke it.
OR, you may take a look at some ready protection tools like SafeNet (HASP protection).
Also, Firebird (and maybe Interbase) have on DB Connect / Disconnect triggers, where if user have privileges it can read connection count. But these can be easily changed if DB are stored on customer server.

experiences with firebird server over the internet with multiple clients?

Has somebody real experience with firebird databases over the internet?
I have a typical windows accounting/ERP software (done with delphi) that works with the firebird database server pretty well.. Now my users (300 aprox. now, but should increment) also want to work "in the cloud" (connecting from the office, from the laptop, from the house, etc.). It is a lot of work of recreating everything to a standard web application (let's say for example, HTML+CSS+JS+PHP+MYSQL), so I'm considering keeping the win client (I don't care about other OSes) but instead of the server living in the clients LANs moving it to a pair of dedicated servers that I will contract (one primary and one secondary againts failures for starting).
Searching I've come across this faq http://www.firebirdfaq.org/faq53/ that explains that the fb protocol it isn't ideal for working in the internet, but still all my users today have at least a 1MBbit/sec ADSL internet connection (I don't think that to be slow as the faq denotes).
Somebody have done this? what was the experience? how secure are fb servers for being open to the internet? how well they scale?
I know that building a "middleware" with SOAP for example will be more normal, but still the solution I'm evaluating here is much more fast and easy (still I have some work with the replication, backup, hearbreath services, but it's much less than redoing everything for the web).
Thanks! Edit: FB version: 2.5.
I had being trying to "push" the Firebird Core developers to improve the Firebird protocol to get better speed with high latency network (aka. Internet). Recently, Dmitry Yemanov published some articles in his blog about this subject (dyemanov.blogspot.com). It seems that there is margin for optimizations, and I would really like to see this coming in FB 2.5.3 and FB 3.0, although there is no warranty for this happening in those versions or anytime soon. You can vote in such improvement here: http://tracker.firebirdsql.org/browse/CORE-2530
Safety? You may try to set up a VPN. It also may help with speed, since most of the VPNs software out there (Zebedee, etc) can compress the data being transfered, helping to speed up data transfer in some cases.
Some of my customers do use Firebird traditional C/S over the internet. It is much slower compared to local network, and of course, how much slower depends basically on the link speed and latency. You can do some optimization at the client side too, using metadata cache, etc. but don't expect miracles with the currently protocol. I would say that for whole day working, using Terminal Services would be a better option for now.
The response about the scaling question
Firebird runs in production on large big iron servers : 512G of ram 100.000 concurrent users
We run Firebird to power larger systems (for 12 government agencies
and 3 banks). It has approximately 100000 end users multiplexed
through 2500 (max) pooled connections
https://plus.google.com/111558763769231855886/posts/Q1ACy1yyTgP
The protocol in Firebird 2.5 is improoved there is still room left for 3.0 but you can check
what is already done
http://asfernandes.blogspot.com/2009/07/network-latency-influence-on-firebird.html
And the future enhancements in 3.0
http://www.firebirdnews.org/?p=6953
To protect your connection i guess the best bet is ssl/ssh tunnel (it can be a opnvpn)
with high compression option
http://mapopa.blogspot.com/2010/11/securing-firebird-using-ssh-tunnel.html
FB protocol problem isn't about bandwidth, but latency. In my experience, some operations can be very slow over internet/VPN compared to LAN or local connection. I haven't examined issue further since I don't really run applications over internet connection.
However, I suggest three-tier model for application. Create own application server, which runs on database server/same network. Let the clients talk with application server and you get maximum performance.
There are some N-tier application/middleware frameworks for Delphi:
RemObjects SDK and DataAbstract
RealThinClient
kbmMW
Delphi's own DataSnap
MidWare
With those you can get data compression, encryption, binary messages (faster than SOAP) etc.
You can implement TCP/IP packets encryption/decryption directly in the firebird engine itself.
Personnaly, i have downloaded the Firebird 2.5 source code and injected secure tunnelization code directly in his low level communication layer (the INET socket layer). Now, encryption/decryption is done directly by the firebird engine for each TCP/IP packet both at the server side and the client side (fbclient.dll).
Then there is no need to re-structure the client application except adding one line of code that provide the secret key you choose to crypt communication to the fbclient.dll. The same secret key must be declared in the firebird.conf file of your server installation.
I have also implemented a proxy negociation solution in the fbclient.dll in order to allow to TCP/IP packets to pass throught any proxy server (like Microsoft ISA Server for example).
For us, this architecture is functional for more than one year in a real production system.
kbmMW CodeGear Edition is free but without source. It can be used for commercial apps.
Download it after registering at: https://portal.components4developers.com
In case you see certificate errors (you shouldnt but I know we have heard that some actually do), accept and ignore them. The site is valid despite the cert.error.
kbmMW CodeGear Edition contains a subset of kbmMW Professional Edition, but supports the following Delphi database API's:
Borland Database Engine
DBExpress
kbmMemTable
SQLite3
It supports binary, binary over HTML, XML and SOAP protocols in communication with clients.
It contains everything you need incl.
unified remote custom method invocation
unified remote dataset query, execute and data change resolving
unified database meta data handling and creation (tables, fields, indexes, generators/sequencers)
optional automatic proxying of requests to another server and proxying results back to original requester
full native XML DOM and SAX support
full dataset briefcase support as CSV, or binary data
advanced but simple to use wizard for creating new application server services
THere is one caveat though. Newest version of kbmMW CodeGear Edition always only supports newest Delphi version. You can still download older kbmMW CodeGear Editions matching older Delphi releases.
kbMMW Professional Edition and kbmMW Enterprise Edition do not have such limitations, and currently supports D7, D2006, D2007, D2010, DXE, DXE2 along with Embarcadero C++ counterparts.
best regards
Kim Madsen
www.components4developers.com

Database sync solutions for Delphi [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am looking for some starting points integrating a Win32 Delphi application's data with a remote database for a web application.
Problem(s) this project intends to solve:
1) The desktop does not perform well over vpns. Users in remote office could use the web app instead.
2) Some companies prefer a web app to the desktop app
3) Mobile devices could hit the web app as a front end.
Issues I've identified:
Web application will run on a Unix based system, probably Linux while the desktop application uses NexusDB while the web application will likely be Postgres. Dissimilar platforms and databases.
Using Delphi it appears the Microsoft Sync Framework is not available for this project.
My first thought was to give the web app your standard REST API and have the desktop app hit the API as though it's a client every n-minutes from the local database server. Tons of issues I see with this already!
Richard, I have been down this path before and all I can say is DON'T DO IT! I use to work for a company that had a large Delphi Desktop Application (over 250 forms) running on DBISAM (very similar to what you have). Clients wanted a "Web" interface so people could remotely work and then have the web app and desktop app synch changes. Well, a few years later and the application was horrible - data issues and user workflow was terrible because managing the same data in two different places is a nightmare.
I would recommend moving your database to something like MySQL (Delphi and Web Client both hit) and use one database between the two interfaces. The reason the Delphi client is not working well over the VPN is because desktop databases like NexusDB and DBISAM copy way to much data over the pipe when it runs queries (pulls back all the data and then filters/orders, etc)- it not truly client / server like SQL Server or MySQL where all the heavy lifting is being done on the server and only the results come back. Of course, moving the Delphi app to DB like MySQL could eleviate speed issues all together - but you don't solve #2 and #3 with that.
Another option is to move the entire application to the web and only have 1 application to support. Of course, a good UI developer in a tool like Delphi can always make a superior user interface to a web app - especially in data-entry heavy applications - so that may not be an option for you.
I would be very weary of "synching data".
My 2 cents worth.
Mike
If you use a RESTful based ORM, you could have both for instance AJAX and Client Delphi applications calling the same Delphi server, using JSON as transmission format, HTTP/1.1 as remote connection layer, Delphi and Javascript objects to access the data.
For instance, if you type http://localhost:8080/root/SampleRecord in your browser, you'll receive something like:
[{"ID":1},{"ID":2},{"ID":3},{"ID":4}]
And if you ask for http://localhost:8080/root/SampleRecord/1 you'll get:
{"ID":1,"Time":"2010-02-08T11:07:09","Name":"AB","Question":"To be or not to be"}
This can be consumed by any AJAX application, if you know a bit about JavaScript.
And the same HTTP/1.1 RESTful requests (GET/POST/PUT/DELETE/LOCK/UNLOCK...) are already available in any Client HTTP/1.1 application. The framework implements the server using the very fast kernel-mode http.sys (faster than any other HTTP server on Windows), and fast HTTP API for the client. You can even use HTTPS to handle a secure connection.
IMHO, using such an ORM is better than using only a database connection, because:
It will follow more strictly the n-Tier principle: the business rules are written ONCE in the Delphi server, and you consume only services and RESTful operations with business objects;
It will use HTTP/1.1 for connection which is faster, more standard across the Internet than any direct database connection, and can be strongly secured via HTTPS;
JSON and RESTful over HTTP are de-facto standard for AJAX applications (even Microsoft uses it for WCF);
The data will be transmitted using JSON, which is a very nice format for multiple front-end;
The Stateless approach makes it very strong, even in unconnected mode;
Using a local small replication of the database (we encourage SQLite for this) allow you to have client access in unconnected mode (for Delphi client, or for HTML 5 clients).
I recommend you have one database, and two front ends (web UI that calls SOAP methods for its back end work, and a SOAP method call based rich client in Delphi, and a SOAP server tier that implements SOAP accessible methods which contains your business logic).
From what you're describing, you think replication will merely speed you up, but what it will do instead, is slow you down and cause you to have replication, coherence, and relational integrity problems that must be sorted out by hand (by you).
Take a look at this
CopyCat is a database replication
engine, written as a component set for
Embarcadero Delphi. CopyCat has been
in production use since 2004, and is
very stable. It is relied upon daily
by a number of small to large
businesses for applications ranging
from inter-site synchronization,
itinerant work, database backup and
more. We are confident that it can
fulfill your needs as well. Read on...

Which Delphi technology to use?

I have a Client/Server application written Delphi. Essentially all the application is doing is transferring xml data streams between a server application and connected clients. I am currently using the Indy TIdTCPServer component. But the server side application keeps crashing on some of my installments. And it has been extremely difficult to debug. So I am wondering if there is some "architecture" I should be utilizing which does all the tcp/ip connection management and database connection pooling, allowing me to concentrate on the business logic.
Here are more details:
clients must maintain a "persistent" connection. There are times when the server must notify and send data to all connected clients.
clients are connecting from laptop computers using wireless aircards. So network "drops" are pretty common.
Backend database is SqlServer.
There can be upward of 100 computers simultaneously connected at a time.
When the server gets a new connection (TCPServer.OnConnect) I instantiate my own object containing it own SqlServer database connection. When tcp connections are dropped I in turn free these objects (and associated database connection).
Client application have a TTimer built into them. They routinely send heartbeats to the server. And if they "drop"/"lose" their connection they automatically establish a new connection once the network is back.
Anyone have any suggestions on the best approach/architecture here?
I presume the Indy component would work, but at the same time feel I am "reinventing the wheel" with respect to managing the connections.
Three component sets I am aware of that will take care of the nitty gritty technical aspects of client server applications for you:
kbmMW: http://components4developers.com/
Asta: http://www.astatech.com/index.asp
RemObjects: http://www.remobjects.com/
You may have to rework your applications to take advantage of the way these component sets work, but assuming you have properly separated layers that shouldn't be too much of a hassle and will buy you the advantage of well tested and widely used code for your client server work.
If you want some light TCP/IP components, take a look at our SynCrtSock unit.
You'll find low-level classes to create IP Client and Servers.
We implemented both TCP/IP and UDP/IP in one of our applications.
There is also a THttpServer class, which implement a HTTP/1.1 server. Therefore it follows the HTTP/1.1 connection management. There is also an optional compression, and using HTTP/1.1 on a port other than 80 is not a bad idea. And what is good with HTTP/1.1 is that it can pass through firewalls, and can be easily be VPNed or hosted on another HTTP server (like IIS or Apache) with a proxy. There is even a FastCGI class, if you need such a server under a linux-based solution.
Of course, a THttpClientSocket class does the same on the client class.
We use these classes to add HTTP/1.1 connection to our Open Source SQLite3 RESTful framework - http://synopse.info/forum/viewforum.php?id=2
See http://synopse.info/fossil/artifact?name=722e896e3d7aad1fe217b0e2e7903483e66d66d1 for the SynCrtSock unit. Open source, work from Delphi 7 to Delphi 2010.
Misha Charrett's CSI Application Framework covers pretty much exactly what you're asking for.
It's an open source Delphi framework that at its heart is a distributed message passing and threading framework that allows XML message passing from both client to server and server to client.
It can handle disconnections/reconnections, high client numbers and there's an optional virtual database library that will handle SQL server (or you could just use same SQL Server access you're using now).
It's not particularly well known yet but I can tell you that it's been actively developed over the last few years and that the author Misha is very keen to assist anyone who's interested in using it in their application.
Well, it would probably require a complete rewrite of much of your C/S code, but instead of using the Indy components, you could try to use a COM+ solution instead. Basically, you would create a COM+ component that will be installed on the server and your client applications will connect to this client and call the functions of this component directly. It will have transaction management which will be handled by Windows itself and the same is true about handling transactions. It's also technically possible to create events, which would allow the server to do callbacks to the client, although that would make things a bit more complicated.
I don't think this solution would work out for you, though, unless you have a lot of experience with COM development in Windows and/or you're brave enough to try something different.
In the past, I had a similar problem where hundreds of clients had to connect to a single server, doing all kinds of database transactions. It has a steep learning curve but me and my team managed to get things working and once we understood the technique, it resulted in a very stable and reliable solution which did manage to have up to 500 users simultaneously doing updates and other actions in a one-time extreme stress-test. But again, the learning curse is steep, so it might not be the solution you're looking for.
(Still, COM+ will use a lot of functionality that's build-in into Windows, like transaction management, database pooling and whatever more.)
If you use Indy each connection will equal a thread.
Anyway, I suggest for connecting to MSSQL to use SDAC from Devart http://www.devart.com/sdac/ and for the connection layer to use HPScktSrvr based on I/O Completion Port from http://www.torry.net/authorsmore.php?id=7131 (I don't know though what changes it will need for TThread changes in newer VCL).
You build your client class arround THPServerClient, you set your new class as the server ClientClass and the framework will create automatically new clients for you.
You may also want to have a look at the ICS/Midware combo: http://www.overbyte.be/

Will embedded Firebird/Delphi cause a firewall 'hit'?

I'm looking at porting an Interbase 6 / Delphi 7 application to embedded Firebird in Delphi 2007. One of the problems we have is getting our users (often quite an unskilled bunch, really - though I love them to bits, naturally) to unblock our applications in their firewall. Windows firewall itself is fairly straightforward but often they are running McAfee or similar (they tend to buy cheap Dells with this stuff pre-installed) and it seems that each and every variation of this stuff has a slightly different user interface. sigh
Still, I'm digressing, sorry. Straight to the point; If my Delphi app connects to an embedded Firebird database, will I still need to all/open something in the user's firewall (as I currently do when installing stuff that makes a connection to 'normal' IB6)?
And if you've read this far (thanks) - can embedded Firebird be used concurrently on a machine? Let's say we have 2 applications, both of which want to use DIFFERENT databases - could the user run both of these apps simultaneously on the same machine or is there some kind of port binding that goes on under the hood, which we'd have to work around?
I have never had a problem with firewalls or McAfee with embedded firebird. (I assume this is because embedded is not really a 'server' and does not require a port to operate)
Yes you can have two apps concurrently, just keep the executables & databases in two different folders.
Even using Firebird in a non-embeded install on the local machine we have never bumped into any firewall issues in hundreds of installations. You don't even have to use TCP/IP to connect to the database. We do use TCP/IP, but using the local shared memory protocol would avoid the issue entirely.
Firebird makes an excellent embedded or semi-embedded database. We just install it in it's normal mode and it runs in the background without any user intervention 24x7 for years at a time.
As the embedded version of Firebird doesn't use TCP/IP to talk to the database, you'll be fine on single user machines. Bear in mind that Firebird Embedded is single-user and you won't be able to get two apps talking concurrently to the same database. To do that you'd need to install the Firebird server on the machine and in the connection string use localhost:C:\Data\MyDB.FDB on both apps.
I use UIB to talk to Firebird (I wrote a persistence layer for the OPF I use using it), it's thread-safe (unlike IBX) and I've found it to be appreciably faster than IBX. There's a version that comes with JVCL and a slightly later version at http://www.progdigy.com

Resources