Informix Server - 5 databases with same name - informix

( Original question changed 12/21/22 - had wrong Informix server name - Should be Informix Dynamic Server )
I have 5 different regional databases running on one machine using Informix SE that I now must move to Informix Dynamic Server (IDS). All 5 databases have the same name, but are stored in completely different directories. It is a business requirement that each database is kept separate.
Some database/code information
Code is over 20 years old, but has been continuously updated to meet changing business needs.
There are over 200 tables.
Each of the 5 databases maintains over 4 million records with retention being 1 year for most tables.
Running on 1 Solaris server, the performance has been great.
The code base contains 45 4ge modules made up of 286 4gl modules.
There are over 200 reports now converted from ace to 4gl as well.
So, renaming the database would require changing almost 500 4gl modules.
Having maintained the SE databases for 20 years, I am now being required to set up the IDS database with no previous experience in it.
The question now: is there some straight forward documentation on how to set up multiple instances of IDS to support the 5 separate regional databases?

Related

Is it OK to specify a schema in `table_name_prefix`?

TL;DR: Is it OK to specify a schema in table_name_prefix?
We have a large Rails application that is not quite a traditional multi-tenant app. We have a hundred clients, all supported by one app, and that number will never grow more than 1-2 per year. Currently, every client has their own Postgresql database.
We are addressing some infrastructure concerns of having so many distinct databases...most urgently, a high number of simultaneous database connections when processing many clients' data at the same time.
The app is not visible, even to clients, so a lot of traditional multi-tenant web site philosophies don't apply here neatly.
Each tenant has a distinct Postgres database, managed in
database.yml.
Each database has a schema, named for the tenant.
We have a model specific to each tenant with notably different code.
Each model uses establish_connection to select a different database and schema.
Each model uses a distinct table_name_prefix with the client's unique name.
The tables vary extensively for each tenant. There is no hope or desire to normalize the clients together. Clients are not provisioned dynamically -- it is always a new code release with migrations.
We intend to move each of the client schemas into one database, so fewer distinct connection pools are required. The unique names we currently have at the database, schema, and table names mean there is no possibility of name collisions.
We've looked at the Apartment gem, and decided it is not a good fit for what we're doing.
We could add all hundred schemas to schema_search_path, so all clients could share the same connection pool and still find their schema. We believe this would reduce our db connection count one-hundred-fold. But we're a bit uneasy about that. I've found no discussions of how many are too many. Perhaps that would work, and perhaps there would not have a performance penalty finding tables.
We've found a very simple solution that seems promising, by adding the schema in the table_name_prefix. We're already setting this like:
def self.table_name_prefix
'client99_'
end
Through experimenting and looking within Rails 4 (our current version) and Rails 5 source code, this works to specify the schema ('tenant_99') as well as the traditional table prefix ('client99') :
def self.table_name_prefix
'tenant_99.client99_'
end
Before that change, queries looked like this:
SELECT COUNT(*) FROM 'client99_products'
After, they include the schema, as desired:
SELECT COUNT(*) FROM 'tenant_99.client99_products'
This seems to answer our needs, with no downsides. I've searched the Interwebs for people encouraging or discouraging this practice, and found no mention of it either way.
So through all this, here are the questions I haven't found definitive answers for:
Is there a concern of having too many schemas listed in schema_search_path?
Is putting a schema name in table_name_prefix okay?
To address your concerns in reverse order:
Is putting a schema name in table_name_prefix okay?
There are no problems with this just as long as the names are unique(internal and external).
Is there a concern of having too many schemas listed in schema_search_path?
The answer is maybe, any non-fully qualified request(asking for a table by name only) will have to search each of the schemas in the order listed in schema_search_path If it is cached in memory there is little penalty; an on-disk search of all schemas will be slow(proportional to its location in the list.)
Be sure to list the most active schemas first.
A fully qualified request should take no longer than the separated database solution.
Assuming all of your calls are fully qualified, this technique should provide the full advantages of connection pooling, when possible.
Remember that connection pooling only minimizes the overhead of the setup and tear-down times of the connections, by taking advantage of "gaps" during communication.
For example:
You have four clients and three of them are making near constant requests, you will still have four connections to the server, even with pooling.
The advantage comes when you have four clients each utilizing a quarter of the resources, pooled over a single connection.
The underlying(excluding connection overhead) database utilization will remain the same, whether pooling with a single database or separate connections to separate databases.
The drawback/advantage to combining the databases into a single one is this: it is not possible to move individual databases to another server for load balancing purposes outside of PostgreSQLs methods for load balancing.

Umbraco Database size

We are hosting a new Umbraco 7 site with the database on SQL server 2012 Express. The database seems to be growing rapidly. It's currently about 5Gb and SQL 2012 express has a maximum database size of 10Gb so we are starting to get a little concerned. The cmsPropertyData and cmsPreviewXml seem to be taking up the most space at about 2.5Gb each. Is there any general housekeeping that needs to be done to keep these tables under control. Have tried shrinking the database but there is no unused space.
Any Advice?
I don't know for sure this is the problem in your case, but Umbraco creates a new version of the content node each time the node is saved. This can cause your database to grow rapidly. There's a cool package that automatically removes older versions called "UnVersion". https://our.umbraco.org/projects/website-utilities/unversion/

ASP.NET MVC website performance issue on Azure AppServices

We have a ASP.NET MVC5 website hosted on Azure AppServices.
We have 2 distinct instances of this site on Azure: 1 for tests and 1 for production.
This 2 instances are in distinct Azure plans, but all services considered in each instance is in the same region (Western Europe).
The first one seems to work in an acceptable manner, but we are facing performance issues loading some pages on the 2nd one (sometimes from 15s to +30s page load times).
Each of our application instance is composed of:
ASP.NET MVC 5 (with FormsAuthentication)
N-Tiers Architecture
EntityFramework 6.1.3
ApplicationInsights service
2 SqlServer Databases (1 for business data & 1 for security data) located in a Azure Sql Service
The Azure plan used is "Basic (Small)" for AppServices, and "S0 Standard (10 DTUs)" for SqlServices.
The 1st one is running around 5% for CPU and 58% for Memory. The 2nd one is running around 3% for DTU.
With AppInsights, I've seen that "all is ok in controller" and the problem might comes from below.
I've also detected some page loads having the issue presents a failed Sql dependency call (with result code 207).
The Sql requests respond times are also separately ok (under 300ms).
We have, of course, already read a lot of posts about Azure performance issues but nothing that has helped us.
We would really appreciate some help please.
Many thanks!
Enable the profiler in Application Insights (same thing that used to live under https://azureserviceprofiler.com). It's now under the Performance blade.
Stress test your application for a few hours, enough for a good amount of ETL traces to be collected so it can paint a comprehensive picture of where time is being spent. A tiny "trace" icon will then become available next to your controllers:
Results look like this:

some feature of application is not loading in production server sometime

We have an asp.net application where sqlserver 2012 is being used as back end.There is a tab called 'Screening' which can be used to search for data for different categories,let's say 'A'and'B'.'A' has 124 records and 'B' has 924 records.The screening section is always working fine for 'A' in production server.
however it may take more than 10 minutes or hours to load for 'B' in production server sometime,it was loading properly in production server with the same data till last two days(within 7 seconds) whereas it is always loading faster in test server.If we ran a query against a table in sql query browser with 'begin tran' and if it is not committed ,then our application features related to this table will not work until we are committing it. I am getting the same experience while i am trying to search the records of 'B' some time even none of our developers have logged into production database server in that moments.I am not sure why it is happening.The maximum number of users will work on this feature concurrently may be 10.It is occurring in some day and the issue may persist for few days and it may work properly in some day.

How Scalable is OpenERP? Are there Any Statistics Availble?

OpenERP user manual Says 'Open ERP was developed mainly for small and medium-sized organizations with 5 to 150 users. ' I want to Implement OpenERP in an Educational Institution, where more than 150 can be active simultaneously. OpenERP
Is OpenERP Worth the effort for such an Implementation?
I know there will be bottlenecks but are there any official or unofficial statistics available?
a similar was asked by at How scalable is OpenERP?
How does OpenERP perform with large deployments and what do you need to do to really scale it to enterprise level - 1000+ users.
There is a post floating around about a production instance of OpenERP that has 150,000 users, 100 http queries per second, and over 1,000,000 product codes. I could not verify the origin.
I can tell you that 1000 users, on a non-media rich implementation would not be hard at all. Run the web app on something like ubuntu 12.4, and run the postgresql on a DB server instance separate from the application. If you see repetitive queries slowing down response time install an instance of memcached on a server for quicker reference times for users.
You could probably even do 1000 users on a stack depending on how many concurrent users and queries you would see.

Resources