Rails with SQL Server 2008/2012 - FILESTREAM - ruby-on-rails

Novice here! I'm currently creating an application using Ruby on Rails.
This particular application uses binary data for content. Apparently, SQL Server is the best way to go because of the FILESTREAM feature. From what I found from documentation, this basically creates a file system for binary objects that are > 1mb.
With that said, I am using Ruby on Rails and preparing to setup the activerecord-sqlserver-adapter but I need to know how will I be able to specify a column to use FILESTREAM while setting up a database with active record migration? Would I just edit the column to accept FILESTREAM in SQL Server management? (This is obviously after allowing FILESTREAM to be used in SQL SERVER.)
So the setup I predict is:
1. install SQL Server and all supporting components
2. install activerecord-sqlserver-adpater gem
3. create a varbinary(max) database column (for the binary file) - In migration
4. specify in sql server to use said column for FILESTREAM
All in all, How do I configure to specify the use of FILESTREAM when creating a column in a database using rails/ruby?

No that's not the all, every table that has a column varbinary(max) which stored as a FILESTREAM should have a column with rowguid type.
Here is a sample that I've used for Attachments
CREATE TABLE [dbo].[Attachment](
[Attachment_Id] [uniqueidentifier] ROWGUIDCOL NOT NULL,
[ContentLength] [int] NULL,
[ContentType] [nvarchar](100) NULL,
[Contents] [varbinary](max) FILESTREAM NULL,
[DateAdded] [datetime] NULL,
[FileName] [nvarchar](255) NULL,
[Title] [nvarchar](255) NULL,
PRIMARY KEY CLUSTERED
(
[Attachment_Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] FILESTREAM_ON [filestream]
) ON [PRIMARY] FILESTREAM_ON [filestream]

Related

Java Web with Weblogic, DB Informix stored procedure data — impossible problem

I have Java Web on Weblogic; database is Informix.
The process is as follows:
User query data.
Create serial(only).
Using stored procedure with serial.
SP content like:
insert reporttable
select data from table1
insert reporttable
select data from table2
if(reporttable.count==0)
insert reporttable select 'NO DATA'
Query reporttable with serial.
Show on Web.
Important problem:
table1 has data count 10(data1,data2.......data10)
reporttable result data count 3(data1, data2, NO DATA) impossible
Important!!! The implementation does not process any exceptions.
When the problem occurs, any query on the data shows the same problem.
But when I restart Weblogic (using the same parameters), the query has no problem.
I have no idea to solve the problem; can you help?
I find the error reason.
Test : rename table name
sp use table1、table2、table3
unknown reason maybe abnormal connection
java.sql.SQLSyntaxErrorException: [FMWGEN][Informix JDBC Driver][Informix]The
specified table (table1) is not in the database.
Error message only trigger on first
Execute sp again,no error, and executeing neglect table1
weblogic restart jndi connection
Execute sp result normal

PERL DBI in PL/PERL in POSTGRESQL

Can I use DBI in a pl/perl function created in Postgresql to select any foreign database?
Im getting the error: Unable to laod DBI.pm into plperl
(I know that there are oracle foreign data wrappers, but I just need to store the resultset of a select statement fired against Oracle, MSSQL or PG and store it in Postgres.)
Here is my function (just with the connect string at the moment):
CREATE OR REPLACE FUNCTION sel_ora()
RETURNS VOID AS $$
use DBI;
my $db = DBI->connect( "dbi:Oracle:DBKUNDEN", "stadl", "sysadm" )
|| die( $DBI::errstr . "\n" );
$$ LANGUAGE plperl;
Yes, you can use DBI from within plperl.
Note that for security reasons, plperl restricts access to using perl modules. This is intended for multi-user databases where your postgres users are not trusted.
The solution in plperl is to add a line such as this to your postgresql.conf file:
plperl.on_init = 'use DBI;'
Then DBI will be available within your plperl functions. See docs: https://www.postgresql.org/docs/9.5/plperl-under-the-hood.html
Alternatively, if this security consideration does not apply in your situation, then you can use plperlu (u = unrestricted) instead of plperl. Then you can use any perl module directly from your plperlu code.

Datastax opscenter upgrade fails to display stats

I upgraded Opscenter from 5.1.2 to 5.2.0 yesterday, and now none of the graphs on the Dashboard are showing any statistics.
My cluster is datastax enterprise 4.5.1 with the following versions:
cqlsh 4.1.1 | Cassandra 2.0.8.39 | CQL spec 3.1.1 | Thrift protocol 19.39.0
I'm using this cluster for a search workload with solr. The agent.log is filled with the following:
INFO [qtp1313948736-24] 2015-08-06 12:30:52,211 New JMX connection (127.0.0.1:7199)
INFO [qtp1313948736-24] 2015-08-06 12:30:52,214 New JMX connection (127.0.0.1:7199)
INFO [qtp1313948736-24] 2015-08-06 12:30:52,218 HTTP: :get /cluster/solr-cores {} - 200
INFO [jmx-metrics-1] 2015-08-06 12:30:57,186 New JMX connection (127.0.0.1:7199)
INFO [jmx-metrics-1] 2015-08-06 12:30:57,191 New JMX connection (127.0.0.1:7199)
ERROR [cassandra-processor-4] 2015-08-06 12:31:15,082 Error when proccessing cassandra callcom.datastax.driver.core.exceptions.InvalidQueryException: Unknown identifier timestamp
Any ideas?
The migration on upgrade failed. Can look through schema with describe keyspace "OpsCenter" and find tables not upgraded to 5.2.0 (look at comment). The changes made are here:
https://gist.github.com/philip-doctor/2b7c87f551a35a5c7c79
-- depending on how far through the migration you progressed, parts of this may fail
-- this assumes you're using the default name of "OpsCenter" for the opscenter keyspace, otherwise
-- you'll have to rename the "OpsCenter" part.
ALTER TABLE "OpsCenter"."events" ADD message text;
ALTER TABLE "OpsCenter"."events" ADD column_family text;
ALTER TABLE "OpsCenter"."events" ADD target_node text;
ALTER TABLE "OpsCenter"."events" ADD event_source text;
ALTER TABLE "OpsCenter"."events" ADD "keyspace" text;
ALTER TABLE "OpsCenter"."events" ADD api_source_ip text;
ALTER TABLE "OpsCenter"."events" ADD user text;
ALTER TABLE "OpsCenter"."events" ADD source_node text;
ALTER TABLE "OpsCenter"."events" with comment = '{"info": "OpsCenter management data.", "version": [5, 2, 0]}';
ALTER TABLE "OpsCenter"."rollups60" RENAME column1 to timestamp;
ALTER TABLE "OpsCenter"."rollups60" with comment = '{"info": "OpsCenter management data.", "version": [5, 2, 0]}';
ALTER TABLE "OpsCenter"."rollups300" RENAME column1 to timestamp;
ALTER TABLE "OpsCenter"."rollups300" with comment = '{"info": "OpsCenter management data.", "version": [5, 2, 0]}';
ALTER TABLE "OpsCenter"."rollups7200" RENAME column1 to timestamp;
ALTER TABLE "OpsCenter"."rollups7200" with comment = '{"info": "OpsCenter management data.", "version": [5, 2, 0]}';
ALTER TABLE "OpsCenter"."rollups86400" RENAME column1 to timestamp;
ALTER TABLE "OpsCenter"."rollups86400" with comment = '{"info": "OpsCenter management data.", "version": [5, 2, 0]}';

Informix select first 250000 then the last 250000 records in a table

I work on a CFML script to backup some data in a CSV file from Informix database. The problem is the table has many records 286906 and my scripts timeouts (even I set it not to), the best I could successfully was 260000 with:
SELECT FIRST 260000
APE1, APE2, CALLE, CODPOSTAL, DNI, FCADU, FENACI, LOCALIDAD, NOMBRE, NSS, PROV, TELEFONO
FROM
mytable WHERE FCADU IS NOT NULL AND FENACI IS NOT NULL
is there any way to select the rest of 260000 and then the rest?
I tried with:
SELECT SKIP 260000 FIRST 520000
APE1, APE2, CALLE, CODPOSTAL, DNI, FCADU, FENACI, LOCALIDAD, NOMBRE, NSS, PROV, TELEFONO
FROM
mytable WHERE FCADU IS NOT NULL AND FENACI IS NOT NULL
but I get Error Executing Database Query. A syntax error has occurred.
You can use the Unload statement for creating a file from database:
UNLOAD TO 'mytable.txt' SELECT * FROM mytable;
Maybe that this not works in CFML environment. So you can create a stored procedure which unloads your data.
See unload statement in stored procedure
Is it your script timing out or your database connection? From your question it sou ds to me like its not the coldfusion template that is timing out but the cfquery connection to the database. There is a timeout attribute for the cfquery tag. However apparently it is not reliable a better option is to configure the timout in the advanced section of the datasource within the coldfusion administrator.
Charlie Arehart blogged about this feature here:
http://www.carehart.org/blog/client/index.cfm/2010/7/14/hidden_gem_in_cf9_admin_querytimeout

Orchard CMS installing error?

I tried to install orchard CMS the from source code. I opened it in VS 2012, and I m using Sql Server 2012.
I am geting the following error.
error text:
Setup failed: could not execute query
[ select rolerecord0_.Id as Id13_, rolerecord0_.Name as Name13_ from Test_Orchard_Roles_RoleRecord rolerecord0_ where rolerecord0_.Name=#p0 ]
Name:p1 - Value:Anonymous
[SQL: select rolerecord0_.Id as Id13_, rolerecord0_.Name as Name13_ from Test_Orchard_Roles_RoleRecord rolerecord0_ where rolerecord0_.Name=#p0]
I cant find any solution for this error. Where did I go wrong? How can I fix this error? When I choose built-in storage, it runs. Also, are there any disadvantages if I use the built-in one?
UPDATE (new error message)
Setup failed: could not execute query [ SELECT TOP (#p0) this_.Id as
Id17_2_, this_.Number as Number17_2_, this_.Published as
Published17_2_, this_.Latest as Latest17_2_, this_.Data as Data17_2_,
this_.ContentItemRecord_id as ContentI6_17_2_, contentite1_.Id as
Id16_0_, contentite1_.Data as Data16_0_, contentite1_.ContentType_id
as ContentT3_16_0_, contenttyp4_.Id as Id18_1_, contenttyp4_.Name as
Name18_1_ FROM Orchard_Framework_ContentItemVersionRecord this_ inner
join Orchard_Framework_ContentItemRecord contentite1_ on
this_.ContentItemRecord_id=contentite1_.Id left outer join
Orchard_Framework_ContentTypeRecord contenttyp4_ on
contentite1_.ContentType_id=contenttyp4_.Id WHERE contentite1_.Id =
#p1 and this_.Published = #p2 ] Name:cp0 - Value:2 Name:cp1 -
Value:True [SQL: SELECT TOP (#p0) this_.Id as Id17_2_, this_.Number
as Number17_2_, this_.Published as Published17_2_, this_.Latest as
Latest17_2_, this_.Data as Data17_2_, this_.ContentItemRecord_id as
ContentI6_17_2_, contentite1_.Id as Id16_0_, contentite1_.Data as
Data16_0_, contentite1_.ContentType_id as ContentT3_16_0_,
contenttyp4_.Id as Id18_1_, contenttyp4_.Name as Name18_1_ FROM
Orchard_Framework_ContentItemVersionRecord this_ inner join
Orchard_Framework_ContentItemRecord contentite1_ on
this_.ContentItemRecord_id=contentite1_.Id left outer join
Orchard_Framework_ContentTypeRecord contenttyp4_ on
contentite1_.ContentType_id=contenttyp4_.Id WHERE contentite1_.Id =
#p1 and this_.Published = #p2]
Before creating SQL Server database, set collocation as Latin1_General_100_CI_AS
Do this, right click to Database note on SQL Server management Tool and click to New Database. When you see the New Database window, type the database name and click to Option tab left hand side on New Database windows.
You will see Collocation combobox top of the New Database window. Change default to Latin1_General_100_CI_AS. And then run Orchard setup again.
Set up a new application pool in your IIS manager and have it run as an account with permissions to query your sql server. Then assign your orchard website to use that application pool. Here is a screen shot of where to set the identity of the application pool, click the "Advanced Settings" link to get to this menu:
In the "Process Model" section - set the Identity to an account with SQL server permissions. Set "Load User Profile" = 'false'. This will prevent the pool from trying to retrieve the user profile when running the orchard website.
I find that usually when I get that error message, it's because the query being executed has a syntax error, or table/column names don't match. If you can halt execution where the exception is thrown, you can check the InnerException to see if that provides more info. If not, just copy the SQL from the message, fill in the parameters #p0, #p1, #p2 with values, and try to run it in whatever tool you use to manually query your database. It will often give you a more helpful error message.

Resources