icCube - Monitoring: incremental load - set up - monitoring

I have some trouble with incremental load: it is not showing any changes of the data.
What am I doing?
schema is loaded
changes inside the origin data table
incremental load is performing with the help of the schedular
"fresh time start" and "refest time end" of the schema is viewable
no error occured
I can not see any changes in the reports after an incremental load was performed. After a full load the changes are visible.
My set up:
builder schema: incremental load is activated
builder schema data source:
incr. load strategy: icnremental load
incr. load column: incremental_id (datatype is long)
monitoring schemas schedular:
Any ideas what I am doing wrong or what I can try?
Cheers Joe
log:

The incremental load works if I add an Incr. Load SQL Statement within the schema.
Schema -> Data Integration ->Data Source -> Your Source -> -> Your table -> field incr. load SQl

I can not see any changes in the reports after an incremental load was performed. After a full load the changes are visible.
Assuming you have new data in the schema, you might have forgot to setup the "Refresh Period" of the report.
Otherwise, check the log file that will mention that new data has been loaded for the given schema during the incremental load. You can then check within the MDX IDE that new data is available.
Hope that helps.
[edit: the error was due a SQL query badly generated (www)]

Related

Creating a temporary table in informix 4gl using prepare statement

I've been trying to create a function to load some files insert them into a temp table based on an existing table, then verifying that there are not duplicated rows on the files loaded and then inserting them into proper tables in the DB, tried using something like this:
let statement = " select * from ", vtable clipped, " where 1=0 into temp t_",vtable clipped
prepare pstatement from statement
execute pstatement
to no avail because the temp table seems to be created on a different session than the one im working in.
Any suggestions?
Thank you all beforehand
If you prepare and execute a statement as shown, it is created on the connection you're using at the time. If you don't mess with the connections (CONNECT, DISCONNECT, SET CONNECTION), then it should all be clean — if the statement worked at all. Are you checking errors (WHENEVER ERROR STOP, perhaps)? Or have you displayed statement to ensure the SQL is as expected (no untoward chopping of the string, for example — that could account for why the table appears to be missing).
Remember that a temporary table is private to the session. If you run a LOAD statement in the I4GL program, there should be no problem, but you can't use a separate loader program with a temporary table. In terms of the database, it would have to be a 'permanent' or 'regular' table, even if you remove it soon after creating it.
You could also prepare an explicit CREATE TEMP TABLE statement to create a table.
Also consider whether using an external table would help you with the loading. There are also violations tables that could be used to trap problematic rows while loading directly into the main table.

Elasticsearch can't sync 2GB database from neo4j with GprahAware

The neo4j2elasticsearch works on my machine when the database is only 250KB. But the the databse is around 2GB. It won't sync anymore. I'm wondering is because of these parameters in the config file:
#optional, size of the in-memory queue that queues up operations to be synchronised to Elasticsearch, defaults to 10000
com.graphaware.module.ES.queueSize=10000
#optional, size of the batch size to use during re-initialization, defaults to 1000
com.graphaware.module.ES.reindexBatchSize=2000
I'm wondering what is the unit of in-memory queue size 10000 and how to estimate what parameter to set based on my database size.
Here is the debug file :
neo4j debug.log failure loading
The database is re-initialized but there are only empty neo4j-index-relationship/neo4j-index-node index in the elasticsearch database
Just for information, here is the debug file for successful 250KB database loading:
neo4j debug.log successful loading
seems like Re-indexing nodes... step is missing in the 2GB database loading procedure.
The log doesn't have a line saying it will re-index, did you configure:
com.graphaware.module.ES.initializeUntil=
To a timestamp that warrants re-indexing on startup? Otherwise, it will only index new data. It is explained at the bottom of https://github.com/graphaware/neo4j-to-elasticsearch :
...in order to trigger (re-)indexing, i.e. sending every node that
should be indexed to Elasticsearch upon Neo4j restart, you have to
manually intervene...
So try to create a new node and see if the synchronization is working for new stuff to eliminate this situation (most common one).

How to use FileTable in EF Code First

I'm using FileTable in SQL Server 2014 and EF code first in my project.
When I use this command
USE [master]
GO
ALTER DATABASE [OnlineStore]
SET FILESTREAM( DIRECTORY_NAME = N'OnlineStore',
NON_TRANSACTED_ACCESS = FULL) WITH NO_WAIT
GO
it shows this warning in sql
When the FILESTREAM database option NON_TRANSACTED_ACCESS is set to FULL and the READ_COMMITTED_SNAPSHOT or the ALLOW_SNAPSHOT_ISOLATION options are on, T-SQL and transactional read access to FILESTREAM data in the context of a FILETABLE is blocked.
Now I continue and create the table, and insert folder and file not problem.
My problem to read data, when read data is show this error:
Msg 33447, Level 16, State 1, Line 2
Cannot access file_stream column in FileTable 'File', because FileTable doesn't support row versioning. Either set transaction level to something other than READ COMMITTED SNAPSHOT or SNAPSHOT, or use READCOMMITTEDLOCK table hint.
I'm using EF code first - how to resolve this problem?
You must run this command to have the ability to SELECT the table.
USE [master]
GO
ALTER DATABASE [dbname] SET READ_COMMITTED_SNAPSHOT OFF WITH NO_WAIT

Creating a DTS package that uses a stored procedure

We're trying to make a DTS package where it'll launch a stored procedure and capture the contents in a flat file. This will have to run every night, and the new file should overwrite the existing file.
This wouldn't normally be a problem, as we just plug in the query and it runs, but this time everything was complicated enough that we chose to approach it with a stored procedure employing temporary tables. How can I go about using this in a DTS package? I tried going the normal route with the Wizard and then plugging in EXEC BlahBlah.dbo... It did not care for that:
The Statement could not be parsed. Additional information: Invalid object name '#DestinyDistHS'. (Microsoft SQL Server Native Client 10.0)
Can anyone guide me in the right direction here?
Thanks.
Is it an option to simply populate a non-temp table in your SP, call it and select from the non temp table when exporting?
This is only an issue if you have multiple simultaneous calls to the stored procedure. In this case you can't save to a single table.
If you do have multiple simultaneous calls then you might be able to:
Create a temp table to hold results
Use INSERT INTO #TempTable EXEC YourProc
SELECT FROM #TempTable
You might need to do this in a more forgiving command line tool (like SQLCMD). It's not as fussy about metadata.

Unable to Generate Script for 3 Views in SQL server management studio 2008

I have a strange problem
When I create Object Script (script to drop and create Stored Procedures, Views, Functions) from Sql Server 2008 it misses 3 Views don't know why?
I am performing Following steps to create object script
1) Open Sql Server 2008 Management Studio
2) Connect to server
3) Right click on selected database then click on Tasks -> Generate Script, then select database from list, click Next.
4) It gives options I am changing three options i.e. Include If Not Exists = true, Script Drop = true, Script Use Database = false and clicing Next button
4) Now selecting SP, Views and Functions and clicking Next,
5) clicking Select All for All the coming screens
6) Finally clicking Finish button.
Is there any limitation or special condition or convention that I am not following and causing Views not to include in Generate Script?
Please let me know if I am missing something , I have tried many ways.
I also found that this problem not only exists with Views but it also exists with Functions and Stored Procedures.
If we rename them it works fine , for example a Function earlier named dbo.SeperateElementsInt was working fine, but strangely, Generate Script ignored this function, later we renamed it to dbo.SeperateElementsInteger and it started generating script.
We cannot change the View names as it is used at many places.
Views which are giving problem are dbo.DivisionInfo and dbo.CustomerDivisonOfficeInfo
Stored Procedure which is giving problem is dbo.procsync_get_zVariable
The problem exists with SSMS 2005 too.
Thanks
We didn't understand each other on INFORMATION_SCHEMA-profiler issue. I was suggesting to turn profiler on, because SSMS does a SELECT on INFORMATION_SCHEMA with some where clauses. I suspect that the query itself cuts off your views. Once You have a query that SSMS executes to get the list of objects You should find why it doesn't see some views.
Here are the scripts that SSMS executes when You select all views and start scripting. Check if any of them doesn't return DivisionInfo view. (I've created DivisionInfo view in my database to reproduce your case). For quick check execute them one by one and read my comments after each query. Please note that You should actually catch queries on your environment with Profiler, because they may differ on your environment.
Before showing screen to select views, procedures, ... SSMS executes following script to get the list of views:
exec sp_executesql N'SELECT
''Server[#Name='' + quotename(CAST(
serverproperty(N''Servername'')
AS sysname),'''''''') + '']'' + ''/Database[#Name='' + quotename(db_name(),'''''''') + '']'' + ''/View[#Name='' + quotename(v.name,'''''''') + '' and #Schema='' + quotename(SCHEMA_NAME(v.schema_id),'''''''') + '']'' AS [Urn],
v.name AS [Name],
SCHEMA_NAME(v.schema_id) AS [Schema]
FROM
sys.all_views AS v
WHERE
(v.type = #_msparam_0)and(CAST(
case
when v.is_ms_shipped = 1 then 1
when (
select
major_id
from
sys.extended_properties
where
major_id = v.object_id and
minor_id = 0 and
class = 1 and
name = N''microsoft_database_tools_support'')
is not null then 1
else 0
end
AS bit)=0)
ORDER BY
[Schema] ASC,[Name] ASC',N'#_msparam_0 nvarchar(4000)',#_msparam_0=N'V'
Is your view listed? You can add condition WHERE v.name = 'DivisionInfo' to filter it. If there is no DivisionInfo listed check what part of this query eliminates it from result set.
Once You select objects to script and start scripting, SSMS creates temp table, store objects in it and executes scripts to find related objects.
Create temp table and insert DivisionInfo view in it:
CREATE TABLE #tempdep (objid int NOT NULL, objname sysname NOT NULL, objschema sysname NULL, objdb sysname NOT NULL, objtype smallint NOT NULL)
exec sp_executesql N'INSERT INTO #tempdep
SELECT
v.object_id AS [ID],
v.name AS [Name],
SCHEMA_NAME(v.schema_id) AS [Schema],
db_name(),
2
FROM
sys.all_views AS v
WHERE
(v.type = #_msparam_0)and(v.name=#_msparam_1 and SCHEMA_NAME(v.schema_id)=#_msparam_2)',N'#_msparam_0 nvarchar(4000),#_msparam_1 nvarchar(4000),#_msparam_2 nvarchar(4000)',#_msparam_0=N'V',#_msparam_1=N'DivisionInfo',#_msparam_2=N'dbo'
Did this query insert anything in #tempdep? If not, check why. Once again, You have to use Profiler to get queries from your environment instead of using queries I put here because they are from my environment.
When You start profiling, there should be many inserts like the one above. You need to find the one that relates to DivisionInfo. You can use Find option to find it because You will see many queries in Profiler because You have a lot of other views. To make profiler log smaller, script only views.
As You can see, idea is to start profiling and start scripting. Once scripting is finished, stop profiler and check scripts executed by SSMS. You should find why it doesn't see DivisionInfo. If there is no DivisionInfo in profiler log but You can check it for scripting in wizard, then take scripts for DivisionInfo and for one view that scripting works for and see the differences between them. Take a close look at differences between them in regards to scripts that SMSS uses to retrieve them.
for some reason SSMS discards this view
according to data he extracted with queries (catched from profiler)
I just ran into the exact issue. We were trying to script out the schema of one database (Call it Database_A) and many views wouldn't script out.
We'd decommissioned another database (Call it Database_B) and all the views that wouldn't script (in Database_A) pointed out to that database (Database_B) - which was accessed through a linked server, and was offline. Since all the connection strings were now pointing to the new server that Database_A was now on, I brought Database_A on the old server online in read_only for just long enough to script out the views, and it worked. Took the database offline again, and we had what we needed.
The script I threw together to find the linked server reference in the views was this:
use Database_B
go
select so.name, sc.text
from sysobjects so, syscomments sc
where so.id = sc.id
and sc.text like '%Database_A%'
That's what worked for me, I hope it works for you as well.
Take care,
Tom

Resources