I'm still very new at programming, and our local SSIS genius isn't here today for me to pick his brain.
I am working on an existing SSIS package and am making modifications to a specific .dtsx file. The data flow has an OLE DB source, which I have successfully changed the sql query to fit my project specs. The Destination is a connection flat file, which I have modified the column mappings to fit the new query.
I have a few concerns:
The source connection originally used SQL Server Authentication, and I don't have the user name or password. I can use Windows Authentication to test it locally, but in the end it will be set up by someone else as a scheduled task on a server somewhere. (I realize this is probably a question for people at my work, but I figured I would fill you guys in).
The destination preview doesn't show anything. I can, however, successfully parse and preview the Source query...
I also don't understand what "Error Output" means on the Source Editor.
Is this set up correctly already, or does it mean there will be some errors in the output?
Any explanations or elaborations would be helpful, but my overall question is: "Am I missing something for this .dtsx, or is this project finished and ready to be set up as a scheduled task?"
It will depend on the package configuration. usually user\password are read from a configuration mechanism (file or server)
Yes, it should be fine
It means what should the task do when it finds an error. It can fail the component or ignore the error for example
Related
I am really sorry to ask a simple question like this, but it is getting frustrating. I installed neo4j 4.0.4 on my Windows machine, created a new project as shown in the official tutorial video and set a password for my local graph. Funnily, the tutorial video ends after setting the password and opening the browser not showing how to perform Cypher queries on this newly created database. In neo4j Desktop my database is shown correctly and it seems to be up and running.
However, when I try to connect to this database via the browser, I do not see the database at all. It is so confusing when connecting to the server to specify a username and password, if you only need to set a password for your database?! The default neo4j user can see the system and default database but not my project database. In addition, I cannot link files from the project directory in Cypher queries. I tried to disable authentication, but it did not help at all.
When I issue SHOW DATABASES command, it does not list my database as well.
Update / Edit:
Seems I misunderstood the concept of projects. Every database is named neo4j - default, regardless of the name specified in the project ?!. However, I still cannot access project files. So far, I copied the files manually in the database directory under "imports". But I guess that is not the intended way.
After importing data to this default database, it still shows no data in the project itself.
Data files in the imports directory are not automatically imported into the DB. That is because neo4j has no idea how you want to store that data as nodes and relationships.
So, it is up to you to determine your desired data model, and then write the appropriate code to enforce that data model.
You can take a look at this page to learn about how to import CSV data (probably the most commonly used import data format).
I have MVC project which uses EF code first and I'm trying to publish to azure from Visual Studio but I'm receiving error: "Web deploy task failed: data loss might occur". I did some refactoring including renaming columns and I'm aware why the error occurs but I would like to force the migration because I'm sure that I handled the data loss:-) Nevertheless I have no idea how to skip data loss check. I've found that on SQL project you have option in properties that you can uncheck the 'block potential data loose' but I cannot find anything like that on my MVC project. I've tried to include my own script for schema update without the checks for data loss but EF complains that there are pending migrations, so I've tried to copy missing entries to _MigrationHistory table from my development db but it turned out that it's not that simple ;-) Because my app is still in development phase I have reinitialized db but It will be worth to know how to handle that kind of situation on "real" production environment:-)
Edit:
After some testing I've discovered that when publishing to azure there is now option "update database" which by default generates db update script based on diff on the local and azure db. It differs from the old "Run Code First migration" because the the old one was changing Db initializer to MigrateDatabaseToLatestVersion and on application start the db was migrated & seed was run when there were not applied migrations. The "update database" process in other hand is handled only by generated script and the migrations files and MigrationsHistory table is not used on production, neither the seed method. I was confused on the beginning but now it seems logical that update script gives more control over the database change, you always can modify the script, and furthermore the publish process of moving new code to azure performs only after successful db update.
These is an option called AutomaticMigrationDataLossAllowed, set this to true. And run Update-Database -Force. That should do it.
I am having a very frustrating problem with my current project. It is continually losing the connection string binding for Entity models.
I have multiple models for different databases in separate areas and was having no problems. Suddenly now whenever I try to update from the database I get the connection string setup prompt. I select for it to add it to the Web.config with the password but it doesn't ever pick it up there again. They all are still in the web config but it just doesn't see them.
If I remove all the connection strings from the config file it will write the new one there. Then when I try to set up a Stored procedure/Function Import, I still get the statement in the lower box:
No database connection has been configured for this model.
I have tried rebuilding the project and creating the models again from scratch and that works for a while. When I try and bring the project in under Perforce source control, it winds up getting re-corrupted & the connection string goes away. It affects all of my models too.
I am also using EF 4.x DbContext Generator to create context files. They work fine. I am also able to run the application and it connects to the database just fine and returns data. No issues there. I am just unable to update Entity Complex types from the DB or import any more stored procedures.
An even weirder occurrence was that I opened a broken project from a different directory then opened an uncorrupted copy it instantly became broken also. Contagious!
Any thoughts on where to look into to see why this is happening? Has anyone else had this issue?
I seem to have found where the issue is coming from.
I seem to have a bad value somewhere in my web.config which causes the issue. Here is the work around I found did the trick and showed it's the config:
When the binding breaks, I closed the project then swapped out the web.config with one from a good build which was missing a couple keys but worked well enough to bind.
I reopen the the solution and the "No database connection..." error goes away.
Then while the solution is still open I re-swap the web.config back in and it still works.
This is far from optimal but I can work until I figure out the bad value via doing some compares.
I will comment on this again once I determine the exact issue.
I have configured TFS 2010, but when I try to load the project dashboard for a team project, it returns a reporting service error so I am not able to see the "Task Burndown (hours)" and " Burn Rate (hours/day)". Other parts of the page are working fine.
The error is as:
An error has occurred during report processing. (rsProcessingAborted) Get Online Help
Cannot impersonate user for data source 'TfsReportDS'. (rsErrorImpersonatingUser) Get Online Help
Log on failed. (rsLogonFailed) Get Online Help
For more information about this error navigate to the report server on the local server machine, or enable remote errors
I got resolution finally....
Go to analysis service.
You will see database named as TFS_Analysis.
Go to Roles node.
View properties of "TfsWarehouseDataReader".
Click on Data Sources.
Now you will see Tfs_AnalysisDataSource.
Change access to "Read" and check "Read Definition" box. Now Click OK and You are Done.
The main problem was with SCHEMA CONFLICTS.
First identify which fields are causing schema conflicts, invoke GetWarehouseStatus and observe the XML which fields got conflicted, and in which collection. once you found the filed names then rename the fields with the help of below link.
geekswithblogs.net/Natalia/Default.aspx
msdn.microsoft.com/en-us/library/ee921480(v=VS.100).aspx
then rebuild your warehouse from tfs admin console, take a back up of old database in sql database and delete it .wait for some time (depends on warehouse time to refresh the cube or check the GetWarehouseStatus next day and check the xml ). to check this use below link
type //localhost:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx?op=GetProcessingStatus
I spend lot of time to resolve this issue, that's why posting the solution here, this may help some one... any queries related schema conflicts and Reporting services feel free to post me... iam not a expert but for sure i can help you out in this issues....
Please use Mark as Answer if my post solved your problem and use Vote As Helpful if a post was useful.
Use WCF-SAP binding in WCF-Custom adapter, ReceiveIDOCFormat is set to 'String', in the pipeline component, we wrap and call a flat file disassembler to disassemble the SAP request to XML and process it later. We also have a log compnent which will log the SAP raw message prior to disassembler (the string version) to database (streaming way using CForwardOnlyStream)
Here is the problem, during the UAT testing with SAP, we find occasionally the flat file disassembler is complaining 'Unexpected end of stream while looking for:....', when we inspect the SAP message sent over the wire, we find the SAP request only contains the header (EDI_DC40), with emty content after that. What makes me worry is , when we go into SAP, resubmit the failed message using transaction WE19, disassembler has no problem parsing it.
I am totally lost, can someone please sugguest how to troubleshoot this?
Thanks a million!!
I think probably I found the problem now, SAP guys added a field, what I used to do (which I think is right but it may be the flaw) is, I didn't regenerate the schema, instead, I just manually added the field in Visual Studio and set the field length based on IDOC description.
I regenerated the IDOC using the WCF wizard, it seems it is not as same as added a field in visual studio, I just deployed this schema and hope it will address the problem, I'll post my findings later if it worked