Could not run web test on agent TFS 2010 - tfs

I'm running into an issue with data binding Web Performance tests. I have two web tests each databound to different CSV files. The first one created initially is running perfectly. But the second web test is throwing the following.
error:
Could not run Web test 'WebTest1' on agent {{SERVER}}: Could not access table 'Data#csv' in data source '{{Datasource}}' of test '6181b289-71fa-478f-8341-eba270b46c2a': No value given for one or more required parameters.
I'm running them locally. No Controllers or Agents are setup and I'm using VS 2010 SP1.

Open the CSV file in the visual studios that is loaded in the web test and change the separator from ; to , for both header and values. For example:
Before:
username;password;shopID;periodID
user1;password11;1;10
user2;password22;2;10
After:
username,password,shopID,periodID
user1,password11,1,10
user2,password22,2,10

I received the above error after creating my .CVS file from the results of a SQL Server query. After inspecting the .CVS with Notepad I found it was formatted as UTF-8. From Notepad simply set the Encoding to ANSI instead of UTF-8 or Unicode and this will resolve the problem.

Related

Passing multiple parameters to SSRS with URL

I'm using Visual Studio 16.4.5 with SSRS 15.0.1659.0
Currently, I'm trying to pass a URL in native mode with three parameters:
="https://reporting/ReportServer?/Production/Equipment/Prime%20On%20Rent%20No%20Shipper&AttachmentOption=N&Store=" & Fields!StoreNumber.Value & "&Division=" & Parameters!Division.Value
Every time I click the resulting box on the server side, I get:
An error has occurred during report processing. (rsProcessingAborted)
Query execution failed for dataset 'DataSet1'. (rsErrorExecutingCommand)
For more information about this error navigate to the report server on the local server machine, or enable remote errors
I've tried wrapping it in javascript, even though I don't want to use javascript for this. I just want to be able to open the target report with the parameters in it. What am I doing wrong?
So, a couple of things:
I had the wrong alias for one of the tables in my query, so it couldn't be found.
I had to make the two initial variables accept multiple values so that I could join them in the final URL.
Once I fixed those issues, it worked!

Calling Excel from a Child Powershell script...losing context/session

Ok due to requirements I have a main powershell that calls child powershell scripts using the & command. In two of my child powershell scripts I use Excel object to either read an excel file and/or create an excel file. If I run these files locally run great no problems. If I run them through a scheduler (in this case Tidal Scheduling Tool) I have issues.
Issue 1:
The first child script reads an excel file to get the names of the worksheets then uses the worksheet name to query the excel file using OleDB. The query function is in a utilities module and gives an error that it can not find the file or it is locked by another process. I've killed the excel process and still wasn't allowing me to query from the file. As a test I commented out the portion of the script that reads the file and hard coded the worksheet name and works fine so somehow the child script is not able to release the handle on the COM object/file.
Issue 2:
From a second child script I create an excel spreadsheet. I'm creating a csv file which I then save as xls file. Again works fine when running locally but when I run through scheduler I get an error when attempting to run the following line: [void]$worksheet.QueryTables.item($connector.name).Refresh and the error that I get is:
Exception calling "Refresh" with "0" argument(s): "Excel cannot find the text f
ile to refresh this external data range.
Check to make sure the text file has not been moved or renamed, then try the re
fresh again."
Again I'm calling these children scripts using the & command (i.e. & \scriptpath\script.ps1)
Anyone seen this before and know how I can make this work?
Thanks!
I have resolved this issue. Looks like Tidal scheduler has Agents and some of the agents setup at my client will run my scripts with no problems while others will not create files or lock files with no real errors given. Anyway, sorry I don't have more than that but powershell is working fine. :)

Visual Studio 2012 MVC Local DB Add Table menu option missing

Usually connect to an external database but I wanted to fiddle with creating a project with a local database. I am using SQLServer Express Local Database, not CE.
I can add tables with EF but if I right click on Tables in Server Explorer the only options I have are Refresh and Properties, i.e. no Add Table menu option. Similarly, if I right click on one of the tables EF has created I also get only the Refresh and Properties menu options so I can't add rows, etc. Same problem with stored procedures; I can't add any (should I want to do some testing with EF and stored procedures) because I only have the Refresh and Properties menu options.
FOLLOW UP:
I have isolated this problem to VS2012 Professional. When I use VS2010 it works exactly as expected: I can add tables and stored procedures. Following the same steps with VS2012 results in the situation where there is no menu option to add either tables or stored procedures. I produced these test results with SQLServer 2012 LocalDB.
If you only see Refresh and Properties when you right click on Tables in Server Explorer, you probably need to install SQL Server Data Tools from http://msdn.microsoft.com/en-us/data/hh297027 .
I had a similar problem only with Visual Studio 2012 Express for Web following along with the "Getting started with ASP.Net MVC3" at www.asp.net. The tutorial used a Sql Server Compact edition but I have SQL Express 2012 so I changed the connection string to have the data source point at my local instance, .\SQLExpress2012. The app worked fine and was able to read and write to the database. However I could not edit the database in the Visual Studio Express Database Explorer (all the context menu options were missing except "refresh" and "properties).
I found that changing the data source in the connection string to "Data Source=(LocalDB)\v11.0;" fixed the problem. I read somewhere that if you use LocalDb instead of the SQL Server instance name then it will be accessed under your user account - must have been some kind of permissions issue.
The whole (working) connnection string is:
"Data Source=(LocalDB)\v11.0;AttachDBFilename=|DataDirectory|Movies.mdf;Integrated Security=True;"
Hope this saves someone a bunch of hours pulling their hair out, I'm almost bald now :)
I found that when installing Visual Studio 2013, I had not requested the 'SQL Server Data Tools' feature.
By re-running the installer from 'Control Panel' > 'Programs and Features', using the 'Modify' option, I could add the missing feature. On completion, the 'Add New Table' menu item was available.

Problems with Characterset when using DB2 SYSPROC.ADMIN_CMD for database import

I am running a Java Application that transfers the files I need to import to the server the DB2 is on. Then the Java Application creates a JDBC Connection to the database and runs:
CALL SYSPROC.ADMIN_CMD('import from <filename> of del modified by decpt, coldel; messages on server inert into <view>')
The problem I have seems somehow conencted to the charset of either the database of the user the database uses to import the files (using the admin_cmd stored procedure). That problem is:
"Umlaute", like ä,ö,ü get messed up by this import. I had this sort of problem in the past and solution always was to set the LC_CTYPE of the user importing the data to de_DE.iso88591
What I already ruled out as the source of the problem:
- The file transfer to the database server. (Umlaute are still ok after that)
- The JDBC Connection (I simply inserted a line through the sql command instead of reading from a file)
The thing is I don't now what user DB2 uses to import files through ADMIN_CMD. And I also don't believe it could somehow be connected to the DB2 settings, since with every other way of inserting,loading ... data into it, everthing works fine.
And yes, I need to use ADMIN_CMD. The DB2 Command Line Tool is a performance nightmare ..
The best approach (for sanity):
Create all databases as UTF-8
Make sure all operating system locales are UTF-8
Get rid of all applications that don't handle their data as UTF-8
Slaughter every developer and vendor not adhering to UTF-8. Repeat and rinse until 100% completed.
DB2 indeed attempts to be smart and convert your input data for you (the import command basically pipes your data into insert clauses - which always get handled like that). The link I gave will outline the basic principle, and give you a few commands to try out. Also, there is official explanation to the similar. According it you could attempt setting the environment variable db2codepage to correspond with your delimited data files, and that should help. Also, the IXF format exports might work better since they have encoding related information attached in every file.
Thanks for your response.
I finally fixed the issue by adding a
MODIFIED BY CODEPAGE=1252
to my JDBC - ADMIN_CMD Import Command. This seems to override any codepage settings the db was using before. It also appears the default codepage of the database didn't matter, since it is set to 1252. The only thing I can think of right now for being the reason for all this could be a linux setting DB2 uses when importing through ADMIN_CMD.

MVC 3 Adding a local Database

I am currently going through the MvcMusicSore tutorial (MVC 3). I have full sql server 2008 installed and created a local database by running the SqL scripts included in the MvcMusicStore-Assets data folder. However when trying to add the mdf to the AppFolder in Visual Studio 2010 I get the error Access Denied. I am completely stuck at this point and would appreciate any help to resolve this.
Most probably the mdf file is locked by some other process, not allowing the application to read it. If you mounted the database on SQL Server you need to use a connection string with the machine name instead of specifying the mdf file directly.
You can't copy or modify a live working database. And I don't see why you should.
You need connecting to it? Pick a way. LINQ to SQL, Entity Framework, NHibernate, ADO.NET...
If you really want to copy the database file for some reason, you must first stop the MSSQL Service (or detach the database), then do that.
Like others have said, you shouldn't need to add the actual .mdf into your project. If you have it running on your local SQL Server instance, you should be able to add it via Visual Studio's Server Explorer (plus that gets you your connection string).

Resources