we have an on premises TFS 2018 Server with Update 2 RC.
The perfomance of the web is very slow. We work with kanban and scrum boards and it take a few seconds to load, furthermore, moving a task to one column to another can take a few seconds
The operating system is Windows 2016 Server. 16GB RAM, and 4 processors of 2.6ghz. It´s a virtual machine. It has been working propertly until the last month.
I have checked and changed:
The elastic search extension has been deactivated because it takes more than 5GB of memory
The antivirus has been disabled.
The processor is below 15% and the memory is around 50%
IIS Worker Process is taking 1.2GB of RAM
I have deleted the "TfsData\ApplicationTier_fileCache" with no success
We are out of ideas, any help would be really appreciated.
Thanks in advance
It's difficult to know what's going on there with this information, an approach to let you perform a best analysis could be the typical divide an conquer approach, in your case:
Put the Agent in other machine.
Install the TFS and MsSql Server in different machines, you can try a clean install of TFS using your current MsSql.
Once you did this you will:
- Have a more stable system.
- Be able to analyze which part is creating problems.
Related
I have a server with 48 GB memory and a sql server analysis service (tabular mode), 2016 standard version SP1 CU7 installed on it.
I can deploy a tabular model from visual studio.
I can manually run a XMLA script:
{
"refresh": {
"type": "full",
"objects": [
{
"database": "MyCube"
}
]
}
}
But when i run that script from sql agent job, i get this error :
the JSON DDL request failed with the following error: Failed to execute XMLA. Error returned: 'There's not enough memory to complete this operation. Please try again later when there may be more memory available.'.. at Microsoft.AnalysisServices.Xmla.XmlaClient.CheckForSoapFault
The memory before porcessing is about 4GB, it increases during processing the cube, but when it hits about 18.5 GB, it fails.
Does anybody know a solution?
Analysis Services Tabular instances in SQL Server 2016 are limited to 16GB of RAM as documented here if you are running Standard Edition. Enterprise Edition removes that cap.
When you do a process full you keep a working copy of the cube and in the background you process a shadow copy. When the shadow copy is ready then it will replace the working copy. Basically this means that at processing time you need twice the amount of memory as the size of your cube. This can be an issue when you have the 16 GB limitation per instance with SSAS Standard edition.
One solution is to do a process with clearValues first, this empties the cube, and then to do the full process. More details here http://byobi.com/2016/12/how-much-ram-do-i-need-for-my-ssas-tabular-server/
Or another one is to play with the Memory \ VertiPaqPagingPolicy settings of the SSAS server. See more details here https://www.jamesserra.com/archive/2012/05/what-happens-when-a-ssas-tabular-model-exceeds-memory/ and here https://www.sqlbi.com/articles/memory-settings-in-tabular-instances-of-analysis-services/
And of course another solution is to upgrade to Enterprise Edition.
To follow up on Greg comments, i am facing similar issue at work and the workaround was instead of doing a database refresh, i did table refresh instead. I created 2 SQL jobs. My tabular model had 40 tables. So based on the sizes of the tables, i refresh x amount of tables in one job and y amount of table in the other jobs. You can create more than 2 SQL jobs and has less tables per job if you wish. This will put less load on the memory.
You can process small subsets of your data by partitioning your tables, this can be handled in SSMS. This Article provides a nice overview on how to achieve this.
Our TFS2012 deployment is currently quite simple:
Virtual Windows Server with TFS, Sharepoint, Reporting, SQL Server and Builds all on the same machine!
Is using the TFS admin console backup tool and/or backup of the entire machine enough to recover from a disaster?
There is no clear-cut criteria, you may take a look at TFS planning and disaster recovery guidance for a more comprehensive answer.
Shortly, you must be sure at least that
Backups are saved on different hardware, and possibly copied to a remote location
Along with your backups you have the recover instructions and install packages
This guarantees that you are able to recover, but it can take a long time, depending on the disaster impact (someone deleted a record vs. the server room has burnt) and the size of your data.
We're about to deploy TFS 2012 - mainly for source control at this stage but will hopefully ultimately provide a full work-flow for us.
Can anybody point me towards a sizing guide for the database aspect ?
The short answer is "how long is a piece of string?".
To qualify that short answer a bit, there is obviously an overhead to begin with. TFS is much better than SourceSafe in that only changes are stored, so you don't get a different version of the file in the database for each check-in. This is a good thing.
That said the answer to this question really depends on how often you're going to be checking in, the amount of changes there are between those check-ins and the overall size of all the projects and their related files.
To give you some metric, on our TFS server, the supporting TFS databases plus our "collection" database which has been running for 6 months now, with regular daily check-ins, is hitting 800mb.
Now, unless you head a massive project, I can't see you going over a half a TB anytime soon. That said, given that TFS is SQL Server based - should you need to upgrade in the future it's not as much of a nightmare as you may think.
According to Microsoft's documentation:
Fewer than 250 users: 1 disk at 7.2k rpm (125 GB)
250 to 500 users: 1 disk at 10k rpm (300 GB)
500 to 2,200 users: 1 disk at 7.2k rpm (500 GB)
2,200 to 3,600 users:1 disk at 7.2k rpm (500 GB)
However, as Moo-Juice said, the real-world numbers are dependent on how you actually use TFS.
Also keep in mind that you'll want to also create, store, and maintain backups of your TFS databases.
I have an mvc4 application that communicates to my sql server database via a wcf layer. Each layer is co located on the same server with the database located on a different server.
I am seeing CPU issues on my server which holds the applications, in particular with my mvc4 application. The server is windows server 2008 R2¬ running IIS7.5.
I would like to put some performance counters on my server to analyze where the problem on the server may be and is causing the high cpu problems.
I am new to setting up such and looking for pointers as to what counters to set up that may assist me, how I should analyze and best plan in gaining more knowledge on such.
Performance counters are generally good for production monitoring. On dev environnement (and I suppose you are at this stage), there are many profiling tools & apis.
On Sql Server
The best tool is Sql Server Profiler. You can find and diagnose slow-running queries by capturing all Transact-SQL statements and/or Sql Server Events.
On Asp.net MVC
I highly suggest you install a profiler like asp.net mini-profiler or Glimpse. When browsing you website, this will tell you which controller/action/partial/ajax is slow and sometimes why.
Visual Studio includes a Profiler. This let you measure, evaluate, and target performance-related issues in your code. It's fully integrated into the IDE. Once you have ran a performance session, several reports are available to help visualize and detect performance issues from the data gathered.
If you can't find why, you could run a load test using Visual Studio Web & Load Tests. You will rarely have performance issues for a single user, but for many concurrent users it's not generally the case.
I have started at a new site that is using .Net applications for the first time. As a developer I am used to VSS but this product is dying a death so we are using TFS (BASIC) instead.
I have been using TFS for source control up until now. But now we are having new servers installed for a live environment.
Now I am not sure what I should be doing. There are no books on TFS 2010 that I can find and I am wondering what tips you can give me. Does TFS need to be installed again, or should I use the existing installation? I am thinking I ought to set up a daily build for a test server. I have not been using TDD up until now, but for the next project this may change.
What must I absolutely get right, and what pitfuls should I avoid?
Without being there in your environment, it's hard to make appropriate recommendations. I've made some assumptions about what your installation based on what you said, but these may be wildly wrong.
You say you're using TFS (BASIC)-- I'm not sure what you mean by that, but if you are using TFS installed on one of the developers workstations, and you're starting to move towards a more robust development environment, I would recommend that you get a separate server (or servers) for your TFS installation.
It sounds like you're relatively small, so having your application tier and your data tier on the same machine shouldn't be that much of an issue. Just make sure that you have enough RAM on the machine to support both processes, and that you have enough disk space allocated for the growth of the database.
You talk about Test Driven Development (TDD), but what I think you're actually talking about is Continuous Integration (CI). When you have a CI environment set up, builds happen automatically based on either a schedule, or triggered by check-ins. Having this set up is never a bad idea, and would recommend that you get into the rhythm of CI builds as soon as possible.
If you're looking for a build server, you are probably going to be ok hosting the build agent on the combined application/data tier. If you find that you're getting performance hits when you do builds, you can move your builds to a different server without much effort.
You will also want to look at migrating your source code repository from your current environment to your future environment. The TFS installation wizard might be able to help you with that. If not, there are other options available, such as moving the database files to the new machine, or using the codeplex-based TFS Integration Platform.