I already have a website running and want to install JIRA Server on it. So how do I add a sub directory to my main website, example www.mywebsite.com is my actual website so want www.mywebsite.com/Jira how do I achieve this?
I had downloaded the windows JIRA Installer and ran it locally connecting to the database hosted on my server and it worked using localhost(http://localhost:8080/secure/Dashboard.jspa) however now I want to access it through the web link i.e base URL.
I am new to this so any help would be really appreciated.
Well I understand you do have a domain and you want to run your JIRA with context path /jira un that domain. If that's the case, you may need to consider running your JIRA behind proxy. Take a look at Atlassian documentation since it contains all the information that you need for this:
https://confluence.atlassian.com/kb/proxying-atlassian-server-applications-with-apache-http-server-mod_proxy_http-806032611.html
Related
Bamboo and bitbucket are two pieces from the same vendor and there should be no problem integrating those two with each other, but I have a weird situation.
Here is what i get when trying to add bitbucket repo to my Bamboo.
See attached screenshot.
I'm pretty sure my repo is public and I use correct bitbucket user account name.
Thanks in advance.
I have installed BAMBOO to AWS using Java script provided by Atlassian
At the end it gave me a web UI links as on following screenshot .
It worked, however some functionality was blocked by XSRF (that is not enabled by default in Atlassian products).
Works fine when I use native Bamboo URL (HTTP and port 8085) instead on HTTPS.
Be careful with that... I just wasted about 2 days trying to fix something that didn't have to be fixed at all.
Have you linked your Bitbucket repo to Bamboo server. If not see this https://confluence.atlassian.com/bamboo/linking-to-another-application-360677713.html
May be this can help.
I'm having another problem using WSO2 API Manager 2.0.0: I have installed it in docker using three containers (one for APIM, one for Analytics and one for MySQL) and I replace some configuration files with my custom version (e.g. DB, server name, gateway setup...).
Both APIM and Analytics are configured to save data in the MySQL container and I am able to see changes in the DB.
The issue is that I cannot find my APIs neither in the publisher nor in the store after the container has been rebuilt. Changes in the DB persists, I can see the statistics for all my APIs and I get an error if I try to create a new API using the same name or context, but the store is always empty after a new build.
I have also tried to put both /repository/deployment/server/synapse-config/default and /repository/tenants/ in two volumes and I can see the files created in /.../default/api/ for my APIs, but I cannot figure out the issue.
Should I persists some additional directory not mentioned in the guide?
I don't want to put the whole APIM and Analytics homes in volumes if possible.
First, check whether artifacts can be located in Resources Browser.
If you can find the API related files, then the issue is related to indexing.
Do the following to re-index the artifacts in the registry:
Rename the <lastAccessTimeLocation> element in the <APIM_2.0.0_HOME>/repository/conf/registry.xml file. If you use a clustered/distributed API Manager setup, change the file in the API Publisher node. For example, change the /_system/local/repository/components/org.wso2.carbon.registry/indexing/lastaccesstime registry path to /_system/local/repository/components/org.wso2.carbon.registry/indexing/lastaccesstime_1.
Shut down the API Manager, back up and delete the <APIM_2.0.0_HOME>/solr directory.
Finally start the API Manager.
The Api Information resides in the DB and in the File system.(/repository/deployment/server/synapse-config/default/api) It is possible that the registry artifacts are not indexed properly. Can you try following?
Delete the solar directory.
Open registry.xml and change the following line as shown below. < lastAccessTimeLocation>/_system/local/repository/components/org.wso2.carbon.registry/indexing/lastaccesstime-1
Now restart the server. Server will re-index all the files again.
Also make sure the Databases are properly configured. Specially Registry mounting related configurations.
My server running TFS express crashed. I managed to mount the disk and extract mdf/ldf file for my TFS collection. Here is what I did next:
Built a new machine (with the same name/IP address) and installed SQL Express/TFS server express.
From SQL Server Management Studio, attached the mdf/ldf files. I can now see TFS_MyCollection as a new database.
From TFS Administrative console, clicked on "Attach Collection."
However, the new database is not being listed.
I went through a bunch of links on the Internet. https://social.msdn.microsoft.com/Forums/en-US/d949edf3-1795-448a-a1cc-39555ce87b50/tfs-2010-installation-error had a similar situation. Based on the suggestion, I had attached the database. I also looked at https://msdn.microsoft.com/en-us/library/ms404869(VS.80).aspx. However, this one talks about using backup/restore, which is not my case.
I must be missing some configuration step. Please advice. Regards.
You cant just attach a collection that was never detached.
You need to unconfigure your TFS instance (tfconfig.exe setup /uninstall:all) and then restore all of the databases.
You will need to restore each collection and the configuration DB. They are currently a set. Once you have all of the databses attached/restored you need to run the setup and "configure application tier only".
https://msdn.microsoft.com/en-us/library/ms404869.aspx
You need to follow the documentation for moving hardware. Make sure that you follow each step.
Note: You should take backups!
I am currently using TFS to source changes to a web site code base. Currently, when I'm done making a change, I need to deploy the changes to a web server for review by the end user.
Generally the way I would do this is just connect to that machine via RDP, open visual studio and get latest to pull changes...
However, this only works if I'm the only one working on the entire site. If someone else RDP in to make changes, the site is locked to my TFS account, and they can't make any changes to it...
They could pull their own copy of the site into their own machine via TFS and check in the changes there but because so much of their part is done on the database (vs code) they'd have to duplicate everything they do into the website every time them commit a change, so they prefer to work directly on the machine...
is there any way to make this work, a better way to set this up so I can pull their changes into my local copy via TFS?
my biggest problem to overcome is the fact that when I Get Latest on the webserver via RDP it locks the entire solution to my TFS account, so that when they login to RDP with their credentials, they can't make any changes because the files are checked in, and of course they can't checkout because of course the solution is tied to my account.
If I can get past that I think we'd be okay.
any info is appreciated, please let me know if I can provide more context, thanks
Can you set up a different TFS workspace for each user on your RDP machine? This should allow multiple users to use the TFS client to pull the same solution on the same machine without issue.
I am currently in the middle of a TFS 2010 multi-server installation and have the following questions:
should i go with the default installation of having tfs sit on http://server:8080/tfs as i would probably prefer http://server/ or http://server:8080
I can't seem to find the best practice for which user i should use to remotely connect to the reporting services - should this be a domain user, or a local machine account on the database server?
If i install sharepoint and i want the database stored on another server, am i ok to install sharepoint as a single server or do i need to install "application only" and create a "farm"
that is all :)
•should i go with the default
installation of having tfs sit on
http://server:8080/tfs as i would
probably prefer http://server/ or
http://server:8080
We have our hosted externaly so we go with a FQDN, but if not you should use the default as Visual Studio also uses this default. By all means setup a FQDN or other URL on port 80 to give people the choice, and to make the web access more accessable.
•I can't seem to find the best
practice for which user i should use
to remotely connect to the reporting
services - should this be a domain
user, or a local machine account on
the database server?
Network Service is fine and TFS will set everything up during the install. You can create a "domain\TfsReports" if it makes you feel better and I would recommend this if you are going to setup Kerberos or are runnign in a security consious enviroment.
•If i install sharepoint and i want
the database stored on another server,
am i ok to install sharepoint as a
single server or do i need to install
"application only" and create a "farm"
You can put the Sharepoint databases anywher you like during the insatll. if you are doing what you sugest, remember not to lose them. A better option would be to integrate with your existing Sharepoint 2007/2010 corporate deployment.
Integrate SharePoint 2010 with Team Foundation Server 2010