My workspace became out of sync with the repo as I was working. Now, I cannot submit my changes nor can I download the latest from the repo - plasticscm

I was working on code while our artist was working on his binary files. He had submitted his files to the repo. After that, I tried to submit my updates and got the following error,
The merge operation is not currently available in the cloud server
I get this message too when I try to download updates so my workspace is in sync.
Help!

As "https://www.plasticscm.com/cloud/index.html" states developers should use distributed servers in order to be able to merge. Artists are fine with Gluon. Notice that if all your development is done at br:/main then developers can also use gluon.
From the message you are getting it seems you are directly working with the cloud repo instead of your local one. Merge in the Plastic SCM Cloud is not supported yet, it will be available in a near future.
Now, you should create a local repository, pull the cloud content into your local server and start working, once you are done with your local changes you can push the new changesets to the Plastic SCM Cloud repository.
The Sync view will help you a lot with this process of pulling/pushing information: More info. This is how your workflow should be:
As you are currently having local changes that can't be submitted because the merge lock I do recommend you first to backup the changed files and then start working distributed, restore the files in your workspace and you will be able to checkin and then push back to the Cloud repository.

Related

Gitbash/bitbucket: how to push changes from previous push, overwriting a "bad" push

I have a project I have been working on, and I was suggested to use Gitbash and Bitbucket to keep track of all the changes I will make.
I made some changes, pushed them to my bitbucket, and then cloned it to really make sure I was working on the latest files. Made a few more changes on that cloned download, tried to run it on my hardware and it wouldn't run. Couldn't figure out why either.
I went back to the previous download and managed to get it running on my hardware setup. I have made a multitude of changes since but just completely forgot to push.
Now, when I try to push, because there are files I don't have locally, I get the error:
On branch master
Your branch and 'origin/master' have diverged, and you have 2 and 5 different commits respectively.
It also tells me to git pull to merge the remote to my local files.
I do not want to do that, as that would overwrite changes I want to keep locally. I just want to push what I have locally, to my remote.
I am not a software engineer at all. All of the helps pages and tutorials are going way over my head.

Jenkins build notification affecting all branches in bitbucket

I have previously use Jenkins and BitBucket on premises and been able to have Jenkins notify bitbucket of the build condition of each individual branch (success, failed, in progress) however since I moved to bitbucket cloud it has started applying the condition of every build on every branch to every branch. For example if I have just a master and develop branch (to keep it simple) and the master branch failed because of some deployment configuration I am unable to merge a fix into it from develop even if develop is passing because it claims 1 of my 2 builds is failing on the develop branch.
This is tough to explain clearly in words so I've attached some pictures:
Two branches one build failing but both being marked as failed
Showing that develop branch is passing
Proof it wont let me merge
These notifications come from jenkins and have been set up using the standard cloudbees-bitbucket-branch-source:2.9.7 plugin to scan my bit bucket cloud.
Okay so this was a really obvious mistake but I thought I would leave the reason why this happened here in case any one makes the same mistake. cloudbees-bitbucket-branch-source:2.9.7 notifies bitbucket using the commit ID, when creating the branch structure for the repository I branched off main to make develop and both got built but both had the same commit ID and so both were notified of both builds. The problem fixes it's self on the first cycle of code to run through it.

Using Jira to trigger a Jenkins build

I have a strange use case here I know, but basically I have a CI / CD solution that starts be a developer creating a zip file of a set of resources. This zip is then sucked in to SVN via the tools internal programs.
Currently the solution works, using the FSTRigger to poll for an updated zip. When it see it, then the process kicks off and we're happy.
going forward I'd like the builds to be triggered by a Jira job reaching a certain status and have been looking at the Jira trigger plugin. It looks like it will help satisfy me with regards the triggering of the build and passing data from Jira to Jenkins to use for delivery notes etc. However it would still depend on the zip file being in a certain location to be picked up.
I'm wondering if it's possible to attach the zip to the Jira task and then as part of the task status hitting 'build' kick off the Jenkins job and copy the zip so it can be picked up by by the Jenkins build task.
for reasons to complex to mention, checking the zip into svn first won't really work.
When your Jenkins build is triggered via jira-trigger-plugin, you would be able to access JIRA_ISSUE_KEY environment variable that contains the JIRA issue which status has changed.
With the JIRA issue key, you can hit Get Issue JIRA REST API to retrieve the issue details. The issue details would contain the attachment information, which would then be able to be used for downloading the zip in Jenkins.

How does multiple developers work with the same repository (Unity3D)?

I have been trying out Plastic SCM for a few weeks now.. and we still have issues in figuring out how to use Plastic SCM when multiple developers are working with the same repository.
We are currently working with Unity3D for our project, and for each time we try to merge our changes we always end up with some sort of problem.
We are currently using the Plastic SCM Cloud Edition.
What we have tried is to work directly against the Cloud Repo, but that ended up with problems merging (since the cloud edition isn't able to merge content "on the fly" in the cloud).
We then ended up creating local repositories that we then replicated the cloud repository to by simply pulling the repo, then made some changes and then tried to push the local repository. However.. we then received a new error telling us something about the MemoryStream...
So... how are we supposed to work with Plastic SCM Cloud Edition when in a small team of 2-5 developers when working with Unity3D?
Your last approach make sense if you don't need an on-premise server in your office.
Plastic Cloud Edition is the sum of the Plastic SCM software plus a Plastic Cloud server and storage all packaged together in a single pay-as-you-go subscription.
The client software and local repositories are installed on each developer’s computer, and then he will push/pull to the cloud organization.
You commented that your workflow requires branching and merging, so the checkin, update, create branch and merge will be local operations and then you be just connecting to the cloud to push/pull your changes.
This "MemoryStream" message could be a "hard drive is full" issue, but we would need more details to confirm it. If you reach us, we will try to debug it we will also help you to configure your setup: support at codicesoftware dot com

Importing the code changes while comparing the code from source control

Can I import the changes b/w the local version and the checkin version of the code.I am using TFS 2010.The reason I am asking this question is that ,I want send my code to an external reviewer who will not have access to TFS source control .
My current thinking is that if can I some import the changes then reviewer can review the changes through windiff or beyond compare tool.
Why not just:
Get latest
Copy it to a directory called <<filename>>-new.cs
Get specific version
Copy it to the same directory call it <<filename>>-old.cs
Send both files to the reviewer and have him use windiff (or whatever) to review?
If you want to import only the changed code, you can first check out all the files, then copy the files the reviewer send to you over the local files.
Then use the tfpt uu command from the TFS Power Tools to undo all unchanged files. Now you only have the list of files that are modified.
For a (very) different approach you could use git with TFS and push updates to your reviewers remote repository using git. You can then just pull their changes when you're ready and push them to TFS when you've looked at them.
For an idea on how this might be done have a look at http://www.richard-banks.org/2010/04/git-tfs-working-together-version-2.html

Resources