I used ant design in my projects. I like to contribute.
i clone the repo and run locally. But my question is if I change any code how can I see the output or test or debug any component
i need some instruction
Look for Contributing section on the relevant antd project official website.
For example, contributing section in react.
Moreover, in every respected github repository, check for .github folder or How to contribute section in the README.md file - it has all the information necessary.
Finally, I suggest using the actual platform (github) to communicate with the project you want to contribute, for any question try asking in the issues section.
Related
I am trying to adopt Bazel for a project which was recently converted to a monorepo.
(I am VERY new to Bazel and have no mentor handy, please excuse me if the questions I ask are answered with "rtfm"; I am happy to do so, please point me to the manual.)
This monorepo contains a bunch of libs and binaries. When a certain tag is set, the goal is to build everything (as in bazel build //...), pack all the binary files into a zip and attach the created archive to a GitHub release.
I am comfortable with GitHub Actions, triggers and the releases itself, however I am not sure what an appropriate approach would be to create the archive containing all binaries via Bazel. Can anyone point me in the right direction?
You should go through below links #1 #2, #3, #4.
How can I download the source of an overleaf project with a command line script? I want to make regular backups of the source and it'd be better if I could automate the download instead of having to do it through the web interface every time. I'm not aware of any API that would allow me to do that, is there any?
I know that an ideal solution would probably use git-overleaf integration for proper version control and that's what I do for my personal projects, but for some projects I have to work with collaborators who find git too confusing and do not want to enable the git features to avoid possible confusions between the git history and overleaf's history, so that's not an option.
You may want to look into scripts that download zip archive of your project, for example overleafv2-git-integration-unofficial
This python script will download zip of your project, extract it, and delete the zip. It offers basic git functionality albeit largely experimental.
Sample usage:
overleafv2-git --email=your#email --password=yourpass --message="commit message" project-URL
I am preparing for making a minor bug fix to bazel java code. Am working on a Linux distribution.
Following the instructions in https://bazel.build/contributing.html but I encounter problems with two of the test instructions:
In the section about "Compiling bazel" the third parapgraph state: "In addition to the Bazel binary, you might want to build the various tools Bazel uses. They are located in //src/java_tools/..., //src/objc_tools/... and //src/tools/... and their directories contain README files describing their respective utility." If I follow this the //src/tools/... fail because there is no xcrun command in the Linux environment I am using. I suppose this is MacOS platform specific tests?
The next paragraph instructs you to build a distribution package, that you then unpack in a new directory, and then do: "bazel test //src/... //third_party/ijar/...". I now get an error that windows.h is missing, which I suppose is Windows platform specific tests.
Some questions:
So is there an easy way to run tests only for the current platform?
Is the instructions good enough?
If the instructions should be updated, what is the best way to notify the ones managing that documentation page?
Thanks for your interest in contributing to Bazel! The bazel-dev mailing list is a better avenue for these questions.
The tests that you want to run largely depend on the changes you make, but when you make a pull request, the Bazel CI will run all of Bazel's tests to make sure that nothing breaks.
So is there an easy way to run tests only for the current platform?
It depends, and this is still a work in progress where we want to make Bazel more aware of platforms and toolchains without specifying additional flags.
In general, you don't need to modify or worry about the //src/*_tools packages unless you're making direct changes to them.
Is the instructions good enough?
The instructions will never be perfect, and we're always looking for ways to make it clearer and more concise.
If the instructions should be updated, what is the best way to notify the ones managing that documentation page?
Please file an issue on the GitHub repository or email the bazel-dev mailing list for further discussion.
I'm tired of not knowing what builds web application is built from when an issue is presented to me by QA. My proposed solution is to generate some static content as part of the build process that has useful information in it. Somebody could reference that file manually on the server and I think it would be nice to offer the ability the HTTP GET the content as json perhaps as part of the API.
Seems simple right? I've done a lot of looking into doing this with Subversion in the past but I'm not familiar with the options for TFS. I realize that I'm really looking for two things, one to harvest the TFS information such as the repository, branch, changeset number, ... as well as a way to write content to a file all within a standard build process. Are there any existing MSBuild targets or tools I can pull in using something like Nuget to ensure that this is done for all builds whether they are local dev builds or ones done through a CI server which don't require any extra configuration or steps by the dev or the one setting up the CI server environment? What I'm hoping is that this is a common enough pattern that maybe somebody has already done this exact thing and packaged it for reuse or that the pieces are simple enough to piece together in a way that can be repeated as a general pattern for various types of .NET projects.
UPDATE: I realized that one issue with the proposed solution is that whatever static file is generated, it would have to be added as a reference to the project in order to be picked up by the build process as content so that it is copied/published properly along with the rest of the web application. Perhaps a broken reference can be added to the project ahead of time that would be fulfilled after the project is built at least once. This is reminiscent of the old way that Package Restore used to work with NuGet.
Related resources:
MSDN - Code Generation in a Build Process
StackOverFlow - MSBuild to copy dynamically generated files as part of project dependency
StackOverflow - How to programmatically get information about branches in TFS?
StackOverflow - It is possible to get TFS change set number from the local file system?
StackOverflow - tf.exe info /version:T does not get latest
The version control information is already generated in environment variables, you just need to write some simple command to read them from the environment variables. Check following links to see if these information meets your requirement:
For vNext build: Predefined variables.
For XAML build: TF_BUILD environment variables.
To include the generated file, you can refer to this question for details: How do you include additional files using VS2010 web deployment packages?
Here is what I've come up with which is working for now:
<Target Name="BeforeBuild">
<PropertyGroup>
<ProgramFiles32>$(MSBuildProgramFiles32)</ProgramFiles32>
<VS14Dir>$(ProgramFiles32)\Microsoft Visual Studio 14.0</VS14Dir>
<TF Condition="Exists('$(VS14Dir)')">$(VS14Dir)\Common7\IDE\TF.exe</TF>
<TfCommand>"$(TF)" vc info . > version.txt</TfCommand>
</PropertyGroup>
<Message Importance="High" Text="Creating version.txt: $(TfCommand)" />
<Exec Command="$(TfCommand)"></Exec>
</Target>
<ItemGroup>
<Content Include="$(SolutionDir)\version.txt">
<Link>%(Filename)%(Extension)</Link>
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
</ItemGroup>
This invokes the tf.exe console application with the args vc info . which gets version control information for the current directory and then uses >> version.txt to pipe the output to a file. There are a probably a bunch of improvements that could be made on this but it's a start. Next, I'll have to find some way to include the generated version.txt file as content within the project.
UPDATE: It looks like tf info only gives info about the specified directory and not about the working copy. I'm looking around for alternatives again.
http://teamcity.codebetter.com/viewLog.html?buildId=11047&tab=artifacts&buildTypeId=bt21
CodeCampServer has two download packages: VisualStudioTemplate and CodeCampServerPackage.
I looked for any idea how to use them, but unfortunately I didn't. So, I ask my questions here:
How to use VisualStudioTemplate? the archive file did not contain any .vstemplate file, so it cannot be used as VisualStudio template. Do I need to rename any $safesolutionname$ manually? It's not make any sense.
What the porpuse of CodeCampServerPackage? This archive file contains a deployment files only.
Thanks to Eric Hexter I have some answers:
The current trunk of CodeCampServer is on CodePlex. The build
server is moving right now from the regular teamcity.codebetter
server here to a new build server of HeadSpring projects:
http://build1.headspringlabs.com/
VisualStudioTemplate is meant to be used with the solution
factory command line tool. This tool basically takes the name of
the new project, the directory of the source template, and the
destination directory of the new project.
CodeCampServerPackage is a ready to use deployment files for site
the power CodeCampServer, like: codecampserver.com,
adnug.org and c4mvc.net.
Again, big credit to Eric Hexter for this answer.