Anyone know a good solution?
So far I have not found a better way than using File>New file and then copying contents from old file to new.
You can probably duplicate in Finder and re-import but that's almost same amount of work: switching to finder, duplicate, import new files.
Doing this with one class is not so hard, but what to do if you need to generate 10+ similar Classes based on superclass.
In Eclipse you select file and then copy/paste it in same folder. In finder there's Duplicate.
There's a menu Edit > Duplicate. But it's ALWAYS disabled. I tried selecting various files, classes, methods. It's still disabled.
In XCode 4.2 (I know this is an old question) there is Duplicate under the File menu.
Select the file (you can select multiple files but it doesn't appear to do anything useful) in the Project Navigator and then File->Duplicate. Hooray!
In Xcode 4.5 we can duplicate using File-> Duplicate or cmd + shift + S
"Duplicate" is enabled for targets in XCode (pretty much nothing else that I know of).
If you have a substantial number of subclasses with the same starting point to replicate, why not make a class template from it? Then you can just use file->New to make new instances. It's fairly quick to do.
This is probably the simplest example:
http://www.macresearch.org/custom_xcode_templates
Otherwise, I'd simply duplicate the files in Finder as many times as you need, name them, and drag them into XCode en-masse.
Careful!
When you use duplicate ( CMD + Shift + S ) - Xcode have a problem with indexing headers.
Also when u want to make a refactoring it can be next error window:
So there a couple of ways what to do, to fix that.
Delete derived data from menu Window > Projects. Restart Xcode.
Product > Clean
You could use "Save As..."; you'd still have to go back and re-add the original files to the project, though.
It wouldn't be such a bad way to do a bunch of related classes, though: edit file, Save As "class2", edit file, Save As "class3", etc., then "Add Existing Files" and re-add all of the files but the last to your project.
I use the following perl script to duplicate a file pair in the Terminal. You give it the base name of the original and new file, and it copies the header and implementation (c/cpp/m/mm) file, then replaces all occurances of the base name with the new name, then adds them to subversion. You still have to add the new files in to Xcode and adjust the creation date in the comment (I've got a Keyboard Maestro macro for that), but its quicker than doing a lot of the steps manually. I operate with a Terminal window and four tabs pre-set to the Project, Source, Resources, and English.lproj directory which gives quick access for a lot of operations.
#!/usr/bin/perl
use lib "$ENV{HOME}/perl";
use warnings;
use strict;
our $cp = '/bin/cp';
our $svn = '/usr/bin/svn';
our $perl = '/usr/bin/perl';
our $source = shift;
our $add = 1;
if ( $source =~ m!^-! ) {
if ( $source eq '-a' || $source eq '--add' ) {
$add = 1;
$source = shift;
} elsif ( $source eq '-A' || $source eq '--noadd' ) {
$add = undef;
$source = shift;
} else {
die "Bad arg $source";
}
}
our $dest = shift;
die "Bad source $source" unless $source =~ m!^(.*/)?[A-Za-z0-9]+$!;
die "Bad dest $dest" unless $dest =~ m!^(.*/)?[A-Za-z0-9]+$!;
my $cpp;
$cpp = 'c' if ( -e "$source.c" );
$cpp = 'cpp' if ( -e "$source.cpp" );
$cpp = 'mm' if ( -e "$source.mm" );
$cpp = 'm' if ( -e "$source.m" );
die "Missing source $source" unless -e "$source.h" && -e "$source.$cpp";
die "Existing dest $dest" if -e "$dest.h" && -e "$dest.$cpp";
our $sourcename = $source; $sourcename =~ s!.*/!!;
our $destname = $dest; $destname =~ s!.*/!!;
print "cp $source.h $dest.h\n";
system( $cp, "$source.h", "$dest.h" );
print "s/$sourcename/$destname in $dest.h\n";
system( $perl, '-p', '-i', '-e', "s/$sourcename/$destname/g", "$dest.h" );
print "cp $source.$cpp $dest.$cpp\n";
system( $cp, "$source.$cpp", "$dest.$cpp" );
print "s/$sourcename/$destname in $dest.$cpp\n";
system( $perl, '-p', '-i', '-e', "s/$sourcename/$destname/g", "$dest.$cpp" );
if ( $add ) {
print "svn add $dest.$cpp $dest.h\n";
system( $svn, 'add', "$dest.$cpp", "$dest.h" );
}
In my case, one of my folder changed from one place to another place.
I have "Home" folder in Controller folder, but unfortunately it's moved from Controller folder to Manager folder.
I checked many times everything fine, but I'm getting Command PrecompileSwiftBridgingHeader failed with a nonzero exit code
But after 2 hours i realised, my folder structure changed.
Related
In my Docusaurus project my internal links work on my local environment, but when I push to GitLab they no longer work. Instead of replacing the original doc title with the new one it adds it to the url at the end ('https://username.io/test-site/docs/overview/add-a-category.html'). I looked over my config file, but I do not understand why this is happening.
I tried updating the id in the front matter for the page, and making sure it matches the id in the sidebars.json file. I have also added customDocsPath and set it to 'docs/' in the config file, though that is supposed to be the default.
---
id: "process-designer-overview"
title: "Process Designer Overview"
sidebar_label: "Overview"
---
# Process Designer
The Process Designer is a collaborative business process modeling and
design workspace for the business processes, scenarios, roles and tasks
that make up governed data processes.
Use the Process Designer to:
- [Add a Category](add-a-category.html)
- [Add a Process or Scenario](Add%20a%20Process%20or%20Scenario.html)
- [Edit a Process or Scenario](Edit%20a%20Process%20or%20Scenario.html)
I updated the add a category link in parenthesis to an md extension, but that broke the link on my local and it still didn't work on GitLab. I would expect that when a user clicks on the link it would replace the doc title in the url with the new doc title ('https://username.gitlab.io/docs/add-a-category.html') but instead it just tacks it on to the end ('https://username.gitlab.io/docs/process-designer-overview/add-a-category.html') and so the link is broken as that is not where the doc is located.
There were several issues with my links. First, I converted these files from html to markdown using Pandoc and did not add front matter - relying instead on the file name to connect my files to the sidebars. This was fine, except almost all of the file names had spaces in them, which you can see in my code example above. This was causing real issues, so I found a Bash script to replace all of the spaces in my file names with underscores, but now all of my links were broken. I updated all of the links in my files with a search and replace in my code editor, replacing "%20" with "_". I also needed to replace the ".html" extension with ".md" or my project would no longer work locally. Again, I did this with a search and replace in my code editor.
Finally, I ended up adding the front matter because otherwise my sidebar titles were all covered in underscores. Since I was working with 90 files, I didn't want to do this manually. I looked for a while and found a great gist by thebearJew and adjusted it so that it would take the file name and add it as the id, and the first heading and add it as the title and sidebar_label, since as it happens that works for our project. Here is the Bash script I found online to convert the spaces in my file names to underscores if interested:
find $1 -name "* *.md" -type f -print0 | \
while read -d $'\0' f; do mv -v "$f" "${f// /_}"; done
Here is the script I ended up with if anyone else has a similar setup and doesn't want to update a huge amount of files with front matter:
# Given a file path as an argument
# 1. get the file name
# 2. prepend template string to the top of the source file
# 3. resave original source file
# command: find . -name "*.md" -print0 | xargs -0 -I file ./prepend.sh file
filepath="$1"
file_name=$("basename" -a "$filepath")
# Getting the file name (title)
md='.md'
title=${file_name%$md}
heading=$(grep -r "^# \b" ~/Documents/docs/$title.md)
heading1=${heading#*\#}
# Prepend front-matter to files
TEMPLATE="---
id: $title
title: $heading1
sidebar_label: $heading1
---
"
echo "$TEMPLATE" | cat - "$filepath" > temp && mv temp "$filepath"
Goal of my Makefile is to create in the end a static library *.a out of Fortran77 files and some *.c's + *.h's whereas a specific part of the headers have to be precompiled with a special company internal precompiler which is provided via executable and all you have to hand over is the pathname+filename.
Let's call the Precompiler CPreComp.
The files needing the precompilation *_l.h .
So I want first to collect all the headers I need to precompile and then hand it over to a script which does some magic (env variables blubb blubb) and calls the precompiler.
Here you go with my Makefile:
SHELL=/usr/bin/bash
.SHELLFLAGS:= -ec
SOURCE_PATH = ./src
CPRECOMP = ./tools/cprecomp.exe
DO_CPreComp = $(SOURCE_PATH)/do_cprec
HDREXT = .h
PREC_HEADERS = $(foreach d, $(SOURCE_PATH), $(wildcard $(addprefix $(d)/*, $(HDREXT))))
.PHONY: all prereq
all:: \
prereq \
lib.a
prereq: chmod 777 $(DO_CPreComp)
echo $(PREC_HEADERS) >> makefileTellMeWhatYouHaveSoFar.txt
lib.a: \
obj/file1.o \
obj/file2.o
ar -r lib.a $?
obj/file1.o:
# do some fortran precompiling stuff here for a specific file
obj/file2.o: $(SOURCE_PATH)/*.h precomp_path/*.h $(SOURCE_PATH)/file2.c precomp_path/%_l.h
cc -c -g file2.c
precomp_path/%_l.h : DatabaseForPreComp.txt
precomp_path/%_l.h :
$(foreach i , $(PREC_HEADERS) , $(DO_CPreComp) $(i) $(CPRECOMP);)
So that is my Makefile, the script for the DO_CPreComp looks as follows:
#!/bin/bash
filename="(basename "$1")"
dir="$(dirname "$1")"
cprecomptool="$2"
echo ${dir} ${filename} ${cprecomptool} >> scriptTellMeWhatYouKnow.txt
"${cprecomptool}" "precomp_path/${filename}.1" >&cprecomp.err
cp "precomp_path/${filename}.1" "precomp_path/${filename}"
So according to the makefileTellMeWhatYouHaveSoFar.txt I collect all the headers, obviously also the ones not specified with _l.h . This has space for improvement but the precompiler is smart enough to skip the files which are not suitable. So makefileTellMeWhatYouHaveSoFar.txt looks like that:
header1.h header2.h header2_l.h headerx_l.h headery_l.h headerz.h
The Error tells me:
path_to_here/do_cprec : line xy: $2: unbound variable
make[2]: *** [precomp_path/%_l.h] Error 1
make[1]: *** [lib.a] Error 2
scriptTellMeWhatYouKnow.txt shows me the script knows nothing and it is not even created. If I modify cprecomptool and directly add it in the script hardcoded the scriptTellMeWhatYouKnow.txt shows me the argument $(CPRECOMP) twice as file name and path name and the hardcoded precompiler. And ofc it ends up with Segmentation fault, so the header name was never handed over.
Additionally:
If I do not call the script in the second foreach but let $(i) be printed out with echo in another file it is empty.
Perhaps I am just too blind. And please if you are able to help me , explain it to me for dumb people, such that for the next time I stumble over a problem I am smarter because I know what I am doing. :)
OK, now that the main issue is solved, let's have a look at make coding styles. The make way of accomplishing what you want is not exactly using foreach in recipes. There are several drawbacks with this approach like, for instance, the fact that make cannot run parallel jobs, while it is extremely good at this. And on modern multi-core architectures, it can really make a difference. Or the fact that things are always redone while they are potentially up to date.
Assuming the result of the pre-compilation of foo_l.h file is a foo.h (we will look at other options later), the make way is more something like:
SOURCE_PATH := ./src
CPRECOMP := ./tools/cprecomp.exe
DO_CPreComp := $(SOURCE_PATH)/do_cprec
HDREXT := .h
PREC_HEADERS := $(wildcard $(addsuffix /*_l.$(HDREXT),$(SOURCE_PATH)))
PRECOMPILED_HEADERS := $(patsubst %_l.h,%.h,$(PREC_HEADERS))
$(PRECOMPILED_HEADERS): %_l.h: %.h DatabaseForPreComp.txt
$(DO_CPreComp) $# $(CPRECOMP)
($# expands as the target). This is a static pattern rule. With this coding style only the headers that need to be pre-compiled (because they are older than their prerequisites) are re-built. And if you run make in parallel mode (make -j4 for 4 jobs in parallel) you should see a nice speed-up factor on a multi-core processor.
But what if the pre-compilation modifies the foo_l.h file itself? In this case you need another dummy (empty) file to keep track of when a file has been pre-compiled:
SOURCE_PATH := ./src
CPRECOMP := ./tools/cprecomp.exe
DO_CPreComp := $(SOURCE_PATH)/do_cprec
HDREXT := .h
PREC_HEADERS := $(wildcard $(addsuffix /*_l.$(HDREXT),$(SOURCE_PATH)))
PREC_TAGS := $(patsubst %,%.done,$(PREC_HEADERS))
$(PREC_TAGS): %.done: % DatabaseForPreComp.txt
$(DO_CPreComp) $< $(CPRECOMP) && \
touch $#
($< expands as the first prerequisite). The trick here is that the foo_l.h.done empty file is a marker. Its last modification time records the last time foo_l.h has been pre-compiled. If foo_l.h or DatabaseForPreComp.txt has changed since, then foo_l.h.done is out of date and make re-builds it, that is, pre-compiles foo_l.h and then touch foo_l.h.done to update its last modification time. Of course, if you use this, you must tell make that some other targets depend on $(PREC_TAGS).
With the help of #Renaud Pacalet I was able to find a solution.
In the comments you can read further try & errors.
I am using GNU Make 3.82 Built for x86_64-redhat-linux-gnu . Seems like the foreach does not like the space behind the i or furthermore takes the space as part of the variable.
# ... like beforehand check out in the question
PREC_HEADERS=$(shell find $(SOURCE_PATH) -iname '*_l.h')
# nothing changed here in between...
$(foreach i,$(PREC_HEADERS),$(DO_CPC) $i $(CPC);)
This has the advantage that I only precompile the headers which have the _l.h - ending. Having the brackets $(i) around the $i or not, doesn't make a change. What really changed everything was the space behind the first i .
Good luck!
I have the following script, that I launch using wscript:
Set sh = CreateObject("Shell.Application")
Set rv = sh.BrowseForFolder(0, "Now browse...", 1)
WScript.Echo rv
How can I obtain the full path of the selected folder?
The documentation for the Folder object that is returned by BrowseForFolder gives nothing appropriate.
Or maybe I should use something completely different for browsing for folders in wscript...
rv.Self.Path, discussed here in detail: How Can I Show Users a Dialog Box That Only Lets Them Select Folders? at Hey, Scripting Guy! Blog.
Set sh = CreateObject("Shell.Application")
Set rv = sh.BrowseForFolder(0, "Now browse...", 1)
If rv Is Nothing Then
WScript.Echo "Nothing chosen"
Else
WScript.Echo rv.Self.Path
End If
I'm trying to get started playing with factor.
So far, I've:
downloaded the OSX disk image
copied the factor directory into $INSTALL/factor
started up the debugger by running $INSTALL/factor/factor
Which seems to be running great.
Following the instructions for writing your first factor program, I noticed that scaffold-vocab generated files in my $INSTALL/factor/work directory. Which I can use for now, but in general, I like to keep a separate $INSTALL directory-tree and $CODE directory-tree.
So I'm trying to follow the instructions from the "Working with code outside of the Factor directory tree" documentation to add other directories to the path used to load code into the factor executable, but I'm not having much luck.
First, I tried to set a FACTOR_ROOTS environment variable:
% export FACTOR_ROOTS=.:$CODE/Factor:$INSTALL/factor
% $INSTALL/factor/factor
( scratchpad ) "work" resource-path .
"/usr/local/src/factor/work"
( scratchpad ) ^D
Then, I tried to create a ~/.factor-roots file
% echo . > ~/.factor-roots
% echo $CODE/Factor >> ~/.factor-roots
% echo $INSTALL/factor >> ~/.factor-roots
% $INSTALL/factor/factor
( scratchpad ) "work" resource-path .
"/usr/local/src/factor/work"
( scratchpad ) ^D
Then I checked to see if it should be ./.factor-roots instead:
% mv ~/.factor-roots .
% $INSTALL/factor/factor
( scratchpad ) "work" resource-path .
"/usr/local/src/factor/work"
( scratchpad ) ^D
Lastly, I tried adding it manually:
% $INSTALL/factor/factor
( scratchpad ) "." add-vocab-root
( scratchpad ) "$CODE/Factor" add-vocab-root ! no, I didn't actually use an environment variable here :)
( scratchpad ) "work" resource-path .
"/usr/local/src/factor/work"
( scratchpad ) ^D
It seems I'm missing something fundamental here.
How do I write code outside of the $INSTALL/factor directory-tree and use it in factor? How can I tell scaffold-vocab to build scaffolding in my $CODE/Factor directory?
Ok, I was able to work out what I was doing wrong thanks to the earnest help of slava and erg on #concatenative.
Simply put, resource-path is not a way to test your factor roots. Like the docs say it "resolve[s] a path relative to the Factor source code location."
A more effective test is simply vocab-roots get, which will fetch the current list of vocab roots.
"/path/to/wherever" add-vocab-root will add /path/to/wherever to your list of vocab-roots, and allow you to do "/path/to/wherever" "project" scaffold-vocab so you can build scaffolding in the desired location.
As erg said:
i usually make another word, like
: scaffold-games ( vocab -- ) [ "/home/erg/games" ] dip scaffold-vocab ;
"minesweeper" scaffold-games
I'm using aapt tool to remove some files from different folders of my apk. This works fine.
But when I want to add files to the apk, the aapt tool add command doesn't let me specify the path to where I want the file to be added, therefore I can add files only to the root folder of the apk.
This is strange because I don't think that developers would never want to add files to a subfolder of the apk (res folder for example). Is this possible with aapt or any other method? Cause removing files from any folder works fine, and adding file works only for the root folder of the apk. Can't use it for any other folder.
Thanks
The aapt tool retains the directory structure specified in the add command, if you want to add something to an existing folder in an apk you simply must have a similar folder on your system and must specify each file to add fully listing the directory. Example
$ aapt list test.apk
res/drawable-hdpi/pic1.png
res/drawable-hdpi/pic2.png
AndroidManifest.xml
$ aapt remove test.apk res/drawable-hdpi/pic1.png
$ aapt add test.apk res/drawable-hdpi/pic1.png
The pic1.png that will is added resides in a folder in the current working directory of the terminal res/drawable-hdpi/ , hope this answered your question
There is actually a bug in aapt that will make this randomly impossible. The way it is supposed to work is as the other answer claims: paths are kept, unless you pass -k. Let's see how this is implemented:
The flag that controls whether the path is ignored is mJunkPath:
bool mJunkPath;
This variable is in a class called Bundle, and is controlled by two accessors:
bool getJunkPath(void) const { return mJunkPath; }
void setJunkPath(bool val) { mJunkPath = val; }
If the user specified -k at the command line, it is set to true:
case 'k':
bundle.setJunkPath(true);
break;
And, when the data is being added to the file, it is checked:
if (bundle->getJunkPath()) {
String8 storageName = String8(fileName).getPathLeaf();
printf(" '%s' as '%s'...\n", fileName, storageName.string());
result = zip->add(fileName, storageName.string(),
bundle->getCompressionMethod(), NULL);
} else {
printf(" '%s'...\n", fileName);
result = zip->add(fileName, bundle->getCompressionMethod(), NULL);
}
Unfortunately, the one instance of Bundle used by the application is allocated in main on the stack, and there is no initialization of mJunkPath in the constructor, so the value of the variable is random; without a way to explicitly set it to false, on my system I (seemingly deterministically) am unable to add files at specified paths.
However, you can also just use zip, as an APK is simply a Zip file, and the zip tool works fine.
(For the record, I have not submitted the trivial fix for this as a patch to Android yet, if someone else wants to the world would likely be a better place. My experience with the Android code submission process was having to put up with an incredibly complex submission mechanism that in the end took six months for someone to get back to me, in some cases with minor modifications that could have just been made on their end were their submission process not so horribly complex. Given that there is a really easy workaround to this problem, I do not consider it important enough to bother with all of that again.)