When executing: duplicity --force remove-older-than 10D file:///fullSystemBackup/
Produces an error of: Specified archive directory '/home/bitnami/.cache/duplicity/abcxyz' does not exist, or is not a directory
The issue was I ran the command with a different user as to when I made a backup with Duplicity.
Related
When I try to move my iOS backup folder (which does not yet have any backups) to my external hdd, the command line (on Mac) tells me that the command -s is not found.
This was the directory which I've tried to link the iTunes backups to:
user123#user123s-MacBook-Pro ~ % -s /Volumes/Personal/user123/iOSBackup/Old_Backup/ ~/Library/Application\ Support/MobileSync
zsh: command not found: -s
when I entered it manually instead of copy pasting it, it said that permission was denied, even though I had granted full disk access to the terminal app before for coding in vs code and such...
Thanks!
Well, the first problem is that you’re trying to execute -s rather than ln -s.
What is the goal of your ln operation? To symlink the folder called Old_Backup inside the MobileSync directory? That won’t work if your intention is that additional backups go to Old_Backup. You should symlink the Old_Backup directory to the location and name it originally had.
I am trying to backup jenkins home directory (/home/ubuntu/.jenkins/) using rsync to the target directory /opt/jenbkup/. Since the directory traversal seems not working as expected, I have gone with single directory in the filter:
rsync -avr --include="jobs/*/config.xml" --exclude="*" /home/saga/.jenkins /opt/jenbkup
But nothing is copied. I also tried with exact file path in the include and did not work.
rsync -avr --include="jobs/job1/config.xml" --exclude="*" /home/saga/.jenkins /opt/jenbkup
File is not copied to destination. I don't understand whats wrong here. Some one please assist.
I assume, that you would only backup the config.xml files in your JENKINS_HOME, then this should work:
rsync -av --include="*/" --include="config.xml" --exclude="*" \
--delete --prune-empty-dirs /home/saga/.jenkins/ /opt/jenbkup/
Short explaination of the used options:
--include="*/" traverse all directories
--include="config.xml" include only files named "config.xml"
--exclude="*" exlude everything
--delete delete non-existing files in the backup
--prune-empty-dirs delete empty directories from the backup
within a running docker container I am trying to erase a folder. However, the OS seems to find that some files are not able to delete it. During the deletion process it says something like can not find file for several files supposed to be within the folder I am trying to delete. Is there a way to repair this?
I tried:
fsck -f /MyFolder
fsck from util-linux 2.27.1
e2fsck 1.42.13 (17-May-2015)
fsck.ext2: Is a directory while trying to open /MyFolder
The superblock could not be read or does not describe a valid ext2/ext3/ext4
filesystem. If the device is valid and it really contains an ext2/ext3/ext4
filesystem (and not swap or ufs or something else), then the superblock
is corrupt, and you might try running e2fsck with an alternate superblock:
e2fsck -b 8193
or
e2fsck -b 32768
OTHER error messages:
rm: cannot remove 'MyFolder/webpack.config.js': No such file or directory
rm: cannot remove 'MyFolder/webpack.config.js.bck2': No such file or directory
rm: cannot remove 'MyFolder/webpack.config.js.bck3': No such file or directory
I also noticed a lot of files with size 0 with the names of the files declared as 'cannot remove...'
I am running make script(Execute Shell command option) inside a jenkins job.
The make script has rm -rf <directory name> shell command.
This command fails with error saying the Directory is not empty. Since script uses rm -rf it should work even if directory is not empty.
Not sure what is wrong here.
Any help around this will be much appreciated.
If your Jenkins job is executed on a Linux machine, this could b:
a permission issue.
a race condition issue (which is why deleting files first is a good idea. Then your rm -Rf will delete all the empty folders)
On Windows, check the full error message: there could be a resource preempted by the OS (used by another process).
You can first try and empty the directory and then delete the directory.
Try running the following command:
rm mydir/* mydir/.*
rmdir mydir
I have the problem that when I try to create the .apk file with the cmd line and the aapt order, It gives me the following error:
"...\res\layout-land\activity_statistics.xml:2: error: Error: No resource found that matches the given name (at 'background' with value '#drawable/bg_session')."
This error goes further thrue all layout and drawable folders.
My cmd order is the following one:
"...\Android\sdk\platform-tools\aapt.exe"
package -v -f
-A "...\workspace\WBRLight\assets"
-M "...\workspace\WBRLight\AndroidManifest.xml"
-S "...\workspace\WBRLight\res"
-I "...\Android\sdk\platforms\android-17\android.jar"
-F "...\workspace\WBRLight\bin\WBRLight.unsigned.apk" "...\workspace\WBRLight\bin"
I checked my files if they are corrupted and clean my project folder already.
With eclipse its working, but I want to to it with the cmd line order.
Could anybody help me please ? I try to solve it now since three days...
So i figured it out:
I have to "crunch" all picture in the res folder first:
aapt crunch -v -S \res -C \bin\res
And then I pointed as a source folder to the res dir and to the bin\res dir. Also added --no-chrunch --generate-dependencies
aapt package --no-crunch --generate-dependencies -v -f
-M \AndroidManifest.xml"
-S \bin\res
-S \res
-A \assets
-I \android.jar
-F \bin\APPNAME.unsigned.apk \bin
Now Its working perfectly. Also with the .9.png 9patch pictures.
I found that while using multiple native extensions, there could be a filename conflict. Etc. two different extensions use the same file (and same path)
res/values/strings.xml
inside the ANE and during APK packaging when these resources are merged in temp folder, this file will be overwritten leading to similar error message.
Solution I've found so far is to enter the ANE archive and rename the conflicting file. You can also contact author of the extension to update it or rebuild it by yourself if possible.